WO2022259612A1 - Information processing device and program - Google Patents

Information processing device and program Download PDF

Info

Publication number
WO2022259612A1
WO2022259612A1 PCT/JP2022/004791 JP2022004791W WO2022259612A1 WO 2022259612 A1 WO2022259612 A1 WO 2022259612A1 JP 2022004791 W JP2022004791 W JP 2022004791W WO 2022259612 A1 WO2022259612 A1 WO 2022259612A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
location
information processing
processing device
unit
Prior art date
Application number
PCT/JP2022/004791
Other languages
French (fr)
Japanese (ja)
Inventor
啓宏 王
公伸 西村
篤史 内田
雅友 倉田
崇 小形
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to JP2023527485A priority Critical patent/JPWO2022259612A1/ja
Publication of WO2022259612A1 publication Critical patent/WO2022259612A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials

Definitions

  • the present technology relates to an information processing device and a program, and more particularly to an information processing device and a program that can guarantee the authenticity of position information of the information processing device.
  • This technology has been developed in view of such circumstances, and makes it possible to guarantee the authenticity of location information of an information processing device without using a dedicated device.
  • An information processing apparatus includes a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and current time; a location certification acquisition unit that requests location certification from a first information processing device existing in the vicinity and receives first location certification information from the first information processing device; and a location registration unit that transmits the first location information and the first location certification information to a second information processing device that records one piece of location certification information.
  • a program detects a current position when a predetermined trigger is detected, generates position information including the current position and current time, and detects first information existing in the surroundings when the trigger is detected. requesting location certification from a processing device, receiving location certification information from the first information processing device, and storing the location information and the location certification information in a second information processing device that records the location information and the location certification information; causes the computer to execute the process of sending the
  • a current position is detected, position information including the current position and current time is generated, and when the trigger is detected, first information processing existing in the surroundings is generated.
  • a device is requested to provide location proof, location proof information is received from the first information processing device, and the location information and the location proof information are received in a second information processing device that records the location information and the location proof information. sent.
  • An information processing apparatus receives a request for location certification transmitted from a first information processing apparatus that has detected a predetermined trigger to another information processing apparatus in the surrounding area. and a location certification unit that transmits the location certification information.
  • a program generates location certification information when an information processing device that detects a predetermined trigger receives a request for location certification transmitted to another information processing device in the vicinity, Causes the computer to execute a process of transmitting location verification information.
  • An information processing device includes first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger.
  • a second information processing device which was present around the first information processing device when the trigger was detected, receives the report data received from the first information processing device from the first information processing device;
  • a verification unit that verifies using location proof information generated in response to a request for location verification, and an execution unit that executes processing corresponding to the declared data when the verification unit determines that the declared data is authentic.
  • first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger
  • the report data received from the first information processing device is the location certification from the first information processing device by the second information processing device that was present in the vicinity of the first information processing device at the time of detection of the trigger. Verification is performed using the position proof information generated in response to the request, and when the verification unit determines that the report data is authentic, a process corresponding to the report data is executed.
  • FIG. 10 is a diagram showing a format example of a PoL request;
  • FIG. 10 is a diagram showing a format example of a PoL response;
  • FIG. 10 is a diagram showing a format example of a PoL transaction;
  • FIG. 4 is a diagram showing a format example of a PoL block;
  • FIG. 10 is a diagram showing a format example of a PoL request;
  • FIG. 10 is a diagram showing a format example of a PoL response;
  • FIG. 10 is a diagram showing a format example of a PoL transaction;
  • FIG. 4 is a diagram showing a format example of a PoL block;
  • FIG. 10 is a diagram showing a format example of a PoL request;
  • FIG. 10 is a diagram showing a format example of a PoL response;
  • FIG. 10 is a diagram showing a format example of a PoL transaction;
  • FIG. 4 is a diagram showing a format example of a PoL block;
  • FIG. 6 is a sequence diagram showing the processing of FIG. 5;
  • FIG. 10 is a flowchart for explaining details of PoL request verification processing;
  • FIG. FIG. 10 is a flowchart for explaining the details of PoL response verification processing;
  • FIG. FIG. 6 is a sequence diagram showing the processing of FIG. 5; It is a figure which shows the flow of collection processing of eyewitness information of an accident.
  • FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire;
  • FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire;
  • FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire;
  • FIG. 15 is a sequence diagram showing the processing of FIG.
  • FIG. 14 It is a figure which shows the flow of calculation processing, such as an insurance premium of risk subdivision type insurance.
  • FIG. 20 is a sequence diagram showing the processing of FIG. 19;
  • FIG. 10 is a diagram showing the flow of calculation processing for insurance claims for leisure insurance or non-life insurance;
  • FIG. 10 is a diagram showing a flow of calculation processing of health promotion insurance premiums, etc.
  • FIG. FIG. 10 is a sequence diagram showing a process of storing image data and recording position information and position certification information at the time of shooting;
  • FIG. 10 is a sequence diagram showing eyewitness information collection processing;
  • FIG. 11 is a block diagram showing a second embodiment of an information processing system to which the present technology is applied; It is a block diagram which shows the functional structural example of an accident trigger generator.
  • FIG. 2 is a block diagram showing an example of the functional configuration of a camera
  • FIG. 3 is a block diagram showing a functional configuration example of a server
  • FIG. 26 is a sequence diagram showing the flow of processing of the information processing system of FIG. 25
  • FIG. 4 is a flowchart for explaining the details of position verification processing
  • FIG. 1 shows a configuration example of an information processing system 1 as a first embodiment of an information processing system to which the present technology is applied.
  • the information processing system 1 is a system for executing various insurance processing.
  • the information processing system 1 includes information processing terminals 11 - 1 to 11 - n, a server 12 , and a blockchain network 13 .
  • the information processing terminals 11-1 to 11-n, the server 12, and the blockchain network 13 are connected via a network 21, and can communicate with each other.
  • the information processing terminals 11-1 to 11-n can communicate directly using short-range wireless communication without going through the network 21.
  • the information processing terminals 11-1 to 11-n are simply referred to as the information processing terminal 11 when there is no need to distinguish them individually.
  • the information processing terminal 11 is configured by, for example, a portable information processing device that can be carried by the user or worn by the user.
  • the information processing terminal 11 is configured by a smart phone, a mobile phone, a tablet terminal, a wearable device, an action camera, a portable music player, a portable game machine, or the like.
  • the information processing terminal 11 is configured by an information processing device such as a drive recorder, which is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle), and shoots and records the surroundings of the mobile object.
  • an information processing device such as a drive recorder, which is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle), and shoots and records the surroundings of the mobile object.
  • the information processing terminal 11 is composed of, for example, a dedicated information processing device installed at an arbitrary location outdoors or indoors.
  • the information processing terminal 11 is used, for example, to use the insurance provided by the server 12. For example, the information processing terminal 11 generates declaration data for making various declarations regarding insurance, and transmits the declaration data to the server 12 via the network 21 . The information processing terminal 11 receives various data transmitted by the server 12 in response to the declaration data via the network 21 .
  • the information processing terminal 11 is used for certifying the positions of other information processing terminals 11 in the vicinity.
  • the information processing terminal 11 (hereinafter referred to as Prover) possessed by the insurance user detects the current position, and transmits the detected current position and position information including the current time to surrounding information processing terminals. 11 (hereinafter referred to as Witness) to request location verification.
  • the position verification is processing for verifying the position information of the Prover.
  • position proof is a process of proving that the Prover was present at the position indicated by the position information at the time indicated by the position information.
  • the location certification information is information that certifies the location information of the Prover.
  • location proof information is information that proves that the Prover was present at the location indicated by the location information at the time indicated by the location information.
  • the location proof information includes location information (Witness location information) including the current location and current time of the Witness.
  • the Prover transmits transaction data including location information and location proof information to the blockchain network 13 via the network 21, and causes the Prover's location information and location proof information to be recorded in the blockchain.
  • the time may include not only the so-called time but also the date and the day of the week.
  • the current time may include not only the current time, but also the current date and day of the week.
  • each information processing terminal 11 normally operates as a Witness, operates as a Prover when a predetermined trigger is detected, and returns to the Witness after completing the operation as a Prover.
  • the information processing terminal 11 when the information processing terminal 11 is present near the accident site when the accident occurs, the information processing terminal 11 generates eyewitness information of the accident according to a request from the server 12 and transmits it to the server 12 via the network 21 .
  • the server 12 executes various processes related to insurance while exchanging various data with the information processing terminal 11 and the blockchain network 13.
  • the blockchain network 13 is composed of a network in which multiple nodes are connected.
  • the (each node of) the blockchain network 13 updates and maintains a block chain in which blocks including location information and location proof information of each information processing terminal 11 are connected.
  • the blockchain network 13 extracts from the blockchain blocks containing location information and location proof information that meet the conditions presented by the server 12, and sends them to the server 12 via the network 21. Send.
  • FIG. 2 is a block diagram showing a functional configuration example of the information processing terminal 11 of FIG.
  • the information processing terminal 11 includes a CPU (Central Processing Unit) 101, a memory 102, a storage 103, an operation unit 104, a display unit 105, a speaker 106, an imaging unit 107, a GNSS (Global Navigation Satellite System) receiver 108, a sensing unit 109, A communication unit 110 , an external I/F 111 and a drive 112 are provided.
  • the CPU 101 to drive 112 are connected to a bus and perform necessary communications with each other.
  • the CPU 101 performs various processes by executing programs installed in the memory 102 and storage 103.
  • the memory 102 is composed of, for example, a volatile memory or the like, and temporarily stores programs executed by the CPU 101 and necessary data.
  • the storage 103 is composed of, for example, a hard disk or non-volatile memory, and stores programs executed by the CPU 101 and necessary data.
  • the operation unit 104 is composed of physical keys (including a keyboard), a mouse, a touch panel, and the like.
  • the operation unit 104 outputs an operation signal corresponding to the user's operation onto the bus.
  • the display unit 105 is composed of, for example, an LCD (Liquid Crystal Display) or the like, and displays an image according to data supplied from the bus.
  • LCD Liquid Crystal Display
  • the touch panel as the operation unit 104 is made of a transparent member and can be configured integrally with the display unit 105 . Accordingly, the user can input information by operating icons, buttons, and the like displayed on the display unit 105 .
  • the speaker 106 outputs sound according to the data supplied from the bus.
  • the imaging unit 107 captures an image (still image, moving image) (perceives light) and outputs the corresponding image data onto the bus.
  • the GNSS receiver 108 receives signals from GNSS satellites and detects the current position of the information processing terminal 11 based on the received signals.
  • the GNSS receiver 108 outputs data indicating the detection result of the current position (hereinafter referred to as position detection data) onto the bus.
  • the sensing unit 109 includes various sensors, and outputs sensor data output from each sensor onto the bus.
  • the sensing unit 109 includes, for example, sensors for detecting user behavior, such as motion sensors, acceleration sensors, angular velocity sensors, and the like.
  • the communication unit 110 includes a communication circuit, an antenna, and the like, and communicates with other information processing terminals 11 , servers 12 , and the blockchain network 13 via the network 21 . Also, the communication unit 110 performs short-range wireless communication with another information processing terminal 11 using a predetermined method without using the network 21 .
  • Bluetooth registered trademark, hereinafter referred to as BT
  • An external I/F (interface) 111 is an interface for exchanging data with various external devices.
  • the drive 112 is capable of attaching and detaching a removable medium 112A such as a memory card, and drives the attached removable medium 112A.
  • a removable medium 112A such as a memory card
  • the program executed by the CPU 101 can be recorded in advance in the storage 103 as a recording medium incorporated in the CPU 101 .
  • the program can be stored (recorded) in the removable media 111A, provided as so-called package software, and installed in the server 12 from the removable media 111A.
  • the program can be downloaded from a server (not shown) or the like via the network 21 and the communication unit 110 and installed in the information processing terminal 11.
  • FIG. 3 shows a configuration example of functions realized by the CPU 101 executing a program installed in the information processing terminal 11.
  • functions including, for example, the control unit 131, the position detection unit 132, the position proof processing unit 134, and the report unit 135 are realized.
  • the control unit 131 controls the processing of each unit of the information processing terminal 11 .
  • the position detection unit 132 detects the current position of the information processing terminal 11 based on the position detection data output from the GNSS receiver 108 .
  • the position detection unit 132 generates position information including the current position and current time of the information processing terminal 11 .
  • the accident detection unit 133 Based on at least one of the image data output from the imaging unit 107 and the sensor data output from the sensing unit 109, the accident detection unit 133 detects an accident or information associated with the user possessed by the information processing terminal 11. An accident occurring around the processing terminal 11 is detected.
  • the location certification processing unit 134 performs processing related to location certification of the information processing terminal 11 .
  • the location proof processing unit 134 includes a Prover processing unit 141 and a Witness processing unit 142 .
  • the Prover processing unit 141 performs processing when the information processing terminal 11 operates as a Prover, that is, when the information processing terminal 11 asks surrounding information processing terminals 11 to verify its location.
  • the Prover processing unit 141 includes a location certification acquisition unit 151 , a location registration unit 152 and an image registration unit 153 .
  • the location certification acquisition unit 151 When location certification is required, the location certification acquisition unit 151 generates a PoL (Proof of Location) request for requesting location certification to surrounding information processing terminals 11, including the location information of the information processing terminal 11. Generate.
  • the location certification acquisition unit 151 transmits a PoL request to the surrounding information processing terminals 11 via the communication unit 110 by BT.
  • the location certification acquisition unit 151 receives, via the communication unit 110, a PoL response including location certification information that has been sent from the surrounding information processing terminal 11 in response to a PoL request.
  • the location registration unit 152 registers the location of the information processing terminal 11 in the blockchain network 13. Specifically, the location registration unit 152 generates a PoL transaction including the location information of the information processing terminal 11 and the location certification information acquired from the surrounding information processing terminals 11 . The location registration unit 152 broadcasts PoL transactions to the blockchain network 13 via the communication unit 110 and the network 21 . As a result, a PoL block based on the PoL transaction is added to the blockchain, and the location information and the location proof information of the information processing terminal 11 are recorded in the blockchain.
  • the image registration unit 153 registers image data corresponding to images captured by the image capturing unit 107 in the server 12 .
  • the image registration unit 153 transmits the image data to the server 12 via the communication unit 110 and the network 21 and causes the server 12 to store the image data as necessary.
  • the witness processing unit 141 performs processing when the information processing terminal 11 operates as a witness, that is, when the information processing terminal 11 performs position verification of the surrounding information processing terminals 11 .
  • Witness processing unit 141 includes location certifying unit 161 and information providing unit 162 .
  • the location certification unit 161 When the location certification unit 161 receives a PoL request from the Prover via the communication unit 110, based on the PoL request, it generates a PoL response including location certification information including Witness location information. The location certification unit 161 transmits the PoL response to the Prover via the communication unit 110 by BT.
  • the information providing unit 162 generates information requested by the server 12 (for example, eyewitness information of an accident, etc.) and transmits it to the server 12 via the communication unit 110 and the network 21 .
  • the declaration unit 135 generates declaration data for making various declarations regarding insurance provided by the server 12 and transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the reporting unit 135 receives various data transmitted from the server 12 with respect to the reporting data via the network 21 and the communication unit 110 .
  • FIG. 4 is a block diagram showing a functional configuration example of the server 12. As shown in FIG.
  • the server 12 includes a CPU 201 , a memory 202 , a storage 203 , an insurance DB (Data Base) 204 , an operation section 205 , a display section 206 , a communication section 207 , an external I/F 208 and a drive 209 .
  • the CPU 201 to drive 209 are connected to a bus and perform necessary communications with each other.
  • the CPU 201 through the storage 203, the operation unit 205, the display unit 206, the external I/F 208, and the drive 209 are the CPU 101 through the storage 103, the operation unit 104, the display unit 105, the external I/F 111, and the and are configured similarly to the drive 112, respectively.
  • the insurance DB 204 stores various data related to insurance provided by the server 12.
  • the insurance DB 204 stores various data related to insurance to be provided and policyholders.
  • insurance DB 204 stores eyewitness information for accidents involving policyholders.
  • the insurance DB 204 stores image data transmitted from the information processing terminal 11 of the policyholder.
  • the communication unit 207 includes a communication circuit, an antenna, etc., and communicates with the information processing terminal 11 and the blockchain network 13 via the network 21 .
  • the program executed by the CPU 201 can be recorded in advance in the storage 203 as a recording medium incorporated in the server 12 .
  • the program can be stored (recorded) in the removable media 209A, provided as package software, and installed in the server 12 from the removable media 209A.
  • the program can be downloaded from another server (not shown) or the like via the network 21 and the communication unit 207 and installed on the server 12 .
  • Functions including a control unit 231, a verification unit 232, an information collection unit 233, and an insurance processing unit 234 are realized by the CPU 201 executing a program installed in the server 12.
  • the control unit 131 controls the processing of each unit of the server 12.
  • the verification section 232 verifies the declaration data received from the information processing terminal 11 via the communication section 207 and the network 21 .
  • the verification unit 232 requests the blockchain network 13 via the communication unit 207 and the network 21 to collate data (for example, location information) included in the declaration data.
  • the verification unit 232 receives data indicating the result of data matching from the blockchain network 13 via the network 21 and the communication unit 207 .
  • the verification unit 232 verifies the declaration data based on the verification result or the like received from the blockchain network 13 .
  • the information collection unit 233 requests information necessary for insurance processing from the information processing terminal 11 via the communication unit 207 and the network 21 as necessary, and receives the requested information. For example, the information collecting unit 233 requests eyewitness information of an accident from the information processing terminal 11 via the communication unit 207 and the network 21 and receives the eyewitness information from the information processing terminal 11 .
  • the insurance processing unit 234 performs various processes related to the insurance provided by the server 12.
  • the insurance processing unit 234 has a calculation unit 241 and an execution unit 242 .
  • the calculation unit 241 calculates premiums, benefits, benefits (for example, cashback, etc.) related to the insurance contracted by the user, based on the declaration data received from the information processing terminal 11 .
  • the privilege does not necessarily have to be money, and may be, for example, goods or points.
  • the execution unit 242 executes processing related to various insurance services. For example, the execution unit 242 executes insurance premium billing processing, insurance payment processing, privilege provision processing, and the like with the information processing terminal 11 via the communication unit 207 and the network 21 .
  • step S1 the control unit 131 of the Prover activates APP1 that implements the location proof processing unit 134.
  • APP1 may always operate in the background of the OS (Operating System).
  • the Witness control unit 131 similarly activates APP2 that implements the location proof processing unit 134 .
  • APP2 may always operate in the background of the OS.
  • step S2 the position detection unit 132 of the Prover generates position information. Specifically, the position detection unit 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 when a predetermined trigger is detected.
  • a predetermined event or predetermined timing is set as a predetermined trigger.
  • an accident e.g., collision, fall
  • An event such as a predetermined user operation on the operation unit 104 is set as a trigger.
  • the timing of elapse of a predetermined time, arrival at a predetermined time, etc. is set as a trigger.
  • a trigger is detected at predetermined time intervals. Note that the time interval may or may not be constant.
  • the position detection unit 132 generates position information including the current position and current time of the Prover.
  • the location detection unit 132 supplies the location information to the location certification acquisition unit 151 and stores it in the storage 103 . If there is metadata associated with the location information, the location detection unit 132 stores the metadata in the storage 103 in association with the location information.
  • the metadata associated with the position information includes image data corresponding to an image (moving or still image) captured by the imaging unit 107 at the current position of the Prover, Sensor data, etc. are assumed.
  • step S3 the Prover's location certification acquisition unit 151 generates a metadata fingerprint as necessary. For example, if there is metadata associated with the location information generated in step S2, the location certification acquisition unit 151 generates a fingerprint by calculating a hash value of the metadata.
  • step S4 the Prover's location certification acquisition unit 151 generates a PoL request.
  • Fig. 6 shows a format example of a PoL request.
  • the PoL request contains provider_address, latitude, longitude, timestamp, metadata_fingerprint and signature.
  • prover_address is the Prover's public key.
  • latitude indicates the latitude of the Prover's current position.
  • longitude indicates the longitude of the Prover's current position.
  • timestamp indicates the generation time of the PoL request.
  • the current time included in the position information generated in the process of step S2 is set as timestamp.
  • the PoL request includes location information (latitude, longitude, and timestamp) that is the target of location certification.
  • Metadata_fingerprint is the fingerprint of the metadata associated with the Prover's location information.
  • the value of metadata_fingerprint is set to NULL. Also, for example, instead of the fingerprint, the metadata itself may be stored in the PoL request.
  • the signature is the Prover's electronic signature.
  • a signature is generated by encrypting a plaintext hash value containing provider_address, latitude, longitude, timestamp, and metadata_fingerprint with a private key corresponding to the provider_address (public key).
  • step S5 the Prover's location certification acquisition unit 151 transmits a PoL request to another information processing terminal 11 (Witness) existing within a predetermined range from the Prover by BT via the communication unit 110 .
  • the predetermined range is set, for example, to the BT communicable range. This requires proof of the Prover's location from witnesses that exist around the Prover.
  • the witness CPU 101 receives the PoL request via the communication unit 110 .
  • step S6 the Witness location verification unit 161 verifies the PoL request. A method of verifying the PoL request will be described later with reference to FIG.
  • step S7 when the Witness's location certification unit 161 determines that the PoL request is valid as a result of the verification, it generates a PoL response, which is a response to the PoL request.
  • FIG. 7 shows an example of the PoL response format.
  • the PoL response includes pol_request_signed, witness_address, latitude, longitude, timestamp, metadata_fingerprint, and signature.
  • pol_request_signed is PoL request data corresponding to the PoL response. For example, all data of the PoL request is stored as it is in the PoL response as pol_request_signed. Therefore, the PoL response contains the Prover's location information included in the PoL request.
  • the witness_address is the public key of the Witness.
  • Latitude indicates the latitude of the Witness's current position.
  • longitude indicates the longitude of the Witness's current position.
  • timestamp indicates the PoL response generation time (location proof time). For example, the detection time of the Witness's current position (latitude and longitude) is set to timestamp.
  • the PoL response contains the Witness's location information (latitude, longitude, and timestamp) at the time of location verification.
  • Metadata_fingerprint is the fingerprint of the metadata associated with the Witness's location information.
  • the value of metadata_fingerprint is set to NULL. Also, for example, instead of the fingerprint, the metadata itself may be stored in the PoL response.
  • the signature is the Witness's electronic signature.
  • a signature is generated by encrypting a plaintext hash value containing pol_request_signed, witness_address, latitude, longitude, timestamp, and metadata_fingerprint with a private key corresponding to witness_address (public key).
  • step S8 the witnesses's location verification unit 161 transmits the PoL response to the Prover by BT via the communication unit 110.
  • the Prover's CPU 101 receives the PoL response sent from the Witness via the communication unit 110 .
  • the Prover generates and broadcasts a PoL transaction. Specifically, the Prover's location proof acquisition unit 151 verifies the PoL response. A PoL response verification method will be described later with reference to FIG. When the location certification acquisition unit 151 determines that the PoL response is valid as a result of the verification, it supplies the PoL response to the location registration unit 152 .
  • the location registration unit 152 generates a PoL transaction corresponding to the PoL response.
  • Fig. 8 shows a format example of a PoL transaction.
  • a PoL transaction includes sender_address, recipient_address, value, data, and signature.
  • sender_address indicates the address of the sender. For example, sender_address is set to "THE BLOCKCHAIN".
  • recipient_address indicates the address of the recipient. For example, the recipient_address is set to "THE BLOCKCHAIN".
  • 0 is set for the value.
  • data contains the PoL response data received in response to the PoL request. Therefore, the PoL transaction contains the Prover's location information and the Witness's location information. Note that when receiving PoL responses from a plurality of Witnesses in response to a PoL request, data includes the data of the plurality of PoL requests.
  • the location proof data blockchain (blockchain that does not include remittance information), for example, it is possible to omit the sender_address, recipient_address, and value so that the PoL transaction includes only data.
  • the location registration unit 152 broadcasts the PoL transaction to the blockchain network 13 via the communication unit 110 and the network 21.
  • each node of the blockchain network 13 receives PoL transactions via the network 21.
  • step S10 the blockchain network 13 generates a PoL block based on the PoL transaction and adds it to the blockchain.
  • FIG. 9 shows a format example of a PoL block.
  • a PoL block includes block_number, timestamp, transactions, previous_hush, nonce, miner_address, and signature.
  • block_number indicates the block number of the PoL block.
  • timestamp indicates the generation time of the PoL block.
  • a transaction list contains one or more PoL transactions.
  • previous_hush is the hash value of the PoL block immediately preceding the PoL block in question in the blockchain.
  • a nonce is a nonce value calculated by, for example, PoW (Proof of Work) or PoS (Proof of Stake).
  • miner_address is the public key of the miner who mined the PoL transaction.
  • the signature is the miner's electronic signature.
  • signature is generated by encrypting a plaintext hash value including block_number, timestamp, transactions, previous_hush, nonce, and miner_address with a private key corresponding to miner_address (public key).
  • the declaration unit 135 of the Prover generates and transmits declaration data.
  • the declaration data includes, for example, a registration ID for identifying user U1, location information of Prover, metadata associated with location information of Prover (metadata based on metadata_fingerprint of PoL request), public key of Prover (PoL request provider_address), insurance contract period, etc., including data necessary for declaration.
  • the declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 requests verification of the declaration data. For example, the verification unit 232 extracts from the insurance DB 204 information about the user corresponding to the registration ID included in the declaration data. If the verification unit 232 determines that the user U1 is a legitimate user (for example, an insurance policyholder) based on the extracted information, the verification unit 232 requests the blockchain network 13 to verify the declaration data.
  • the verification unit 232 extracts from the insurance DB 204 information about the user corresponding to the registration ID included in the declaration data. If the verification unit 232 determines that the user U1 is a legitimate user (for example, an insurance policyholder) based on the extracted information, the verification unit 232 requests the blockchain network 13 to verify the declaration data.
  • the verification unit 232 sends the location information to the blockchain network 13 via the communication unit 207 and the network 21. demand.
  • the verification unit 232 For example, if the declaration data includes location information and metadata, the verification unit 232 generates a fingerprint of the metadata. The verification unit 232 requests the blockchain network 13 via the communication unit 207 and the network 21 to verify the location information included in the declaration data and the generated fingerprint.
  • step S13 the blockchain network 13 verifies the declaration data and transmits the verification result.
  • the blockchain network 13 searches the blockchain for a PoL block containing location information that matches the location information. That is, the PoL block containing the location information included in the declaration data and the location proof information for the location information is searched.
  • the blockchain network 13 searches the blockchain for PoL blocks containing location information and fingerprints that match the location information and fingerprints when verification of location information and metadata fingerprints is requested. . That is, the PoL block containing the location information contained in the declaration data, the fingerprint of the metadata, and the location proof for the location information is searched.
  • the blockchain network 13 When the blockchain network 13 detects the relevant PoL block, it transmits the detected PoL block to the server 12 via the network 21 .
  • the CPU 201 of the server 12 receives the PoL block via the network 21 and the communication unit 207.
  • the blockchain network 13 fails to detect the relevant PoL block, it notifies the server 12 via the network 21 that the relevant PoL block does not exist.
  • step S14 the server 12 verifies the declaration data and executes various services. For example, when the verification unit 232 of the server 12 receives a PoL block from the blockchain network 13, it verifies the PoL response included in the PoL block by the same processing as in FIG. 11, which will be described later. When the verification unit 232 determines that the PoL response is valid as a result of the verification, the verification unit 232 verifies the PoL request included in the PoL response by the same processing as in FIG. 12 described later.
  • the verification unit 232 determines that the PoL request is valid, it extracts the Prover's location information (latitude, longtitude, timestamp) from the PoL request. If the Prover's location information included in the declaration data matches the Prover's location information extracted from the PoL request, the verification unit 232 extracts the Witness's location information (latitude, longtitude, timestamp) from the PoL response. Then, if the difference in position and time between the Prover's position information and the Witness's position information is within a predetermined range, the verification unit 232 determines that the position information of the declaration data is authentic.
  • the Prover's location information latitude, longtitude, timestamp
  • the verification unit 232 determines that the PoL request is valid and the declaration data includes metadata
  • the verification unit 232 extracts the fingerprint of the metadata (metadata_fingerprint) from the PoL request. If the fingerprint generated from the metadata included in the declared data matches the fingerprint extracted from the PoL request, the verification unit 232 determines that the metadata of the declared data is authentic.
  • the verification unit 232 determines that the declared data is authentic when the position information of the declared data is authentic. On the other hand, if the declared data does not contain metadata and the position information of the declared data is not authentic, the verification unit 232 determines that the declared data is not authentic.
  • the verification unit 232 determines that the declared data is authentic when the position information and the metadata of the declared data are authentic. On the other hand, when the declared data includes metadata, the verification unit 232 determines that the declared data is not authentic when at least one of the position information and the metadata of the declared data is not authentic.
  • the verification unit 232 determines that the declaration data is authentic, it supplies the declaration data to the insurance processing unit 234 .
  • the insurance processing unit 234 executes processing related to various insurance services provided by the server 12 based on the declaration data. A specific example of processing will be described later.
  • step S31 the Prover executes the process of step S2 in FIG. 5 described above to generate position information.
  • the Prover's location certification acquisition unit 151 scans the surrounding witnesses. That is, the location certification acquisition unit 151 scans for witnesses (other information processing terminals 11) existing within a predetermined range around the Prover. For example, the range in which the communication unit 110 can communicate by BT is set as the range to scan Witness. Further, the location certification acquisition unit 151 confirms whether or not the Witness detected by scanning can communicate with the Prover by BT.
  • the Prover generates and transmits a PoL request. That is, the Prover executes the processes of steps S3 to S5 in FIG. 5 described above, generates a PoL request, and transmits it to the witness detected in the process of step S32 by BT via the communication unit 110 .
  • the witness CPU 101 receives the PoL request via the communication unit 110 .
  • step S34 the Witness location verification unit 161 verifies the PoL request.
  • step S35 if the Witness's location certification unit 161 determines that the PoL request is valid, it generates and transmits a PoL response.
  • step S51 the location certification unit 161 determines whether the format of the PoL request is normal. If the received PoL request conforms to the format of FIG. 6, the location certification unit 161 determines that the format of the PoL request is normal, and the process proceeds to step S52.
  • step S52 the location certification unit 161 determines whether the PoL request is authentic.
  • the location proving unit 161 calculates a plaintext hash value including the PoL request provider_address, latitude, longitude, timestamp, and metadata_fingerprint. Also, the location proof unit 161 decrypts the hash value by decrypting the signature of the PoL request using the provider_address of the PoL request. If the hash value calculated from the plaintext of the PoL request matches the hash value decoded from the signature of the PoL request, the location certification unit 161 determines that the PoL request is authentic, and the process proceeds to step S53.
  • step S53 the position verification unit 161 determines whether the Prover exists within a predetermined distance range. For example, if the distance between the position of the Prover indicated in the position information included in the PoL request and the current position of the Witness is equal to or less than a predetermined threshold, the location verification unit 161 determines that the Prover exists within a predetermined distance range. Then, the process proceeds to step S54.
  • step S54 the location certification unit 161 transmits an OK message to the Prover via the communication unit 110 by BT.
  • the witness generates and transmits a PoL response. That is, the witness executes the processes of steps S7 and S8 in FIG. 5 described above, generates a PoL response to the PoL request, and transmits it to the Prover by BT via the communication unit 110 .
  • the Prover's CPU 101 receives the PoL response via the communication unit 110 .
  • step S53 for example, if the distance between the position of the Prover indicated in the position information included in the PoL request and the current position of the Witness exceeds a predetermined threshold, the Prover , and the process proceeds to step S56.
  • step S52 if the hash value calculated from the plaintext of the PoL request and the hash value decoded from the signature of the PoL request do not match, the location certification unit 161 determines that the PoL request is not authentic, and processes the PoL request. goes to step S56.
  • step S51 if the received PoL request does not conform to the format shown in FIG. 6, the location certification unit 161 determines that the format of the PoL request is not normal, and the process proceeds to step S56.
  • step S56 the location certification unit 161 transmits an NG message to the Prover by BT via the communication unit 110.
  • step S36 the Prover's location proof acquisition unit 151 verifies the PoL response.
  • step S37 the Prover's location registration unit 152 generates and broadcasts a PoL transaction.
  • step S71 the location certification acquisition unit 151 determines whether the format of the PoL response is normal. If the received PoL response conforms to the format of FIG. 7, the location certification acquisition unit 151 determines that the format of the PoL response is normal, and the process proceeds to step S72.
  • step S72 the location certification acquisition unit 151 determines whether the PoL response is authentic.
  • the location proof acquisition unit 151 calculates a plaintext hash value including pol_request_signed, witness_address, latitude, longitude, timestamp, and metadata_fingerprint of the PoL response. Also, the location proof acquisition unit 151 decrypts the hash value by decrypting the signature of the PoL response using the provider_address of the PoL response. If the hash value calculated from the plaintext of the PoL response and the hash value decrypted from the signature of the PoL response match, the location proof acquisition unit 151 determines that the PoL response is authentic, and the process proceeds to step S73. move on.
  • step S73 the location certification acquisition unit 151 determines whether or not the Witness exists within a predetermined distance range. For example, when the distance between the position of the Witness indicated in the position information included in the PoL response and the current position of the Prover is equal to or less than a predetermined threshold, the location proof acquisition unit 151 determines that the Witness is within the range of the predetermined distance. It is determined that it exists, and the process proceeds to step S74.
  • step S74 the Prover executes the process of step S9 in FIG. 5 described above, generates a PoL transaction, and broadcasts it.
  • each node of the blockchain network 13 receives PoL transactions via the network 21.
  • step S73 for example, if the distance between the position of the Witness indicated in the position information included in the PoL response and the current position of the Prover exceeds a predetermined threshold, the Witness It is determined that it does not exist within the range of distance, and the process proceeds to step S75.
  • step S72 when the hash value calculated from the plaintext of the PoL response and the hash value decrypted from the signature of the PoL response do not match, the location proof acquisition unit 151 determines that the PoL response is not authentic, The process proceeds to step S75.
  • step S71 if the received PoL response does not conform to the format of FIG. 7, the location certification acquisition unit 151 determines that the format of the PoL response is not normal, and the process proceeds to step S75.
  • step S75 the location certification acquisition unit 151 discards the PoL response.
  • the PoL response verification process ends, and the Prover process ends. That is, the Prover processing ends without the PoL transaction being sent to the blockchain network 13 .
  • step S38 the miner of the blockchain network 13 performs mining and generates PoL blocks.
  • step S39 the blockchain network 13 verifies the generated PoL block and adds it to the blockchain.
  • the miner of the blockchain network 13 verifies the validity of the location information and the location proof information included in the generated PoL block, and if it determines that the PoL block is valid, other users in the blockchain network 13 node.
  • Each node that receives a PoL block adds the received PoL block to the blockchain.
  • step S101 the Prover executes the process of step S11 in FIG.
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • step S102 the verification unit 232 of the server 12 generates a metadata fingerprint as necessary. For example, if the declaration data includes metadata corresponding to metadata_fingerprint of the PoL request in FIG. 6, the verification unit 232 generates a fingerprint of the metadata.
  • step S103 the verification unit 232 of the server 12 queries PoL records based on the declaration data.
  • the verification unit 232 For example, if the declared data includes location information but not metadata, the verification unit 232 generates a query requesting extraction of a PoL block including location information that matches the location information.
  • the verification unit 232 For example, if the declaration data includes location information and metadata, the verification unit 232 generates a fingerprint of the metadata. The verification unit 232 generates a query requesting extraction of a PoL block containing location information and fingerprints that match the location information included in the declaration data and the generated fingerprint.
  • the verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and the network 21.
  • each node of) the blockchain network 13 receives the query via the network 21.
  • step S104 the blockchain network 13 extracts and transmits PoL records based on the query. Specifically, (a node of) the blockchain network 13 extracts PoL blocks that match the conditions indicated by the query from the PoL blocks included in the blockchain.
  • PoL blocks that contain location information that matches the location information are extracted.
  • the declaration data includes location information and metadata
  • the PoL blocks including the location information and the location information and fingerprints that match the fingerprint of the metadata are extracted.
  • the blockchain network 13 transmits the extracted PoL blocks to the server 12 via the network 21.
  • the CPU 201 of the server 12 receives the extracted PoL block via the network 21 and the communication unit 207.
  • the blockchain network 13 notifies the server 12 via the network 21 that the PoL block does not exist.
  • step S105 the server 12 verifies the declaration data and executes various services. That is, the server 12 performs the process of step S14 in FIG. 5 described above, and when it determines that the declared data is authentic as a result of verifying the declared data, the insurance provided by the server 12 is determined based on the declared data. Executes processing related to various services.
  • Prover can reliably guarantee the authenticity of location information and metadata without using a dedicated device. That is, the witnesseses existing around the Prover transmit the Witness's location information to the Prover as location verification information in association with the Prover's location information. The Prover can then use the location proof information to ensure the authenticity of the Prover's location information and the metadata associated with the location information.
  • the server 12 can easily confirm the authenticity of the declared data, and if it determines that the declared data is authentic, it can provide an appropriate insurance service based on the declared data.
  • the management cost of personal information and the operating cost of the server 12 can be reduced.
  • the insurance money paid in the event of an accident to the user, or the insurance money paid to the victim in the event of an accident caused by the user, shall be based on the percentage of negligence of the parties to the accident (victim and perpetrator). Calculated. And eyewitness information of an accident other than the party involved becomes an important basis for determining the percentage of fault.
  • this technology can be applied to the process of collecting eyewitness information about accidents.
  • user U1 is assumed to be a party (perpetrator or victim) of the accident
  • user U2 is assumed to be a witness who was around the accident site when the accident occurred.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S201 Prover activates APP1, and witnesses activates APP2, as in the process of step S1 in FIG.
  • step S202 the Prover detects an accident and generates location information. For example, when the accident detection unit 133 detects an accident in which the user U1 is a party, based on at least one of image data output from the imaging unit 107 and sensor data output from the sensing unit 109. , to notify the position detector 132 of the occurrence of an accident.
  • accident data The type and number of data used for accident detection (hereinafter referred to as accident data) are not particularly limited. For example, image data, impact data from an impact sensor, etc. are used.
  • the position detection unit 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 .
  • the position detector 132 generates position information including the current position and current time of the Prover.
  • the position detection unit 132 acquires accident data acquired before and after the accident from the accident detection unit 133 .
  • the location detection unit 132 supplies location information and accident data to the location certification acquisition unit 151 .
  • the position detection unit 132 associates the position information and the accident data and stores them in the storage 103 .
  • step S203 the Prover's location certification acquisition unit 151 generates a fingerprint of the accident data.
  • the Prover generates a PoL request, similar to the process at step S4 in FIG.
  • the fingerprint of the accident data is stored in the PoL request as metadata_fingerprint.
  • steps S205 to S210 the same processing as steps S5 to S10 in FIG. 5 described above is performed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain.
  • the location information of the Prover at the time of the accident, the location verification information by the Witness, and the fingerprint of the accident data are recorded in the blockchain.
  • the declaration unit 135 of the Prover generates and transmits declaration data.
  • the reporting unit 135 acquires from the storage 203 location information generated by the Prover when an accident occurs and accident data associated with the location information.
  • the report unit 135 generates report data including the acquired position information and accident data, the registration ID of the user U1, and the public key of the Prover.
  • the declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • steps S212 and S213 processing similar to steps S12 and S13 in FIG. 5 described above is executed.
  • PoL blocks containing location information and fingerprints that match the fingerprints of the location information and accident data included in the report data are extracted from the blockchain and sent to the server 12 .
  • step S214 the information collection unit 233 of the server 12 estimates the witness. Specifically, the information collecting unit 233 identifies the witness (for example, the information processing terminal 11 - 2 ) that generated the PoL response included in the PoL block received from the blockchain network 13 . In addition, the information collecting unit 233 estimates the specified Witness user (for example, the user U2, etc.) as a witness.
  • the witness for example, the information processing terminal 11 - 2
  • the information collecting unit 233 estimates the specified Witness user (for example, the user U2, etc.) as a witness.
  • the information collection unit 233 of the server 12 generates and broadcasts a witness questionnaire.
  • the information collecting unit 233 generates a witness questionnaire for collecting eyewitness information from eyewitnesses based on the PoL responses of each witness included in the PoL block received from the blockchain network 13 .
  • the eyewitness questionnaire includes a public key (witness_address) and location information (latitude, longitude, timestamp) included in the Witness PoL response, and information indicating the contents of the questionnaire.
  • the information collecting unit 233 broadcasts the eyewitness questionnaire to the information processing terminal 11 (Witness) of the eyewitness estimated in step S214 via the communication unit 207 and the network 21 .
  • the Witness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
  • step S216 the Witness information providing unit 162 generates and transmits responses to the eyewitness questionnaire. Specifically, when the public key (witness_address) included in the eyewitness questionnaire matches the public key of the witness, the information providing unit 162 displays the screen of FIG. to display.
  • a map 301 showing the site of the accident is displayed in the background.
  • An image 302 of the vicinity of the accident site is displayed on the map 301 .
  • a window 303 is displayed above the map 301 containing a message to a user (eg, user U2) and information about the accident.
  • the window 303 displays a message stating that a witness is being searched for, the time when the accident occurred, and an overview. Also displayed are a "Yes” button and a “No” button, along with a message asking whether the user was a witness to the accident.
  • the screen in FIG. 16 is displayed on the display unit 105.
  • the "No” button is pressed, the display of the eyewitness questionnaire ends.
  • the screen in FIG. 16 differs from the screen in FIG. 15 in that window 304 is displayed instead of window 303 .
  • the window 304 displays the conditions regarding the shooting location and shooting time of the image (eyewitness information) requested to be sent. For example, a message requesting the provision of images (photographs) taken near the accident site during a predetermined time period before and after the accident (in this example, after the accident) is displayed. A “+” button, a “next” button, and a “back” button are also displayed.
  • the screen in FIG. 17 differs from the screen in FIG. 16 in that window 305 is displayed instead of window 304 .
  • a window 305 displays a message expressing gratitude for providing information, a "Submit” button, and a "Cancel” button.
  • the image data corresponding to the image selected on the screen of FIG. 16 is transmitted to the server 12.
  • This image data corresponds to an image captured near the site and time of occurrence of an accident involving user U1 (near the position and time indicated by the position information of Prover when the accident occurred).
  • the information providing unit 162 generates eyewitness information including the selected image data.
  • eyewitness information may include information other than image data, such as text data indicating testimony of eyewitnesses.
  • the information providing unit 162 generates an electronic signature of the sighting information using the secret key corresponding to the witness_address (public key).
  • the information providing unit 162 generates answers to the eyewitness questionnaire including eyewitness information and electronic signatures, and transmits them to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
  • step S211 to step S216 of FIG. 14 and the subsequent processing will be supplemented.
  • step S231 the Prover executes the process of step S11 in FIG.
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 generates a fingerprint of the accident data included in the report data.
  • step S233 the verification unit 232 of the server 12 queries PoL records based on the declaration data. Specifically, the verification unit 232 generates a query requesting extraction of PoL blocks containing position information and fingerprints that match the location information included in the declaration data and the generated fingerprints. The verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and network 21 .
  • the blockchain network 13 receives the query via the network 21.
  • step S234 the blockchain network 13 extracts and transmits PoL records based on the query, similar to the process of step S103 in FIG. 13 described above. PoL blocks that meet the conditions indicated by the query are thereby extracted from the blockchain and sent to the server 12 .
  • step S235 the verification unit 232 of the server 12 verifies the report data and estimates the witness. Specifically, the verification unit 232 verifies the declaration data by the same process as in step S105 of FIG. 13 described above. When the verification unit 232 determines that the declaration data is authentic, the verification unit 232 supplies the declaration data and the PoL block received from the blockchain network 13 to the information collection unit 233 .
  • the information collection unit 233 identifies the witness that generated the PoL response included in the PoL block.
  • the information collecting unit 233 also presumes that the specified Witness user is a witness.
  • step S236 the server 12 executes the process of step S215 in FIG. 14 described above, generates a witness questionnaire, and broadcasts it to Trust via the network 21.
  • the Witness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
  • step S237 the Witness information providing unit 162 confirms the public key of the witness questionnaire. That is, the information providing unit 162 confirms whether or not the public key included in the eyewitness questionnaire matches the public key (witness_address) of the Witness.
  • step S2308 if the public key of the eyewitness questionnaire matches the public key of the Witness, the Witness executes the process of step S216 in FIG. .
  • the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
  • step S239 the information collection unit 233 of the server 12 verifies the answers to the eyewitness questionnaire. Specifically, the information collecting unit 233 calculates a hash value of the eyewitness information included in the answers to the eyewitness questionnaire. The information collecting unit 233 also decrypts the hash value by decrypting the electronic signature included in the answer to the eyewitness questionnaire using the Witness's public key (witness_address). If the hash value calculated from the eyewitness information matches the hash value decoded from the electronic signature, the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic. On the other hand, if the hash value calculated from the eyewitness information does not match the hash value decoded from the electronic signature, the information collecting unit 233 determines that the answer to the eyewitness questionnaire is not authentic.
  • step S240 the server 12 executes key money remittance processing. Specifically, when the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic, the information collecting unit 233 supplies the eyewitness information included in the answer to the eyewitness questionnaire to the insurance processing unit 234 .
  • the calculation unit 241 calculates the key money to be paid to the eyewitness based on the content of the eyewitness information, and notifies the execution unit 242 of the calculation result.
  • the execution unit 242 communicates with the Witness via the communication unit 207 and the network 21 to perform key money remittance processing.
  • step S241 the server 12 transmits information regarding insurance claims. Specifically, the calculation unit 241 calculates the insurance money to be paid to the user U1, for example, based on the report data and eyewitness information, and notifies the execution unit 242 of the calculation result. The execution unit 242 generates information including calculation results of the insurance money, and transmits the information to the Prover via the communication unit 207 and the network 21 .
  • the Prover's CPU 101 receives information on the insurance money via the network 21 and the communication unit 110 .
  • risk-segmented automobile insurance has been popular in the past.
  • premiums are set based on, for example, annual mileage.
  • risk segmented insurance for bicycles and pedestrians will spread in the future.
  • the risk is estimated based on the moving route and moving distance of the user (contractor) during the contract period. Then, based on the estimated risk, it is assumed that the insurance premium for the next contract period is set, or benefits such as cash back for the current contract period are given.
  • user U1 is assumed to be a user for whom insurance premiums or benefits are calculated
  • user U2 is assumed to be a user existing around user U1.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S301 Prover activates APP1, and witnesses activates APP2, similar to the process of step S1 in FIG.
  • steps S302 to S309 processes similar to steps S2 and steps S4 to S10 in FIG. 5 described above are periodically performed at predetermined time intervals (for example, one minute intervals).
  • the Prover periodically detects the current location and generates location information.
  • witnesses around the Prover also generate a PoL response proving the Prover's location.
  • a PoL block containing the generated PoL response is then added to the blockchain.
  • Prover's location information and Witness's location proof information are periodically recorded in the blockchain.
  • the declaration unit 135 of the Prover generates and transmits declaration data.
  • the declaring unit 135 reads out from the storage 103 position information for a plurality of different dates and times generated periodically within the contract period of the insurance, and generates movement data by arranging the read position information in chronological order.
  • the declaration unit 135 generates declaration data that includes movement data, the registration ID of the user U1, and the public key of the Prover, and is used for calculating insurance premiums or benefits.
  • the reporting unit 135 transmits to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 requests verification of the declaration data. Specifically, the verification unit 232 requests the block chain network 13 via the communication unit 207 and the network 21 to collate each location information included in the movement data included in the declaration data.
  • step S312 the blockchain network 13 verifies the declaration data and transmits the verification result. Specifically, the blockchain network 13 extracts from the blockchain a plurality of PoL blocks each containing location information that matches each location information requested to be verified. Blockchain network 13 transmits the extracted PoL blocks to server 12 via network 21 .
  • the CPU 201 of the server 12 receives the PoL block via the network 21 and the communication unit 207.
  • step S313 the server 12 calculates insurance premiums and the like based on the declaration data.
  • the verification unit 232 of the server 12 verifies the declaration data by the same processing as in step S105 of FIG. 13 described above.
  • the verification unit 232 supplies the movement data to the calculation unit 241 when it is determined that the movement data (each position information contained therein) included in the declaration data is authentic.
  • the calculation unit 241 detects the travel route and travel distance of the user U1 during the contract period based on the travel data.
  • the calculator 241 estimates the risk of the user U1 during the contract period based on the travel route and travel distance of the user U1.
  • calculation unit 241 may acquire information about the means of transportation of the user U1, or estimate the means of transportation based on the route and speed of movement of the user U1. Then, the calculation unit 241 may estimate the risk of the user U1 during the contract period, taking into consideration the means of transportation.
  • the calculation unit 241 calculates benefits such as insurance premiums for the next contract period or cashback for the current contract period based on the estimated risk.
  • the calculation unit 241 supplies information on the calculated insurance premium or benefits to the execution unit 242 .
  • step S314 the server 12 and the Prover execute settlement processing for insurance premiums and the like.
  • the execution unit 242 of the server 12 and the control unit 131 of the Prover communicate with each other via the communication unit 207, the network 21, and the communication unit 110 to perform insurance premium claim processing and insurance premium payment processing.
  • a privilege granting process or the like is performed.
  • step S331 the Prover executes the process of step S310 in FIG.
  • step S332 the verification unit 232 of the server 12 queries PoL records based on the declaration data.
  • the verification unit 232 generates a query requesting extraction of a plurality of PoL blocks each including the location information of the Prover that matches the location information included in the movement data of the declaration data.
  • the verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and network 21 .
  • the blockchain network 13 receives the query via the network 21.
  • step S333 the blockchain network 13 extracts and transmits PoL records based on the query. Specifically, the blockchain network 13 extracts PoL blocks that match the conditions indicated by the query from the PoL blocks included in the blockchain. As a result, a plurality of PoL blocks each containing location information that matches each location information indicated in the query are extracted.
  • the blockchain network 13 transmits the extracted PoL blocks to the server 12 via the network 21.
  • the CPU 201 of the server 12 receives the extracted PoL block via the network 21 and the communication unit 207.
  • step S334 the server 12 executes the process of step S313 in FIG. 19 described above, and calculates insurance premiums and the like based on the declaration data.
  • step S335 the server 12 and the Prover execute the process of step S314 in FIG.
  • the calculation unit 241 further determines the conditions of the travel route of the user U1 (e.g., weather, congestion, occurrence of accidents in the past, etc.) and the purpose of travel (e.g., work, travel, sports, etc.). may be used to estimate the risk.
  • the conditions of the travel route of the user U1 e.g., weather, congestion, occurrence of accidents in the past, etc.
  • the purpose of travel e.g., work, travel, sports, etc.
  • user U1 can certify the travel route and travel time, and the certified travel route and travel time can be used for purposes other than risk segmentation insurance.
  • user U1 can use the travel route and travel time certified by the present technology when receiving recognition of a commuting accident.
  • user U1 can use the travel route and travel time certified by the present technology when proving that he was late due to an unforeseen factor.
  • the present technology can be applied, for example, when providing benefits of insurance (hereinafter referred to as leisure insurance) against injuries, damages, compensation, etc. that occur during leisure activities such as travel, skiing, golf, and hiking.
  • benefits of insurance hereinafter referred to as leisure insurance
  • a service will be introduced that gives benefits to policyholders based on the locations visited by the user (policyholder) during the contract period.
  • policyholders of leisure insurance for golf or skiing may receive benefits such as cashback (celebration money) or discounts on insurance premiums for the next contract period, depending on the number of golf courses or ski resorts visited during the contract period. service is assumed.
  • the contractor must prove that they actually visited the location.
  • this technology can be applied, for example, when calculating insurance claims for property and casualty insurance against fires and natural disasters.
  • the insurance money is calculated based on the damage situation.
  • the disaster situation is proved based on the image of the disaster area.
  • a user contractor
  • an image of a disaster-stricken area it is possible for a user (contractor) to use an image of a disaster-stricken area to prove the disaster situation.
  • the user U1 is assumed to be a user who takes an image for proof and declares insurance money or benefits
  • the user U2 is assumed to be a user who exists around the user U1 when the image for proof is taken.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S401 Prover activates APP1 and witnesses activates APP2, as in the process of step S1 in FIG.
  • step S402 the Prover captures a certification image and generates location information.
  • the photographing unit 107 of the Prover photographs a certification image in response to a user's operation on the operation unit 104 and supplies the corresponding image data to the CPU 101 .
  • the position detection unit 132 detects the current position of the Prover when the certification image was captured based on the position detection data output from the GNSS receiver 108 .
  • the position detection unit 132 generates position information including the position and time of the Prover when the certification image was captured.
  • the location detection unit 132 supplies the certification image and the location information to the location certification acquisition unit 151 .
  • the position detection unit 132 stores the certification image and the position information in the storage 103 in association with each other.
  • step S403 the Prover's location certification acquisition unit 151 generates a fingerprint of the image data of the certification image.
  • step S404 the Prover's location certification acquisition unit 151 generates a PoL request in the same manner as in step S4 of FIG. 5 described above.
  • the fingerprint of the image data of the certification image is stored in the PoL request as metadata_fingerprint.
  • steps S405 through S410 processing similar to steps S5 through S10 in FIG. 5 described above is executed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information of the Prover at the time of photographing the certification image, the location certification information by the Witness, and the fingerprint of the image data of the certification image are recorded in the blockchain.
  • step S411 the declaration unit 135 of the Prover generates and transmits declaration data. Specifically, the declaration unit 135 generates declaration data including the image data of the certification image, the position information when the certification image was captured, the registration ID of the user U1, and the public key of the Prover. The declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • steps S412 and S413 the same processes as steps S12 and S13 in FIG. 5 are executed.
  • the PoL block containing the location information included in the declaration data and the location information and the fingerprint matching the fingerprint of the image data of the certification image is extracted from the blockchain.
  • step S414 the server 12 analyzes the proof image and calculates the insurance money.
  • the verification unit 232 of the server 12 verifies the declaration data in the same manner as the process of step S14 in FIG. 5 described above.
  • the verification unit 232 determines that the declared data is authentic as a result of the verification, that is, when it determines that the certification image was taken at the declared position and time, the verification unit 232 converts the declared data to the calculation unit 241 . supply to
  • the calculation unit 241 analyzes the certification image and identifies the location visited by the contractor.
  • the calculation unit 241 calculates benefits to be given to the contractor based on the locations visited by the contractor.
  • the calculation unit 241 analyzes the certification image and estimates the damage situation of the contractor.
  • the calculation unit 241 calculates the insurance money to be paid to the policyholder based on the estimated disaster situation.
  • the calculation unit 241 supplies the execution unit 242 with data indicating the calculation result of the insurance money or benefits.
  • step S415 the server 12 and the Prover execute settlement processing for insurance claims and the like.
  • the execution unit 242 of the server 12 and the control unit 131 of the Prover communicate with each other via the communication unit 207, the network 21, and the communication unit 110 while performing insurance payment processing or provision of benefits. processing, etc.
  • the proof image While guaranteeing the authenticity of the proof image taken by the user (contractor), it is possible to use the proof image to appropriately calculate insurance benefits or benefits. In addition, the user can use the proof image to quickly receive the insurance money or benefit without performing troublesome procedures.
  • health promotion insurance is insurance that discounts insurance premiums and provides benefits such as cash back depending on the policyholder's health condition and efforts to improve health.
  • the present technology can be applied when discounting insurance premiums or granting benefits based on the policyholder's health promotion activities.
  • user U1 is assumed to be a user (contractor) for whom insurance premiums or benefits are calculated
  • user U2 is assumed to be a user existing around user U1.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S501 Prover activates APP1, and witnesses activates APP2, as in the process of step S1 in FIG.
  • step S502 the Prover acquires activity data and generates location information. Specifically, the position detection unit 132 of the Prover acquires from the sensing unit 109 activity data, which is sensor data used for detecting the activity of the user U1. For example, when insurance premium discounts and benefits are provided according to the number of steps and walking distance of the user, sensor data indicating the acceleration and angular velocity indicating the walking motion of the user are used as the activity data.
  • the position detection unit 132 detects the current position of the Prover at the time of acquisition of the activity data based on the position detection data output from the GNSS receiver 108 .
  • the position detector 132 generates position information including the current position and current time of the Prover.
  • the location detection unit 132 supplies the activity data and the location information to the location certification acquisition unit 151 and causes the storage 103 to store the activity data and the location information in association with each other.
  • step S503 the Prover's location certification acquisition unit 151 generates a fingerprint of the activity data.
  • step S504 the Prover's location certification acquisition unit 151 generates a PoL request in the same manner as the processing in step S4 of FIG. 5 described above.
  • the activity data fingerprint is stored in the PoL request as metadata_fingerprint.
  • steps S505 to S510 the same processes as steps S5 to S10 in FIG. 5 described above are executed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information of the Prover at the time of acquisition of the activity data, the location verification information by the Witness, and the fingerprint of the activity data are recorded in the blockchain.
  • steps S502 to S510 are repeatedly executed during the contract period and activity of user U1. For example, every time the user U1 walks a predetermined number of steps (for example, 100 steps), the processing from step S502 to step S510 is executed. As a result, the activity data and location information during activity of the user U1, and the location proof information for the location information are recorded in the blockchain.
  • a predetermined number of steps for example, 100 steps
  • step S511 the declaration unit 135 of the Prover generates and transmits declaration data.
  • the reporting unit 135 generates reporting data including the active activity data and movement data of the user U1, the registration ID of the user U1, and the Prover's public key.
  • the movement data is, for example, data in which the position information generated when the activity data is acquired is arranged in chronological order.
  • the declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 requests verification of the declaration data. Specifically, the verification unit 232 generates a fingerprint for each activity data included in the declaration data. The verification unit 232 requests, via the communication unit 207 and the network 21, verification of the fingerprints of the location information and the activity data for each combination of the fingerprints of the location information and the activity data contained in the movement data.
  • step S513 the blockchain network 13 verifies the declaration data and transmits the verification result. Specifically, for each combination of location information and activity data fingerprints, the blockchain network 13 extracts PoL blocks containing matching location information and fingerprints from the blockchain. Blockchain network 13 transmits the extracted PoL blocks to server 12 via network 21 .
  • the CPU 201 of the server 12 receives each PoL block via the network 21 and the communication unit 207.
  • step S514 the server 12 calculates insurance premiums, etc., based on the declaration data.
  • the verification unit 232 of the server 12 verifies the declaration data by the same processing as in step S105 of FIG. 13 described above.
  • the verification unit 232 supplies the movement data to the calculation unit 241 when it is determined that the activity data included in the report data and the location information included in the movement data are authentic as a result of the verification.
  • the calculation unit 241 estimates the activity content of the user U1 during the contract period based on each combination of the position information and the activity data. For example, the calculation unit 241 estimates the walking distance or the like during the contract period of the user U1.
  • the calculation unit 241 calculates insurance premiums or benefits for user U1 based on the estimated activity content. For example, the calculation unit 241 calculates a discount on insurance premiums for the next contract period of user U1, an amount to be cashed back for insurance premiums for the current contract period, and the like, based on the estimated activity content.
  • the calculation unit 241 supplies the calculation result of insurance premiums or benefits to the execution unit 242 .
  • step S514 the server 12 and the Prover execute settlement processing for insurance premiums, etc., similar to the processing in step S314 of FIG. 19 described above.
  • each information processing terminal 11 saves the obtained image data in the server 12 while shooting, and records the location information and location certification information at the time of shooting in the blockchain. It is conceivable to use the image data stored in the server 12 as eyewitness information when an accident occurs, using the location information and location certification information recorded in the blockchain.
  • step S601 the communication unit 110 of the Prover checks the communication status with surrounding witnesses.
  • step S602 the photographing unit 107 of the Prover photographs an image (still image or moving image).
  • the photographing unit 107 supplies image data corresponding to the photographed image to the CPU 101 .
  • step S603 the image registration unit 153 of the Prover encrypts and uploads the image data. Specifically, the position detection unit 132 encrypts the image data using the private key held by the Prover immediately after the photographing is completed. The position detection section 132 transmits the encrypted image data to the server 12 via the communication section 110 and the network 21 .
  • the CPU 201 of the server 12 receives the image data via the network 21 and the communication unit 207.
  • step S604 the position detection unit 132 of the Prover generates position information and calculates a hash value of the encrypted image data. Specifically, the position detector 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 . The position detector 132 generates position information including the current position and current time of the Prover. The image registration unit 153 calculates a hash value of the image data encrypted in step S603. The image registration unit 153 supplies the calculated hash value to the position detection unit 132 .
  • step S605 the information collection unit 233 of the server 12 saves the encrypted image and its hash value. Specifically, the information collecting unit 233 calculates a hash value of the encrypted image data received in the process of step S603. The information collection unit 233 issues an image ID for accessing the encrypted image data. The information collection unit 233 associates the image ID, the encrypted image data, and the hash value, and stores them in the insurance DB 204 .
  • step S606 the information collection unit 233 of the server 12 transmits to the Prover, via the communication unit 207 and the network 21, an image ID that allows access to the image saved in step S605.
  • the Prover's CPU 101 receives the image ID via the network 21 and the communication unit 110 .
  • the position detection unit 132 of the Prover associates the position information and the encrypted image data obtained in the process of step S604 with the received image ID and stores them in the storage 103 .
  • the position detection unit 132 also supplies the position information and the image data to the position certification acquisition unit 151 .
  • the Prover generates and transmits a PoL request in the same manner as the process at step S33 in FIG. 10 described above.
  • a fingerprint of metadata including image data and sensor data of a motion sensor that detects the movement of Proves (hereinafter referred to as motion data) is stored in the PoL block as metadata_fingerprint.
  • steps S608 through S613 the same processes as in steps S34 through S39 of FIG. 10 described above are executed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information at the time the image was taken and the location proof information by Witness are recorded in the blockchain together with the hash value of the image data and the fingerprint of the metadata including the motion data.
  • the information processing terminal 11 of the accident party (accident victim or perpetrator) is simply referred to as the accident party.
  • the information processing terminal 11 of the accident eyewitness is simply referred to as the accident eyewitness.
  • step S631 the party involved in the accident generates report data and transmits it to the server 12 in the same manner as in the process of step S211 in FIG.
  • the reporting data includes, for example, location information of the parties involved in the accident and accident data when the accident occurred.
  • the accident data includes, for example, image data corresponding to images captured when the accident occurred, and motion data acquired when the accident occurred.
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 generates a fingerprint of the accident data included in the report data.
  • step S633 the server 12 makes a PoL record query based on the declaration data, similar to the processing in step S233 of FIG. 18 described above.
  • step S634 the blockchain network 13 extracts PoL records based on the query and transmits them to the server 12, similar to the process of step S234 in FIG.
  • step S635 the server 12 verifies the report data and estimates the accident eyewitness by the same processing as in step S235 of FIG. 18 described above.
  • step S636 the server 12 generates an eyewitness questionnaire through the same processing as in step S236 of FIG. 18 described above, and broadcasts it to the accident eyewitnesses.
  • the accident eyewitness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
  • step S637 the eyewitness to the accident confirms the public key of the eyewitness questionnaire by the same process as in step S237 of FIG. 18 described above.
  • step S638 the accident eyewitness information provision unit 162 generates responses to the eyewitness questionnaire and transmits them to the server 12, in the same manner as in the process of step S216 in FIG.
  • the responses to the eyewitness questionnaire included image data taken near the accident site before and after the accident.
  • the responses to the eyewitness questionnaire include the image ID given by the server 12 to the image data corresponding to the images taken near the accident site before and after the accident.
  • Responses to the eyewitness questionnaire also include decryption keys corresponding to encryption keys used to encrypt image data corresponding to images taken near the accident site before and after the accident.
  • the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
  • step S639 the information collection unit 233 of the server 12 verifies the answers to the eyewitness questionnaire and acquires images. Specifically, the information collecting unit 233 verifies the answers to the eyewitness questionnaire by the same processing as in step S239 of FIG. 18 described above.
  • the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic
  • the information collecting unit 233 acquires image data corresponding to the image ID included in the answer to the eyewitness questionnaire from the insurance DB 204 .
  • the information collecting unit 233 decrypts the acquired image data using the decryption key included in the answers to the eyewitness questionnaire. As a result, image data corresponding to images taken near the scene after the accident before and after the accident is acquired.
  • steps S640 and S641 processing similar to steps S239 and S240 in FIG. 18 described above is executed.
  • the image data obtained by the information processing terminal 11 is encrypted and stored in the server 12 .
  • Metadata fingerprints including image data and motion data, are also recorded on the blockchain in association with location information and location proof information at the time of capture. This ensures the authenticity of the image data and the positional information of the image data at the time of shooting. Therefore, image data whose authenticity is guaranteed can be used for eyewitness information.
  • the image data is stored in the server 12, even if the image data stored in the information processing terminal 11 is deleted, it can be used later as eyewitness information.
  • the position information may be generated at a predetermined timing other than the end of the above-described image capturing and recorded together with the position certification information during the period from the start to the end of image capturing. Further, when shooting a moving image, location information may be generated at a plurality of timings from the start to the end of shooting the moving image and recorded together with the location certification information.
  • the information processing terminal 11 is a mobile information terminal such as a smart phone
  • the above APP is not necessarily executed and communication is possible. Therefore, even if the information processing terminal 11 exists around the Prover, it does not necessarily become a Witness.
  • the number of witnesseses can be increased around each information processing terminal 11 . This makes it possible to increase the reliability of the position proof information.
  • the information processing terminal 11 installed outdoors or indoors may operate only as a Witness, or may operate as a Prover. In the latter case, for example, the information processing terminal 11 normally operates as a Witness, and operates as a Prover when requested by another information processing terminal 11 .
  • FIG. 25 shows a configuration example of an information processing system 401 that is a second embodiment of an information processing system to which the present technology is applied.
  • the information processing system 401 is a system that collects image data corresponding to images taken near the accident site before and after the occurrence of the accident.
  • the information processing system 401 includes accident trigger generators 411-1 to 411-m, camera 412-1 to camera 412-n, and a server 413.
  • the accident trigger generators 411-1 to 411-m, the camera 412-1 to 412-n, and the server 413 are connected via a network 421 and can communicate with each other. be.
  • the accident trigger generators 411-1 to 411-m and the camera 412-1 to camera 412-n communicate directly using short-range wireless communication without going through the network 421. It is possible.
  • the accident trigger generators 411-1 to 411-m are simply referred to as the accident trigger generator 411 when there is no need to distinguish them individually.
  • the cameras 412-1 to 412-n are simply referred to as the camera 412 when there is no need to distinguish them individually.
  • the accident trigger generator 411 is composed of, for example, a portable information processing device that can be carried by the user or worn by the user.
  • the accident trigger generator 411 performs an accident detection process, and when an accident is detected, transmits a request trigger to the camera 412 present in the surroundings to request position proof.
  • the photographing device 412 is configured by an information processing device having a photographing function and a communication function.
  • the photographing device 412 is configured by a portable information processing device that can be carried by the user or worn by the user.
  • the camera 412 is configured by a smart phone, a mobile phone, a tablet terminal, a wearable device, an action camera, a portable game machine, or the like.
  • the camera 412 is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle) such as a drive recorder, and is configured by an information processing device that photographs and records the surroundings of the mobile object.
  • a mobile object such as a vehicle (including a two-wheeled vehicle) such as a drive recorder, and is configured by an information processing device that photographs and records the surroundings of the mobile object.
  • the camera 412 is configured by, for example, a dedicated camera installed at an arbitrary location outdoors or indoors.
  • the camera 412 receives the private key from the server 413 via the network 421 .
  • the image capturing device 412 captures images of the surroundings, and superimposes a watermark on the obtained image data using the secret key received from the server 413 .
  • the camera 412 stores the watermark-superimposed image data in association with position information indicating the shooting position and shooting time.
  • the camera 412 collects image data corresponding to the images captured during the period before and after receiving the request trigger (near the time when the position proof is requested), and the image Location information associated with the data is transmitted to server 413 via network 421 .
  • the server 413 generates a private key and transmits it to the camera 412 via the network 421.
  • the server 413 receives image data and location information from the camera 412 via the network 421 and verifies the received image data. If the image data is valid, the server 413 creates a PoL block containing the image data and location information and adds it to the blockchain.
  • the server 413 also transmits the PoL block to other nodes (not shown) that make up the blockchain network to add it to the blockchain.
  • FIG. 26 is a block diagram showing a functional configuration example of the accident trigger generator 411. As shown in FIG.
  • the accident trigger generator 411 includes a CPU 501, a memory 502, a storage 503, an operation unit 504, a display unit 505, a speaker 506, an imaging unit 507, a sensing unit 508, a communication unit 509, an external I/F 510, and a drive 511.
  • the CPU 501 to drive 511 are connected to a bus and perform necessary communications with each other.
  • the CPU 501 through the drive 511 are configured similarly to the CPU 101 through the imaging unit 107 and the sensing unit 109 through the drive 112 of the information processing terminal 11 in FIG.
  • the program executed by the CPU 501 can be recorded in advance in the storage 503 as a recording medium built into the accident trigger generator 411 .
  • the program can be stored (recorded) in the removable media 510A, provided as package software, and installed in the accident trigger generator 411 from the removable media 510A.
  • the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 509 and installed in the accident trigger generator 411.
  • the control unit 531 controls the processing of each unit of the accident trigger generator 411.
  • the accident detection unit 532 detects an accident related to the user of the accident trigger generator 411, or Accidents occurring around the accident trigger generator 411 are detected. When detecting an accident, the accident detection unit 532 transmits a request trigger to the cameras 412 around the accident trigger generator 411 by BT via the communication unit 509 .
  • FIG. 27 is a block diagram showing a functional configuration example of the camera 412. As shown in FIG.
  • the imaging device 412 includes a CPU 601, a memory 602, a storage 603, an operation unit 604, a display unit 605, a speaker 606, an imaging unit 607, a GNSS receiver 608, a sensing unit 609, a communication unit 610, an external I/F 611, and a drive 612. Prepare.
  • the CPU 601 to drive 612 are connected to a bus and perform necessary communications with each other.
  • the CPU 601 to drive 612 are configured similarly to the CPU 101 to drive 612 of the information processing terminal 11 in FIG.
  • the program executed by the CPU 601 can be recorded in advance in the storage 603 as a recording medium built into the camera 412 .
  • the program can be stored (recorded) in the removable media 612A, provided as package software, and installed in the server 12 from the removable media 612A.
  • the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 610 and installed in the camera 412 .
  • Functions including a control unit 631, a position detection unit 632, a watermark superimposition unit 633, and a position verification unit 634 are realized by the CPU 601 executing the program installed in the camera 412.
  • a control unit 631 controls processing of each unit of the camera 412 .
  • the position detection unit 632 detects the current position of the camera 412 based on the position detection data output from the GNSS receiver 608 .
  • the position detection unit 632 generates position information including the detected current position and current time.
  • the watermark superimposing unit 633 receives the secret key from the server 413 via the network 421 and the communication unit 610.
  • the watermark superimposing unit 633 generates a watermark using the secret key and superimposes the watermark on the image data supplied from the camera 412 .
  • the watermark superimposing unit 633 associates the watermark-superimposed image data with position information indicating the shooting position of the image data and stores them in the storage 603 .
  • the location certification unit 634 acquires from the storage 603 image data of a predetermined interval before and after receiving the request trigger and location information associated with the image data. .
  • the location certification unit 634 transmits location certification information including the acquired image data and location information to the server 413 via the communication unit 610 and the network 21 .
  • FIG. 28 is a block diagram showing a functional configuration example of the server 413. As shown in FIG.
  • the server 413 includes a CPU 701 , a memory 702 , a storage 703 , an image DB (Data Base) 704 , an operation section 705 , a display section 706 , a communication section 707 , an external I/F 708 and a drive 709 .
  • the CPU 701 to drive 709 are connected to a bus and perform necessary communications with each other.
  • the CPU 701 to storage 703 and operation unit 705 to drive 709 are configured in the same manner as the CPU 201 to storage 203 and operation unit 205 to drive 209 of the server 12 in FIG.
  • the image DB 704 accumulates map information, image data around each position of the map information, and feature data indicating features of each image data.
  • the program executed by the CPU 701 can be recorded in advance in the storage 703 as a recording medium incorporated in the server 413 .
  • the program can be stored (recorded) in the removable media 709A, provided as package software, and installed in the server 413 from the removable media 709A.
  • the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 707 and installed on the server 413 .
  • Functions including a control unit 731, a position verification unit 732, and a PoL block generation unit 733 are realized by the CPU 701 executing a program installed in the server 12.
  • the control unit 731 controls the processing of each unit of the server 413.
  • the location verification unit 732 verifies the watermark superimposed on the image data included in the location certification information received from the camera 412, and also verifies the image data accumulated in the image DB 704 and the image data included in the location certification information.
  • the position information included in the position proof information is verified by comparing with .
  • the position verification section 732 includes a private key generation section 741 , a watermark extraction section 742 , a watermark verification section 743 , a feature extraction section 744 and a feature verification section 745 .
  • a secret key generation unit 741 generates a secret key and transmits it to the camera 412 via the communication unit 707 and network 421 .
  • the watermark extraction unit 742 extracts the watermark from the image data received from the camera 412 using the secret key generated by the secret key generation unit 741 and supplies the extracted watermark to the watermark verification unit 743 .
  • a watermark verification unit 743 verifies the watermark extracted from the image data and supplies the verification result to the feature extraction unit 744 .
  • the feature extraction unit 744 extracts features of the image data received from the camera 412 and supplies data indicating the extraction result of the features of the image data to the feature verification unit 745 .
  • the feature verification unit 745 verifies the position information received from the camera 412 by comparing the features extracted from the image data received from the camera 412 and the features of the image data stored in the image DB 704 . .
  • the PoL block generation unit 733 mines the image data and position information received from the camera 412, generates a PoL block, and adds it to the blockchain. In addition, the PoL block generation unit 733 transmits the PoL block to other nodes constituting the blockchain network to add it to the blockchain.
  • step S701 the private key generation unit 741 of the server 413 generates a private key.
  • step S702 the private key generation unit 741 of the server 413 shares the private key. Specifically, the private key generation unit 741 transmits the private key to the camera 412 via the communication unit 707 and network 421 .
  • the watermark superimposing unit 633 of the camera 412 receives the secret key via the communication unit 610 and stores it in the storage 603.
  • the secret key is shared between the server 413 and the camera 412.
  • Any method of transmitting the private key to the camera 412 can be adopted as long as the private key can be transmitted safely and secretly.
  • step S703 the image capturing device 412 starts capturing moving images, superimposing watermarks, and storing position information.
  • the shooting unit 607 shoots a moving image and starts processing to supply the obtained moving image data to the watermark superimposing unit 633 .
  • the position detection unit 632 starts processing for detecting the current position of the camera 412 based on the position detection data output from the GNSS receiver 608 .
  • the position detection unit 632 generates position information including the detected current position and current time, and starts processing to supply the generated position information to the watermark superimposition unit 633 .
  • the watermark superimposing unit 633 generates a watermark using the secret key stored in the storage 603 and starts superimposing it on each frame of the video data. Also, the watermark superimposing unit 633 associates each frame of the watermark superimposed moving image data with the position information, and starts the process of storing them in the storage 603 .
  • position information may be associated with each predetermined frame.
  • step S704 the accident detection unit 532 of the accident trigger generator 411 detects the occurrence of an accident based on at least one of the image data output from the imaging unit 507 and the sensor data output from the sensing unit 508.
  • Accidents to be detected include not only accidents related to the user who owns the accident trigger generator 411 but also accidents occurring around the accident trigger generator 411 .
  • step S705 the accident detection unit 532 of the accident trigger generator 411 scans the surrounding camera 412 (Witness) via the communication unit 509. For example, the accident detection unit 532 scans the camera 412 within the BT communication range of the communication unit 509 .
  • step S706 the accident detection unit 532 of the accident trigger generator 411 transmits a request trigger. Specifically, the accident detection unit 532 transmits a request trigger to the camera 412 detected in the process of step S705 by BT via the communication unit 509 .
  • the CPU 601 of the camera 412 receives the request trigger via the communication unit 610 .
  • step S707 the location verification unit 634 of the camera 412 confirms whether or not there is moving image data for the target period. Specifically, the location certification unit 634 determines whether video data for a predetermined period before and after receiving the request trigger (that is, before and after the accident was detected by the accident trigger generator 411) is stored in the storage 603. to confirm.
  • step S708 the camera 412 transmits the moving image data and the position information to the server 12.
  • the location certification unit 634 reads from the storage 603 the moving image data for a predetermined target period before and after receiving the request trigger and the location information associated with each frame of the moving image data.
  • Location certification unit 634 transmits location certification information including the read moving image data and location information to server 413 via communication unit 610 and network 421 .
  • the CPU 701 of the server 413 receives the location certification information via the network 421 and the communication unit 707.
  • step S709 the server 413 performs position verification processing.
  • step S710 the server 413 performs mining and generates PoL blocks.
  • step S711 the server 413 verifies the PoL block and adds it to the blockchain.
  • step S731 the watermark extraction unit 742 extracts the watermark of the video data. Specifically, the watermark extraction unit 742 extracts the watermark of each frame of the moving image data included in the position verification information received from the camera 412 using the private key generated in the process of step S701. The watermark extraction unit 742 supplies the extracted watermark to the watermark verification unit 743 .
  • step S732 the watermark verification unit 743 determines whether the watermark extracted from the video data is valid. If the watermark is determined to be valid, the process proceeds to step S733.
  • step S733 the feature extraction unit 744 extracts features of the video data. Specifically, the watermark verification unit 743 notifies the feature extraction unit 744 that the watermark of the video data is valid.
  • the feature extraction unit 744 uses an arbitrary method to extract features of each frame of video data.
  • the features extracted at this time are features that change little over time.
  • the feature extraction unit 744 uses a SIFT (Scale Invariant Feature Transform) feature amount or the like to extract feature points of stationary objects such as buildings and mountains that change little.
  • the feature extraction unit 744 recognizes characters such as guide signs and signboards in the moving image, and extracts the recognized characters as features.
  • the feature extraction unit 744 supplies the feature verification unit 745 with data indicating the extraction result of the feature of the moving image data.
  • the feature verification unit 745 refers to the image DB 704 using the position information received from the camera 412 as a key. That is, the feature verification unit 745 extracts a plurality of image data at positions indicated by position information associated with each frame of the received moving image data from the image DB 704 as image data corresponding to each frame of the moving image data. Also, the feature verification unit 745 extracts feature data indicating the feature of each extracted image data from the image DB 704 .
  • step S735 the feature verification unit 745 determines whether the correlation between the feature of the moving image data and the feature of the image DB 704 is equal to or greater than a threshold. Specifically, the position verification unit 704 calculates the correlation between the features extracted from each frame of the moving image data and the features of the image data corresponding to each frame of the moving image data extracted from the image DB 704 . When the feature verification unit 745 determines that the calculated correlation is equal to or greater than a predetermined threshold, that is, when the feature of each frame of the moving image data is similar to the feature of the image data in the image DB 704 corresponding to each frame, Processing proceeds to step S736.
  • the video data received from the camera 412 is video data corresponding to the video taken at the position indicated by the position information received from the camera 412 .
  • the camera 412 was present at the position indicated by the positional information when the moving image was captured. That is, the authenticity of the moving image data and the location information included in the location certification information is verified.
  • the PoL block generation unit 733 performs mining and generates PoL blocks. Specifically, the feature verification unit 745 notifies the PoL block generation unit 733 that the moving image data and position information received from the camera 412 are authentic.
  • the PoL block generation unit 733 mines the moving image data and the location information by a predetermined method, and generates a PoL block including the location certification information.
  • step S737 the PoL block generation unit 733 verifies the PoL block and adds it to the blockchain. Specifically, the PoL block generation unit 733 verifies the PoL block by a predetermined method, and adds the PoL block to the blockchain when determining that the PoL block is valid. In addition, when the PoL block generation unit 733 determines that the PoL block is valid, it transmits the PoL block to other nodes constituting the blockchain network to add the PoL block to the blockchain.
  • step S735 determines in step S735 that the calculated correlation is less than the predetermined threshold, that is, if the feature of each frame of the moving image data and the feature of the image data in the image DB 704 corresponding to each frame are are not similar, the processes of steps S736 and S737 are skipped, and the position verification process ends. This is the case where at least one of the video data and the location information included in the location proof information is not authentic.
  • the camera 412 can transmit still image data instead of moving image data.
  • the accident trigger generator 411 and the camera 412 may be included in one housing to constitute one device. In this case, for example, it is possible to delete one of the overlapping functions.
  • the accident trigger generator 411 can transmit a witness request trigger when a trigger other than an accident (eg, a predetermined event or predetermined timing) is detected. This makes it possible to collect moving image data and position information captured before and after the occurrence of a predetermined trigger.
  • a trigger other than an accident eg, a predetermined event or predetermined timing
  • the position of the information processing terminal 11 is represented by latitude and longitude, but the position of the information processing terminal 11 may be represented by other methods.
  • the position of the information processing terminal 11 may be represented using Geohash, S2 Geometry, etc., in consideration of the ease of searching for the position, the protection of privacy, and the like.
  • the position of the information processing terminal 11 may be represented by adding the altitude to the latitude and longitude.
  • the method of detecting the position of the information processing terminal 11 is not limited to the method using the GNSS receiver 108 described above, and other methods can be used.
  • the server 12 in FIG. 1 may constitute one of the nodes of the blockchain network 13.
  • this technology can be used in other technologies and situations where it is necessary to guarantee the authenticity of the user's location, other than the insurance example described above.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • a system means a set of multiple components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules in one housing are both systems. .
  • this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
  • each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
  • one step includes multiple processes
  • the multiple processes included in the one step can be executed by one device or shared by multiple devices.
  • a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and the current time
  • a location certification acquisition unit that requests location certification from a first information processing device present in the vicinity when the trigger is detected and receives first location certification information from the first information processing device
  • an information processing apparatus comprising: a location registration unit that transmits the first location information and the first location proof information to a second information processing apparatus that records the first location information and the first location proof information.
  • the location certification acquisition unit transmits a location certification request including the first location information to the first information processing device by short-range wireless communication, and transmits a location certification response including the first location certification information to the The information processing device according to (1), received from the first information processing device.
  • the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key.
  • the information processing device according to . (4) the location request further includes metadata associated with the first location information or a fingerprint of the metadata; The information processing apparatus according to (3), wherein the plaintext further includes the metadata or the fingerprint.
  • the metadata includes accident data used to detect an accident, a hash value of the accident data, image data corresponding to an image of the surroundings, a hash value of the image data, activity data used to detect user activity, and at least one of hash values of the activity data.
  • the information processing device according to (4) or (5).
  • An image registration unit that transmits the image data encrypted using the encryption key to a third information processing device and receives an image ID assigned to the image data from the third information processing device.
  • the position detection unit detects a current position and generates the first position information when detecting a predetermined timing from the start to the end of capturing an image corresponding to the image data
  • the location certification acquisition unit requests location certification from the first information processing device when the timing is detected, and receives the first location certification information from the first information processing device.
  • the third information processing device requests the image data corresponding to the image taken near the position and the time indicated by the first position information, the image ID corresponding to the corresponding image data , and an information providing unit configured to transmit a decryption key corresponding to the encryption key to the third information processing apparatus.
  • the information processing apparatus according to (7).
  • the location certification acquisition unit converts the location certification response to the first information.
  • the information processing device according to any one of (3) to (8), received from the processing device.
  • the location proof response includes second location information including the current location and current time of the first information processing device, a public key, and a private key corresponding to the public key from plaintext including the second location information.
  • an electronic signature generated using
  • the location registration unit determines that the location certification response is valid using the public key and the electronic signature
  • the location registration unit transmits the location information transaction including the first location information and the first location certification information to the The information processing device according to (2), which transmits to the second information processing device.
  • the location proof response further includes the location proof request; said plaintext further comprising said location proof request;
  • the location certification acquisition unit transmits the location certification request to the first information processing device existing within a communication range of the short-range wireless communication.
  • the first location proof information includes second location information including the current location and current time of the first information processing device.
  • the information processing device includes movement data including the first position information in time series.
  • the report data is used for calculating premiums, insurance benefits, or benefits of the insurance.
  • the trigger is a predetermined event or predetermined timing.
  • the information processing device according to any one of (1) to (17), wherein the first location information and the first location proof information are recorded in a block chain by the second information processing device.
  • the location certification unit receives, from the first information processing device, a location certification request including first location information including the current location and current time of the first information processing device by short-range wireless communication, and The information processing device according to (21), wherein a position proof response including position proof information is transmitted to the first information processing device.
  • the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key;
  • the location certification unit transmits the location certification response to the first information processing device when determining that the location certification request is valid using the public key and the electronic signature.
  • information processing equipment further comprising a location detection unit that detects a current location and generates second location information including the current location and the current time when the request for location certification is received;
  • the location proof response includes the second location information and a public key, and a first digital signature generated from a first plaintext containing the second location information using a private key corresponding to the public key.
  • the location proof response further includes the location proof request;
  • the location proof response further includes metadata associated with the second location information or a fingerprint of the metadata;
  • the (24 ) When image data corresponding to an image taken near the position and time indicated by the first position information is requested from the second information processing device that has acquired the first position information and the position proof information, The (24 ).
  • the information processing apparatus according to any one of (21) to (28), wherein the location proof information includes the location information.
  • the location certification information includes image data superimposed with a watermark corresponding to an image of the surroundings captured around the time when the request for location certification was received, and the location information;
  • the information processing apparatus according to (29), wherein the location certification unit transmits the location certification information to a second information processing apparatus that records the location certification information.
  • an information processing device that has detected a predetermined trigger receives a request for position proof sent to another information processing device in the vicinity
  • the computer executes processing for generating position proof information and sending the position proof information.
  • program to make (33) Reporting data received from the first information processing device, including first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger.
  • position proof information generated in response to a position proof request from the first information processing device by a second information processing device that was present in the vicinity of the first information processing device when the trigger was detected.
  • a verification unit that verifies using An information processing apparatus comprising: an execution unit that executes processing corresponding to the declaration data when the verification unit determines that the declaration data is authentic.
  • the verification unit acquires the location proof information from a third information processing device that records a location proof block containing the first location information and the location proof information, based on the first location information.
  • the claim data further includes metadata associated with the first location information;
  • the location proof block includes the first location information, the location proof information, and the metadata or a fingerprint of the metadata;
  • the information processing device according to (34) wherein the verification unit acquires the location proof information from the second information processing device based on the first location information and the metadata or the fingerprint.
  • the information processing device according to (35) wherein the execution unit executes processing corresponding to the report data based on the first position information and the metadata.
  • the information processing device according to (34) or (35), wherein the location proof block is recorded in a blockchain.
  • the information processing apparatus according to (33), further comprising an information collecting unit.
  • the position proof information includes second position information including the current position and current time of the second information processing device when the position proof is requested from the first information processing device,
  • the information collection unit extracts the first location information, the location proof information, a public key, and a first plaintext containing the first location information and the location proof information to a private key corresponding to the public key.
  • the information collecting unit sends the second position information and the public key to the second information processing device to request the image data, and obtains the image data and a second plaintext including the image data.
  • (40) receiving image data encrypted using an encryption key from a third information processing device, transmitting an image ID assigned to the image data to the third information processing device, and encrypting the image; further comprising an information collecting unit that associates the data and the image ID and saves them in a database; requesting the third information processing device for an image taken near the position and near the time indicated by the first position information, and decrypting the image ID corresponding to the corresponding image data and the encryption key corresponding to the encryption key;
  • the information processing device according to (33), which receives a key from the third information processing device.

Abstract

The present technology pertains to an information processing device and a program which can guarantee the authenticity of location information from an information processing device without using a dedicated device. The information processing device comprises: a location detection unit which detects the current location when detecting a prescribed trigger, and generates first location information including the current location and the current time; a location certification acquisition unit which requires location certification of a first information processing device that is present in the periphery when detecting the trigger, and receives first location certification information from the first information processing device; and a location registration unit which transmits the first location information and the first location certification information to a second information processing device that records the first location information and the first location certification information. The present technology can be applied to, for example, a system which performs reports pertaining to insurance.

Description

情報処理装置及びプログラムInformation processing device and program
 本技術は、情報処理装置及びプログラムに関し、特に、情報処理装置の位置情報の真正性を保証できるようにした情報処理装置及びプログラムに関する。 The present technology relates to an information processing device and a program, and more particularly to an information processing device and a program that can guarantee the authenticity of position information of the information processing device.
 従来、携帯情報端末に装着したICカードや位置保証装置等の専用のデバイスを用いて、携帯情報端末の位置情報の真正性を保証する技術が提案されている(例えば、特許文献1、2参照)。 Conventionally, there has been proposed a technique for assuring the authenticity of location information of a mobile information terminal using a dedicated device such as an IC card or a location assurance device attached to the mobile information terminal (see, for example, Patent Documents 1 and 2). ).
国際公開2008/010287号WO2008/010287 特開2018-107514号公報JP 2018-107514 A
 しかしながら、特許文献1及び2に記載の発明では、専用のデバイスを情報処理端末に装着しないと位置情報の真正性を保証することができない。また、例えば、情報処理端末と専用のデバイス間において、位置情報が改ざんされるおそれがある。 However, in the inventions described in Patent Documents 1 and 2, the authenticity of location information cannot be guaranteed unless a dedicated device is attached to the information processing terminal. Also, for example, there is a risk that position information may be tampered with between the information processing terminal and the dedicated device.
 本技術は、このような状況に鑑みてなされたものであり、専用のデバイスを用いずに、情報処理装置の位置情報の真正性を保証できるようにするものである。 This technology has been developed in view of such circumstances, and makes it possible to guarantee the authenticity of location information of an information processing device without using a dedicated device.
 本技術の第1の側面の情報処理装置は、所定のトリガの検出時に現在位置を検出し、現在位置及び現在時刻を含む第1の位置情報を生成する位置検出部と、前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明を要求し、第1の位置証明情報を前記第1の情報処理装置から受信する位置証明取得部と、前記第1の位置情報及び前記第1の位置証明情報を記録する第2の情報処理装置に前記第1の位置情報及び前記第1の位置証明情報を送信する位置登録部とを備える。 An information processing apparatus according to a first aspect of the present technology includes a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and current time; a location certification acquisition unit that requests location certification from a first information processing device existing in the vicinity and receives first location certification information from the first information processing device; and a location registration unit that transmits the first location information and the first location certification information to a second information processing device that records one piece of location certification information.
 本技術の第1の側面のプログラムは、所定のトリガの検出時に現在位置を検出し、現在位置及び現在時刻を含む位置情報を生成し、前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明を要求し、位置証明情報を前記第1の情報処理装置から受信し、前記位置情報及び前記位置証明情報を記録する第2の情報処理装置に前記位置情報及び前記位置証明情報を送信する処理をコンピュータに実行させる。 A program according to a first aspect of the present technology detects a current position when a predetermined trigger is detected, generates position information including the current position and current time, and detects first information existing in the surroundings when the trigger is detected. requesting location certification from a processing device, receiving location certification information from the first information processing device, and storing the location information and the location certification information in a second information processing device that records the location information and the location certification information; causes the computer to execute the process of sending the
 本技術の第1の側面においては、所定のトリガの検出時に現在位置が検出され、現在位置及び現在時刻を含む位置情報が生成され、前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明が要求され、位置証明情報が前記第1の情報処理装置から受信され、前記位置情報及び前記位置証明情報を記録する第2の情報処理装置に前記位置情報及び前記位置証明情報が送信される。 In a first aspect of the present technology, when a predetermined trigger is detected, a current position is detected, position information including the current position and current time is generated, and when the trigger is detected, first information processing existing in the surroundings is generated. A device is requested to provide location proof, location proof information is received from the first information processing device, and the location information and the location proof information are received in a second information processing device that records the location information and the location proof information. sent.
 本技術の第2の側面の情報処理装置は、所定のトリガを検出した第1の情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求を受信したとき、位置証明情報を生成し、前記位置証明情報を送信する位置証明部を備える。 An information processing apparatus according to a second aspect of the present technology receives a request for location certification transmitted from a first information processing apparatus that has detected a predetermined trigger to another information processing apparatus in the surrounding area. and a location certification unit that transmits the location certification information.
 本技術の第2の側面のプログラムは、所定のトリガを検出した情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求を受信したとき、位置証明情報を生成し、前記位置証明情報を送信する処理をコンピュータに実行させる。 A program according to a second aspect of the present technology generates location certification information when an information processing device that detects a predetermined trigger receives a request for location certification transmitted to another information processing device in the vicinity, Causes the computer to execute a process of transmitting location verification information.
 本技術の第2の側面においては、所定のトリガを検出した情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求が受信されたとき、位置証明情報が生成され、前記位置証明情報が送信される。 In the second aspect of the present technology, when an information processing apparatus that has detected a predetermined trigger receives a request for location certification transmitted to another information processing apparatus in the vicinity, location certification information is generated, Location information is sent.
 本技術の第3の側面の情報処理装置は、第1の情報処理装置が所定のトリガを検出したときの前記第1の情報処理装置の現在位置及び現在時刻を含む第1の位置情報を含み、前記第1の情報処理装置から受信した申告データを、前記トリガの検出時に前記第1の情報処理装置の周囲に存在していた第2の情報処理装置が前記第1の情報処理装置からの位置証明の要求に応じて生成した位置証明情報を用いて検証する検証部と、前記検証部により前記申告データが真正であると判定された場合、前記申告データに対応した処理を実行する実行部とを備える。 An information processing device according to a third aspect of the present technology includes first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger. a second information processing device, which was present around the first information processing device when the trigger was detected, receives the report data received from the first information processing device from the first information processing device; A verification unit that verifies using location proof information generated in response to a request for location verification, and an execution unit that executes processing corresponding to the declared data when the verification unit determines that the declared data is authentic. and
 本技術の第3の側面においては、第1の情報処理装置が所定のトリガを検出したときの前記第1の情報処理装置の現在位置及び現在時刻を含む第1の位置情報を含み、前記第1の情報処理装置から受信した申告データが、前記トリガの検出時に前記第1の情報処理装置の周囲に存在していた第2の情報処理装置が前記第1の情報処理装置からの位置証明の要求に応じて生成した位置証明情報を用いて検証され、前記検証部により前記申告データが真正であると判定された場合、前記申告データに対応した処理が実行される。 In a third aspect of the present technology, first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger, The report data received from the first information processing device is the location certification from the first information processing device by the second information processing device that was present in the vicinity of the first information processing device at the time of detection of the trigger. Verification is performed using the position proof information generated in response to the request, and when the verification unit determines that the report data is authentic, a process corresponding to the report data is executed.
本技術を適用した情報処理システムの第1の実施の形態を示すブロック図である。1 is a block diagram showing a first embodiment of an information processing system to which the present technology is applied; FIG. 情報処理端末の機能的構成例を示すブロック図である。3 is a block diagram showing an example of the functional configuration of an information processing terminal; FIG. 情報処理端末のCPUにより実現される機能の構成例を示すブロック図である。3 is a block diagram showing a configuration example of functions realized by a CPU of the information processing terminal; FIG. サーバの機能的構成例を示すブロック図である。3 is a block diagram showing a functional configuration example of a server; FIG. 情報処理システムの基本的な処理の流れを示す図である。It is a figure which shows the flow of a basic process of an information processing system. PoLリクエストのフォーマット例を示す図である。FIG. 10 is a diagram showing a format example of a PoL request; PoLレスポンスのフォーマット例を示す図である。FIG. 10 is a diagram showing a format example of a PoL response; PoLトランザクションのフォーマット例を示す図である。FIG. 10 is a diagram showing a format example of a PoL transaction; PoLブロックのフォーマット例を示す図である。FIG. 4 is a diagram showing a format example of a PoL block; 図5の処理を示すシーケンス図である。FIG. 6 is a sequence diagram showing the processing of FIG. 5; PoLリクエスト検証処理の詳細を説明するためのフローチャートである。FIG. 10 is a flowchart for explaining details of PoL request verification processing; FIG. PoLレスポンス検証処理の詳細を説明するためのフローチャートである。FIG. 10 is a flowchart for explaining the details of PoL response verification processing; FIG. 図5の処理を示すシーケンス図である。FIG. 6 is a sequence diagram showing the processing of FIG. 5; 事故の目撃情報の収集処理の流れを示す図である。It is a figure which shows the flow of collection processing of eyewitness information of an accident. 目撃者アンケートの表示画面の例を示す図である。FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire; 目撃者アンケートの表示画面の例を示す図である。FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire; 目撃者アンケートの表示画面の例を示す図である。FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire; 図14の処理を示すシーケンス図である。FIG. 15 is a sequence diagram showing the processing of FIG. 14; リスク細分型保険の保険料等の計算処理の流れを示す図である。It is a figure which shows the flow of calculation processing, such as an insurance premium of risk subdivision type insurance. 図19の処理を示すシーケンス図である。FIG. 20 is a sequence diagram showing the processing of FIG. 19; レジャー保険又は損害保険の保険金等の計算処理の流れを示す図である。FIG. 10 is a diagram showing the flow of calculation processing for insurance claims for leisure insurance or non-life insurance; 健康増進型保険の保険料等の計算処理の流れを示す図である。FIG. 10 is a diagram showing a flow of calculation processing of health promotion insurance premiums, etc. FIG. 画像データを保存するとともに、撮影時の位置情報及び位置証明情報を記録する処理を示すシーケンス図である。FIG. 10 is a sequence diagram showing a process of storing image data and recording position information and position certification information at the time of shooting; 目撃情報の収集処理を示すシーケンス図である。FIG. 10 is a sequence diagram showing eyewitness information collection processing; 本技術を適用した情報処理システムの第2の実施の形態を示すブロック図である。FIG. 11 is a block diagram showing a second embodiment of an information processing system to which the present technology is applied; 事故トリガ生成機の機能的構成例を示すブロック図である。It is a block diagram which shows the functional structural example of an accident trigger generator. 撮影機の機能的構成例を示すブロック図である。2 is a block diagram showing an example of the functional configuration of a camera; FIG. サーバの機能的構成例を示すブロック図である。3 is a block diagram showing a functional configuration example of a server; FIG. 図25の情報処理システムの処理の流れを示すシーケンス図である。26 is a sequence diagram showing the flow of processing of the information processing system of FIG. 25; FIG. 位置検証処理の詳細を説明するためのフローチャートである。4 is a flowchart for explaining the details of position verification processing;
 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.第1の実施の形態
 2.第2の実施の形態
 3.変形例
 4.その他
Embodiments for implementing the present technology will be described below. The explanation is given in the following order.
1. First Embodiment 2. Second embodiment 3. Modification 4. others
 <<1.第1の実施の形態>>
 まず、図1乃至図24を参照して、本技術の第1の実施の形態について説明する。
<<1. First Embodiment>>
First, a first embodiment of the present technology will be described with reference to FIGS. 1 to 24. FIG.
  <情報処理システム1の構成例>
 図1は、本技術を適用した情報処理システムの第1の実施の形態である情報処理システム1の構成例を示している。
<Configuration example of information processing system 1>
FIG. 1 shows a configuration example of an information processing system 1 as a first embodiment of an information processing system to which the present technology is applied.
 情報処理システム1は、各種の保険の処理を実行するためのシステムである。情報処理システム1は、情報処理端末11-1乃至情報処理端末11-n、サーバ12、及び、ブロックチェーンネットワーク13を備える。情報処理端末11-1乃至情報処理端末11-n、サーバ12、及び、ブロックチェーンネットワーク13は、ネットワーク21を介して接続され、相互に通信することが可能である。また、情報処理端末11-1乃至情報処理端末11-nは、ネットワーク21を介さずに、近距離無線通信を用いて直接通信することが可能である。 The information processing system 1 is a system for executing various insurance processing. The information processing system 1 includes information processing terminals 11 - 1 to 11 - n, a server 12 , and a blockchain network 13 . The information processing terminals 11-1 to 11-n, the server 12, and the blockchain network 13 are connected via a network 21, and can communicate with each other. In addition, the information processing terminals 11-1 to 11-n can communicate directly using short-range wireless communication without going through the network 21. FIG.
 なお、以下、情報処理端末11-1乃至情報処理端末11-nを個々に区別する必要がない場合、単に情報処理端末11と称する。 Hereinafter, the information processing terminals 11-1 to 11-n are simply referred to as the information processing terminal 11 when there is no need to distinguish them individually.
 情報処理端末11は、例えば、ユーザが携帯又は装着することにより持ち運ぶことが可能な携帯型の情報処理装置により構成される。例えば、情報処理端末11は、スマートフォン、携帯電話機、タブレット端末、ウエアラブルデバイス、アクションカメラ、携帯用音楽プレイヤー、携帯用ゲーム機等により構成される。 The information processing terminal 11 is configured by, for example, a portable information processing device that can be carried by the user or worn by the user. For example, the information processing terminal 11 is configured by a smart phone, a mobile phone, a tablet terminal, a wearable device, an action camera, a portable music player, a portable game machine, or the like.
 また、例えば、情報処理端末11は、ドライブレコーダ等の車両(二輪車を含む)等の移動体に搭載され、移動体の周囲を撮影し、記録する情報処理装置により構成される。 Also, for example, the information processing terminal 11 is configured by an information processing device such as a drive recorder, which is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle), and shoots and records the surroundings of the mobile object.
 さらに、情報処理端末11は、例えば、屋外又は屋内の任意の場所に設置される専用の情報処理装置により構成される。 Further, the information processing terminal 11 is composed of, for example, a dedicated information processing device installed at an arbitrary location outdoors or indoors.
 情報処理端末11は、例えば、サーバ12が提供する保険を利用するために用いられる。例えば、情報処理端末11は、保険に関する各種の申告を行うための申告データを生成し、ネットワーク21を介してサーバ12に送信する。情報処理端末11は、申告データに対してサーバ12が送信する各種のデータを、ネットワーク21を介して受信する。 The information processing terminal 11 is used, for example, to use the insurance provided by the server 12. For example, the information processing terminal 11 generates declaration data for making various declarations regarding insurance, and transmits the declaration data to the server 12 via the network 21 . The information processing terminal 11 receives various data transmitted by the server 12 in response to the declaration data via the network 21 .
 また、情報処理端末11は、周囲の他の情報処理端末11の位置の証明に用いられる。 In addition, the information processing terminal 11 is used for certifying the positions of other information processing terminals 11 in the vicinity.
 例えば、後述するように、保険利用者が所持する情報処理端末11(以下、Proverと称する)は、現在位置を検出し、検出した現在位置及び現在時刻を含む位置情報を、周囲の情報処理端末11(以下、Witnessと称する)に送信し、位置証明を要求する。ここで、位置証明とは、Proverの位置情報を証明する処理である。換言すれば、位置証明とは、Proverが位置情報に示される時刻に位置情報に示される位置に存在していたことを証明する処理である。 For example, as will be described later, the information processing terminal 11 (hereinafter referred to as Prover) possessed by the insurance user detects the current position, and transmits the detected current position and position information including the current time to surrounding information processing terminals. 11 (hereinafter referred to as Witness) to request location verification. Here, the position verification is processing for verifying the position information of the Prover. In other words, position proof is a process of proving that the Prover was present at the position indicated by the position information at the time indicated by the position information.
 これに対して、Witnessは、位置証明情報を生成し、Proverに送信する。ここで、位置証明情報とは、Proverの位置情報を証明する情報である。換言すれば、位置証明情報とは、Proverが位置情報に示される時刻に位置情報に示される位置に存在していたことを証明する情報である。例えば、位置証明情報は、Witnessの現在位置及び現在時刻を含む位置情報(Witnessの位置情報)を含む。 In response, Witness generates location verification information and sends it to Prover. Here, the location certification information is information that certifies the location information of the Prover. In other words, location proof information is information that proves that the Prover was present at the location indicated by the location information at the time indicated by the location information. For example, the location proof information includes location information (Witness location information) including the current location and current time of the Witness.
 これに対して、Proverは、位置情報及び位置証明情報を含むトランザクションデータを、ネットワーク21を介してブロックチェーンネットワーク13に送信し、Proverの位置情報及び位置証明情報をブロックチェーンに記録させる。 In response, the Prover transmits transaction data including location information and location proof information to the blockchain network 13 via the network 21, and causes the Prover's location information and location proof information to be recorded in the blockchain.
 なお、本明細書において、時刻とは、いわゆる時刻だけでなく、日付や曜日も含みうる。例えば、現在時刻は、現在の時刻だけでなく、現在の日付や曜日も含みうる。 In this specification, the time may include not only the so-called time but also the date and the day of the week. For example, the current time may include not only the current time, but also the current date and day of the week.
 例えば、各情報処理端末11は、通常はWitnessとして動作し、所定のトリガを検出した場合、Proverとして動作し、Proverとしての動作の終了後、Witnessに戻る。 For example, each information processing terminal 11 normally operates as a Witness, operates as a Prover when a predetermined trigger is detected, and returns to the Witness after completing the operation as a Prover.
 さらに、情報処理端末11は、例えば、事故発生時に事故現場付近に存在していた場合、サーバ12からの要求に従って、事故の目撃情報を生成し、ネットワーク21を介して、サーバ12に送信する。 Furthermore, for example, when the information processing terminal 11 is present near the accident site when the accident occurs, the information processing terminal 11 generates eyewitness information of the accident according to a request from the server 12 and transmits it to the server 12 via the network 21 .
 サーバ12は、情報処理端末11及びブロックチェーンネットワーク13と各種のデータを授受しながら、保険に関する各種の処理を実行する。 The server 12 executes various processes related to insurance while exchanging various data with the information processing terminal 11 and the blockchain network 13.
 ブロックチェーンネットワーク13は、複数のノードが接続されたネットワークにより構成される。ブロックチェーンネットワーク13(の各ノード)は、各情報処理端末11の位置情報及び位置証明情報を含むブロックを連結したブロックチェーンの更新及び保持を行う。ブロックチェーンネットワーク13は、サーバ12からの要求に応じて、サーバ12から提示された条件に適合する位置情報及び位置証明情報を含むブロックをブロックチェーンから抽出し、ネットワーク21を介して、サーバ12に送信する。 The blockchain network 13 is composed of a network in which multiple nodes are connected. The (each node of) the blockchain network 13 updates and maintains a block chain in which blocks including location information and location proof information of each information processing terminal 11 are connected. In response to a request from the server 12, the blockchain network 13 extracts from the blockchain blocks containing location information and location proof information that meet the conditions presented by the server 12, and sends them to the server 12 via the network 21. Send.
  <情報処理端末11の構成例>
 図2は、図1の情報処理端末11の機能的構成例を示すブロック図である。
<Configuration example of information processing terminal 11>
FIG. 2 is a block diagram showing a functional configuration example of the information processing terminal 11 of FIG.
 情報処理端末11は、CPU(Central Processing Unit)101、メモリ102、ストレージ103、操作部104、表示部105、スピーカ106、撮影部107、GNSS(Global Navigation Satellite System)受信機108、センシング部109、通信部110、外部I/F111、及び、ドライブ112を備える。CPU101乃至ドライブ112は、バスに接続されており、相互に、必要な通信を行う。 The information processing terminal 11 includes a CPU (Central Processing Unit) 101, a memory 102, a storage 103, an operation unit 104, a display unit 105, a speaker 106, an imaging unit 107, a GNSS (Global Navigation Satellite System) receiver 108, a sensing unit 109, A communication unit 110 , an external I/F 111 and a drive 112 are provided. The CPU 101 to drive 112 are connected to a bus and perform necessary communications with each other.
 CPU101は、メモリ102やストレージ103にインストールされたプログラムを実行することで、各種の処理を行う。 The CPU 101 performs various processes by executing programs installed in the memory 102 and storage 103.
 メモリ102は、例えば、揮発性メモリ等で構成され、CPU101が実行するプログラムや、必要なデータを一時記憶する。 The memory 102 is composed of, for example, a volatile memory or the like, and temporarily stores programs executed by the CPU 101 and necessary data.
 ストレージ103は、例えば、ハードディスクや不揮発性メモリで構成され、CPU101が実行するプログラムや、必要なデータを記憶する。 The storage 103 is composed of, for example, a hard disk or non-volatile memory, and stores programs executed by the CPU 101 and necessary data.
 操作部104は、物理的なキー(キーボードを含む)や、マウス、タッチパネル等で構成される。操作部104は、ユーザの操作に応じて、その操作に対応する操作信号を、バス上に出力する。 The operation unit 104 is composed of physical keys (including a keyboard), a mouse, a touch panel, and the like. The operation unit 104 outputs an operation signal corresponding to the user's operation onto the bus.
 表示部105は、例えば、LCD(Liquid Crystal Display)等で構成され、バスから供給されるデータに応じて、画像を表示する。 The display unit 105 is composed of, for example, an LCD (Liquid Crystal Display) or the like, and displays an image according to data supplied from the bus.
 ここで、例えば、操作部104としてのタッチパネルは、透明な部材で構成され、表示部105と一体的に構成することができる。これにより、ユーザは、表示部105に表示されたアイコンやボタン等を操作するような形で、情報を入力することができる。 Here, for example, the touch panel as the operation unit 104 is made of a transparent member and can be configured integrally with the display unit 105 . Accordingly, the user can input information by operating icons, buttons, and the like displayed on the display unit 105 .
 スピーカ106は、バスから供給されるデータに応じて、音を出力する。 The speaker 106 outputs sound according to the data supplied from the bus.
 撮影部107は、画像(静止画、動画)を撮影し(光をセンシングし)、対応する画像データを、バス上に出力する。 The imaging unit 107 captures an image (still image, moving image) (perceives light) and outputs the corresponding image data onto the bus.
 GNSS受信機108は、GNSS衛星からの信号を受信し、受信した信号に基づいて、情報処理端末11の現在位置を検出する。GNSS受信機108は、現在位置の検出結果を示すデータ(以下、位置検出データと称する)をバス上に出力する。 The GNSS receiver 108 receives signals from GNSS satellites and detects the current position of the information processing terminal 11 based on the received signals. The GNSS receiver 108 outputs data indicating the detection result of the current position (hereinafter referred to as position detection data) onto the bus.
 センシング部109は、各種のセンサを備え、各センサから出力されるセンサデータを、バス上に出力する。センシング部109は、例えば、ユーザの行動を検出するためのセンサ、例えば、モーションセンサ、加速度センサ、角速度センサ等を備える。 The sensing unit 109 includes various sensors, and outputs sensor data output from each sensor onto the bus. The sensing unit 109 includes, for example, sensors for detecting user behavior, such as motion sensors, acceleration sensors, angular velocity sensors, and the like.
 通信部110は、通信回路及びアンテナ等を含み、ネットワーク21を介して、他の情報処理端末11、サーバ12、及び、ブロックチェーンネットワーク13と通信を行う。
また、通信部110は、ネットワーク21を介さずに、他の情報処理端末11と所定の方式の近距離無線通信を行う。
The communication unit 110 includes a communication circuit, an antenna, and the like, and communicates with other information processing terminals 11 , servers 12 , and the blockchain network 13 via the network 21 .
Also, the communication unit 110 performs short-range wireless communication with another information processing terminal 11 using a predetermined method without using the network 21 .
 なお、以下、通信部110が他の情報処理端末11とBluetooth(登録商標、以下BTと称する)により近距離無線通信を行う場合の例について説明する。 An example in which the communication unit 110 performs short-range wireless communication with another information processing terminal 11 using Bluetooth (registered trademark, hereinafter referred to as BT) will be described below.
 外部I/F(インタフェース)111は、各種の外部の装置との間で、データをやりとりするためのインタフェースである。 An external I/F (interface) 111 is an interface for exchanging data with various external devices.
 ドライブ112は、例えば、メモリカード等のリムーバブルメディア112Aの着脱が可能になっており、そこに装着されたリムーバブルメディア112Aを駆動する。 For example, the drive 112 is capable of attaching and detaching a removable medium 112A such as a memory card, and drives the attached removable medium 112A.
 以上のように構成されるサーバ12において、CPU101が実行するプログラムは、CPU101に内蔵されている記録媒体としてのストレージ103にあらかじめ記録しておくことができる。 In the server 12 configured as described above, the program executed by the CPU 101 can be recorded in advance in the storage 103 as a recording medium incorporated in the CPU 101 .
 また、プログラムは、リムーバブルメディア111Aに格納(記録)して、いわゆるパッケージソフトウエアとして提供し、リムーバブルメディア111Aからサーバ12にインストールすることができる。 Also, the program can be stored (recorded) in the removable media 111A, provided as so-called package software, and installed in the server 12 from the removable media 111A.
 その他、プログラムは、ネットワーク21及び通信部110を介して、図示せぬサーバ等からダウンロードし、情報処理端末11にインストールすることができる。 In addition, the program can be downloaded from a server (not shown) or the like via the network 21 and the communication unit 110 and installed in the information processing terminal 11.
 図3は、CPU101が情報処理端末11にインストールされたプログラムを実行することにより実現される機能の構成例を示している。CPU101が情報処理端末11にインストールされたプログラムを実行することにより、例えば、制御部131、位置検出部132、位置証明処理部134、及び、申告部135を含む機能が実現される。 FIG. 3 shows a configuration example of functions realized by the CPU 101 executing a program installed in the information processing terminal 11. FIG. By executing the program installed in the information processing terminal 11 by the CPU 101, functions including, for example, the control unit 131, the position detection unit 132, the position proof processing unit 134, and the report unit 135 are realized.
 制御部131は、情報処理端末11の各部の処理の制御を行う。 The control unit 131 controls the processing of each unit of the information processing terminal 11 .
 位置検出部132は、GNSS受信機108から出力される位置検出データに基づいて、情報処理端末11の現在位置を検出する。位置検出部132は、情報処理端末11の現在位置及び現在時刻を含む位置情報を生成する。 The position detection unit 132 detects the current position of the information processing terminal 11 based on the position detection data output from the GNSS receiver 108 . The position detection unit 132 generates position information including the current position and current time of the information processing terminal 11 .
 事故検出部133は、撮影部107から出力される画像データ、及び、センシング部109から出力されるセンサデータの少なくとも一方に基づいて、情報処理端末11が所持するユーザが関連する事故、又は、情報処理端末11の周囲で発生した事故を検出する。 Based on at least one of the image data output from the imaging unit 107 and the sensor data output from the sensing unit 109, the accident detection unit 133 detects an accident or information associated with the user possessed by the information processing terminal 11. An accident occurring around the processing terminal 11 is detected.
 位置証明処理部134は、情報処理端末11の位置証明に関する処理を行う。位置証明処理部134は、Prover処理部141及びWitness処理部142を備える。 The location certification processing unit 134 performs processing related to location certification of the information processing terminal 11 . The location proof processing unit 134 includes a Prover processing unit 141 and a Witness processing unit 142 .
 Prover処理部141は、情報処理端末11がProverとして動作する場合、すなわち、情報処理端末11が周囲の情報処理端末11に位置証明をしてもらう場合の処理を行う。Prover処理部141は、位置証明取得部151、位置登録部152、及び、画像登録部153を備える。 The Prover processing unit 141 performs processing when the information processing terminal 11 operates as a Prover, that is, when the information processing terminal 11 asks surrounding information processing terminals 11 to verify its location. The Prover processing unit 141 includes a location certification acquisition unit 151 , a location registration unit 152 and an image registration unit 153 .
 位置証明取得部151は、位置証明が必要な場合に、情報処理端末11の位置情報を含み、周囲の情報処理端末11に位置証明を要求するためのPoL(Proof of Location、位置証明)リクエストを生成する。位置証明取得部151は、通信部110を介してBTにより、周囲の情報処理端末11にPoLリクエストを送信する。位置証明取得部151は、周囲の情報処理端末11からPoLリクエストに対して送信され、位置証明情報を含むPoLレスポンスを、通信部110を介して受信する。 When location certification is required, the location certification acquisition unit 151 generates a PoL (Proof of Location) request for requesting location certification to surrounding information processing terminals 11, including the location information of the information processing terminal 11. Generate. The location certification acquisition unit 151 transmits a PoL request to the surrounding information processing terminals 11 via the communication unit 110 by BT. The location certification acquisition unit 151 receives, via the communication unit 110, a PoL response including location certification information that has been sent from the surrounding information processing terminal 11 in response to a PoL request.
 位置登録部152は、情報処理端末11の位置をブロックチェーンネットワーク13に登録する。具体的には、位置登録部152は、情報処理端末11の位置情報及び周囲の情報処理端末11から取得した位置証明情報を含むPoLトランザクションを生成する。位置登録部152は、通信部110及びネットワーク21を介して、ブロックチェーンネットワーク13にPoLトランザクションをブロードキャストする。これにより、PoLトランザクションに基づくPoLブロックがブロックチェーンに追加され、情報処理端末11の位置情報及び位置証明情報がブロックチェーンに記録される。 The location registration unit 152 registers the location of the information processing terminal 11 in the blockchain network 13. Specifically, the location registration unit 152 generates a PoL transaction including the location information of the information processing terminal 11 and the location certification information acquired from the surrounding information processing terminals 11 . The location registration unit 152 broadcasts PoL transactions to the blockchain network 13 via the communication unit 110 and the network 21 . As a result, a PoL block based on the PoL transaction is added to the blockchain, and the location information and the location proof information of the information processing terminal 11 are recorded in the blockchain.
 画像登録部153は、撮影部107により撮影された画像に対応する画像データをサーバ12に登録する。例えば、画像登録部153は、必要に応じて、通信部110及びネットワーク21を介して、画像データをサーバ12に送信し、サーバ12に画像データを保存させる。 The image registration unit 153 registers image data corresponding to images captured by the image capturing unit 107 in the server 12 . For example, the image registration unit 153 transmits the image data to the server 12 via the communication unit 110 and the network 21 and causes the server 12 to store the image data as necessary.
 Witness処理部141は、情報処理端末11がWitnessとして動作する場合、すなわち、情報処理端末11が周囲の情報処理端末11の位置証明を行う場合の処理を行う。Witness処理部141は、位置証明部161及び情報提供部162を備える。 The witness processing unit 141 performs processing when the information processing terminal 11 operates as a witness, that is, when the information processing terminal 11 performs position verification of the surrounding information processing terminals 11 . Witness processing unit 141 includes location certifying unit 161 and information providing unit 162 .
 位置証明部161は、通信部110を介して、ProverからPoLリクエストを受信した場合、PoLリクエストに基づいて、Witnessの位置情報を含む位置証明情報を含むPoLレスポンスを生成する。位置証明部161は、通信部110を介してBTにより、PoLレスポンスをProverに送信する。 When the location certification unit 161 receives a PoL request from the Prover via the communication unit 110, based on the PoL request, it generates a PoL response including location certification information including Witness location information. The location certification unit 161 transmits the PoL response to the Prover via the communication unit 110 by BT.
 情報提供部162は、サーバ12からの要求された情報(例えば、事故の目撃情報等)を生成し、通信部110及びネットワーク21を介して、サーバ12に送信する。 The information providing unit 162 generates information requested by the server 12 (for example, eyewitness information of an accident, etc.) and transmits it to the server 12 via the communication unit 110 and the network 21 .
 申告部135は、サーバ12が提供する保険に関する各種の申告を行うための申告データを生成し、通信部110及びネットワーク21を介して、サーバ12に申告データを送信する。申告部135は、サーバ12が申告データに対して送信する各種のデータを、ネットワーク21及び通信部110を介して受信する。 The declaration unit 135 generates declaration data for making various declarations regarding insurance provided by the server 12 and transmits the declaration data to the server 12 via the communication unit 110 and the network 21 . The reporting unit 135 receives various data transmitted from the server 12 with respect to the reporting data via the network 21 and the communication unit 110 .
  <サーバ12の構成例>
 図4は、サーバ12の機能的構成例を示すブロック図である。
<Configuration example of server 12>
FIG. 4 is a block diagram showing a functional configuration example of the server 12. As shown in FIG.
 サーバ12は、CPU201、メモリ202、ストレージ203、保険DB(Data Base)204、操作部205、表示部206、通信部207、外部I/F208、及び、ドライブ209を備える。CPU201乃至ドライブ209は、バスに接続されており、相互に、必要な通信を行う。 The server 12 includes a CPU 201 , a memory 202 , a storage 203 , an insurance DB (Data Base) 204 , an operation section 205 , a display section 206 , a communication section 207 , an external I/F 208 and a drive 209 . The CPU 201 to drive 209 are connected to a bus and perform necessary communications with each other.
 CPU201乃至ストレージ203、操作部205、表示部206、外部I/F208、及び、ドライブ209は、図2の情報処理端末11のCPU101乃至ストレージ103、操作部104、表示部105、外部I/F111、及び、ドライブ112とそれぞれ同様に構成される。 The CPU 201 through the storage 203, the operation unit 205, the display unit 206, the external I/F 208, and the drive 209 are the CPU 101 through the storage 103, the operation unit 104, the display unit 105, the external I/F 111, and the and are configured similarly to the drive 112, respectively.
 保険DB204は、サーバ12が提供する保険に関する各種のデータを格納する。例えば、保険DB204は、提供する保険や保険契約者に関する各種のデータを格納する。例えば、保険DB204は、保険契約者が関連する事故の目撃情報を格納する。例えば、保険DB204は、保険契約者の情報処理端末11から送信されてくる画像データを格納する。 The insurance DB 204 stores various data related to insurance provided by the server 12. For example, the insurance DB 204 stores various data related to insurance to be provided and policyholders. For example, insurance DB 204 stores eyewitness information for accidents involving policyholders. For example, the insurance DB 204 stores image data transmitted from the information processing terminal 11 of the policyholder.
 通信部207は、通信回路及びアンテナ等を含み、ネットワーク21を介して、情報処理端末11及びブロックチェーンネットワーク13と通信を行う。 The communication unit 207 includes a communication circuit, an antenna, etc., and communicates with the information processing terminal 11 and the blockchain network 13 via the network 21 .
 サーバ12では、情報処理端末11と同様に、CPU201が実行するプログラムは、サーバ12に内蔵されている記録媒体としてのストレージ203にあらかじめ記録しておくことができる。 In the server 12 , as in the information processing terminal 11 , the program executed by the CPU 201 can be recorded in advance in the storage 203 as a recording medium incorporated in the server 12 .
 また、プログラムは、リムーバブルメディア209Aに格納(記録)して、パッケージソフトウエアとして提供し、リムーバブルメディア209Aからサーバ12にインストールすることができる。 Also, the program can be stored (recorded) in the removable media 209A, provided as package software, and installed in the server 12 from the removable media 209A.
 その他、プログラムは、ネットワーク21及び通信部207を介して、図示せぬ他のサーバ等からダウンロードし、サーバ12にインストールすることができる。 In addition, the program can be downloaded from another server (not shown) or the like via the network 21 and the communication unit 207 and installed on the server 12 .
 CPU201がサーバ12にインストールされたプログラムを実行することにより、制御部231、検証部232、情報収集部233、及び、保険処理部234を含む機能が実現される。 Functions including a control unit 231, a verification unit 232, an information collection unit 233, and an insurance processing unit 234 are realized by the CPU 201 executing a program installed in the server 12.
 制御部131は、サーバ12の各部の処理の制御を行う。 The control unit 131 controls the processing of each unit of the server 12.
 検証部232は、通信部207及びネットワーク21を介して、情報処理端末11から受信した申告データの検証を行う。例えば、検証部232は、通信部207及びネットワーク21を介して、申告データに含まれるデータ(例えば、位置情報)の照合をブロックチェーンネットワーク13に要求する。検証部232は、ネットワーク21及び通信部207を介して、データの照合結果を示すデータをブロックチェーンネットワーク13から受信する。検証部232は、ブロックチェーンネットワーク13から受信した照合結果等に基づいて、申告データの検証を行う。 The verification section 232 verifies the declaration data received from the information processing terminal 11 via the communication section 207 and the network 21 . For example, the verification unit 232 requests the blockchain network 13 via the communication unit 207 and the network 21 to collate data (for example, location information) included in the declaration data. The verification unit 232 receives data indicating the result of data matching from the blockchain network 13 via the network 21 and the communication unit 207 . The verification unit 232 verifies the declaration data based on the verification result or the like received from the blockchain network 13 .
 情報収集部233は、必要に応じて、通信部207及びネットワーク21を介して、保険に関する処理に必要な情報を情報処理端末11に要求し、要求した情報を受信する。例えば、情報収集部233は、通信部207及びネットワーク21を介して、事故の目撃情報を情報処理端末11に要求し、情報処理端末11から目撃情報を受信する。 The information collection unit 233 requests information necessary for insurance processing from the information processing terminal 11 via the communication unit 207 and the network 21 as necessary, and receives the requested information. For example, the information collecting unit 233 requests eyewitness information of an accident from the information processing terminal 11 via the communication unit 207 and the network 21 and receives the eyewitness information from the information processing terminal 11 .
 保険処理部234は、サーバ12が提供する保険に関する各種の処理を行う。保険処理部234は、計算部241及び実行部242を備える。 The insurance processing unit 234 performs various processes related to the insurance provided by the server 12. The insurance processing unit 234 has a calculation unit 241 and an execution unit 242 .
 計算部241は、情報処理端末11から受信した申告データ等に基づいて、ユーザが契約している保険に関する保険料、保険金、又は、特典(例えば、キャッシュバック等)等の計算を行う。なお、特典は、必ずしも金銭でなくてもよく、例えば物品やポイント等であってもよい。 The calculation unit 241 calculates premiums, benefits, benefits (for example, cashback, etc.) related to the insurance contracted by the user, based on the declaration data received from the information processing terminal 11 . Note that the privilege does not necessarily have to be money, and may be, for example, goods or points.
 実行部242は、保険の各種のサービスに関する処理を実行する。例えば、実行部242は、通信部207及びネットワーク21を介して情報処理端末11との間で、保険料の請求処理、保険金の支払い処理、特典の付与処理等を実行する。 The execution unit 242 executes processing related to various insurance services. For example, the execution unit 242 executes insurance premium billing processing, insurance payment processing, privilege provision processing, and the like with the information processing terminal 11 via the communication unit 207 and the network 21 .
  <情報処理システム1の基本的な処理の流れ>
 次に、図5を参照して、情報処理システム1の基本的な処理の流れについて説明する。
<Flow of Basic Processing of Information Processing System 1>
Next, a basic processing flow of the information processing system 1 will be described with reference to FIG.
 なお、以下、ユーザU1が所持する情報処理端末11-1がProverであり、ユーザU2が所持する情報処理端末11-2がWitnessである場合の処理について説明する。 In the following, the processing when the information processing terminal 11-1 owned by the user U1 is the Prover and the information processing terminal 11-2 owned by the user U2 is the Witness will be described.
 ステップS1において、Proverの制御部131は、位置証明処理部134を実現するAPP1を起動する。なお、例えば、APP1がOS(Operating System)のバックグラウンドで常時動作するようにしてもよい。 In step S1, the control unit 131 of the Prover activates APP1 that implements the location proof processing unit 134. In addition, for example, APP1 may always operate in the background of the OS (Operating System).
 Witnessの制御部131も同様に、位置証明処理部134を実現するAPP2を起動する。なお、例えば、APP2がOSのバックグラウンドで常時動作するようにしてもよい。 The Witness control unit 131 similarly activates APP2 that implements the location proof processing unit 134 . In addition, for example, APP2 may always operate in the background of the OS.
 ステップS2において、Proverの位置検出部132は、位置情報を生成する。具体的には、位置検出部132は、所定のトリガを検出した場合、GNSS受信機108から出力される位置検出データに基づいて、Proverの現在位置を検出する。 In step S2, the position detection unit 132 of the Prover generates position information. Specifically, the position detection unit 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 when a predetermined trigger is detected.
 なお、例えば、所定のイベント又は所定のタイミングが、所定のトリガに設定される。例えば、撮影部107からの画像データ及びセンシング部109からのセンサデータに基づいて検出される事故(例えば、衝突、転倒)、撮影部107による画像(静止画又は動画)の撮影の開始又は終了、操作部104に対する所定のユーザ操作等のイベントが、トリガに設定される。 It should be noted that, for example, a predetermined event or predetermined timing is set as a predetermined trigger. For example, an accident (e.g., collision, fall) detected based on image data from the imaging unit 107 and sensor data from the sensing unit 109, start or end of imaging of an image (still image or moving image) by the imaging unit 107, An event such as a predetermined user operation on the operation unit 104 is set as a trigger.
 例えば、所定の時間の経過、所定の時刻への到達等のタイミングが、トリガに設定される。前者の場合、例えば、所定の時間間隔毎にトリガが検出される。なお、時間間隔は、一定であってもよいし、一定でなくてもよい。 For example, the timing of elapse of a predetermined time, arrival at a predetermined time, etc. is set as a trigger. In the former case, for example, a trigger is detected at predetermined time intervals. Note that the time interval may or may not be constant.
 位置検出部132は、Proverの現在位置及び現在時刻を含む位置情報を生成する。位置検出部132は、位置情報を位置証明取得部151に供給するとともに、ストレージ103に記憶させる。なお、位置検出部132は、位置情報に関連付けるメタデータが存在する場合、メタデータと位置情報に関連付けてストレージ103に記憶させる。 The position detection unit 132 generates position information including the current position and current time of the Prover. The location detection unit 132 supplies the location information to the location certification acquisition unit 151 and stores it in the storage 103 . If there is metadata associated with the location information, the location detection unit 132 stores the metadata in the storage 103 in association with the location information.
 例えば、位置情報に関連付けるメタデータとしては、現在のProverの位置において撮影部107により撮影された画像(動画又は静止画)に対応する画像データ、現在のProverの位置においてセンシング部109により取得されたセンサデータ等が想定される。 For example, the metadata associated with the position information includes image data corresponding to an image (moving or still image) captured by the imaging unit 107 at the current position of the Prover, Sensor data, etc. are assumed.
 ステップS3において、Proverの位置証明取得部151は、必要に応じて、メタデータのフィンガープリントを生成する。例えば、位置証明取得部151は、ステップS2の処理で生成した位置情報に関連付けるメタデータが存在する場合、メタデータのハッシュ値を計算することにより、フィンガープリントを生成する。 In step S3, the Prover's location certification acquisition unit 151 generates a metadata fingerprint as necessary. For example, if there is metadata associated with the location information generated in step S2, the location certification acquisition unit 151 generates a fingerprint by calculating a hash value of the metadata.
 ステップS4において、Proverの位置証明取得部151は、PoLリクエストを生成する。 In step S4, the Prover's location certification acquisition unit 151 generates a PoL request.
 図6は、PoLリクエストのフォーマット例を示している。PoLリクエストは、prover_address、latitude、longitude、timestamp、metadata_fingerprint、及び、signatueを含む。 Fig. 6 shows a format example of a PoL request. The PoL request contains provider_address, latitude, longitude, timestamp, metadata_fingerprint and signature.
 prover_addressは、Proverの公開鍵である。  prover_address is the Prover's public key.
 latitudeは、Proverの現在位置の緯度を示す。  latitude indicates the latitude of the Prover's current position.
 longitudeは、Proverの現在位置の経度を示す。  longitude indicates the longitude of the Prover's current position.
 timestampは、PoLリクエストの生成時刻を示す。例えば、ステップS2の処理で生成された位置情報に含まれる現在時刻が、timestampに設定される。  timestamp indicates the generation time of the PoL request. For example, the current time included in the position information generated in the process of step S2 is set as timestamp.
 従って、PoLリクエストは、位置証明の対象となる位置情報(latitude、longitude、及び、timestamp)を含む。 Therefore, the PoL request includes location information (latitude, longitude, and timestamp) that is the target of location certification.
 metadata_fingerprintは、Proverの位置情報に関連付けるメタデータのフィンガープリントである。  metadata_fingerprint is the fingerprint of the metadata associated with the Prover's location information.
 なお、Proverの位置情報に関連付けるメタデータが存在しない場合、metadata_fingerprintの値はNULLに設定される。また、例えば、フィンガープリントの代わりに、メタデータそのものをPoLリクエストに格納するようにしてもよい。  If there is no metadata associated with the Prover's location information, the value of metadata_fingerprint is set to NULL. Also, for example, instead of the fingerprint, the metadata itself may be stored in the PoL request.
 signatureは、Proverの電子署名である。例えば、signatureは、prover_address、latitude、longitude、timestamp、及び、metadata_fingerprintを含む平文のハッシュ値を、prover_address(公開鍵)に対応する秘密鍵により暗号化することにより生成される。 The signature is the Prover's electronic signature. For example, a signature is generated by encrypting a plaintext hash value containing provider_address, latitude, longitude, timestamp, and metadata_fingerprint with a private key corresponding to the provider_address (public key).
 ステップS5において、Proverの位置証明取得部151は、通信部110を介してBTにより、Proverから所定の範囲内に存在する他の情報処理端末11(Witness)に向けてPoLリクエストを送信する。なお、上記の所定の範囲は、例えば、BTの通信可能範囲に設定される。これにより、Proverの周囲に存在するWitnessに対して、Proverの位置証明が要求される。 In step S5, the Prover's location certification acquisition unit 151 transmits a PoL request to another information processing terminal 11 (Witness) existing within a predetermined range from the Prover by BT via the communication unit 110 . Note that the predetermined range is set, for example, to the BT communicable range. This requires proof of the Prover's location from witnesses that exist around the Prover.
 これに対して、WitnessのCPU101は、通信部110を介して、PoLリクエストを受信する。 In response, the Witness CPU 101 receives the PoL request via the communication unit 110 .
 ステップS6において、Witnessの位置証明部161は、PoLリクエストを検証する。PoLリクエストの検証方法は、図11を参照して後述する。 In step S6, the Witness location verification unit 161 verifies the PoL request. A method of verifying the PoL request will be described later with reference to FIG.
 ステップS7において、Witnessの位置証明部161は、検証の結果、PoLリクエストが正当であると判定した場合、PoLリクエストに対する応答であるPoLレスポンスを生成する。 In step S7, when the Witness's location certification unit 161 determines that the PoL request is valid as a result of the verification, it generates a PoL response, which is a response to the PoL request.
 図7は、PoLレスポンスのフォーマットの例を示している。 FIG. 7 shows an example of the PoL response format.
 PoLレスポンスは、pol_request_signed、witness_address、latitude、longitude、timestamp、metadata_fingerprint、及び、signatureを含む。 The PoL response includes pol_request_signed, witness_address, latitude, longitude, timestamp, metadata_fingerprint, and signature.
 pol_request_signedは、PoLレスポンスに対応するPoLリクエストのデータである。例えば、PoLリクエストの全データが、そのままpol_request_signedとしてPoLレスポンスに格納される。従って、PoLレスポンスは、PoLリクエストに含まれるProverの位置情報を含む。  pol_request_signed is PoL request data corresponding to the PoL response. For example, all data of the PoL request is stored as it is in the PoL response as pol_request_signed. Therefore, the PoL response contains the Prover's location information included in the PoL request.
 witness_addressは、Witnessの公開鍵である。  The witness_address is the public key of the Witness.
 latitudeは、Witnessの現在位置の緯度を示す。  Latitude indicates the latitude of the Witness's current position.
 longitudeは、Witnessの現在位置の経度を示す。  longitude indicates the longitude of the Witness's current position.
 timestampは、PoLレスポンスの生成時刻(位置証明時刻)を示す。例えば、Witnessの現在位置(latitude及びlongitude)の検出時刻が、timestampに設定される。  timestamp indicates the PoL response generation time (location proof time). For example, the detection time of the Witness's current position (latitude and longitude) is set to timestamp.
 従って、PoLレスポンスは、位置証明時のWitnessの位置情報(latitude、longitude、及び、timestamp)を含む。 Therefore, the PoL response contains the Witness's location information (latitude, longitude, and timestamp) at the time of location verification.
 metadata_fingerprintは、Witnessの位置情報に関連付けるメタデータのフィンガープリントである。  metadata_fingerprint is the fingerprint of the metadata associated with the Witness's location information.
 なお、Witnessの位置情報に関連付けるメタデータが存在しない場合、metadata_fingerprintの値はNULLに設定される。また、例えば、フィンガープリントの代わりに、メタデータそのものをPoLレスポンスに格納するようにしてもよい。  If there is no metadata associated with the Witness's location information, the value of metadata_fingerprint is set to NULL. Also, for example, instead of the fingerprint, the metadata itself may be stored in the PoL response.
 signatureは、Witnessの電子署名である。例えば、signatureは、pol_request_signed、witness_address、latitude、longitude、timestamp、及び、metadata_fingerprintを含む平文のハッシュ値を、witness_address(公開鍵)に対応する秘密鍵により暗号化することにより生成される。 The signature is the Witness's electronic signature. For example, a signature is generated by encrypting a plaintext hash value containing pol_request_signed, witness_address, latitude, longitude, timestamp, and metadata_fingerprint with a private key corresponding to witness_address (public key).
 ステップS8において、Witnessの位置証明部161は、通信部110を介してBTにより、PoLレスポンスをProverに送信する。 In step S8, the Witness's location verification unit 161 transmits the PoL response to the Prover by BT via the communication unit 110.
 これに対して、ProverのCPU101は、通信部110を介して、Witnessから送信されたPoLレスポンスを受信する。 In response, the Prover's CPU 101 receives the PoL response sent from the Witness via the communication unit 110 .
 ステップS9において、Proverは、PoLトランザクションを生成し、ブロードキャストする。具体的には、Proverの位置証明取得部151は、PoLレスポンスを検証する。PoLレスポンスの検証方法は、図12を参照して後述する。位置証明取得部151は、検証の結果、PoLレスポンスが正当であると判定した場合、PoLレスポンスを位置登録部152に供給する。 At step S9, the Prover generates and broadcasts a PoL transaction. Specifically, the Prover's location proof acquisition unit 151 verifies the PoL response. A PoL response verification method will be described later with reference to FIG. When the location certification acquisition unit 151 determines that the PoL response is valid as a result of the verification, it supplies the PoL response to the location registration unit 152 .
 位置登録部152は、PoLレスポンスに対応するPoLトランザクションを生成する。 The location registration unit 152 generates a PoL transaction corresponding to the PoL response.
 図8は、PoLトランザクションのフォーマット例を示している。 Fig. 8 shows a format example of a PoL transaction.
 PoLトランザクションは、sender_address、recipient_address、value、data、及び、signatureを含む。 A PoL transaction includes sender_address, recipient_address, value, data, and signature.
 sender_addressは、送信元のアドレスを示す。例えば、sender_addressには、"THE BLOCKCHAIN"が設定される。  sender_address indicates the address of the sender. For example, sender_address is set to "THE BLOCKCHAIN".
 recipient_addressは、受信先のアドレスを示す。例えば、recipient_addressには、"THE BLOCKCHAIN"が設定される。  recipient_address indicates the address of the recipient. For example, the recipient_address is set to "THE BLOCKCHAIN".
 valueには、0が設定される。  0 is set for the value.
 dataは、PoLリクエストに対して受信したPoLレスポンスのデータを含む。従って、PoLトランザクションは、Proverの位置情報及びWitnessによる位置証明情報を含む。なお、PoLリクエストに対して複数のWitnessからPoLレスポンスを受信した場合、dataは、それらの複数のPoLリクエストのデータを含む。  data contains the PoL response data received in response to the PoL request. Therefore, the PoL transaction contains the Prover's location information and the Witness's location information. Note that when receiving PoL responses from a plurality of Witnesses in response to a PoL request, data includes the data of the plurality of PoL requests.
 signatureには、特に値が設定されない。 No value is set for the signature.
 なお、位置証明データのブロックチェーン(送金情報を含まないブロックチェーン)においては、例えば、sender_address、recipient_address、及び、valueを省略し、PoLトランザクションがdataのみを含むようにすることが可能である。 In addition, in the location proof data blockchain (blockchain that does not include remittance information), for example, it is possible to omit the sender_address, recipient_address, and value so that the PoL transaction includes only data.
 位置登録部152は、通信部110及びネットワーク21を介して、PoLトランザクションをブロックチェーンネットワーク13にブロードキャストする。 The location registration unit 152 broadcasts the PoL transaction to the blockchain network 13 via the communication unit 110 and the network 21.
 これに対して、ブロックチェーンネットワーク13の各ノードは、ネットワーク21を介して、PoLトランザクションを受信する。 On the other hand, each node of the blockchain network 13 receives PoL transactions via the network 21.
 ステップS10において、ブロックチェーンネットワーク13は、PoLトランザクションに基づいて、PoLブロックを生成し、ブロックチェーンに追加する。 In step S10, the blockchain network 13 generates a PoL block based on the PoL transaction and adds it to the blockchain.
 図9は、PoLブロックのフォーマット例を示している。 FIG. 9 shows a format example of a PoL block.
 PoLブロックは、block_number、timestamp、transactions、previous_hush、nonce、miner_address、及び、signatureを含む。 A PoL block includes block_number, timestamp, transactions, previous_hush, nonce, miner_address, and signature.
 block_numberは、PoLブロックのブロック番号を示す。  block_number indicates the block number of the PoL block.
 timestampは、PoLブロックの生成時刻を示す。  timestamp indicates the generation time of the PoL block.
 transactionsは、トランザクションリストを示す。トランザクションリストは、1以上のPoLトランザクションを含む。  transactions shows the transaction list. A transaction list contains one or more PoL transactions.
 previous_hushは、ブロックチェーンにおいて当該PoLブロックの1つ前のPoLブロックのハッシュ値である。  previous_hush is the hash value of the PoL block immediately preceding the PoL block in question in the blockchain.
 nonceは、例えばPoW(Proof of Work)又はPoS(Proof of Stake)により計算されるノンス値である。 A nonce is a nonce value calculated by, for example, PoW (Proof of Work) or PoS (Proof of Stake).
 miner_addressは、PoLトランザクションをマイニングしたminerの公開鍵である。  miner_address is the public key of the miner who mined the PoL transaction.
 signatureは、minerの電子署名である。例えば、signatureは、block_number、timestamp、transactions、previous_hush、nonce、及び、miner_addressを含む平文のハッシュ値を、miner_address(公開鍵)に対応する秘密鍵により暗号化することにより生成される。  The signature is the miner's electronic signature. For example, signature is generated by encrypting a plaintext hash value including block_number, timestamp, transactions, previous_hush, nonce, and miner_address with a private key corresponding to miner_address (public key).
 ステップS11において、Proverの申告部135は、申告データを生成し、送信する。例えば、申告データは、例えば、ユーザU1を識別するための登録ID、Proverの位置情報、Proverの位置情報に関連付けるメタデータ(PoLリクエストのmetadata_fingerprintの元になるメタデータ)、Proverの公開鍵(PoLリクエストのprover_address)、保険の契約期間等のうち、申告に必要なデータを含む。申告部135は、通信部110及びネットワーク21を介して、申告データをサーバ12に送信する。 In step S11, the declaration unit 135 of the Prover generates and transmits declaration data. For example, the declaration data includes, for example, a registration ID for identifying user U1, location information of Prover, metadata associated with location information of Prover (metadata based on metadata_fingerprint of PoL request), public key of Prover (PoL request provider_address), insurance contract period, etc., including data necessary for declaration. The declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS12において、サーバ12の検証部232は、申告データの照合を要求する。例えば、検証部232は、申告データに含まれる登録IDに対応するユーザに関する情報を保険DB204から抽出する。検証部232は、抽出した情報に基づいて、ユーザU1が正規のユーザ(例えば、保険契約者)であると判定した場合、ブロックチェーンネットワーク13に申告データの照合を要求する。 At step S12, the verification unit 232 of the server 12 requests verification of the declaration data. For example, the verification unit 232 extracts from the insurance DB 204 information about the user corresponding to the registration ID included in the declaration data. If the verification unit 232 determines that the user U1 is a legitimate user (for example, an insurance policyholder) based on the extracted information, the verification unit 232 requests the blockchain network 13 to verify the declaration data.
 具体的には、例えば、検証部232は、申告データに位置情報が含まれ、メタデータが含まれない場合、通信部207及びネットワーク21を介して、当該位置情報の照合をブロックチェーンネットワーク13に要求する。 Specifically, for example, when the declared data includes location information but does not include metadata, the verification unit 232 sends the location information to the blockchain network 13 via the communication unit 207 and the network 21. demand.
 例えば、検証部232は、申告データに位置情報及びメタデータが含まれる場合、メタデータのフィンガープリントを生成する。検証部232は、通信部207及びネットワーク21を介して、申告データに含まれる位置情報、及び、生成したフィンガープリントの照合をブロックチェーンネットワーク13に要求する。 For example, if the declaration data includes location information and metadata, the verification unit 232 generates a fingerprint of the metadata. The verification unit 232 requests the blockchain network 13 via the communication unit 207 and the network 21 to verify the location information included in the declaration data and the generated fingerprint.
 ステップS13において、ブロックチェーンネットワーク13は、申告データの照合を行い、照合結果を送信する。 In step S13, the blockchain network 13 verifies the declaration data and transmits the verification result.
 例えば、ブロックチェーンネットワーク13は、位置情報の照合が要求された場合、当該位置情報と一致する位置情報を含むPoLブロックをブロックチェーンの中から検索する。すなわち、申告データに含まれる位置情報、及び、当該位置情報に対する位置証明情報を含むPoLブロックが検索される。 For example, when the location information is requested to be verified, the blockchain network 13 searches the blockchain for a PoL block containing location information that matches the location information. That is, the PoL block containing the location information included in the declaration data and the location proof information for the location information is searched.
 例えば、ブロックチェーンネットワーク13は、位置情報及びメタデータのフィンガープリントの照合が要求された場合、当該位置情報及びフィンガープリントと一致する位置情報及びフィンガープリントを含むPoLブロックをブロックチェーンの中から検索する。すなわち、申告データに含まれる位置情報、及び、メタデータのフィンガープリント、並びに、当該位置情報に対する位置証明を含むPoLブロックが検索される。 For example, the blockchain network 13 searches the blockchain for PoL blocks containing location information and fingerprints that match the location information and fingerprints when verification of location information and metadata fingerprints is requested. . That is, the PoL block containing the location information contained in the declaration data, the fingerprint of the metadata, and the location proof for the location information is searched.
 ブロックチェーンネットワーク13は、該当するPoLブロックを検出した場合、ネットワーク21を介して、検出したPoLブロックをサーバ12に送信する。 When the blockchain network 13 detects the relevant PoL block, it transmits the detected PoL block to the server 12 via the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、PoLブロックを受信する。 In response, the CPU 201 of the server 12 receives the PoL block via the network 21 and the communication unit 207.
 ブロックチェーンネットワーク13は、該当するPoLブロックを検出できなかった場合、ネットワーク21を介して、該当するPoLブロックが存在しないことをサーバ12に通知する。 If the blockchain network 13 fails to detect the relevant PoL block, it notifies the server 12 via the network 21 that the relevant PoL block does not exist.
 ステップS14において、サーバ12は、申告データを検証し、各種のサービスを実行する。例えば、サーバ12の検証部232は、ブロックチェーンネットワーク13からPoLブロックを受信した場合、後述する図11と同様の処理により、PoLブロックに含まれるPoLレスポンスの検証を行う。検証部232は、検証の結果、PoLレスポンスが正当であると判定した場合、後述する図12と同様の処理により、PoLレスポンスに含まれるPoLリクエストの検証を行う。 In step S14, the server 12 verifies the declaration data and executes various services. For example, when the verification unit 232 of the server 12 receives a PoL block from the blockchain network 13, it verifies the PoL response included in the PoL block by the same processing as in FIG. 11, which will be described later. When the verification unit 232 determines that the PoL response is valid as a result of the verification, the verification unit 232 verifies the PoL request included in the PoL response by the same processing as in FIG. 12 described later.
 検証部232は、PoLリクエストが正当であると判定した場合、Proverの位置情報(latitude、longtitude、timestamp)をPoLリクエストから抽出する。検証部232は、申告データに含まれるProverの位置情報と、PoLリクエストから抽出したProverの位置情報とが一致する場合、Witnessの位置情報(latitude、longtitude、timestamp)をPoLレスポンスから抽出する。そして、検証部232は、Proverの位置情報とWitnessの位置情報との間の位置及び時刻の差が所定の範囲内である場合、申告データの位置情報が真正であると判定する。 When the verification unit 232 determines that the PoL request is valid, it extracts the Prover's location information (latitude, longtitude, timestamp) from the PoL request. If the Prover's location information included in the declaration data matches the Prover's location information extracted from the PoL request, the verification unit 232 extracts the Witness's location information (latitude, longtitude, timestamp) from the PoL response. Then, if the difference in position and time between the Prover's position information and the Witness's position information is within a predetermined range, the verification unit 232 determines that the position information of the declaration data is authentic.
 また、検証部232は、PoLリクエストが正当であると判定した場合、申告データにメタデータが含まれるとき、メタデータのフィンガープリント(metadata_fingerprint)をPoLリクエストから抽出する。検証部232は、申告データに含まれるメタデータから生成したフィンガープリントと、PoLリクエストから抽出したフィンガープリントが一致する場合、申告データのメタデータが真正であると判定する。 In addition, when the verification unit 232 determines that the PoL request is valid and the declaration data includes metadata, the verification unit 232 extracts the fingerprint of the metadata (metadata_fingerprint) from the PoL request. If the fingerprint generated from the metadata included in the declared data matches the fingerprint extracted from the PoL request, the verification unit 232 determines that the metadata of the declared data is authentic.
 検証部232は、申告データにメタデータが含まれない場合、申告データの位置情報が真正であるとき、申告データが真正であると判定する。一方、検証部232は、申告データにメタデータが含まれない場合、申告データの位置情報が真正でないとき、申告データが真正でないと判定する。 When the declared data does not contain metadata, the verification unit 232 determines that the declared data is authentic when the position information of the declared data is authentic. On the other hand, if the declared data does not contain metadata and the position information of the declared data is not authentic, the verification unit 232 determines that the declared data is not authentic.
 検証部232は、申告データにメタデータが含まれる場合、申告データの位置情報及びメタデータが真正であるとき、申告データが真正であると判定する。一方、検証部232は、申告データにメタデータが含まれる場合、申告データの位置情報及びメタデータの少なくとも一方が真正でないとき、申告データが真正でないと判定する。 When the declared data includes metadata, the verification unit 232 determines that the declared data is authentic when the position information and the metadata of the declared data are authentic. On the other hand, when the declared data includes metadata, the verification unit 232 determines that the declared data is not authentic when at least one of the position information and the metadata of the declared data is not authentic.
 検証部232は、申告データが真正であると判定した場合、申告データを保険処理部234に供給する。 When the verification unit 232 determines that the declaration data is authentic, it supplies the declaration data to the insurance processing unit 234 .
 保険処理部234は、申告データに基づいて、サーバ12が提供する保険の各種のサービスに関する処理を実行する。なお、処理の具体例については後述する。 The insurance processing unit 234 executes processing related to various insurance services provided by the server 12 based on the declaration data. A specific example of processing will be described later.
 次に、図10乃至図13を参照して、図5のステップS2乃至ステップS14の処理について補足する。 Next, with reference to FIGS. 10 to 13, the processing of steps S2 to S14 in FIG. 5 will be supplemented.
 まず、図10のシーケンス図を参照して、図5のステップS2乃至ステップS10の処理について補足する。 First, with reference to the sequence diagram of FIG. 10, the processing of steps S2 to S10 of FIG. 5 will be supplemented.
 ステップS31において、Proverは、上述した図5のステップS2の処理を実行し、位置情報を生成する。 In step S31, the Prover executes the process of step S2 in FIG. 5 described above to generate position information.
 ステップS32において、Proverの位置証明取得部151は、周囲のWitnessをスキャンする。すなわち、位置証明取得部151は、Proverの周囲の所定の範囲内に存在するWitness(他の情報処理端末11)をスキャンする。例えば、通信部110がBTにより通信可能な範囲が、Witnessをスキャンする範囲に設定される。また、位置証明取得部151は、スキャンにより検出したWitnessが、ProverとBTによる通信が可能であるか否かを確認する。 In step S32, the Prover's location certification acquisition unit 151 scans the surrounding witnesses. That is, the location certification acquisition unit 151 scans for witnesses (other information processing terminals 11) existing within a predetermined range around the Prover. For example, the range in which the communication unit 110 can communicate by BT is set as the range to scan Witness. Further, the location certification acquisition unit 151 confirms whether or not the Witness detected by scanning can communicate with the Prover by BT.
 ステップS33において、Proverは、PoLリクエストを生成し、送信する。すなわち、Proverは、上述した図5のステップS3乃至ステップS5の処理を実行し、PoLリクエストを生成し、通信部110を介してBTにより、ステップS32の処理で検出したWitnessに送信する。 At step S33, the Prover generates and transmits a PoL request. That is, the Prover executes the processes of steps S3 to S5 in FIG. 5 described above, generates a PoL request, and transmits it to the witness detected in the process of step S32 by BT via the communication unit 110 .
 これに対して、WitnessのCPU101は、通信部110を介して、PoLリクエストを受信する。 In response, the Witness CPU 101 receives the PoL request via the communication unit 110 .
 ステップS34において、Witnessの位置証明部161は、PoLリクエストを検証する。 In step S34, the Witness location verification unit 161 verifies the PoL request.
 ステップS35において、Witnessの位置証明部161は、PoLリクエストが正当であると判定した場合、PoLレスポンスを生成し、送信する。 In step S35, if the Witness's location certification unit 161 determines that the PoL request is valid, it generates and transmits a PoL response.
 ここで、図11のフローチャートを参照して、ステップS34及びステップS35の処理(PoLリクエスト検証処理)の詳細について説明する。 Here, the details of the processing of steps S34 and S35 (PoL request verification processing) will be described with reference to the flowchart of FIG.
 ステップS51において、位置証明部161は、PoLリクエストのフォーマットが正常であるか否かを判定する。位置証明部161は、受信したPoLリクエストが図6のフォーマットに従っている場合、PoLリクエストのフォーマットが正常であると判定し、処理はステップS52に進む。 In step S51, the location certification unit 161 determines whether the format of the PoL request is normal. If the received PoL request conforms to the format of FIG. 6, the location certification unit 161 determines that the format of the PoL request is normal, and the process proceeds to step S52.
 ステップS52において、位置証明部161は、PoLリクエストが真正であるか否かを判定する。位置証明部161は、PoLリクエストのprover_address、latitude、longitude、timestamp、及び、metadata_fingerprintを含む平文のハッシュ値を計算する。また、位置証明部161は、PoLリクエストのprover_addressを用いて、PoLリクエストのsignatureを復号することにより、ハッシュ値を復号する。位置証明部161は、PoLリクエストの平文から計算したハッシュ値と、PoLリクエストのsignatureから復号したハッシュ値とが一致する場合、PoLリクエストが真正であると判定し、処理はステップS53に進む。 In step S52, the location certification unit 161 determines whether the PoL request is authentic. The location proving unit 161 calculates a plaintext hash value including the PoL request provider_address, latitude, longitude, timestamp, and metadata_fingerprint. Also, the location proof unit 161 decrypts the hash value by decrypting the signature of the PoL request using the provider_address of the PoL request. If the hash value calculated from the plaintext of the PoL request matches the hash value decoded from the signature of the PoL request, the location certification unit 161 determines that the PoL request is authentic, and the process proceeds to step S53.
 ステップS53において、位置証明部161は、Proverが所定の距離の範囲内に存在するか否かを判定する。例えば、位置証明部161は、PoLリクエストに含まれる位置情報に示されるProverの位置とWitnessの現在位置との間の距離が所定の閾値以下である場合、Proverが所定の距離の範囲内に存在すると判定し、処理はステップS54に進む。 In step S53, the position verification unit 161 determines whether the Prover exists within a predetermined distance range. For example, if the distance between the position of the Prover indicated in the position information included in the PoL request and the current position of the Witness is equal to or less than a predetermined threshold, the location verification unit 161 determines that the Prover exists within a predetermined distance range. Then, the process proceeds to step S54.
 ステップS54において、位置証明部161は、通信部110を介してBTにより、ProverにOKメッセージを送信する。 In step S54, the location certification unit 161 transmits an OK message to the Prover via the communication unit 110 by BT.
 ステップS55において、Witnessは、PoLレスポンスを生成し、送信する。すなわち、Witnessは、上述した図5のステップS7及びステップS8の処理を実行し、PoLリクエストに対するPoLレスポンスを生成し、通信部110を介してBTによりProverに送信する。 At step S55, the Witness generates and transmits a PoL response. That is, the Witness executes the processes of steps S7 and S8 in FIG. 5 described above, generates a PoL response to the PoL request, and transmits it to the Prover by BT via the communication unit 110 .
 これに対して、ProverのCPU101は、通信部110を介して、PoLレスポンスを受信する。 In response, the Prover's CPU 101 receives the PoL response via the communication unit 110 .
 その後、PoLリクエスト検証処理は終了する。 After that, the PoL request verification process ends.
 一方、ステップS53において、例えば、位置証明部161は、PoLリクエストに含まれる位置情報に示されるProverの位置とWitnessの現在位置との間の距離が所定の閾値を超える場合、Proverが所定の距離の範囲内に存在しないと判定し、処理はステップS56に進む。 On the other hand, in step S53, for example, if the distance between the position of the Prover indicated in the position information included in the PoL request and the current position of the Witness exceeds a predetermined threshold, the Prover , and the process proceeds to step S56.
 また、ステップS52において、位置証明部161は、PoLリクエストの平文から計算されたハッシュ値と、PoLリクエストのsignatureから復号されたハッシュ値とが一致しない場合、PoLリクエストが真正でないと判定し、処理はステップS56に進む。 Further, in step S52, if the hash value calculated from the plaintext of the PoL request and the hash value decoded from the signature of the PoL request do not match, the location certification unit 161 determines that the PoL request is not authentic, and processes the PoL request. goes to step S56.
 さらに、ステップS51において、位置証明部161は、受信したPoLリクエストが図6のフォーマットに従っていない場合、PoLリクエストのフォーマットが正常でないと判定し、処理はステップS56に進む。 Furthermore, in step S51, if the received PoL request does not conform to the format shown in FIG. 6, the location certification unit 161 determines that the format of the PoL request is not normal, and the process proceeds to step S56.
 ステップS56において、位置証明部161は、通信部110を介してBTにより、ProverにNGメッセージを送信する。 In step S56, the location certification unit 161 transmits an NG message to the Prover by BT via the communication unit 110.
 その後、PoLリクエスト検証処理は終了する。 After that, the PoL request verification process ends.
 図10に戻り、ステップS36において、Proverの位置証明取得部151は、PoLレスポンスを検証する。 Returning to FIG. 10, in step S36, the Prover's location proof acquisition unit 151 verifies the PoL response.
 ステップS37において、Proverの位置登録部152は、PoLトランザクションを生成し、ブロードキャストする。 In step S37, the Prover's location registration unit 152 generates and broadcasts a PoL transaction.
 ここで、図12のフローチャートを参照して、ステップS36及びステップS37の処理(PoLレスポンス検証処理)の詳細について説明する。 Here, the details of the processing of steps S36 and S37 (PoL response verification processing) will be described with reference to the flowchart of FIG.
 ステップS71において、位置証明取得部151は、PoLレスポンスのフォーマットが正常であるか否かを判定する。位置証明取得部151は、受信したPoLレスポンスが図7のフォーマットに従っている場合、PoLレスポンスのフォーマットが正常であると判定し、処理はステップS72に進む。 In step S71, the location certification acquisition unit 151 determines whether the format of the PoL response is normal. If the received PoL response conforms to the format of FIG. 7, the location certification acquisition unit 151 determines that the format of the PoL response is normal, and the process proceeds to step S72.
 ステップS72において、位置証明取得部151は、PoLレスポンスが真正であるか否かを判定する。位置証明取得部151は、PoLレスポンスのpol_request_signed、witness_address、latitude、longitude、timestamp、及び、metadata_fingerprintを含む平文のハッシュ値を計算する。また、位置証明取得部151は、PoLレスポンスのprover_addressを用いて、PoLレスポンスのsignatureを復号することにより、ハッシュ値を復号する。位置証明取得部151は、PoLレスポンスの平文から計算されたハッシュ値と、PoLレスポンスのsignatureから復号されたハッシュ値とが一致する場合、PoLレスポンスが真正であると判定し、処理はステップS73に進む。 In step S72, the location certification acquisition unit 151 determines whether the PoL response is authentic. The location proof acquisition unit 151 calculates a plaintext hash value including pol_request_signed, witness_address, latitude, longitude, timestamp, and metadata_fingerprint of the PoL response. Also, the location proof acquisition unit 151 decrypts the hash value by decrypting the signature of the PoL response using the provider_address of the PoL response. If the hash value calculated from the plaintext of the PoL response and the hash value decrypted from the signature of the PoL response match, the location proof acquisition unit 151 determines that the PoL response is authentic, and the process proceeds to step S73. move on.
 ステップS73において、位置証明取得部151は、Witnessが所定の距離の範囲内に存在するか否かを判定する。例えば、位置証明取得部151は、PoLレスポンスに含まれる位置情報に示されるWitnessの位置とProverの現在位置との間の距離が所定の閾値以下である場合、Witnessが所定の距離の範囲内に存在すると判定し、処理はステップS74に進む。 In step S73, the location certification acquisition unit 151 determines whether or not the Witness exists within a predetermined distance range. For example, when the distance between the position of the Witness indicated in the position information included in the PoL response and the current position of the Prover is equal to or less than a predetermined threshold, the location proof acquisition unit 151 determines that the Witness is within the range of the predetermined distance. It is determined that it exists, and the process proceeds to step S74.
 ステップS74において、Proverは、上述した図5のステップS9の処理を実行し、PoLトランザクションを生成し、ブロードキャストする。 In step S74, the Prover executes the process of step S9 in FIG. 5 described above, generates a PoL transaction, and broadcasts it.
 これに対して、ブロックチェーンネットワーク13の各ノードは、ネットワーク21を介して、PoLトランザクションを受信する。 On the other hand, each node of the blockchain network 13 receives PoL transactions via the network 21.
 その後、PoLレスポンス検証処理は終了する。 After that, the PoL response verification process ends.
 一方、ステップS73において、例えば、位置証明取得部151は、PoLレスポンスに含まれる位置情報に示されるWitnessの位置とProverの現在位置との間の距離が所定の閾値を超える場合、Witnessが所定の距離の範囲内に存在しないと判定し、処理はステップS75に進む。 On the other hand, in step S73, for example, if the distance between the position of the Witness indicated in the position information included in the PoL response and the current position of the Prover exceeds a predetermined threshold, the Witness It is determined that it does not exist within the range of distance, and the process proceeds to step S75.
 また、ステップS72において、位置証明取得部151は、PoLレスポンスの平文から計算されたハッシュ値と、PoLレスポンスのsignatureから復号されたハッシュ値とが一致しない場合、PoLレスポンスが真正でないと判定し、処理はステップS75に進む。 Further, in step S72, when the hash value calculated from the plaintext of the PoL response and the hash value decrypted from the signature of the PoL response do not match, the location proof acquisition unit 151 determines that the PoL response is not authentic, The process proceeds to step S75.
 ステップS71において、位置証明取得部151は、受信したPoLレスポンスが図7のフォーマットに従っていない場合、PoLレスポンスのフォーマットが正常でないと判定し、処理はステップS75に進む。 In step S71, if the received PoL response does not conform to the format of FIG. 7, the location certification acquisition unit 151 determines that the format of the PoL response is not normal, and the process proceeds to step S75.
 ステップS75において、位置証明取得部151は、PoLレスポンスを破棄する。 In step S75, the location certification acquisition unit 151 discards the PoL response.
 その後、PoLレスポンス検証処理は終了し、Proverの処理は終了する。すなわち、PoLトランザクションがブロックチェーンネットワーク13に送信されずに、Proverの処理が終了する。 After that, the PoL response verification process ends, and the Prover process ends. That is, the Prover processing ends without the PoL transaction being sent to the blockchain network 13 .
 図10に戻り、ステップS38において、ブロックチェーンネットワーク13のminerは、マイニングを行い、PoLブロックを生成する。 Returning to FIG. 10, in step S38, the miner of the blockchain network 13 performs mining and generates PoL blocks.
 ステップS39において、ブロックチェーンネットワーク13は、生成したPoLブロックを検証し、ブロックチェーンに追加する。例えば、ブロックチェーンネットワーク13のminerは、生成したPoLブロックに含まれる位置情報及び位置証明情報の有効性を検証することにより、PoLブロックが正当であると判定した場合、ブロックチェーンネットワーク13内の他のノードにPoLブロックを送信する。PoLブロックを受信した各ノードは、受信したPoLブロックをブロックチェーンに追加する。 In step S39, the blockchain network 13 verifies the generated PoL block and adds it to the blockchain. For example, the miner of the blockchain network 13 verifies the validity of the location information and the location proof information included in the generated PoL block, and if it determines that the PoL block is valid, other users in the blockchain network 13 node. Each node that receives a PoL block adds the received PoL block to the blockchain.
 次に、図13のシーケンス図を参照して、図5のステップS12乃至ステップS14の処理について補足する。 Next, with reference to the sequence diagram of FIG. 13, the processing of steps S12 to S14 of FIG. 5 will be supplemented.
 ステップS101において、Proverは、上述した図5のステップS11の処理を実行し、申告データを生成し、ネットワーク21を介して、サーバ12に送信する。 At step S101, the Prover executes the process of step S11 in FIG.
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS102において、サーバ12の検証部232は、必要に応じて、メタデータのフィンガープリントを生成する。例えば、検証部232は、図6のPoLリクエストのmetadata_fingerprintに対応するメタデータが申告データに含まれる場合、当該メタデータのフィンガープリントを生成する。 In step S102, the verification unit 232 of the server 12 generates a metadata fingerprint as necessary. For example, if the declaration data includes metadata corresponding to metadata_fingerprint of the PoL request in FIG. 6, the verification unit 232 generates a fingerprint of the metadata.
 ステップS103において、サーバ12の検証部232は、申告データに基づいて、PoLの記録のクエリを行う。 In step S103, the verification unit 232 of the server 12 queries PoL records based on the declaration data.
 例えば、検証部232は、申告データに位置情報が含まれ、メタデータが含まれない場合、当該位置情報と一致する位置情報を含むPoLブロックの抽出を要求するクエリを生成する。 For example, if the declared data includes location information but not metadata, the verification unit 232 generates a query requesting extraction of a PoL block including location information that matches the location information.
 例えば、検証部232は、申告データに位置情報及びメタデータが含まれる場合、メタデータのフィンガープリントを生成する。検証部232は、申告データに含まれる位置情報、及び、生成したフィンガープリントと一致する位置情報及びフィンガープリントを含むPoLブロックの抽出を要求するクエリを生成する。 For example, if the declaration data includes location information and metadata, the verification unit 232 generates a fingerprint of the metadata. The verification unit 232 generates a query requesting extraction of a PoL block containing location information and fingerprints that match the location information included in the declaration data and the generated fingerprint.
 検証部232は、通信部207及びネットワーク21を介して、生成したクエリをブロックチェーンネットワーク13に送信する。 The verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and the network 21.
 これに対して、ブロックチェーンネットワーク13(の各ノード)は、ネットワーク21を介して、クエリを受信する。 On the other hand, (each node of) the blockchain network 13 receives the query via the network 21.
 ステップS104において、ブロックチェーンネットワーク13は、クエリに基づいてPoLの記録を抽出し、送信する。具体的には、ブロックチェーンネットワーク13(のノード)は、ブロックチェーンに含まれるPoLブロックの中から、当該クエリにより示される条件に適合するPoLブロックを抽出する。 In step S104, the blockchain network 13 extracts and transmits PoL records based on the query. Specifically, (a node of) the blockchain network 13 extracts PoL blocks that match the conditions indicated by the query from the PoL blocks included in the blockchain.
 これにより、例えば、申告データに位置情報が含まれ、メタデータが含まれない場合、当該位置情報と一致する位置情報を含むPoLブロックが抽出される。例えば、申告データに位置情報及びメタデータが含まれる場合、当該位置情報、及び、当該メタデータのフィンガープリントと一致する位置情報及びフィンガープリントを含むPoLブロックが抽出される。 As a result, for example, if the declaration data contains location information but not metadata, PoL blocks that contain location information that matches the location information are extracted. For example, if the declaration data includes location information and metadata, the PoL blocks including the location information and the location information and fingerprints that match the fingerprint of the metadata are extracted.
 ブロックチェーンネットワーク13は、ネットワーク21を介して、抽出したPoLブロックをサーバ12に送信する。 The blockchain network 13 transmits the extracted PoL blocks to the server 12 via the network 21.
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、抽出されたPoLブロックを受信する。 In response, the CPU 201 of the server 12 receives the extracted PoL block via the network 21 and the communication unit 207.
 なお、ブロックチェーンネットワーク13は、クエリにより示される条件に適合するPoLブロックが存在しない場合、ネットワーク21を介して、該当するPoLブロックが存在しないことをサーバ12に通知する。 If there is no PoL block that meets the conditions indicated by the query, the blockchain network 13 notifies the server 12 via the network 21 that the PoL block does not exist.
 ステップS105において、サーバ12は、申告データを検証し、各種のサービスを実行する。すなわち、サーバ12は、上述した図5のステップS14の処理を実行し、申告データを検証した結果、申告データが真正であると判定した場合、申告データに基づいて、サーバ12が提供する保険の各種のサービスに関する処理を実行する。 In step S105, the server 12 verifies the declaration data and executes various services. That is, the server 12 performs the process of step S14 in FIG. 5 described above, and when it determines that the declared data is authentic as a result of verifying the declared data, the insurance provided by the server 12 is determined based on the declared data. Executes processing related to various services.
 以上のようにして、Proverは、専用のデバイスを用いずに、位置情報及びメタデータの真正性を確実に保証することができる。すなわち、Proverの周囲に存在するWitnessが、Proverの位置情報に関連付けてWitnessの位置情報を位置証明情報としてProverに送信する。そして、Proverは、その位置証明情報を用いて、Proverの位置情報、及び、位置情報に関連付けたメタデータの真正性を確実に保証することができる。 As described above, Prover can reliably guarantee the authenticity of location information and metadata without using a dedicated device. That is, the Witnesses existing around the Prover transmit the Witness's location information to the Prover as location verification information in association with the Prover's location information. The Prover can then use the location proof information to ensure the authenticity of the Prover's location information and the metadata associated with the location information.
 また、サーバ12は、申告データの真正性を容易に確認し、申告データが真正であると判定した場合、申告データに基づいて適切な保険のサービスを提供することができる。 In addition, the server 12 can easily confirm the authenticity of the declared data, and if it determines that the declared data is authentic, it can provide an appropriate insurance service based on the declared data.
 さらに、各情報処理端末11の位置情報及び位置証明情報がブロックチェーンに記録されて管理されることにより、個人情報の管理コストやサーバ12の運営コストを削減することができる。 Furthermore, by recording and managing the location information and location certification information of each information processing terminal 11 in a blockchain, the management cost of personal information and the operating cost of the server 12 can be reduced.
 次に、情報処理システム1の処理のより具体的な例について説明する。 Next, a more specific example of the processing of the information processing system 1 will be described.
  <事故の目撃情報の収集処理>
 ユーザが事故に遭った場合に支払われる保険金、又は、ユーザが事故を起こした場合に被害者に支払う賠償に対する保険金は、事故の当事者(被害者及び加害者)の過失割合等に基づいて計算される。そして、当事者以外の事故の目撃情報が、過失割合を決定する重要な根拠になる。
<Collecting and processing eyewitness information about accidents>
The insurance money paid in the event of an accident to the user, or the insurance money paid to the victim in the event of an accident caused by the user, shall be based on the percentage of negligence of the parties to the accident (victim and perpetrator). Calculated. And eyewitness information of an accident other than the party involved becomes an important basis for determining the percentage of fault.
 これに対して、本技術は、事故の目撃情報を収集する処理に適用することが可能である。 In contrast, this technology can be applied to the process of collecting eyewitness information about accidents.
 ここで、図14乃至図18を参照して、本技術を事故の目撃情報の収集処理に適用した場合の処理について説明する。 Here, with reference to FIG. 14 to FIG. 18, the processing when the present technology is applied to the collection processing of eyewitness information about an accident will be described.
 まず、図14を参照して、処理の全体的な流れについて説明する。 First, the overall flow of processing will be described with reference to FIG.
 なお、以下、ユーザU1を事故の当事者(加害者又は被害者)とし、ユーザU2を事故発生時に事故現場の周囲にいた目撃者とする。以下、ユーザU1が所持する情報処理端末11-1をProverとし、ユーザU2が所持する情報処理端末11-2をWitnessとする。 In the following, user U1 is assumed to be a party (perpetrator or victim) of the accident, and user U2 is assumed to be a witness who was around the accident site when the accident occurred. Hereinafter, the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover, and the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
 ステップS201において、図5のステップS1の処理と同様に、ProverはAPP1を起動し、WitnessはAPP2を起動する。 In step S201, Prover activates APP1, and Witness activates APP2, as in the process of step S1 in FIG.
 ステップS202において、Proverは、事故を検出し、位置情報を生成する。例えば、事故検出部133は、撮影部107から出力される画像データ、及び、センシング部109から出力されるセンサデータのうち少なくとも1つのデータに基づいて、ユーザU1が当事者となる事故を検出した場合、事故の発生を位置検出部132に通知する。 In step S202, the Prover detects an accident and generates location information. For example, when the accident detection unit 133 detects an accident in which the user U1 is a party, based on at least one of image data output from the imaging unit 107 and sensor data output from the sensing unit 109. , to notify the position detector 132 of the occurrence of an accident.
 なお、事故の検出に用いられるデータ(以下、事故データと称する)の種類や数は、特に限定されない。例えば、画像データ、衝撃センサからの衝撃データ等が用いられる。 The type and number of data used for accident detection (hereinafter referred to as accident data) are not particularly limited. For example, image data, impact data from an impact sensor, etc. are used.
 位置検出部132は、事故の検出をトリガとして、GNSS受信機108から出力される位置検出データに基づいて、Proverの現在位置を検出する。位置検出部132は、Proverの現在位置及び現在時刻を含む位置情報を生成する。位置検出部132は、事故前後に取得された事故データを事故検出部133から取得する。位置検出部132は、位置情報及び事故データを位置証明取得部151に供給する。位置検出部132は、位置情報及び事故データを関連付けて、ストレージ103に記憶させる。 Triggered by the detection of an accident, the position detection unit 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 . The position detector 132 generates position information including the current position and current time of the Prover. The position detection unit 132 acquires accident data acquired before and after the accident from the accident detection unit 133 . The location detection unit 132 supplies location information and accident data to the location certification acquisition unit 151 . The position detection unit 132 associates the position information and the accident data and stores them in the storage 103 .
 ステップS203において、Proverの位置証明取得部151は、事故データのフィンガープリントを生成する。 In step S203, the Prover's location certification acquisition unit 151 generates a fingerprint of the accident data.
 ステップS204において、図5のステップS4の処理と同様に、Proverは、PoLリクエストを生成する。このとき、事故データのフィンガープリントが、metadata_fingerprintとしてPoLリクエストに格納される。 At step S204, the Prover generates a PoL request, similar to the process at step S4 in FIG. At this time, the fingerprint of the accident data is stored in the PoL request as metadata_fingerprint.
 ステップS205乃至ステップS210において、上述した図5のステップS5乃至ステップS10と同様の処理が実行される。これにより、Proverの周囲のWitnessによりPoLリクエストに対応するPoLレスポンスが生成され、PoLレスポンスを含むPoLブロックが、ブロックチェーンに追加される。すなわち、事故発生時のProverの位置情報及びWitnessによる位置証明情報、並びに、事故データのフィンガープリントがブロックチェーンに記録される。 In steps S205 to S210, the same processing as steps S5 to S10 in FIG. 5 described above is performed. As a result, a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. In other words, the location information of the Prover at the time of the accident, the location verification information by the Witness, and the fingerprint of the accident data are recorded in the blockchain.
 ステップS211において、Proverの申告部135は、申告データを生成し、送信する。例えば、申告部135は、Proverが事故発生時に生成した位置情報、及び、位置情報に関連付けられている事故データをストレージ203から取得する。申告部135は、取得した位置情報及び事故データ、ユーザU1の登録ID、Proverの公開鍵を含む申告データを生成する。申告部135は、通信部110及びネットワーク21を介して、申告データをサーバ12に送信する。 In step S211, the declaration unit 135 of the Prover generates and transmits declaration data. For example, the reporting unit 135 acquires from the storage 203 location information generated by the Prover when an accident occurs and accident data associated with the location information. The report unit 135 generates report data including the acquired position information and accident data, the registration ID of the user U1, and the public key of the Prover. The declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS212及びステップS213において、上述した図5のステップS12及びステップS13と同様の処理が実行される。これにより、申告データに含まれる位置情報及び事故データのフィンガープリントと一致する位置情報及びフィンガープリントを含むPoLブロックがブロックチェーンから抽出され、サーバ12に送信される。 In steps S212 and S213, processing similar to steps S12 and S13 in FIG. 5 described above is executed. As a result, PoL blocks containing location information and fingerprints that match the fingerprints of the location information and accident data included in the report data are extracted from the blockchain and sent to the server 12 .
 ステップS214において、サーバ12の情報収集部233は、目撃者を推定する。具体的には、情報収集部233は、ブロックチェーンネットワーク13から受信したPoLブロックに含まれるPoLレスポンスを生成したWitness(例えば、情報処理端末11-2)を特定する。また、情報収集部233は、特定したWitnessのユーザ(例えば、ユーザU2等)を目撃者として推定する。 In step S214, the information collection unit 233 of the server 12 estimates the witness. Specifically, the information collecting unit 233 identifies the witness (for example, the information processing terminal 11 - 2 ) that generated the PoL response included in the PoL block received from the blockchain network 13 . In addition, the information collecting unit 233 estimates the specified Witness user (for example, the user U2, etc.) as a witness.
 ステップS215において、サーバ12の情報収集部233は、目撃者アンケートを生成し、ブロードキャストする。具体的には、情報収集部233は、ブロックチェーンネットワーク13から受信したPoLブロックに含まれる各witnessのPoLレスポンスに基づいて、目撃者から目撃情報を収集するための目撃者アンケートを生成する。目撃者アンケートは、WitnessのPoLレスポンスに含まれる公開鍵(witness_address)及び位置情報(latitude、longitude、timestamp)、並びに、アンケートの内容を示す情報を含む。
情報収集部233は、通信部207及びネットワーク21を介して、ステップS214の処理で推定した目撃者の情報処理端末11(Witness)に目撃者アンケートをブロードキャストする。
In step S215, the information collection unit 233 of the server 12 generates and broadcasts a witness questionnaire. Specifically, the information collecting unit 233 generates a witness questionnaire for collecting eyewitness information from eyewitnesses based on the PoL responses of each witness included in the PoL block received from the blockchain network 13 . The eyewitness questionnaire includes a public key (witness_address) and location information (latitude, longitude, timestamp) included in the Witness PoL response, and information indicating the contents of the questionnaire.
The information collecting unit 233 broadcasts the eyewitness questionnaire to the information processing terminal 11 (Witness) of the eyewitness estimated in step S214 via the communication unit 207 and the network 21 .
 これに対して、WitnessのCPU101は、ネットワーク21及び通信部110を介して、目撃者アンケートを受信する。 On the other hand, the Witness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
 ステップS216において、Witnessの情報提供部162は、目撃者アンケートの回答を生成し、送信する。具体的には、情報提供部162は、目撃者アンケートに含まれる公開鍵(witness_address)が、Witnessの公開鍵と一致する場合、例えば、目撃者アンケートに基づいて、図15の画面を表示部105に表示させる。 In step S216, the Witness information providing unit 162 generates and transmits responses to the eyewitness questionnaire. Specifically, when the public key (witness_address) included in the eyewitness questionnaire matches the public key of the witness, the information providing unit 162 displays the screen of FIG. to display.
 図15の画面には、事故の発生現場を示す地図301が背景に表示されている。事故現場の周辺を撮影した画像302が、地図301の上に表示されている。ユーザ(例えば、ユーザU2)に対するメッセージ及び事故に関する情報を含むウインドウ303が、地図301の上に表示されている。 In the screen of FIG. 15, a map 301 showing the site of the accident is displayed in the background. An image 302 of the vicinity of the accident site is displayed on the map 301 . A window 303 is displayed above the map 301 containing a message to a user (eg, user U2) and information about the accident.
 ウインドウ303には、目撃者を探している旨のメッセージ、事故の発生時刻及び概要が表示されている。また、ユーザが事故の目撃者であるか否かを尋ねるメッセージとともに、「はい」ボタン及び「いいえ」ボタンが表示されている。「はい」ボタンが押下された場合、図16の画面が表示部105に表示される。一方、「いいえ」ボタンが押下された場合、目撃者アンケートの表示は終了する。 The window 303 displays a message stating that a witness is being searched for, the time when the accident occurred, and an overview. Also displayed are a "Yes" button and a "No" button, along with a message asking whether the user was a witness to the accident. When the "Yes" button is pressed, the screen in FIG. 16 is displayed on the display unit 105. On the other hand, if the "No" button is pressed, the display of the eyewitness questionnaire ends.
 図16の画面は、図15の画面と比較して、ウインドウ303の代わりにウインドウ304が表示されている点が異なる。 The screen in FIG. 16 differs from the screen in FIG. 15 in that window 304 is displayed instead of window 303 .
 ウインドウ304には、送信を要求する画像(目撃情報)の撮影場所及び撮影時刻に関する条件等が表示されている。例えば、事故発生前後(この例の場合、事故発生後)の所定の時間帯に事故現場付近で撮影した画像(写真)の提供を依頼するメッセージが表示されている。また、「+」ボタン、「次」ボタン、及び、「戻る」ボタンが表示されている。 The window 304 displays the conditions regarding the shooting location and shooting time of the image (eyewitness information) requested to be sent. For example, a message requesting the provision of images (photographs) taken near the accident site during a predetermined time period before and after the accident (in this example, after the accident) is displayed. A "+" button, a "next" button, and a "back" button are also displayed.
 「+」ボタンが押下されると、提供する画像(写真)を選択するウインドウ(不図示)が表示され、画像の選択が可能になる。そして、画像の選択後に「次」ボタンが選択された場合、図17の画面が表示部105に表示される。一方、「戻る」ボタンが選択された場合、図15の画面が表示部105に表示される。 When the "+" button is pressed, a window (not shown) for selecting images (photographs) to be provided is displayed, enabling image selection. 17 is displayed on the display unit 105 when the "next" button is selected after selecting the image. On the other hand, when the "return" button is selected, the screen of FIG. 15 is displayed on the display unit 105. FIG.
 図17の画面は、図16の画面と比較して、ウインドウ304の代わりにウインドウ305が表示されている点が異なる。 The screen in FIG. 17 differs from the screen in FIG. 16 in that window 305 is displayed instead of window 304 .
 ウインドウ305には、情報提供に対する感謝を表すメッセージ、「提出」ボタン、及び、「キャンセル」ボタンが表示されている。「提出」ボタンが押下されると、図16の画面で選択された画像に対応する画像データが、サーバ12に送信される。この画像データは、ユーザU1が関わる事故の発生現場付近及び発生時刻付近(事故発生時のProverの位置情報に示される位置付近及び時刻付近)において撮影された画像に対応する。 A window 305 displays a message expressing gratitude for providing information, a "Submit" button, and a "Cancel" button. When the "submit" button is pressed, the image data corresponding to the image selected on the screen of FIG. 16 is transmitted to the server 12. This image data corresponds to an image captured near the site and time of occurrence of an accident involving user U1 (near the position and time indicated by the position information of Prover when the accident occurred).
 具体的には、情報提供部162は、選択した画像データを含む目撃情報を生成する。なお、目撃情報には、画像データ以外の情報、例えば、目撃者の証言を示すテキストデータ等が含まれてもよい。情報提供部162は、witness_address(公開鍵)に対応する秘密鍵を用いて、目撃情報の電子署名を生成する。情報提供部162は、目撃情報及び電子署名を含む目撃者アンケートの回答を生成し、通信部110及びネットワーク21を介して、サーバ12に送信する。 Specifically, the information providing unit 162 generates eyewitness information including the selected image data. Note that eyewitness information may include information other than image data, such as text data indicating testimony of eyewitnesses. The information providing unit 162 generates an electronic signature of the sighting information using the secret key corresponding to the witness_address (public key). The information providing unit 162 generates answers to the eyewitness questionnaire including eyewitness information and electronic signatures, and transmits them to the server 12 via the communication unit 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、目撃者アンケートの回答を受信する。 In response, the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
 一方、ウインドウ305内の「いいえ」ボタンが選択された場合、目撃者アンケートの回答の送信はキャンセルされ、図16の画面が表示される。 On the other hand, if the "No" button in the window 305 is selected, the transmission of the answers to the eyewitness questionnaire is canceled and the screen in FIG. 16 is displayed.
 次に、図18のシーケンス図を参照して、図14のステップS211乃至ステップS216の処理及びそれ以降の処理について補足する。 Next, with reference to the sequence diagram of FIG. 18, the processing from step S211 to step S216 of FIG. 14 and the subsequent processing will be supplemented.
 ステップS231において、Proverは、上述した図14のステップS11の処理を実行し、申告データを生成し、サーバ12に送信する。 In step S231, the Prover executes the process of step S11 in FIG.
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS232において、サーバ12の検証部232は、申告データに含まれる事故データのフィンガープリントを生成する。 At step S232, the verification unit 232 of the server 12 generates a fingerprint of the accident data included in the report data.
 ステップS233において、サーバ12の検証部232は、申告データに基づいて、PoLの記録のクエリを行う。具体的には、検証部232は、申告データに含まれる位置情報、及び、生成したフィンガープリントと一致する位置情報及びフィンガープリントを含むPoLブロックの抽出を要求するクエリを生成する。検証部232は、通信部207及びネットワーク21を介して、生成したクエリをブロックチェーンネットワーク13に送信する。 In step S233, the verification unit 232 of the server 12 queries PoL records based on the declaration data. Specifically, the verification unit 232 generates a query requesting extraction of PoL blocks containing position information and fingerprints that match the location information included in the declaration data and the generated fingerprints. The verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and network 21 .
 これに対して、ブロックチェーンネットワーク13は、ネットワーク21を介して、クエリを受信する。 In response, the blockchain network 13 receives the query via the network 21.
 ステップS234において、ブロックチェーンネットワーク13は、上述した図13のステップS103の処理と同様に、クエリに基づいてPoLの記録を抽出し、送信する。
これにより、クエリにより示される条件に適合するPoLブロックがブロックチェーンから抽出され、サーバ12に送信される。
In step S234, the blockchain network 13 extracts and transmits PoL records based on the query, similar to the process of step S103 in FIG. 13 described above.
PoL blocks that meet the conditions indicated by the query are thereby extracted from the blockchain and sent to the server 12 .
 ステップS235において、サーバ12の検証部232は、申告データを検証し、目撃者を推定する。具体的には、検証部232は、上述した図13のステップS105と同様の処理により、申告データの検証を行う。検証部232は、申告データが真正であると判定した場合、申告データ、及び、ブロックチェーンネットワーク13から受信したPoLブロックを情報収集部233に供給する。 In step S235, the verification unit 232 of the server 12 verifies the report data and estimates the witness. Specifically, the verification unit 232 verifies the declaration data by the same process as in step S105 of FIG. 13 described above. When the verification unit 232 determines that the declaration data is authentic, the verification unit 232 supplies the declaration data and the PoL block received from the blockchain network 13 to the information collection unit 233 .
 情報収集部233は、PoLブロックに含まれるPoLレスポンスを生成したWitnessを特定する。また、情報収集部233は、特定したWitnessのユーザを目撃者として推定する。 The information collection unit 233 identifies the witness that generated the PoL response included in the PoL block. The information collecting unit 233 also presumes that the specified Witness user is a witness.
 ステップS236において、サーバ12は、上述した図14のステップS215の処理を実行し、目撃者アンケートを生成し、ネットワーク21を介して、Witnessにブロードキャストする。 In step S236, the server 12 executes the process of step S215 in FIG. 14 described above, generates a witness questionnaire, and broadcasts it to Witness via the network 21.
 これに対して、WitnessのCPU101は、ネットワーク21及び通信部110を介して、目撃者アンケートを受信する。 On the other hand, the Witness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
 ステップS237において、Witnessの情報提供部162は、目撃者アンケートの公開鍵を確認する。すなわち、情報提供部162は、目撃者アンケートに含まれる公開鍵が、Witnessの公開鍵(witness_address)と一致するか否かを確認する In step S237, the Witness information providing unit 162 confirms the public key of the witness questionnaire. That is, the information providing unit 162 confirms whether or not the public key included in the eyewitness questionnaire matches the public key (witness_address) of the Witness.
 ステップS238において、Witnessは、目撃者アンケートの公開鍵がWitnessの公開鍵と一致する場合、上述した図14のステップS216の処理を実行し、目撃者アンケートの回答を生成し、サーバ12に送信する。 In step S238, if the public key of the eyewitness questionnaire matches the public key of the Witness, the Witness executes the process of step S216 in FIG. .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、目撃者アンケートの回答を受信する。 In response, the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
 ステップS239において、サーバ12の情報収集部233は、目撃者アンケートの回答を検証する。具体的には、情報収集部233は、目撃者アンケートの回答に含まれる目撃情報のハッシュ値を計算する。また、情報収集部233は、目撃者アンケートの回答に含まれる電子署名を、Witnessの公開鍵(witness_address)を用いて復号することにより、ハッシュ値を復号する。情報収集部233は、目撃情報から計算したハッシュ値と電子署名から復号したハッシュ値とが一致する場合、目撃者アンケートの回答が真正であると判定する。一方、情報収集部233は、目撃情報から計算したハッシュ値と電子署名から復号したハッシュ値とが一致しない場合、目撃者アンケートの回答が真正でないと判定する。 In step S239, the information collection unit 233 of the server 12 verifies the answers to the eyewitness questionnaire. Specifically, the information collecting unit 233 calculates a hash value of the eyewitness information included in the answers to the eyewitness questionnaire. The information collecting unit 233 also decrypts the hash value by decrypting the electronic signature included in the answer to the eyewitness questionnaire using the Witness's public key (witness_address). If the hash value calculated from the eyewitness information matches the hash value decoded from the electronic signature, the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic. On the other hand, if the hash value calculated from the eyewitness information does not match the hash value decoded from the electronic signature, the information collecting unit 233 determines that the answer to the eyewitness questionnaire is not authentic.
 ステップS240において、サーバ12は、礼金の送金処理を実行する。具体的には、情報収集部233は、目撃者アンケートの回答が真正であると判定した場合、目撃者アンケートの回答に含まれる目撃情報を保険処理部234に供給する。 In step S240, the server 12 executes key money remittance processing. Specifically, when the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic, the information collecting unit 233 supplies the eyewitness information included in the answer to the eyewitness questionnaire to the insurance processing unit 234 .
 計算部241は、例えば、目撃情報の内容に基づいて、目撃者に支払う礼金を計算し、計算結果を実行部242に通知する。実行部242は、通信部207及びネットワーク21を介して、Witnessと通信を行い、礼金の送金処理を行う。 For example, the calculation unit 241 calculates the key money to be paid to the eyewitness based on the content of the eyewitness information, and notifies the execution unit 242 of the calculation result. The execution unit 242 communicates with the Witness via the communication unit 207 and the network 21 to perform key money remittance processing.
 ステップS241において、サーバ12は、保険金に関する情報を送信する。具体的には、計算部241は、例えば、申告データ及び目撃情報等に基づいて、ユーザU1に支払う保険金を計算し、計算結果を実行部242に通知する。実行部242は、保険金の計算結果を含む情報を生成し、通信部207及びネットワーク21を介して、Proverに送信する。 In step S241, the server 12 transmits information regarding insurance claims. Specifically, the calculation unit 241 calculates the insurance money to be paid to the user U1, for example, based on the report data and eyewitness information, and notifies the execution unit 242 of the calculation result. The execution unit 242 generates information including calculation results of the insurance money, and transmits the information to the Prover via the communication unit 207 and the network 21 .
 これに対して、ProverのCPU101は、ネットワーク21及び通信部110を介して、保険金に関する情報を受信する。 In response, the Prover's CPU 101 receives information on the insurance money via the network 21 and the communication unit 110 .
 以上のようにして、多くの目撃者から事故の目撃情報を迅速に収集することができる。
また、目撃情報の真正性を確実に保証することができる。
As described above, it is possible to quickly collect accident eyewitness information from many eyewitnesses.
Also, the authenticity of eyewitness information can be assured.
  <リスク細分型保険への適用例>
 本技術は、例えば、リスク細分型の保険の保険料を計算したり、特典を付与したりする場合に適用することができる。
<Example of application to risk subdivision insurance>
This technology can be applied, for example, when calculating insurance premiums for risk segmented insurance or when providing benefits.
 具体的には、従来、リスク細分型の自動車保険が普及している。リスク細分型の自動車保険では、例えば、年間の走行距離等に基づいて保険料が設定される。 Specifically, risk-segmented automobile insurance has been popular in the past. In risk segmented automobile insurance, premiums are set based on, for example, annual mileage.
 同様に、例えば、今後、自転車や歩行者に対するリスク細分型の保険が普及することが想定される。例えば、契約期間中のユーザ(契約者)の移動ルートや移動距離に基づいてリスクが見積もられる。そして、見積もられたリスクに基づいて、次の契約期間の保険料が設定されたり、現在の契約期間に対するキャッシュバック等の特典が付与されたりすることが想定される。 Similarly, for example, it is expected that risk segmented insurance for bicycles and pedestrians will spread in the future. For example, the risk is estimated based on the moving route and moving distance of the user (contractor) during the contract period. Then, based on the estimated risk, it is assumed that the insurance premium for the next contract period is set, or benefits such as cash back for the current contract period are given.
 これに対して、例えば、本技術を用いて、ユーザの移動ルートや移動距離を表す移動データを収集し、移動データに基づいてリスク細分型保険の保険料や特典を計算することが考えられる。 On the other hand, for example, it is conceivable to use this technology to collect movement data representing the user's movement route and movement distance, and to calculate insurance premiums and benefits for risk segmented insurance based on the movement data.
 ここで、図19及び図20を参照して、本技術を用いて、ユーザの移動データを収集し、移動データに基づいてリスク細分型保険の保険料又は特典を計算する処理について説明する。 Here, with reference to FIGS. 19 and 20, a process of collecting movement data of the user and calculating insurance premiums or benefits for risk segmented insurance based on the movement data using the present technology will be described.
 まず、図19を参照して、処理の全体的な流れについて説明する。 First, the overall flow of processing will be described with reference to FIG.
 なお、以下、ユーザU1を保険料又は特典の計算対象となるユーザとし、ユーザU2をユーザU1の周囲に存在するユーザとする。以下、ユーザU1が所持する情報処理端末11-1をProverとし、ユーザU2が所持する情報処理端末11-2をWitnessとする。 In the following description, user U1 is assumed to be a user for whom insurance premiums or benefits are calculated, and user U2 is assumed to be a user existing around user U1. Hereinafter, the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover, and the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
 ステップS301において、図5のステップS1の処理と同様に、ProverはAPP1を起動し、WitnessはAPP2を起動する。 In step S301, Prover activates APP1, and Witness activates APP2, similar to the process of step S1 in FIG.
 ステップS302乃至ステップS309において、上述した図5のステップS2及びステップS4乃至ステップS10と同様の処理が、所定の時間間隔(例えば、1分間隔)で定期的に実行される。 In steps S302 to S309, processes similar to steps S2 and steps S4 to S10 in FIG. 5 described above are periodically performed at predetermined time intervals (for example, one minute intervals).
 これにより、Proverは、定期的に現在位置を検出し、位置情報を生成する。また、Proverの周囲のWitnessは、Proverの位置情報を証明するPoLレスポンスを生成する。そして、生成されたPoLレスポンスを含むPoLブロックが、ブロックチェーンに追加される。これにより、Proverの位置情報及びWitnessによる位置証明情報が定期的にブロックチェーンに記録される。 As a result, the Prover periodically detects the current location and generates location information. Witnesses around the Prover also generate a PoL response proving the Prover's location. A PoL block containing the generated PoL response is then added to the blockchain. As a result, Prover's location information and Witness's location proof information are periodically recorded in the blockchain.
 ステップS310において、Proverの申告部135は、申告データを生成し、送信する。例えば、申告部135は、保険の契約期間内に定期的に生成された複数の異なる日時における位置情報をストレージ103から読み出し、読み出した位置情報を時系列に並べた移動データを生成する。申告部135は、移動データ、ユーザU1の登録ID、及び、Proverの公開鍵を含み、保険料又は特典の計算に用いられる申告データを生成する。申告部135は、通信部110及びネットワーク21を介して、サーバ12に送信する。 In step S310, the declaration unit 135 of the Prover generates and transmits declaration data. For example, the declaring unit 135 reads out from the storage 103 position information for a plurality of different dates and times generated periodically within the contract period of the insurance, and generates movement data by arranging the read position information in chronological order. The declaration unit 135 generates declaration data that includes movement data, the registration ID of the user U1, and the public key of the Prover, and is used for calculating insurance premiums or benefits. The reporting unit 135 transmits to the server 12 via the communication unit 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS311において、サーバ12の検証部232は、申告データの照合を要求する。具体的には、検証部232は、申告データに含まれる移動データに含まれる各位置情報の照合を、通信部207及びネットワーク21を介して、ブロックチェーンネットワーク13に要求する。 At step S311, the verification unit 232 of the server 12 requests verification of the declaration data. Specifically, the verification unit 232 requests the block chain network 13 via the communication unit 207 and the network 21 to collate each location information included in the movement data included in the declaration data.
 ステップS312において、ブロックチェーンネットワーク13は、申告データの照合を行い、照合結果を送信する。具体的には、ブロックチェーンネットワーク13は、照合が要求された各位置情報と一致する位置情報をそれぞれ含む複数のPoLブロックをブロックチェーンから抽出する。ブロックチェーンネットワーク13は、ネットワーク21を介して、抽出した複数のPoLブロックをサーバ12に送信する。 In step S312, the blockchain network 13 verifies the declaration data and transmits the verification result. Specifically, the blockchain network 13 extracts from the blockchain a plurality of PoL blocks each containing location information that matches each location information requested to be verified. Blockchain network 13 transmits the extracted PoL blocks to server 12 via network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、PoLブロックを受信する。 In response, the CPU 201 of the server 12 receives the PoL block via the network 21 and the communication unit 207.
 ステップS313において、サーバ12は、申告データに基づいて、保険料等を計算する。具体的には、サーバ12の検証部232は、上述した図13のステップS105と同様の処理により、申告データの検証を行う。検証部232は、検証の結果、申告データに含まれる移動データ(に含まれる各位置情報)が真正であると判定した場合、移動データを計算部241に供給する。 In step S313, the server 12 calculates insurance premiums and the like based on the declaration data. Specifically, the verification unit 232 of the server 12 verifies the declaration data by the same processing as in step S105 of FIG. 13 described above. As a result of verification, the verification unit 232 supplies the movement data to the calculation unit 241 when it is determined that the movement data (each position information contained therein) included in the declaration data is authentic.
 計算部241は、移動データに基づいて、契約期間中のユーザU1の移動ルート及び移動距離を検出する。計算部241は、ユーザU1の移動ルート及び移動距離に基づいて、契約期間中のユーザU1のリスクを見積もる。 The calculation unit 241 detects the travel route and travel distance of the user U1 during the contract period based on the travel data. The calculator 241 estimates the risk of the user U1 during the contract period based on the travel route and travel distance of the user U1.
 なお、計算部241は、ユーザU1の移動手段に関する情報を取得したり、ユーザU1の移動ルート及び移動速度等に基づいて移動手段を推定したりするようにしてもよい。そして、計算部241は、さらに移動手段も考慮して、契約期間中のユーザU1のリスクを見積もるようにしてもよい。 Note that the calculation unit 241 may acquire information about the means of transportation of the user U1, or estimate the means of transportation based on the route and speed of movement of the user U1. Then, the calculation unit 241 may estimate the risk of the user U1 during the contract period, taking into consideration the means of transportation.
 計算部241は、見積もったリスクに基づいて、次の契約期間の保険料、又は、現在の契約期間に対するキャッシュバック等の特典を計算する。計算部241は、計算した保険料又は特典に関する情報を実行部242に供給する。 The calculation unit 241 calculates benefits such as insurance premiums for the next contract period or cashback for the current contract period based on the estimated risk. The calculation unit 241 supplies information on the calculated insurance premium or benefits to the execution unit 242 .
 ステップS314において、サーバ12とProverとは、保険料等の清算処理を実行する。例えば、サーバ12の実行部242と、Proverの制御部131とは、通信部207、ネットワーク21、及び、通信部110を介して、通信を行いながら、保険料の請求処理及び保険料の支払い処理、又は、特典の付与処理等を行う。 In step S314, the server 12 and the Prover execute settlement processing for insurance premiums and the like. For example, the execution unit 242 of the server 12 and the control unit 131 of the Prover communicate with each other via the communication unit 207, the network 21, and the communication unit 110 to perform insurance premium claim processing and insurance premium payment processing. Alternatively, a privilege granting process or the like is performed.
 次に、図20のシーケンス図を介して、図19のステップS310乃至ステップS314の処理について補足する。 Next, the processing of steps S310 to S314 in FIG. 19 will be supplemented through the sequence diagram in FIG.
 ステップS331において、Proverは、上述した図19のステップS310の処理を実行し、申告データを生成し、サーバ12に送信する。 In step S331, the Prover executes the process of step S310 in FIG.
 ステップS332において、サーバ12の検証部232は、申告データに基づいて、PoLの記録のクエリを行う。検証部232は、申告データの移動データに含まれる各位置情報と一致するProverの位置情報をそれぞれ含む複数のPoLブロックの抽出を要求するクエリを生成する。検証部232は、通信部207及びネットワーク21を介して、生成したクエリをブロックチェーンネットワーク13に送信する。 In step S332, the verification unit 232 of the server 12 queries PoL records based on the declaration data. The verification unit 232 generates a query requesting extraction of a plurality of PoL blocks each including the location information of the Prover that matches the location information included in the movement data of the declaration data. The verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and network 21 .
 これに対して、ブロックチェーンネットワーク13は、ネットワーク21を介して、クエリを受信する。 In response, the blockchain network 13 receives the query via the network 21.
 ステップS333において、ブロックチェーンネットワーク13は、クエリに基づいてPoLの記録を抽出し、送信する。具体的には、ブロックチェーンネットワーク13は、ブロックチェーンに含まれるPoLブロックの中から、クエリにより示される条件に適合するPoLブロックを抽出する。これにより、クエリに示される各位置情報と一致する位置情報をそれぞれ含む複数のPoLブロックが抽出される。 In step S333, the blockchain network 13 extracts and transmits PoL records based on the query. Specifically, the blockchain network 13 extracts PoL blocks that match the conditions indicated by the query from the PoL blocks included in the blockchain. As a result, a plurality of PoL blocks each containing location information that matches each location information indicated in the query are extracted.
 ブロックチェーンネットワーク13は、ネットワーク21を介して、抽出したPoLブロックをサーバ12に送信する。 The blockchain network 13 transmits the extracted PoL blocks to the server 12 via the network 21.
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、抽出されたPoLブロックを受信する。 In response, the CPU 201 of the server 12 receives the extracted PoL block via the network 21 and the communication unit 207.
 ステップS334において、サーバ12は、上述した図19のステップS313の処理を実行し、申告データに基づいて、保険料等を計算する。 In step S334, the server 12 executes the process of step S313 in FIG. 19 described above, and calculates insurance premiums and the like based on the declaration data.
 ステップS335において、サーバ12とProverは、上述した図19のステップS314の処理を実行し、保険料等の清算処理を実行する。 At step S335, the server 12 and the Prover execute the process of step S314 in FIG.
 以上のようにして、自転車や歩行者に対するリスク細分型の保険を提供することが可能になる。また、契約期間中のユーザの移動データに基づいて、ユーザ(契約者)のリスクをより正確に見積もることができる。さらに、移動データの真正性を確実に保証することができる。 In this way, it is possible to provide risk segmented insurance for bicycles and pedestrians. In addition, the user's (contractor's) risk can be estimated more accurately based on the user's movement data during the contract period. Furthermore, the authenticity of the transfer data can be reliably guaranteed.
 なお、例えば、計算部241は、ユーザU1の移動ルートにおける状況(例えば、天候、混み具合、過去の事故の発生状況等)や、移動する目的(例えば、仕事、旅行、スポーツ等)にさらに基づいて、リスクを見積もるようにしてもよい。 Note that, for example, the calculation unit 241 further determines the conditions of the travel route of the user U1 (e.g., weather, congestion, occurrence of accidents in the past, etc.) and the purpose of travel (e.g., work, travel, sports, etc.). may be used to estimate the risk.
 また、以上の処理により、ユーザU1は、移動ルート及び移動時刻を証明することができるが、証明された移動ルート及び移動時刻を、リスク細分型保険以外の目的に利用することができる。例えば、ユーザU1は、通勤災害の認定を受ける場合に、本技術により証明された移動ルート及び移動時刻を利用することができる。例えば、ユーザU1は、不慮の要因による遅刻を証明する場合に、本技術により証明された移動ルート及び移動時刻を利用することができる。 In addition, through the above processing, user U1 can certify the travel route and travel time, and the certified travel route and travel time can be used for purposes other than risk segmentation insurance. For example, user U1 can use the travel route and travel time certified by the present technology when receiving recognition of a commuting accident. For example, user U1 can use the travel route and travel time certified by the present technology when proving that he was late due to an unforeseen factor.
  <レジャー保険及び損害保険への適用例>
 本技術は、例えば、旅行、スキー、ゴルフ、ハイキング等のレジャー時に発生した怪我、損害、賠償等に対する保険(以下、レジャー保険と称する)の特典を付与する場合に適用することが可能である。
<Examples of application to leisure insurance and non-life insurance>
The present technology can be applied, for example, when providing benefits of insurance (hereinafter referred to as leisure insurance) against injuries, damages, compensation, etc. that occur during leisure activities such as travel, skiing, golf, and hiking.
 具体的には、例えば、レジャー保険において、契約期間中にユーザ(契約者)が訪れた場所に基づいて、契約者に特典を付与するサービスを導入することが想定される。例えば、ゴルフやスキーのレジャー保険の契約者が、契約期間中に訪れたゴルフ場やスキー場の数により、キャッシュバック(お祝い金)や次の契約期間の保険料の割引き等の特典を付与するサービスが想定される。 Specifically, for example, in leisure insurance, it is envisioned that a service will be introduced that gives benefits to policyholders based on the locations visited by the user (policyholder) during the contract period. For example, policyholders of leisure insurance for golf or skiing may receive benefits such as cashback (celebration money) or discounts on insurance premiums for the next contract period, depending on the number of golf courses or ski resorts visited during the contract period. service is assumed.
 この場合、契約者は、実際にその場所を訪れたことを証明する必要がある。これに対して、例えば、本技術を用いて、契約者が訪れた場所で撮影した画像を用いて、契約者が実際にその場所を訪れたことを証明することが可能である。 In this case, the contractor must prove that they actually visited the location. On the other hand, for example, using the present technology, it is possible to prove that the contractor actually visited the place by using an image taken at the place visited by the contractor.
 また、本技術は、例えば、火災や自然災害に対する損害保険の保険金を計算する場合に適用することが可能である。 In addition, this technology can be applied, for example, when calculating insurance claims for property and casualty insurance against fires and natural disasters.
 具体的には、損害保険では、被災状況に基づいて保険金が計算される。その際、例えば、被災場所の画像に基づいて、被災状況を証明することが想定される。これに対して、例えば、本技術を用いて、ユーザ(契約者)が被災場所を撮影した画像を用いて、被災状況を証明することが可能である。 Specifically, in non-life insurance, the insurance money is calculated based on the damage situation. At that time, for example, it is assumed that the disaster situation is proved based on the image of the disaster area. On the other hand, for example, using the present technology, it is possible for a user (contractor) to use an image of a disaster-stricken area to prove the disaster situation.
 ここで、図21を参照して、契約者が撮影した証明用の画像(以下、証明用画像)に基づいて、保険金又は特典の計算を行う場合の処理について説明する。 Here, with reference to FIG. 21, the process of calculating insurance benefits or benefits based on images for certification taken by the policyholder (hereinafter referred to as images for certification) will be described.
 なお、以下、ユーザU1を、証明用画像を撮影し、保険金又は特典の申告を行うユーザとし、ユーザU2を証明用画像の撮影時にユーザU1の周囲に存在するユーザとする。以下、ユーザU1が所持する情報処理端末11-1をProverとし、ユーザU2が所持する情報処理端末11-2をWitnessとする。 It should be noted that hereinafter, the user U1 is assumed to be a user who takes an image for proof and declares insurance money or benefits, and the user U2 is assumed to be a user who exists around the user U1 when the image for proof is taken. Hereinafter, the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover, and the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
 ステップS401において、図5のステップS1の処理と同様に、ProverはAPP1を起動し、WitnessはAPP2を起動する。 In step S401, Prover activates APP1 and Witness activates APP2, as in the process of step S1 in FIG.
 ステップS402において、Proverは、証明用画像を撮影し、位置情報を生成する。 In step S402, the Prover captures a certification image and generates location information.
 例えば、Proverの撮影部107は、操作部104に対するユーザ操作に応じて、証明用画像の撮影を行い、対応する画像データをCPU101に供給する。 For example, the photographing unit 107 of the Prover photographs a certification image in response to a user's operation on the operation unit 104 and supplies the corresponding image data to the CPU 101 .
 位置検出部132は、GNSS受信機108から出力される位置検出データに基づいて、証明用画像の撮影時のProverの現在位置を検出する。位置検出部132は、証明用画像の撮影時のProverの位置及び時刻を含む位置情報を生成する。位置検出部132は、証明用画像と位置情報を位置証明取得部151に供給する。位置検出部132は、証明用画像と位置情報を関連付けてストレージ103に記憶させる。 The position detection unit 132 detects the current position of the Prover when the certification image was captured based on the position detection data output from the GNSS receiver 108 . The position detection unit 132 generates position information including the position and time of the Prover when the certification image was captured. The location detection unit 132 supplies the certification image and the location information to the location certification acquisition unit 151 . The position detection unit 132 stores the certification image and the position information in the storage 103 in association with each other.
 ステップS403において、Proverの位置証明取得部151は、証明用画像の画像データのフィンガープリントを生成する。 In step S403, the Prover's location certification acquisition unit 151 generates a fingerprint of the image data of the certification image.
 ステップS404において、Proverの位置証明取得部151は、上述した図5のステップS4の処理と同様に、PoLリクエストを生成する。この場合、証明用画像の画像データのフィンガープリントが、metadata_fingerprintとしてPoLリクエストに格納される。 In step S404, the Prover's location certification acquisition unit 151 generates a PoL request in the same manner as in step S4 of FIG. 5 described above. In this case, the fingerprint of the image data of the certification image is stored in the PoL request as metadata_fingerprint.
 ステップS405乃至ステップS410において、上述した図5のステップS5乃至ステップS10と同様の処理が実行される。これにより、Proverの周囲のWitnessによりPoLリクエストに対応するPoLレスポンスが生成され、PoLレスポンスを含むPoLブロックが、ブロックチェーンに追加される。すなわち、証明用画像撮影時のProverの位置情報及びWitnessによる位置証明情報、並びに、証明用画像の画像データのフィンガープリントがブロックチェーンに記録される。 In steps S405 through S410, processing similar to steps S5 through S10 in FIG. 5 described above is executed. As a result, a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information of the Prover at the time of photographing the certification image, the location certification information by the Witness, and the fingerprint of the image data of the certification image are recorded in the blockchain.
 ステップS411において、Proverの申告部135は、申告データを生成し、送信する。具体的には、申告部135は、証明用画像の画像データ、証明用画像の撮影時の位置情報、ユーザU1の登録ID、及び、Proverの公開鍵を含む申告データを生成する。申告部135は、通信部110及びネットワーク21を介して、申告データをサーバ12に送信する。 In step S411, the declaration unit 135 of the Prover generates and transmits declaration data. Specifically, the declaration unit 135 generates declaration data including the image data of the certification image, the position information when the certification image was captured, the registration ID of the user U1, and the public key of the Prover. The declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS412及びステップS413において、図5のステップS12及びステップS13と同様の処理が実行される。これにより、申告データに含まれる位置情報、及び、証明用画像の画像データのフィンガープリントと一致する位置情報及びフィンガープリントを含むPoLブロックが、ブロックチェーンから抽出される。 In steps S412 and S413, the same processes as steps S12 and S13 in FIG. 5 are executed. As a result, the PoL block containing the location information included in the declaration data and the location information and the fingerprint matching the fingerprint of the image data of the certification image is extracted from the blockchain.
 ステップS414において、サーバ12は、証明用画像を解析し、保険金等を計算する。 In step S414, the server 12 analyzes the proof image and calculates the insurance money.
 具体的には、サーバ12の検証部232は、上述した図5のステップS14の処理と同様に、申告データの検証を行う。検証部232は、検証の結果、申告データが真正であると判定した場合、すなわち、証明用画像が申告された位置及び時刻に撮影されたものであると判定した場合、申告データを計算部241に供給する。 Specifically, the verification unit 232 of the server 12 verifies the declaration data in the same manner as the process of step S14 in FIG. 5 described above. When the verification unit 232 determines that the declared data is authentic as a result of the verification, that is, when it determines that the certification image was taken at the declared position and time, the verification unit 232 converts the declared data to the calculation unit 241 . supply to
 例えば、計算部241は、証明用画像を解析して、契約者が訪れた場所を特定する。計算部241は、契約者が訪れた場所に基づいて、契約者に付与する特典を計算する。 For example, the calculation unit 241 analyzes the certification image and identifies the location visited by the contractor. The calculation unit 241 calculates benefits to be given to the contractor based on the locations visited by the contractor.
 例えば、計算部241は、証明用画像を解析して、契約者の被災状況を推定する。計算部241は、推定した被災状況に基づいて、契約者に支払う保険金を計算する。 For example, the calculation unit 241 analyzes the certification image and estimates the damage situation of the contractor. The calculation unit 241 calculates the insurance money to be paid to the policyholder based on the estimated disaster situation.
 計算部241は、保険金又は特典の計算結果を示すデータを実行部242に供給する。 The calculation unit 241 supplies the execution unit 242 with data indicating the calculation result of the insurance money or benefits.
 ステップS415において、サーバ12とProverとは、保険金等の清算処理を実行する。例えば、サーバ12の実行部242と、Proverの制御部131とは、通信部207、ネットワーク21、及び、通信部110を介して、通信を行いながら、保険金の支払い処理、又は、特典の付与処理等を行う。 In step S415, the server 12 and the Prover execute settlement processing for insurance claims and the like. For example, the execution unit 242 of the server 12 and the control unit 131 of the Prover communicate with each other via the communication unit 207, the network 21, and the communication unit 110 while performing insurance payment processing or provision of benefits. processing, etc.
 以上のようにして、ユーザ(契約者)が撮影した証明用画像の真正性を保証しつつ、証明用画像を用いて、保険金又は特典を適切に計算することが可能になる。また、ユーザは、証明用画像を用いて、煩わしい手続きを行うことなく、迅速に保険金又は特典を受けることができる。 As described above, while guaranteeing the authenticity of the proof image taken by the user (contractor), it is possible to use the proof image to appropriately calculate insurance benefits or benefits. In addition, the user can use the proof image to quickly receive the insurance money or benefit without performing troublesome procedures.
 なお、例えば、契約者以外のユーザが撮影した被災状況を証明する証明用画像を用いることも可能である。 It should be noted that, for example, it is also possible to use a proof image that proves the disaster situation taken by a user other than the contractor.
  <健康増進型保険への適用例>
 本技術は、例えば、健康増進型保険の保険料を計算したり、特典を付与したりする場合に適用することができる。
<Example of application to health promotion insurance>
This technology can be applied, for example, when calculating insurance premiums for health promotion insurance or when providing benefits.
 具体的には、健康増進型保険とは、契約者の健康状態や健康増進への取り組みによって、保険料を割り引いたり、キャッシュバック等の特典を付与したりする保険である。これに対して、契約者の健康増進を促す活動に基づいて、保険料の割引き又は特典の付与を行う場合に、本技術を適用することができる。 Specifically, health promotion insurance is insurance that discounts insurance premiums and provides benefits such as cash back depending on the policyholder's health condition and efforts to improve health. On the other hand, the present technology can be applied when discounting insurance premiums or granting benefits based on the policyholder's health promotion activities.
 ここで、図22を参照して、本技術を健康増進型保険の処理に適用する場合の例について説明する。 Here, with reference to FIG. 22, an example of applying the present technology to health promotion insurance processing will be described.
 なお、以下、ユーザU1を保険料又は特典の計算対象となるユーザ(契約者)とし、ユーザU2をユーザU1の周囲に存在するユーザとする。以下、ユーザU1が所持する情報処理端末11-1をProverとし、ユーザU2が所持する情報処理端末11-2をWitnessとする。 In the following description, user U1 is assumed to be a user (contractor) for whom insurance premiums or benefits are calculated, and user U2 is assumed to be a user existing around user U1. Hereinafter, the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover, and the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
 ステップS501において、図5のステップS1の処理と同様に、ProverはAPP1を起動し、WitnessはAPP2を起動する。 In step S501, Prover activates APP1, and Witness activates APP2, as in the process of step S1 in FIG.
 ステップS502において、Proverは、活動データを取得し、位置情報を生成する。具体的には、Proverの位置検出部132は、ユーザU1の活動の検出に用いられるセンサデータである活動データをセンシング部109から取得する。例えば、ユーザの歩数や歩行距離に応じて、保険料の割引きや特典の付与が行われる場合、ユーザの歩行動作を示す加速度及び角速度を示すセンサデータが、活動データに用いられる。 In step S502, the Prover acquires activity data and generates location information. Specifically, the position detection unit 132 of the Prover acquires from the sensing unit 109 activity data, which is sensor data used for detecting the activity of the user U1. For example, when insurance premium discounts and benefits are provided according to the number of steps and walking distance of the user, sensor data indicating the acceleration and angular velocity indicating the walking motion of the user are used as the activity data.
 位置検出部132は、GNSS受信機108から出力される位置検出データに基づいて、活動データ取得時のProverの現在位置を検出する。位置検出部132は、Proverの現在位置及び現在時刻を含む位置情報を生成する。位置検出部132は、活動データと位置情報を位置証明取得部151に供給するとともに、活動データと位置情報を関連付けてストレージ103に記憶させる。 The position detection unit 132 detects the current position of the Prover at the time of acquisition of the activity data based on the position detection data output from the GNSS receiver 108 . The position detector 132 generates position information including the current position and current time of the Prover. The location detection unit 132 supplies the activity data and the location information to the location certification acquisition unit 151 and causes the storage 103 to store the activity data and the location information in association with each other.
 ステップS503において、Proverの位置証明取得部151は、活動データのフィンガープリントを生成する。 In step S503, the Prover's location certification acquisition unit 151 generates a fingerprint of the activity data.
 ステップS504において、Proverの位置証明取得部151は、上述した図5のステップS4の処理と同様に、PoLリクエストを生成する。この場合、活動データのフィンガープリントが、metadata_fingerprintとしてPoLリクエストに格納される。 In step S504, the Prover's location certification acquisition unit 151 generates a PoL request in the same manner as the processing in step S4 of FIG. 5 described above. In this case, the activity data fingerprint is stored in the PoL request as metadata_fingerprint.
 ステップS505乃至ステップS510において、上述した図5のステップS5乃至ステップS10と同様の処理が実行される。これにより、Proverの周囲のWitnessによりPoLリクエストに対応するPoLレスポンスが生成され、PoLレスポンスを含むPoLブロックが、ブロックチェーンに追加される。すなわち、活動データ取得時のProverの位置情報及びWitnessによる位置証明情報、並びに、活動データのフィンガープリントがブロックチェーンに記録される。 In steps S505 to S510, the same processes as steps S5 to S10 in FIG. 5 described above are executed. As a result, a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information of the Prover at the time of acquisition of the activity data, the location verification information by the Witness, and the fingerprint of the activity data are recorded in the blockchain.
 そして、例えば、ユーザU1の契約期間中かつ活動中に、ステップS502乃至ステップS510の処理が、繰り返し実行される。例えば、ユーザU1が所定の歩数(例えば100歩)を歩く毎、ステップS502乃至ステップS510の処理が実行される。これにより、ユーザU1の活動中の活動データ及び位置情報、並びに、位置情報に対する位置証明情報が、ブロックチェーンに記録される。 Then, for example, the processes of steps S502 to S510 are repeatedly executed during the contract period and activity of user U1. For example, every time the user U1 walks a predetermined number of steps (for example, 100 steps), the processing from step S502 to step S510 is executed. As a result, the activity data and location information during activity of the user U1, and the location proof information for the location information are recorded in the blockchain.
 ステップS511において、Proverの申告部135は、申告データを生成し、送信する。具体的には、申告部135は、ユーザU1の活動中の活動データ及び移動データ、ユーザU1の登録ID、並びに、Proverの公開鍵を含む申告データを生成する。移動データは、例えば、活動データの取得時に生成された位置情報を時系列に並べたデータである。申告部135は、通信部110及びネットワーク21を介して、申告データをサーバ12に送信する。 In step S511, the declaration unit 135 of the Prover generates and transmits declaration data. Specifically, the reporting unit 135 generates reporting data including the active activity data and movement data of the user U1, the registration ID of the user U1, and the Prover's public key. The movement data is, for example, data in which the position information generated when the activity data is acquired is arranged in chronological order. The declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS512において、サーバ12の検証部232は、申告データの照合を要求する。具体的には、検証部232は、申告データに含まれる各活動データのフィンガープリントを生成する。検証部232は、通信部207及びネットワーク21を介して、移動データに含まれる各位置情報と活動データのフィンガープリントの組み合わせ毎に、位置情報と活動データのフィンガープリントの照合を要求する。 At step S512, the verification unit 232 of the server 12 requests verification of the declaration data. Specifically, the verification unit 232 generates a fingerprint for each activity data included in the declaration data. The verification unit 232 requests, via the communication unit 207 and the network 21, verification of the fingerprints of the location information and the activity data for each combination of the fingerprints of the location information and the activity data contained in the movement data.
 ステップS513において、ブロックチェーンネットワーク13は、申告データの照合を行い、照合結果を送信する。具体的には、ブロックチェーンネットワーク13は、位置情報及び活動データのフィンガープリントの組み合わせ毎に、一致する位置情報及びフィンガープリントを含むPoLブロックをブロックチェーンから抽出する。ブロックチェーンネットワーク13は、ネットワーク21を介して、抽出した複数のPoLブロックをサーバ12に送信する。 In step S513, the blockchain network 13 verifies the declaration data and transmits the verification result. Specifically, for each combination of location information and activity data fingerprints, the blockchain network 13 extracts PoL blocks containing matching location information and fingerprints from the blockchain. Blockchain network 13 transmits the extracted PoL blocks to server 12 via network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、各PoLブロックを受信する。 In response, the CPU 201 of the server 12 receives each PoL block via the network 21 and the communication unit 207.
 ステップS514において、サーバ12は、申告データに基づいて、保険料等の計算を行う。具体的には、サーバ12の検証部232は、上述した図13のステップS105と同様の処理により、申告データの検証を行う。検証部232は、検証の結果、申告データに含まれる活動データ、及び、移動データに含まれる各位置情報が真正であると判定した場合、移動データを計算部241に供給する。 In step S514, the server 12 calculates insurance premiums, etc., based on the declaration data. Specifically, the verification unit 232 of the server 12 verifies the declaration data by the same processing as in step S105 of FIG. 13 described above. The verification unit 232 supplies the movement data to the calculation unit 241 when it is determined that the activity data included in the report data and the location information included in the movement data are authentic as a result of the verification.
 計算部241は、位置情報及び活動データの各組み合わせに基づいて、ユーザU1の契約期間中の活動内容を推定する。例えば、計算部241は、ユーザU1の契約期間中の歩行距離等を推定する。 The calculation unit 241 estimates the activity content of the user U1 during the contract period based on each combination of the position information and the activity data. For example, the calculation unit 241 estimates the walking distance or the like during the contract period of the user U1.
 計算部241は、推定した活動内容に基づいて、ユーザのU1の保険料又は特典を計算する。例えば、計算部241は、推定した活動内容に基づいて、ユーザのU1の次の契約期間の保険料の割引きや、現在の契約期間の保険料に対してキャッシュバックする金額等を計算する。 The calculation unit 241 calculates insurance premiums or benefits for user U1 based on the estimated activity content. For example, the calculation unit 241 calculates a discount on insurance premiums for the next contract period of user U1, an amount to be cashed back for insurance premiums for the current contract period, and the like, based on the estimated activity content.
 計算部241は、保険料又は特典の計算結果を実行部242に供給する。 The calculation unit 241 supplies the calculation result of insurance premiums or benefits to the execution unit 242 .
 ステップS514において、サーバ12とProverとは、上述した図19のステップS314の処理と同様に、保険料等の清算処理を実行する。 In step S514, the server 12 and the Prover execute settlement processing for insurance premiums, etc., similar to the processing in step S314 of FIG. 19 described above.
 以上のようにして、ユーザの活動データ及び移動データの真正性を保証しつつ、活動データ及び移動データに基づいて、健康増進型保険の保険料又は特典を適切に設定することができる。 As described above, while ensuring the authenticity of the user's activity data and movement data, it is possible to appropriately set insurance premiums or benefits for health promotion insurance based on the activity data and movement data.
  <撮影時の位置情報及び位置証明情報を記録して利用する処理>
 図14乃至図18を参照して上述した処理では、事故の発生前後にProverの周囲の情報処理端末11(Witness)がAPPを起動し、撮影を実行している必要がある。そのため、事故発生時にProverの周囲に情報処理端末11が存在しても、有力な目撃情報を得られない場合がある。
<Process of recording and using location information and location verification information at the time of shooting>
In the processing described above with reference to FIGS. 14 to 18, it is necessary that the information processing terminal 11 (Witness) around the Prover activates the APP and performs shooting before and after the accident occurs. Therefore, even if the information processing terminal 11 exists around the Prover when an accident occurs, it may not be possible to obtain convincing eyewitness information.
 これに対して、例えば、各情報処理端末11が、撮影を行いながら、得られた画像データをサーバ12に保存するとともに、撮影時の位置情報及び位置証明情報をブロックチェーンに記録する。そして、ブロックチェーンに記録された位置情報及び位置証明情報を用いて、サーバ12に保存された画像データを事故発生時の目撃情報に用いることが考えられる。 On the other hand, for example, each information processing terminal 11 saves the obtained image data in the server 12 while shooting, and records the location information and location certification information at the time of shooting in the blockchain. It is conceivable to use the image data stored in the server 12 as eyewitness information when an accident occurs, using the location information and location certification information recorded in the blockchain.
 ここで、撮影時の位置情報及び位置証明を記録して利用する処理について説明する。 Here, we will explain the process of recording and using the location information and location certification at the time of shooting.
 まず、図23のシーケンス図を参照して、画像データを保存するとともに、撮影時の位置情報及び位置証明情報を記録する処理について説明する。 First, with reference to the sequence diagram of FIG. 23, the process of saving image data and recording position information and position verification information at the time of shooting will be described.
 ステップS601において、Proverの通信部110は、周囲のWitnessとの通信状況を確認する。 In step S601, the communication unit 110 of the Prover checks the communication status with surrounding witnesses.
 ステップS602において、Proverの撮影部107は、画像(静止画又は動画)を撮影する。撮影部107は、撮影した画像に対応する画像データをCPU101に供給する。 In step S602, the photographing unit 107 of the Prover photographs an image (still image or moving image). The photographing unit 107 supplies image data corresponding to the photographed image to the CPU 101 .
 ステップS603において、Proverの画像登録部153は、画像データを暗号化してアップロードする。具体的には、位置検出部132は、撮影が終了した直後に、Proverが保持する秘密鍵を用いて画像データを暗号化する。位置検出部132は、通信部110及びネットワーク21を介して、暗号化した画像データをサーバ12に送信する。 In step S603, the image registration unit 153 of the Prover encrypts and uploads the image data. Specifically, the position detection unit 132 encrypts the image data using the private key held by the Prover immediately after the photographing is completed. The position detection section 132 transmits the encrypted image data to the server 12 via the communication section 110 and the network 21 .
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、画像データを受信する。 In response, the CPU 201 of the server 12 receives the image data via the network 21 and the communication unit 207.
 ステップS604において、Proverの位置検出部132は、位置情報を生成し、暗号化した画像データのハッシュ値を計算する。具体的には、位置検出部132は、GNSS受信機108から出力される位置検出データに基づいて、Proverの現在位置を検出する。位置検出部132は、Proverの現在位置及び現在時刻を含む位置情報を生成する。画像登録部153は、ステップS603の処理で暗号化した画像データのハッシュ値を計算する。
画像登録部153は、計算したハッシュ値を位置検出部132に供給する。
In step S604, the position detection unit 132 of the Prover generates position information and calculates a hash value of the encrypted image data. Specifically, the position detector 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 . The position detector 132 generates position information including the current position and current time of the Prover. The image registration unit 153 calculates a hash value of the image data encrypted in step S603.
The image registration unit 153 supplies the calculated hash value to the position detection unit 132 .
 ステップS605において、サーバ12の情報収集部233は、暗号化された画像及びそのハッシュ値を保存する。具体的には、情報収集部233は、ステップS603の処理で受信した暗号化された画像データのハッシュ値を計算する。情報収集部233は、暗号化された画像データにアクセスするための画像IDを発行する。情報収集部233は、画像ID、暗号化された画像データ、及び、ハッシュ値を関連付けて、保険DB204に保存させる。 In step S605, the information collection unit 233 of the server 12 saves the encrypted image and its hash value. Specifically, the information collecting unit 233 calculates a hash value of the encrypted image data received in the process of step S603. The information collection unit 233 issues an image ID for accessing the encrypted image data. The information collection unit 233 associates the image ID, the encrypted image data, and the hash value, and stores them in the insurance DB 204 .
 ステップS606において、サーバ12の情報収集部233は、通信部207及びネットワーク21を介して、ステップS605の処理で保存した画像にアクセスできる画像IDをProverに送信する。 In step S606, the information collection unit 233 of the server 12 transmits to the Prover, via the communication unit 207 and the network 21, an image ID that allows access to the image saved in step S605.
 これに対して、ProverのCPU101は、ネットワーク21及び通信部110を介して、画像IDを受信する。Proverの位置検出部132は、ステップS604の処理で得られた位置情報及び暗号化した画像データと、受信した画像IDとを関連付けて、ストレージ103に記憶させる。また、位置検出部132は、位置情報及び画像データを位置証明取得部151に供給する。 In response, the Prover's CPU 101 receives the image ID via the network 21 and the communication unit 110 . The position detection unit 132 of the Prover associates the position information and the encrypted image data obtained in the process of step S604 with the received image ID and stores them in the storage 103 . The position detection unit 132 also supplies the position information and the image data to the position certification acquisition unit 151 .
 ステップS607において、Proverは、上述した図10のステップS33の処理と同様に、PoLリクエストを生成し、送信する。このとき、例えば、画像データ、及び、Provesの動きを検出するモーションセンサのセンサデータ(以下、モーションデータと称する)を含むメタデータのフィンガープリントが、metadata_fingerprintとしてPoLブロックに格納される。 At step S607, the Prover generates and transmits a PoL request in the same manner as the process at step S33 in FIG. 10 described above. At this time, for example, a fingerprint of metadata including image data and sensor data of a motion sensor that detects the movement of Proves (hereinafter referred to as motion data) is stored in the PoL block as metadata_fingerprint.
 ステップS608乃至ステップS613において、上述した図10のステップS34乃至ステップS39と同様の処理が実行される。これにより、Proverの周囲のWitnessによりPoLリクエストに対応するPoLレスポンスが生成され、PoLレスポンスを含むPoLブロックが、ブロックチェーンに追加される。すなわち、画像の撮影時の位置情報及びWitnessによる位置証明情報が、画像データのハッシュ値及びモーションデータを含むメタデータのフィンガープリントとともにブロックチェーンに記録される。 In steps S608 through S613, the same processes as in steps S34 through S39 of FIG. 10 described above are executed. As a result, a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information at the time the image was taken and the location proof information by Witness are recorded in the blockchain together with the hash value of the image data and the fingerprint of the metadata including the motion data.
 次に、図24のシーケンス図を参照して、目撃情報の収集処理について説明する。 Next, the eyewitness information collection process will be described with reference to the sequence diagram of FIG.
 なお、以下、事故当事者(事故の被害者又は加害者)の情報処理端末11を、単に事故当事者と称する。以下、事故目撃者の情報処理端末11を、単に事故目撃者と称する。 In addition, hereinafter, the information processing terminal 11 of the accident party (accident victim or perpetrator) is simply referred to as the accident party. Hereinafter, the information processing terminal 11 of the accident eyewitness is simply referred to as the accident eyewitness.
 ステップS631において、事故当事者は、上述した図14のステップS211の処理と同様に、申告データを生成し、サーバ12に送信する。申告データは、例えば、事故発生時の事故当事者の位置情報及び事故データを含む。事故データは、例えば、事故発生時に撮影された画像に対応する画像データ、及び、事故発生時に取得されたモーションデータを含む。 In step S631, the party involved in the accident generates report data and transmits it to the server 12 in the same manner as in the process of step S211 in FIG. The reporting data includes, for example, location information of the parties involved in the accident and accident data when the accident occurred. The accident data includes, for example, image data corresponding to images captured when the accident occurred, and motion data acquired when the accident occurred.
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、申告データを受信する。 In response, the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
 ステップS632において、サーバ12の検証部232は、申告データに含まれる事故データのフィンガープリントを生成する。 At step S632, the verification unit 232 of the server 12 generates a fingerprint of the accident data included in the report data.
 ステップS633において、サーバ12は、上述した図18のステップS233の処理と同様に、申告データに基づいて、PoLの記録のクエリを行う。 In step S633, the server 12 makes a PoL record query based on the declaration data, similar to the processing in step S233 of FIG. 18 described above.
 ステップS634において、ブロックチェーンネットワーク13は、上述した図18のステップS234の処理と同様に、クエリに基づいてPoLの記録を抽出し、サーバ12に送信する。 In step S634, the blockchain network 13 extracts PoL records based on the query and transmits them to the server 12, similar to the process of step S234 in FIG.
 ステップS635において、サーバ12は、上述した図18のステップS235と同様の処理により、申告データを検証し、事故目撃者を推定する。 In step S635, the server 12 verifies the report data and estimates the accident eyewitness by the same processing as in step S235 of FIG. 18 described above.
 ステップS636において、サーバ12は、上述した図18のステップS236と同様の処理により、目撃者アンケートを生成し、事故目撃者にブロードキャストする。 In step S636, the server 12 generates an eyewitness questionnaire through the same processing as in step S236 of FIG. 18 described above, and broadcasts it to the accident eyewitnesses.
 これに対して、事故目撃者のCPU101は、ネットワーク21及び通信部110を介して、目撃者アンケートを受信する。 In response, the accident eyewitness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
 ステップS637において、事故目撃者は、上述した図18のステップS237と同様の処理により、目撃者アンケートの公開鍵を確認する。 In step S637, the eyewitness to the accident confirms the public key of the eyewitness questionnaire by the same process as in step S237 of FIG. 18 described above.
 ステップS638において、事故目撃者の情報提供部162は、上述した図14のステップS216の処理と同様に、目撃者アンケートの回答を生成し、サーバ12に送信する。 In step S638, the accident eyewitness information provision unit 162 generates responses to the eyewitness questionnaire and transmits them to the server 12, in the same manner as in the process of step S216 in FIG.
 ただし、図14のステップS16の処理では、目撃者アンケートの回答は、事故発生前後に事故現場付近で撮影された画像のデータを含んでいた。 However, in the process of step S16 in FIG. 14, the responses to the eyewitness questionnaire included image data taken near the accident site before and after the accident.
 一方、この処理では、目撃者アンケートの回答は、事故発生前後に事故現場付近で撮影された画像に対応する画像データに対して、サーバ12が付与した画像IDを含む。また、目撃者アンケートの回答は、事故発生前後に事故現場付近で撮影された画像に対応する画像データの暗号化に用いた暗号鍵に対応する復号鍵を含む。 On the other hand, in this process, the responses to the eyewitness questionnaire include the image ID given by the server 12 to the image data corresponding to the images taken near the accident site before and after the accident. Responses to the eyewitness questionnaire also include decryption keys corresponding to encryption keys used to encrypt image data corresponding to images taken near the accident site before and after the accident.
 これに対して、サーバ12のCPU201は、ネットワーク21及び通信部207を介して、目撃者アンケートの回答を受信する。 In response, the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
 ステップS639において、サーバ12の情報収集部233は、目撃者アンケートの回答を検証し、画像を取得する。具体的には、情報収集部233は、上述した図18のステップS239と同様の処理により、目撃者アンケートの回答を検証する。情報収集部233は、目撃者アンケートの回答が真正であると判定した場合、目撃者アンケートの回答に含まれる画像IDに対応する画像データを保険DB204から取得する。情報収集部233は、目撃者アンケートの回答に含まれる復号鍵を用いて、取得した画像データを復号する。これにより、事故発生前後に事後現場付近で撮影された画像に対応する画像データが取得される。 In step S639, the information collection unit 233 of the server 12 verifies the answers to the eyewitness questionnaire and acquires images. Specifically, the information collecting unit 233 verifies the answers to the eyewitness questionnaire by the same processing as in step S239 of FIG. 18 described above. When the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic, the information collecting unit 233 acquires image data corresponding to the image ID included in the answer to the eyewitness questionnaire from the insurance DB 204 . The information collecting unit 233 decrypts the acquired image data using the decryption key included in the answers to the eyewitness questionnaire. As a result, image data corresponding to images taken near the scene after the accident before and after the accident is acquired.
 ステップS640及びステップS641において、上述した図18のステップS239及びステップS240と同様の処理が実行される。 In steps S640 and S641, processing similar to steps S239 and S240 in FIG. 18 described above is executed.
 以上のようにして、情報処理端末11において得られた画像データが、暗号化されてサーバ12に保存される。また、画像データ及びモーションデータを含むメタデータのフィンガープリントが、撮影時の位置情報及び位置証明情報と関連付けられて、ブロックチェーンに記録される。これにより、画像データ及び画像データの撮影時の位置情報の真正性が保証される。従って、真正性が保証された画像データを目撃情報に用いることができる。 As described above, the image data obtained by the information processing terminal 11 is encrypted and stored in the server 12 . Metadata fingerprints, including image data and motion data, are also recorded on the blockchain in association with location information and location proof information at the time of capture. This ensures the authenticity of the image data and the positional information of the image data at the time of shooting. Therefore, image data whose authenticity is guaranteed can be used for eyewitness information.
 また、画像データがサーバ12に保存されるため、情報処理端末11に記憶されている画像データが削除されたとしても、後で目撃情報として使用することが可能になる。 Also, since the image data is stored in the server 12, even if the image data stored in the information processing terminal 11 is deleted, it can be used later as eyewitness information.
 なお、画像の撮影の開始からの終了までの間において、上述した撮影終了時以外の所定のタイミングで位置情報が生成され、位置証明情報とともに記録されるようにしてもよい。また、動画を撮影する場合、動画の撮影の開始からの終了までの間の複数のタイミングにおいて、位置情報が生成され、位置証明情報とともに記録されるようにしてもよい。 It should be noted that the position information may be generated at a predetermined timing other than the end of the above-described image capturing and recorded together with the position certification information during the period from the start to the end of image capturing. Further, when shooting a moving image, location information may be generated at a plurality of timings from the start to the end of shooting the moving image and recorded together with the location certification information.
  <位置証明情報の信頼度を高める方法>
 例えば、1人のユーザが複数の情報処理端末11を所持したり、情報処理端末11を所持する複数のユーザが共謀したりして、Witnessをなりすまさせることにより、不正な位置証明情報が生成されることが想定される。
<How to increase the reliability of location verification information>
For example, one user possesses a plurality of information processing terminals 11, or a plurality of users possessing information processing terminals 11 collude to spoof Witness, thereby generating unauthorized location proof information. It is assumed that
 これに対して、1つの位置情報に対して所定の数以上のWitnessによる位置証明情報が存在しない場合、位置証明情報を無効にすることが考えられる。これにより、なりすまさせるWitnessの数を増やす必要が生じ、位置証明情報の不正の難易度が高くなる。また、例えば、1つの位置情報に対して複数のWitnessによる位置証明情報が存在する場合、各位置証明情報の位置及び時刻を比較することにより、位置証明情報の不正を防止し、信頼度を高めることができる。 On the other hand, if there is no more than a predetermined number of Witness location certification information for one piece of location information, it is possible to invalidate the location certification information. This makes it necessary to increase the number of Witnesses to impersonate, increasing the difficulty of fraudulent location verification information. Also, for example, if there are multiple pieces of location proof information from Witness for one piece of location information, by comparing the location and time of each piece of location proof information, fraudulent location proof information can be prevented and reliability can be increased. be able to.
 しかしながら、例えば、特にサービスの初期段階において、Proverの周囲にWitnessとなる情報処理端末11が十分な数だけ存在しない場合が想定される。また、例えば、情報処理端末11がスマートフォン等の携帯情報端末である場合、必ずしも上記のAPPを実行し、かつ、通信可能な状態であるとは限らない。従って、Proverの周囲に情報処理端末11が存在しても、必ずしもWitnessになれるとは限らない。 However, for example, especially in the initial stage of the service, there may not be a sufficient number of information processing terminals 11 serving as Witnesses around the Prover. Further, for example, when the information processing terminal 11 is a mobile information terminal such as a smart phone, the above APP is not necessarily executed and communication is possible. Therefore, even if the information processing terminal 11 exists around the Prover, it does not necessarily become a Witness.
 これに対して、例えば、Witnessとして動作する情報処理端末11を屋外や屋内に設置することにより、各情報処理端末11の周囲においてWitnessの数を増やすことができる。これにより、位置証明情報の信頼度を高めることができる。 On the other hand, for example, by installing the information processing terminals 11 that operate as Witnesses outdoors or indoors, the number of Witnesses can be increased around each information processing terminal 11 . This makes it possible to increase the reliability of the position proof information.
 なお、屋外や屋内に設置される当該情報処理端末11は、Witnessとしてのみ動作するようにしてもよいし、Proverとしても動作するようにしてもよい。後者の場合、例えば、当該情報処理端末11は、通常はWitnessして動作し、他の情報処理端末11から要求があった場合にProverとして動作する。 The information processing terminal 11 installed outdoors or indoors may operate only as a Witness, or may operate as a Prover. In the latter case, for example, the information processing terminal 11 normally operates as a Witness, and operates as a Prover when requested by another information processing terminal 11 .
 <<2.第2の実施の形態>>
 次に、図25乃至図30を参照して、本技術の第2の実施の形態について説明する。
<<2. Second Embodiment>>
Next, a second embodiment of the present technology will be described with reference to FIGS. 25 to 30. FIG.
  <情報処理システム401の構成例>
 図25は、本技術を適用した情報処理システムの第2の実施の形態である情報処理システム401の構成例を示している。
<Configuration example of information processing system 401>
FIG. 25 shows a configuration example of an information processing system 401 that is a second embodiment of an information processing system to which the present technology is applied.
 情報処理システム401は、事故の発生前後に事故現場付近で撮影された画像に対応する画像データを収集するシステムである。情報処理システム401は、事故トリガ生成機411-1乃至事故トリガ生成機411-m、撮影機412-1乃至撮影機412-n、及び、サーバ413を備える。事故トリガ生成機411-1乃至事故トリガ生成機411-m、撮影機412-1乃至撮影機412-n、及び、サーバ413は、ネットワーク421を介して接続され、相互に通信することが可能である。また、事故トリガ生成機411-1乃至事故トリガ生成機411-m、及び、撮影機412-1乃至撮影機412-nは、ネットワーク421を介さずに、近距離無線通信を用いて直接通信することが可能である。 The information processing system 401 is a system that collects image data corresponding to images taken near the accident site before and after the occurrence of the accident. The information processing system 401 includes accident trigger generators 411-1 to 411-m, camera 412-1 to camera 412-n, and a server 413. The accident trigger generators 411-1 to 411-m, the camera 412-1 to 412-n, and the server 413 are connected via a network 421 and can communicate with each other. be. In addition, the accident trigger generators 411-1 to 411-m and the camera 412-1 to camera 412-n communicate directly using short-range wireless communication without going through the network 421. It is possible.
 なお、以下、事故トリガ生成機411-1乃至事故トリガ生成機411-mを個々に区別する必要がない場合、単に事故トリガ生成機411と称する。以下、撮影機412-1乃至撮影機412-nを個々に区別する必要がない場合、単に撮影機412と称する。 It should be noted that hereinafter, the accident trigger generators 411-1 to 411-m are simply referred to as the accident trigger generator 411 when there is no need to distinguish them individually. Hereinafter, the cameras 412-1 to 412-n are simply referred to as the camera 412 when there is no need to distinguish them individually.
 事故トリガ生成機411は、例えば、ユーザが携帯又は装着することにより持ち運ぶことが可能な携帯型の情報処理装置により構成される。事故トリガ生成機411は、事故の検出処理を行い、事故を検出した場合、周囲に存在する撮影機412にリクエストトリガを送信し、位置証明を要求する。 The accident trigger generator 411 is composed of, for example, a portable information processing device that can be carried by the user or worn by the user. The accident trigger generator 411 performs an accident detection process, and when an accident is detected, transmits a request trigger to the camera 412 present in the surroundings to request position proof.
 撮影機412は、撮影機能及び通信機能を備える情報処理装置により構成される。例えば、撮影機412は、ユーザが携帯又は装着することにより持ち運ぶことが可能な携帯型の情報処理装置により構成される。例えば、撮影機412は、スマートフォン、携帯電話機、タブレット端末、ウエアラブルデバイス、アクションカメラ、携帯用ゲーム機等により構成される。 The photographing device 412 is configured by an information processing device having a photographing function and a communication function. For example, the photographing device 412 is configured by a portable information processing device that can be carried by the user or worn by the user. For example, the camera 412 is configured by a smart phone, a mobile phone, a tablet terminal, a wearable device, an action camera, a portable game machine, or the like.
 また、例えば、撮影機412は、ドライブレコーダ等の車両(二輪車を含む)等の移動体に搭載され、移動体の周囲を撮影し、記録する情報処理装置により構成される。 Also, for example, the camera 412 is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle) such as a drive recorder, and is configured by an information processing device that photographs and records the surroundings of the mobile object.
 さらに、撮影機412は、例えば、屋外又は屋内の任意の場所に設置される専用の撮影装置により構成される。 Further, the camera 412 is configured by, for example, a dedicated camera installed at an arbitrary location outdoors or indoors.
 撮影機412は、ネットワーク421を介して、サーバ413から秘密鍵を受信する。撮影機412は、周囲の撮影を行い、得られた画像データに、サーバ413から受信した秘密鍵を用いて透かしを重畳する。撮影機412は、透かしを重畳した画像データに、撮影位置及び撮影時刻を示す位置情報を関連付けて、記憶する。撮影機412は、事故トリガ生成機411からリクエストトリガを受信した場合、リクエストトリガを受信した前後の期間(位置証明が要求された時刻付近)に撮影された画像に対応する画像データ、及び、画像データに関連付けられている位置情報を、ネットワーク421を介してサーバ413に送信する。 The camera 412 receives the private key from the server 413 via the network 421 . The image capturing device 412 captures images of the surroundings, and superimposes a watermark on the obtained image data using the secret key received from the server 413 . The camera 412 stores the watermark-superimposed image data in association with position information indicating the shooting position and shooting time. When the camera 412 receives the request trigger from the accident trigger generator 411, the camera 412 collects image data corresponding to the images captured during the period before and after receiving the request trigger (near the time when the position proof is requested), and the image Location information associated with the data is transmitted to server 413 via network 421 .
 サーバ413は、秘密鍵を生成し、ネットワーク421を介して、撮影機412に送信する。サーバ413は、ネットワーク421を介して、画像データ及び位置情報を撮影機412から受信し、受信した画像データを検証する。サーバ413は、画像データが正当である場合、画像データ及び位置情報を含むPoLブロックを生成し、ブロックチェーンに追加する。また、サーバ413は、ブロックチェーンネットワークを構成する他のノード(不図示)にPoLブロックを送信し、ブロックチェーンに追加させる。 The server 413 generates a private key and transmits it to the camera 412 via the network 421. The server 413 receives image data and location information from the camera 412 via the network 421 and verifies the received image data. If the image data is valid, the server 413 creates a PoL block containing the image data and location information and adds it to the blockchain. The server 413 also transmits the PoL block to other nodes (not shown) that make up the blockchain network to add it to the blockchain.
  <事故トリガ生成機411の構成例>
 図26は、事故トリガ生成機411の機能的構成例を示すブロック図である。
<Configuration example of accident trigger generator 411>
FIG. 26 is a block diagram showing a functional configuration example of the accident trigger generator 411. As shown in FIG.
 事故トリガ生成機411は、CPU501、メモリ502、ストレージ503、操作部504、表示部505、スピーカ506、撮影部507、センシング部508、通信部509、外部I/F510、及び、ドライブ511を備える。CPU501乃至ドライブ511は、バスに接続されており、相互に、必要な通信を行う。 The accident trigger generator 411 includes a CPU 501, a memory 502, a storage 503, an operation unit 504, a display unit 505, a speaker 506, an imaging unit 507, a sensing unit 508, a communication unit 509, an external I/F 510, and a drive 511. The CPU 501 to drive 511 are connected to a bus and perform necessary communications with each other.
 CPU501乃至ドライブ511は、図2の情報処理端末11のCPU101乃至撮影部107及びセンシング部109乃至ドライブ112とそれぞれ同様に構成される。 The CPU 501 through the drive 511 are configured similarly to the CPU 101 through the imaging unit 107 and the sensing unit 109 through the drive 112 of the information processing terminal 11 in FIG.
 事故トリガ生成機411では、情報処理端末11と同様に、CPU501が実行するプログラムは、事故トリガ生成機411に内蔵されている記録媒体としてのストレージ503にあらかじめ記録しておくことができる。 In the accident trigger generator 411 , as in the information processing terminal 11 , the program executed by the CPU 501 can be recorded in advance in the storage 503 as a recording medium built into the accident trigger generator 411 .
 また、プログラムは、リムーバブルメディア510Aに格納(記録)して、パッケージソフトウエアとして提供し、リムーバブルメディア510Aから事故トリガ生成機411にインストールすることができる。 Also, the program can be stored (recorded) in the removable media 510A, provided as package software, and installed in the accident trigger generator 411 from the removable media 510A.
 その他、プログラムは、ネットワーク421及び通信部509を介して、図示せぬ他のサーバ等からダウンロードし、事故トリガ生成機411にインストールすることができる。 In addition, the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 509 and installed in the accident trigger generator 411.
 CPU501が事故トリガ生成機411にインストールされたプログラムを実行することにより、制御部531及び事故検出部532を含む機能が実現される。 By executing the program installed in the accident trigger generator 411 by the CPU 501, functions including the control unit 531 and the accident detection unit 532 are realized.
 制御部531は、事故トリガ生成機411の各部の処理の制御を行う。 The control unit 531 controls the processing of each unit of the accident trigger generator 411.
 事故検出部532は、撮影部507から出力される画像データ、及び、センシング部508から出力されるセンサデータのうち少なくとも1つに基づいて、事故トリガ生成機411のユーザが関連する事故、又は、事故トリガ生成機411の周囲で発生した事故を検出する。事故検出部532は、事故を検出した場合、通信部509を介してBTにより、事故トリガ生成機411の周囲の撮影機412にリクエストトリガを送信する。 The accident detection unit 532 detects an accident related to the user of the accident trigger generator 411, or Accidents occurring around the accident trigger generator 411 are detected. When detecting an accident, the accident detection unit 532 transmits a request trigger to the cameras 412 around the accident trigger generator 411 by BT via the communication unit 509 .
  <撮影機412の構成例>
 図27は、撮影機412の機能的構成例を示すブロック図である。
<Configuration example of camera 412>
FIG. 27 is a block diagram showing a functional configuration example of the camera 412. As shown in FIG.
 撮影機412は、CPU601、メモリ602、ストレージ603、操作部604、表示部605、スピーカ606、撮影部607、GNSS受信機608、センシング部609、通信部610、外部I/F611、及び、ドライブ612を備える。CPU601乃至ドライブ612は、バスに接続されており、相互に、必要な通信を行う。 The imaging device 412 includes a CPU 601, a memory 602, a storage 603, an operation unit 604, a display unit 605, a speaker 606, an imaging unit 607, a GNSS receiver 608, a sensing unit 609, a communication unit 610, an external I/F 611, and a drive 612. Prepare. The CPU 601 to drive 612 are connected to a bus and perform necessary communications with each other.
 CPU601乃至ドライブ612は、図2の情報処理端末11のCPU101乃至ドライブ612とそれぞれ同様に構成される。 The CPU 601 to drive 612 are configured similarly to the CPU 101 to drive 612 of the information processing terminal 11 in FIG.
 撮影機412では、情報処理端末11と同様に、CPU601が実行するプログラムは、撮影機412に内蔵されている記録媒体としてのストレージ603にあらかじめ記録しておくことができる。 In the camera 412 , as in the information processing terminal 11 , the program executed by the CPU 601 can be recorded in advance in the storage 603 as a recording medium built into the camera 412 .
 また、プログラムは、リムーバブルメディア612Aに格納(記録)して、パッケージソフトウエアとして提供し、リムーバブルメディア612Aからサーバ12にインストールすることができる。 Also, the program can be stored (recorded) in the removable media 612A, provided as package software, and installed in the server 12 from the removable media 612A.
 その他、プログラムは、ネットワーク421及び通信部610を介して、図示せぬ他のサーバ等からダウンロードし、撮影機412にインストールすることができる。 In addition, the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 610 and installed in the camera 412 .
 CPU601が撮影機412にインストールされたプログラムを実行することにより、制御部631、位置検出部632、透かし重畳部633、及び、位置証明部634を含む機能が実現される。 Functions including a control unit 631, a position detection unit 632, a watermark superimposition unit 633, and a position verification unit 634 are realized by the CPU 601 executing the program installed in the camera 412.
 制御部631は、撮影機412の各部の処理の制御を行う。 A control unit 631 controls processing of each unit of the camera 412 .
 位置検出部632は、GNSS受信機608から出力される位置検出データに基づいて、撮影機412の現在位置を検出する。位置検出部632は、検出した現在位置及び現在時刻を含む位置情報を生成する。 The position detection unit 632 detects the current position of the camera 412 based on the position detection data output from the GNSS receiver 608 . The position detection unit 632 generates position information including the detected current position and current time.
 透かし重畳部633は、ネットワーク421及び通信部610を介して、サーバ413から秘密鍵を受信する。透かし重畳部633は、秘密鍵を用いて透かしを生成し、撮影機412から供給される画像データに透かしを重畳する。透かし重畳部633は、透かしを重畳した画像データと、画像データの撮影位置を示す位置情報とを関連付けてストレージ603に記憶させる。 The watermark superimposing unit 633 receives the secret key from the server 413 via the network 421 and the communication unit 610. The watermark superimposing unit 633 generates a watermark using the secret key and superimposes the watermark on the image data supplied from the camera 412 . The watermark superimposing unit 633 associates the watermark-superimposed image data with position information indicating the shooting position of the image data and stores them in the storage 603 .
 位置証明部634は、事故トリガ生成機411からリクエストトリガを受信した場合、リクエストトリガを受信した前後の所定の区間の画像データ、及び、画像データに関連付けられている位置情報をストレージ603から取得する。位置証明部634は、通信部610及びネットワーク21を介して、取得した画像データ及び位置情報を含む位置証明情報をサーバ413に送信する。 When the request trigger is received from the accident trigger generator 411, the location certification unit 634 acquires from the storage 603 image data of a predetermined interval before and after receiving the request trigger and location information associated with the image data. . The location certification unit 634 transmits location certification information including the acquired image data and location information to the server 413 via the communication unit 610 and the network 21 .
  <サーバ413の構成例>
 図28は、サーバ413の機能的構成例を示すブロック図である。
<Configuration example of server 413>
FIG. 28 is a block diagram showing a functional configuration example of the server 413. As shown in FIG.
 サーバ413は、CPU701、メモリ702、ストレージ703、画像DB(Data Base)704、操作部705、表示部706、通信部707、外部I/F708、及び、ドライブ709を備える。CPU701乃至ドライブ709は、バスに接続されており、相互に、必要な通信を行う。 The server 413 includes a CPU 701 , a memory 702 , a storage 703 , an image DB (Data Base) 704 , an operation section 705 , a display section 706 , a communication section 707 , an external I/F 708 and a drive 709 . The CPU 701 to drive 709 are connected to a bus and perform necessary communications with each other.
 CPU701乃至ストレージ703及び操作部705乃至ドライブ709は、図4のサーバ12のCPU201乃至ストレージ203及び操作部205乃至ドライブ209とそれぞれ同様に構成される。 The CPU 701 to storage 703 and operation unit 705 to drive 709 are configured in the same manner as the CPU 201 to storage 203 and operation unit 205 to drive 209 of the server 12 in FIG.
 画像DB704は、地図情報、地図情報の各位置の周辺の画像データ、及び、各画像データの特徴を示す特徴データを蓄積する。 The image DB 704 accumulates map information, image data around each position of the map information, and feature data indicating features of each image data.
 サーバ413では、情報処理端末11と同様に、CPU701が実行するプログラムは、サーバ413に内蔵されている記録媒体としてのストレージ703にあらかじめ記録しておくことができる。 In the server 413 , as in the information processing terminal 11 , the program executed by the CPU 701 can be recorded in advance in the storage 703 as a recording medium incorporated in the server 413 .
 また、プログラムは、リムーバブルメディア709Aに格納(記録)して、パッケージソフトウエアとして提供し、リムーバブルメディア709Aからサーバ413にインストールすることができる。 Also, the program can be stored (recorded) in the removable media 709A, provided as package software, and installed in the server 413 from the removable media 709A.
 その他、プログラムは、ネットワーク421及び通信部707を介して、図示せぬ他のサーバ等からダウンロードし、サーバ413にインストールすることができる。 In addition, the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 707 and installed on the server 413 .
 CPU701がサーバ12にインストールされたプログラムを実行することにより、制御部731、位置検証部732、及び、PoLブロック生成部733を含む機能が実現される。 Functions including a control unit 731, a position verification unit 732, and a PoL block generation unit 733 are realized by the CPU 701 executing a program installed in the server 12.
 制御部731は、サーバ413の各部の処理の制御を行う。 The control unit 731 controls the processing of each unit of the server 413.
 位置検証部732は、撮影機412から受信した位置証明情報に含まれる画像データに重畳されている透かしを検証するとともに、画像DB704に蓄積されている画像データと、位置証明情報に含まれる画像データとを比較することにより、位置証明情報に含まれる位置情報の検証を行う。位置検証部732は、秘密鍵生成部741、透かし抽出部742、透かし検証部743、特徴抽出部744、及び、特徴検証部745を備える。 The location verification unit 732 verifies the watermark superimposed on the image data included in the location certification information received from the camera 412, and also verifies the image data accumulated in the image DB 704 and the image data included in the location certification information. The position information included in the position proof information is verified by comparing with . The position verification section 732 includes a private key generation section 741 , a watermark extraction section 742 , a watermark verification section 743 , a feature extraction section 744 and a feature verification section 745 .
 秘密鍵生成部741は、秘密鍵を生成し、通信部707及びネットワーク421を介して、撮影機412に送信する。 A secret key generation unit 741 generates a secret key and transmits it to the camera 412 via the communication unit 707 and network 421 .
 透かし抽出部742は、秘密鍵生成部741により生成された秘密鍵を用いて、撮影機412から受信した画像データから透かしを抽出し、抽出した透かしを透かし検証部743に供給する。 The watermark extraction unit 742 extracts the watermark from the image data received from the camera 412 using the secret key generated by the secret key generation unit 741 and supplies the extracted watermark to the watermark verification unit 743 .
 透かし検証部743は、画像データから抽出された透かしを検証し、検証結果を特徴抽出部744に供給する。 A watermark verification unit 743 verifies the watermark extracted from the image data and supplies the verification result to the feature extraction unit 744 .
 特徴抽出部744は、撮影機412から受信した画像データの特徴を抽出し、画像データの特徴の抽出結果を示すデータを特徴検証部745に供給する。 The feature extraction unit 744 extracts features of the image data received from the camera 412 and supplies data indicating the extraction result of the features of the image data to the feature verification unit 745 .
 特徴検証部745は、撮影機412から受信した画像データから抽出された特徴と、画像DB704に格納されている画像データの特徴とを比較することにより、撮影機412から受信した位置情報を検証する。 The feature verification unit 745 verifies the position information received from the camera 412 by comparing the features extracted from the image data received from the camera 412 and the features of the image data stored in the image DB 704 . .
 PoLブロック生成部733は、撮影機412から受信した画像データ及び位置情報のマイニングを行い、PoLブロックを生成し、ブロックチェーンに追加する。また、PoLブロック生成部733は、ブロックチェーンネットワークを構成する他のノードにPoLブロックを送信し、ブロックチェーンに追加させる。 The PoL block generation unit 733 mines the image data and position information received from the camera 412, generates a PoL block, and adds it to the blockchain. In addition, the PoL block generation unit 733 transmits the PoL block to other nodes constituting the blockchain network to add it to the blockchain.
  <情報処理システム401の処理>
 次に、図29のタイミングチャートを参照して、情報処理システム401の処理について説明する。
<Processing of information processing system 401>
Next, the processing of the information processing system 401 will be described with reference to the timing chart of FIG. 29 .
 なお、以下、撮影機412が動画を撮影する場合の例について説明する。 An example in which the camera 412 shoots a moving image will be described below.
 ステップS701において、サーバ413の秘密鍵生成部741は、秘密鍵を生成する。 In step S701, the private key generation unit 741 of the server 413 generates a private key.
 ステップS702において、サーバ413の秘密鍵生成部741は、秘密鍵を共有する。具体的には、秘密鍵生成部741は、通信部707及びネットワーク421を介して、秘密鍵を撮影機412に送信する。 In step S702, the private key generation unit 741 of the server 413 shares the private key. Specifically, the private key generation unit 741 transmits the private key to the camera 412 via the communication unit 707 and network 421 .
 これに対して、撮影機412の透かし重畳部633は、通信部610を介して秘密鍵を受信し、ストレージ603に記憶させる。 On the other hand, the watermark superimposing unit 633 of the camera 412 receives the secret key via the communication unit 610 and stores it in the storage 603.
 これにより、サーバ413と撮影機412との間で秘密鍵が共有される。 As a result, the secret key is shared between the server 413 and the camera 412.
 なお、撮影機412に秘密鍵を送信する方法は、秘密鍵を安全かつ秘密裏に送信可能な方法であれば、任意の方法を採用することが可能である。 Any method of transmitting the private key to the camera 412 can be adopted as long as the private key can be transmitted safely and secretly.
 ステップS703において、撮影機412は、動画の撮影、透かしの重畳、及び、位置情報の保存を開始する。 In step S703, the image capturing device 412 starts capturing moving images, superimposing watermarks, and storing position information.
 具体的には、撮影部607は、動画を撮影し、得られた動画データを透かし重畳部633に供給する処理を開始する。 Specifically, the shooting unit 607 shoots a moving image and starts processing to supply the obtained moving image data to the watermark superimposing unit 633 .
 位置検出部632は、GNSS受信機608から出力される位置検出データに基づいて、撮影機412の現在位置を検出する処理を開始する。位置検出部632は、検出した現在位置及び現在時刻を含む位置情報を生成し、透かし重畳部633に供給する処理を開始する。 The position detection unit 632 starts processing for detecting the current position of the camera 412 based on the position detection data output from the GNSS receiver 608 . The position detection unit 632 generates position information including the detected current position and current time, and starts processing to supply the generated position information to the watermark superimposition unit 633 .
 透かし重畳部633は、ストレージ603に記憶されている秘密鍵を用いて透かしを生成し、動画データの各フレームに重畳する処理を開始する。また、透かし重畳部633は、透かしを重畳した動画データの各フレームと位置情報とを関連付けて、ストレージ603に記憶させる処理を開始する。 The watermark superimposing unit 633 generates a watermark using the secret key stored in the storage 603 and starts superimposing it on each frame of the video data. Also, the watermark superimposing unit 633 associates each frame of the watermark superimposed moving image data with the position information, and starts the process of storing them in the storage 603 .
 なお、必ずしも動画データの全てのフレームに位置情報を関連付ける必要はなく、例えば、所定のフレーム毎に位置情報を関連付けるようにしてもよい。 It should be noted that it is not necessary to associate position information with all frames of video data. For example, position information may be associated with each predetermined frame.
 ステップS704において、事故トリガ生成機411の事故検出部532は、撮影部507から出力される画像データ、及び、センシング部508から出力されるセンサデータの少なくとも一方に基づいて、事故の発生を検出する。なお、検出対象となる事故は、事故トリガ生成機411を所持するユーザに関連する事故だけでなく、事故トリガ生成機411の周囲で発生した事故も含まれる。 In step S704, the accident detection unit 532 of the accident trigger generator 411 detects the occurrence of an accident based on at least one of the image data output from the imaging unit 507 and the sensor data output from the sensing unit 508. . Accidents to be detected include not only accidents related to the user who owns the accident trigger generator 411 but also accidents occurring around the accident trigger generator 411 .
 ステップS705において、事故トリガ生成機411の事故検出部532は、通信部509を介して、周囲の撮影機412(Witness)をスキャンする。例えば、事故検出部532は、通信部509によるBTの通信範囲内に存在する撮影機412をスキャンする。 In step S705, the accident detection unit 532 of the accident trigger generator 411 scans the surrounding camera 412 (Witness) via the communication unit 509. For example, the accident detection unit 532 scans the camera 412 within the BT communication range of the communication unit 509 .
 ステップS706において、事故トリガ生成機411の事故検出部532は、リクエストトリガを送信する。具体的には、事故検出部532は、通信部509を介してBTにより、ステップS705の処理で検出した撮影機412にリクエストトリガを送信する。 In step S706, the accident detection unit 532 of the accident trigger generator 411 transmits a request trigger. Specifically, the accident detection unit 532 transmits a request trigger to the camera 412 detected in the process of step S705 by BT via the communication unit 509 .
 これに対して、撮影機412のCPU601は、通信部610を介して、リクエストトリガを受信する。 In response, the CPU 601 of the camera 412 receives the request trigger via the communication unit 610 .
 ステップS707において、撮影機412の位置証明部634は、対象期間の動画データの有無を確認する。具体的には、位置証明部634は、リクエストトリガを受信した前後(すなわち、事故トリガ生成機411により事故が検出された前後)の所定の期間の動画データがストレージ603に記憶されているか否かを確認する。 In step S707, the location verification unit 634 of the camera 412 confirms whether or not there is moving image data for the target period. Specifically, the location certification unit 634 determines whether video data for a predetermined period before and after receiving the request trigger (that is, before and after the accident was detected by the accident trigger generator 411) is stored in the storage 603. to confirm.
 ステップS708において、撮影機412は、動画データ及び位置情報をサーバ12に送信する。具体的には、位置証明部634は、リクエストトリガを受信した前後の所定の対象期間の動画データ、及び、動画データの各フレームに関連付けられている位置情報をストレージ603から読み出す。位置証明部634は、読み出した動画データ及び位置情報を含む位置証明情報を、通信部610及びネットワーク421を介して、サーバ413に送信する。 In step S708, the camera 412 transmits the moving image data and the position information to the server 12. Specifically, the location certification unit 634 reads from the storage 603 the moving image data for a predetermined target period before and after receiving the request trigger and the location information associated with each frame of the moving image data. Location certification unit 634 transmits location certification information including the read moving image data and location information to server 413 via communication unit 610 and network 421 .
 これに対して、サーバ413のCPU701は、ネットワーク421及び通信部707を介して、位置証明情報を受信する。 In response, the CPU 701 of the server 413 receives the location certification information via the network 421 and the communication unit 707.
 ステップS709において、サーバ413は、位置検証処理を行う。 In step S709, the server 413 performs position verification processing.
 ステップS710において、サーバ413は、マイニングを行い、PoLブロックを生成する。 In step S710, the server 413 performs mining and generates PoL blocks.
 ステップS711において、サーバ413は、PoLブロックを検証し、ブロックチェーンに追加する。 In step S711, the server 413 verifies the PoL block and adds it to the blockchain.
 ここで、図30のフローチャートを参照して、ステップS709乃至ステップS711の処理(位置検証処理)の詳細について説明する。 Here, the details of the processing (position verification processing) from steps S709 to S711 will be described with reference to the flowchart of FIG.
 ステップS731において、透かし抽出部742は、動画データの透かしを超出する。具体的には、透かし抽出部742は、ステップS701の処理で生成した秘密鍵を用いて、撮影機412から受信した位置証明情報に含まれる動画データの各フレームの透かしを抽出する。透かし抽出部742は、抽出した透かしを透かし検証部743に供給する。 In step S731, the watermark extraction unit 742 extracts the watermark of the video data. Specifically, the watermark extraction unit 742 extracts the watermark of each frame of the moving image data included in the position verification information received from the camera 412 using the private key generated in the process of step S701. The watermark extraction unit 742 supplies the extracted watermark to the watermark verification unit 743 .
 ステップS732において、透かし検証部743は、動画データから抽出された透かしが正当であるか否かを判定する。透かしが正当であると判定された場合、処理はステップS733に進む。 In step S732, the watermark verification unit 743 determines whether the watermark extracted from the video data is valid. If the watermark is determined to be valid, the process proceeds to step S733.
 これにより、撮影機412から受信した動画データが真正であることが証明される。 This proves that the video data received from the camera 412 is genuine.
 ステップS733において、特徴抽出部744は、動画データの特徴を抽出する。具体的には、透かし検証部743は、動画データの透かしが正当であることを特徴抽出部744に通知する。 In step S733, the feature extraction unit 744 extracts features of the video data. Specifically, the watermark verification unit 743 notifies the feature extraction unit 744 that the watermark of the video data is valid.
 特徴抽出部744は、任意の方法を用いて、動画データの各フレームの特徴を抽出する。このとき抽出される特徴は、時間経過による変化が少ない特徴とされる。例えば、特徴抽出部744は、SIFT(Scale Invariant Feature Transform)特徴量等を用いて、建造物や山等の変化が少ない静止物体の特徴点を抽出する。例えば、特徴抽出部744は、動画内の案内標識や看板等の文字を認識し、認識した文字を特徴として抽出する。特徴抽出部744は、動画データの特徴の抽出結果を示すデータを特徴検証部745に供給する。 The feature extraction unit 744 uses an arbitrary method to extract features of each frame of video data. The features extracted at this time are features that change little over time. For example, the feature extraction unit 744 uses a SIFT (Scale Invariant Feature Transform) feature amount or the like to extract feature points of stationary objects such as buildings and mountains that change little. For example, the feature extraction unit 744 recognizes characters such as guide signs and signboards in the moving image, and extracts the recognized characters as features. The feature extraction unit 744 supplies the feature verification unit 745 with data indicating the extraction result of the feature of the moving image data.
 ステップS734において、特徴検証部745は、撮影機412から受信した位置情報をキーにして、画像DB704を参照する。すなわち、特徴検証部745は、受信した動画データの各フレームに関連付けられている位置情報により示される位置における複数の画像データを、動画データの各フレームに対応する画像データとして画像DB704から抽出する。また、特徴検証部745は、抽出した各画像データの特徴を示す特徴データを画像DB704から抽出する。 In step S734, the feature verification unit 745 refers to the image DB 704 using the position information received from the camera 412 as a key. That is, the feature verification unit 745 extracts a plurality of image data at positions indicated by position information associated with each frame of the received moving image data from the image DB 704 as image data corresponding to each frame of the moving image data. Also, the feature verification unit 745 extracts feature data indicating the feature of each extracted image data from the image DB 704 .
 ステップS735において、特徴検証部745は、動画データの特徴と画像DB704の特徴との相関が閾値以上であるか否かを判定する。具体的には、位置検証部704は、動画データの各フレームから抽出した特徴と、画像DB704から抽出した、動画データの各フレームに対応する画像データの特徴との相関を計算する。特徴検証部745が、計算した相関が所定の閾値以上であると判定した場合、すなわち、動画データの各フレームの特徴と、各フレームに対応する画像DB704の画像データの特徴とが類似する場合、処理はステップS736に進む。 In step S735, the feature verification unit 745 determines whether the correlation between the feature of the moving image data and the feature of the image DB 704 is equal to or greater than a threshold. Specifically, the position verification unit 704 calculates the correlation between the features extracted from each frame of the moving image data and the features of the image data corresponding to each frame of the moving image data extracted from the image DB 704 . When the feature verification unit 745 determines that the calculated correlation is equal to or greater than a predetermined threshold, that is, when the feature of each frame of the moving image data is similar to the feature of the image data in the image DB 704 corresponding to each frame, Processing proceeds to step S736.
 これにより、撮影機412から受信した動画データが、撮影機412から受信した位置情報に示される位置で撮影された動画に対応する動画データであることが証明される。換言すれば、撮影機412が、動画の撮影時に位置情報に示される位置に存在していたことが証明される。すなわち、位置証明情報に含まれる動画データ及び位置情報の真正性が証明される。 This proves that the video data received from the camera 412 is video data corresponding to the video taken at the position indicated by the position information received from the camera 412 . In other words, it is proved that the camera 412 was present at the position indicated by the positional information when the moving image was captured. That is, the authenticity of the moving image data and the location information included in the location certification information is verified.
 ステップS736において、PoLブロック生成部733は、マイニングを行い、PoLブロックを生成する。具体的には、特徴検証部745は、撮影機412から受信した動画データ及び位置情報が真正であることをPoLブロック生成部733に通知する。PoLブロック生成部733は、所定の方法により、動画データ及び位置情報のマイニングを行い、位置証明情報を含むPoLブロックを生成する。 In step S736, the PoL block generation unit 733 performs mining and generates PoL blocks. Specifically, the feature verification unit 745 notifies the PoL block generation unit 733 that the moving image data and position information received from the camera 412 are authentic. The PoL block generation unit 733 mines the moving image data and the location information by a predetermined method, and generates a PoL block including the location certification information.
 ステップS737において、PoLブロック生成部733は、PoLブロックを検証し、ブロックチェーンに追加する。具体的には、PoLブロック生成部733は、所定の方法によりPoLブロックを検証し、PoLブロックが正当であると判定した場合、ブロックチェーンにPoLブロックを追加する。また、PoLブロック生成部733は、PoLブロックが正当であると判定した場合、ブロックチェーンネットワークを構成する他のノードにPoLブロックを送信し、ブロックチェーンに追加させる。 In step S737, the PoL block generation unit 733 verifies the PoL block and adds it to the blockchain. Specifically, the PoL block generation unit 733 verifies the PoL block by a predetermined method, and adds the PoL block to the blockchain when determining that the PoL block is valid. In addition, when the PoL block generation unit 733 determines that the PoL block is valid, it transmits the PoL block to other nodes constituting the blockchain network to add the PoL block to the blockchain.
 その後、位置検証処理は終了する。 After that, the position verification process ends.
 一方、ステップS735において、特徴検証部745が、計算した相関が所定の閾値未満であると判定した場合、すなわち、動画データの各フレームの特徴と、各フレームに対応する画像DB704の画像データの特徴とが類似しない場合、ステップS736及びステップS737の処理はスキップされ、位置検証処理は終了する。これは、位置証明情報に含まれる動画データ及び位置情報のうち少なくとも一方が真正でない場合である。 On the other hand, if the feature verification unit 745 determines in step S735 that the calculated correlation is less than the predetermined threshold, that is, if the feature of each frame of the moving image data and the feature of the image data in the image DB 704 corresponding to each frame are are not similar, the processes of steps S736 and S737 are skipped, and the position verification process ends. This is the case where at least one of the video data and the location information included in the location proof information is not authentic.
 以上のようにして、事故発生前後に撮影機412が事故現場付近に存在していたことを証明することができる。また、事故現場付近で撮影された動画データを容易に収集し、事故の過失割合計算等の根拠に用いることができる。また、動画データ及び位置情報の改ざんを防止することができ、証拠としての価値を高めることができる。 As described above, it is possible to prove that the camera 412 existed near the accident site before and after the accident occurred. In addition, video data captured near the scene of an accident can be easily collected and used as a basis for calculating the percentage of fault in an accident. In addition, falsification of moving image data and location information can be prevented, and value as evidence can be increased.
 なお、例えば、撮影機412は、動画データでなく、静止画のデータを送信することも可能である。 Note that, for example, the camera 412 can transmit still image data instead of moving image data.
 例えば、事故トリガ生成機411と撮影機412が1つの筐体内に含まれ、1つの装置を構成するようにすることも可能である。この場合、例えば、両者で重複する機能のうち一方を削除することが可能である。 For example, the accident trigger generator 411 and the camera 412 may be included in one housing to constitute one device. In this case, for example, it is possible to delete one of the overlapping functions.
 例えば、事故トリガ生成機411は、事故以外のトリガ(例えば、所定のイベント又は所定のタイミング)を検出した場合に、Witnessのリクエストトリガを送信するようにすることも可能である。これにより、所定のトリガの発生前後に撮影された動画データ及び位置情報を収集することが可能になる。 For example, the accident trigger generator 411 can transmit a witness request trigger when a trigger other than an accident (eg, a predetermined event or predetermined timing) is detected. This makes it possible to collect moving image data and position information captured before and after the occurrence of a predetermined trigger.
 <<3.変形例>>
 以下、上述した本技術の実施の形態の変形例について説明する。
<<3. Modification>>
Modifications of the embodiment of the present technology described above will be described below.
 以上の説明では、情報処理端末11の位置を緯度及び経度により表すようにしたが、他の方法により情報処理端末11の位置を表すようにしてもよい。例えば、位置の検索のしやすさ、プライバシの保護等を考慮して、Geohash、S2 Geometry等を用いて、情報処理端末11の位置を表すようにしてもよい。また、緯度及び経度に高度を加えて、情報処理端末11の位置を表すようにしてもよい。 In the above description, the position of the information processing terminal 11 is represented by latitude and longitude, but the position of the information processing terminal 11 may be represented by other methods. For example, the position of the information processing terminal 11 may be represented using Geohash, S2 Geometry, etc., in consideration of the ease of searching for the position, the protection of privacy, and the like. Also, the position of the information processing terminal 11 may be represented by adding the altitude to the latitude and longitude.
 また、情報処理端末11の位置の検出方法は、上述したGNSS受信機108を用いる方法に限定されず、他の方法を用いることが可能である。 Also, the method of detecting the position of the information processing terminal 11 is not limited to the method using the GNSS receiver 108 described above, and other methods can be used.
 さらに、例えば、図1のサーバ12が、ブロックチェーンネットワーク13のノードの1つを構成するようにしてもよい。 Furthermore, for example, the server 12 in FIG. 1 may constitute one of the nodes of the blockchain network 13.
 なお、例えば、上述した位置情報の真正性を保証する技術に加えて、情報処理端末11や撮影機412を他人ではなくユーザ自身が所持していることを証明する技術を採用することにより、より位置情報の信頼度を高めることができる。この技術には、任意の技術を採用することができる。 For example, in addition to the technique for assuring the authenticity of the position information described above, by adopting a technique for proving that the information processing terminal 11 or the camera 412 is possessed by the user rather than someone else, The reliability of position information can be increased. Any technique can be adopted for this technique.
 また、本技術は、上述した保険の例以外にも、ユーザの位置の真正性の保証が必要な他の技術や場面でも使用することが可能である。 In addition, this technology can be used in other technologies and situations where it is necessary to guarantee the authenticity of the user's location, other than the insurance example described above.
 <<4.その他>>
  <コンピュータの構成例>
 上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
<<4. Other>>
<Computer configuration example>
The series of processes described above can be executed by hardware or by software. When executing a series of processes by software, a program that constitutes the software is installed in the computer. Here, the computer includes, for example, a computer built into dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
 また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 Also, in this specification, a system means a set of multiple components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules in one housing are both systems. .
 さらに、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Furthermore, the embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, when one step includes multiple processes, the multiple processes included in the one step can be executed by one device or shared by multiple devices.
  <構成の組み合わせ例>
 本技術は、以下のような構成をとることもできる。
<Configuration example combination>
This technique can also take the following configurations.
(1)
 所定のトリガの検出時に現在位置を検出し、現在位置及び現在時刻を含む第1の位置情報を生成する位置検出部と、
 前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明を要求し、第1の位置証明情報を前記第1の情報処理装置から受信する位置証明取得部と、
 前記第1の位置情報及び前記第1の位置証明情報を記録する第2の情報処理装置に前記第1の位置情報及び前記第1の位置証明情報を送信する位置登録部と
 を備える情報処理装置。
(2)
 前記位置証明取得部は、近距離無線通信により、前記第1の位置情報を含む位置証明リクエストを前記第1の情報処理装置に送信し、前記第1の位置証明情報を含む位置証明レスポンスを前記第1の情報処理装置から受信する
 前記(1)に記載の情報処理装置。
(3)
 前記位置証明リクエストは、前記第1の位置情報及び公開鍵と、前記第1の位置情報を含む平文から前記公開鍵に対応する秘密鍵を用いて生成される電子署名とを含む
 前記(2)に記載の情報処理装置。
(4)
 前記位置証明リクエストは、前記第1の位置情報に関連付けるメタデータ、又は、前記メタデータのフィンガープリントをさらに含み、
 前記平文は、前記メタデータ又は前記フィンガープリントをさらに含む
 前記(3)に記載の情報処理装置。
(5)
 前記第1の位置情報及び前記メタデータを含み、保険に関する申告を行うための申告データを第3の情報処理装置に送信する申告部を
 さらに備える前記(4)に記載の情報処理装置。
(6)
 前記メタデータは、事故の検出に用いられる事故データ、前記事故データのハッシュ値、周囲を撮影した画像に対応する画像データ、前記画像データのハッシュ値、ユーザの活動の検出に用いられる活動データ、及び、前記活動データのハッシュ値のうち少なくとも1つを含む
 前記(4)又は(5)に記載の情報処理装置。
(7)
 暗号鍵を用いて暗号化した前記画像データを第3の情報処理装置に送信し、前記画像データに対して付与された画像IDを前記第3の情報処理装置から受信する画像登録部を
 さらに備え、
 前記位置検出部は、前記画像データに対応する画像の撮影開始から終了までの間の所定のタイミングの検出時に、現在位置を検出し、前記第1の位置情報を生成し、
 前記位置証明取得部は、前記タイミングの検出時に、前記第1の情報処理装置に位置証明を要求し、前記第1の位置証明情報を前記第1の情報処理装置から受信する
 前記(6)に記載の情報処理装置。
(8)
 前記第3の情報処理装置から前記第1の位置情報に示される位置付近及び時刻付近において撮影された画像に対応する前記画像データが要求された場合、該当する前記画像データに対応する前記画像ID、及び、前記暗号鍵に対応する復号鍵を前記第3の情報処理装置に送信する情報提供部を
 さらに備える前記(7)に記載の情報処理装置。
(9)
 前記位置証明取得部は、前記第2の情報処理装置において、前記公開鍵及び前記電子署名を用いて前記位置証明リクエストが正当であると判定された場合、前記位置証明レスポンスを前記第1の情報処理装置から受信する
 前記(3)乃至(8)のいずれかに記載の情報処理装置。
(10)
 前記位置証明レスポンスは、前記第1の情報処理装置の現在位置及び現在時刻を含む第2の位置情報及び公開鍵と、前記第2の位置情報を含む平文から前記公開鍵に対応する秘密鍵を用いて生成される電子署名とを含み、
 前記位置登録部は、前記公開鍵及び前記電子署名を用いて前記位置証明レスポンスが正当であると判定した場合、前記第1の位置情報及び前記第1の位置証明情報を含む位置情報トランザクションを前記第2の情報処理装置に送信する
 前記(2)に記載の情報処理装置。
(11)
 前記位置証明レスポンスは、前記位置証明リクエストをさらに含み、
 前記平文は、前記位置証明リクエストをさらに含み、
 前記位置情報トランザクションは、前記位置証明レスポンスを含む
 前記(10)に記載の情報処理装置。
(12)
 前記位置証明取得部は、前記近距離無線通信の通信範囲内に存在する前記第1の情報処理装置に前記位置証明リクエストを送信する
 前記(2)乃至(11)のいずれかに記載の情報処理装置。
(13)
 前記第1の位置証明情報は、前記第1の情報処理装置の現在位置及び現在時刻を含む第2の位置情報を含む
 前記(1)乃至(12)のいずれかに記載の情報処理装置。
(14)
 前記第1の位置情報を含み、保険に関する申告を行うための申告データを第3の情報処理装置に送信する申告部を
 さらに備える前記(1)に記載の情報処理装置。
(15)
 前記申告データは、時系列の前記第1の位置情報を含む移動データを含む
 前記(14)に記載の情報処理装置。
(16)
 前記申告データは、前記保険の保険料、保険金、又は、特典の計算に用いられる
 前記(14)又は(15)に記載の情報処理装置。
(17)
 前記トリガは、所定のイベント又は所定のタイミングである
 前記(1)乃至(16)のいずれかに記載の情報処理装置。
(18)
 前記第1の位置情報及び前記第1の位置証明情報は、前記第2の情報処理装置によりブロックチェーンに記録される
 前記(1)乃至(17)のいずれかに記載の情報処理装置。
(19)
 周囲に存在する第3の情報処理装置から位置証明が要求された場合、第2の位置証明情報を生成し、前記第3の情報処理装置に送信する位置証明部を
 さらに備える前記(1)に記載の情報処理装置。
(20)
 所定のトリガの検出時に現在位置を検出し、現在位置及び現在時刻を含む位置情報を生成し、
 前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明を要求し、位置証明情報を前記第1の情報処理装置から受信し、
 前記位置情報及び前記位置証明情報を記録する第2の情報処理装置に前記位置情報及び前記位置証明情報を送信する
 処理をコンピュータに実行させるためのプログラム。
(21)
 所定のトリガを検出した第1の情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求を受信したとき、位置証明情報を生成し、前記位置証明情報を送信する位置証明部を
 備える情報処理装置。
(22)
 前記位置証明部は、近距離無線通信により、前記第1の情報処理装置の現在位置及び現在時刻を含む第1の位置情報を含む位置証明リクエストを前記第1の情報処理装置から受信し、前記位置証明情報を含む位置証明レスポンスを前記第1の情報処理装置に送信する
 前記(21)に記載の情報処理装置。
(23)
 前記位置証明リクエストは、前記第1の位置情報及び公開鍵と、前記第1の位置情報を含む平文から前記公開鍵に対応する秘密鍵を用いて生成される電子署名とを含み、
 前記位置証明部は、前記公開鍵及び前記電子署名を用いて前記位置証明リクエストが正当であると判定した場合、前記位置証明レスポンスを前記第1の情報処理装置に送信する
 前記(22)に記載の情報処理装置。
(24)
 前記位置証明の要求を受信したとき、現在位置を検出し、現在位置及び現在時刻を含む第2の位置情報を生成する位置検出部を
 さらに備え、
 前記位置証明レスポンスは、前記第2の位置情報及び公開鍵と、前記第2の位置情報を含む第1の平文から前記公開鍵に対応する秘密鍵を用いて生成される第1の電子署名とを含む
 前記(22)に記載の情報処理装置。
(25)
 前記位置証明レスポンスは、前記位置証明リクエストをさらに含み、
 前記第1の平文は、前記位置証明リクエストをさらに含む
 前記(24)に記載の情報処理装置。
(26)
 前記位置証明レスポンスは、前記第2の位置情報に関連付けるメタデータ、又は、前記メタデータのフィンガープリントをさらに含み、
 前記第1の平文は、前記メタデータ又は前記フィンガープリントをさら含む
 前記(24)又は(25)に記載の情報処理装置。
(27)
 前記第1の位置情報及び前記位置証明情報を取得した第2の情報処理装置から前記第1の位置情報に示される位置付近及び時刻付近において撮影した画像に対応する画像データが要求された場合、前記画像データと、前記画像データを含む第2の平文から前記秘密鍵を用いて生成される第2の電子署名とを前記第2の情報処理装置に送信する情報提供部を
 さらに備える前記(24)に記載の情報処理装置。
(28)
 前記情報提供部は、前記第2の情報処理装置から要求された前記画像データの撮影場所及び撮影時刻に関する条件の表示を制御する
 前記(27)に記載の情報処理装置。
(29)
 前記位置証明の要求を受信したとき、現在位置を検出し、現在位置及び現在時刻を含む位置情報を生成する位置検出部を
 さらに含み、
 前記位置証明情報は、前記位置情報を含む
 前記(21)乃至(28)のいずれかに記載の情報処理装置。
(30)
 前記位置証明情報は、前記位置証明の要求を受信した時刻付近において周囲を撮影した画像に対応し、透かしが重畳された画像データ、及び、前記位置情報を含み、
 前記位置証明部は、前記位置証明情報を記録する第2の情報処理装置に前記位置証明情報を送信する
 前記(29)に記載の情報処理装置。
(31)
 前記位置証明情報は、前記第2の情報処理装置によりブロックチェーンに記録される
 前記(30)に記載の情報処理装置。
(32)
 所定のトリガを検出した情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求を受信したとき、位置証明情報を生成し、前記位置証明情報を送信する
 処理をコンピュータに実行させるためのプログラム。
(33)
 第1の情報処理装置が所定のトリガを検出したときの前記第1の情報処理装置の現在位置及び現在時刻を含む第1の位置情報を含み、前記第1の情報処理装置から受信した申告データを、前記トリガの検出時に前記第1の情報処理装置の周囲に存在していた第2の情報処理装置が前記第1の情報処理装置からの位置証明の要求に応じて生成した位置証明情報を用いて検証する検証部と、
 前記検証部により前記申告データが真正であると判定された場合、前記申告データに対応した処理を実行する実行部と
 を備える情報処理装置。
(34)
 前記検証部は、前記第1の位置情報に基づいて、前記第1の位置情報及び前記位置証明情報を含む位置証明ブロックを記録する第3の情報処理装置から前記位置証明情報を取得する
 前記(33)に記載の情報処理装置。
(35)
 前記申告データは、前記第1の位置情報に関連付けられたメタデータをさらに含み、
 前記位置証明ブロックは、前記第1の位置情報、前記位置証明情報、及び、前記メタデータ又は前記メタデータのフィンガープリントを含み、
 前記検証部は、前記第1の位置情報、及び、前記メタデータ又は前記フィンガープリントに基づいて、前記第2の情報処理装置から前記位置証明情報を取得する
 前記(34)に記載の情報処理装置。
(36)
 前記実行部は、前記第1の位置情報及び前記メタデータに基づいて、前記申告データに対応した処理を実行する
 前記(35)に記載の情報処理装置。
(37)
 前記位置証明ブロックは、ブロックチェーンに記録されている
 前記(34)又は(35)に記載の情報処理装置。
(38)
 前記第1の位置情報に示される位置付近及び時刻付近において撮影された画像に対応する画像データを前記第2の情報処理装置に要求し、前記画像データを前記第2の情報処理装置から受信する情報収集部を
 さらに備える前記(33)に記載の情報処理装置。
(39)
 前記位置証明情報は、前記第1の情報処理装置から前記位置証明が要求されたときの前記第2の情報処理装置の現在位置及び現在時刻を含む第2の位置情報を含み、
 前記情報収集部は、前記第1の位置情報、前記位置証明情報、及び、公開鍵と、前記第1の位置情報及び前記位置証明情報を含む第1の平文から前記公開鍵に対応する秘密鍵を用いて生成された第1の電子署名とを含む位置証明ブロックを記録する第3の情報処理装置から前記位置証明情報を取得し、
 前記情報収集部は、前記第2の情報処理装置に前記第2の位置情報及び前記公開鍵を送付して前記画像データを要求し、前記画像データと、前記画像データを含む第2の平文から前記秘密鍵を用いて生成された第2の電子署名とを前記第2の情報処理装置から受信し、前記第2の電子署名を用いて前記画像データを検証する
 前記(38)に記載の情報処理装置。
(40) 
 暗号鍵を用いて暗号化された画像データを第3の情報処理装置から受信し、前記画像データに対して付与した画像IDを前記第3の情報処理装置に送信し、暗号化された前記画像データ及び前記画像IDを関連付けてデータベースに保存させる情報収集部を
 さらに備え、
 前記第1の位置情報に示される位置付近及び時刻付近において撮影された画像を前記第3の情報処理装置に要求し、該当する前記画像データに対応する前記画像ID及び前記暗号鍵に対応する復号鍵を前記第3の情報処理装置から受信する
 前記(33)に記載の情報処理装置。
(41) 
 前記第1の情報処理装置から位置証明が要求された時刻付近において撮影された画像に対応し、透かしが重畳された第1の画像データ、及び、前記画像を撮影した位置及び時刻を含む第2の位置情報を含む第2の位置証明情報を第3の情報処理装置から受信し、前記透かしを検証するとともに、地図情報の各位置周辺の画像データを蓄積するデータベースから前記第2の位置情報に示される位置に対応する第2の画像データを抽出し、前記第2の画像データと前記第1の画像データとを比較することにより、前記第2の位置証明情報を検証する位置検証部を
 さらに備える前記(33)に記載の情報処理装置。
(42)
 前記位置検証部は、前記第2の位置証明情報をブロックチェーンに記録する
 前記(41)に記載の情報処理装置。
(43)
 前記申告データは、保険に関する申告を行うためのデータであり、
 前記実行部は、前記申告データに基づいて、前記保険の保険料、保険金、又は、特典を計算する
 前記(33)乃至(42)のいずれかに記載の情報処理装置。
(1)
a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and the current time;
a location certification acquisition unit that requests location certification from a first information processing device present in the vicinity when the trigger is detected and receives first location certification information from the first information processing device;
an information processing apparatus comprising: a location registration unit that transmits the first location information and the first location proof information to a second information processing apparatus that records the first location information and the first location proof information. .
(2)
The location certification acquisition unit transmits a location certification request including the first location information to the first information processing device by short-range wireless communication, and transmits a location certification response including the first location certification information to the The information processing device according to (1), received from the first information processing device.
(3)
The location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key. (2) The information processing device according to .
(4)
the location request further includes metadata associated with the first location information or a fingerprint of the metadata;
The information processing apparatus according to (3), wherein the plaintext further includes the metadata or the fingerprint.
(5)
The information processing device according to (4), further comprising: a reporting unit that transmits reporting data for reporting insurance, including the first location information and the metadata, to a third information processing device.
(6)
The metadata includes accident data used to detect an accident, a hash value of the accident data, image data corresponding to an image of the surroundings, a hash value of the image data, activity data used to detect user activity, and at least one of hash values of the activity data. The information processing device according to (4) or (5).
(7)
An image registration unit that transmits the image data encrypted using the encryption key to a third information processing device and receives an image ID assigned to the image data from the third information processing device. ,
The position detection unit detects a current position and generates the first position information when detecting a predetermined timing from the start to the end of capturing an image corresponding to the image data,
The location certification acquisition unit requests location certification from the first information processing device when the timing is detected, and receives the first location certification information from the first information processing device. The information processing device described.
(8)
When the third information processing device requests the image data corresponding to the image taken near the position and the time indicated by the first position information, the image ID corresponding to the corresponding image data , and an information providing unit configured to transmit a decryption key corresponding to the encryption key to the third information processing apparatus. The information processing apparatus according to (7).
(9)
When the second information processing device determines that the location certification request is valid using the public key and the electronic signature, the location certification acquisition unit converts the location certification response to the first information. The information processing device according to any one of (3) to (8), received from the processing device.
(10)
The location proof response includes second location information including the current location and current time of the first information processing device, a public key, and a private key corresponding to the public key from plaintext including the second location information. an electronic signature generated using
When the location registration unit determines that the location certification response is valid using the public key and the electronic signature, the location registration unit transmits the location information transaction including the first location information and the first location certification information to the The information processing device according to (2), which transmits to the second information processing device.
(11)
the location proof response further includes the location proof request;
said plaintext further comprising said location proof request;
The information processing device according to (10), wherein the location information transaction includes the location certification response.
(12)
The information processing according to any one of (2) to (11), wherein the location certification acquisition unit transmits the location certification request to the first information processing device existing within a communication range of the short-range wireless communication. Device.
(13)
The information processing device according to any one of (1) to (12), wherein the first location proof information includes second location information including the current location and current time of the first information processing device.
(14)
The information processing device according to (1), further comprising: a reporting unit that transmits reporting data for reporting insurance, including the first position information, to a third information processing device.
(15)
The information processing device according to (14), wherein the report data includes movement data including the first position information in time series.
(16)
The information processing apparatus according to (14) or (15), wherein the report data is used for calculating premiums, insurance benefits, or benefits of the insurance.
(17)
The information processing apparatus according to any one of (1) to (16), wherein the trigger is a predetermined event or predetermined timing.
(18)
The information processing device according to any one of (1) to (17), wherein the first location information and the first location proof information are recorded in a block chain by the second information processing device.
(19)
(1), further comprising a position proof unit that generates second position proof information and transmits the second position proof information to the third information processing device when a position proof is requested from a third information processing device existing in the vicinity; The information processing device described.
(20)
detecting a current position upon detection of a predetermined trigger and generating position information including the current position and current time;
When the trigger is detected, requesting position proof from a first information processing device existing in the vicinity, receiving position proof information from the first information processing device,
A program for causing a computer to execute a process of transmitting the position information and the position proof information to a second information processing device that records the position information and the position proof information.
(21)
Location certification for generating location certification information and transmitting the location certification information when a first information processing device that has detected a predetermined trigger receives a request for location certification transmitted to other surrounding information processing devices An information processing device comprising:
(22)
The location certification unit receives, from the first information processing device, a location certification request including first location information including the current location and current time of the first information processing device by short-range wireless communication, and The information processing device according to (21), wherein a position proof response including position proof information is transmitted to the first information processing device.
(23)
the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key;
The location certification unit transmits the location certification response to the first information processing device when determining that the location certification request is valid using the public key and the electronic signature. information processing equipment.
(24)
further comprising a location detection unit that detects a current location and generates second location information including the current location and the current time when the request for location certification is received;
The location proof response includes the second location information and a public key, and a first digital signature generated from a first plaintext containing the second location information using a private key corresponding to the public key. The information processing apparatus according to (22).
(25)
the location proof response further includes the location proof request;
The information processing apparatus according to (24), wherein the first plaintext further includes the location certification request.
(26)
the location proof response further includes metadata associated with the second location information or a fingerprint of the metadata;
The information processing device according to (24) or (25), wherein the first plaintext further includes the metadata or the fingerprint.
(27)
When image data corresponding to an image taken near the position and time indicated by the first position information is requested from the second information processing device that has acquired the first position information and the position proof information, The (24 ).
(28)
The information processing apparatus according to (27), wherein the information providing unit controls display of conditions relating to a shooting location and shooting time of the image data requested by the second information processing apparatus.
(29)
a location detection unit that detects a current location and generates location information including the current location and current time when the request for location certification is received;
The information processing apparatus according to any one of (21) to (28), wherein the location proof information includes the location information.
(30)
The location certification information includes image data superimposed with a watermark corresponding to an image of the surroundings captured around the time when the request for location certification was received, and the location information;
The information processing apparatus according to (29), wherein the location certification unit transmits the location certification information to a second information processing apparatus that records the location certification information.
(31)
The information processing device according to (30), wherein the position proof information is recorded in a block chain by the second information processing device.
(32)
When an information processing device that has detected a predetermined trigger receives a request for position proof sent to another information processing device in the vicinity, the computer executes processing for generating position proof information and sending the position proof information. program to make
(33)
Reporting data received from the first information processing device, including first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger. position proof information generated in response to a position proof request from the first information processing device by a second information processing device that was present in the vicinity of the first information processing device when the trigger was detected. a verification unit that verifies using
An information processing apparatus comprising: an execution unit that executes processing corresponding to the declaration data when the verification unit determines that the declaration data is authentic.
(34)
The verification unit acquires the location proof information from a third information processing device that records a location proof block containing the first location information and the location proof information, based on the first location information. 33) The information processing device described in 33).
(35)
the claim data further includes metadata associated with the first location information;
the location proof block includes the first location information, the location proof information, and the metadata or a fingerprint of the metadata;
The information processing device according to (34), wherein the verification unit acquires the location proof information from the second information processing device based on the first location information and the metadata or the fingerprint. .
(36)
The information processing device according to (35), wherein the execution unit executes processing corresponding to the report data based on the first position information and the metadata.
(37)
The information processing device according to (34) or (35), wherein the location proof block is recorded in a blockchain.
(38)
requesting the second information processing device for image data corresponding to an image taken near the position and time indicated by the first position information, and receiving the image data from the second information processing device; The information processing apparatus according to (33), further comprising an information collecting unit.
(39)
The position proof information includes second position information including the current position and current time of the second information processing device when the position proof is requested from the first information processing device,
The information collection unit extracts the first location information, the location proof information, a public key, and a first plaintext containing the first location information and the location proof information to a private key corresponding to the public key. Acquiring the location proof information from a third information processing device that records a location proof block containing the first electronic signature generated using
The information collecting unit sends the second position information and the public key to the second information processing device to request the image data, and obtains the image data and a second plaintext including the image data. The information according to (38), wherein a second electronic signature generated using the private key is received from the second information processing device, and the image data is verified using the second electronic signature. processing equipment.
(40)
receiving image data encrypted using an encryption key from a third information processing device, transmitting an image ID assigned to the image data to the third information processing device, and encrypting the image; further comprising an information collecting unit that associates the data and the image ID and saves them in a database;
requesting the third information processing device for an image taken near the position and near the time indicated by the first position information, and decrypting the image ID corresponding to the corresponding image data and the encryption key corresponding to the encryption key; The information processing device according to (33), which receives a key from the third information processing device.
(41)
First image data superimposed with a watermark corresponding to an image taken around the time when location certification was requested from the first information processing device, and second image data including the location and time when the image was taken from a third information processing device, verifying the watermark, and obtaining the second location information from a database storing image data around each location of the map information; A position verification unit that verifies the second position proof information by extracting second image data corresponding to the indicated position and comparing the second image data with the first image data. The information processing device according to (33).
(42)
The information processing device according to (41), wherein the location verification unit records the second location proof information in a blockchain.
(43)
The declaration data is data for making a declaration regarding insurance,
The information processing device according to any one of (33) to (42), wherein the execution unit calculates an insurance premium, insurance money, or benefit of the insurance based on the declaration data.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 It should be noted that the effects described in this specification are only examples and are not limited, and other effects may be provided.
 1 情報処理システム, 11-1乃至11-n 情報処理端末, 12 サーバ, 13 ブロックチェーンネットワーク, 101 CPU, 107 撮影部, 108 GNSS受信機, 109 センシング部, 131 制御部, 132 位置検出部, 133 事故検出部, 134 位置証明処理部, 135 申告部, 141 Prove処理部, 142 Witness処理部, 151 位置証明取得部, 152 位置登録部, 153 画像登録部, 161 位置証明部, 162 情報提供部, 201 CPU, 204 保険DB, 231 制御部, 232 検証部, 233 情報収集部, 234 保険処理部, 241 計算部, 242 実行部, 401 情報処理システム, 411-1乃至411-m 事故トリガ生成機, 412-1乃至412-n 撮像機, 413 サーバ, 501 CPU,507 撮影部, 508 センシング部, 531 制御部, 532 事故検出部, 601 CPU, 607 撮影部, 608 GNSS受信機, 631 制御部, 632 位置検出部, 633 透かし重畳部, 634 位置証明部, 701 CPU, 704 画像DB, 731 制御部, 732 位置検証部, 733 PoLブロック生成部, 741 秘密鍵生成部, 742 透かし超出部, 743 透かし検証部, 744 特徴抽出部, 745 特徴検証部 1 information processing system, 11-1 to 11-n information processing terminal, 12 server, 13 blockchain network, 101 CPU, 107 imaging unit, 108 GNSS receiver, 109 sensing unit, 131 control unit, 132 position detection unit, 133 Accident detection unit 134 Location proof processing unit 135 Declaration unit 141 Prove processing unit 142 Witness processing unit 151 Location proof acquisition unit 152 Location registration unit 153 Image registration unit 161 Location proof unit 162 Information provision unit 201 CPU, 204 insurance DB, 231 control unit, 232 verification unit, 233 information collection unit, 234 insurance processing unit, 241 calculation unit, 242 execution unit, 401 information processing system, 411-1 to 411-m accident trigger generator, 412-1 to 412-n imaging device, 413 server, 501 CPU, 507 imaging unit, 508 sensing unit, 531 control unit, 532 accident detection unit, 601 CPU, 607 imaging unit, 608 GNSS receiver, 631 control unit, 632 Position detection unit, 633 Watermark superimposition unit, 634 Position proof unit, 701 CPU, 704 Image DB, 731 Control unit, 732 Position verification unit, 733 PoL block generation unit, 741 Private key generation unit, 742 Watermark exceeding unit, 743 Watermark verification section, 744 feature extraction section, 745 feature verification section

Claims (43)

  1.  所定のトリガの検出時に現在位置を検出し、現在位置及び現在時刻を含む第1の位置情報を生成する位置検出部と、
     前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明を要求し、第1の位置証明情報を前記第1の情報処理装置から受信する位置証明取得部と、
     前記第1の位置情報及び前記第1の位置証明情報を記録する第2の情報処理装置に前記第1の位置情報及び前記第1の位置証明情報を送信する位置登録部と
     を備える情報処理装置。
    a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and the current time;
    a location certification acquisition unit that requests location certification from a first information processing device present in the vicinity when the trigger is detected and receives first location certification information from the first information processing device;
    an information processing apparatus comprising: a location registration unit that transmits the first location information and the first location proof information to a second information processing apparatus that records the first location information and the first location proof information. .
  2.  前記位置証明取得部は、近距離無線通信により、前記第1の位置情報を含む位置証明リクエストを前記第1の情報処理装置に送信し、前記第1の位置証明情報を含む位置証明レスポンスを前記第1の情報処理装置から受信する
     請求項1に記載の情報処理装置。
    The location certification acquisition unit transmits a location certification request including the first location information to the first information processing device by short-range wireless communication, and transmits a location certification response including the first location certification information to the The information processing apparatus according to claim 1, receiving from the first information processing apparatus.
  3.  前記位置証明リクエストは、前記第1の位置情報及び公開鍵と、前記第1の位置情報を含む平文から前記公開鍵に対応する秘密鍵を用いて生成される電子署名とを含む
     請求項2に記載の情報処理装置。
    3. The method according to claim 2, wherein the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key. The information processing device described.
  4.  前記位置証明リクエストは、前記第1の位置情報に関連付けるメタデータ、又は、前記メタデータのフィンガープリントをさらに含み、
     前記平文は、前記メタデータ又は前記フィンガープリントをさらに含む
     請求項3に記載の情報処理装置。
    the location request further includes metadata associated with the first location information or a fingerprint of the metadata;
    The information processing apparatus according to claim 3, wherein the plaintext further includes the metadata or the fingerprint.
  5.  前記第1の位置情報及び前記メタデータを含み、保険に関する申告を行うための申告データを第3の情報処理装置に送信する申告部を
     さらに備える請求項4に記載の情報処理装置。
    5. The information processing apparatus according to claim 4, further comprising a declaration unit that transmits declaration data for making a declaration regarding insurance, including the first position information and the metadata, to a third information processing apparatus.
  6.  前記メタデータは、事故の検出に用いられる事故データ、前記事故データのハッシュ値、周囲を撮影した画像に対応する画像データ、前記画像データのハッシュ値、ユーザの活動の検出に用いられる活動データ、及び、前記活動データのハッシュ値のうち少なくとも1つを含む
     請求項4に記載の情報処理装置。
    The metadata includes accident data used to detect an accident, a hash value of the accident data, image data corresponding to an image of the surroundings, a hash value of the image data, activity data used to detect user activity, and at least one of hash values of the activity data.
  7.  暗号鍵を用いて暗号化した前記画像データを第3の情報処理装置に送信し、前記画像データに対して付与された画像IDを前記第3の情報処理装置から受信する画像登録部を
     さらに備え、
     前記位置検出部は、前記画像データに対応する画像の撮影開始から終了までの間の所定のタイミングの検出時に、現在位置を検出し、前記第1の位置情報を生成し、
     前記位置証明取得部は、前記タイミングの検出時に、前記第1の情報処理装置に位置証明を要求し、前記第1の位置証明情報を前記第1の情報処理装置から受信する
     請求項6に記載の情報処理装置。
    An image registration unit that transmits the image data encrypted using the encryption key to a third information processing device and receives an image ID assigned to the image data from the third information processing device. ,
    The position detection unit detects a current position and generates the first position information when detecting a predetermined timing from the start to the end of capturing an image corresponding to the image data,
    7. The position proof acquisition unit according to claim 6, wherein when the timing is detected, the position proof acquisition unit requests position proof from the first information processing device and receives the first position proof information from the first information processing device. information processing equipment.
  8.  前記第3の情報処理装置から前記第1の位置情報に示される位置付近及び時刻付近において撮影された画像に対応する前記画像データが要求された場合、該当する前記画像データに対応する前記画像ID、及び、前記暗号鍵に対応する復号鍵を前記第3の情報処理装置に送信する情報提供部を
     さらに備える請求項7に記載の情報処理装置。
    When the third information processing device requests the image data corresponding to the image taken near the position and the time indicated by the first position information, the image ID corresponding to the corresponding image data 8. The information processing apparatus according to claim 7, further comprising: , and an information providing unit that transmits a decryption key corresponding to said encryption key to said third information processing apparatus.
  9.  前記位置証明取得部は、前記第2の情報処理装置において、前記公開鍵及び前記電子署名を用いて前記位置証明リクエストが正当であると判定された場合、前記位置証明レスポンスを前記第1の情報処理装置から受信する
     請求項3に記載の情報処理装置。
    When the second information processing device determines that the location certification request is valid using the public key and the electronic signature, the location certification acquisition unit converts the location certification response to the first information. The information processing device according to claim 3, received from the processing device.
  10.  前記位置証明レスポンスは、前記第1の情報処理装置の現在位置及び現在時刻を含む第2の位置情報及び公開鍵と、前記第2の位置情報を含む平文から前記公開鍵に対応する秘密鍵を用いて生成される電子署名とを含み、
     前記位置登録部は、前記公開鍵及び前記電子署名を用いて前記位置証明レスポンスが正当であると判定した場合、前記第1の位置情報及び前記第1の位置証明情報を含む位置情報トランザクションを前記第2の情報処理装置に送信する
     請求項2に記載の情報処理装置。
    The location proof response includes second location information including the current location and current time of the first information processing device, a public key, and a private key corresponding to the public key from plaintext including the second location information. an electronic signature generated using
    When the location registration unit determines that the location certification response is valid using the public key and the electronic signature, the location registration unit transmits the location information transaction including the first location information and the first location certification information to the The information processing device according to claim 2, wherein the information is transmitted to the second information processing device.
  11.  前記位置証明レスポンスは、前記位置証明リクエストをさらに含み、
     前記平文は、前記位置証明リクエストをさらに含み、
     前記位置情報トランザクションは、前記位置証明レスポンスを含む
     請求項10に記載の情報処理装置。
    the location proof response further includes the location proof request;
    said plaintext further comprising said location proof request;
    The information processing device according to claim 10, wherein the location information transaction includes the location certification response.
  12.  前記位置証明取得部は、前記近距離無線通信の通信範囲内に存在する前記第1の情報処理装置に前記位置証明リクエストを送信する
     請求項2に記載の情報処理装置。
    The information processing apparatus according to claim 2, wherein the location certification acquisition unit transmits the location certification request to the first information processing apparatus existing within a communication range of the near field communication.
  13.  前記第1の位置証明情報は、前記第1の情報処理装置の現在位置及び現在時刻を含む第2の位置情報を含む
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the first location proof information includes second location information including the current location and current time of the first information processing device.
  14.  前記第1の位置情報を含み、保険に関する申告を行うための申告データを第3の情報処理装置に送信する申告部を
     さらに備える請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, further comprising a declaration unit that transmits declaration data for making a declaration regarding insurance, including the first position information, to the third information processing apparatus.
  15.  前記申告データは、時系列の前記第1の位置情報を含む移動データを含む
     請求項14に記載の情報処理装置。
    The information processing apparatus according to claim 14, wherein the report data includes movement data including the time-series first position information.
  16.  前記申告データは、前記保険の保険料、保険金、又は、特典の計算に用いられる
     請求項14に記載の情報処理装置。
    15. The information processing apparatus according to claim 14, wherein the report data is used for calculating premiums, insurance benefits, or benefits of the insurance.
  17.  前記トリガは、所定のイベント又は所定のタイミングである
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the trigger is a predetermined event or predetermined timing.
  18.  前記第1の位置情報及び前記第1の位置証明情報は、前記第2の情報処理装置によりブロックチェーンに記録される
     請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the first location information and the first location proof information are recorded in a block chain by the second information processing device.
  19.  周囲に存在する第3の情報処理装置から位置証明が要求された場合、第2の位置証明情報を生成し、前記第3の情報処理装置に送信する位置証明部を
     さらに備える請求項1に記載の情報処理装置。
    2. The method according to claim 1, further comprising a position verification unit that generates second position verification information and transmits the second position verification information to the third information processing device when a position verification is requested from a third information processing device existing in the vicinity. information processing equipment.
  20.  所定のトリガの検出時に現在位置を検出し、現在位置及び現在時刻を含む位置情報を生成し、
     前記トリガの検出時に、周囲に存在する第1の情報処理装置に位置証明を要求し、位置証明情報を前記第1の情報処理装置から受信し、
     前記位置情報及び前記位置証明情報を記録する第2の情報処理装置に前記位置情報及び前記位置証明情報を送信する
     処理をコンピュータに実行させるためのプログラム。
    detecting a current position upon detection of a predetermined trigger and generating position information including the current position and current time;
    When the trigger is detected, requesting position proof from a first information processing device existing in the vicinity, receiving position proof information from the first information processing device,
    A program for causing a computer to execute a process of transmitting the position information and the position proof information to a second information processing device that records the position information and the position proof information.
  21.  所定のトリガを検出した第1の情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求を受信したとき、位置証明情報を生成し、前記位置証明情報を送信する位置証明部を
     備える情報処理装置。
    Location certification for generating location certification information and transmitting the location certification information when a first information processing device that has detected a predetermined trigger receives a request for location certification transmitted to other surrounding information processing devices An information processing device comprising:
  22.  前記位置証明部は、近距離無線通信により、前記第1の情報処理装置の現在位置及び現在時刻を含む第1の位置情報を含む位置証明リクエストを前記第1の情報処理装置から受信し、前記位置証明情報を含む位置証明レスポンスを前記第1の情報処理装置に送信する 請求項21に記載の情報処理装置。 The location certification unit receives, from the first information processing device, a location certification request including first location information including the current location and current time of the first information processing device by short-range wireless communication, and 22. The information processing device according to claim 21, which transmits a location proof response including location proof information to said first information processing device.
  23.  前記位置証明リクエストは、前記第1の位置情報及び公開鍵と、前記第1の位置情報を含む平文から前記公開鍵に対応する秘密鍵を用いて生成される電子署名とを含み、
     前記位置証明部は、前記公開鍵及び前記電子署名を用いて前記位置証明リクエストが正当であると判定した場合、前記位置証明レスポンスを前記第1の情報処理装置に送信する 請求項22に記載の情報処理装置。
    the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key;
    23. The location certification unit according to claim 22, wherein, when determining that the location certification request is valid using the public key and the electronic signature, the location certification unit transmits the location certification response to the first information processing device. Information processing equipment.
  24.  前記位置証明の要求を受信したとき、現在位置を検出し、現在位置及び現在時刻を含む第2の位置情報を生成する位置検出部を
     さらに備え、
     前記位置証明レスポンスは、前記第2の位置情報及び公開鍵と、前記第2の位置情報を含む第1の平文から前記公開鍵に対応する秘密鍵を用いて生成される第1の電子署名とを含む
     請求項22に記載の情報処理装置。
    further comprising a location detection unit that detects a current location and generates second location information including the current location and the current time when the request for location certification is received;
    The location proof response includes the second location information and a public key, and a first digital signature generated from a first plaintext containing the second location information using a private key corresponding to the public key. The information processing apparatus according to claim 22, comprising:
  25.  前記位置証明レスポンスは、前記位置証明リクエストをさらに含み、
     前記第1の平文は、前記位置証明リクエストをさらに含む
     請求項24に記載の情報処理装置。
    the location proof response further includes the location proof request;
    The information processing apparatus according to claim 24, wherein said first plaintext further includes said location certification request.
  26.  前記位置証明レスポンスは、前記第2の位置情報に関連付けるメタデータ、又は、前記メタデータのフィンガープリントをさらに含み、
     前記第1の平文は、前記メタデータ又は前記フィンガープリントをさら含む
     請求項24に記載の情報処理装置。
    the location proof response further includes metadata associated with the second location information or a fingerprint of the metadata;
    The information processing apparatus according to claim 24, wherein said first plaintext further includes said metadata or said fingerprint.
  27.  前記第1の位置情報及び前記位置証明情報を取得した第2の情報処理装置から前記第1の位置情報に示される位置付近及び時刻付近において撮影した画像に対応する画像データが要求された場合、前記画像データと、前記画像データを含む第2の平文から前記秘密鍵を用いて生成される第2の電子署名とを前記第2の情報処理装置に送信する情報提供部を さらに備える請求項24に記載の情報処理装置。 When image data corresponding to an image taken near the position and time indicated by the first position information is requested from the second information processing device that has acquired the first position information and the position proof information, 24. further comprising an information providing unit configured to transmit said image data and a second electronic signature generated using said private key from a second plaintext including said image data to said second information processing device. The information processing device according to .
  28.  前記情報提供部は、前記第2の情報処理装置から要求された前記画像データの撮影場所及び撮影時刻に関する条件の表示を制御する
     請求項27に記載の情報処理装置。
    28. The information processing apparatus according to claim 27, wherein said information providing unit controls display of conditions relating to a photographing location and photographing time of said image data requested by said second information processing apparatus.
  29.  前記位置証明の要求を受信したとき、現在位置を検出し、現在位置及び現在時刻を含む位置情報を生成する位置検出部を
     さらに含み、
     前記位置証明情報は、前記位置情報を含む
     請求項21に記載の情報処理装置。
    a location detection unit that detects a current location and generates location information including the current location and current time when the request for location certification is received;
    The information processing apparatus according to claim 21, wherein the location proof information includes the location information.
  30.  前記位置証明情報は、前記位置証明の要求を受信した時刻付近において周囲を撮影した画像に対応し、透かしが重畳された画像データ、及び、前記位置情報を含み、
     前記位置証明部は、前記位置証明情報を記録する第2の情報処理装置に前記位置証明情報を送信する
     請求項29に記載の情報処理装置。
    The location certification information includes image data superimposed with a watermark corresponding to an image of the surroundings captured around the time when the request for location certification was received, and the location information;
    The information processing apparatus according to claim 29, wherein the location certification unit transmits the location certification information to a second information processing apparatus that records the location certification information.
  31.  前記位置証明情報は、前記第2の情報処理装置によりブロックチェーンに記録される
     請求項30に記載の情報処理装置。
    The information processing device according to claim 30, wherein the location proof information is recorded in a block chain by the second information processing device.
  32.  所定のトリガを検出した情報処理装置が周囲の他の情報処理装置に向けて送信した位置証明の要求を受信したとき、位置証明情報を生成し、前記位置証明情報を送信する
     処理をコンピュータに実行させるためのプログラム。
    When an information processing device that has detected a predetermined trigger receives a request for position proof sent to another information processing device in the vicinity, the computer executes processing for generating position proof information and sending the position proof information. program to make
  33.  第1の情報処理装置が所定のトリガを検出したときの前記第1の情報処理装置の現在位置及び現在時刻を含む第1の位置情報を含み、前記第1の情報処理装置から受信した申告データを、前記トリガの検出時に前記第1の情報処理装置の周囲に存在していた第2の情報処理装置が前記第1の情報処理装置からの位置証明の要求に応じて生成した位置証明情報を用いて検証する検証部と、
     前記検証部により前記申告データが真正であると判定された場合、前記申告データに対応した処理を実行する実行部と
     を備える情報処理装置。
    Reporting data received from the first information processing device, including first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger. position proof information generated in response to a position proof request from the first information processing device by a second information processing device that was present in the vicinity of the first information processing device when the trigger was detected. a verification unit that verifies using
    An information processing apparatus comprising: an execution unit that executes processing corresponding to the declaration data when the verification unit determines that the declaration data is authentic.
  34.  前記検証部は、前記第1の位置情報に基づいて、前記第1の位置情報及び前記位置証明情報を含む位置証明ブロックを記録する第3の情報処理装置から前記位置証明情報を取得する
     請求項33に記載の情報処理装置。
    The verification unit acquires the location proof information from a third information processing device that records a location proof block containing the first location information and the location proof information based on the first location information. 33. The information processing device according to 33.
  35.  前記申告データは、前記第1の位置情報に関連付けられたメタデータをさらに含み、
     前記位置証明ブロックは、前記第1の位置情報、前記位置証明情報、及び、前記メタデータ又は前記メタデータのフィンガープリントを含み、
     前記検証部は、前記第1の位置情報、及び、前記メタデータ又は前記フィンガープリントに基づいて、前記第2の情報処理装置から前記位置証明情報を取得する
     請求項34に記載の情報処理装置。
    the claim data further includes metadata associated with the first location information;
    the location proof block includes the first location information, the location proof information, and the metadata or a fingerprint of the metadata;
    The information processing device according to claim 34, wherein the verification unit acquires the location proof information from the second information processing device based on the first location information and the metadata or the fingerprint.
  36.  前記実行部は、前記第1の位置情報及び前記メタデータに基づいて、前記申告データに対応した処理を実行する
     請求項35に記載の情報処理装置。
    The information processing apparatus according to claim 35, wherein the execution unit executes processing corresponding to the report data based on the first position information and the metadata.
  37.  前記位置証明ブロックは、ブロックチェーンに記録されている
     請求項34に記載の情報処理装置。
    The information processing device according to Claim 34, wherein the location proof block is recorded in a blockchain.
  38.  前記第1の位置情報に示される位置付近及び時刻付近において撮影された画像に対応する画像データを前記第2の情報処理装置に要求し、前記画像データを前記第2の情報処理装置から受信する情報収集部を
     さらに備える請求項33に記載の情報処理装置。
    requesting the second information processing device for image data corresponding to an image taken near the position and time indicated by the first position information, and receiving the image data from the second information processing device; The information processing apparatus according to claim 33, further comprising an information collecting section.
  39.  前記位置証明情報は、前記第1の情報処理装置から前記位置証明が要求されたときの前記第2の情報処理装置の現在位置及び現在時刻を含む第2の位置情報を含み、
     前記情報収集部は、前記第1の位置情報、前記位置証明情報、及び、公開鍵と、前記第1の位置情報及び前記位置証明情報を含む第1の平文から前記公開鍵に対応する秘密鍵を用いて生成された第1の電子署名とを含む位置証明ブロックを記録する第3の情報処理装置から前記位置証明情報を取得し、
     前記情報収集部は、前記第2の情報処理装置に前記第2の位置情報及び前記公開鍵を送付して前記画像データを要求し、前記画像データと、前記画像データを含む第2の平文から前記秘密鍵を用いて生成された第2の電子署名とを前記第2の情報処理装置から受信し、前記第2の電子署名を用いて前記画像データを検証する
     請求項38に記載の情報処理装置。
    The position proof information includes second position information including the current position and current time of the second information processing device when the position proof is requested from the first information processing device,
    The information collection unit extracts the first location information, the location proof information, a public key, and a first plaintext containing the first location information and the location proof information to a private key corresponding to the public key. Acquiring the location proof information from a third information processing device that records a location proof block containing the first electronic signature generated using
    The information collecting unit sends the second position information and the public key to the second information processing device to request the image data, and obtains the image data and a second plaintext including the image data. 39. The information processing according to claim 38, wherein a second electronic signature generated using said private key is received from said second information processing apparatus, and said image data is verified using said second electronic signature. Device.
  40.  暗号鍵を用いて暗号化された画像データを第3の情報処理装置から受信し、前記画像データに対して付与した画像IDを前記第3の情報処理装置に送信し、暗号化された前記画像データ及び前記画像IDを関連付けてデータベースに保存させる情報収集部を
     さらに備え、
     前記第1の位置情報に示される位置付近及び時刻付近において撮影された画像を前記第3の情報処理装置に要求し、該当する前記画像データに対応する前記画像ID及び前記暗号鍵に対応する復号鍵を前記第3の情報処理装置から受信する
     請求項33に記載の情報処理装置。
    receiving image data encrypted using an encryption key from a third information processing device, transmitting an image ID assigned to the image data to the third information processing device, and encrypting the image; further comprising an information collecting unit that associates the data and the image ID and saves them in a database;
    requesting the third information processing device for an image taken near the position and near the time indicated by the first position information, and decrypting the image ID corresponding to the corresponding image data and the encryption key corresponding to the encryption key; The information processing device according to claim 33, wherein a key is received from said third information processing device.
  41.  前記第1の情報処理装置から位置証明が要求された時刻付近において撮影された画像に対応し、透かしが重畳された第1の画像データ、及び、前記画像を撮影した位置及び時刻を含む第2の位置情報を含む第2の位置証明情報を第3の情報処理装置から受信し、前記透かしを検証するとともに、地図情報の各位置周辺の画像データを蓄積するデータベースから前記第2の位置情報に示される位置に対応する第2の画像データを抽出し、前記第2の画像データと前記第1の画像データとを比較することにより、前記第2の位置証明情報を検証する位置検証部を
     さらに備える請求項33に記載の情報処理装置。
    First image data superimposed with a watermark corresponding to an image taken around the time when location certification was requested from the first information processing device, and second image data including the location and time when the image was taken from a third information processing device, verifying the watermark, and obtaining the second location information from a database storing image data around each location of the map information; A position verification unit that verifies the second position proof information by extracting second image data corresponding to the indicated position and comparing the second image data with the first image data. Information processing apparatus according to claim 33, comprising:
  42.  前記位置検証部は、前記第2の位置証明情報をブロックチェーンに記録する
     請求項41に記載の情報処理装置。
    The information processing apparatus according to Claim 41, wherein said location verification unit records said second location proof information in a block chain.
  43.  前記申告データは、保険に関する申告を行うためのデータであり、
     前記実行部は、前記申告データに基づいて、前記保険の保険料、保険金、又は、特典を計算する
     請求項33に記載の情報処理装置。
    The declaration data is data for making a declaration regarding insurance,
    The information processing apparatus according to claim 33, wherein the execution unit calculates the insurance premium, insurance money, or privilege of the insurance based on the declaration data.
PCT/JP2022/004791 2021-06-09 2022-02-08 Information processing device and program WO2022259612A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023527485A JPWO2022259612A1 (en) 2021-06-09 2022-02-08

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-096304 2021-06-09
JP2021096304 2021-06-09

Publications (1)

Publication Number Publication Date
WO2022259612A1 true WO2022259612A1 (en) 2022-12-15

Family

ID=84425664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004791 WO2022259612A1 (en) 2021-06-09 2022-02-08 Information processing device and program

Country Status (2)

Country Link
JP (1) JPWO2022259612A1 (en)
WO (1) WO2022259612A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003065261A1 (en) * 2002-01-30 2003-08-07 Fujitsu Limited Insurance transsacting system and method using personal behaivior information
WO2008010287A1 (en) * 2006-07-20 2008-01-24 Panasonic Corporation Position verifying device, position verifying system, and position verifying method
JP2017050763A (en) * 2015-09-03 2017-03-09 日本電信電話株式会社 Permission information management system, user terminal, right holder terminal, permission information management method, and permission information management program
CN106897901A (en) * 2017-02-16 2017-06-27 湖北大学 Based on the shared bicycle Secure Billing method that home is proved

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003065261A1 (en) * 2002-01-30 2003-08-07 Fujitsu Limited Insurance transsacting system and method using personal behaivior information
WO2008010287A1 (en) * 2006-07-20 2008-01-24 Panasonic Corporation Position verifying device, position verifying system, and position verifying method
JP2017050763A (en) * 2015-09-03 2017-03-09 日本電信電話株式会社 Permission information management system, user terminal, right holder terminal, permission information management method, and permission information management program
CN106897901A (en) * 2017-02-16 2017-06-27 湖北大学 Based on the shared bicycle Secure Billing method that home is proved

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WENZHE LV; SHENG WU; CHUNXIAO JIANG; YUANHAO CUI; XUESONG QIU; YAN ZHANG: "Decentralized Blockchain for Privacy-Preserving Large-Scale Contact Tracing", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 2 July 2020 (2020-07-02), 201 Olin Library Cornell University Ithaca, NY 14853 , XP081713528 *
WU WEI; LIU ERWU; GONG XINGLIN; WANG RUI: "Blockchain Based Zero-Knowledge Proof of Location in IoT", ICC 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), IEEE, 7 June 2020 (2020-06-07), pages 1 - 7, XP033798373, DOI: 10.1109/ICC40277.2020.9149366 *

Also Published As

Publication number Publication date
JPWO2022259612A1 (en) 2022-12-15

Similar Documents

Publication Publication Date Title
US10019773B2 (en) Authentication and validation of smartphone imagery
CN110689460B (en) Traffic accident data processing method, device, equipment and medium based on block chain
US8751528B2 (en) Accident information aggregation and management systems and methods for accident information aggregation and management thereof
CN111460526A (en) Image data recording, acquiring and verifying method and device based on block chain
CN109523413B (en) Policy processing method and device, computer equipment and storage medium
US10824713B2 (en) Spatiotemporal authentication
KR20120069703A (en) Geographical location authentication method for mobile voting
JPWO2005119539A1 (en) Certificate issuing server and certification system for certifying operating environment
KR102029128B1 (en) Internet of things platform and implementing a method for mutual exchange of video clips of black box or dash cam captured traffic accident between drivers
CN111159474B (en) Multi-line evidence obtaining method, device and equipment based on block chain and storage medium
CN110706371A (en) Block chain-based driving safety management method, system and storage medium
JP2019133419A (en) Data transmission/reception method, data transmission/reception system, processing device, computer program, and construction method for system
CN108268915B (en) Electronic evidence curing system and method
CN110597906A (en) Block chain-based entrance integral generation method, device, equipment and storage medium
WO2022259612A1 (en) Information processing device and program
JP5112363B2 (en) Life log data management system, management method, and program
WO2005107148A1 (en) Authentication system
KR20160082935A (en) Method and apparatus for informing, managing, and trading media data
US20200184430A1 (en) Electronic ticket management system, electronic ticket management method and electronic ticket management program
US8850198B2 (en) Method for validating a road traffic control transaction
KR102231434B1 (en) Platform and implementing a method for p2p transaction/sharing service of black box images among drivers based on block chain technology
JPWO2012014473A1 (en) Image recording device
JP2019086904A (en) Image management server and image management method
JP2004140658A (en) Information processing apparatus, locational information utilizing system, and image management method
JP2019133650A (en) Data transmission/reception method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819803

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023527485

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE