WO2022259612A1 - Dispositif de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations et programme Download PDF

Info

Publication number
WO2022259612A1
WO2022259612A1 PCT/JP2022/004791 JP2022004791W WO2022259612A1 WO 2022259612 A1 WO2022259612 A1 WO 2022259612A1 JP 2022004791 W JP2022004791 W JP 2022004791W WO 2022259612 A1 WO2022259612 A1 WO 2022259612A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
location
information processing
processing device
unit
Prior art date
Application number
PCT/JP2022/004791
Other languages
English (en)
Japanese (ja)
Inventor
啓宏 王
公伸 西村
篤史 内田
雅友 倉田
崇 小形
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Priority to JP2023527485A priority Critical patent/JPWO2022259612A1/ja
Publication of WO2022259612A1 publication Critical patent/WO2022259612A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials

Definitions

  • the present technology relates to an information processing device and a program, and more particularly to an information processing device and a program that can guarantee the authenticity of position information of the information processing device.
  • This technology has been developed in view of such circumstances, and makes it possible to guarantee the authenticity of location information of an information processing device without using a dedicated device.
  • An information processing apparatus includes a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and current time; a location certification acquisition unit that requests location certification from a first information processing device existing in the vicinity and receives first location certification information from the first information processing device; and a location registration unit that transmits the first location information and the first location certification information to a second information processing device that records one piece of location certification information.
  • a program detects a current position when a predetermined trigger is detected, generates position information including the current position and current time, and detects first information existing in the surroundings when the trigger is detected. requesting location certification from a processing device, receiving location certification information from the first information processing device, and storing the location information and the location certification information in a second information processing device that records the location information and the location certification information; causes the computer to execute the process of sending the
  • a current position is detected, position information including the current position and current time is generated, and when the trigger is detected, first information processing existing in the surroundings is generated.
  • a device is requested to provide location proof, location proof information is received from the first information processing device, and the location information and the location proof information are received in a second information processing device that records the location information and the location proof information. sent.
  • An information processing apparatus receives a request for location certification transmitted from a first information processing apparatus that has detected a predetermined trigger to another information processing apparatus in the surrounding area. and a location certification unit that transmits the location certification information.
  • a program generates location certification information when an information processing device that detects a predetermined trigger receives a request for location certification transmitted to another information processing device in the vicinity, Causes the computer to execute a process of transmitting location verification information.
  • An information processing device includes first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger.
  • a second information processing device which was present around the first information processing device when the trigger was detected, receives the report data received from the first information processing device from the first information processing device;
  • a verification unit that verifies using location proof information generated in response to a request for location verification, and an execution unit that executes processing corresponding to the declared data when the verification unit determines that the declared data is authentic.
  • first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger
  • the report data received from the first information processing device is the location certification from the first information processing device by the second information processing device that was present in the vicinity of the first information processing device at the time of detection of the trigger. Verification is performed using the position proof information generated in response to the request, and when the verification unit determines that the report data is authentic, a process corresponding to the report data is executed.
  • FIG. 10 is a diagram showing a format example of a PoL request;
  • FIG. 10 is a diagram showing a format example of a PoL response;
  • FIG. 10 is a diagram showing a format example of a PoL transaction;
  • FIG. 4 is a diagram showing a format example of a PoL block;
  • FIG. 10 is a diagram showing a format example of a PoL request;
  • FIG. 10 is a diagram showing a format example of a PoL response;
  • FIG. 10 is a diagram showing a format example of a PoL transaction;
  • FIG. 4 is a diagram showing a format example of a PoL block;
  • FIG. 10 is a diagram showing a format example of a PoL request;
  • FIG. 10 is a diagram showing a format example of a PoL response;
  • FIG. 10 is a diagram showing a format example of a PoL transaction;
  • FIG. 4 is a diagram showing a format example of a PoL block;
  • FIG. 6 is a sequence diagram showing the processing of FIG. 5;
  • FIG. 10 is a flowchart for explaining details of PoL request verification processing;
  • FIG. FIG. 10 is a flowchart for explaining the details of PoL response verification processing;
  • FIG. FIG. 6 is a sequence diagram showing the processing of FIG. 5; It is a figure which shows the flow of collection processing of eyewitness information of an accident.
  • FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire;
  • FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire;
  • FIG. 11 is a diagram showing an example of a display screen of a witness questionnaire;
  • FIG. 15 is a sequence diagram showing the processing of FIG.
  • FIG. 14 It is a figure which shows the flow of calculation processing, such as an insurance premium of risk subdivision type insurance.
  • FIG. 20 is a sequence diagram showing the processing of FIG. 19;
  • FIG. 10 is a diagram showing the flow of calculation processing for insurance claims for leisure insurance or non-life insurance;
  • FIG. 10 is a diagram showing a flow of calculation processing of health promotion insurance premiums, etc.
  • FIG. FIG. 10 is a sequence diagram showing a process of storing image data and recording position information and position certification information at the time of shooting;
  • FIG. 10 is a sequence diagram showing eyewitness information collection processing;
  • FIG. 11 is a block diagram showing a second embodiment of an information processing system to which the present technology is applied; It is a block diagram which shows the functional structural example of an accident trigger generator.
  • FIG. 2 is a block diagram showing an example of the functional configuration of a camera
  • FIG. 3 is a block diagram showing a functional configuration example of a server
  • FIG. 26 is a sequence diagram showing the flow of processing of the information processing system of FIG. 25
  • FIG. 4 is a flowchart for explaining the details of position verification processing
  • FIG. 1 shows a configuration example of an information processing system 1 as a first embodiment of an information processing system to which the present technology is applied.
  • the information processing system 1 is a system for executing various insurance processing.
  • the information processing system 1 includes information processing terminals 11 - 1 to 11 - n, a server 12 , and a blockchain network 13 .
  • the information processing terminals 11-1 to 11-n, the server 12, and the blockchain network 13 are connected via a network 21, and can communicate with each other.
  • the information processing terminals 11-1 to 11-n can communicate directly using short-range wireless communication without going through the network 21.
  • the information processing terminals 11-1 to 11-n are simply referred to as the information processing terminal 11 when there is no need to distinguish them individually.
  • the information processing terminal 11 is configured by, for example, a portable information processing device that can be carried by the user or worn by the user.
  • the information processing terminal 11 is configured by a smart phone, a mobile phone, a tablet terminal, a wearable device, an action camera, a portable music player, a portable game machine, or the like.
  • the information processing terminal 11 is configured by an information processing device such as a drive recorder, which is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle), and shoots and records the surroundings of the mobile object.
  • an information processing device such as a drive recorder, which is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle), and shoots and records the surroundings of the mobile object.
  • the information processing terminal 11 is composed of, for example, a dedicated information processing device installed at an arbitrary location outdoors or indoors.
  • the information processing terminal 11 is used, for example, to use the insurance provided by the server 12. For example, the information processing terminal 11 generates declaration data for making various declarations regarding insurance, and transmits the declaration data to the server 12 via the network 21 . The information processing terminal 11 receives various data transmitted by the server 12 in response to the declaration data via the network 21 .
  • the information processing terminal 11 is used for certifying the positions of other information processing terminals 11 in the vicinity.
  • the information processing terminal 11 (hereinafter referred to as Prover) possessed by the insurance user detects the current position, and transmits the detected current position and position information including the current time to surrounding information processing terminals. 11 (hereinafter referred to as Witness) to request location verification.
  • the position verification is processing for verifying the position information of the Prover.
  • position proof is a process of proving that the Prover was present at the position indicated by the position information at the time indicated by the position information.
  • the location certification information is information that certifies the location information of the Prover.
  • location proof information is information that proves that the Prover was present at the location indicated by the location information at the time indicated by the location information.
  • the location proof information includes location information (Witness location information) including the current location and current time of the Witness.
  • the Prover transmits transaction data including location information and location proof information to the blockchain network 13 via the network 21, and causes the Prover's location information and location proof information to be recorded in the blockchain.
  • the time may include not only the so-called time but also the date and the day of the week.
  • the current time may include not only the current time, but also the current date and day of the week.
  • each information processing terminal 11 normally operates as a Witness, operates as a Prover when a predetermined trigger is detected, and returns to the Witness after completing the operation as a Prover.
  • the information processing terminal 11 when the information processing terminal 11 is present near the accident site when the accident occurs, the information processing terminal 11 generates eyewitness information of the accident according to a request from the server 12 and transmits it to the server 12 via the network 21 .
  • the server 12 executes various processes related to insurance while exchanging various data with the information processing terminal 11 and the blockchain network 13.
  • the blockchain network 13 is composed of a network in which multiple nodes are connected.
  • the (each node of) the blockchain network 13 updates and maintains a block chain in which blocks including location information and location proof information of each information processing terminal 11 are connected.
  • the blockchain network 13 extracts from the blockchain blocks containing location information and location proof information that meet the conditions presented by the server 12, and sends them to the server 12 via the network 21. Send.
  • FIG. 2 is a block diagram showing a functional configuration example of the information processing terminal 11 of FIG.
  • the information processing terminal 11 includes a CPU (Central Processing Unit) 101, a memory 102, a storage 103, an operation unit 104, a display unit 105, a speaker 106, an imaging unit 107, a GNSS (Global Navigation Satellite System) receiver 108, a sensing unit 109, A communication unit 110 , an external I/F 111 and a drive 112 are provided.
  • the CPU 101 to drive 112 are connected to a bus and perform necessary communications with each other.
  • the CPU 101 performs various processes by executing programs installed in the memory 102 and storage 103.
  • the memory 102 is composed of, for example, a volatile memory or the like, and temporarily stores programs executed by the CPU 101 and necessary data.
  • the storage 103 is composed of, for example, a hard disk or non-volatile memory, and stores programs executed by the CPU 101 and necessary data.
  • the operation unit 104 is composed of physical keys (including a keyboard), a mouse, a touch panel, and the like.
  • the operation unit 104 outputs an operation signal corresponding to the user's operation onto the bus.
  • the display unit 105 is composed of, for example, an LCD (Liquid Crystal Display) or the like, and displays an image according to data supplied from the bus.
  • LCD Liquid Crystal Display
  • the touch panel as the operation unit 104 is made of a transparent member and can be configured integrally with the display unit 105 . Accordingly, the user can input information by operating icons, buttons, and the like displayed on the display unit 105 .
  • the speaker 106 outputs sound according to the data supplied from the bus.
  • the imaging unit 107 captures an image (still image, moving image) (perceives light) and outputs the corresponding image data onto the bus.
  • the GNSS receiver 108 receives signals from GNSS satellites and detects the current position of the information processing terminal 11 based on the received signals.
  • the GNSS receiver 108 outputs data indicating the detection result of the current position (hereinafter referred to as position detection data) onto the bus.
  • the sensing unit 109 includes various sensors, and outputs sensor data output from each sensor onto the bus.
  • the sensing unit 109 includes, for example, sensors for detecting user behavior, such as motion sensors, acceleration sensors, angular velocity sensors, and the like.
  • the communication unit 110 includes a communication circuit, an antenna, and the like, and communicates with other information processing terminals 11 , servers 12 , and the blockchain network 13 via the network 21 . Also, the communication unit 110 performs short-range wireless communication with another information processing terminal 11 using a predetermined method without using the network 21 .
  • Bluetooth registered trademark, hereinafter referred to as BT
  • An external I/F (interface) 111 is an interface for exchanging data with various external devices.
  • the drive 112 is capable of attaching and detaching a removable medium 112A such as a memory card, and drives the attached removable medium 112A.
  • a removable medium 112A such as a memory card
  • the program executed by the CPU 101 can be recorded in advance in the storage 103 as a recording medium incorporated in the CPU 101 .
  • the program can be stored (recorded) in the removable media 111A, provided as so-called package software, and installed in the server 12 from the removable media 111A.
  • the program can be downloaded from a server (not shown) or the like via the network 21 and the communication unit 110 and installed in the information processing terminal 11.
  • FIG. 3 shows a configuration example of functions realized by the CPU 101 executing a program installed in the information processing terminal 11.
  • functions including, for example, the control unit 131, the position detection unit 132, the position proof processing unit 134, and the report unit 135 are realized.
  • the control unit 131 controls the processing of each unit of the information processing terminal 11 .
  • the position detection unit 132 detects the current position of the information processing terminal 11 based on the position detection data output from the GNSS receiver 108 .
  • the position detection unit 132 generates position information including the current position and current time of the information processing terminal 11 .
  • the accident detection unit 133 Based on at least one of the image data output from the imaging unit 107 and the sensor data output from the sensing unit 109, the accident detection unit 133 detects an accident or information associated with the user possessed by the information processing terminal 11. An accident occurring around the processing terminal 11 is detected.
  • the location certification processing unit 134 performs processing related to location certification of the information processing terminal 11 .
  • the location proof processing unit 134 includes a Prover processing unit 141 and a Witness processing unit 142 .
  • the Prover processing unit 141 performs processing when the information processing terminal 11 operates as a Prover, that is, when the information processing terminal 11 asks surrounding information processing terminals 11 to verify its location.
  • the Prover processing unit 141 includes a location certification acquisition unit 151 , a location registration unit 152 and an image registration unit 153 .
  • the location certification acquisition unit 151 When location certification is required, the location certification acquisition unit 151 generates a PoL (Proof of Location) request for requesting location certification to surrounding information processing terminals 11, including the location information of the information processing terminal 11. Generate.
  • the location certification acquisition unit 151 transmits a PoL request to the surrounding information processing terminals 11 via the communication unit 110 by BT.
  • the location certification acquisition unit 151 receives, via the communication unit 110, a PoL response including location certification information that has been sent from the surrounding information processing terminal 11 in response to a PoL request.
  • the location registration unit 152 registers the location of the information processing terminal 11 in the blockchain network 13. Specifically, the location registration unit 152 generates a PoL transaction including the location information of the information processing terminal 11 and the location certification information acquired from the surrounding information processing terminals 11 . The location registration unit 152 broadcasts PoL transactions to the blockchain network 13 via the communication unit 110 and the network 21 . As a result, a PoL block based on the PoL transaction is added to the blockchain, and the location information and the location proof information of the information processing terminal 11 are recorded in the blockchain.
  • the image registration unit 153 registers image data corresponding to images captured by the image capturing unit 107 in the server 12 .
  • the image registration unit 153 transmits the image data to the server 12 via the communication unit 110 and the network 21 and causes the server 12 to store the image data as necessary.
  • the witness processing unit 141 performs processing when the information processing terminal 11 operates as a witness, that is, when the information processing terminal 11 performs position verification of the surrounding information processing terminals 11 .
  • Witness processing unit 141 includes location certifying unit 161 and information providing unit 162 .
  • the location certification unit 161 When the location certification unit 161 receives a PoL request from the Prover via the communication unit 110, based on the PoL request, it generates a PoL response including location certification information including Witness location information. The location certification unit 161 transmits the PoL response to the Prover via the communication unit 110 by BT.
  • the information providing unit 162 generates information requested by the server 12 (for example, eyewitness information of an accident, etc.) and transmits it to the server 12 via the communication unit 110 and the network 21 .
  • the declaration unit 135 generates declaration data for making various declarations regarding insurance provided by the server 12 and transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the reporting unit 135 receives various data transmitted from the server 12 with respect to the reporting data via the network 21 and the communication unit 110 .
  • FIG. 4 is a block diagram showing a functional configuration example of the server 12. As shown in FIG.
  • the server 12 includes a CPU 201 , a memory 202 , a storage 203 , an insurance DB (Data Base) 204 , an operation section 205 , a display section 206 , a communication section 207 , an external I/F 208 and a drive 209 .
  • the CPU 201 to drive 209 are connected to a bus and perform necessary communications with each other.
  • the CPU 201 through the storage 203, the operation unit 205, the display unit 206, the external I/F 208, and the drive 209 are the CPU 101 through the storage 103, the operation unit 104, the display unit 105, the external I/F 111, and the and are configured similarly to the drive 112, respectively.
  • the insurance DB 204 stores various data related to insurance provided by the server 12.
  • the insurance DB 204 stores various data related to insurance to be provided and policyholders.
  • insurance DB 204 stores eyewitness information for accidents involving policyholders.
  • the insurance DB 204 stores image data transmitted from the information processing terminal 11 of the policyholder.
  • the communication unit 207 includes a communication circuit, an antenna, etc., and communicates with the information processing terminal 11 and the blockchain network 13 via the network 21 .
  • the program executed by the CPU 201 can be recorded in advance in the storage 203 as a recording medium incorporated in the server 12 .
  • the program can be stored (recorded) in the removable media 209A, provided as package software, and installed in the server 12 from the removable media 209A.
  • the program can be downloaded from another server (not shown) or the like via the network 21 and the communication unit 207 and installed on the server 12 .
  • Functions including a control unit 231, a verification unit 232, an information collection unit 233, and an insurance processing unit 234 are realized by the CPU 201 executing a program installed in the server 12.
  • the control unit 131 controls the processing of each unit of the server 12.
  • the verification section 232 verifies the declaration data received from the information processing terminal 11 via the communication section 207 and the network 21 .
  • the verification unit 232 requests the blockchain network 13 via the communication unit 207 and the network 21 to collate data (for example, location information) included in the declaration data.
  • the verification unit 232 receives data indicating the result of data matching from the blockchain network 13 via the network 21 and the communication unit 207 .
  • the verification unit 232 verifies the declaration data based on the verification result or the like received from the blockchain network 13 .
  • the information collection unit 233 requests information necessary for insurance processing from the information processing terminal 11 via the communication unit 207 and the network 21 as necessary, and receives the requested information. For example, the information collecting unit 233 requests eyewitness information of an accident from the information processing terminal 11 via the communication unit 207 and the network 21 and receives the eyewitness information from the information processing terminal 11 .
  • the insurance processing unit 234 performs various processes related to the insurance provided by the server 12.
  • the insurance processing unit 234 has a calculation unit 241 and an execution unit 242 .
  • the calculation unit 241 calculates premiums, benefits, benefits (for example, cashback, etc.) related to the insurance contracted by the user, based on the declaration data received from the information processing terminal 11 .
  • the privilege does not necessarily have to be money, and may be, for example, goods or points.
  • the execution unit 242 executes processing related to various insurance services. For example, the execution unit 242 executes insurance premium billing processing, insurance payment processing, privilege provision processing, and the like with the information processing terminal 11 via the communication unit 207 and the network 21 .
  • step S1 the control unit 131 of the Prover activates APP1 that implements the location proof processing unit 134.
  • APP1 may always operate in the background of the OS (Operating System).
  • the Witness control unit 131 similarly activates APP2 that implements the location proof processing unit 134 .
  • APP2 may always operate in the background of the OS.
  • step S2 the position detection unit 132 of the Prover generates position information. Specifically, the position detection unit 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 when a predetermined trigger is detected.
  • a predetermined event or predetermined timing is set as a predetermined trigger.
  • an accident e.g., collision, fall
  • An event such as a predetermined user operation on the operation unit 104 is set as a trigger.
  • the timing of elapse of a predetermined time, arrival at a predetermined time, etc. is set as a trigger.
  • a trigger is detected at predetermined time intervals. Note that the time interval may or may not be constant.
  • the position detection unit 132 generates position information including the current position and current time of the Prover.
  • the location detection unit 132 supplies the location information to the location certification acquisition unit 151 and stores it in the storage 103 . If there is metadata associated with the location information, the location detection unit 132 stores the metadata in the storage 103 in association with the location information.
  • the metadata associated with the position information includes image data corresponding to an image (moving or still image) captured by the imaging unit 107 at the current position of the Prover, Sensor data, etc. are assumed.
  • step S3 the Prover's location certification acquisition unit 151 generates a metadata fingerprint as necessary. For example, if there is metadata associated with the location information generated in step S2, the location certification acquisition unit 151 generates a fingerprint by calculating a hash value of the metadata.
  • step S4 the Prover's location certification acquisition unit 151 generates a PoL request.
  • Fig. 6 shows a format example of a PoL request.
  • the PoL request contains provider_address, latitude, longitude, timestamp, metadata_fingerprint and signature.
  • prover_address is the Prover's public key.
  • latitude indicates the latitude of the Prover's current position.
  • longitude indicates the longitude of the Prover's current position.
  • timestamp indicates the generation time of the PoL request.
  • the current time included in the position information generated in the process of step S2 is set as timestamp.
  • the PoL request includes location information (latitude, longitude, and timestamp) that is the target of location certification.
  • Metadata_fingerprint is the fingerprint of the metadata associated with the Prover's location information.
  • the value of metadata_fingerprint is set to NULL. Also, for example, instead of the fingerprint, the metadata itself may be stored in the PoL request.
  • the signature is the Prover's electronic signature.
  • a signature is generated by encrypting a plaintext hash value containing provider_address, latitude, longitude, timestamp, and metadata_fingerprint with a private key corresponding to the provider_address (public key).
  • step S5 the Prover's location certification acquisition unit 151 transmits a PoL request to another information processing terminal 11 (Witness) existing within a predetermined range from the Prover by BT via the communication unit 110 .
  • the predetermined range is set, for example, to the BT communicable range. This requires proof of the Prover's location from witnesses that exist around the Prover.
  • the witness CPU 101 receives the PoL request via the communication unit 110 .
  • step S6 the Witness location verification unit 161 verifies the PoL request. A method of verifying the PoL request will be described later with reference to FIG.
  • step S7 when the Witness's location certification unit 161 determines that the PoL request is valid as a result of the verification, it generates a PoL response, which is a response to the PoL request.
  • FIG. 7 shows an example of the PoL response format.
  • the PoL response includes pol_request_signed, witness_address, latitude, longitude, timestamp, metadata_fingerprint, and signature.
  • pol_request_signed is PoL request data corresponding to the PoL response. For example, all data of the PoL request is stored as it is in the PoL response as pol_request_signed. Therefore, the PoL response contains the Prover's location information included in the PoL request.
  • the witness_address is the public key of the Witness.
  • Latitude indicates the latitude of the Witness's current position.
  • longitude indicates the longitude of the Witness's current position.
  • timestamp indicates the PoL response generation time (location proof time). For example, the detection time of the Witness's current position (latitude and longitude) is set to timestamp.
  • the PoL response contains the Witness's location information (latitude, longitude, and timestamp) at the time of location verification.
  • Metadata_fingerprint is the fingerprint of the metadata associated with the Witness's location information.
  • the value of metadata_fingerprint is set to NULL. Also, for example, instead of the fingerprint, the metadata itself may be stored in the PoL response.
  • the signature is the Witness's electronic signature.
  • a signature is generated by encrypting a plaintext hash value containing pol_request_signed, witness_address, latitude, longitude, timestamp, and metadata_fingerprint with a private key corresponding to witness_address (public key).
  • step S8 the witnesses's location verification unit 161 transmits the PoL response to the Prover by BT via the communication unit 110.
  • the Prover's CPU 101 receives the PoL response sent from the Witness via the communication unit 110 .
  • the Prover generates and broadcasts a PoL transaction. Specifically, the Prover's location proof acquisition unit 151 verifies the PoL response. A PoL response verification method will be described later with reference to FIG. When the location certification acquisition unit 151 determines that the PoL response is valid as a result of the verification, it supplies the PoL response to the location registration unit 152 .
  • the location registration unit 152 generates a PoL transaction corresponding to the PoL response.
  • Fig. 8 shows a format example of a PoL transaction.
  • a PoL transaction includes sender_address, recipient_address, value, data, and signature.
  • sender_address indicates the address of the sender. For example, sender_address is set to "THE BLOCKCHAIN".
  • recipient_address indicates the address of the recipient. For example, the recipient_address is set to "THE BLOCKCHAIN".
  • 0 is set for the value.
  • data contains the PoL response data received in response to the PoL request. Therefore, the PoL transaction contains the Prover's location information and the Witness's location information. Note that when receiving PoL responses from a plurality of Witnesses in response to a PoL request, data includes the data of the plurality of PoL requests.
  • the location proof data blockchain (blockchain that does not include remittance information), for example, it is possible to omit the sender_address, recipient_address, and value so that the PoL transaction includes only data.
  • the location registration unit 152 broadcasts the PoL transaction to the blockchain network 13 via the communication unit 110 and the network 21.
  • each node of the blockchain network 13 receives PoL transactions via the network 21.
  • step S10 the blockchain network 13 generates a PoL block based on the PoL transaction and adds it to the blockchain.
  • FIG. 9 shows a format example of a PoL block.
  • a PoL block includes block_number, timestamp, transactions, previous_hush, nonce, miner_address, and signature.
  • block_number indicates the block number of the PoL block.
  • timestamp indicates the generation time of the PoL block.
  • a transaction list contains one or more PoL transactions.
  • previous_hush is the hash value of the PoL block immediately preceding the PoL block in question in the blockchain.
  • a nonce is a nonce value calculated by, for example, PoW (Proof of Work) or PoS (Proof of Stake).
  • miner_address is the public key of the miner who mined the PoL transaction.
  • the signature is the miner's electronic signature.
  • signature is generated by encrypting a plaintext hash value including block_number, timestamp, transactions, previous_hush, nonce, and miner_address with a private key corresponding to miner_address (public key).
  • the declaration unit 135 of the Prover generates and transmits declaration data.
  • the declaration data includes, for example, a registration ID for identifying user U1, location information of Prover, metadata associated with location information of Prover (metadata based on metadata_fingerprint of PoL request), public key of Prover (PoL request provider_address), insurance contract period, etc., including data necessary for declaration.
  • the declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 requests verification of the declaration data. For example, the verification unit 232 extracts from the insurance DB 204 information about the user corresponding to the registration ID included in the declaration data. If the verification unit 232 determines that the user U1 is a legitimate user (for example, an insurance policyholder) based on the extracted information, the verification unit 232 requests the blockchain network 13 to verify the declaration data.
  • the verification unit 232 extracts from the insurance DB 204 information about the user corresponding to the registration ID included in the declaration data. If the verification unit 232 determines that the user U1 is a legitimate user (for example, an insurance policyholder) based on the extracted information, the verification unit 232 requests the blockchain network 13 to verify the declaration data.
  • the verification unit 232 sends the location information to the blockchain network 13 via the communication unit 207 and the network 21. demand.
  • the verification unit 232 For example, if the declaration data includes location information and metadata, the verification unit 232 generates a fingerprint of the metadata. The verification unit 232 requests the blockchain network 13 via the communication unit 207 and the network 21 to verify the location information included in the declaration data and the generated fingerprint.
  • step S13 the blockchain network 13 verifies the declaration data and transmits the verification result.
  • the blockchain network 13 searches the blockchain for a PoL block containing location information that matches the location information. That is, the PoL block containing the location information included in the declaration data and the location proof information for the location information is searched.
  • the blockchain network 13 searches the blockchain for PoL blocks containing location information and fingerprints that match the location information and fingerprints when verification of location information and metadata fingerprints is requested. . That is, the PoL block containing the location information contained in the declaration data, the fingerprint of the metadata, and the location proof for the location information is searched.
  • the blockchain network 13 When the blockchain network 13 detects the relevant PoL block, it transmits the detected PoL block to the server 12 via the network 21 .
  • the CPU 201 of the server 12 receives the PoL block via the network 21 and the communication unit 207.
  • the blockchain network 13 fails to detect the relevant PoL block, it notifies the server 12 via the network 21 that the relevant PoL block does not exist.
  • step S14 the server 12 verifies the declaration data and executes various services. For example, when the verification unit 232 of the server 12 receives a PoL block from the blockchain network 13, it verifies the PoL response included in the PoL block by the same processing as in FIG. 11, which will be described later. When the verification unit 232 determines that the PoL response is valid as a result of the verification, the verification unit 232 verifies the PoL request included in the PoL response by the same processing as in FIG. 12 described later.
  • the verification unit 232 determines that the PoL request is valid, it extracts the Prover's location information (latitude, longtitude, timestamp) from the PoL request. If the Prover's location information included in the declaration data matches the Prover's location information extracted from the PoL request, the verification unit 232 extracts the Witness's location information (latitude, longtitude, timestamp) from the PoL response. Then, if the difference in position and time between the Prover's position information and the Witness's position information is within a predetermined range, the verification unit 232 determines that the position information of the declaration data is authentic.
  • the Prover's location information latitude, longtitude, timestamp
  • the verification unit 232 determines that the PoL request is valid and the declaration data includes metadata
  • the verification unit 232 extracts the fingerprint of the metadata (metadata_fingerprint) from the PoL request. If the fingerprint generated from the metadata included in the declared data matches the fingerprint extracted from the PoL request, the verification unit 232 determines that the metadata of the declared data is authentic.
  • the verification unit 232 determines that the declared data is authentic when the position information of the declared data is authentic. On the other hand, if the declared data does not contain metadata and the position information of the declared data is not authentic, the verification unit 232 determines that the declared data is not authentic.
  • the verification unit 232 determines that the declared data is authentic when the position information and the metadata of the declared data are authentic. On the other hand, when the declared data includes metadata, the verification unit 232 determines that the declared data is not authentic when at least one of the position information and the metadata of the declared data is not authentic.
  • the verification unit 232 determines that the declaration data is authentic, it supplies the declaration data to the insurance processing unit 234 .
  • the insurance processing unit 234 executes processing related to various insurance services provided by the server 12 based on the declaration data. A specific example of processing will be described later.
  • step S31 the Prover executes the process of step S2 in FIG. 5 described above to generate position information.
  • the Prover's location certification acquisition unit 151 scans the surrounding witnesses. That is, the location certification acquisition unit 151 scans for witnesses (other information processing terminals 11) existing within a predetermined range around the Prover. For example, the range in which the communication unit 110 can communicate by BT is set as the range to scan Witness. Further, the location certification acquisition unit 151 confirms whether or not the Witness detected by scanning can communicate with the Prover by BT.
  • the Prover generates and transmits a PoL request. That is, the Prover executes the processes of steps S3 to S5 in FIG. 5 described above, generates a PoL request, and transmits it to the witness detected in the process of step S32 by BT via the communication unit 110 .
  • the witness CPU 101 receives the PoL request via the communication unit 110 .
  • step S34 the Witness location verification unit 161 verifies the PoL request.
  • step S35 if the Witness's location certification unit 161 determines that the PoL request is valid, it generates and transmits a PoL response.
  • step S51 the location certification unit 161 determines whether the format of the PoL request is normal. If the received PoL request conforms to the format of FIG. 6, the location certification unit 161 determines that the format of the PoL request is normal, and the process proceeds to step S52.
  • step S52 the location certification unit 161 determines whether the PoL request is authentic.
  • the location proving unit 161 calculates a plaintext hash value including the PoL request provider_address, latitude, longitude, timestamp, and metadata_fingerprint. Also, the location proof unit 161 decrypts the hash value by decrypting the signature of the PoL request using the provider_address of the PoL request. If the hash value calculated from the plaintext of the PoL request matches the hash value decoded from the signature of the PoL request, the location certification unit 161 determines that the PoL request is authentic, and the process proceeds to step S53.
  • step S53 the position verification unit 161 determines whether the Prover exists within a predetermined distance range. For example, if the distance between the position of the Prover indicated in the position information included in the PoL request and the current position of the Witness is equal to or less than a predetermined threshold, the location verification unit 161 determines that the Prover exists within a predetermined distance range. Then, the process proceeds to step S54.
  • step S54 the location certification unit 161 transmits an OK message to the Prover via the communication unit 110 by BT.
  • the witness generates and transmits a PoL response. That is, the witness executes the processes of steps S7 and S8 in FIG. 5 described above, generates a PoL response to the PoL request, and transmits it to the Prover by BT via the communication unit 110 .
  • the Prover's CPU 101 receives the PoL response via the communication unit 110 .
  • step S53 for example, if the distance between the position of the Prover indicated in the position information included in the PoL request and the current position of the Witness exceeds a predetermined threshold, the Prover , and the process proceeds to step S56.
  • step S52 if the hash value calculated from the plaintext of the PoL request and the hash value decoded from the signature of the PoL request do not match, the location certification unit 161 determines that the PoL request is not authentic, and processes the PoL request. goes to step S56.
  • step S51 if the received PoL request does not conform to the format shown in FIG. 6, the location certification unit 161 determines that the format of the PoL request is not normal, and the process proceeds to step S56.
  • step S56 the location certification unit 161 transmits an NG message to the Prover by BT via the communication unit 110.
  • step S36 the Prover's location proof acquisition unit 151 verifies the PoL response.
  • step S37 the Prover's location registration unit 152 generates and broadcasts a PoL transaction.
  • step S71 the location certification acquisition unit 151 determines whether the format of the PoL response is normal. If the received PoL response conforms to the format of FIG. 7, the location certification acquisition unit 151 determines that the format of the PoL response is normal, and the process proceeds to step S72.
  • step S72 the location certification acquisition unit 151 determines whether the PoL response is authentic.
  • the location proof acquisition unit 151 calculates a plaintext hash value including pol_request_signed, witness_address, latitude, longitude, timestamp, and metadata_fingerprint of the PoL response. Also, the location proof acquisition unit 151 decrypts the hash value by decrypting the signature of the PoL response using the provider_address of the PoL response. If the hash value calculated from the plaintext of the PoL response and the hash value decrypted from the signature of the PoL response match, the location proof acquisition unit 151 determines that the PoL response is authentic, and the process proceeds to step S73. move on.
  • step S73 the location certification acquisition unit 151 determines whether or not the Witness exists within a predetermined distance range. For example, when the distance between the position of the Witness indicated in the position information included in the PoL response and the current position of the Prover is equal to or less than a predetermined threshold, the location proof acquisition unit 151 determines that the Witness is within the range of the predetermined distance. It is determined that it exists, and the process proceeds to step S74.
  • step S74 the Prover executes the process of step S9 in FIG. 5 described above, generates a PoL transaction, and broadcasts it.
  • each node of the blockchain network 13 receives PoL transactions via the network 21.
  • step S73 for example, if the distance between the position of the Witness indicated in the position information included in the PoL response and the current position of the Prover exceeds a predetermined threshold, the Witness It is determined that it does not exist within the range of distance, and the process proceeds to step S75.
  • step S72 when the hash value calculated from the plaintext of the PoL response and the hash value decrypted from the signature of the PoL response do not match, the location proof acquisition unit 151 determines that the PoL response is not authentic, The process proceeds to step S75.
  • step S71 if the received PoL response does not conform to the format of FIG. 7, the location certification acquisition unit 151 determines that the format of the PoL response is not normal, and the process proceeds to step S75.
  • step S75 the location certification acquisition unit 151 discards the PoL response.
  • the PoL response verification process ends, and the Prover process ends. That is, the Prover processing ends without the PoL transaction being sent to the blockchain network 13 .
  • step S38 the miner of the blockchain network 13 performs mining and generates PoL blocks.
  • step S39 the blockchain network 13 verifies the generated PoL block and adds it to the blockchain.
  • the miner of the blockchain network 13 verifies the validity of the location information and the location proof information included in the generated PoL block, and if it determines that the PoL block is valid, other users in the blockchain network 13 node.
  • Each node that receives a PoL block adds the received PoL block to the blockchain.
  • step S101 the Prover executes the process of step S11 in FIG.
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • step S102 the verification unit 232 of the server 12 generates a metadata fingerprint as necessary. For example, if the declaration data includes metadata corresponding to metadata_fingerprint of the PoL request in FIG. 6, the verification unit 232 generates a fingerprint of the metadata.
  • step S103 the verification unit 232 of the server 12 queries PoL records based on the declaration data.
  • the verification unit 232 For example, if the declared data includes location information but not metadata, the verification unit 232 generates a query requesting extraction of a PoL block including location information that matches the location information.
  • the verification unit 232 For example, if the declaration data includes location information and metadata, the verification unit 232 generates a fingerprint of the metadata. The verification unit 232 generates a query requesting extraction of a PoL block containing location information and fingerprints that match the location information included in the declaration data and the generated fingerprint.
  • the verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and the network 21.
  • each node of) the blockchain network 13 receives the query via the network 21.
  • step S104 the blockchain network 13 extracts and transmits PoL records based on the query. Specifically, (a node of) the blockchain network 13 extracts PoL blocks that match the conditions indicated by the query from the PoL blocks included in the blockchain.
  • PoL blocks that contain location information that matches the location information are extracted.
  • the declaration data includes location information and metadata
  • the PoL blocks including the location information and the location information and fingerprints that match the fingerprint of the metadata are extracted.
  • the blockchain network 13 transmits the extracted PoL blocks to the server 12 via the network 21.
  • the CPU 201 of the server 12 receives the extracted PoL block via the network 21 and the communication unit 207.
  • the blockchain network 13 notifies the server 12 via the network 21 that the PoL block does not exist.
  • step S105 the server 12 verifies the declaration data and executes various services. That is, the server 12 performs the process of step S14 in FIG. 5 described above, and when it determines that the declared data is authentic as a result of verifying the declared data, the insurance provided by the server 12 is determined based on the declared data. Executes processing related to various services.
  • Prover can reliably guarantee the authenticity of location information and metadata without using a dedicated device. That is, the witnesseses existing around the Prover transmit the Witness's location information to the Prover as location verification information in association with the Prover's location information. The Prover can then use the location proof information to ensure the authenticity of the Prover's location information and the metadata associated with the location information.
  • the server 12 can easily confirm the authenticity of the declared data, and if it determines that the declared data is authentic, it can provide an appropriate insurance service based on the declared data.
  • the management cost of personal information and the operating cost of the server 12 can be reduced.
  • the insurance money paid in the event of an accident to the user, or the insurance money paid to the victim in the event of an accident caused by the user, shall be based on the percentage of negligence of the parties to the accident (victim and perpetrator). Calculated. And eyewitness information of an accident other than the party involved becomes an important basis for determining the percentage of fault.
  • this technology can be applied to the process of collecting eyewitness information about accidents.
  • user U1 is assumed to be a party (perpetrator or victim) of the accident
  • user U2 is assumed to be a witness who was around the accident site when the accident occurred.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S201 Prover activates APP1, and witnesses activates APP2, as in the process of step S1 in FIG.
  • step S202 the Prover detects an accident and generates location information. For example, when the accident detection unit 133 detects an accident in which the user U1 is a party, based on at least one of image data output from the imaging unit 107 and sensor data output from the sensing unit 109. , to notify the position detector 132 of the occurrence of an accident.
  • accident data The type and number of data used for accident detection (hereinafter referred to as accident data) are not particularly limited. For example, image data, impact data from an impact sensor, etc. are used.
  • the position detection unit 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 .
  • the position detector 132 generates position information including the current position and current time of the Prover.
  • the position detection unit 132 acquires accident data acquired before and after the accident from the accident detection unit 133 .
  • the location detection unit 132 supplies location information and accident data to the location certification acquisition unit 151 .
  • the position detection unit 132 associates the position information and the accident data and stores them in the storage 103 .
  • step S203 the Prover's location certification acquisition unit 151 generates a fingerprint of the accident data.
  • the Prover generates a PoL request, similar to the process at step S4 in FIG.
  • the fingerprint of the accident data is stored in the PoL request as metadata_fingerprint.
  • steps S205 to S210 the same processing as steps S5 to S10 in FIG. 5 described above is performed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain.
  • the location information of the Prover at the time of the accident, the location verification information by the Witness, and the fingerprint of the accident data are recorded in the blockchain.
  • the declaration unit 135 of the Prover generates and transmits declaration data.
  • the reporting unit 135 acquires from the storage 203 location information generated by the Prover when an accident occurs and accident data associated with the location information.
  • the report unit 135 generates report data including the acquired position information and accident data, the registration ID of the user U1, and the public key of the Prover.
  • the declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • steps S212 and S213 processing similar to steps S12 and S13 in FIG. 5 described above is executed.
  • PoL blocks containing location information and fingerprints that match the fingerprints of the location information and accident data included in the report data are extracted from the blockchain and sent to the server 12 .
  • step S214 the information collection unit 233 of the server 12 estimates the witness. Specifically, the information collecting unit 233 identifies the witness (for example, the information processing terminal 11 - 2 ) that generated the PoL response included in the PoL block received from the blockchain network 13 . In addition, the information collecting unit 233 estimates the specified Witness user (for example, the user U2, etc.) as a witness.
  • the witness for example, the information processing terminal 11 - 2
  • the information collecting unit 233 estimates the specified Witness user (for example, the user U2, etc.) as a witness.
  • the information collection unit 233 of the server 12 generates and broadcasts a witness questionnaire.
  • the information collecting unit 233 generates a witness questionnaire for collecting eyewitness information from eyewitnesses based on the PoL responses of each witness included in the PoL block received from the blockchain network 13 .
  • the eyewitness questionnaire includes a public key (witness_address) and location information (latitude, longitude, timestamp) included in the Witness PoL response, and information indicating the contents of the questionnaire.
  • the information collecting unit 233 broadcasts the eyewitness questionnaire to the information processing terminal 11 (Witness) of the eyewitness estimated in step S214 via the communication unit 207 and the network 21 .
  • the Witness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
  • step S216 the Witness information providing unit 162 generates and transmits responses to the eyewitness questionnaire. Specifically, when the public key (witness_address) included in the eyewitness questionnaire matches the public key of the witness, the information providing unit 162 displays the screen of FIG. to display.
  • a map 301 showing the site of the accident is displayed in the background.
  • An image 302 of the vicinity of the accident site is displayed on the map 301 .
  • a window 303 is displayed above the map 301 containing a message to a user (eg, user U2) and information about the accident.
  • the window 303 displays a message stating that a witness is being searched for, the time when the accident occurred, and an overview. Also displayed are a "Yes” button and a “No” button, along with a message asking whether the user was a witness to the accident.
  • the screen in FIG. 16 is displayed on the display unit 105.
  • the "No” button is pressed, the display of the eyewitness questionnaire ends.
  • the screen in FIG. 16 differs from the screen in FIG. 15 in that window 304 is displayed instead of window 303 .
  • the window 304 displays the conditions regarding the shooting location and shooting time of the image (eyewitness information) requested to be sent. For example, a message requesting the provision of images (photographs) taken near the accident site during a predetermined time period before and after the accident (in this example, after the accident) is displayed. A “+” button, a “next” button, and a “back” button are also displayed.
  • the screen in FIG. 17 differs from the screen in FIG. 16 in that window 305 is displayed instead of window 304 .
  • a window 305 displays a message expressing gratitude for providing information, a "Submit” button, and a "Cancel” button.
  • the image data corresponding to the image selected on the screen of FIG. 16 is transmitted to the server 12.
  • This image data corresponds to an image captured near the site and time of occurrence of an accident involving user U1 (near the position and time indicated by the position information of Prover when the accident occurred).
  • the information providing unit 162 generates eyewitness information including the selected image data.
  • eyewitness information may include information other than image data, such as text data indicating testimony of eyewitnesses.
  • the information providing unit 162 generates an electronic signature of the sighting information using the secret key corresponding to the witness_address (public key).
  • the information providing unit 162 generates answers to the eyewitness questionnaire including eyewitness information and electronic signatures, and transmits them to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
  • step S211 to step S216 of FIG. 14 and the subsequent processing will be supplemented.
  • step S231 the Prover executes the process of step S11 in FIG.
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 generates a fingerprint of the accident data included in the report data.
  • step S233 the verification unit 232 of the server 12 queries PoL records based on the declaration data. Specifically, the verification unit 232 generates a query requesting extraction of PoL blocks containing position information and fingerprints that match the location information included in the declaration data and the generated fingerprints. The verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and network 21 .
  • the blockchain network 13 receives the query via the network 21.
  • step S234 the blockchain network 13 extracts and transmits PoL records based on the query, similar to the process of step S103 in FIG. 13 described above. PoL blocks that meet the conditions indicated by the query are thereby extracted from the blockchain and sent to the server 12 .
  • step S235 the verification unit 232 of the server 12 verifies the report data and estimates the witness. Specifically, the verification unit 232 verifies the declaration data by the same process as in step S105 of FIG. 13 described above. When the verification unit 232 determines that the declaration data is authentic, the verification unit 232 supplies the declaration data and the PoL block received from the blockchain network 13 to the information collection unit 233 .
  • the information collection unit 233 identifies the witness that generated the PoL response included in the PoL block.
  • the information collecting unit 233 also presumes that the specified Witness user is a witness.
  • step S236 the server 12 executes the process of step S215 in FIG. 14 described above, generates a witness questionnaire, and broadcasts it to Trust via the network 21.
  • the Witness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
  • step S237 the Witness information providing unit 162 confirms the public key of the witness questionnaire. That is, the information providing unit 162 confirms whether or not the public key included in the eyewitness questionnaire matches the public key (witness_address) of the Witness.
  • step S2308 if the public key of the eyewitness questionnaire matches the public key of the Witness, the Witness executes the process of step S216 in FIG. .
  • the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
  • step S239 the information collection unit 233 of the server 12 verifies the answers to the eyewitness questionnaire. Specifically, the information collecting unit 233 calculates a hash value of the eyewitness information included in the answers to the eyewitness questionnaire. The information collecting unit 233 also decrypts the hash value by decrypting the electronic signature included in the answer to the eyewitness questionnaire using the Witness's public key (witness_address). If the hash value calculated from the eyewitness information matches the hash value decoded from the electronic signature, the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic. On the other hand, if the hash value calculated from the eyewitness information does not match the hash value decoded from the electronic signature, the information collecting unit 233 determines that the answer to the eyewitness questionnaire is not authentic.
  • step S240 the server 12 executes key money remittance processing. Specifically, when the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic, the information collecting unit 233 supplies the eyewitness information included in the answer to the eyewitness questionnaire to the insurance processing unit 234 .
  • the calculation unit 241 calculates the key money to be paid to the eyewitness based on the content of the eyewitness information, and notifies the execution unit 242 of the calculation result.
  • the execution unit 242 communicates with the Witness via the communication unit 207 and the network 21 to perform key money remittance processing.
  • step S241 the server 12 transmits information regarding insurance claims. Specifically, the calculation unit 241 calculates the insurance money to be paid to the user U1, for example, based on the report data and eyewitness information, and notifies the execution unit 242 of the calculation result. The execution unit 242 generates information including calculation results of the insurance money, and transmits the information to the Prover via the communication unit 207 and the network 21 .
  • the Prover's CPU 101 receives information on the insurance money via the network 21 and the communication unit 110 .
  • risk-segmented automobile insurance has been popular in the past.
  • premiums are set based on, for example, annual mileage.
  • risk segmented insurance for bicycles and pedestrians will spread in the future.
  • the risk is estimated based on the moving route and moving distance of the user (contractor) during the contract period. Then, based on the estimated risk, it is assumed that the insurance premium for the next contract period is set, or benefits such as cash back for the current contract period are given.
  • user U1 is assumed to be a user for whom insurance premiums or benefits are calculated
  • user U2 is assumed to be a user existing around user U1.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S301 Prover activates APP1, and witnesses activates APP2, similar to the process of step S1 in FIG.
  • steps S302 to S309 processes similar to steps S2 and steps S4 to S10 in FIG. 5 described above are periodically performed at predetermined time intervals (for example, one minute intervals).
  • the Prover periodically detects the current location and generates location information.
  • witnesses around the Prover also generate a PoL response proving the Prover's location.
  • a PoL block containing the generated PoL response is then added to the blockchain.
  • Prover's location information and Witness's location proof information are periodically recorded in the blockchain.
  • the declaration unit 135 of the Prover generates and transmits declaration data.
  • the declaring unit 135 reads out from the storage 103 position information for a plurality of different dates and times generated periodically within the contract period of the insurance, and generates movement data by arranging the read position information in chronological order.
  • the declaration unit 135 generates declaration data that includes movement data, the registration ID of the user U1, and the public key of the Prover, and is used for calculating insurance premiums or benefits.
  • the reporting unit 135 transmits to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 requests verification of the declaration data. Specifically, the verification unit 232 requests the block chain network 13 via the communication unit 207 and the network 21 to collate each location information included in the movement data included in the declaration data.
  • step S312 the blockchain network 13 verifies the declaration data and transmits the verification result. Specifically, the blockchain network 13 extracts from the blockchain a plurality of PoL blocks each containing location information that matches each location information requested to be verified. Blockchain network 13 transmits the extracted PoL blocks to server 12 via network 21 .
  • the CPU 201 of the server 12 receives the PoL block via the network 21 and the communication unit 207.
  • step S313 the server 12 calculates insurance premiums and the like based on the declaration data.
  • the verification unit 232 of the server 12 verifies the declaration data by the same processing as in step S105 of FIG. 13 described above.
  • the verification unit 232 supplies the movement data to the calculation unit 241 when it is determined that the movement data (each position information contained therein) included in the declaration data is authentic.
  • the calculation unit 241 detects the travel route and travel distance of the user U1 during the contract period based on the travel data.
  • the calculator 241 estimates the risk of the user U1 during the contract period based on the travel route and travel distance of the user U1.
  • calculation unit 241 may acquire information about the means of transportation of the user U1, or estimate the means of transportation based on the route and speed of movement of the user U1. Then, the calculation unit 241 may estimate the risk of the user U1 during the contract period, taking into consideration the means of transportation.
  • the calculation unit 241 calculates benefits such as insurance premiums for the next contract period or cashback for the current contract period based on the estimated risk.
  • the calculation unit 241 supplies information on the calculated insurance premium or benefits to the execution unit 242 .
  • step S314 the server 12 and the Prover execute settlement processing for insurance premiums and the like.
  • the execution unit 242 of the server 12 and the control unit 131 of the Prover communicate with each other via the communication unit 207, the network 21, and the communication unit 110 to perform insurance premium claim processing and insurance premium payment processing.
  • a privilege granting process or the like is performed.
  • step S331 the Prover executes the process of step S310 in FIG.
  • step S332 the verification unit 232 of the server 12 queries PoL records based on the declaration data.
  • the verification unit 232 generates a query requesting extraction of a plurality of PoL blocks each including the location information of the Prover that matches the location information included in the movement data of the declaration data.
  • the verification unit 232 transmits the generated query to the blockchain network 13 via the communication unit 207 and network 21 .
  • the blockchain network 13 receives the query via the network 21.
  • step S333 the blockchain network 13 extracts and transmits PoL records based on the query. Specifically, the blockchain network 13 extracts PoL blocks that match the conditions indicated by the query from the PoL blocks included in the blockchain. As a result, a plurality of PoL blocks each containing location information that matches each location information indicated in the query are extracted.
  • the blockchain network 13 transmits the extracted PoL blocks to the server 12 via the network 21.
  • the CPU 201 of the server 12 receives the extracted PoL block via the network 21 and the communication unit 207.
  • step S334 the server 12 executes the process of step S313 in FIG. 19 described above, and calculates insurance premiums and the like based on the declaration data.
  • step S335 the server 12 and the Prover execute the process of step S314 in FIG.
  • the calculation unit 241 further determines the conditions of the travel route of the user U1 (e.g., weather, congestion, occurrence of accidents in the past, etc.) and the purpose of travel (e.g., work, travel, sports, etc.). may be used to estimate the risk.
  • the conditions of the travel route of the user U1 e.g., weather, congestion, occurrence of accidents in the past, etc.
  • the purpose of travel e.g., work, travel, sports, etc.
  • user U1 can certify the travel route and travel time, and the certified travel route and travel time can be used for purposes other than risk segmentation insurance.
  • user U1 can use the travel route and travel time certified by the present technology when receiving recognition of a commuting accident.
  • user U1 can use the travel route and travel time certified by the present technology when proving that he was late due to an unforeseen factor.
  • the present technology can be applied, for example, when providing benefits of insurance (hereinafter referred to as leisure insurance) against injuries, damages, compensation, etc. that occur during leisure activities such as travel, skiing, golf, and hiking.
  • benefits of insurance hereinafter referred to as leisure insurance
  • a service will be introduced that gives benefits to policyholders based on the locations visited by the user (policyholder) during the contract period.
  • policyholders of leisure insurance for golf or skiing may receive benefits such as cashback (celebration money) or discounts on insurance premiums for the next contract period, depending on the number of golf courses or ski resorts visited during the contract period. service is assumed.
  • the contractor must prove that they actually visited the location.
  • this technology can be applied, for example, when calculating insurance claims for property and casualty insurance against fires and natural disasters.
  • the insurance money is calculated based on the damage situation.
  • the disaster situation is proved based on the image of the disaster area.
  • a user contractor
  • an image of a disaster-stricken area it is possible for a user (contractor) to use an image of a disaster-stricken area to prove the disaster situation.
  • the user U1 is assumed to be a user who takes an image for proof and declares insurance money or benefits
  • the user U2 is assumed to be a user who exists around the user U1 when the image for proof is taken.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S401 Prover activates APP1 and witnesses activates APP2, as in the process of step S1 in FIG.
  • step S402 the Prover captures a certification image and generates location information.
  • the photographing unit 107 of the Prover photographs a certification image in response to a user's operation on the operation unit 104 and supplies the corresponding image data to the CPU 101 .
  • the position detection unit 132 detects the current position of the Prover when the certification image was captured based on the position detection data output from the GNSS receiver 108 .
  • the position detection unit 132 generates position information including the position and time of the Prover when the certification image was captured.
  • the location detection unit 132 supplies the certification image and the location information to the location certification acquisition unit 151 .
  • the position detection unit 132 stores the certification image and the position information in the storage 103 in association with each other.
  • step S403 the Prover's location certification acquisition unit 151 generates a fingerprint of the image data of the certification image.
  • step S404 the Prover's location certification acquisition unit 151 generates a PoL request in the same manner as in step S4 of FIG. 5 described above.
  • the fingerprint of the image data of the certification image is stored in the PoL request as metadata_fingerprint.
  • steps S405 through S410 processing similar to steps S5 through S10 in FIG. 5 described above is executed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information of the Prover at the time of photographing the certification image, the location certification information by the Witness, and the fingerprint of the image data of the certification image are recorded in the blockchain.
  • step S411 the declaration unit 135 of the Prover generates and transmits declaration data. Specifically, the declaration unit 135 generates declaration data including the image data of the certification image, the position information when the certification image was captured, the registration ID of the user U1, and the public key of the Prover. The declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • steps S412 and S413 the same processes as steps S12 and S13 in FIG. 5 are executed.
  • the PoL block containing the location information included in the declaration data and the location information and the fingerprint matching the fingerprint of the image data of the certification image is extracted from the blockchain.
  • step S414 the server 12 analyzes the proof image and calculates the insurance money.
  • the verification unit 232 of the server 12 verifies the declaration data in the same manner as the process of step S14 in FIG. 5 described above.
  • the verification unit 232 determines that the declared data is authentic as a result of the verification, that is, when it determines that the certification image was taken at the declared position and time, the verification unit 232 converts the declared data to the calculation unit 241 . supply to
  • the calculation unit 241 analyzes the certification image and identifies the location visited by the contractor.
  • the calculation unit 241 calculates benefits to be given to the contractor based on the locations visited by the contractor.
  • the calculation unit 241 analyzes the certification image and estimates the damage situation of the contractor.
  • the calculation unit 241 calculates the insurance money to be paid to the policyholder based on the estimated disaster situation.
  • the calculation unit 241 supplies the execution unit 242 with data indicating the calculation result of the insurance money or benefits.
  • step S415 the server 12 and the Prover execute settlement processing for insurance claims and the like.
  • the execution unit 242 of the server 12 and the control unit 131 of the Prover communicate with each other via the communication unit 207, the network 21, and the communication unit 110 while performing insurance payment processing or provision of benefits. processing, etc.
  • the proof image While guaranteeing the authenticity of the proof image taken by the user (contractor), it is possible to use the proof image to appropriately calculate insurance benefits or benefits. In addition, the user can use the proof image to quickly receive the insurance money or benefit without performing troublesome procedures.
  • health promotion insurance is insurance that discounts insurance premiums and provides benefits such as cash back depending on the policyholder's health condition and efforts to improve health.
  • the present technology can be applied when discounting insurance premiums or granting benefits based on the policyholder's health promotion activities.
  • user U1 is assumed to be a user (contractor) for whom insurance premiums or benefits are calculated
  • user U2 is assumed to be a user existing around user U1.
  • the information processing terminal 11-1 possessed by the user U1 will be referred to as Prover
  • the information processing terminal 11-2 possessed by the user U2 will be referred to as Witness.
  • step S501 Prover activates APP1, and witnesses activates APP2, as in the process of step S1 in FIG.
  • step S502 the Prover acquires activity data and generates location information. Specifically, the position detection unit 132 of the Prover acquires from the sensing unit 109 activity data, which is sensor data used for detecting the activity of the user U1. For example, when insurance premium discounts and benefits are provided according to the number of steps and walking distance of the user, sensor data indicating the acceleration and angular velocity indicating the walking motion of the user are used as the activity data.
  • the position detection unit 132 detects the current position of the Prover at the time of acquisition of the activity data based on the position detection data output from the GNSS receiver 108 .
  • the position detector 132 generates position information including the current position and current time of the Prover.
  • the location detection unit 132 supplies the activity data and the location information to the location certification acquisition unit 151 and causes the storage 103 to store the activity data and the location information in association with each other.
  • step S503 the Prover's location certification acquisition unit 151 generates a fingerprint of the activity data.
  • step S504 the Prover's location certification acquisition unit 151 generates a PoL request in the same manner as the processing in step S4 of FIG. 5 described above.
  • the activity data fingerprint is stored in the PoL request as metadata_fingerprint.
  • steps S505 to S510 the same processes as steps S5 to S10 in FIG. 5 described above are executed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information of the Prover at the time of acquisition of the activity data, the location verification information by the Witness, and the fingerprint of the activity data are recorded in the blockchain.
  • steps S502 to S510 are repeatedly executed during the contract period and activity of user U1. For example, every time the user U1 walks a predetermined number of steps (for example, 100 steps), the processing from step S502 to step S510 is executed. As a result, the activity data and location information during activity of the user U1, and the location proof information for the location information are recorded in the blockchain.
  • a predetermined number of steps for example, 100 steps
  • step S511 the declaration unit 135 of the Prover generates and transmits declaration data.
  • the reporting unit 135 generates reporting data including the active activity data and movement data of the user U1, the registration ID of the user U1, and the Prover's public key.
  • the movement data is, for example, data in which the position information generated when the activity data is acquired is arranged in chronological order.
  • the declaration unit 135 transmits the declaration data to the server 12 via the communication unit 110 and the network 21 .
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 requests verification of the declaration data. Specifically, the verification unit 232 generates a fingerprint for each activity data included in the declaration data. The verification unit 232 requests, via the communication unit 207 and the network 21, verification of the fingerprints of the location information and the activity data for each combination of the fingerprints of the location information and the activity data contained in the movement data.
  • step S513 the blockchain network 13 verifies the declaration data and transmits the verification result. Specifically, for each combination of location information and activity data fingerprints, the blockchain network 13 extracts PoL blocks containing matching location information and fingerprints from the blockchain. Blockchain network 13 transmits the extracted PoL blocks to server 12 via network 21 .
  • the CPU 201 of the server 12 receives each PoL block via the network 21 and the communication unit 207.
  • step S514 the server 12 calculates insurance premiums, etc., based on the declaration data.
  • the verification unit 232 of the server 12 verifies the declaration data by the same processing as in step S105 of FIG. 13 described above.
  • the verification unit 232 supplies the movement data to the calculation unit 241 when it is determined that the activity data included in the report data and the location information included in the movement data are authentic as a result of the verification.
  • the calculation unit 241 estimates the activity content of the user U1 during the contract period based on each combination of the position information and the activity data. For example, the calculation unit 241 estimates the walking distance or the like during the contract period of the user U1.
  • the calculation unit 241 calculates insurance premiums or benefits for user U1 based on the estimated activity content. For example, the calculation unit 241 calculates a discount on insurance premiums for the next contract period of user U1, an amount to be cashed back for insurance premiums for the current contract period, and the like, based on the estimated activity content.
  • the calculation unit 241 supplies the calculation result of insurance premiums or benefits to the execution unit 242 .
  • step S514 the server 12 and the Prover execute settlement processing for insurance premiums, etc., similar to the processing in step S314 of FIG. 19 described above.
  • each information processing terminal 11 saves the obtained image data in the server 12 while shooting, and records the location information and location certification information at the time of shooting in the blockchain. It is conceivable to use the image data stored in the server 12 as eyewitness information when an accident occurs, using the location information and location certification information recorded in the blockchain.
  • step S601 the communication unit 110 of the Prover checks the communication status with surrounding witnesses.
  • step S602 the photographing unit 107 of the Prover photographs an image (still image or moving image).
  • the photographing unit 107 supplies image data corresponding to the photographed image to the CPU 101 .
  • step S603 the image registration unit 153 of the Prover encrypts and uploads the image data. Specifically, the position detection unit 132 encrypts the image data using the private key held by the Prover immediately after the photographing is completed. The position detection section 132 transmits the encrypted image data to the server 12 via the communication section 110 and the network 21 .
  • the CPU 201 of the server 12 receives the image data via the network 21 and the communication unit 207.
  • step S604 the position detection unit 132 of the Prover generates position information and calculates a hash value of the encrypted image data. Specifically, the position detector 132 detects the current position of the Prover based on the position detection data output from the GNSS receiver 108 . The position detector 132 generates position information including the current position and current time of the Prover. The image registration unit 153 calculates a hash value of the image data encrypted in step S603. The image registration unit 153 supplies the calculated hash value to the position detection unit 132 .
  • step S605 the information collection unit 233 of the server 12 saves the encrypted image and its hash value. Specifically, the information collecting unit 233 calculates a hash value of the encrypted image data received in the process of step S603. The information collection unit 233 issues an image ID for accessing the encrypted image data. The information collection unit 233 associates the image ID, the encrypted image data, and the hash value, and stores them in the insurance DB 204 .
  • step S606 the information collection unit 233 of the server 12 transmits to the Prover, via the communication unit 207 and the network 21, an image ID that allows access to the image saved in step S605.
  • the Prover's CPU 101 receives the image ID via the network 21 and the communication unit 110 .
  • the position detection unit 132 of the Prover associates the position information and the encrypted image data obtained in the process of step S604 with the received image ID and stores them in the storage 103 .
  • the position detection unit 132 also supplies the position information and the image data to the position certification acquisition unit 151 .
  • the Prover generates and transmits a PoL request in the same manner as the process at step S33 in FIG. 10 described above.
  • a fingerprint of metadata including image data and sensor data of a motion sensor that detects the movement of Proves (hereinafter referred to as motion data) is stored in the PoL block as metadata_fingerprint.
  • steps S608 through S613 the same processes as in steps S34 through S39 of FIG. 10 described above are executed.
  • a PoL Response corresponding to the PoL Request is generated by the Witness around the Prover, and a PoL Block containing the PoL Response is added to the blockchain. That is, the location information at the time the image was taken and the location proof information by Witness are recorded in the blockchain together with the hash value of the image data and the fingerprint of the metadata including the motion data.
  • the information processing terminal 11 of the accident party (accident victim or perpetrator) is simply referred to as the accident party.
  • the information processing terminal 11 of the accident eyewitness is simply referred to as the accident eyewitness.
  • step S631 the party involved in the accident generates report data and transmits it to the server 12 in the same manner as in the process of step S211 in FIG.
  • the reporting data includes, for example, location information of the parties involved in the accident and accident data when the accident occurred.
  • the accident data includes, for example, image data corresponding to images captured when the accident occurred, and motion data acquired when the accident occurred.
  • the CPU 201 of the server 12 receives the declaration data via the network 21 and the communication unit 207.
  • the verification unit 232 of the server 12 generates a fingerprint of the accident data included in the report data.
  • step S633 the server 12 makes a PoL record query based on the declaration data, similar to the processing in step S233 of FIG. 18 described above.
  • step S634 the blockchain network 13 extracts PoL records based on the query and transmits them to the server 12, similar to the process of step S234 in FIG.
  • step S635 the server 12 verifies the report data and estimates the accident eyewitness by the same processing as in step S235 of FIG. 18 described above.
  • step S636 the server 12 generates an eyewitness questionnaire through the same processing as in step S236 of FIG. 18 described above, and broadcasts it to the accident eyewitnesses.
  • the accident eyewitness CPU 101 receives the eyewitness questionnaire via the network 21 and the communication unit 110 .
  • step S637 the eyewitness to the accident confirms the public key of the eyewitness questionnaire by the same process as in step S237 of FIG. 18 described above.
  • step S638 the accident eyewitness information provision unit 162 generates responses to the eyewitness questionnaire and transmits them to the server 12, in the same manner as in the process of step S216 in FIG.
  • the responses to the eyewitness questionnaire included image data taken near the accident site before and after the accident.
  • the responses to the eyewitness questionnaire include the image ID given by the server 12 to the image data corresponding to the images taken near the accident site before and after the accident.
  • Responses to the eyewitness questionnaire also include decryption keys corresponding to encryption keys used to encrypt image data corresponding to images taken near the accident site before and after the accident.
  • the CPU 201 of the server 12 receives the responses to the eyewitness questionnaire via the network 21 and the communication unit 207.
  • step S639 the information collection unit 233 of the server 12 verifies the answers to the eyewitness questionnaire and acquires images. Specifically, the information collecting unit 233 verifies the answers to the eyewitness questionnaire by the same processing as in step S239 of FIG. 18 described above.
  • the information collecting unit 233 determines that the answer to the eyewitness questionnaire is authentic
  • the information collecting unit 233 acquires image data corresponding to the image ID included in the answer to the eyewitness questionnaire from the insurance DB 204 .
  • the information collecting unit 233 decrypts the acquired image data using the decryption key included in the answers to the eyewitness questionnaire. As a result, image data corresponding to images taken near the scene after the accident before and after the accident is acquired.
  • steps S640 and S641 processing similar to steps S239 and S240 in FIG. 18 described above is executed.
  • the image data obtained by the information processing terminal 11 is encrypted and stored in the server 12 .
  • Metadata fingerprints including image data and motion data, are also recorded on the blockchain in association with location information and location proof information at the time of capture. This ensures the authenticity of the image data and the positional information of the image data at the time of shooting. Therefore, image data whose authenticity is guaranteed can be used for eyewitness information.
  • the image data is stored in the server 12, even if the image data stored in the information processing terminal 11 is deleted, it can be used later as eyewitness information.
  • the position information may be generated at a predetermined timing other than the end of the above-described image capturing and recorded together with the position certification information during the period from the start to the end of image capturing. Further, when shooting a moving image, location information may be generated at a plurality of timings from the start to the end of shooting the moving image and recorded together with the location certification information.
  • the information processing terminal 11 is a mobile information terminal such as a smart phone
  • the above APP is not necessarily executed and communication is possible. Therefore, even if the information processing terminal 11 exists around the Prover, it does not necessarily become a Witness.
  • the number of witnesseses can be increased around each information processing terminal 11 . This makes it possible to increase the reliability of the position proof information.
  • the information processing terminal 11 installed outdoors or indoors may operate only as a Witness, or may operate as a Prover. In the latter case, for example, the information processing terminal 11 normally operates as a Witness, and operates as a Prover when requested by another information processing terminal 11 .
  • FIG. 25 shows a configuration example of an information processing system 401 that is a second embodiment of an information processing system to which the present technology is applied.
  • the information processing system 401 is a system that collects image data corresponding to images taken near the accident site before and after the occurrence of the accident.
  • the information processing system 401 includes accident trigger generators 411-1 to 411-m, camera 412-1 to camera 412-n, and a server 413.
  • the accident trigger generators 411-1 to 411-m, the camera 412-1 to 412-n, and the server 413 are connected via a network 421 and can communicate with each other. be.
  • the accident trigger generators 411-1 to 411-m and the camera 412-1 to camera 412-n communicate directly using short-range wireless communication without going through the network 421. It is possible.
  • the accident trigger generators 411-1 to 411-m are simply referred to as the accident trigger generator 411 when there is no need to distinguish them individually.
  • the cameras 412-1 to 412-n are simply referred to as the camera 412 when there is no need to distinguish them individually.
  • the accident trigger generator 411 is composed of, for example, a portable information processing device that can be carried by the user or worn by the user.
  • the accident trigger generator 411 performs an accident detection process, and when an accident is detected, transmits a request trigger to the camera 412 present in the surroundings to request position proof.
  • the photographing device 412 is configured by an information processing device having a photographing function and a communication function.
  • the photographing device 412 is configured by a portable information processing device that can be carried by the user or worn by the user.
  • the camera 412 is configured by a smart phone, a mobile phone, a tablet terminal, a wearable device, an action camera, a portable game machine, or the like.
  • the camera 412 is mounted on a mobile object such as a vehicle (including a two-wheeled vehicle) such as a drive recorder, and is configured by an information processing device that photographs and records the surroundings of the mobile object.
  • a mobile object such as a vehicle (including a two-wheeled vehicle) such as a drive recorder, and is configured by an information processing device that photographs and records the surroundings of the mobile object.
  • the camera 412 is configured by, for example, a dedicated camera installed at an arbitrary location outdoors or indoors.
  • the camera 412 receives the private key from the server 413 via the network 421 .
  • the image capturing device 412 captures images of the surroundings, and superimposes a watermark on the obtained image data using the secret key received from the server 413 .
  • the camera 412 stores the watermark-superimposed image data in association with position information indicating the shooting position and shooting time.
  • the camera 412 collects image data corresponding to the images captured during the period before and after receiving the request trigger (near the time when the position proof is requested), and the image Location information associated with the data is transmitted to server 413 via network 421 .
  • the server 413 generates a private key and transmits it to the camera 412 via the network 421.
  • the server 413 receives image data and location information from the camera 412 via the network 421 and verifies the received image data. If the image data is valid, the server 413 creates a PoL block containing the image data and location information and adds it to the blockchain.
  • the server 413 also transmits the PoL block to other nodes (not shown) that make up the blockchain network to add it to the blockchain.
  • FIG. 26 is a block diagram showing a functional configuration example of the accident trigger generator 411. As shown in FIG.
  • the accident trigger generator 411 includes a CPU 501, a memory 502, a storage 503, an operation unit 504, a display unit 505, a speaker 506, an imaging unit 507, a sensing unit 508, a communication unit 509, an external I/F 510, and a drive 511.
  • the CPU 501 to drive 511 are connected to a bus and perform necessary communications with each other.
  • the CPU 501 through the drive 511 are configured similarly to the CPU 101 through the imaging unit 107 and the sensing unit 109 through the drive 112 of the information processing terminal 11 in FIG.
  • the program executed by the CPU 501 can be recorded in advance in the storage 503 as a recording medium built into the accident trigger generator 411 .
  • the program can be stored (recorded) in the removable media 510A, provided as package software, and installed in the accident trigger generator 411 from the removable media 510A.
  • the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 509 and installed in the accident trigger generator 411.
  • the control unit 531 controls the processing of each unit of the accident trigger generator 411.
  • the accident detection unit 532 detects an accident related to the user of the accident trigger generator 411, or Accidents occurring around the accident trigger generator 411 are detected. When detecting an accident, the accident detection unit 532 transmits a request trigger to the cameras 412 around the accident trigger generator 411 by BT via the communication unit 509 .
  • FIG. 27 is a block diagram showing a functional configuration example of the camera 412. As shown in FIG.
  • the imaging device 412 includes a CPU 601, a memory 602, a storage 603, an operation unit 604, a display unit 605, a speaker 606, an imaging unit 607, a GNSS receiver 608, a sensing unit 609, a communication unit 610, an external I/F 611, and a drive 612. Prepare.
  • the CPU 601 to drive 612 are connected to a bus and perform necessary communications with each other.
  • the CPU 601 to drive 612 are configured similarly to the CPU 101 to drive 612 of the information processing terminal 11 in FIG.
  • the program executed by the CPU 601 can be recorded in advance in the storage 603 as a recording medium built into the camera 412 .
  • the program can be stored (recorded) in the removable media 612A, provided as package software, and installed in the server 12 from the removable media 612A.
  • the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 610 and installed in the camera 412 .
  • Functions including a control unit 631, a position detection unit 632, a watermark superimposition unit 633, and a position verification unit 634 are realized by the CPU 601 executing the program installed in the camera 412.
  • a control unit 631 controls processing of each unit of the camera 412 .
  • the position detection unit 632 detects the current position of the camera 412 based on the position detection data output from the GNSS receiver 608 .
  • the position detection unit 632 generates position information including the detected current position and current time.
  • the watermark superimposing unit 633 receives the secret key from the server 413 via the network 421 and the communication unit 610.
  • the watermark superimposing unit 633 generates a watermark using the secret key and superimposes the watermark on the image data supplied from the camera 412 .
  • the watermark superimposing unit 633 associates the watermark-superimposed image data with position information indicating the shooting position of the image data and stores them in the storage 603 .
  • the location certification unit 634 acquires from the storage 603 image data of a predetermined interval before and after receiving the request trigger and location information associated with the image data. .
  • the location certification unit 634 transmits location certification information including the acquired image data and location information to the server 413 via the communication unit 610 and the network 21 .
  • FIG. 28 is a block diagram showing a functional configuration example of the server 413. As shown in FIG.
  • the server 413 includes a CPU 701 , a memory 702 , a storage 703 , an image DB (Data Base) 704 , an operation section 705 , a display section 706 , a communication section 707 , an external I/F 708 and a drive 709 .
  • the CPU 701 to drive 709 are connected to a bus and perform necessary communications with each other.
  • the CPU 701 to storage 703 and operation unit 705 to drive 709 are configured in the same manner as the CPU 201 to storage 203 and operation unit 205 to drive 209 of the server 12 in FIG.
  • the image DB 704 accumulates map information, image data around each position of the map information, and feature data indicating features of each image data.
  • the program executed by the CPU 701 can be recorded in advance in the storage 703 as a recording medium incorporated in the server 413 .
  • the program can be stored (recorded) in the removable media 709A, provided as package software, and installed in the server 413 from the removable media 709A.
  • the program can be downloaded from another server (not shown) or the like via the network 421 and the communication unit 707 and installed on the server 413 .
  • Functions including a control unit 731, a position verification unit 732, and a PoL block generation unit 733 are realized by the CPU 701 executing a program installed in the server 12.
  • the control unit 731 controls the processing of each unit of the server 413.
  • the location verification unit 732 verifies the watermark superimposed on the image data included in the location certification information received from the camera 412, and also verifies the image data accumulated in the image DB 704 and the image data included in the location certification information.
  • the position information included in the position proof information is verified by comparing with .
  • the position verification section 732 includes a private key generation section 741 , a watermark extraction section 742 , a watermark verification section 743 , a feature extraction section 744 and a feature verification section 745 .
  • a secret key generation unit 741 generates a secret key and transmits it to the camera 412 via the communication unit 707 and network 421 .
  • the watermark extraction unit 742 extracts the watermark from the image data received from the camera 412 using the secret key generated by the secret key generation unit 741 and supplies the extracted watermark to the watermark verification unit 743 .
  • a watermark verification unit 743 verifies the watermark extracted from the image data and supplies the verification result to the feature extraction unit 744 .
  • the feature extraction unit 744 extracts features of the image data received from the camera 412 and supplies data indicating the extraction result of the features of the image data to the feature verification unit 745 .
  • the feature verification unit 745 verifies the position information received from the camera 412 by comparing the features extracted from the image data received from the camera 412 and the features of the image data stored in the image DB 704 . .
  • the PoL block generation unit 733 mines the image data and position information received from the camera 412, generates a PoL block, and adds it to the blockchain. In addition, the PoL block generation unit 733 transmits the PoL block to other nodes constituting the blockchain network to add it to the blockchain.
  • step S701 the private key generation unit 741 of the server 413 generates a private key.
  • step S702 the private key generation unit 741 of the server 413 shares the private key. Specifically, the private key generation unit 741 transmits the private key to the camera 412 via the communication unit 707 and network 421 .
  • the watermark superimposing unit 633 of the camera 412 receives the secret key via the communication unit 610 and stores it in the storage 603.
  • the secret key is shared between the server 413 and the camera 412.
  • Any method of transmitting the private key to the camera 412 can be adopted as long as the private key can be transmitted safely and secretly.
  • step S703 the image capturing device 412 starts capturing moving images, superimposing watermarks, and storing position information.
  • the shooting unit 607 shoots a moving image and starts processing to supply the obtained moving image data to the watermark superimposing unit 633 .
  • the position detection unit 632 starts processing for detecting the current position of the camera 412 based on the position detection data output from the GNSS receiver 608 .
  • the position detection unit 632 generates position information including the detected current position and current time, and starts processing to supply the generated position information to the watermark superimposition unit 633 .
  • the watermark superimposing unit 633 generates a watermark using the secret key stored in the storage 603 and starts superimposing it on each frame of the video data. Also, the watermark superimposing unit 633 associates each frame of the watermark superimposed moving image data with the position information, and starts the process of storing them in the storage 603 .
  • position information may be associated with each predetermined frame.
  • step S704 the accident detection unit 532 of the accident trigger generator 411 detects the occurrence of an accident based on at least one of the image data output from the imaging unit 507 and the sensor data output from the sensing unit 508.
  • Accidents to be detected include not only accidents related to the user who owns the accident trigger generator 411 but also accidents occurring around the accident trigger generator 411 .
  • step S705 the accident detection unit 532 of the accident trigger generator 411 scans the surrounding camera 412 (Witness) via the communication unit 509. For example, the accident detection unit 532 scans the camera 412 within the BT communication range of the communication unit 509 .
  • step S706 the accident detection unit 532 of the accident trigger generator 411 transmits a request trigger. Specifically, the accident detection unit 532 transmits a request trigger to the camera 412 detected in the process of step S705 by BT via the communication unit 509 .
  • the CPU 601 of the camera 412 receives the request trigger via the communication unit 610 .
  • step S707 the location verification unit 634 of the camera 412 confirms whether or not there is moving image data for the target period. Specifically, the location certification unit 634 determines whether video data for a predetermined period before and after receiving the request trigger (that is, before and after the accident was detected by the accident trigger generator 411) is stored in the storage 603. to confirm.
  • step S708 the camera 412 transmits the moving image data and the position information to the server 12.
  • the location certification unit 634 reads from the storage 603 the moving image data for a predetermined target period before and after receiving the request trigger and the location information associated with each frame of the moving image data.
  • Location certification unit 634 transmits location certification information including the read moving image data and location information to server 413 via communication unit 610 and network 421 .
  • the CPU 701 of the server 413 receives the location certification information via the network 421 and the communication unit 707.
  • step S709 the server 413 performs position verification processing.
  • step S710 the server 413 performs mining and generates PoL blocks.
  • step S711 the server 413 verifies the PoL block and adds it to the blockchain.
  • step S731 the watermark extraction unit 742 extracts the watermark of the video data. Specifically, the watermark extraction unit 742 extracts the watermark of each frame of the moving image data included in the position verification information received from the camera 412 using the private key generated in the process of step S701. The watermark extraction unit 742 supplies the extracted watermark to the watermark verification unit 743 .
  • step S732 the watermark verification unit 743 determines whether the watermark extracted from the video data is valid. If the watermark is determined to be valid, the process proceeds to step S733.
  • step S733 the feature extraction unit 744 extracts features of the video data. Specifically, the watermark verification unit 743 notifies the feature extraction unit 744 that the watermark of the video data is valid.
  • the feature extraction unit 744 uses an arbitrary method to extract features of each frame of video data.
  • the features extracted at this time are features that change little over time.
  • the feature extraction unit 744 uses a SIFT (Scale Invariant Feature Transform) feature amount or the like to extract feature points of stationary objects such as buildings and mountains that change little.
  • the feature extraction unit 744 recognizes characters such as guide signs and signboards in the moving image, and extracts the recognized characters as features.
  • the feature extraction unit 744 supplies the feature verification unit 745 with data indicating the extraction result of the feature of the moving image data.
  • the feature verification unit 745 refers to the image DB 704 using the position information received from the camera 412 as a key. That is, the feature verification unit 745 extracts a plurality of image data at positions indicated by position information associated with each frame of the received moving image data from the image DB 704 as image data corresponding to each frame of the moving image data. Also, the feature verification unit 745 extracts feature data indicating the feature of each extracted image data from the image DB 704 .
  • step S735 the feature verification unit 745 determines whether the correlation between the feature of the moving image data and the feature of the image DB 704 is equal to or greater than a threshold. Specifically, the position verification unit 704 calculates the correlation between the features extracted from each frame of the moving image data and the features of the image data corresponding to each frame of the moving image data extracted from the image DB 704 . When the feature verification unit 745 determines that the calculated correlation is equal to or greater than a predetermined threshold, that is, when the feature of each frame of the moving image data is similar to the feature of the image data in the image DB 704 corresponding to each frame, Processing proceeds to step S736.
  • the video data received from the camera 412 is video data corresponding to the video taken at the position indicated by the position information received from the camera 412 .
  • the camera 412 was present at the position indicated by the positional information when the moving image was captured. That is, the authenticity of the moving image data and the location information included in the location certification information is verified.
  • the PoL block generation unit 733 performs mining and generates PoL blocks. Specifically, the feature verification unit 745 notifies the PoL block generation unit 733 that the moving image data and position information received from the camera 412 are authentic.
  • the PoL block generation unit 733 mines the moving image data and the location information by a predetermined method, and generates a PoL block including the location certification information.
  • step S737 the PoL block generation unit 733 verifies the PoL block and adds it to the blockchain. Specifically, the PoL block generation unit 733 verifies the PoL block by a predetermined method, and adds the PoL block to the blockchain when determining that the PoL block is valid. In addition, when the PoL block generation unit 733 determines that the PoL block is valid, it transmits the PoL block to other nodes constituting the blockchain network to add the PoL block to the blockchain.
  • step S735 determines in step S735 that the calculated correlation is less than the predetermined threshold, that is, if the feature of each frame of the moving image data and the feature of the image data in the image DB 704 corresponding to each frame are are not similar, the processes of steps S736 and S737 are skipped, and the position verification process ends. This is the case where at least one of the video data and the location information included in the location proof information is not authentic.
  • the camera 412 can transmit still image data instead of moving image data.
  • the accident trigger generator 411 and the camera 412 may be included in one housing to constitute one device. In this case, for example, it is possible to delete one of the overlapping functions.
  • the accident trigger generator 411 can transmit a witness request trigger when a trigger other than an accident (eg, a predetermined event or predetermined timing) is detected. This makes it possible to collect moving image data and position information captured before and after the occurrence of a predetermined trigger.
  • a trigger other than an accident eg, a predetermined event or predetermined timing
  • the position of the information processing terminal 11 is represented by latitude and longitude, but the position of the information processing terminal 11 may be represented by other methods.
  • the position of the information processing terminal 11 may be represented using Geohash, S2 Geometry, etc., in consideration of the ease of searching for the position, the protection of privacy, and the like.
  • the position of the information processing terminal 11 may be represented by adding the altitude to the latitude and longitude.
  • the method of detecting the position of the information processing terminal 11 is not limited to the method using the GNSS receiver 108 described above, and other methods can be used.
  • the server 12 in FIG. 1 may constitute one of the nodes of the blockchain network 13.
  • this technology can be used in other technologies and situations where it is necessary to guarantee the authenticity of the user's location, other than the insurance example described above.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • a system means a set of multiple components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules in one housing are both systems. .
  • this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
  • each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
  • one step includes multiple processes
  • the multiple processes included in the one step can be executed by one device or shared by multiple devices.
  • a position detection unit that detects a current position when a predetermined trigger is detected and generates first position information including the current position and the current time
  • a location certification acquisition unit that requests location certification from a first information processing device present in the vicinity when the trigger is detected and receives first location certification information from the first information processing device
  • an information processing apparatus comprising: a location registration unit that transmits the first location information and the first location proof information to a second information processing apparatus that records the first location information and the first location proof information.
  • the location certification acquisition unit transmits a location certification request including the first location information to the first information processing device by short-range wireless communication, and transmits a location certification response including the first location certification information to the The information processing device according to (1), received from the first information processing device.
  • the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key.
  • the information processing device according to . (4) the location request further includes metadata associated with the first location information or a fingerprint of the metadata; The information processing apparatus according to (3), wherein the plaintext further includes the metadata or the fingerprint.
  • the metadata includes accident data used to detect an accident, a hash value of the accident data, image data corresponding to an image of the surroundings, a hash value of the image data, activity data used to detect user activity, and at least one of hash values of the activity data.
  • the information processing device according to (4) or (5).
  • An image registration unit that transmits the image data encrypted using the encryption key to a third information processing device and receives an image ID assigned to the image data from the third information processing device.
  • the position detection unit detects a current position and generates the first position information when detecting a predetermined timing from the start to the end of capturing an image corresponding to the image data
  • the location certification acquisition unit requests location certification from the first information processing device when the timing is detected, and receives the first location certification information from the first information processing device.
  • the third information processing device requests the image data corresponding to the image taken near the position and the time indicated by the first position information, the image ID corresponding to the corresponding image data , and an information providing unit configured to transmit a decryption key corresponding to the encryption key to the third information processing apparatus.
  • the information processing apparatus according to (7).
  • the location certification acquisition unit converts the location certification response to the first information.
  • the information processing device according to any one of (3) to (8), received from the processing device.
  • the location proof response includes second location information including the current location and current time of the first information processing device, a public key, and a private key corresponding to the public key from plaintext including the second location information.
  • an electronic signature generated using
  • the location registration unit determines that the location certification response is valid using the public key and the electronic signature
  • the location registration unit transmits the location information transaction including the first location information and the first location certification information to the The information processing device according to (2), which transmits to the second information processing device.
  • the location proof response further includes the location proof request; said plaintext further comprising said location proof request;
  • the location certification acquisition unit transmits the location certification request to the first information processing device existing within a communication range of the short-range wireless communication.
  • the first location proof information includes second location information including the current location and current time of the first information processing device.
  • the information processing device includes movement data including the first position information in time series.
  • the report data is used for calculating premiums, insurance benefits, or benefits of the insurance.
  • the trigger is a predetermined event or predetermined timing.
  • the information processing device according to any one of (1) to (17), wherein the first location information and the first location proof information are recorded in a block chain by the second information processing device.
  • the location certification unit receives, from the first information processing device, a location certification request including first location information including the current location and current time of the first information processing device by short-range wireless communication, and The information processing device according to (21), wherein a position proof response including position proof information is transmitted to the first information processing device.
  • the location certification request includes the first location information, a public key, and an electronic signature generated from plaintext including the first location information using a private key corresponding to the public key;
  • the location certification unit transmits the location certification response to the first information processing device when determining that the location certification request is valid using the public key and the electronic signature.
  • information processing equipment further comprising a location detection unit that detects a current location and generates second location information including the current location and the current time when the request for location certification is received;
  • the location proof response includes the second location information and a public key, and a first digital signature generated from a first plaintext containing the second location information using a private key corresponding to the public key.
  • the location proof response further includes the location proof request;
  • the location proof response further includes metadata associated with the second location information or a fingerprint of the metadata;
  • the (24 ) When image data corresponding to an image taken near the position and time indicated by the first position information is requested from the second information processing device that has acquired the first position information and the position proof information, The (24 ).
  • the information processing apparatus according to any one of (21) to (28), wherein the location proof information includes the location information.
  • the location certification information includes image data superimposed with a watermark corresponding to an image of the surroundings captured around the time when the request for location certification was received, and the location information;
  • the information processing apparatus according to (29), wherein the location certification unit transmits the location certification information to a second information processing apparatus that records the location certification information.
  • an information processing device that has detected a predetermined trigger receives a request for position proof sent to another information processing device in the vicinity
  • the computer executes processing for generating position proof information and sending the position proof information.
  • program to make (33) Reporting data received from the first information processing device, including first position information including the current position and current time of the first information processing device when the first information processing device detects a predetermined trigger.
  • position proof information generated in response to a position proof request from the first information processing device by a second information processing device that was present in the vicinity of the first information processing device when the trigger was detected.
  • a verification unit that verifies using An information processing apparatus comprising: an execution unit that executes processing corresponding to the declaration data when the verification unit determines that the declaration data is authentic.
  • the verification unit acquires the location proof information from a third information processing device that records a location proof block containing the first location information and the location proof information, based on the first location information.
  • the claim data further includes metadata associated with the first location information;
  • the location proof block includes the first location information, the location proof information, and the metadata or a fingerprint of the metadata;
  • the information processing device according to (34) wherein the verification unit acquires the location proof information from the second information processing device based on the first location information and the metadata or the fingerprint.
  • the information processing device according to (35) wherein the execution unit executes processing corresponding to the report data based on the first position information and the metadata.
  • the information processing device according to (34) or (35), wherein the location proof block is recorded in a blockchain.
  • the information processing apparatus according to (33), further comprising an information collecting unit.
  • the position proof information includes second position information including the current position and current time of the second information processing device when the position proof is requested from the first information processing device,
  • the information collection unit extracts the first location information, the location proof information, a public key, and a first plaintext containing the first location information and the location proof information to a private key corresponding to the public key.
  • the information collecting unit sends the second position information and the public key to the second information processing device to request the image data, and obtains the image data and a second plaintext including the image data.
  • (40) receiving image data encrypted using an encryption key from a third information processing device, transmitting an image ID assigned to the image data to the third information processing device, and encrypting the image; further comprising an information collecting unit that associates the data and the image ID and saves them in a database; requesting the third information processing device for an image taken near the position and near the time indicated by the first position information, and decrypting the image ID corresponding to the corresponding image data and the encryption key corresponding to the encryption key;
  • the information processing device according to (33), which receives a key from the third information processing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Health & Medical Sciences (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Technology Law (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'informations et un programme qui peuvent garantir l'authenticité d'informations d'emplacement à partir d'un dispositif de traitement d'informations sans utiliser un dispositif dédié. Le dispositif de traitement d'informations comprend : une unité de détection d'emplacement qui détecte l'emplacement actuel lors de la détection d'un déclencheur prescrit, et génère des premières informations d'emplacement comprenant l'emplacement actuel et le moment actuel ; une unité d'acquisition de certification d'emplacement qui nécessite une certification d'emplacement d'un premier dispositif de traitement d'informations qui est présente dans la périphérie lors de la détection du déclencheur, et reçoit des premières informations de certification d'emplacement provenant du premier dispositif de traitement d'informations ; et une unité d'enregistrement d'emplacement qui transmet les premières informations d'emplacement et les premières informations de certification d'emplacement à un deuxième dispositif de traitement d'informations qui enregistre les premières informations d'emplacement et les premières informations de certification d'emplacement. La présente invention peut être appliquée, par exemple, à un système qui rédige des rapports concernant une assurance.
PCT/JP2022/004791 2021-06-09 2022-02-08 Dispositif de traitement d'informations et programme WO2022259612A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023527485A JPWO2022259612A1 (fr) 2021-06-09 2022-02-08

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-096304 2021-06-09
JP2021096304 2021-06-09

Publications (1)

Publication Number Publication Date
WO2022259612A1 true WO2022259612A1 (fr) 2022-12-15

Family

ID=84425664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004791 WO2022259612A1 (fr) 2021-06-09 2022-02-08 Dispositif de traitement d'informations et programme

Country Status (2)

Country Link
JP (1) JPWO2022259612A1 (fr)
WO (1) WO2022259612A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003065261A1 (fr) * 2002-01-30 2003-08-07 Fujitsu Limited Systeme de transactions sur les assurances et procede utilisant des informations sur les habitudes personnelles
WO2008010287A1 (fr) * 2006-07-20 2008-01-24 Panasonic Corporation Dispositif, système et procédé de vérification de position
JP2017050763A (ja) * 2015-09-03 2017-03-09 日本電信電話株式会社 許諾情報管理システム、利用者端末、権利者端末、許諾情報管理方法、および、許諾情報管理プログラム
CN106897901A (zh) * 2017-02-16 2017-06-27 湖北大学 基于安全位置证明的共享单车安全计费方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003065261A1 (fr) * 2002-01-30 2003-08-07 Fujitsu Limited Systeme de transactions sur les assurances et procede utilisant des informations sur les habitudes personnelles
WO2008010287A1 (fr) * 2006-07-20 2008-01-24 Panasonic Corporation Dispositif, système et procédé de vérification de position
JP2017050763A (ja) * 2015-09-03 2017-03-09 日本電信電話株式会社 許諾情報管理システム、利用者端末、権利者端末、許諾情報管理方法、および、許諾情報管理プログラム
CN106897901A (zh) * 2017-02-16 2017-06-27 湖北大学 基于安全位置证明的共享单车安全计费方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WENZHE LV; SHENG WU; CHUNXIAO JIANG; YUANHAO CUI; XUESONG QIU; YAN ZHANG: "Decentralized Blockchain for Privacy-Preserving Large-Scale Contact Tracing", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 2 July 2020 (2020-07-02), 201 Olin Library Cornell University Ithaca, NY 14853 , XP081713528 *
WU WEI; LIU ERWU; GONG XINGLIN; WANG RUI: "Blockchain Based Zero-Knowledge Proof of Location in IoT", ICC 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), IEEE, 7 June 2020 (2020-06-07), pages 1 - 7, XP033798373, DOI: 10.1109/ICC40277.2020.9149366 *

Also Published As

Publication number Publication date
JPWO2022259612A1 (fr) 2022-12-15

Similar Documents

Publication Publication Date Title
US10019773B2 (en) Authentication and validation of smartphone imagery
CN110689460B (zh) 基于区块链的交通事故数据处理方法、装置、设备及介质
TWI451283B (zh) 事故資訊整合及管理系統及其相關事故資訊整合及管理方法
CN111460526A (zh) 基于区块链的影像数据记录、获取、验证方法及装置
CN109523413B (zh) 保单处理方法、装置、计算机设备及存储介质
US10824713B2 (en) Spatiotemporal authentication
KR20120069703A (ko) 모바일 투표를 위한 지리적 위치 인증 방법
KR102029128B1 (ko) 운전자간 블랙박스 사고영상을 상호교환하기 위한 iot 플랫폼 및 그 구현 방법
JPWO2005119539A1 (ja) 動作環境を証明する証明書発行サーバ及び証明システム
JP2019133419A (ja) データ送受信方法、データ送受信システム、処理装置、コンピュータプログラム及びシステムの構築方法
CN108268915B (zh) 电子证据固化系统及方法
CN110597906A (zh) 基于区块链的入学积分生成方法、装置、设备及存储介质
KR20160082935A (ko) 미디어 데이터의 제공, 관리, 거래 방법 및 그 장치
KR102231434B1 (ko) 블록체인 기술 기반 운전자 간의 블랙박스 영상의 p2p 거래/공유 서비스를 위한 플랫폼 및 그 구현 방법
WO2022259612A1 (fr) Dispositif de traitement d'informations et programme
JP5112363B2 (ja) ライフログデータの管理システム、管理方法及びプログラム
WO2005107148A1 (fr) Authentication system
US8850198B2 (en) Method for validating a road traffic control transaction
JP2019086904A (ja) 画像管理サーバ及び画像管理方法
JP2004140658A (ja) 情報処理装置、位置情報利用システム、および画像管理方法
JP2019133650A (ja) データ送受信方法
US11914748B2 (en) Apparatus and method for collecting data
JP7322682B2 (ja) 制御方法、情報処理装置、情報処理システム及びプログラム
KR102088099B1 (ko) 위치 정보를 기반으로 하는 블랙박스 영상 정보 공유 시스템 및 방법
JP2011060081A (ja) 画像管理システム及び管理装置及び管理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22819803

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023527485

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22819803

Country of ref document: EP

Kind code of ref document: A1