WO2021106180A1 - Information processing system, information processing device, terminal device, server device, program, or method - Google Patents

Information processing system, information processing device, terminal device, server device, program, or method Download PDF

Info

Publication number
WO2021106180A1
WO2021106180A1 PCT/JP2019/046746 JP2019046746W WO2021106180A1 WO 2021106180 A1 WO2021106180 A1 WO 2021106180A1 JP 2019046746 W JP2019046746 W JP 2019046746W WO 2021106180 A1 WO2021106180 A1 WO 2021106180A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
terminal device
image
mobile terminal
Prior art date
Application number
PCT/JP2019/046746
Other languages
French (fr)
Japanese (ja)
Inventor
路威 重松
佐々木 雄一
涵 周
山本 正晃
聞浩 周
ジニト バット
翼 岩切
長屋 茂喜
Original Assignee
ニューラルポケット株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ニューラルポケット株式会社 filed Critical ニューラルポケット株式会社
Priority to JP2019566377A priority Critical patent/JP6704568B1/en
Priority to PCT/JP2019/046746 priority patent/WO2021106180A1/en
Publication of WO2021106180A1 publication Critical patent/WO2021106180A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the technology disclosed in this application relates to an information processing system, an information processing device, a server device, a program, or a method.
  • various embodiments of the present invention provide an information processing system, an information processing device, a terminal device, a server device, a program, or a method in order to solve the above problems.
  • the first system is The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information.
  • the second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information.
  • a determination unit that determines the identity between the first vehicle and the second vehicle by using the first vehicle identification information and the second vehicle identification information.
  • the statistical information relating to the first vehicle is generated by using the first vehicle determination information and the second vehicle determination information.
  • the second system is A transmission unit that transmits statistical information relating to the first vehicle to the second mobile terminal device when it is determined that the first vehicle and the second vehicle are the same. First system with.
  • the third system is It includes a third acquisition unit that acquires information related to an object in the first image from the first mobile terminal device. Any one of the first and second systems.
  • the fourth system is The information related to the object is the information related to the operation of the wiper, the information related to the road, the information related to the sidewalk, the information related to the event on the roadway, the information related to the advertisement, and / or for the vehicle in the first image. Including fuel information, Third system.
  • the fifth system is An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle in the image captured by the third mobile terminal device, the message transmitted by the third mobile terminal device, and the acquisition unit.
  • An acquisition unit that acquires the fourth vehicle identification information that identifies the fourth vehicle registered as its own vehicle in the fourth mobile terminal device, and When the third vehicle and the fourth vehicle are determined to be the same vehicle, a transmission unit that transmits the message to the fourth mobile terminal device and a transmission unit. Any one of the first to fourth systems comprising.
  • the sixth system is An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle registered as its own vehicle in the third mobile terminal device and the message transmitted by the third mobile terminal device.
  • An acquisition unit that acquires the fifth vehicle identification information that identifies the fifth vehicle related to the image captured by the fifth mobile terminal device, and the acquisition unit.
  • the transmission unit and the transmission unit that transmit the message to the fifth mobile terminal device. Any one of the first to fourth systems comprising.
  • the seventh system is After it is determined that the first vehicle and the second vehicle are the same, the transmission unit transmits statistical information relating to the first vehicle within a predetermined time.
  • the second system After it is determined that the first vehicle and the second vehicle are the same, the transmission unit transmits statistical information relating to the first vehicle within a predetermined time. The second system.
  • the eighth system is The first mobile terminal device and the second mobile terminal device are different mobile terminal devices. Any one of the first to seventh systems.
  • the ninth system is The first acquisition unit acquires a moving image including the first image from the first mobile terminal device. Any one of the first to eighth systems.
  • the tenth system is The moving image is a compressed moving image. Ninth system.
  • the eleventh system is The first vehicle determination information is information generated by the machine-learned device in the first mobile terminal device. Any one of the first to ten systems.
  • the twelfth system is The one vehicle determination information and the second vehicle determination information include sudden steering, sudden acceleration, and / or sudden braking, respectively. Any one of the first to eleven systems.
  • the thirteenth method is The computer The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. And the acquisition step of acquiring the image from the first mobile terminal device.
  • the second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. And the acquisition step of acquiring the image from the second mobile terminal device.
  • a determination step for determining the identity of the first vehicle and the second vehicle and When the first vehicle and the second vehicle are the same, a statistical processing step of generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information. , How to do.
  • the fourteenth program is Computer,
  • the first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information.
  • the second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information.
  • a determination means for determining the identity of the first vehicle and the second vehicle, When the first vehicle and the second vehicle are the same, a statistical processing means for generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information.
  • a program that operates as.
  • the fifteenth program is A program for operating a computer as any one of the first to twelfth systems.
  • image information can be used more appropriately.
  • FIG. 1 is a diagram illustrating a situation example in which the system according to one embodiment is applied.
  • FIG. 2 is a block diagram showing the functions of the system according to the embodiment.
  • FIG. 3 is a block diagram showing the functions of the system according to the embodiment.
  • FIG. 4 is an example of a data structure used by the system according to the embodiment.
  • FIG. 5 is an example of a data structure used by the system according to the embodiment.
  • FIG. 6 is an example of a data structure used by the system according to the embodiment.
  • FIG. 7 is an example of a data structure used by the system according to the embodiment.
  • FIG. 8 is an example of a data structure used by the system according to the embodiment.
  • FIG. 9 is an example of a data structure used by the system according to the embodiment.
  • FIG. 1 is a diagram illustrating a situation example in which the system according to one embodiment is applied.
  • FIG. 2 is a block diagram showing the functions of the system according to the embodiment.
  • FIG. 3 is a block
  • FIG. 10 is an example of a data structure used by the system according to the embodiment.
  • FIG. 11 is an example of a data structure used by the system according to the embodiment.
  • FIG. 12 is an example of a data structure used by the system according to the embodiment.
  • FIG. 13 is an example of a data structure used by the system according to the embodiment.
  • FIG. 14 is an example of a data structure used by the system according to the embodiment.
  • FIG. 15 is an example of a data structure used by the system according to the embodiment.
  • FIG. 16 is an example of a data structure used by the system according to the embodiment.
  • FIG. 17 is an example of a data structure used by the system according to the embodiment.
  • FIG. 18 is an example of a data structure used by the system according to the embodiment.
  • FIG. 11 is an example of a data structure used by the system according to the embodiment.
  • FIG. 12 is an example of a data structure used by the system according to the embodiment.
  • FIG. 13 is an example of a data structure used
  • FIG. 19 is an example of a data structure used by the system according to the embodiment.
  • FIG. 20 is an example of a data structure used by the system according to the embodiment.
  • FIG. 21 is an example of a data structure used by the system according to the embodiment.
  • FIG. 22 is an example of a transition diagram used by the system according to the embodiment.
  • FIG. 23 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 24 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 25 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 26 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 27 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 20 is an example of a data structure used by the system according to the embodiment.
  • FIG. 21 is an example of a data structure used by the system according to the embodiment.
  • FIG. 22 is an example of a transition diagram used
  • FIG. 28 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 29 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 30 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 31 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 32 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 33 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 34 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 35 is an example of a data structure used by the system according to the embodiment.
  • FIG. 36 shows an example of the flow of the system according to one embodiment.
  • FIG. 37 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 38 is a block diagram showing an overall configuration according to the system according to one embodiment.
  • FIG. 39 is a block diagram showing an overall configuration according to the system according to one embodiment.
  • FIG. 40 shows an example of the flow of the system according to one embodiment.
  • FIG. 41 is a block diagram showing a configuration of an information processing device according to an embodiment.
  • An example system may include one or more terminal devices used by system users and one or more management systems used by system administrators.
  • the user may be a passenger in the vehicle, may be a driver, or may be a passenger.
  • the terminal device may be fixed to the vehicle or may be detachable.
  • FIG. 1 shows an example of a situation in which an example system is used.
  • arrow 001 indicates the traveling direction of the vehicle.
  • Vehicle 002 indicates a vehicle using a user mobile terminal according to an example system. It is assumed that the vehicles 003A to I use the terminal device according to the example system as a drive recorder. It should be noted that some of these vehicles do not have to use the terminal device according to the example system. It is assumed that each vehicle is driven along the lane boundary line 004A and the lane boundary line 004B.
  • the user terminal device in the vehicle 002 may be able to identify each vehicle by its imaging device. For example, when the angle of view of one image pickup device of the user terminal device in the vehicle 002 is wide, three vehicles of vehicles 003A, 003D, and 003F may be included in the image. Similarly, when the angle of view of the other imaging device that captures the report facing the one imaging device of the user terminal device in the vehicle 002 is wide, the four vehicles of the vehicles 003C, 003E, 003H, and 003I are images. May be included in. In some cases, such as the vehicle 003I, the image pickup device of the user terminal device in the vehicle 002 comes into view.
  • the vehicles 003B and 003G travel on the side of the vehicle 002, in the vehicle 002, the front is the imaging direction of one imaging device, and the rear is the other direction that images the direction facing the one imaging device.
  • the imaging direction is set to the imaging device, even if the vehicle does not enter the field of view while traveling right beside the vehicle 002, it enters the field of view of the imaging device of the user terminal device in the vehicle 002 by moving back and forth while traveling. There may be.
  • the vehicle in the field of view in the imaging device may be referred to as a "surrounding vehicle" in the documents of the present application, and when the terminal device related to the system of one example is used in the surrounding vehicle.
  • a terminal device When such a terminal device is viewed from the user terminal device in the user vehicle 002, it may be referred to as a "peripheral terminal device" in the documents of the present application.
  • the vehicle in front of the user vehicle may be referred to as the "front vehicle” in the document of the present application
  • the vehicle behind the user vehicle may be referred to as the “rear vehicle” in the document of the present application.
  • the terminal device related to the example system in the front vehicle and the rear vehicle is used as an example system
  • the terminal devices used in each vehicle are referred to as “front terminal device” and "rear", respectively. It may also be called a "terminal device". Since the surrounding vehicle is included in the field of view in the above-mentioned imaging device, the information for identifying the surrounding vehicle may be acquired by using the information related to the license plate described later.
  • the terminal device may include one or more accelerometers.
  • a plurality of acceleration sensors may be provided in the terminal device so that accelerations in directions orthogonal to each other can be measured.
  • a plurality of acceleration sensors may be provided in the terminal device so that the acceleration in the direction imaged by the image pickup device and the direction orthogonal to the image pickup direction can be measured.
  • the terminal device may include one or more imaging devices.
  • the terminal device may be equipped with an imaging device so that it can image in opposite directions.
  • the terminal device may be installed in the vehicle so that one or more of the plurality of imaging devices have the front of the vehicle as the imaging direction.
  • the terminal device may be installed in the vehicle so that the other one or a plurality of imaging devices among the plurality of imaging devices have the rear of the vehicle as the imaging direction.
  • the installation in the vehicle may be detachably installed in the vehicle or may be fixed.
  • the display direction of the display device in the terminal device may be the same as the image pickup direction of one image pickup device. Further, the display direction of the display device may be the rear of the vehicle.
  • the terminal device may have one or more position measuring devices.
  • the position measuring device may be a position measuring function using GPS or a base station.
  • the terminal device may be provided with a communication device.
  • the communication device may be capable of communicating with an image pickup device installed in the vehicle in which the terminal device is installed.
  • the communication device may be a wireless type or a wired type. In the case of wireless type, it may be WIFI, BLUETOOTH or the like.
  • the image pickup device installed in the vehicle may be an image pickup device incorporated in the vehicle or an image pickup device installed in the vehicle.
  • the terminal device may be a portable terminal device that can be carried by a person.
  • the portable terminal device may be a smartphone, a PDA, or the like.
  • FIG. 2 is a block diagram showing a specific example of the function related to the management system of this example
  • FIG. 3 is a block diagram showing a specific example of the function related to the terminal device.
  • a part of the functions related to the management system may be executed in the terminal device.
  • the system can process based on the information acquired from a plurality of terminal devices, but when the terminal device executes the function in FIG. 2, one terminal device is used based on the information acquired by the terminal device.
  • the function may be executed as the acquired information.
  • the information obtained by executing the information on the one terminal device may be transmitted to the management system, combined with the information acquired from the other terminal devices in the management system, and processed such as summation and averaging.
  • the term system is used as a superordinate concept of a management system and a terminal device, and the system may include a management system and may not include one or a plurality of terminal devices, or one or a plurality of terminal devices. It may include the management system and may not include the management system, or may include both the management system and one or more terminal devices.
  • the vehicle in which the terminal device is installed or placed is referred to as a self-vehicle
  • the information related to such a vehicle is referred to as self-vehicle information
  • the vehicle included in the image captured by the imaging device related to the terminal device is referred to as another vehicle. That is.
  • the term "own vehicle” refers to one or more of these terminal devices may be installed or placed. You may point to the vehicle.
  • the image acquisition unit has a function of acquiring an image.
  • the image may be a moving image or a still image.
  • images are treated as a superordinate concept of moving images and still images.
  • the image acquisition unit may store the moving image in a file format divided for each predetermined capacity. This is because the convenience of communication is improved when the capacity per file is smaller than that of a single file having a large capacity.
  • the image acquisition unit may acquire the imaged location and / or the imaging time for each predetermined period of the acquired moving image, and store the acquired moving image in association with each predetermined period of the corresponding moving image.
  • the image acquisition unit may acquire an image from an image pickup device inside the terminal device, or may acquire an image from an image pickup device outside the terminal device. In this case, it may be connected to an image pickup device outside the terminal device by wire or wirelessly, and in the case of wireless communication, image information may be acquired by a connection method such as WIFI or BLUETOOTH.
  • the image acquisition unit may acquire audio.
  • the image acquisition unit may also acquire audio when acquiring a moving image as an image. This has the advantage that, for example, sounds at the time of an accident, sudden braking, sharp curves, and the like can be acquired and recorded. Further, the image acquisition unit may acquire the sound itself separately from the moving image. As a superordinate concept of image and sound, it may be referred to as "image, etc.” in the documents of the present application.
  • the own vehicle information generation unit may have a function of acquiring information related to the own vehicle.
  • the information related to the own vehicle is the information related to the license plate related to the own vehicle, the message to be transmitted to the surrounding vehicle, the vehicle in front, and / or the vehicle behind the own vehicle (referred to as "transmission message” in the documents of the present application). There is also), may be included.
  • the information related to the license plate may be an image of the license plate of the vehicle, may be information on a number that identifies the vehicle generated from the image of the image of the imaged license plate, or may be information on a number that identifies the vehicle, depending on the user or the like. It may be the information of the number that identifies the entered vehicle. Further, in this case, the area name may be selected depending on the mode of selecting from those given in advance.
  • the self-vehicle information generation unit may have a function of analyzing the image and extracting the information of the number that identifies the vehicle.
  • the information related to the license plate may be registered one or more in the terminal device. Among the information related to one or a plurality of license plates registered in advance, the information related to one license plate is selected and may be used. In addition, when the user can use a plurality of own vehicles, the user may select the own vehicle to be actually used and select the information related to the license plate related to the actually used own vehicle. .. As a result, even when the user may use a plurality of own vehicles, there is an advantage that the information related to the license plate can be used quickly without registering a new license plate each time the user uses the vehicle. ..
  • the transmission message may be in various modes as long as the message of the own vehicle can be transmitted.
  • the transmitted message may be a character string input by the user before or at the time of boarding.
  • a character string may be any character input by the user.
  • the transmitted message may be a predetermined character string.
  • the user may be selected from one of the plurality of predetermined character strings.
  • the transmitted message may be, for example, a character string indicating that the vehicle is driving slowly, a character string indicating that the driver is elderly, a character string indicating that the driver is a beginner, a character string indicating that a child is on board, or a character string indicating that the driver is in a hurry. Good.
  • These transmitted messages may be stored in advance or may be input in the terminal device.
  • the transmission message may include a transmission message to the vehicle in front and a transmission message to the vehicle behind.
  • a plurality of transmitted messages may be possessed, and one transmitted message may be selected from a plurality of transmitted messages to be used at an appropriate timing of the user.
  • the selection of the transmitted message may be selected by voice or by gesture.
  • the message conveyed by the gesture may be, for example, gratitude, apology, or giving up the other party.
  • gratitude is assumed to be gratitude for being put in the vehicle line when changing lanes, and gratitude for being given the lead when turning right or left.
  • an apology is expected to be given in place of gratitude in the same case as gratitude.
  • these gestures may include gestures or voices that identify the opponent's vehicle, such as a front vehicle, a rear vehicle, a right-side vehicle, and a left-side vehicle, before and after the message content.
  • the user of the terminal device according to the example system the driver of the vehicle into which the terminal device is brought in, and the vehicle owner of the vehicle may be different or the same. Further, when a user of the terminal device according to the example system uses a plurality of vehicles, such a user may have a function of specifying a vehicle to be boarded.
  • FIG. 4 is an example of a data structure in which information related to the license plate related to the user's vehicle and a message when the user's vehicle is boarded are stored in association with each other. Having such a data structure has an advantage that the user can select the user's vehicle to actually board and can use the information related to the license plate according to the selected user's vehicle. ..
  • the information related to the own vehicle includes information indicating the driving status of the own vehicle (sometimes referred to as "own vehicle driving status information" in the documents of the present application) and / or information indicating the position of the own vehicle "own vehicle”. "Position information" may be included.
  • the own vehicle driving situation information may include sudden acceleration, sudden braking, sudden steering, speed, and / or acceleration of the own vehicle.
  • the sudden acceleration, sudden braking, and / or sudden steering of the own vehicle may be determined by using the function in the terminal device.
  • a sensor in the terminal device may determine the sudden acceleration, sudden braking, and / or sudden steering of the own vehicle.
  • an acceleration sensor as a sensor may determine sudden acceleration, sudden braking, and / or sudden steering when the acceleration is higher or lower than a predetermined acceleration.
  • the accelerometer may be capable of measuring each of the three-dimensional accelerations.
  • the acceleration in the same direction as the front direction of the vehicle is higher than the predetermined acceleration, it is determined that the acceleration of the own vehicle is sudden, and when the acceleration in the direction opposite to the front direction of the vehicle is higher than the predetermined acceleration, the self When it is determined that the vehicle is suddenly braked and the lateral acceleration of the vehicle is higher than a predetermined acceleration, it may be determined that the vehicle is a sudden steering wheel.
  • the speed and acceleration of the own vehicle may be measured by measuring the acceleration with a sensor in the terminal device and using the acceleration.
  • the self-vehicle driving status information may include information relating to the distance of the self-vehicle from the vehicle in front (sometimes referred to as "distance information" in the documents of the present application).
  • the own vehicle information generation unit may generate distance information using the information related to the image generated by the image information generation unit described later.
  • Such distance information may include information for estimating the distance between the own vehicle and the vehicle in front or information corresponding to the distance and / or information for determining that the vehicle has approached using the distance information. For the information determined to be approaching, for example, the size in the image when the vehicle in front is viewed from the rear is used as the information corresponding to the distance between the own vehicle and the vehicle in front, and the size from the rear of the vehicle is used.
  • the size from the rear of the vehicle may be the width or height of the vehicle. Since the vehicle height can be changed depending on the vehicle type, the presence or absence of approach may be determined by using the vehicle height according to the vehicle type in the image. In this case, when the height of the vehicle in front obtained from the image is higher than the height of a predetermined vehicle height set according to the vehicle type, it may be determined that the vehicle has approached.
  • the vehicle width does not differ greatly depending on the vehicle type, so there is an advantage that information processing based on the vehicle type does not have to be performed, and the vehicle width of the vehicle in front obtained from the image is longer than the predetermined vehicle width. It may be determined that the vehicle has approached.
  • the self-vehicle information generation unit is provided with a plurality of ranks for each of the degree of sudden acceleration, sudden braking, sudden steering, and approach information, and one of these may be selected.
  • the rank is 3, threshold values are set as low, medium, and high, respectively, and the rank may be determined by comparing them with the obtained information.
  • the number of ranks is not limited to 3, and may be any number.
  • the own vehicle information generation unit may store the sudden acceleration, sudden braking, sudden steering, and / or approach information of the own vehicle in association with the situation in which these are determined.
  • the determined situation may be, for example, the time and / or location where the sudden acceleration, sudden braking, sudden steering, and / or approach information of the own vehicle occur.
  • the own vehicle information generation unit may generate statistical information regarding the driving situation of the own vehicle.
  • the statistical information may be, for example, the number of times of sudden acceleration, sudden braking, sudden steering, and / or approach information in traveling for each predetermined item.
  • the predetermined items may be within a predetermined unit time, one run, one day run, a predetermined period, and the like.
  • one run is, for example, from the start of the engine to the end of the engine, the time between running and running is a predetermined period or more, or the time from the start to the stop of the software related to the system of one example. Or it may be from the start of recording the moving image to the end of recording.
  • the above-mentioned number of times may be generated in chronological order.
  • a graph may be generated for each of sudden acceleration, sudden braking, sudden steering, and / or approach information with the horizontal axis as the timing and the vertical axis as the number of times of application, and may be used as a drive report.
  • FIG. 5 is an example of statistical information data on the driving situation of the own vehicle. This is an example in which each statistical information in one day's driving is recorded for each predetermined item.
  • the IDs are 001 to 004, which is an example in which the total number of times is stored as each statistical information for running for 4 days.
  • the position information may be acquired from the GPS inside the terminal device, or the position information may be acquired from the GPS outside the terminal device.
  • the image pickup device outside the terminal device may be connected by wire or wirelessly, and in the case of wireless communication, the position information may be acquired by a connection method such as WIFI or BLUETOOTH.
  • the own vehicle information generation unit generates a green light when the image information generation unit described later generates information that the own vehicle does not move and that the traffic light for the own vehicle is a green light. However, in order to indicate that the own vehicle has not started, information prompting the own vehicle to start may be generated. In addition, the self-vehicle information generation unit generates information that the own vehicle does not move and that the distance information to the vehicle in front is larger than a predetermined distance in the image information generation unit described later. If this is the case, information prompting the departure of the own vehicle may be generated in order to indicate that the own vehicle has not started even though the vehicle in front has started. In the latter case, when a traffic light targeting the own vehicle is detected in the image, a condition that it is a green light may be added.
  • the self-vehicle information generation unit when the self-vehicle information generation unit generates information that the own vehicle is moving and that there is a stop sign or display in the image information generation unit described later, the self-vehicle information generation unit generates information that the vehicle is moving. It may be determined that the suspension violation occurs, and information indicating the suspension violation may be generated.
  • the self-vehicle information generation unit ignores the red light when the image information generation unit described later generates information that the own vehicle is moving and information indicating that the red light continues. Judgment may be made and information indicating that the red light is ignored may be generated.
  • the self-vehicle information generation unit uses the information indicating the stop violation and the information ignoring the red light to add these information indicating the stop violation and / or the red light to the above-mentioned statistical information.
  • the total number and / or average number of information to be ignored may be included.
  • the information indicating the suspension violation and / or the information ignoring the red light may include the location information in which they occur and / or the information relating to the occasion in association with each other.
  • the image information generation unit has a function of generating information related to an image by using an image.
  • the information related to the image may be information using various objects in the image (sometimes referred to as "objects" in the documents of the present application) and various situations, and the type and range thereof are not limited, but for example, The following information can be mentioned.
  • the image information generation unit may generate information after identifying a target for generating information in the image.
  • the image information generation unit may generate information related to the image by using the image and other information.
  • information related to an image may be generated by using information related to at least a part of an image or the like acquisition unit and a self-vehicle information generation unit.
  • the information related to the image or the like acquisition unit may be, for example, the information at the time of acquiring the image or the like, the image or the like, and the information related to the own vehicle information generation unit may be, for example, the own vehicle position information or the like. However, it is not limited to these.
  • the information related to the image may be information related to one or more other vehicles in the image.
  • Information related to other vehicles is referred to as information that identifies another vehicle (sometimes referred to as “vehicle identification information" in the documents of the present application) and information that indicates the driving status of another vehicle (in the documents of the present application, "information on the driving status of another vehicle”). Sometimes) and so on.
  • the vehicle specific information may be information related to the vehicle in the image.
  • the information relating to the vehicle in the image may include information relating to the license plate of the vehicle in the image, or may include information such as the vehicle type, color, and options of the vehicle in the image.
  • the information related to the license plate may be the same as the information related to the license plate related to the user vehicle described above.
  • the other vehicle driving status information may be any information indicating the driving status of another vehicle, and may be, for example, sudden acceleration of the vehicle, sudden braking of the vehicle, and / or sudden steering of the vehicle.
  • the sudden acceleration of the vehicle in the image may be determined as a sudden acceleration when the size of the vehicle in the image changes slightly by a predetermined ratio or more within a predetermined period. Further, the sudden acceleration may be determined by using the acceleration of the own vehicle. That is, when the acceleration of the own vehicle in the forward direction is within a predetermined range and the size of the vehicle in the image changes slightly by a predetermined ratio or more within a predetermined period, it is determined to be sudden acceleration. It's okay. In this case, there is an advantage that the determination accuracy is improved.
  • the sudden braking of the vehicle in the image may be determined as sudden acceleration when the size of the vehicle in the image changes significantly by a predetermined ratio or more within a predetermined period. Further, the sudden acceleration may be determined by using the acceleration of the own vehicle. That is, when the acceleration of the own vehicle in the rear direction is within a predetermined range and the size of the vehicle in the image changes significantly by a predetermined ratio or more within a predetermined period, it is determined as sudden braking. It's okay. In this case, there is an advantage that the determination accuracy is improved.
  • the steep steering wheel of the vehicle in the image means that the lateral area of the vehicle in the image increases at a speed of a predetermined ratio or more, and the shape of the side of the vehicle in the image is the shape when the vehicle is seen from the side. If any or a combination of the speeds approaching the above is equal to or higher than a predetermined ratio, it may be determined to be sudden acceleration. Further, in combination with these, when the size of the vehicle in the image changes small at a speed of a predetermined ratio or more within a predetermined period, it may be determined that the vehicle is suddenly accelerated. The size of the vehicle may be the length of the width of the vehicle.
  • the driving status information of other vehicles may be stored in association with the date and time when the acquired target image was captured. This has the advantage of clarifying the date and time related to the driving status information of other vehicles.
  • the driving status information of other vehicles may be stored in association with the captured position information of the acquired target image. This has the advantage of clarifying the position related to the driving status information of other vehicles.
  • FIG. 6 is an example in which information is generated based on a plurality of images.
  • This figure shows the data structure, and the vehicle ID, other vehicle driving status information, the date and time, and the position are stored in association with the image ID.
  • the vehicle ID may be associated with other vehicle specific information in other data structures.
  • the vehicle ID may be one that is sequentially attached to the vehicle specified in the image, or the corresponding vehicle is collated with the vehicle identification information to which the vehicle ID is attached in the past. ID may be used.
  • the storage unit may store a set of information of the vehicle identification information in the image captured in the past by the image pickup device related to the terminal device and the vehicle ID attached to the vehicle. In addition, the information of such a set may be stored for a predetermined period of time.
  • each driving status information is associated with a different image ID. For example, when the driving status information of another vehicle is generated at the same time, a plurality of vehicles are generated for one image. And the corresponding other vehicle driving status information may be associated with them.
  • vehicle statistical information in the image may be referred to as "vehicle statistical information in the image" in the documents of the present application.
  • the image information generation unit may generate information on the movement of the own vehicle by using time-series images.
  • the presence or absence of movement of the own vehicle may be determined by using the movements in the corresponding traffic lights, signs, and / or landscapes in the still images of the plurality of adjacent frames in the moving image.
  • the information regarding the movement of the own vehicle may include information on the presence or absence of the own vehicle. Such information is that the vehicle is keeping a pause, the traffic light for the own vehicle turns blue, or the traffic light for the own vehicle is blue when the vehicle in front is moving. May be used if the vehicle departed.
  • the image information generation unit includes a state in which the traffic light for the own vehicle is a red light in the first image, and the traffic light with the red light continues in the frame following the first image in time series. After being displayed, it may be determined whether or not the image deviates from the image without a green light. When such a determination is made, information indicating the continuation of the red light may be generated.
  • the identification of the traffic light for the own vehicle may be made possible by learning by machine learning using an image including the traffic light.
  • the image information generation unit may detect such information and generate information of a stop instruction when there is a stop sign or display in the road sign or road marking of the object related to the image.
  • the information related to the image may be information related to various objects detected in the image.
  • the information related to the target may include information related to the operation of the wiper.
  • the information related to the operation of the wiper the information on the operation of the wiper related to the own vehicle may be used, or the operation of the wiper related to another vehicle may be used.
  • the information related to the operation of the wiper may include information related to the operating status of the wiper.
  • the information related to the operation status of the wiper may be the presence / absence of movement of the wiper and / or the speed of movement of the wiper.
  • the speed of movement of the wiper may be information that identifies one of a plurality of ranks indicating the speed of movement of the wiper.
  • the plurality of ranks indicating the speed of movement of the wiper may be, for example, intermittent, slow, medium, fast, etc., but are not limited to these.
  • the information related to the operation of the wiper may include information related to the operation status of the wiper and the position information obtained by capturing the image obtained by acquiring the information related to the operation status of the wiper.
  • the information related to the operation of the wiper may include information related to the operation status of the wiper and the information related to the operation status of the wiper when the image obtained by acquiring the information is associated with each other.
  • FIG. 7 shows an example of information related to the operation of the wiper.
  • this figure is an example of data in which the presence / absence of movement of the wiper and the rank of movement are separated.
  • one of the ranks of movement includes one with no movement, and the presence / absence of movement of the wiper is assumed. May be good.
  • the data without the movement of the wiper may not be included in the data, and the data may be acquired only when the wiper moves.
  • the information relating to the subject may include information relating to the road.
  • the information related to the road may include the information related to the road condition.
  • Information relating to road conditions may include information relating to road signs, information relating to road markings, information relating to traffic lights, and / or information relating to roadways.
  • Information related to road signs may include information on the presence or absence of signs, the contents of signs, and / or abnormalities in signs.
  • Abnormalities in the sign are the presence or absence of something that interferes with the sign (eg, at least part of the sign is invisible by the tree) and / or the abnormality in the sign itself (eg, at least part of the sign is damaged). It may contain information on (state of operation).
  • Information related to road markings may include the presence or absence of road markings, the content of road markings, and / or abnormalities in road markings. Anomalies in road markings may include information about the presence or absence of something that interferes with the markings and / or anomalies in the markings themselves.
  • the information related to the traffic light may include information that the traffic light has been identified (information that there is a traffic light) and / or information that the traffic light is abnormal. Anomalies in a traffic light include anomalies in the appearance of the traffic light (for example, information that a part of the traffic light cannot be seen due to a tree or the like) and / or a failure of the traffic light (for example, a part of the traffic light is damaged). ), Information may be included.
  • the information related to the roadway may be an abnormality of the roadway and / or the width of the lane.
  • the lane may be a part of a strip-shaped roadway (excluding a sub-roadway) provided to allow a single column of automobiles to pass safely and smoothly, and when the roadway is separated by a line, It may be a part through which a car or the like can pass.
  • Roadway abnormalities may be lane boundary abnormalities, foreign objects on the roadway, and / or destruction of the roadway.
  • the abnormality of the lane boundary line may be a part or all of the lane boundary line is missing.
  • the foreign matter on the roadway may be, for example, a falling object on the roadway, a fallen tree, a fallen utility pole, or the like.
  • Destruction of the roadway may be an abnormality in the shape of the roadway, such as a collapse of the roadway.
  • the information related to the roadway may include information on parking on the street.
  • the information related to the road may include information related to the road condition and the position information obtained by capturing the image obtained by acquiring the information related to the road condition.
  • the information related to the road may include information related to the road condition and the information related to the road condition when the image obtained by acquiring the information is associated with each other.
  • Figure 8 shows an example of information related to roads.
  • the information related to the road may be always acquired when it is detected in the image, or may be acquired only when the information detected in the image satisfies a predetermined condition.
  • the predetermined condition may be a predetermined target, or only a condition in which an abnormality is detected.
  • information related to roads information related to disasters may be generated.
  • Information relating to a disaster may include information on foreign matter on the roadway and / or destruction of the roadway.
  • the information related to the road may include information related to the disaster and the position information obtained by capturing the image obtained by acquiring the information related to the disaster, or may include the information related to the disaster and the information related to the disaster. It may include an image associated with the time when the acquired image is captured.
  • FIG. 9 is an example in which the information related to the road includes the information related to the disaster.
  • the information related to these roads is transmitted to the system related to autonomous driving, it may be used to judge the propriety and priority of autonomous driving.
  • the information relating to the subject may include information relating to the sidewalk.
  • the information on the sidewalk may include information on the condition of the person on the sidewalk.
  • Information on the condition of a person on the sidewalk may include information on what the person wears. What is worn by such a person may include an umbrella (in the documents of the present application, the term wearing shall include the meaning of pointing when the subject is an umbrella). Also, what is worn by a person may include a cloak. Also, what is worn by a person may include a half sleeve and a long sleeve.
  • the information related to the sidewalk may include information on the state of the person on the sidewalk and the position information obtained by capturing the image obtained by acquiring the information on the state of the person on the sidewalk.
  • the information related to the sidewalk may include information on the state of the person on the sidewalk and the information on the state of the person on the sidewalk associating the acquired image.
  • FIG. 10 is an example of information related to the sidewalk.
  • the information related to the sidewalk may be always acquired when it is detected in the image, or may be acquired only when the information detected in the image satisfies a predetermined condition.
  • predetermined conditions only those that are predetermined as information on the state of a person or that a predetermined number of people or more are detected in one image are predetermined among the people in one image. It may be only the detection of people less than or equal to the number of people.
  • the information related to the target may include information related to the degree of congestion of people.
  • the information on the degree of congestion of people may include the number of people in a predetermined area.
  • the information relating to the degree of congestion of people includes the number of such people and the position information in which the image used for determining the number of such people is captured or the information indicating such a predetermined area in association with each other. Good.
  • the information on the degree of congestion of people may be included in association with the number of such people and the time when the image in which the number of such people is determined is captured.
  • the predetermined area may be predetermined or may be defined at the time of moving image shooting. Examples of the predetermined area include a predetermined area using map information and location information. Further, what is defined at the time of moving image shooting may be a region defined by using a predetermined distance from a coordinate value such as GPS, or may be a region defined by using the elapsed imaging time.
  • Information on the degree of congestion of these people may be used to determine the propriety and priority of autonomous driving when it is transmitted to the system related to autonomous driving.
  • the information related to the target may include information related to an event on the roadway.
  • Information about roadway events may be information about construction and / or accidents.
  • Information about the construction may include the presence or absence of construction and / or the scheduled completion deadline of the construction. The presence or absence of construction may be determined by detecting information such as a stop of the construction vehicle, installation of an instant traffic light, and vehicle guidance by a person involved in the construction in the image.
  • Information about the accident may include the presence or absence of an accident and the magnitude of the accident.
  • the presence or absence of an accident may be determined by the presence or absence of an accident vehicle and / or the presence or absence of police personnel. The detection of construction personnel and police personnel may be detected by the clothes and belongings that characterize them.
  • the information on the on-road event may include information on the construction and / or the accident and the position information obtained by capturing the image obtained from the information on the construction and / or the accident.
  • the information on the event on the roadway may include information on the construction and / or the accident and the information on the acquisition of the information on the construction and / or the accident when the image is taken.
  • FIG. 11 shows an example of information regarding an event on the roadway.
  • the information related to the target may be information related to the advertisement.
  • the information related to the advertisement may include the advertisement status information.
  • the advertisement status information may include a specific brand of the advertisement, the advertiser of the advertisement, the field to which the advertisement belongs, the size of the advertisement, and / or the presence or absence of an abnormality in the advertisement.
  • the presence or absence of an abnormality in the advertisement may include a state in which the advertisement is hidden by an object such as a tree and / or damage to the advertisement itself.
  • Advertising status information may be acquired based on the image.
  • the information related to the advertisement may include an association between the advertisement status information and the position information obtained by capturing the image obtained from the advertisement status information.
  • the information related to the advertisement may include the information related to the advertisement status information and the time when the image obtained from the advertisement status information is captured.
  • FIG. 12 is an example of information related to the advertisement.
  • the size of the advertisement may be configured to select one of a plurality of predetermined ranks for the size.
  • the information relating to the target may be information relating to fuel for vehicles.
  • the information relating to the fuel for the vehicle may include the information relating to the status of the fuel for the vehicle.
  • the information on the status of the fuel for the vehicle may include the type of fuel for the vehicle and the amount of money corresponding to the type of fuel for the vehicle.
  • Vehicle fuel types may include high-octane gasoline, regular gasoline, light oil, and / or kerosene, and the like.
  • Information on the status of fuel for vehicles may be obtained from the presentation of locations in the image that provide fuel for vehicles, such as service stations, including gas stations and refueling stations.
  • the information relating to the fuel for the vehicle may include information relating to the status of the fuel for the vehicle and the position information obtained by capturing the image obtained by acquiring the information relating to the status of the fuel for the vehicle. ..
  • the information relating to the fuel for the vehicle may include information relating to the status of the fuel for the vehicle and the information relating to the acquisition of the information relating to the fuel status for the vehicle when the image is taken.
  • FIG. 13 shows an example of information on fuel for vehicles.
  • the information related to the fuel for these vehicles is transmitted to the system related to automatic driving, it may be used to judge the propriety and priority of automatic driving.
  • the image information acquisition unit may have a machine-learned identification function.
  • the machine-learned identification function may include a function capable of identifying a predetermined object.
  • the image information acquisition unit may acquire information related to the above-mentioned various objects from the image by using the machine-learned identification function.
  • the machine-learned identification function may be stored in the information processing device in the terminal device. Since the object to be identified is limited in advance, there is an advantage that a high identification function can be realized even in a simple information processing device such as in a terminal device.
  • Artificial intelligence technologies include, for example, machine learning technologies such as neural networks, genetic programming, functional logic programming, support vector machines, clustering, regression, classification, Bayesian networks, reinforcement learning, expression learning, decision trees, and k-means clustering. May be used. In the following, an example using a neural network will be used, but the present invention is not necessarily limited to the neural network.
  • the machine learning technology using the neural network deep learning technology may be used. This is a technique that makes it possible to generate an output corresponding to an input even for an unknown input by learning the relationship between the input and the output using a plurality of layers.
  • the learning image and the attribute information related to the learning image are associated with each other, and machine learning is performed using these as learning data.
  • the machine learning function related to the image information acquisition unit includes, for example, information related to at least a part of the image information acquisition unit, the information acquisition unit, the self-vehicle information generation unit, and the image information generation unit, and the information is an image seen from a person.
  • the relationship between the image determined to be displayed above and the image may be machine-learned.
  • the machine learning related to the image information acquisition unit identifies information related to at least a part of the image information acquisition unit, the self-vehicle information generation unit, and the image information generation unit from the image, and generates the corresponding information. It may have a function capable of being able to do so.
  • the learning algorithm itself when using the deep learning technique may be a known one. Further, as the program used in the deep learning technique, an open source program may be used, or a program obtained by modifying them as appropriate may be used.
  • the machine-learned function may have an update function.
  • the update function may be a function capable of updating an identifiable object. For example, it may be possible to identify information related to vehicles, information related to roads, information related to advertisements, and the like, and update the information so that the target information can be generated. Such updates may be in a mode in which the program or its parameters are downloaded and installed.
  • the information related to the target may be stored in association with the information related to the time when each information was acquired and / or the information related to the place.
  • the information related to the time may be month, day, week, time, and the like.
  • the information related to the location may be administrative divisions, GPS coordinates, and the like.
  • the information related to the location may be acquired by the GPS related to the terminal device that acquires the image.
  • the machine-learned identification function may use the identification function by using the position information. For example, when there is a high possibility that a specific advertisement is in a specific position, it may have an identification function that emphasizes the identification of the specific advertisement in the vicinity of the specific position including the specific position.
  • the storage unit has a function of storing information.
  • the storage unit may store the above-mentioned information related to the image acquisition unit, the self-vehicle information generation unit, and the image information generation unit.
  • the storage unit may have a function of associating and storing a plurality of pieces of information.
  • the information to be associated includes, for example, a part or all of an image or the like, time information, own vehicle driving status information, own vehicle position information, vehicle specific information, and driving status information. , You may remember.
  • the storage unit may store the information by the ring buffer. Further, the storage unit may store each piece of information in a ring buffer. In this case, the upper limit of the ring buffer is defined for each information, and when the information exceeding the upper limit is stored, the storage may be deleted from the oldest information.
  • the information for each information may be, for example, information such as an image or the like, own vehicle driving status information, own vehicle position information, vehicle specific information, other vehicle driving status information, and the like.
  • the output unit may have a function of producing sound or displaying. For example, information related to the own vehicle and / or information related to an image may be displayed.
  • the information related to the own vehicle may include the own vehicle driving status information and / or the distance information.
  • information on the driving of the own vehicle can be obtained from the information on the own vehicle.
  • the information related to the image may include vehicle specific information and / or other vehicle driving status information.
  • information on other vehicles can be obtained from information on other vehicles.
  • various information may be included, and such information may be displayed.
  • the output unit may display the information acquired by the information and communication unit. For example, information related to other vehicle statistical information may be displayed. Further, the output unit may display the processed information by using the other vehicle statistical information. This has the advantage that the viewer can understand the information related to other vehicles. In particular, when the driving situation of the vehicle in front is different from the usual one, there is an advantage that the driver can drive with care.
  • the output unit has a function of emitting a sound, and may notify the user of the terminal device. For example, the output unit may generate a sound that prompts the driver of the own vehicle to start the vehicle when the information for prompting the departure of the own vehicle is generated.
  • the output unit may output information related to at least a part of the image information acquisition unit, the self-vehicle information generation unit, the image information generation unit, the storage unit, and the information communication unit.
  • the information and communication unit has a function of communicating information.
  • the information communication unit may communicate information relating to at least a part of an image information acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, and an output unit.
  • the information and communication unit may transmit such information.
  • the function of communicating information may be executed at an arbitrary timing, or at a timing when a specific condition is satisfied.
  • the latter specific condition may be, for example, the timing at which the WIFI function can be executed. In this case, there is an advantage that the communication cost for the user can be reduced.
  • the information and communication unit may have a function of receiving information.
  • the information and communication unit may acquire other vehicle statistical information described later from the management system.
  • the statistical processing unit has a statistical processing function. By performing statistical processing, the statistical processing department performs other vehicle statistical information, traffic jam information, weather information, road information, road abnormality information, accident information, vehicle fuel provision information, advertising statistical information, human information, and 3D. At least some information about the map may be generated.
  • the statistical processing unit uses vehicle identification information and other vehicle driving status information obtained from one or more terminal devices to provide statistical information related to other vehicles (sometimes referred to as "other vehicle statistical information" in the documents of the present application). May be generated.
  • the statistic may include the sum of the number of sudden steering, the number of sudden braking, and / or the number of sudden accelerations for one vehicle, or the number of sudden steering, sudden braking for one vehicle. And / or the number of sudden accelerations, etc. may be included. Further, the total value and the average value may be those for vehicles imaged during a predetermined period.
  • Such other vehicle statistical information for example, regarding the number of times of sudden steering, for a specific vehicle obtained from the vehicle specific information, the vehicle specific information obtained from the one or a plurality of terminal devices and the other vehicle driving status information.
  • vehicle identification information the same vehicle as the specific vehicle is specified, and the presence / absence or the number of times of sudden steering in the corresponding other vehicle driving status information is acquired for a predetermined range.
  • the total value of the steep steering wheel may be generated by applying it to the vehicle identification information and the other vehicle driving situation information obtained from all the above-mentioned one or more terminal devices.
  • the total number of the number of cue brakes and the number of sudden accelerations may be generated, and the corresponding average value in a predetermined period, a predetermined area, or the like may be generated in the same manner.
  • the statistical processing unit may generate a ranking for the total value or the average value.
  • the ranking may be targeted at vehicles imaged during a predetermined period, may be targeted at vehicles imaged in a predetermined area, or may be targeted at vehicles traveling on a predetermined road or a predetermined road condition. It may be something to do.
  • the predetermined road may be, for example, a type of road such as a road in a residential area, a main road, an expressway, or the like.
  • the predetermined road condition may be, for example, depending on the speed of the vehicle, for example, during traffic jam, low speed running, high speed running, etc., while traveling at a speed within a specified range with the speed as an evaluation value. It may be intended for vehicles.
  • the ranking may be a predetermined number of rankings in descending order of the total value or the average value, or may be a predetermined number of rankings in ascending order.
  • FIG. 14 is an example of other vehicle statistical information.
  • the statistical processing unit may generate congestion information.
  • the traffic jam information may be any data as long as it is information related to the traffic jam of the vehicle.
  • Congestion information may include, for example, information indicating the presence of congestion associated with a particular area.
  • Information indicating the existence of a traffic jam associated with a specific area may be generated, for example, when one or more vehicles having position information in the specific area satisfy a predetermined condition.
  • the predetermined condition is, for example, when the number of vehicles determined to be congested is larger than the predetermined number, or the number of vehicles determined to be congested is a predetermined multiple or more of the number of vehicles not determined to be congested. It may be determined depending on the case.
  • the vehicle may be determined to be in a traffic jam if certain conditions are met for the vehicle speed and / or the above-mentioned distance information.
  • the vehicle speed When the vehicle speed satisfies a predetermined condition, for example, when the vehicle speed is smaller than the vehicle speed and the predetermined speed, it may be determined as a traffic jam. Further, when the average speed of the vehicles is smaller than the predetermined speed within a predetermined time or a predetermined distance, it may be determined as a traffic jam. This is because in the case of traffic congestion, the average speed of the vehicle becomes smaller than the predetermined speed.
  • the speed and / or acceleration in the own vehicle driving status information obtained from the terminal device may be used.
  • the distance information when the distance information satisfies a predetermined condition, for example, when the distance information is smaller than the predetermined distance, it may be determined as a traffic jam. In addition, it may be determined that there is a traffic jam because the average of the distance information is equal to or less than a predetermined time within a predetermined time or within a predetermined distance. This is because when the size of the vehicle in the image is large, it indicates that the distance between the vehicle in front and the vehicle of the company is short, and this is information that should be determined to be a traffic jam.
  • the information associated with the time in a predetermined range may be used. Such a case may be when the measurement is performed by the acceleration sensor in the terminal device or when the image acquired by the terminal device is captured. This has an advantage that traffic congestion information can be collected in the information of the same time or the time within a predetermined range including the same time.
  • the traffic jam information may be included in association with the time measured by the acceleration sensor in the terminal device, the time when the image acquired by the terminal device is captured, or the information in the time range including these.
  • FIG. 15 is an example of traffic congestion information.
  • the traffic jam information is generated for the area, but the traffic jam length information may be generated and the traffic jam length information may be included in the traffic jam information.
  • the information on the length of the traffic jam is between the terminal devices that have acquired the information for determining that the traffic jam is caused when the area or the position information is determined to be a traffic jam within a predetermined range which is a short distance. Using the length of the distance, it may be determined that at least the length of the distance is the traffic jam, and the length of the traffic jam may be used as the length of the traffic jam.
  • the statistical processing unit may generate weather information.
  • the weather information may include, for example, weather condition information associated with a particular area.
  • the weather condition information associated with the specific area is, for example, information related to rain acquired from an image captured by one or a plurality of terminal devices having location information in the specific area and / or information indicating temperature. However, it may be generated when a predetermined condition is satisfied.
  • the information relating to rain may include information that it is rain.
  • the information determined to be rain may be generated by using the information related to the operation of the wiper. For example, if there is movement of the wiper, it may be generated as rain, and if there is no movement of the wiper, it may not be determined that it is rain. Further, when the speed of movement of the wiper indicates one of a plurality of ranks indicating the speed of movement of the wiper, since it is one of the plurality of ranks, information indicating the degree of rain is generated. You can.
  • a predetermined condition of the information related to rain for example, when there is more information determined to be one of a plurality of ranks indicating the speed of movement of the wiper than a predetermined number, or the speed of movement of the wiper. If the ratio of the number of information determined to be one of multiple ranks indicating the speed of movement of the wiper to the number of information not determined to be one of multiple ranks is greater than the predetermined ratio, it rains.
  • the information relating to the above may include information indicating the corresponding degree of rain.
  • the information indicating the temperature may include the information that the weather is such that a mantle is required, and the number including the long sleeves is larger than the predetermined number in the information related to the sidewalk, or the number not including the long sleeves is larger than the number including the long sleeves.
  • the information indicating the temperature may include the information that the weather requires a long sleeve, and the number including the half sleeve in the information related to the sidewalk is larger than the predetermined number or the number including the half sleeve.
  • the information indicating the temperature may include information that the weather is such that half sleeves are required.
  • the weather information may include the weather condition information and the information indicating the specific area used for generating the weather condition information in association with each other.
  • the weather information may include the weather condition information and the time when the image obtained by acquiring the information related to the wiper used to generate the weather condition information and / or the information related to the sidewalk is captured in association with each other. ..
  • FIG. 16 is an example of weather information.
  • the statistical processing unit may generate road abnormality information.
  • the road abnormality information may include information relating to the road abnormality state.
  • the information relating to the road abnormality state may include, for example, an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
  • Sign abnormalities, road marking abnormalities, traffic light abnormalities, and / or roadway abnormalities are signs abnormalities, road marking abnormalities, traffic light abnormalities, and / or roadway abnormalities in information related to roads. , May be used to generate.
  • the statistical processing unit may generate road abnormality information from information related to roads acquired from one terminal device, or may generate road abnormality information from information related to roads acquired from a plurality of terminal devices. In the latter case, the statistical processing unit performs a predetermined number or a predetermined ratio of terminal devices on the road for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • Road abnormality information may be generated when such information includes an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
  • the road abnormality information may include the information related to the road abnormality state and the position information in which the image related to the abnormality is captured or the area in a predetermined range including the position information in association with each other.
  • the road abnormality information may include the information related to the road abnormality state in association with the time when the image is captured or the time in a predetermined range including the time when the image is taken.
  • the road abnormality information may include information related to the road abnormality state and an image related to the abnormality in association with each other.
  • FIG. 17 is an example of road abnormality information.
  • the statistical processing unit may generate road information.
  • Road information may be generated using information related to roads.
  • the statistical processing unit may generate road information from information related to roads acquired from one terminal device, or may generate road information from information related to roads acquired from a plurality of terminal devices. In the latter case, the statistical processing unit is the same or from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • Road information may be generated when information on similar roads is acquired.
  • the similar information may be an error range as the information generated from the image.
  • Road information such as, may be included in association with time information, area or location information, as well as road anomaly information.
  • the statistical processing unit may generate accident information.
  • the statistical processing unit may generate accident information when the information about the event on the roadway includes the information related to the accident. Further, in such a case, the statistical processing unit may set the area where the accident has occurred by using the position information associated with the information related to the accident, and the accident information may include the information of the area. In addition, the accident information may include information related to such an accident in association with the area.
  • Accident information may be generated based on information about events on the road. Further, the information at the time in the accident information may be generated by using the information at the time related to the event on the roadway. In addition, the information at the time in the accident information is the information at the time related to the event on the roadway when the information related to the event on the roadway is acquired from a plurality of terminal devices, and the information at the time related to the event on the roadway is early or late. It may be a range.
  • the statistical processing unit may generate information related to a roadway event whose continuous period is shorter than a predetermined period as accident information among the information related to the continuous roadway event. This is because accidents on the road are generally removed in a short period of time compared to construction information. Therefore, even when it is not possible to determine whether the work is a construction work or an accident only from the image, the accident information may be determined by paying attention to the continuity of the information related to the event on the roadway. From the viewpoint of improving the accuracy of the accident information, the accident information may not be determined by paying attention to such a period.
  • the statistical processing unit may generate accident information from information related to roadway events acquired from one terminal device, or may generate accident information from information related to roadway events acquired from a plurality of terminal devices. In the latter case, the statistical processing unit is on the road from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • Road abnormality information may be generated when the information related to the event includes an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
  • FIG. 18 is an example of such accident information.
  • the statistical processing unit may generate information on the provision of fuel for vehicles.
  • the vehicle fuel provision information may include the type of vehicle fuel and the amount of money corresponding to such type in association with each other.
  • the type of vehicle fuel and the amount of money corresponding to such type may be generated using the information on the vehicle fuel.
  • the vehicle fuel provision information may include the location information corresponding to the type and the amount of money in association with each other. Such location information may be generated using the location information in the information relating to the fuel for the vehicle corresponding to such type and amount.
  • the vehicle fuel provision information may include the information corresponding to the type and the amount of money in association with each other.
  • the information at such times may be generated using the information at such times in the information relating to fuels for vehicles corresponding to such types and amounts.
  • the statistical processing unit may generate vehicle fuel provision information from information related to vehicle fuel acquired from one terminal device, or from information related to vehicle fuel acquired from a plurality of terminal devices. It may be generated. In the latter case, the statistical processing unit is used for vehicles from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • the information relating to the fuel of the vehicle includes the type of fuel for the vehicle and the corresponding amount of money
  • the information for providing the fuel for the vehicle may be generated.
  • the time in the predetermined range may be a predetermined period retroactive from the present time. This has the advantage of being able to keep up-to-date information.
  • FIG. 19 is an example of vehicle fuel provision information.
  • the statistical processing unit may generate advertising statistical information.
  • the advertisement statistical information may include the number related to the advertisement or may include the ratio related to the advertisement.
  • the number of advertisements may be the number of advertisements satisfying a predetermined condition. Advertisements that meet certain conditions include advertisements of a specific brand, advertisements by a specific advertiser, advertisements related to a specific field, advertisements of a predetermined size, advertisements larger than a predetermined size, and advertisements smaller than a predetermined size. , And / or advertisements that are blocked by trees or buildings.
  • the ratio related to the advertisement may be the ratio of the advertisement satisfying the second predetermined condition to the advertisement satisfying the first predetermined condition.
  • the first predetermined condition and the second predetermined condition may be the above-mentioned predetermined conditions, and these may be different.
  • the above-mentioned specific brand may be able to identify such a brand by character recognition in an image.
  • the above-mentioned advertiser only needs to be able to identify such an advertiser by character recognition in the image.
  • the field of view of the advertisement may be obstructed for a predetermined ratio or more in the time zone in which the advertisement can be recognized in the image.
  • the advertisement statistical information may include the number related to the advertisement and the information indicating a specific area including the position information obtained by capturing the image used to generate the number related to the advertisement in association with each other.
  • the information indicating such a region may be generated as a specific region including such a position by using the position information associated with the image.
  • the advertisement statistical information may include the number related to the advertisement and the information indicating a specific time range including the time when the image used to generate the number related to the advertisement is imaged in association with each other.
  • Information indicating such a specific time range may be generated as a specific time range including such a time by using the information of the time associated with such an image.
  • the statistical processing unit may generate advertisement statistical information from information related to advertisements acquired from one terminal device, or may generate advertisement statistical information from information related to advertisements acquired from a plurality of terminal devices. In the latter case, regarding the information related to the advertisement related to the same or predetermined range of position information, the advertisement statistical information is obtained only when the information related to the same advertisement is acquired from a predetermined number or a predetermined ratio or more of the terminal devices. It may be configured to generate, or it may be configured to generate advertisement statistics even when the information related to the advertisement is acquired from one terminal device.
  • the statistical processing unit may include the corresponding predetermined conditions for the advertisement statistical information. This has the advantage that it is possible to retain information as to what kind of conditions the advertisement statistical information satisfies.
  • FIG. 20 is an example of advertising statistical information.
  • the statistical processing unit may generate human information.
  • Person information may be generated based on information related to the degree of congestion of people.
  • the person information may include the number of people and the information indicating the area where the person is located in association with each other.
  • the person information may include the number of people and the information including the time when the area where the person is present is imaged in association with each other.
  • the statistical processing unit may generate human information from information related to the degree of congestion of people acquired from one terminal device, or may generate human information from information related to the degree of congestion of people acquired from a plurality of terminal devices. Good. In the latter case, regarding the information related to the degree of congestion of people associated with the time within the predetermined time and with the predetermined area, the information from the terminal device satisfying the predetermined condition and / or each terminal device. The average number of people in the above may be used to generate the number of people mentioned above.
  • FIG. 21 is an example of human information.
  • Statistical information may generate a 3D map.
  • the 3D map may be a 3D digital representation of the landscapes on both sides as seen from the roadway.
  • the 3D map may be generated from the image.
  • the 3D map may be generated using one or more images captured by an imaging device that images the front and / or rear of the terminal device.
  • the means for generating a 3D map from one or more images may be a known means.
  • the statistical processing unit may generate a 3D map from an image acquired from one terminal device, or may generate a 3D map from an image acquired from a plurality of terminal devices. In the latter case, an image associated with a predetermined time range and associated with the same location information may be used to generate a 3D map.
  • an image captured in a predetermined time range there is an advantage that a 3D map can be generated based on information that does not change with time. Further, there is an advantage that a more accurate 3D map can be generated by generating a 3D map from a plurality of images captured at the same place having the same position information.
  • the information and communication unit may communicate information with one or more terminal devices.
  • the information communication unit may acquire information from one or more terminal devices, and may transmit information to one or more terminal devices.
  • the information communication unit acquires information related to at least a part of an image acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, and an output unit as information from one or a plurality of terminal devices. You can do it.
  • the information communication unit may provide one or more pieces of information relating to at least a part of an image acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, an output unit, and a statistical processing unit. It may be transmitted to the terminal device of.
  • the information communication unit may have a function of communicating information with various systems.
  • the Information and Communication Department is responsible for at least a part of other vehicle statistical information, congestion information, weather information, road information, road abnormality information, accident information, vehicle fuel provision information, advertising statistical information, human information, and 3D maps. Information may be transmitted to the system for road users.
  • the road user provides a company that provides a map, a carrier that uses the road, a taxi company that uses the road, a company that provides car navigation, a company that maintains the road, and services related to automatic driving. Companies and / or companies that provide services to municipalities or municipalities may be included.
  • the information and communication department may transmit the accident information to the system related to the insurance company.
  • the information and communication department may transmit the advertisement statistical information to the system related to the advertising company.
  • the information and communication unit may transmit the person information to the system related to the radio wave service of the mobile phone.
  • the system example according to the first embodiment is an embodiment mainly used as a drive recorder by using some or all of the above-mentioned functions.
  • a user of an example system may have, for example, a smartphone as a terminal device.
  • the software related to the system of the example may be downloaded and installed in advance on such a smartphone.
  • the user When boarding a vehicle used by a user of an example system (hereinafter, also referred to as a "user vehicle"), the user attaches a terminal device to the user vehicle.
  • the mounting mode may be various. It may be removable.
  • the rectangular smartphone may be attached to the vehicle in a horizontally long shape that is longer in the horizontal direction than in the vertical direction, or may be mounted in a vertically long shape that is longer in the vertical direction than in the horizontal direction. In the case of the horizontally long vehicle, there is an advantage that the lane in which the own vehicle travels and the vehicle traveling on the adjacent lane may also fall within the wide imaging range.
  • the example system may automatically start imaging. That is, the imaging device in the terminal device may be activated to start imaging. Since the imaging is automatically started only by starting and without any other operation, the user has an advantage that he / she can prevent forgetting to start the imaging operation and save the trouble of starting the imaging. Further, in the other example system, the imaging does not have to be started at the same time as the startup.
  • the shooting may be started by selecting the shooting button by the user.
  • the shooting button may be a mechanical and physical button in the smartphone, or may be a touch on the screen.
  • the terminal device may set the autofocus to infinity at the time of imaging and perform imaging.
  • the bonnet of the own vehicle may enter the imaging field of view, and when the inventor conducted an experiment, the bonnet was sometimes focused.
  • the focus is about 2 meters, and the focus is not on the vehicle in front, which may reduce the accuracy of acquiring information for identifying the vehicle in front.
  • the focus of the terminal device may be focused on water droplets on the windshield or the wiper in operation, and similarly, the accuracy of acquiring information for identifying the vehicle in front may be reduced. ..
  • there is an advantage that such a decrease in accuracy can be prevented by automatically setting the focus at the time of imaging of the terminal device to infinity.
  • the system of one example stores the information related to the driving situation at that time while storing the captured image.
  • the video to be stored may be stored in a file for each fixed file size. For example, a video file for each XMB (megabyte) of a predetermined storage amount may be created, and when the predetermined storage amount is exceeded, the next file may be created to store the video.
  • XMB megabyte
  • the video to be shot may include audio. This is because, as a drive recorder, voice in the driving situation is also important information.
  • the system of the example may display various displays or make sounds on the output unit according to the content of the captured image and the information related to the driving situation. For example, when the distance between the vehicle driving ahead and the own vehicle is getting closer, a warning may be issued. The warning may be displayed on the output unit in a manner that attracts the attention of the viewer, or may make a sound that attracts the attention of the person in the vehicle. Further, as information related to the driving situation, in the case of sudden braking, sudden acceleration, and sudden steering, these may be similarly warned.
  • An example system may generate a drive report when the vehicle has finished driving. For example, an example system may generate statistical information about driving conditions and include it in a drive report.
  • the example system may transmit the stored information to the management server.
  • the captured moving image and information related to the driving situation may be transmitted to the management server.
  • the timing of transmission may be configured so that the terminal device starts transmission at a timing when transmission becomes possible with another information processing device in a predetermined communication mode.
  • the predetermined communication mode may be, for example, WIFI.
  • One example system may automatically erase the video stored in the terminal device after transmission to the management server.
  • the terminal device may automatically delete the information confirmed that the video is stored in the management server after the terminal device acquires the information.
  • automatic erasing may be performed without confirmation of the user.
  • the video stored as a drive recorder has a large storage capacity for the terminal device, so the next time it is used
  • the system of the example may have a function of reproducing the information stored in the terminal device. In addition, it may have a function of displaying information related to the driving situation.
  • FIG. 22 is an example of a screen transition diagram of the terminal device according to the system of this example.
  • the agreement agreement screen 002 is displayed. Such a screen is only when the application is started for the first time after installation.
  • the TOP screen 003 is displayed when the agreement is agreed or the second and subsequent activations are performed. From the top screen, it is possible to shift to the recording screen 004, the captured image list screen 006, and the setting screen 008.
  • the screen can be moved to the drive report screen 005.
  • the drive report screen can display statistical information about the driving status of the drive during recording.
  • the captured video list screen 006 can be switched to the captured video playback screen 007.
  • the screen transition shown here is an example. For example, it may be possible to reproduce the selected one shot video from the shot video list screen, or to display the drive report screen of the shot video.
  • FIG. 23 is a screen waiting for a shooting instruction after the system of one example is started.
  • the image pickup button 002 is displayed large on the display screen.
  • imaging can be performed according to a user's instruction.
  • the image pickup button may have an area of one tenth or more of the area of the display screen.
  • the image pickup button has an area larger than a predetermined area, there is an advantage that the user can more easily instruct the start of image pickup.
  • FIG. 24 is an example of an imaging screen during driving.
  • the vehicles in the forward direction are surrounded by the red frame 002, indicating that each vehicle is recognized.
  • information 003 regarding the driving situation is also displayed.
  • the horizontal axis shows the elapsed time and the vertical axis shows the speed of the vehicle. Therefore, where the angle of the line in the graph is steep, sudden acceleration, sudden braking, or sudden steering can be applied, and such a display may be made.
  • the distance to the vehicle in front is short, particularly when the distance to the vehicle in front on the same lane is short, it may be displayed as a dangerous inter-vehicle distance.
  • the sign 004 on the road may be recognized and the information thereof may be acquired.
  • FIG. 25 shows a list of captured images.
  • a list of each video 001 is displayed according to the time. These may be from the start of one shooting to the end of shooting, or may be listed for each file divided for each predetermined storage amount as described above.
  • FIG. 26 is another example showing a list of captured images.
  • FIG. 27 shows the captured image at 001 during display, and also shows a list 002 of some images.
  • FIG. 28 is an example showing a horizontally long image / display instead of the vertically long image / display described above.
  • the vehicle may be imaged by a smartphone mounted horizontally 001.
  • FIG. 29 is an example of displaying the operating status during imaging.
  • the horizontal axis is the time, and the vertical axis is the speed of the own vehicle, and a mark indicating the time when sudden acceleration, sudden steering, or sudden braking occurs is displayed.
  • FIG. 30 is another example of displaying the operating status during imaging.
  • the screen is a vertical screen.
  • the horizontal axis is the time and the vertical axis is the speed of the own vehicle, and a mark indicating the time when sudden acceleration, sudden steering, or sudden braking occurs is displayed.
  • FIG. 31 is an example showing a state in which a smartphone mounted horizontally is recognizing a vehicle in front.
  • the vehicle in front is surrounded by a frame 001 to indicate that it is recognized.
  • the output unit may display information that identifies the vehicle thus identified. Further, not only the vehicle but also the output unit may display information that identifies a pedestrian, a two-wheeled vehicle (including a bicycle), and the like.
  • the image information generation unit identifies a person and something like a two-wheeled vehicle, it identifies it as a two-wheeled vehicle, and when it identifies a person without specifying something like a two-wheeled vehicle, it identifies it as a pedestrian. It's okay.
  • the line of information for identifying such a vehicle may be a circle or a polygon instead of a rectangle.
  • the display for identifying the target vehicle indicating such recognition may be a line surrounding the target vehicle or a line not surrounding the target vehicle.
  • the display identifying such a vehicle may be various vehicles. For example, it may be a passenger car, a commercial vehicle, a truck, a taxi, a motorcycle, and the like. Further, the display for identifying such a vehicle may be limited to the case where the above-mentioned distance information includes a distance within a predetermined range. For example, a display identifying such a vehicle may be made only if the vehicle is within a range of 15 meters. In addition, the display that identifies the vehicle may be displayed in various modes.
  • the color, the shape of the display, and the like may be changed.
  • the display for identifying the vehicle may change the mode of the marking according to the information included in the above-mentioned distance information. For example, when the distance information includes information of 2 meters or more and 5 meters or less, it may be yellow, and when the distance information includes information of 5 meters or more and 15 meters or less, it may be blue or the like.
  • the distance included in the distance information may be displayed in association with the display that identifies the vehicle. For example, it may be displayed as "3 m" or the like in association with a display that identifies the vehicle in front. If the distance information includes a distance between 0 meters and 2 meters, it is not necessary to display the distance.
  • whether or not the information on the distance is displayed may be determined by the vehicle depending on the position of the output unit and whether or not the information is displayed.
  • the output unit may display only when the display identifying the vehicle is displayed at a predetermined position of the output unit.
  • the output unit may display only the vehicle whose rectangle that identifies the vehicle includes the center of the screen. Whether or not the rectangle that identifies the vehicle is at the center of the screen may be determined, for example, by comparing the coordinate position that specifies the rectangle with the coordinate position at the center of the screen.
  • FIG. 32 is an example in which the form of such a rectangular line is changed according to the proximity of the vehicle in front on the same lane and the own vehicle.
  • the color such as red or yellow that attracts the user's attention may be changed, or the thickness of the line or the decoration of the line may be changed. Further, in order to indicate that they are approaching, "approaching" may be displayed as shown in this figure.
  • FIG. 33 shows a state in which the image is being viewed after imaging.
  • video 001 is displayed.
  • the sudden steering wheel, the sudden acceleration, and the sudden-brake are displayed in the graph 002 by each display.
  • the horizontal axis represents time and the vertical axis represents speed.
  • FIG. 34 the case of a vertical screen is displayed.
  • Example 2 The system of another company's information gathering example may have only the configuration essential to this embodiment, or may have the mode required for another embodiment.
  • An example system acquires images from one or more terminal devices. Further, the system of one example may acquire one or more vehicle identification information and other vehicle determination information associated with the one or more vehicle identification information from one terminal device.
  • the system of one example may generate other vehicle statistical information by collecting other vehicle determination information for each vehicle specific information.
  • the other vehicle statistical information may be summarized in a predetermined period based on the images captured in the predetermined period.
  • other vehicle statistical information may be summarized in a predetermined area based on an image captured in the predetermined area.
  • a predetermined coefficient may be associated with each other for sudden acceleration, sudden steering, and sudden braking, and a weighted total determination score may be generated by multiplying the number of times.
  • FIG. 35 is an example of other vehicle statistical information. In this figure, the vehicles are arranged in order of highest overall judgment score.
  • An example system acquires one or more specific vehicle information from one terminal device, searches the one or more specific vehicle information from other vehicle statistical information, and converts the one or more specific vehicle information into such specific vehicle information.
  • Such other vehicle statistical information may be transmitted to the above-mentioned one terminal device.
  • the other vehicle statistical information related to the specific vehicle information is, for example, the number of times of sudden acceleration, the number of times of sudden braking, the number of times of sudden steering, the numerical value related to each number of times, the total judgment score, and the ranking ranking. , And so on.
  • the numerical value related to the number of times may be the probability of occurrence within a predetermined period or a coefficient obtained by using them.
  • the other vehicle statistical information related to the specific vehicle information may be processed into information that simply indicates the degree of danger. For example, it may include information of one of three options such as high, normal, and low.
  • the other vehicle statistical information related to the specific vehicle information may be displayed in association with the one or more other vehicles.
  • FIG. 37 is an example of such a display.
  • a predetermined standard for example, sudden acceleration, sudden braking, and / or sudden steering.
  • This is an example showing that the number of times is higher than the predetermined standard, the total judgment score is higher than the predetermined standard, and / or the ranking ranking is higher than the predetermined standard).
  • the vehicle 002 does not have a history of driving that is more dangerous than the prescribed standard, it is not necessary to display anything as shown in this figure, and it is more than the predetermined standard such as "normal”. May also indicate that it does not have a dangerous driving history.
  • FIG. 38 shows an example in which an example system collects information acquired from a plurality of terminal devices, processes the information, and transmits the information to one or a plurality of terminal devices.
  • An example management system 004 may acquire information from the terminal devices 001 to 003, perform predetermined processing, and transmit the information to the terminal devices 001 to 003.
  • the terminal device for acquiring information and the terminal device for transmitting information may be the same or different.
  • Example 3 The system of one example of vehicle information collection may include only the configurations essential to this embodiment, or may include aspects required for other embodiments.
  • An example system may acquire information related to a target from one or a plurality of terminal devices, and the timing may be any. For example, in the system of one example, information may be acquired at the timing when the terminal device is connected to a wireless communication device such as WIFI, or when the terminal device is connected to a communication standard such as 3G, 4G, or 5G. Then, the information may be acquired, or the information may be acquired in real time at the timing when the information to be transmitted by the terminal device is acquired. The real-time may include those transmitted with a predetermined delay associated with information processing.
  • FIG. 39 shows an example in which an example system collects information acquired from a plurality of terminal devices, processes the information, and transmits the information to the corresponding companies.
  • An example management system 004 may acquire information related to the target from the terminal devices 001 to 003 and transmit it to the systems 005 to 007 related to the target company.
  • the target company may include road users, insurance companies, local governments or companies related to local governments, advertising companies, mobile phone related companies, and the like.
  • An example system is at least some information (subordinates indicated by the above terms) of congestion information, weather information, road information, road abnormality information, accident information, vehicle fuel supply information, human information, and 3D maps. If you send information such as (which may be conceptual information) to the system related to the company that provides the map, it can be used to update the map, and if you send it to the system related to the carrier using the road, the efficiency of the carrier When it is sent to the system related to the taxi company that uses the road, it can be used for efficient use of the taxi, and when it is sent to the system related to the company that provides car navigation, the car navigation system Can be used to update information to, and sent to the system related to the company that maintains the road.
  • FIG. 40 is an example of a flow in an example system.
  • An example system may acquire information transmitted from an information communication unit in such a terminal from one or more terminals.
  • the statistical processing unit related to the system of one example may perform statistical processing using the information acquired by the information and communication unit.
  • the example system may then send information, including such statistically processed information, to the target company.
  • the system and terminal device may be composed of one or a plurality of information processing devices.
  • the information processing device 10 may include a bus 15, an arithmetic device 11, a storage device 12, and a communication device 16. Further, the information processing device 10 in one embodiment may include an input device 13 and a display device 14. It is also directly or indirectly connected to the network 17.
  • the bus 15 may have a function of transmitting information between the arithmetic unit 11, the storage device 12, the input device 13, the display device 14, and the communication device 16.
  • An example of the arithmetic unit 11 is a processor. This may be a CPU or an MPU. Further, the arithmetic unit in one embodiment may include a graphics processing unit, a digital signal processor, and the like. In short, the arithmetic unit 12 may be any device capable of executing program instructions.
  • the storage device 12 is a device that records information. This may be either an external memory or an internal memory, and may be either a main storage device or an auxiliary storage device. Further, a magnetic disk (hard disk), an optical disk, a magnetic tape, a semiconductor memory, or the like may be used. Further, it may have a storage device via a network or a storage device on the cloud via a network.
  • the registers, L1 cache, L2 cache, etc. that store information at a position physically close to the arithmetic unit may be included in the arithmetic unit 11, but in the design of the computer architecture, As a device for recording information, the storage device 12 may include these.
  • the arithmetic unit 11, the storage device 12, and the bus 11 may be configured to cooperate with each other to execute information processing.
  • the storage device 12 can include a part or all of a program capable of executing the process according to the present invention. In addition, data necessary for executing the process according to the present invention can be appropriately recorded. Further, the storage device 12 in one embodiment may include a database.
  • the above describes the case where the arithmetic unit 12 is executed based on the program provided in the storage device 13, but it is one of the forms in which the bus 11, the arithmetic unit 12 and the storage device 13 are combined.
  • part or all of the information processing according to the present invention may be realized by a programmable logic device capable of changing the hardware circuit itself or a dedicated circuit in which the information processing to be executed is determined.
  • the input device 13 inputs information, but may have other functions.
  • Examples of the input device 14 include an input device such as a keyboard, a mouse, a touch panel, or a pen-type instruction device.
  • the display device 14 has a function of displaying information.
  • a liquid crystal display, a plasma display, an organic EL display, and the like can be mentioned, but in short, any device capable of displaying information may be used.
  • the input device 13 may be partially provided like a touch panel.
  • the network 17 transmits information together with the communication device 16. That is, it has a function of enabling information of 10 information processing devices to be transmitted to other information terminals (not shown) via the network 17.
  • the communication device 16 may use any connection type, such as IEEE1394, Ethernet (registered trademark), PCI, SCSI, USB, 2G, 3G, 4G, 5G, and the like.
  • the connection to the network 17 may be either wired or wireless.
  • the information processing device may be a general-purpose type or a dedicated type. Further, the information processing device may be a workstation, a desktop personal computer, a laptop personal computer, a laptop computer, a PDA, a mobile phone, a smartphone or the like.
  • the system according to the present invention may be composed of a plurality of information processing devices.
  • the plurality of information processing devices may be internally connected or may be externally connected.
  • the system according to the present invention may be of various types of devices.
  • the system according to the present invention may be a stand-alone system, a server-client system, a peer-to-peer system, or a cloud system.
  • the system according to the present invention may be a stand-alone information processing device, may be composed of a part or all of the information processing device of the server-client type, or may be a part or all of the information processing of the peer-to-peer type. It may be composed of devices, or may be composed of some or all information processing devices in the cloud format.
  • the owner and manager of each information processing device may be different.
  • the information processing device 10 may be a physical existence or a virtual one.
  • the information processing device 10 may be virtually realized by using cloud computing.
  • the configuration implemented by the system of this example may be configurations implemented by one or more information processing devices in the system.
  • the information processing device described above as the portable information processing device may be an information processing device that is appropriately installed and fixed.
  • the invention examples described in the examples of the documents of the present application are not limited to those described in the documents of the present application, and can be applied to various examples within the scope of the technical idea.
  • the information presented on the screen of the information processing device can be displayed on the screen of the other information processing device, so that the information can be transmitted to the other information processing device.
  • the system may be configured.
  • the display of ⁇ in various drawings may contain appropriate values according to each context, and may be the same or different.
  • the processes and procedures described in the documents of the present application may be feasible not only by those explicitly described in the embodiments but also by software, hardware or a combination thereof. Further, the processes and procedures described in the documents of the present application may be able to be implemented by various computers by implementing the processes and procedures as a computer program. Further, these computer programs may be stored in a storage medium. Also, these programs may be stored on a non-transient or temporary storage medium.

Abstract

[Problem] A system according to one example of the present invention can utilize data that has been obtained from an image more appropriately. [Solution] A system comprising: a first acquisition unit that acquires, from the first portable terminal device, first vehicle specification information for specifying a first vehicle in a first image captured by a first portable terminal device and first vehicle determination information obtained by determining a driving state of the first vehicle on the basis of the first image; a second acquisition unit that acquires, from the second portable terminal device, second vehicle specification information for specifying a second vehicle in a second image captured by a second portable terminal device and second vehicle determination information obtained by determining a driving state of the second vehicle on the basis of the second image; a determination unit that determines identity between the first vehicle and the second vehicle; and a statistical processing unit that, by using the first vehicle determination information and the second vehicle determination information when the first vehicle and the second vehicle are identical to each other, generates statistic information concerning the first vehicle. 

Description

情報処理システム、情報処理装置、端末装置、サーバ装置、プログラム、又は方法Information processing system, information processing device, terminal device, server device, program, or method
 本出願において開示された技術は、情報処理システム、情報処理装置、サーバ装置、プログラム、又は方法に関する。 The technology disclosed in this application relates to an information processing system, an information processing device, a server device, a program, or a method.
 近年、画像を用いた情報処理が行われているが、必ずしも画像内の情報が適切に利用されてはいない。 In recent years, information processing using images has been performed, but the information in the images is not always used properly.
特開2009-75858号公報Japanese Unexamined Patent Publication No. 2009-75858 特開2004-318905号公報Japanese Unexamined Patent Publication No. 2004-318905
 そこで、本発明の様々な実施形態は、上記の課題を解決するために、情報処理システム、情報処理装置、端末装置、サーバ装置、プログラム、又は方法を提供する。 Therefore, various embodiments of the present invention provide an information processing system, an information processing device, a terminal device, a server device, a program, or a method in order to solve the above problems.
 一実施態様に係る第1のシステムは、
 第1携帯端末装置によって撮像された第1画像内の第1車両を特定する第1車両特定情報と、前記第1画像に基づき前記第1車両の運転状態を判定した第1車両判定情報と、を前記第1携帯端末装置から取得する取得部と、
 第2携帯端末装置によって撮像された第2画像内の第2車両を特定する第2車両特定情報と、前記第2画像に基づき前記第2車両の運転状態を判定した第2車両判定情報と、を前記第2携帯端末装置から取得する取得部と、
 前記第1車両特定情報と前記第2車両特定情報とを用いて、前記第1車両と前記第2車両との同一性を判定する判定部と、
 前記第1車両と前記第2車両とが同一であると判定された場合、前記第1車両判定情報と、前記第2車両判定情報とを用いて、前記第1車両に係る統計情報を生成する統計処理部と、
 を備えるシステム。
The first system according to one embodiment is
The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. From the first mobile terminal device,
The second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. From the second mobile terminal device,
A determination unit that determines the identity between the first vehicle and the second vehicle by using the first vehicle identification information and the second vehicle identification information.
When it is determined that the first vehicle and the second vehicle are the same, the statistical information relating to the first vehicle is generated by using the first vehicle determination information and the second vehicle determination information. Statistical processing department and
System with.
 一実施態様に係る第2のシステムは、
 前記第1車両と前記第2車両とが同一であると判定された場合、前記第1車両に係る統計情報を、前記第2携帯端末装置に送信する送信部、
 を備える第1のシステム。
The second system according to one embodiment is
A transmission unit that transmits statistical information relating to the first vehicle to the second mobile terminal device when it is determined that the first vehicle and the second vehicle are the same.
First system with.
 一実施態様に係る第3のシステムは、
 前記第1携帯端末装置から、前記第1画像内の物体に係る情報を取得する、第3取得部を備える、
第1乃至2のいずれか一のシステム。
The third system according to one embodiment is
It includes a third acquisition unit that acquires information related to an object in the first image from the first mobile terminal device.
Any one of the first and second systems.
 一実施態様に係る第4のシステムは、
 前記物体に係る情報は、前記第1画像内の、ワイパーの動作に係る情報、道路に係る情報、歩道に係る情報、車道上イベントに係る情報、広告に係る情報、及び/又は、車両用の燃料に係る情報、を含む、
第3のシステム。
The fourth system according to one embodiment is
The information related to the object is the information related to the operation of the wiper, the information related to the road, the information related to the sidewalk, the information related to the event on the roadway, the information related to the advertisement, and / or for the vehicle in the first image. Including fuel information,
Third system.
 一実施態様に係る第5のシステムは、
 第3携帯端末装置によって撮像された画像内に係る第3車両を特定する第3車両特定情報と、前記第3携帯端末装置が発信するメッセージと、を取得する取得部と、
 第4携帯端末装置において自己の車両として登録した第4車両を特定する第4車両特定情報を取得する取得部と、
 前記第3車両と前記第4車両とが同一の車両と判定された場合に、前記第4携帯端末装置に対して、前記メッセージを送信する、送信部と、
を備える第1乃至4のいずれか一のシステム。
The fifth system according to one embodiment is
An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle in the image captured by the third mobile terminal device, the message transmitted by the third mobile terminal device, and the acquisition unit.
An acquisition unit that acquires the fourth vehicle identification information that identifies the fourth vehicle registered as its own vehicle in the fourth mobile terminal device, and
When the third vehicle and the fourth vehicle are determined to be the same vehicle, a transmission unit that transmits the message to the fourth mobile terminal device and a transmission unit.
Any one of the first to fourth systems comprising.
 一実施態様に係る第6のシステムは、
 第3携帯端末装置において自己の車両として登録した第3車両を特定する第3車両特定情報と、前記第3携帯端末装置が発信するメッセージと、を取得する取得部と、
 第5携帯端末装置によって撮像された画像内に係る第5車両を特定する第5車両特定情報を取得する取得部と、
 前記第3車両と前記第5車両とが同一の車両と判定された場合に、前記第5携帯端末装置に対して、前記メッセージを送信する、送信部と、
を備える第1乃至4のいずれか一のシステム。
The sixth system according to one embodiment is
An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle registered as its own vehicle in the third mobile terminal device and the message transmitted by the third mobile terminal device.
An acquisition unit that acquires the fifth vehicle identification information that identifies the fifth vehicle related to the image captured by the fifth mobile terminal device, and the acquisition unit.
When the third vehicle and the fifth vehicle are determined to be the same vehicle, the transmission unit and the transmission unit that transmit the message to the fifth mobile terminal device.
Any one of the first to fourth systems comprising.
 一実施態様に係る第7のシステムは、
 前記第1車両と前記第2車両とが同一であると判定された後、所定の時間内に、前記送信部は、前記第1車両に係る統計情報を、送信する、
第2のシステム。
The seventh system according to one embodiment is
After it is determined that the first vehicle and the second vehicle are the same, the transmission unit transmits statistical information relating to the first vehicle within a predetermined time.
The second system.
 一実施態様に係る第8のシステムは、
 前記第1携帯端末装置と、前記第2携帯端末装置と、は異なる携帯端末装置である、
第1乃至7のいずれか一のシステム。
The eighth system according to one embodiment is
The first mobile terminal device and the second mobile terminal device are different mobile terminal devices.
Any one of the first to seventh systems.
 一実施態様に係る第9のシステムは、
 前記第1取得部は、前記第1携帯端末装置から、前記第1画像を含む動画を取得する、
第1乃至8のいずれか一のシステム。
The ninth system according to one embodiment is
The first acquisition unit acquires a moving image including the first image from the first mobile terminal device.
Any one of the first to eighth systems.
 一実施態様に係る第10のシステムは、
 前記動画は、圧縮された動画である、
第9のシステム。
The tenth system according to one embodiment is
The moving image is a compressed moving image.
Ninth system.
 一実施態様に係る第11のシステムは、
 前記第1車両判定情報は、前記第1携帯端末装置内の機械学習済み装置によって生成された情報である、
第1乃至10のいずれか一のシステム。
The eleventh system according to one embodiment is
The first vehicle determination information is information generated by the machine-learned device in the first mobile terminal device.
Any one of the first to ten systems.
 一実施態様に係る第12のシステムは、
 前記1車両判定情報及び前記第2車両判定情報は、それぞれ、急ハンドル、急加速、及び/又は、急ブレーキを含む、
第1乃至11のいずれか一のシステム。
The twelfth system according to one embodiment is
The one vehicle determination information and the second vehicle determination information include sudden steering, sudden acceleration, and / or sudden braking, respectively.
Any one of the first to eleven systems.
 一実施態様に係る第13の方法は、
コンピュータが、
 第1携帯端末装置によって撮像された第1画像内の第1車両を特定する第1車両特定情報と、前記第1画像に基づき前記第1車両の運転状態を判定した第1車両判定情報と、を前記第1携帯端末装置から取得する取得ステップと、
 第2携帯端末装置によって撮像された第2画像内の第2車両を特定する第2車両特定情報と、前記第2画像に基づき前記第2車両の運転状態を判定した第2車両判定情報と、を前記第2携帯端末装置から取得する取得ステップと、
 前記第1車両と前記第2車両との同一性を判定する判定ステップと、
 前記第1車両と前記第2車両とが同一である場合、前記第1車両判定情報と、前記第2車両判定情報とを用いて、前記第1車両に係る統計情報を生成する統計処理ステップと、
 を実行する方法。
The thirteenth method according to one embodiment is
The computer
The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. And the acquisition step of acquiring the image from the first mobile terminal device.
The second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. And the acquisition step of acquiring the image from the second mobile terminal device.
A determination step for determining the identity of the first vehicle and the second vehicle, and
When the first vehicle and the second vehicle are the same, a statistical processing step of generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information. ,
How to do.
 一実施態様に係る第14のプログラムは、
コンピュータを、
 第1携帯端末装置によって撮像された第1画像内の第1車両を特定する第1車両特定情報と、前記第1画像に基づき前記第1車両の運転状態を判定した第1車両判定情報と、を前記第1携帯端末装置から取得する取得手段、
 第2携帯端末装置によって撮像された第2画像内の第2車両を特定する第2車両特定情報と、前記第2画像に基づき前記第2車両の運転状態を判定した第2車両判定情報と、を前記第2携帯端末装置から取得する取得手段、
 前記第1車両と前記第2車両との同一性を判定する判定手段、
 前記第1車両と前記第2車両とが同一である場合、前記第1車両判定情報と、前記第2車両判定情報とを用いて、前記第1車両に係る統計情報を生成する統計処理手段、
 として動作させるプログラム。
The fourteenth program according to one embodiment is
Computer,
The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. Is acquired from the first mobile terminal device,
The second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. Is acquired from the second mobile terminal device,
A determination means for determining the identity of the first vehicle and the second vehicle,
When the first vehicle and the second vehicle are the same, a statistical processing means for generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information.
A program that operates as.
 一実施態様に係る第15のプログラムは、
コンピュータを、第1乃至12のいずれか一のシステムとして機能させるためのプログラム。
The fifteenth program according to one embodiment is
A program for operating a computer as any one of the first to twelfth systems.
 本発明の一実施形態により、より適切に画像情報を活用できる。 According to one embodiment of the present invention, image information can be used more appropriately.
図1は、一実施形態に係るシステムが適用される状況例を説明する図である。FIG. 1 is a diagram illustrating a situation example in which the system according to one embodiment is applied. 図2は、一実施形態に係るシステムの機能を示すブロック図である。FIG. 2 is a block diagram showing the functions of the system according to the embodiment. 図3は、一実施形態に係るシステムの機能を示すブロック図である。FIG. 3 is a block diagram showing the functions of the system according to the embodiment. 図4は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 4 is an example of a data structure used by the system according to the embodiment. 図5は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 5 is an example of a data structure used by the system according to the embodiment. 図6は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 6 is an example of a data structure used by the system according to the embodiment. 図7は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 7 is an example of a data structure used by the system according to the embodiment. 図8は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 8 is an example of a data structure used by the system according to the embodiment. 図9は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 9 is an example of a data structure used by the system according to the embodiment. 図10は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 10 is an example of a data structure used by the system according to the embodiment. 図11は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 11 is an example of a data structure used by the system according to the embodiment. 図12は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 12 is an example of a data structure used by the system according to the embodiment. 図13は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 13 is an example of a data structure used by the system according to the embodiment. 図14は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 14 is an example of a data structure used by the system according to the embodiment. 図15は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 15 is an example of a data structure used by the system according to the embodiment. 図16は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 16 is an example of a data structure used by the system according to the embodiment. 図17は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 17 is an example of a data structure used by the system according to the embodiment. 図18は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 18 is an example of a data structure used by the system according to the embodiment. 図19は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 19 is an example of a data structure used by the system according to the embodiment. 図20は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 20 is an example of a data structure used by the system according to the embodiment. 図21は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 21 is an example of a data structure used by the system according to the embodiment. 図22は、一実施例に係るシステムが利用する遷移図の一例である。FIG. 22 is an example of a transition diagram used by the system according to the embodiment. 図23は、一実施例に係るシステムの表示例を示す図である。FIG. 23 is a diagram showing a display example of the system according to one embodiment. 図24は、一実施例に係るシステムの表示例を示す図である。FIG. 24 is a diagram showing a display example of the system according to one embodiment. 図25は、一実施例に係るシステムの表示例を示す図である。FIG. 25 is a diagram showing a display example of the system according to one embodiment. 図26は、一実施例に係るシステムの表示例を示す図である。FIG. 26 is a diagram showing a display example of the system according to one embodiment. 図27は、一実施例に係るシステムの表示例を示す図である。FIG. 27 is a diagram showing a display example of the system according to one embodiment. 図28は、一実施例に係るシステムの表示例を示す図である。FIG. 28 is a diagram showing a display example of the system according to one embodiment. 図29は、一実施例に係るシステムの表示例を示す図である。FIG. 29 is a diagram showing a display example of the system according to one embodiment. 図30は、一実施例に係るシステムの表示例を示す図である。FIG. 30 is a diagram showing a display example of the system according to one embodiment. 図31は、一実施例に係るシステムの表示例を示す図である。FIG. 31 is a diagram showing a display example of the system according to one embodiment. 図32は、一実施例に係るシステムの表示例を示す図である。FIG. 32 is a diagram showing a display example of the system according to one embodiment. 図33は、一実施例に係るシステムの表示例を示す図である。FIG. 33 is a diagram showing a display example of the system according to one embodiment. 図34は、一実施例に係るシステムの表示例を示す図である。FIG. 34 is a diagram showing a display example of the system according to one embodiment. 図35は、一実施例に係るシステムが利用するデータ構造の一例である。FIG. 35 is an example of a data structure used by the system according to the embodiment. 図36は、一実施例に係るシステムのフローの一例を示すものである。FIG. 36 shows an example of the flow of the system according to one embodiment. 図37は、一実施例に係るシステムの表示例を示す図である。FIG. 37 is a diagram showing a display example of the system according to one embodiment. 図38は、一実施例に係るシステムに係る全体構成を示すブロック図である。FIG. 38 is a block diagram showing an overall configuration according to the system according to one embodiment. 図39は、一実施例に係るシステムに係る全体構成を示すブロック図である。FIG. 39 is a block diagram showing an overall configuration according to the system according to one embodiment. 図40は、一実施例に係るシステムのフローの一例を示すものである。FIG. 40 shows an example of the flow of the system according to one embodiment. 図41は、一実施例に係る情報処理装置の構成を示すブロック図である。FIG. 41 is a block diagram showing a configuration of an information processing device according to an embodiment.
1.はじめに
 一例のシステムは、システムの利用者が利用する一又は複数の端末装置と、システムの管理者が利用する一又は複数の管理システムと、を備えてよい。利用者は、車両に乗者するものであってよく、運転手であってもよいし、同乗者であってもよい。端末装置は、車両に固定されてもよいし、着脱可能にされてもよい。図1は、一例のシステムが利用される状況例を示したものである。本図において、矢印001は、車両の進行方向を示す。車両002は、一例のシステムに係る利用者携帯端末を利用している車両を示す。車両003A乃至Iは、一例のシステムに係る端末装置をドライブレコーダとして用いているものとする。なお、これらの車両の一部が一例のシステムに係る端末装置を用いていなくともよい。各車両は、車線境界線004A及び車線境界線004Bに沿って、運転されているものとする。
1. 1. Introduction An example system may include one or more terminal devices used by system users and one or more management systems used by system administrators. The user may be a passenger in the vehicle, may be a driver, or may be a passenger. The terminal device may be fixed to the vehicle or may be detachable. FIG. 1 shows an example of a situation in which an example system is used. In this figure, arrow 001 indicates the traveling direction of the vehicle. Vehicle 002 indicates a vehicle using a user mobile terminal according to an example system. It is assumed that the vehicles 003A to I use the terminal device according to the example system as a drive recorder. It should be noted that some of these vehicles do not have to use the terminal device according to the example system. It is assumed that each vehicle is driven along the lane boundary line 004A and the lane boundary line 004B.
 車両002内の利用者端末装置は、その撮像装置により、各車両を特定できてよい。例えば、車両002内の利用者端末装置の一の撮像装置の画角が広い場合、車両003A、003D、003Fの3台の車両が画像内に含まれてよい。同様に、車両002内の利用者端末装置の前記一の撮像装置と相対する報告を撮像する他の撮像装置の画角が広い場合、車両003C、003E、003H、003Iの4台の車両が画像内に含まれてよい。なお、車両003Iのように、車両002内の利用者端末装置の撮像装置において視野に入る場合もある。また、車両003B及び003Gは、車両002の側方を進行するため、車両002内において、前方を一の撮像装置の撮像方向とし、後方を前記一の撮像装置と相対する方向を撮像する他の撮像装置の撮像方向とする場合、車両002の真横を走行している間は視野に入らなくとも、走行中前後することにより、車両002内の利用者端末装置の撮像装置の視野内に入る場合があってよい。このように、撮像装置内の視野内に入った車両を、本願書類では、「周囲車両」ということがあり、また、周囲車両内で一例のシステムに係る端末装置が利用されている場合においては、かかる端末装置を、利用者車両002内の利用者端末装置から見た場合に、本願書類では「周囲端末装置」ということがある。また、周囲車両のうち、利用者車両の前方の車両を、本願書類では「前方車両」ということがあり、利用者車両の後方の車両を、本願書類では「後方車両」ということがある。また、かかる前方車両及び後方車両内の一例のシステムに係る端末装置が一例のシステムとして利用されている場合、各車両内で利用されている端末装置を、各々、「前方端末装置」、「後方端末装置」ということもある。周囲車両が上述の撮像装置内の視野に含まれることで、後述のナンバープレートに係る情報などを用いて、かかる周囲車両を特定する情報を取得してよい。 The user terminal device in the vehicle 002 may be able to identify each vehicle by its imaging device. For example, when the angle of view of one image pickup device of the user terminal device in the vehicle 002 is wide, three vehicles of vehicles 003A, 003D, and 003F may be included in the image. Similarly, when the angle of view of the other imaging device that captures the report facing the one imaging device of the user terminal device in the vehicle 002 is wide, the four vehicles of the vehicles 003C, 003E, 003H, and 003I are images. May be included in. In some cases, such as the vehicle 003I, the image pickup device of the user terminal device in the vehicle 002 comes into view. Further, since the vehicles 003B and 003G travel on the side of the vehicle 002, in the vehicle 002, the front is the imaging direction of one imaging device, and the rear is the other direction that images the direction facing the one imaging device. When the imaging direction is set to the imaging device, even if the vehicle does not enter the field of view while traveling right beside the vehicle 002, it enters the field of view of the imaging device of the user terminal device in the vehicle 002 by moving back and forth while traveling. There may be. In this way, the vehicle in the field of view in the imaging device may be referred to as a "surrounding vehicle" in the documents of the present application, and when the terminal device related to the system of one example is used in the surrounding vehicle. When such a terminal device is viewed from the user terminal device in the user vehicle 002, it may be referred to as a "peripheral terminal device" in the documents of the present application. Further, among the surrounding vehicles, the vehicle in front of the user vehicle may be referred to as the "front vehicle" in the document of the present application, and the vehicle behind the user vehicle may be referred to as the "rear vehicle" in the document of the present application. Further, when the terminal device related to the example system in the front vehicle and the rear vehicle is used as an example system, the terminal devices used in each vehicle are referred to as "front terminal device" and "rear", respectively. It may also be called a "terminal device". Since the surrounding vehicle is included in the field of view in the above-mentioned imaging device, the information for identifying the surrounding vehicle may be acquired by using the information related to the license plate described later.
 端末装置は、一又は複数の加速度センサーを備えてよい。複数の加速度センサーは、互いに直交する方向の加速度を測定可能なように、端末装置内に備えられていてよい。複数の加速度センサーは、撮像装置によって撮像される方向と、撮像される方向と直交する方向と、の加速度を測定可能なように、端末装置内に備えられてよい。 The terminal device may include one or more accelerometers. A plurality of acceleration sensors may be provided in the terminal device so that accelerations in directions orthogonal to each other can be measured. A plurality of acceleration sensors may be provided in the terminal device so that the acceleration in the direction imaged by the image pickup device and the direction orthogonal to the image pickup direction can be measured.
 端末装置は、撮像装置を、1又は複数備えてよい。後者の場合、端末装置は、撮像装置を、相対する方向を撮像可能なように備えてよい。また、複数の撮像装置の一又は複数が車両の前方を撮像方向とするよう、端末装置が車両内に設置されてよい。また、複数の撮像装置のうちの他の一又は複数の撮像装置が車両の後方を撮像方向とするよう、端末装置が車両内に設置されてよい。なお、車両内の設置は、車両内に着脱式で設置されてもよいし、固定されてもよい。 The terminal device may include one or more imaging devices. In the latter case, the terminal device may be equipped with an imaging device so that it can image in opposite directions. Further, the terminal device may be installed in the vehicle so that one or more of the plurality of imaging devices have the front of the vehicle as the imaging direction. Further, the terminal device may be installed in the vehicle so that the other one or a plurality of imaging devices among the plurality of imaging devices have the rear of the vehicle as the imaging direction. The installation in the vehicle may be detachably installed in the vehicle or may be fixed.
 端末装置内の表示装置は、表示方向が、一の撮像装置の撮像方向と、同一の方向であってよい。また、表示装置の表示方向は、車両の後方であってよい。 The display direction of the display device in the terminal device may be the same as the image pickup direction of one image pickup device. Further, the display direction of the display device may be the rear of the vehicle.
 端末装置は、一又は複数の位置測定装置を有してよい。位置測定装置は、GPSや基地局を用いた位置測定機能であってよい。 The terminal device may have one or more position measuring devices. The position measuring device may be a position measuring function using GPS or a base station.
 端末装置は、通信装置を備えてよい。通信装置は、端末装置が設置されている車両内に設置された撮像装置と通信可能であってもよい。通信装置は、無線式でもよいし、有線式でもよい。無線式の場合、WIFI、BLUETOOTHなどであってよい。車両内に設置された撮像装置は、車両内に組み込まれた撮像装置であってもよいし、車両に設置された撮像装置であってもよい。 The terminal device may be provided with a communication device. The communication device may be capable of communicating with an image pickup device installed in the vehicle in which the terminal device is installed. The communication device may be a wireless type or a wired type. In the case of wireless type, it may be WIFI, BLUETOOTH or the like. The image pickup device installed in the vehicle may be an image pickup device incorporated in the vehicle or an image pickup device installed in the vehicle.
 端末装置は、人が携帯可能な携帯端末装置であってよい。例えば、携帯可能な端末装置としては、スマートフォン、PDA、などであってよい。 The terminal device may be a portable terminal device that can be carried by a person. For example, the portable terminal device may be a smartphone, a PDA, or the like.
2.本例のシステムの機能
 次に、本例のシステムにおける機能について、図2及び3を参照して説明する。図2は、本例の管理システムに係る機能の具体例を示すブロック図であり、図3は端末装置に係る機能の具体例を示すブロック図である。なお、その他の例として、管理システムに係る機能の一部が端末装置において実行されてもよい。この場合、システムにおいては複数の端末装置から取得した情報に基づいて処理可能であるところ、端末装置が図2内の機能を実行する場合かかる端末装置が取得した情報に基づいて一の端末装置で取得した情報として機能を実行すればよい。またかかる一の端末装置で実行して得られた情報は、管理システムに送信され、更に管理システムにおいて他の端末装置から取得した情報と合わされ合算や平均化などの処理がされてよい。なお、本願書類において、システムという用語は、管理システムと端末装置の上位概念として用いられ、システムは、管理システムを含み一又は複数の端末装置を含まなくてもよいし、一又は複数の端末装置を含み管理システムを含まなくてもよいし、管理システムと一又は複数の端末装置の両方を含んでもよい。
2. Functions of the system of this example Next, the functions of the system of this example will be described with reference to FIGS. 2 and 3. FIG. 2 is a block diagram showing a specific example of the function related to the management system of this example, and FIG. 3 is a block diagram showing a specific example of the function related to the terminal device. As another example, a part of the functions related to the management system may be executed in the terminal device. In this case, the system can process based on the information acquired from a plurality of terminal devices, but when the terminal device executes the function in FIG. 2, one terminal device is used based on the information acquired by the terminal device. The function may be executed as the acquired information. Further, the information obtained by executing the information on the one terminal device may be transmitted to the management system, combined with the information acquired from the other terminal devices in the management system, and processed such as summation and averaging. In the documents of the present application, the term system is used as a superordinate concept of a management system and a terminal device, and the system may include a management system and may not include one or a plurality of terminal devices, or one or a plurality of terminal devices. It may include the management system and may not include the management system, or may include both the management system and one or more terminal devices.
 なお、端末装置が設置又は置かれている車両を自己車両といい、かかる車両に係る情報を自己車両情報といい、端末装置に係る撮像装置が撮像した画像内に含まれている車両を他車両という。なお、システムの利用者が、複数の車両を保有する又は共有するなどにより、複数の車両を利用する場合、自己車両という用語は、これらの端末装置が設置又は置かれる可能性のある一又は複数の車両を指してよい。 The vehicle in which the terminal device is installed or placed is referred to as a self-vehicle, the information related to such a vehicle is referred to as self-vehicle information, and the vehicle included in the image captured by the imaging device related to the terminal device is referred to as another vehicle. That is. When a user of the system uses a plurality of vehicles by owning or sharing a plurality of vehicles, the term "own vehicle" refers to one or more of these terminal devices may be installed or placed. You may point to the vehicle.
2.1.端末装置内の機能
2.1.1.画像等取得部
 画像等取得部は、画像を取得する機能を有する。画像は、動画であっても静止画であってもよい。本願書類において、画像は、動画と静止画との上位概念として扱う。
2.1. Functions in the terminal device
2.1.1. Image acquisition unit The image acquisition unit has a function of acquiring an image. The image may be a moving image or a still image. In the documents of the present application, images are treated as a superordinate concept of moving images and still images.
 画像等取得部は、画像として動画を取得する場合において、動画を、所定容量毎に、分けたファイル形式で記憶してよい。大容量の一のファイルより、一ファイルあたりの容量が少ない方が、通信の利便性が向上するためである。 When acquiring a moving image as an image, the image acquisition unit may store the moving image in a file format divided for each predetermined capacity. This is because the convenience of communication is improved when the capacity per file is smaller than that of a single file having a large capacity.
 また、画像等取得部は、取得した動画について、所定期間毎に、撮像された場所及び/又は撮像時刻を、取得し、対応する動画の所定期間毎に関連付けて記憶してよい。かかる構成により、動画において所定期間毎に、撮像された場所及び/又は撮像時刻の情報を取得できる利点がある。 Further, the image acquisition unit may acquire the imaged location and / or the imaging time for each predetermined period of the acquired moving image, and store the acquired moving image in association with each predetermined period of the corresponding moving image. With such a configuration, there is an advantage that information on the imaged location and / or the imaging time can be acquired at predetermined intervals in the moving image.
 画像等取得部は、端末装置内の撮像装置から、画像を取得してもよいし、端末装置外の撮像装置から画像を取得してもよい。この場合、端末装置外の撮像装置と有線又は無線で接続されてよく、無線の場合、WIFI、BLUETOOTHなどの接続方式により、画像の情報を取得してよい。 The image acquisition unit may acquire an image from an image pickup device inside the terminal device, or may acquire an image from an image pickup device outside the terminal device. In this case, it may be connected to an image pickup device outside the terminal device by wire or wirelessly, and in the case of wireless communication, image information may be acquired by a connection method such as WIFI or BLUETOOTH.
 また、画像等取得部は、音声を取得してよい。画像等取得部は、画像として動画を取得する際に、合わせて音声も取得してよい。これにより、例えば、事故時の音や急ブレーキ、急カーブなどの音も取得し、記録できる利点がある。また、画像取得部は、動画とは別に、音声自体を取得してよい。画像と音声の上位概念として、本願書類において、「画像等」ということもある。 In addition, the image acquisition unit may acquire audio. The image acquisition unit may also acquire audio when acquiring a moving image as an image. This has the advantage that, for example, sounds at the time of an accident, sudden braking, sharp curves, and the like can be acquired and recorded. Further, the image acquisition unit may acquire the sound itself separately from the moving image. As a superordinate concept of image and sound, it may be referred to as "image, etc." in the documents of the present application.
2.1.2.自己車両情報生成部
 自己車両情報生成部は、自己車両に係る情報を取得する機能を有してよい。
2.1.2. Own vehicle information generation unit The own vehicle information generation unit may have a function of acquiring information related to the own vehicle.
 自己車両に係る情報は、自己車両に係るナンバープレートに係る情報、自己車両の周辺車両、前方車両、及び/又は、後方車両、に対して伝えたいメッセージ(本願書類において、「伝達メッセージ」ということもある)、を含んでよい。 The information related to the own vehicle is the information related to the license plate related to the own vehicle, the message to be transmitted to the surrounding vehicle, the vehicle in front, and / or the vehicle behind the own vehicle (referred to as "transmission message" in the documents of the present application). There is also), may be included.
 ナンバープレートに係る情報は、車両のナンバープレートが撮像されたものでもよいし、かかる撮像されたナンバープレートの画像から生成された車両を特定する番号の情報であってもよいし、利用者などによって入力された車両を特定する番号の情報であってもよい。また、この場合、あらかじめ与えられたものから選択する態様により、地域名を選択する構成であってもよい。ナンバープレートに係る情報が、画像から生成される場合、自己車両情報生成部は、画像を解析し、車両を特定する番号の情報を抽出できる機能を有してよい。 The information related to the license plate may be an image of the license plate of the vehicle, may be information on a number that identifies the vehicle generated from the image of the image of the imaged license plate, or may be information on a number that identifies the vehicle, depending on the user or the like. It may be the information of the number that identifies the entered vehicle. Further, in this case, the area name may be selected depending on the mode of selecting from those given in advance. When the information related to the license plate is generated from the image, the self-vehicle information generation unit may have a function of analyzing the image and extracting the information of the number that identifies the vehicle.
 ナンバープレートに係る情報は、端末装置において、一又は複数登録されてよい。予め登録された一又は複数のナンバープレートに係る情報のうち、一のナンバープレートに係る情報が選択されており、利用されてよい。また、利用者が複数の自己車両を利用しうる場合において、利用者は、実際に利用する自己車両を選択し、かかる実際に利用される自己車両に係るナンバープレートに係る情報を選択されてよい。これにより、利用者が複数の自己車両を利用する可能性がある場合においても、利用する度に新たにナンバープレートを登録等しなくても、迅速にナンバープレートに係る情報を利用できる利点がある。 The information related to the license plate may be registered one or more in the terminal device. Among the information related to one or a plurality of license plates registered in advance, the information related to one license plate is selected and may be used. In addition, when the user can use a plurality of own vehicles, the user may select the own vehicle to be actually used and select the information related to the license plate related to the actually used own vehicle. .. As a result, even when the user may use a plurality of own vehicles, there is an advantage that the information related to the license plate can be used quickly without registering a new license plate each time the user uses the vehicle. ..
 伝達メッセージは、自己車両のメッセージを伝達できる態様であれば、種々の態様であってよい。 The transmission message may be in various modes as long as the message of the own vehicle can be transmitted.
 例えば、伝達メッセージは、利用者が、乗車前や乗車時に入力した文字列であってもよい。かかる文字列は、利用者が入力した任意の文字であってよい。 For example, the transmitted message may be a character string input by the user before or at the time of boarding. Such a character string may be any character input by the user.
 また、例えば、伝達メッセージは、予め定められた文字列であってよく。この場合、利用者は、かかる予め定められた複数の文字列の一のうち、選択されたものであってよい。この場合、伝達メッセージは、例えば、ゆっくり走行する旨の文字列や、運転手が高齢である旨、運転手が初心者である旨、子供が乗車している旨、急いでいる旨の文字列でもよい。これらの伝達メッセージは、あらかじめ記憶されたものであってもよいし、端末装置において入力されたものであってもよい。 Also, for example, the transmitted message may be a predetermined character string. In this case, the user may be selected from one of the plurality of predetermined character strings. In this case, the transmitted message may be, for example, a character string indicating that the vehicle is driving slowly, a character string indicating that the driver is elderly, a character string indicating that the driver is a beginner, a character string indicating that a child is on board, or a character string indicating that the driver is in a hurry. Good. These transmitted messages may be stored in advance or may be input in the terminal device.
 また、伝達メッセージは、前方車両への伝達メッセージと、後方車両への伝達メッセージと、を分けて有してもよい。 Further, the transmission message may include a transmission message to the vehicle in front and a transmission message to the vehicle behind.
 また、伝達メッセージが複数保有され、利用者の適宜のタイミングにおいて、使用する伝達メッセージが、複数の伝達メッセージから一の伝達メッセージが選択されてもよい。なお、かかる伝達メッセージの選択は、音声により選択されてもよいし、ジェスチャーにより選択されてもよい。ジェスチャーにより伝えられるメッセージは、例えば、感謝、謝罪、相手方を譲るなどでよい。感謝は、例えば、車線変更時に車両列内に入れてくれたことに対する感謝や右折・左折時などに先に行くことを譲ってもらったことに対する感謝などを想定される。また、謝罪は、感謝と同等のケースにおいて、感謝に代えて示すことが想定される。そして、相手方を譲るのは、右左折や車線変更などにおいて、相手車両に先行することを促すものが想定される。また、これらのジェスチャーは、前方車両、後方車両、右側方車両、左側方車両など、ジェスチャーを示す相手の車両を特定するジェスチャー又は音声が、メッセージ内容の前後にあってもよい。 Further, a plurality of transmitted messages may be possessed, and one transmitted message may be selected from a plurality of transmitted messages to be used at an appropriate timing of the user. The selection of the transmitted message may be selected by voice or by gesture. The message conveyed by the gesture may be, for example, gratitude, apology, or giving up the other party. For example, gratitude is assumed to be gratitude for being put in the vehicle line when changing lanes, and gratitude for being given the lead when turning right or left. In addition, an apology is expected to be given in place of gratitude in the same case as gratitude. Then, it is assumed that the other party is handed over to encourage the other vehicle to take the lead when turning left or right or changing lanes. In addition, these gestures may include gestures or voices that identify the opponent's vehicle, such as a front vehicle, a rear vehicle, a right-side vehicle, and a left-side vehicle, before and after the message content.
 なお、一例のシステムに係る端末装置の利用者と、かかる端末装置が持ち込まれた車両の運転手と、かかる車両の車両保有者とは、各々異なってもよいし、同じでもよい。また、一例のシステムに係る端末装置の利用者が、複数の車両を利用する場合などにおいては、かかる利用者は、乗車する車両を特定する機能を有してもよい。 Note that the user of the terminal device according to the example system, the driver of the vehicle into which the terminal device is brought in, and the vehicle owner of the vehicle may be different or the same. Further, when a user of the terminal device according to the example system uses a plurality of vehicles, such a user may have a function of specifying a vehicle to be boarded.
 図4は、利用者乗車車両に係るナンバープレートに係る情報と、かかる利用者乗車車両に乗車した場合におけるメッセージと、を関連付けて記憶しているデータ構造の一例である。かかるデータ構造を有する場合、利用者が、実際に乗車する利用者乗車車両を選択可能であり、選択された利用者乗車車両に応じたナンバープレートに係る情報を用いることが可能となる利点がある。 FIG. 4 is an example of a data structure in which information related to the license plate related to the user's vehicle and a message when the user's vehicle is boarded are stored in association with each other. Having such a data structure has an advantage that the user can select the user's vehicle to actually board and can use the information related to the license plate according to the selected user's vehicle. ..
 また、自己車両に係る情報は、自己車両の運転状況を示す情報(本願書類において、「自己車両運転状況情報」ということもある。)、及び/又は、自己車両の位置を示す情報「自己車両位置情報」、を含んでよい。自己車両運転状況情報は、自己車両の急加速、急ブレーキ、急ハンドル、速度、及び/又は、加速度を含んでよい。 In addition, the information related to the own vehicle includes information indicating the driving status of the own vehicle (sometimes referred to as "own vehicle driving status information" in the documents of the present application) and / or information indicating the position of the own vehicle "own vehicle". "Position information" may be included. The own vehicle driving situation information may include sudden acceleration, sudden braking, sudden steering, speed, and / or acceleration of the own vehicle.
 自己車両の急加速、急ブレーキ、及び/又は、急ハンドルは、端末装置内の機能が用いられて判定されてよい。例えば、端末装置内のセンサーにより、自己車両の急加速、急ブレーキ、及び/又は、急ハンドルが判定されてよい。より具体的には、センサーとして加速度センサーにより、所定の加速度よりも高い場合又は低い場合、急加速、急ブレーキ、及び/又は、急ハンドルが判定されてよい。加速度センサーは、3次元の加速を各々測定できてよい。車両の前方方向と同一の方向の加速度が所定の加速度よりも高い場合に、自己車両の急加速と判定し、車両の前方方向と相反する方向の加速度が所定の加速度よりも高い場合に、自己車両の急ブレーキと判定し、車両の側方方向への加速度が所定の加速度よりも高い場合に、自己車両の急ハンドルと判定してよい。 The sudden acceleration, sudden braking, and / or sudden steering of the own vehicle may be determined by using the function in the terminal device. For example, a sensor in the terminal device may determine the sudden acceleration, sudden braking, and / or sudden steering of the own vehicle. More specifically, an acceleration sensor as a sensor may determine sudden acceleration, sudden braking, and / or sudden steering when the acceleration is higher or lower than a predetermined acceleration. The accelerometer may be capable of measuring each of the three-dimensional accelerations. When the acceleration in the same direction as the front direction of the vehicle is higher than the predetermined acceleration, it is determined that the acceleration of the own vehicle is sudden, and when the acceleration in the direction opposite to the front direction of the vehicle is higher than the predetermined acceleration, the self When it is determined that the vehicle is suddenly braked and the lateral acceleration of the vehicle is higher than a predetermined acceleration, it may be determined that the vehicle is a sudden steering wheel.
 また、自己車両の速度、加速度は、端末装置内のセンサーにより、加速度を測定し、かかる加速度を用いて、速度を測定してよい。 Further, the speed and acceleration of the own vehicle may be measured by measuring the acceleration with a sensor in the terminal device and using the acceleration.
 また、自己車両運転状況情報は、自己車両の前方車両との距離に係る情報(本願書類において、「距離情報」ということもある)を含んでよい。自己車両情報生成部は、後述する画像情報生成部によって生成された画像に係る情報を用いて、距離情報を生成してもよい。かかる距離情報は、自己車両と前方車両との距離を推定又は距離と対応する情報及び/又はこれを用いて接近したと判定した情報を含んでよい。かかる接近したと判定した情報は、例えば、前方車両を後方から見た場合の画像内の大きさを、自己車両と前方車両との距離と対応する情報として用いて、かかる車両の後方からの大きさが所定の大きさよりも大きい場合に、接近していると判定してもよい。車両の後方からの大きさは、車幅又は車高であってよい。なお、車両において、車高は車種によって変更しうるため、画像内の車両の車種に応じた車高を用いて、接近の有無を判定してよい。この場合、画像から得られた前方の車両の車高の高さが、車種に応じて設定された所定の車高の高さよりも高い場合に、接近したと判定してよい。他方、車幅は、車両は車種によって大きく異ならないため、車種に基づく情報処理をしなくてもよい利点があり、画像から得られた前方の車両の車幅が所定の車幅よりも長い場合に、接近したと判定してよい。 In addition, the self-vehicle driving status information may include information relating to the distance of the self-vehicle from the vehicle in front (sometimes referred to as "distance information" in the documents of the present application). The own vehicle information generation unit may generate distance information using the information related to the image generated by the image information generation unit described later. Such distance information may include information for estimating the distance between the own vehicle and the vehicle in front or information corresponding to the distance and / or information for determining that the vehicle has approached using the distance information. For the information determined to be approaching, for example, the size in the image when the vehicle in front is viewed from the rear is used as the information corresponding to the distance between the own vehicle and the vehicle in front, and the size from the rear of the vehicle is used. When is larger than a predetermined size, it may be determined that they are approaching. The size from the rear of the vehicle may be the width or height of the vehicle. Since the vehicle height can be changed depending on the vehicle type, the presence or absence of approach may be determined by using the vehicle height according to the vehicle type in the image. In this case, when the height of the vehicle in front obtained from the image is higher than the height of a predetermined vehicle height set according to the vehicle type, it may be determined that the vehicle has approached. On the other hand, the vehicle width does not differ greatly depending on the vehicle type, so there is an advantage that information processing based on the vehicle type does not have to be performed, and the vehicle width of the vehicle in front obtained from the image is longer than the predetermined vehicle width. It may be determined that the vehicle has approached.
 また、自己車両情報生成部は、急加速、急ブレーキ、急ハンドル、接近情報の程度について、それぞれ、複数のランクが設けられており、これらの一が選択されてもよい。例えば、ランクを3とした場合、それぞれ、低、中、高として、各々閾値が設定され、それらと得られた情報を対比させることにより、ランクを決定してよい。なお、ランクの数は3に限らず、任意の数であってよい。 In addition, the self-vehicle information generation unit is provided with a plurality of ranks for each of the degree of sudden acceleration, sudden braking, sudden steering, and approach information, and one of these may be selected. For example, when the rank is 3, threshold values are set as low, medium, and high, respectively, and the rank may be determined by comparing them with the obtained information. The number of ranks is not limited to 3, and may be any number.
 また、自己車両情報生成部は、自己車両の急加速、急ブレーキ、急ハンドル、及び/又は、接近情報を、これらを判定した状況と関連付けて、記憶してよい。判定した状況とは、例えば、自己車両の急加速、急ブレーキ、急ハンドル、及び/又は、接近情報、が生じた時刻、及び/又は、場所、などでよい。 Further, the own vehicle information generation unit may store the sudden acceleration, sudden braking, sudden steering, and / or approach information of the own vehicle in association with the situation in which these are determined. The determined situation may be, for example, the time and / or location where the sudden acceleration, sudden braking, sudden steering, and / or approach information of the own vehicle occur.
 自己車両情報生成部は、自己車両の運転状況に関する統計情報を生成してよい。統計情報としては、例えば、所定項目毎の走行における急加速、急ブレーキ、急ハンドル、及び/又は、接近情報の夫々の回数であってよい。所定項目は、所定単位時内、一回の走行、一日の走行、所定期間内、などであってよい。また、一回の走行は、例えば、エンジンを始動してエンジンを終了するまでであったり、走行と走行の間が所定期間以上であったり、一例のシステムに係るソフトウェアの起動から停止までの間であったり、動画の録画開始から録画終了までであってよい。 The own vehicle information generation unit may generate statistical information regarding the driving situation of the own vehicle. The statistical information may be, for example, the number of times of sudden acceleration, sudden braking, sudden steering, and / or approach information in traveling for each predetermined item. The predetermined items may be within a predetermined unit time, one run, one day run, a predetermined period, and the like. In addition, one run is, for example, from the start of the engine to the end of the engine, the time between running and running is a predetermined period or more, or the time from the start to the stop of the software related to the system of one example. Or it may be from the start of recording the moving image to the end of recording.
 また、統計情報として、時系列に、上述の回数を生成してもよい。また、横軸を時期、縦軸をかかる回数として、各、急加速、急ブレーキ、急ハンドル、及び/又は、接近情報について、グラフを生成し、ドライブレポートとしてもよい。 Further, as statistical information, the above-mentioned number of times may be generated in chronological order. Further, a graph may be generated for each of sudden acceleration, sudden braking, sudden steering, and / or approach information with the horizontal axis as the timing and the vertical axis as the number of times of application, and may be used as a drive report.
 図5は、自己車両の運転状況に関する統計情報のデータの一例である。所定項目毎として一日の走行における各統計情報が記録されている例である。IDは001乃至004と、4日分の走行について、各統計情報として、合計回数が記憶されている例である。 FIG. 5 is an example of statistical information data on the driving situation of the own vehicle. This is an example in which each statistical information in one day's driving is recorded for each predetermined item. The IDs are 001 to 004, which is an example in which the total number of times is stored as each statistical information for running for 4 days.
 自己車両位置情報は、端末装置内のGPSから、位置情報を取得してもよいし、端末装置外のGPSから位置情報を取得してもよい。この場合、端末装置外の撮像装置と有線又は無線で接続されてよく、無線の場合、WIFI、BLUETOOTHなどの接続方式により、位置情報を取得してよい。 As the own vehicle position information, the position information may be acquired from the GPS inside the terminal device, or the position information may be acquired from the GPS outside the terminal device. In this case, the image pickup device outside the terminal device may be connected by wire or wirelessly, and in the case of wireless communication, the position information may be acquired by a connection method such as WIFI or BLUETOOTH.
 自己車両情報生成部は、後述する画像情報生成部において、自車両の動きがないとの情報が生成され、かつ、自車両を対象とする信号機が青信号であるという情報が生成された場合、青信号であるにもかかわらず自車両が発車していないことを示すため、自車両の発車を促す情報を生成してよい。また、自己車両情報生成部は、後述する画像情報生成部において、自車両の動きがないとの情報が生成され、かつ、前方車両との距離情報が所定の距離よりも大きいとの情報が生成された場合、前方車両が発車したにもかかわらず自車両が発車していないことを示すため、自車両の発車を促す情報を生成してよい。なお、後者の場合、自車両を対象とする信号機が画像内で検出された場合、青信号であるとの条件が付されてもよい。 The own vehicle information generation unit generates a green light when the image information generation unit described later generates information that the own vehicle does not move and that the traffic light for the own vehicle is a green light. However, in order to indicate that the own vehicle has not started, information prompting the own vehicle to start may be generated. In addition, the self-vehicle information generation unit generates information that the own vehicle does not move and that the distance information to the vehicle in front is larger than a predetermined distance in the image information generation unit described later. If this is the case, information prompting the departure of the own vehicle may be generated in order to indicate that the own vehicle has not started even though the vehicle in front has started. In the latter case, when a traffic light targeting the own vehicle is detected in the image, a condition that it is a green light may be added.
 また、自己車両情報生成部は、後述する画像情報生成部において、自車両の動きがあるとの情報が生成され、かつ、一時停止の標識又は表示があるとの情報が生成されている場合、一時停止違反との判定をし、一時停止違反を示す情報を生成してよい。 Further, when the self-vehicle information generation unit generates information that the own vehicle is moving and that there is a stop sign or display in the image information generation unit described later, the self-vehicle information generation unit generates information that the vehicle is moving. It may be determined that the suspension violation occurs, and information indicating the suspension violation may be generated.
 また、自己車両情報生成部は、後述する画像情報生成部において、自車両の動きがあるとの情報が生成され、かつ、赤信号継続を示す情報が生成されている場合、赤信号無視との判定をし、赤信号無視を示す情報を生成してよい。 Further, the self-vehicle information generation unit ignores the red light when the image information generation unit described later generates information that the own vehicle is moving and information indicating that the red light continues. Judgment may be made and information indicating that the red light is ignored may be generated.
 また、自己車両情報生成部は、一時停止違反を示す情報、赤信号を無視する情報、を用いて、上述の統計情報に、これらの、一時停止違反を示す情報、及び/又は、赤信号を無視する情報、の合計数及び/又は平均数を含ませてもよい。また、これらの一時停止違反を示す情報、及び/又は、赤信号を無視する情報、は、これらが生じた位置情報、及び/又は、時に係る情報、を関連付けて含んでよい。 In addition, the self-vehicle information generation unit uses the information indicating the stop violation and the information ignoring the red light to add these information indicating the stop violation and / or the red light to the above-mentioned statistical information. The total number and / or average number of information to be ignored may be included. In addition, the information indicating the suspension violation and / or the information ignoring the red light may include the location information in which they occur and / or the information relating to the occasion in association with each other.
2.1.3.画像情報生成部
 画像情報生成部は、画像を用いて、画像に係る情報を生成する機能を有する。画像に係る情報は、画像内の種々の物体(本願書類において、「対象」ということもある)や種々の状況を用いた情報であってよく、その種類や範囲に制限はないが、例えば、以下の情報が挙げられる。また、画像情報生成部は、情報を生成する対象となるものを、画像内において識別したうえで、情報を生成してよい。また、画像情報生成部は、画像と、他の情報を用いて、画像に係る情報を生成してよい。他の情報としては、画像等取得部、及び、自己車両情報生成部、の少なくとも一部に係る情報と、を用いて画像に係る情報を生成してよい。画像等取得部に係る情報としては、例えば、画像等を取得した時の情報や、画像等であってよく、自己車両情報生成部に係る情報としては、例えば、自己車両位置情報などがあげられるが、これらに限られない。
2.1.3. Image information generation unit The image information generation unit has a function of generating information related to an image by using an image. The information related to the image may be information using various objects in the image (sometimes referred to as "objects" in the documents of the present application) and various situations, and the type and range thereof are not limited, but for example, The following information can be mentioned. In addition, the image information generation unit may generate information after identifying a target for generating information in the image. In addition, the image information generation unit may generate information related to the image by using the image and other information. As other information, information related to an image may be generated by using information related to at least a part of an image or the like acquisition unit and a self-vehicle information generation unit. The information related to the image or the like acquisition unit may be, for example, the information at the time of acquiring the image or the like, the image or the like, and the information related to the own vehicle information generation unit may be, for example, the own vehicle position information or the like. However, it is not limited to these.
 画像に係る情報は、画像内の一又は複数の他車両に係る情報であってよい。他車両に係る情報は、他車両を特定する情報(本願書類において、「車両特定情報」ということもある)、他車両の運転状況を示す情報(本願書類において、「他車両運転状況情報」ということもある)などであってよい。 The information related to the image may be information related to one or more other vehicles in the image. Information related to other vehicles is referred to as information that identifies another vehicle (sometimes referred to as "vehicle identification information" in the documents of the present application) and information that indicates the driving status of another vehicle (in the documents of the present application, "information on the driving status of another vehicle"). Sometimes) and so on.
 車両特定情報は、画像内の車両に係る情報であって良い。画像内の車両に係る情報は、画像内の車両のナンバープレートに係る情報を含んでもよいし、画像内の車両の車種、色彩、オプションなどの情報を含んでもよい。なお、ナンバープレートに係る情報は、上述の利用者車両に係るナンバープレートに係る情報と同様であってよい。 The vehicle specific information may be information related to the vehicle in the image. The information relating to the vehicle in the image may include information relating to the license plate of the vehicle in the image, or may include information such as the vehicle type, color, and options of the vehicle in the image. The information related to the license plate may be the same as the information related to the license plate related to the user vehicle described above.
 他車両運転状況情報は、他車両の運転状況を示す情報であればよく、例えば、車両の急加速、車両の急ブレーキ、及び/又は、車両の急ハンドルなどであってよい。 The other vehicle driving status information may be any information indicating the driving status of another vehicle, and may be, for example, sudden acceleration of the vehicle, sudden braking of the vehicle, and / or sudden steering of the vehicle.
 画像内の車両の急加速は、画像内における車両の大きさが、所定期間内に所定の比率以上で小さく変化した場合に、急加速と判定されてよい。また、急加速の判定は、自己車両の加速度を用いて、判定されてもよい。すなわち、自己車両の前方方向への加速度が所定の範囲内であり、かつ、前記画像内の車両の大きさが、所定期間内に所定の比率以上で小さく変化した場合に、急加速と判定してよい。この場合、判定精度が向上する利点がある。 The sudden acceleration of the vehicle in the image may be determined as a sudden acceleration when the size of the vehicle in the image changes slightly by a predetermined ratio or more within a predetermined period. Further, the sudden acceleration may be determined by using the acceleration of the own vehicle. That is, when the acceleration of the own vehicle in the forward direction is within a predetermined range and the size of the vehicle in the image changes slightly by a predetermined ratio or more within a predetermined period, it is determined to be sudden acceleration. It's okay. In this case, there is an advantage that the determination accuracy is improved.
 画像内の車両の急ブレーキは、画像内における車両の大きさが、所定期間内に所定の比率以上で大きく変化した場合に、急加速と判定されてよい。また、急加速の判定は、自己車両の加速度を用いて、判定されてもよい。すなわち、自己車両の後方方向への加速度が所定の範囲内であり、かつ、前記画像内の車両の大きさが、所定期間内に所定の比率以上で大きく変化した場合に、急ブレーキと判定してよい。この場合、判定精度が向上する利点がある。 The sudden braking of the vehicle in the image may be determined as sudden acceleration when the size of the vehicle in the image changes significantly by a predetermined ratio or more within a predetermined period. Further, the sudden acceleration may be determined by using the acceleration of the own vehicle. That is, when the acceleration of the own vehicle in the rear direction is within a predetermined range and the size of the vehicle in the image changes significantly by a predetermined ratio or more within a predetermined period, it is determined as sudden braking. It's okay. In this case, there is an advantage that the determination accuracy is improved.
 画像内の車両の急ハンドルは、画像内の車両の側方の面積が所定の比率以上の速さで増加すること、画像内の車両の側方の形状が車両を真横から見えた場合の形状に近づく速さが所定の比率以上の速さであること、のいずれか又は組み合わせの場合に、急加速と判定されてよい。また、これらに組み合わせて、画像内の車両の大きさが所定期間内に所定の比率以上の速さで小さく変化すること、の場合に急加速と判定してもよい。車両の大きさは、車両の幅の長さであってよい。 The steep steering wheel of the vehicle in the image means that the lateral area of the vehicle in the image increases at a speed of a predetermined ratio or more, and the shape of the side of the vehicle in the image is the shape when the vehicle is seen from the side. If any or a combination of the speeds approaching the above is equal to or higher than a predetermined ratio, it may be determined to be sudden acceleration. Further, in combination with these, when the size of the vehicle in the image changes small at a speed of a predetermined ratio or more within a predetermined period, it may be determined that the vehicle is suddenly accelerated. The size of the vehicle may be the length of the width of the vehicle.
 また、他車両運転状況情報は、取得した対象となる画像の撮像された日時と関連付けて、記憶されてよい。これにより、他車両運転状況情報に係る日時が明らかになる利点がある。 In addition, the driving status information of other vehicles may be stored in association with the date and time when the acquired target image was captured. This has the advantage of clarifying the date and time related to the driving status information of other vehicles.
 また、他車両運転状況情報は、取得した対象となる画像の撮像された位置情報と関連付けて、記憶されてよい。これにより、他車両運転状況情報に係る位置が明らかになる利点がある。 Further, the driving status information of other vehicles may be stored in association with the captured position information of the acquired target image. This has the advantage of clarifying the position related to the driving status information of other vehicles.
 また、一の画像内において、複数の車両が特定された場合、各車両特定情報と関連付けて、各車両に係る他車両運転状況情報を記憶してよい。この場合、画像内の各車両について、運転状況情報が、整理してアクセス可能な状態となる利点がある。 Further, when a plurality of vehicles are identified in one image, other vehicle driving status information related to each vehicle may be stored in association with each vehicle identification information. In this case, there is an advantage that the driving status information is organized and accessible for each vehicle in the image.
 図6は、複数の画像に基づき、情報が生成された例である。本図はデータ構造を示しており、画像IDに対して、車両ID、他車両運転状況情報、日時、位置、が関連付けて記憶されている。車両IDは、他のデータ構造において、他車両特定情報と関連付けられていてよい。ここで、車両IDは、画像内で特定された車両に対して、順次付されたものを用いてもよいし、過去に車両IDが付された車両特定情報と照合されることによって対応する車両IDを用いてもよい。記憶部は、端末装置に係る撮像装置が過去に撮像した画像内の車両特定情報と、かかる車両に対して付された車両IDと、のセットの情報を、記憶してよい。また、かかるセットの情報は所定期間、記憶してよい。また、かかるセットの情報を、リングバッファにより記憶してよい。なお、車両IDを用いることは必須ではなく、車両IDに変えて、車両特定情報を直接用いてもよい。また、本図においては、各運転状況情報は、異なる画像IDと関連付けられているが、例えば、同一時刻に他車両運転状況情報が生成された場合は、一の画像に対して、複数の車両とそれらに対応する他車両運転状況情報が関連付けられてもよい。このような、画像内の車両に係る統計情報を、本願書類において、「画像内車両統計情報」ということもある。 FIG. 6 is an example in which information is generated based on a plurality of images. This figure shows the data structure, and the vehicle ID, other vehicle driving status information, the date and time, and the position are stored in association with the image ID. The vehicle ID may be associated with other vehicle specific information in other data structures. Here, the vehicle ID may be one that is sequentially attached to the vehicle specified in the image, or the corresponding vehicle is collated with the vehicle identification information to which the vehicle ID is attached in the past. ID may be used. The storage unit may store a set of information of the vehicle identification information in the image captured in the past by the image pickup device related to the terminal device and the vehicle ID attached to the vehicle. In addition, the information of such a set may be stored for a predetermined period of time. In addition, the information of such a set may be stored in the ring buffer. It is not essential to use the vehicle ID, and the vehicle identification information may be used directly instead of the vehicle ID. Further, in this figure, each driving status information is associated with a different image ID. For example, when the driving status information of another vehicle is generated at the same time, a plurality of vehicles are generated for one image. And the corresponding other vehicle driving status information may be associated with them. Such statistical information relating to the vehicle in the image may be referred to as "vehicle statistical information in the image" in the documents of the present application.
<自車両の動きに関する情報>
 また、画像情報生成部は、時系列の画像を用いて、自車両の動きに関する情報を生成してよい。例えば、動画における隣り合う複数のフレームに係る静止画内の、対応する、信号機、標識、及び/又は、風景、における動きを用いて、自車両の動きの有無を判定してよい。自車両の動きに関する情報は、自車両の有無の情報を含んでよい。かかる情報は、一時停止を守っているか、自車両を対象とする信号機が青になった場合に発車したか、前方車両が動いている場合であって自車両を対象とする信号機が青である場合に発車したか、に用いられてよい。
<Information on the movement of your vehicle>
In addition, the image information generation unit may generate information on the movement of the own vehicle by using time-series images. For example, the presence or absence of movement of the own vehicle may be determined by using the movements in the corresponding traffic lights, signs, and / or landscapes in the still images of the plurality of adjacent frames in the moving image. The information regarding the movement of the own vehicle may include information on the presence or absence of the own vehicle. Such information is that the vehicle is keeping a pause, the traffic light for the own vehicle turns blue, or the traffic light for the own vehicle is blue when the vehicle in front is moving. May be used if the vehicle departed.
<赤信号継続を示す情報>
 また、画像情報生成部は、第1画像が自車両を対象とする信号機が赤信号である状態を含んでおり、かかる第1画像に時系列上後続するフレームにおいて、かかる赤信号の信号機が継続して表示された上で青信号とならずに画像から外れるかどうかを判定してよい。かかる判定がされた場合、赤信号継続を示す情報を生成してよい。
<Information indicating the continuation of the red light>
Further, the image information generation unit includes a state in which the traffic light for the own vehicle is a red light in the first image, and the traffic light with the red light continues in the frame following the first image in time series. After being displayed, it may be determined whether or not the image deviates from the image without a green light. When such a determination is made, information indicating the continuation of the red light may be generated.
 ここで、自車両を対象とする信号機の特定は、信号機を含む画像を用いた機械学習によって学習されることにより、特定可能とされてよい。 Here, the identification of the traffic light for the own vehicle may be made possible by learning by machine learning using an image including the traffic light.
<一時停止指示の情報>
 また、画像情報生成部は、画像内に係る対象における道路標識又は道路標示において、一時停止の標識又は表示がある場合、かかる情報を検出し、一時停止指示の情報を生成してよい。
<Pause instruction information>
Further, the image information generation unit may detect such information and generate information of a stop instruction when there is a stop sign or display in the road sign or road marking of the object related to the image.
 また、画像に係る情報は、画像内で検出される種々の対象に係る情報であってよい。 Further, the information related to the image may be information related to various objects detected in the image.
<ワイパーの動作に係る情報>
 対象に係る情報は、ワイパーの動作に係る情報を含んでよい。ワイパーの動作に係る情報は、自己車両に係るワイパーの動作の情報を用いてもよいし、他車両に係るワイパーの動作を用いてもよい。ワイパーの動作に係る情報は、ワイパーの動作状況に係る情報を含んでよい。ワイパーの動作状況に係る情報は、ワイパーの動きの有無、及び/又は、ワイパーの動きの速さ、などであってよい。また、ワイパーの動きの速さは、ワイパーの動きの速さを示す複数のランクのうちの一を特定する情報であってもよい。ワイパーの動きの速さを示す複数のランクとしては、例えば、間欠、遅い、中ぐらい、早い、などであってよいが、これらにとどまらない。
<Information related to wiper operation>
The information related to the target may include information related to the operation of the wiper. As the information related to the operation of the wiper, the information on the operation of the wiper related to the own vehicle may be used, or the operation of the wiper related to another vehicle may be used. The information related to the operation of the wiper may include information related to the operating status of the wiper. The information related to the operation status of the wiper may be the presence / absence of movement of the wiper and / or the speed of movement of the wiper. Further, the speed of movement of the wiper may be information that identifies one of a plurality of ranks indicating the speed of movement of the wiper. The plurality of ranks indicating the speed of movement of the wiper may be, for example, intermittent, slow, medium, fast, etc., but are not limited to these.
 また、ワイパーの動作に係る情報は、ワイパーの動作状況に係る情報と、ワイパーの動作状況に係る情報を取得した画像を撮像した位置情報と、を関連付けたものを含んでよい。 Further, the information related to the operation of the wiper may include information related to the operation status of the wiper and the position information obtained by capturing the image obtained by acquiring the information related to the operation status of the wiper.
 また、ワイパーの動作に係る情報は、ワイパーの動作状況に係る情報と、ワイパーの動作状況に係る情報を取得した画像を撮像した時、を関連付けたものを含んでよい。 Further, the information related to the operation of the wiper may include information related to the operation status of the wiper and the information related to the operation status of the wiper when the image obtained by acquiring the information is associated with each other.
 ワイパーの動作に係る情報の一例は、図7である。なお、本図は、ワイパーの動きの有無と動きのランクを分けたデータの例であるが、例えば、動きのランクの一つに動きのないものを備え、ワイパーの動きの有無のないものとしてもよい。また、ワイパーの動きのないものはデータにいれず、ワイパーの動きがある場合のみをデータとして取得する構成であってもよい。 FIG. 7 shows an example of information related to the operation of the wiper. In addition, this figure is an example of data in which the presence / absence of movement of the wiper and the rank of movement are separated. For example, one of the ranks of movement includes one with no movement, and the presence / absence of movement of the wiper is assumed. May be good. Further, the data without the movement of the wiper may not be included in the data, and the data may be acquired only when the wiper moves.
<道路に係る情報>
 対象に係る情報は、道路に係る情報を含んでよい。道路に係る情報は、道路状況に係る情報を含んでよい。道路状況に係る情報は、道路標識に係る情報、道路標示に係る情報、信号機に係る情報、及び/又は、車道に係る情報、を含んでよい。
<Information related to roads>
The information relating to the subject may include information relating to the road. The information related to the road may include the information related to the road condition. Information relating to road conditions may include information relating to road signs, information relating to road markings, information relating to traffic lights, and / or information relating to roadways.
 道路標識に係る情報は、標識の有無、標識の内容、及び/又は、標識の異常、の情報を含んでよい。標識の異常は、標識を阻害するものの有無(例えば、標識の少なくとも一部が木によって見えない部分があることの状態)、及び/又は、標識自体の異常(例えば、標識の少なくとも一部が破損している状態)、の情報を含んでよい。 Information related to road signs may include information on the presence or absence of signs, the contents of signs, and / or abnormalities in signs. Abnormalities in the sign are the presence or absence of something that interferes with the sign (eg, at least part of the sign is invisible by the tree) and / or the abnormality in the sign itself (eg, at least part of the sign is damaged). It may contain information on (state of operation).
 道路標示に係る情報は、道路標示の有無、道路標示の内容、及び/又は、道路標示の異常、などを含んでよい。道路標示の異常は、標示を阻害するものの有無、及び/又は、標示自体の異常、の情報を含んでよい。 Information related to road markings may include the presence or absence of road markings, the content of road markings, and / or abnormalities in road markings. Anomalies in road markings may include information about the presence or absence of something that interferes with the markings and / or anomalies in the markings themselves.
 信号機に係る情報は、信号機を特定したことの情報(信号機があるという情報)、及び/又は、信号機の異常、の情報を含んでよい。信号機の異常は、信号機の見え方の異常(例えば、信号機の一部が木などによって見えない部分があることの情報)、及び/又は、信号機の故障(例えば、信号機の一部が破損するなど)、の情報を含んでよい。 The information related to the traffic light may include information that the traffic light has been identified (information that there is a traffic light) and / or information that the traffic light is abnormal. Anomalies in a traffic light include anomalies in the appearance of the traffic light (for example, information that a part of the traffic light cannot be seen due to a tree or the like) and / or a failure of the traffic light (for example, a part of the traffic light is damaged). ), Information may be included.
 車道に係る情報は、車道の異常、及び/又は、車線の幅、などであってよい。なお、本願書類において、車線は、一縦列の自動車を安全かつ円滑に通行させるために設けられる帯状の車道の部分(副道を除く)であってよく、車道が線で区切ってあるときに、自動車等が通行できる部分であってよい。車道の異常は、車線境界線の異常、車道上の異物、及び/又は、車道の破壊、などであってよい。車線境界線の異常は、車線境界線の一部又は全部が欠損しているなどであってよい。車道上の異物は、例えば、車道上への落下物、倒木、倒れた電柱、などであってよい。車道の破壊は、道路の陥没など車道の形状の異常であってよい。また、車道に係る情報は、路上駐車の情報を含んでよい。 The information related to the roadway may be an abnormality of the roadway and / or the width of the lane. In addition, in the document of the present application, the lane may be a part of a strip-shaped roadway (excluding a sub-roadway) provided to allow a single column of automobiles to pass safely and smoothly, and when the roadway is separated by a line, It may be a part through which a car or the like can pass. Roadway abnormalities may be lane boundary abnormalities, foreign objects on the roadway, and / or destruction of the roadway. The abnormality of the lane boundary line may be a part or all of the lane boundary line is missing. The foreign matter on the roadway may be, for example, a falling object on the roadway, a fallen tree, a fallen utility pole, or the like. Destruction of the roadway may be an abnormality in the shape of the roadway, such as a collapse of the roadway. In addition, the information related to the roadway may include information on parking on the street.
 また、道路に係る情報は、道路状況に係る情報と、道路状況に係る情報を取得した画像を撮像した位置情報と、を関連付けたものを含んでよい。 Further, the information related to the road may include information related to the road condition and the position information obtained by capturing the image obtained by acquiring the information related to the road condition.
 また、道路に係る情報は、道路状況に係る情報と、道路状況に係る情報を取得した画像を撮像した時、を関連付けたものを含んでよい。 Further, the information related to the road may include information related to the road condition and the information related to the road condition when the image obtained by acquiring the information is associated with each other.
 道路に係る情報の一例は、図8である。道路に係る情報は、対象に応じて、画像内に検出した場合に常に取得してもよいし、画像内で検出した情報が所定の条件を充足した場合に限り取得してもよい。例えば、所定の条件としては、予め定められた対象であったり、異常が検出されたもののみ、などであってよい。 Figure 8 shows an example of information related to roads. Depending on the target, the information related to the road may be always acquired when it is detected in the image, or may be acquired only when the information detected in the image satisfies a predetermined condition. For example, the predetermined condition may be a predetermined target, or only a condition in which an abnormality is detected.
 また、道路に係る情報として、災害に係る情報が生成されてもよい。災害に係る情報は、車道上の異物及び/又は車道の破壊の情報を含んでよい。この場合、道路に係る情報は、災害に係る情報と、災害に係る情報を取得した画像を撮像した位置情報と、を関連づけたものを含んでもよいし、災害に係る情報と、災害に係る情報を取得した画像を撮像した時と、を関連づけたものを含んでもよい。図9は、道路に係る情報が、災害に係る情報を含む一例である。 Also, as information related to roads, information related to disasters may be generated. Information relating to a disaster may include information on foreign matter on the roadway and / or destruction of the roadway. In this case, the information related to the road may include information related to the disaster and the position information obtained by capturing the image obtained by acquiring the information related to the disaster, or may include the information related to the disaster and the information related to the disaster. It may include an image associated with the time when the acquired image is captured. FIG. 9 is an example in which the information related to the road includes the information related to the disaster.
 これらの道路に係る情報は、自動運転に係るシステムに送信された場合、自動運転の可否や優先度の判断に使用されてよい。 When the information related to these roads is transmitted to the system related to autonomous driving, it may be used to judge the propriety and priority of autonomous driving.
<歩道に係る情報>
 対象に係る情報は、歩道に係る情報を含んでよい。歩道に係る情報は、歩道上の人の状態の情報を含んでよい。歩道上の人の状態の情報は、かかる人の着用するもの、の情報を含んでよい。かかる人の着用するものは、傘、を含んでよい(本願書類において、対象が傘である場合において、着用という用語が、差すという意味を含むものとする)。また、人が着用するものは、外套を含んでよい。また、人が着用するものは、半そで、長そで、を含んでよい。
<Information about sidewalks>
The information relating to the subject may include information relating to the sidewalk. The information on the sidewalk may include information on the condition of the person on the sidewalk. Information on the condition of a person on the sidewalk may include information on what the person wears. What is worn by such a person may include an umbrella (in the documents of the present application, the term wearing shall include the meaning of pointing when the subject is an umbrella). Also, what is worn by a person may include a cloak. Also, what is worn by a person may include a half sleeve and a long sleeve.
 また、歩道に係る情報は、歩道上の人の状態の情報と、歩道上の人の状態の情報を取得した画像を撮像した位置情報と、を関連付けたものを含んでよい。 Further, the information related to the sidewalk may include information on the state of the person on the sidewalk and the position information obtained by capturing the image obtained by acquiring the information on the state of the person on the sidewalk.
 また、歩道に係る情報は、歩道上の人の状態の情報と、歩道上の人の状態の情報を取得した画像を撮像した時、を関連付けたものを含んでよい。 Further, the information related to the sidewalk may include information on the state of the person on the sidewalk and the information on the state of the person on the sidewalk associating the acquired image.
 歩道に係る情報の一例は、図10である。歩道に係る情報は、対象に応じて、画像内に検出した場合に常に取得してもよいし、画像内で検出した情報が所定の条件を充足した場合に限り取得してもよい。例えば、所定の条件としては、人の状態の情報として予め定められたものである場合や、一画像内に所定以上の人数の人が検出されたもののみ、一画像内の人の中で所定の人数以下の人の検出のみ、などであってよい。 FIG. 10 is an example of information related to the sidewalk. Depending on the target, the information related to the sidewalk may be always acquired when it is detected in the image, or may be acquired only when the information detected in the image satisfies a predetermined condition. For example, as predetermined conditions, only those that are predetermined as information on the state of a person or that a predetermined number of people or more are detected in one image are predetermined among the people in one image. It may be only the detection of people less than or equal to the number of people.
 これらの歩道に係る情報は、自動運転に係るシステムに送信された場合、自動運転の可否や優先度の判断に使用されてよい。 When the information related to these sidewalks is transmitted to the system related to automatic driving, it may be used to judge the propriety and priority of automatic driving.
<人の混雑具合に係る情報>
 対象に係る情報は、人の混雑具合に係る情報を含んでよい。人の混雑具合に係る情報は、所定の領域内の人の数を含んでよい。また、人の混雑具合に係る情報は、かかる人の数とかかる人の数が判定するのに使用された画像が撮像された位置情報又はかかる所定の領域を示す情報とを関連付けて、含んでよい。また、人の混雑具合に係る情報は、かかる人の数とかかる人の数が判定された画像が撮像された時とを関連付けて、含んでよい。所定の領域は、予め定められてもよいし、動画撮影時に定められてもよい。あらかじめ定められたものとしては、例えば、地図情報や位置情報を用いて予め定められた領域があげられる。また、動画撮影時に定められるものとしては、GPSなどの座標値から所定の距離を用いて定められた領域であってもよいし、撮像経過時間を用いて定められた領域であってもよい。
<Information on how crowded people are>
The information related to the target may include information related to the degree of congestion of people. The information on the degree of congestion of people may include the number of people in a predetermined area. In addition, the information relating to the degree of congestion of people includes the number of such people and the position information in which the image used for determining the number of such people is captured or the information indicating such a predetermined area in association with each other. Good. In addition, the information on the degree of congestion of people may be included in association with the number of such people and the time when the image in which the number of such people is determined is captured. The predetermined area may be predetermined or may be defined at the time of moving image shooting. Examples of the predetermined area include a predetermined area using map information and location information. Further, what is defined at the time of moving image shooting may be a region defined by using a predetermined distance from a coordinate value such as GPS, or may be a region defined by using the elapsed imaging time.
 これらの人の混雑具合に係る情報は、自動運転に係るシステムに送信された場合、自動運転の可否や優先度の判断に使用されてよい。 Information on the degree of congestion of these people may be used to determine the propriety and priority of autonomous driving when it is transmitted to the system related to autonomous driving.
<車道上イベントに係る情報>
また、対象に係る情報は、車道上イベントに関する情報を含んでよい。車道上イベントに関する情報は、工事及び/又は事故に関する情報であってよい。工事に関する情報は、工事の有無、及び/又は、工事の終了予定期限、を含んでよい。工事の有無は、画像内において、工事車両の停車、即席の信号機の設置、工事関係者による車両誘導、などの情報の検出により、判定されてよい。事故に関する情報は、事故の有無、事故の大きさ、を含んでよい。事故の有無は、事故車両の有無、及び/又は、警察関係者の有無、などにより判定されてよい。なお、工事関係者や警察関係者の検出は、それらを特徴づける衣服や持ち物によって検出されてよい。
<Information about events on the road>
In addition, the information related to the target may include information related to an event on the roadway. Information about roadway events may be information about construction and / or accidents. Information about the construction may include the presence or absence of construction and / or the scheduled completion deadline of the construction. The presence or absence of construction may be determined by detecting information such as a stop of the construction vehicle, installation of an instant traffic light, and vehicle guidance by a person involved in the construction in the image. Information about the accident may include the presence or absence of an accident and the magnitude of the accident. The presence or absence of an accident may be determined by the presence or absence of an accident vehicle and / or the presence or absence of police personnel. The detection of construction personnel and police personnel may be detected by the clothes and belongings that characterize them.
 また、車道上イベントに関する情報は、工事及び/又は事故に関する情報と、工事及び/又は事故に関する情報を取得した画像を撮像した位置情報と、を関連付けたものを含んでよい。 Further, the information on the on-road event may include information on the construction and / or the accident and the position information obtained by capturing the image obtained from the information on the construction and / or the accident.
 また、車道上イベントに関する情報は、工事及び/又は事故に関する情報と、工事及び/又は事故に関する情報を取得した画像を撮像した時、を関連付けたものを含んでよい。 In addition, the information on the event on the roadway may include information on the construction and / or the accident and the information on the acquisition of the information on the construction and / or the accident when the image is taken.
 車道上イベントに関する情報の一例は、図11である。 FIG. 11 shows an example of information regarding an event on the roadway.
 これらの車道上イベントに関する情報は、自動運転に係るシステムに送信された場合、自動運転の可否や優先度の判断に使用されてよい。 When the information about these on-road events is transmitted to the system related to autonomous driving, it may be used to judge the propriety and priority of autonomous driving.
<広告に係る情報>
 また、対象に係る情報は、広告に係る情報であってよい。広告に係る情報は、広告状況情報を含んでよい。広告状況情報は、広告の特定のブランド、広告の広告主、広告の属する分野、広告の大きさ、及び/又は、広告の異常の有無、を含むものであってよい。広告の異常の有無は、広告が木などの物によって隠れている状態、及び/又は、広告自体の破損、を含むものであってよい。広告状況情報は、画像に基づいて、取得されてよい。
<Information related to advertising>
In addition, the information related to the target may be information related to the advertisement. The information related to the advertisement may include the advertisement status information. The advertisement status information may include a specific brand of the advertisement, the advertiser of the advertisement, the field to which the advertisement belongs, the size of the advertisement, and / or the presence or absence of an abnormality in the advertisement. The presence or absence of an abnormality in the advertisement may include a state in which the advertisement is hidden by an object such as a tree and / or damage to the advertisement itself. Advertising status information may be acquired based on the image.
 また、広告に係る情報は、広告状況情報と、広告状況情報を取得した画像を撮像した位置情報と、を関連付けたものを含んでよい。 Further, the information related to the advertisement may include an association between the advertisement status information and the position information obtained by capturing the image obtained from the advertisement status information.
 また、広告に係る情報は、広告状況情報と、広告状況情報を取得した画像を撮像した時、を関連付けたものを含んでよい。 Further, the information related to the advertisement may include the information related to the advertisement status information and the time when the image obtained from the advertisement status information is captured.
 広告に係る情報の一例は、図12である。なお、広告の大きさは、本図のように、大きさについて予め定められた複数のランクの一を選択する構成であってもよい。 FIG. 12 is an example of information related to the advertisement. As shown in this figure, the size of the advertisement may be configured to select one of a plurality of predetermined ranks for the size.
<車両用の燃料に係る情報>
 また、対象に係る情報は、車両用の燃料に係る情報であってよい。車両用の燃料に係る情報は、車両用の燃料の状況に係る情報を含んでよい。車両用の燃料の状況に係る情報は、車両用の燃料の種類と、車両用の燃料の種類に対応する金額を含んでよい。車両用の燃料の種類は、ハイオクガソリン、レギュラーガソリン、軽油、及び/又は、灯油、などを含んでよい。車両用の燃料の状況に係る情報は、画像内において、ガソリンスタンド、給油所などを含むサービスステーションなどの車両用の燃料を提供する場所の提示から、取得してよい。
<Information on fuel for vehicles>
Further, the information relating to the target may be information relating to fuel for vehicles. The information relating to the fuel for the vehicle may include the information relating to the status of the fuel for the vehicle. The information on the status of the fuel for the vehicle may include the type of fuel for the vehicle and the amount of money corresponding to the type of fuel for the vehicle. Vehicle fuel types may include high-octane gasoline, regular gasoline, light oil, and / or kerosene, and the like. Information on the status of fuel for vehicles may be obtained from the presentation of locations in the image that provide fuel for vehicles, such as service stations, including gas stations and refueling stations.
 また、車両用の燃料に係る情報は、車両用の燃料の状況に係る情報と、車両用の燃料の状況に係る情報を取得した画像を撮像した位置情報と、を関連付けたものを含んでよい。 Further, the information relating to the fuel for the vehicle may include information relating to the status of the fuel for the vehicle and the position information obtained by capturing the image obtained by acquiring the information relating to the status of the fuel for the vehicle. ..
 また、車両用の燃料に係る情報は、車両用の燃料の状況に係る情報と、車両用の燃料の状況に係る情報を取得した画像を撮像した時、を関連付けたものを含んでよい。 Further, the information relating to the fuel for the vehicle may include information relating to the status of the fuel for the vehicle and the information relating to the acquisition of the information relating to the fuel status for the vehicle when the image is taken.
 車両用の燃料に係る情報の一例は、図13である。 FIG. 13 shows an example of information on fuel for vehicles.
 これらの車両用の燃料に係る情報は、自動運転に係るシステムに送信された場合、自動運転の可否や優先度の判断に使用されてよい。 When the information related to the fuel for these vehicles is transmitted to the system related to automatic driving, it may be used to judge the propriety and priority of automatic driving.
 画像情報取得部は、機械学習済み識別機能を有してよい。機械学習済み識別機能は、予め定められた対象を識別できる機能を含んでよい。画像情報取得部は、機械学習済み識別機能を用いて、上述の種々の対象に係る情報を、画像から取得してよい。機械学習済み識別機能は、端末装置内の情報処理装置内に格納されていてよい。識別する対象があらかじめ限定されていることにより、端末装置内のような簡易な情報処理装置においても、高い識別機能を実現できる利点がある。 The image information acquisition unit may have a machine-learned identification function. The machine-learned identification function may include a function capable of identifying a predetermined object. The image information acquisition unit may acquire information related to the above-mentioned various objects from the image by using the machine-learned identification function. The machine-learned identification function may be stored in the information processing device in the terminal device. Since the object to be identified is limited in advance, there is an advantage that a high identification function can be realized even in a simple information processing device such as in a terminal device.
 機械学習は、種々の人工知能技術が用いられてよい。人工知能技術としては、例えば、ニューラルネットワーク、遺伝的プログラミング、機能論理プログラミング、サポートベクターマシン、クラスタリング、回帰、分類、ベイジアンネットワーク、強化学習、表現学習、決定木、k平均法などの機械学習技術が用いられてよい。以下では、ニューラルネットワークを用いる例を使用するが、必ずしもニューラルネットワークに限定されるものではない。 Various artificial intelligence technologies may be used for machine learning. Artificial intelligence technologies include, for example, machine learning technologies such as neural networks, genetic programming, functional logic programming, support vector machines, clustering, regression, classification, Bayesian networks, reinforcement learning, expression learning, decision trees, and k-means clustering. May be used. In the following, an example using a neural network will be used, but the present invention is not necessarily limited to the neural network.
 ニューラルネットワークを用いた機械学習技術は、ディープラーニング(深層学習)技術を用いてよい。これは、複数の層を用いて入力と出力の関係を学習することで、未知の入力に対しても入力に対応する出力を生成可能とする技術である。教師有りと教師なしの手法が存在するが、どちらが適用されてもよい。教師ありの手法の場合、学習用画像と、前記学習用画像に係る属性情報と、が関連付けられており、これらを学習データとして機械学習する。画像情報取得部に係る機械学習機能は、例えば、画像情報取得部、情報取得部、自己車両情報生成部、画像情報生成部、の少なくとも一部に係る情報と、かかる情報が人から見て画像上に表示されていると判断された画像と、の関係を機械学習するものであってよい。かかる機械学習により、画像情報取得部に係る機械学習は、画像から、画像情報取得部、自己車両情報生成部、画像情報生成部、の少なくとも一部に係る情報を識別し、対応する情報を生成できる機能を有してよい。 As the machine learning technology using the neural network, deep learning technology may be used. This is a technique that makes it possible to generate an output corresponding to an input even for an unknown input by learning the relationship between the input and the output using a plurality of layers. There are supervised and unsupervised methods, either of which may be applied. In the case of the supervised method, the learning image and the attribute information related to the learning image are associated with each other, and machine learning is performed using these as learning data. The machine learning function related to the image information acquisition unit includes, for example, information related to at least a part of the image information acquisition unit, the information acquisition unit, the self-vehicle information generation unit, and the image information generation unit, and the information is an image seen from a person. The relationship between the image determined to be displayed above and the image may be machine-learned. By such machine learning, the machine learning related to the image information acquisition unit identifies information related to at least a part of the image information acquisition unit, the self-vehicle information generation unit, and the image information generation unit from the image, and generates the corresponding information. It may have a function capable of being able to do so.
 ディープラーニング技術を用いる場合における学習アルゴリズム自体は、公知のものであってよい。また、ディープラーニング技術において使用されるプログラムは、オープンソースのものを用いてもよいし、それらを適宜修正したものを用いてもよい。 The learning algorithm itself when using the deep learning technique may be a known one. Further, as the program used in the deep learning technique, an open source program may be used, or a program obtained by modifying them as appropriate may be used.
 また、機械学習済み機能は、更新機能を有してよい。更新機能は、識別可能な対象を更新できる機能であってよい。例えば、車両に係る情報、道路に係る情報、広告に係る情報、などを識別し、対象となる情報を生成できるよう更新可能であってよい。かかる更新は、プログラム又はそのパラメータをダウンロードして、インストールされる態様であってよい。 Also, the machine-learned function may have an update function. The update function may be a function capable of updating an identifiable object. For example, it may be possible to identify information related to vehicles, information related to roads, information related to advertisements, and the like, and update the information so that the target information can be generated. Such updates may be in a mode in which the program or its parameters are downloaded and installed.
 また、対象に係る情報は、各情報が取得された、時に係る情報、及び/又は、場所に係る情報、と関連付けられて記憶されてよい。時に係る情報は、月、日、週、時刻、などであってよい。場所に係る情報は、行政区画、GPS座標、などであってよい。場所に係る情報は、画像を取得する端末装置に係るGPSによって、取得されてよい。 Further, the information related to the target may be stored in association with the information related to the time when each information was acquired and / or the information related to the place. The information related to the time may be month, day, week, time, and the like. The information related to the location may be administrative divisions, GPS coordinates, and the like. The information related to the location may be acquired by the GPS related to the terminal device that acquires the image.
 また、機械学習済み識別機能は、位置情報を用いて、識別機能を用いてよい。例えば、特定の広告が特定の位置にある可能性が高い場合において、かかる特定の位置を含む特定の位置の周辺において、特定の広告の識別を重視する識別機能を有してよい。 Further, the machine-learned identification function may use the identification function by using the position information. For example, when there is a high possibility that a specific advertisement is in a specific position, it may have an identification function that emphasizes the identification of the specific advertisement in the vicinity of the specific position including the specific position.
2.1.4.記憶部
 記憶部は、情報を記憶する機能を有する。例えば、記憶部は、上述の、画像等取得部、自己車両情報生成部、画像情報生成部、に係る情報を記憶してよい。また、記憶部は、複数の情報を関連付けて記憶する機能を有してよい。関連付ける情報は、上述のほか、例えば、画像等と、時の情報と、自己車両運転状況情報と、自己車両位置情報と、車両特定情報と、運転状況情報と、の一部又は全部を関連付けて、記憶してよい。
2.1.4. Storage unit The storage unit has a function of storing information. For example, the storage unit may store the above-mentioned information related to the image acquisition unit, the self-vehicle information generation unit, and the image information generation unit. Further, the storage unit may have a function of associating and storing a plurality of pieces of information. In addition to the above, the information to be associated includes, for example, a part or all of an image or the like, time information, own vehicle driving status information, own vehicle position information, vehicle specific information, and driving status information. , You may remember.
 また、記憶部は、情報を、リングバッファにより、記憶してよい。また、記憶部は、情報毎にリングバッファで記憶してよい。この場合、情報毎にリングバッファの上限が規定されており、かかる上限以上の情報を記憶する場合、古い情報から記憶を削除してよい。情報毎とは、例えば、画像等、自己車両運転状況情報、自己車両位置情報、車両特定情報、他車両運転状況情報、などの情報であってよい。 Further, the storage unit may store the information by the ring buffer. Further, the storage unit may store each piece of information in a ring buffer. In this case, the upper limit of the ring buffer is defined for each information, and when the information exceeding the upper limit is stored, the storage may be deleted from the oldest information. The information for each information may be, for example, information such as an image or the like, own vehicle driving status information, own vehicle position information, vehicle specific information, other vehicle driving status information, and the like.
2.1.5.出力部
 出力部は、音や表示をする機能を有してよい。例えば、自己車両に係る情報、及び/又は、画像に係る情報、を表示してよい。
2.1.5. Output unit The output unit may have a function of producing sound or displaying. For example, information related to the own vehicle and / or information related to an image may be displayed.
 自己車両に係る情報は、自己車両運転状況情報、及び/又は、距離情報を含んでよい。自己車両に係る情報により、自己車両の運転に関する情報を取得できる利点がある。また、画像に係る情報は、車両特定情報、及び/又は、他車両運転状況情報を含んでよい。他車両に係る情報により、他車両に関する情報を取得できる利点がある。なお、画像に係る情報として、以下で詳述する通り、種々の情報を含んでよいため、それらの情報が表示されてもよい。 The information related to the own vehicle may include the own vehicle driving status information and / or the distance information. There is an advantage that information on the driving of the own vehicle can be obtained from the information on the own vehicle. In addition, the information related to the image may include vehicle specific information and / or other vehicle driving status information. There is an advantage that information on other vehicles can be obtained from information on other vehicles. As the information related to the image, as described in detail below, various information may be included, and such information may be displayed.
 また、出力部は、情報通信部によって取得した情報を、表示してよい。例えば、他車両統計情報に係る情報を、表示してよい。また、出力部は、他車両統計情報を用いて、加工した情報を表示してよい。これにより、閲覧者は、他車両に係る情報を理解できる利点がある。特に、前方車両の運転状況が通常と異なる場合、留意して運転することができる利点がある。 Further, the output unit may display the information acquired by the information and communication unit. For example, information related to other vehicle statistical information may be displayed. Further, the output unit may display the processed information by using the other vehicle statistical information. This has the advantage that the viewer can understand the information related to other vehicles. In particular, when the driving situation of the vehicle in front is different from the usual one, there is an advantage that the driver can drive with care.
 また、出力部は、音を発する機能を有し、端末装置の利用者に対して通知をしてよい。例えば、出力部は、自車両の発車を促す情報が生成された場合に、自車両の運転手に発車を促すような音を発生させてよい。 Further, the output unit has a function of emitting a sound, and may notify the user of the terminal device. For example, the output unit may generate a sound that prompts the driver of the own vehicle to start the vehicle when the information for prompting the departure of the own vehicle is generated.
 また、出力部は、画像情報取得部、自己車両情報生成部、画像情報生成部、記憶部、及び、情報通信部、の少なくとも一部に係る情報を出力してよい。 Further, the output unit may output information related to at least a part of the image information acquisition unit, the self-vehicle information generation unit, the image information generation unit, the storage unit, and the information communication unit.
2.1.6.情報通信部
 情報通信部は、情報を通信する機能を有する。情報通信部は、画像情報取得部、自己車両情報生成部、画像情報生成部、記憶部、及び、出力部の少なくとも一部に係る情報を通信してよい。情報通信部は、かかる情報を、送信してよい。
2.1.6. Information and Communication Unit The information and communication unit has a function of communicating information. The information communication unit may communicate information relating to at least a part of an image information acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, and an output unit. The information and communication unit may transmit such information.
 情報を通信する機能の実行は任意のタイミングであってもよいし、特定の条件が成立したタイミングであってもよい。後者の特定の条件としては、例えば、WIFI機能を実行できるタイミングであってよい。この場合、利用者にとっての通信コストを低下できる利点がある。 The function of communicating information may be executed at an arbitrary timing, or at a timing when a specific condition is satisfied. The latter specific condition may be, for example, the timing at which the WIFI function can be executed. In this case, there is an advantage that the communication cost for the user can be reduced.
 情報通信部は、情報を受信する機能を有してよい。情報通信部は、管理システムから、後述の他車両統計情報を取得してよい。 The information and communication unit may have a function of receiving information. The information and communication unit may acquire other vehicle statistical information described later from the management system.
2.2.管理システムの機能2.2. Management system functions
2.2.1.統計処理部
 統計処理部は、統計処理機能を有する。統計処理部は、統計処理をすることにより、他車両統計情報、渋滞情報、天候情報、道路情報、道路異常情報、事故情報、車両用燃料の提供情報、広告統計情報、人情報、及び、3Dマップ、の少なくとも一部の情報を生成してよい。
2.2.1. Statistical processing unit The statistical processing unit has a statistical processing function. By performing statistical processing, the statistical processing department performs other vehicle statistical information, traffic jam information, weather information, road information, road abnormality information, accident information, vehicle fuel provision information, advertising statistical information, human information, and 3D. At least some information about the map may be generated.
<他車両統計情報>
 統計処理部は、一又は複数の端末装置から得られた車両特定情報と他車両運転状況情報を用いて、他車両に係る統計情報(本願書類において、「他車両統計情報」ということもある)を生成してよい。統計情報は、一の車両についての、急ハンドルの回数、急ブレーキの回数、及び/又は、急加速の回数、の合計値を含んでもよいし、一の車両についての急ハンドルの回数、急ブレーキの回数、及び/又は、急加速の回数、などの平均値を含んでもよい。また、かかる合計値や平均値は、所定期間に撮像された車両を対象としたものでもよい。
<Other vehicle statistics>
The statistical processing unit uses vehicle identification information and other vehicle driving status information obtained from one or more terminal devices to provide statistical information related to other vehicles (sometimes referred to as "other vehicle statistical information" in the documents of the present application). May be generated. The statistic may include the sum of the number of sudden steering, the number of sudden braking, and / or the number of sudden accelerations for one vehicle, or the number of sudden steering, sudden braking for one vehicle. And / or the number of sudden accelerations, etc. may be included. Further, the total value and the average value may be those for vehicles imaged during a predetermined period.
 かかる他車両統計情報は、例えば、急ハンドルの回数については、車両特定情報から得られた特定の一の車両について、前記一又は複数の端末装置から得られた車両特定情報と他車両運転状況情報を用いて、前記車両特定情報内において、前記特定の一の車両と同一の車両を特定し、対応する他車両運転状況情報内の急ハンドルの有無又は回数を取得することを、所定の範囲についての全ての前記一又は複数の端末装置から得られた車両特定情報と他車両運転状況情報に対して適用することで、急ハンドルの合計値が生成されてよい。同様に、キューブレーキの回数、急加速の回数、の合計数は生成されてよく、所定の期間や所定の地域などにおける対応する平均値も同様に生成されてよい。 Such other vehicle statistical information, for example, regarding the number of times of sudden steering, for a specific vehicle obtained from the vehicle specific information, the vehicle specific information obtained from the one or a plurality of terminal devices and the other vehicle driving status information. In the vehicle identification information, the same vehicle as the specific vehicle is specified, and the presence / absence or the number of times of sudden steering in the corresponding other vehicle driving status information is acquired for a predetermined range. The total value of the steep steering wheel may be generated by applying it to the vehicle identification information and the other vehicle driving situation information obtained from all the above-mentioned one or more terminal devices. Similarly, the total number of the number of cue brakes and the number of sudden accelerations may be generated, and the corresponding average value in a predetermined period, a predetermined area, or the like may be generated in the same manner.
 また、統計処理部は、前記合計値や平均値について、ランキングを生成してもよい。ランキングは、所定の期間に撮像された車両を対象としたものでもよいし、所定の地域において撮像された車両を対象としてもよいし、所定の道路や所定の道路状況を走行する車両を対象とするものでもよい。所定の道路としては、例えば、住宅地内の道路、幹線道路、高速道路、それ以外、などの道路の種類であってよい。所定の道路状況としては、例えば、車両の速度に応じるものであってよく、例えば、渋滞中、低速走行中、高速走行中など、速度を評価値として指定の範囲内の速度での走行中の車両を対象とするものであってもよい。ランキングは、前記合計値や平均値について、高い順に所定の数のランキングであってもよいし、低い順に所定の数のランキングであってもよい。 Further, the statistical processing unit may generate a ranking for the total value or the average value. The ranking may be targeted at vehicles imaged during a predetermined period, may be targeted at vehicles imaged in a predetermined area, or may be targeted at vehicles traveling on a predetermined road or a predetermined road condition. It may be something to do. The predetermined road may be, for example, a type of road such as a road in a residential area, a main road, an expressway, or the like. The predetermined road condition may be, for example, depending on the speed of the vehicle, for example, during traffic jam, low speed running, high speed running, etc., while traveling at a speed within a specified range with the speed as an evaluation value. It may be intended for vehicles. The ranking may be a predetermined number of rankings in descending order of the total value or the average value, or may be a predetermined number of rankings in ascending order.
 図14は、他車両統計情報の一例である。 FIG. 14 is an example of other vehicle statistical information.
 <渋滞情報>
 統計処理部は、渋滞情報を生成してよい。渋滞情報は、車両の渋滞に係る情報であれば、どのようなデータであってもよい。渋滞情報は、例えば、特定の領域と関連付けられた渋滞の存在を示す情報、を含んでよい。特定の領域と関連付けられた渋滞の存在を示す情報は、例えば、前記特定の領域内に位置情報を有する一又は複数の車両が、所定の条件を満たす場合に生成されてよい。所定の条件は、例えば、渋滞中と判定された車両の数が所定数よりも多い場合や、渋滞中と判定された車両の数が渋滞中と判定されない車両の数の所定の倍数以上である場合等により、判定されてよい。
<Congestion information>
The statistical processing unit may generate congestion information. The traffic jam information may be any data as long as it is information related to the traffic jam of the vehicle. Congestion information may include, for example, information indicating the presence of congestion associated with a particular area. Information indicating the existence of a traffic jam associated with a specific area may be generated, for example, when one or more vehicles having position information in the specific area satisfy a predetermined condition. The predetermined condition is, for example, when the number of vehicles determined to be congested is larger than the predetermined number, or the number of vehicles determined to be congested is a predetermined multiple or more of the number of vehicles not determined to be congested. It may be determined depending on the case.
 車両は、車両の速度及び/又は上述の距離情報について、所定の条件を満たす場合に、渋滞中と判定されてよい。 The vehicle may be determined to be in a traffic jam if certain conditions are met for the vehicle speed and / or the above-mentioned distance information.
 車両の速度について、所定の条件を満たす場合は、例えば、車両の速度と、所定の速度と比較して、車両の速度が小さい場合に、渋滞と判定されてよい。また、所定の時間内又は所定の距離内において、車両の速度の平均が、所定の速度と比較して、小さい場合に、渋滞と判定されてよい。渋滞の場合、車両の平均の速度が所定の速度よりも小さくなるためである。車両の速度は、端末装置から得られた自己車両運転状況情報内の速度及び/又は加速度を用いてよい。 When the vehicle speed satisfies a predetermined condition, for example, when the vehicle speed is smaller than the vehicle speed and the predetermined speed, it may be determined as a traffic jam. Further, when the average speed of the vehicles is smaller than the predetermined speed within a predetermined time or a predetermined distance, it may be determined as a traffic jam. This is because in the case of traffic congestion, the average speed of the vehicle becomes smaller than the predetermined speed. As the vehicle speed, the speed and / or acceleration in the own vehicle driving status information obtained from the terminal device may be used.
 また、距離情報について、所定の条件を満たす場合は、例えば、距離情報が、所定の距離よりも小さい場合に、渋滞と判定されてよい。また、所定の時間内又は所定の距離内において、距離情報の平均が所定以下であるなどであることにより、渋滞と判定されてよい。画像内の車両の大きさが大きい場合、前の車と自社の車との距離が短いことを示すため、渋滞であると判定されるべき情報であるためである。 Further, when the distance information satisfies a predetermined condition, for example, when the distance information is smaller than the predetermined distance, it may be determined as a traffic jam. In addition, it may be determined that there is a traffic jam because the average of the distance information is equal to or less than a predetermined time within a predetermined time or within a predetermined distance. This is because when the size of the vehicle in the image is large, it indicates that the distance between the vehicle in front and the vehicle of the company is short, and this is information that should be determined to be a traffic jam.
 また、上述の渋滞情報を用いる際に利用される車両の速度及び/又は距離情報は、所定の範囲の時と関連付けられた情報を用いてよい。かかる時は、端末装置内の加速度センサーによって測定された時、又は、端末装置が取得した画像の撮像された時であってよい。これにより、同一又はかかる同一の時を含む所定の範囲内の時の情報において、渋滞の情報を収集できる利点がある。 Further, as the speed and / or distance information of the vehicle used when using the above-mentioned traffic jam information, the information associated with the time in a predetermined range may be used. Such a case may be when the measurement is performed by the acceleration sensor in the terminal device or when the image acquired by the terminal device is captured. This has an advantage that traffic congestion information can be collected in the information of the same time or the time within a predetermined range including the same time.
 また、渋滞情報は、端末装置内の加速度センサーによって測定された時、端末装置が取得した画像の撮像された時、又は、これらを含む時間的範囲の情報、とを関連付けて含んでよい。 Further, the traffic jam information may be included in association with the time measured by the acceleration sensor in the terminal device, the time when the image acquired by the terminal device is captured, or the information in the time range including these.
 図15は、渋滞情報の一例である。なお、本図においては、領域について、渋滞情報を生成しているが、渋滞の長さの情報を生成し、渋滞の長さの情報が、渋滞情報に含まれてもよい。渋滞の長さの情報は、領域又は位置情報が近い距離である所定の範囲以下について渋滞であると判定されている場合に、かかる渋滞であると判定するための情報を取得した端末装置間の距離の長さを用いて、少なくともかかる距離の長さが渋滞であると判定し、かかる長さを渋滞の長さとしてよい。 FIG. 15 is an example of traffic congestion information. In this figure, the traffic jam information is generated for the area, but the traffic jam length information may be generated and the traffic jam length information may be included in the traffic jam information. The information on the length of the traffic jam is between the terminal devices that have acquired the information for determining that the traffic jam is caused when the area or the position information is determined to be a traffic jam within a predetermined range which is a short distance. Using the length of the distance, it may be determined that at least the length of the distance is the traffic jam, and the length of the traffic jam may be used as the length of the traffic jam.
<天候情報>
 統計処理部は、天候情報を生成してよい。天候情報は、例えば、特定の領域と関連付けられた天候状況情報、を含んでよい。特定の領域と関連付けられた天候状況情報は、例えば、前記特定の領域内に位置情報を有する一又は複数の端末装置によって撮像された画像から取得された雨に係る情報及び/又は気温を示す情報が、所定の条件を満たす場合に生成されてよい。
<Weather information>
The statistical processing unit may generate weather information. The weather information may include, for example, weather condition information associated with a particular area. The weather condition information associated with the specific area is, for example, information related to rain acquired from an image captured by one or a plurality of terminal devices having location information in the specific area and / or information indicating temperature. However, it may be generated when a predetermined condition is satisfied.
 雨に係る情報の所定の条件としては、例えば、雨であると判定される情報が所定数よりも多い場合や、雨であると判定されない情報の数に対する雨であると判定される情報の数の割合が所定の割合よりも大きい場合、雨に係る情報が雨であるという情報を含んでよい。 As a predetermined condition of information related to rain, for example, when there is more information determined to be rain than a predetermined number, or the number of information determined to be rain relative to the number of information not determined to be rain. If the proportion of is greater than a predetermined proportion, the information relating to rain may include information that it is rain.
 ここで、雨であると判定される情報は、ワイパーの動作に係る情報を用いて、生成されてよい。例えば、ワイパーの動きがあれば、雨であると生成されてよいし、ワイパーの動きがなければ、雨であると判定されなくてよい。また、ワイパーの動きの速さが、ワイパーの動きの速さを示す複数のランクの一を示すものである場合、かかる複数のランクの一であることから、雨の程度を示す情報が生成されてよい。 Here, the information determined to be rain may be generated by using the information related to the operation of the wiper. For example, if there is movement of the wiper, it may be generated as rain, and if there is no movement of the wiper, it may not be determined that it is rain. Further, when the speed of movement of the wiper indicates one of a plurality of ranks indicating the speed of movement of the wiper, since it is one of the plurality of ranks, information indicating the degree of rain is generated. You can.
 また、雨に係る情報の所定の条件としては、例えば、ワイパーの動きの速さを示す複数のランクの一であると判定される情報が所定数よりも多い場合や、ワイパーの動きの速さを示す複数のランクの一であると判定されない情報の数に対するワイパーの動きの速さを示す複数のランクの一であると判定される情報の数の割合が所定の割合よりも大きい場合、雨に係る情報が、対応する雨の程度を示す情報を含んでよい。 Further, as a predetermined condition of the information related to rain, for example, when there is more information determined to be one of a plurality of ranks indicating the speed of movement of the wiper than a predetermined number, or the speed of movement of the wiper. If the ratio of the number of information determined to be one of multiple ranks indicating the speed of movement of the wiper to the number of information not determined to be one of multiple ranks is greater than the predetermined ratio, it rains. The information relating to the above may include information indicating the corresponding degree of rain.
 また、気温を示す情報の所定の条件としては、例えば、歩道に係る情報において外套を含む数が所定の数よりも多い場合や外套を含む数に対する外套を含まない数が所定の割合よりも大きい場合、気温を示す情報は外套が必要な程度の天候という情報を含んでよく、歩道に係る情報において長そでを含む数が所定の数よりも多い場合や長そでを含む数に対する長そでを含まない数が所定の割合よりも大きい場合、気温を示す情報は長そでが必要な程度の天候という情報を含んでよく、歩道に係る情報において半そでを含む数が所定の数よりも多い場合や半そでを含む数に対する半そでを含まない数が所定の割合よりも大きい場合、気温を示す情報は半そでが必要な程度の天候という情報を含んでよい。 Further, as a predetermined condition of the information indicating the temperature, for example, when the number including the cloak is larger than the predetermined number in the information related to the sidewalk, or the number not including the cloak is larger than the predetermined ratio to the number including the cloak. In that case, the information indicating the temperature may include the information that the weather is such that a mantle is required, and the number including the long sleeves is larger than the predetermined number in the information related to the sidewalk, or the number not including the long sleeves is larger than the number including the long sleeves. If it is larger than the predetermined ratio, the information indicating the temperature may include the information that the weather requires a long sleeve, and the number including the half sleeve in the information related to the sidewalk is larger than the predetermined number or the number including the half sleeve. When the number without half sleeves is larger than a predetermined ratio, the information indicating the temperature may include information that the weather is such that half sleeves are required.
 また、天候情報は、天候状況情報と、前記天候状況情報を生成するのに使用された特定の領域を示す情報と、を関連付けて含んでよい。 Further, the weather information may include the weather condition information and the information indicating the specific area used for generating the weather condition information in association with each other.
 また、天候情報は、天候状況情報と、前記天候状況情報を生成するのに使用されたワイパーに係る情報及び/又は歩道に係る情報を取得した画像を撮像した時と、を関連付けて含んでよい。 Further, the weather information may include the weather condition information and the time when the image obtained by acquiring the information related to the wiper used to generate the weather condition information and / or the information related to the sidewalk is captured in association with each other. ..
 図16は、天候情報の一例である。 FIG. 16 is an example of weather information.
<道路異常情報>
 統計処理部は、道路異常情報を、生成してよい。道路異常情報は、道路異常状態に係る情報を含んでよい。道路異常状態に係る情報は、例えば、標識の異常、道路標示の異常、信号機の異常、及び/又は、車道の異常、を含んでよい。
<Road abnormality information>
The statistical processing unit may generate road abnormality information. The road abnormality information may include information relating to the road abnormality state. The information relating to the road abnormality state may include, for example, an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
 標識の異常、道路標示の異常、信号機の異常、及び/又は、車道の異常、は、道路に係る情報内における、標識の異常、道路標示の異常、信号機の異常、及び/又は、車道の異常、を用いて、生成されてよい。 Sign abnormalities, road marking abnormalities, traffic light abnormalities, and / or roadway abnormalities are signs abnormalities, road marking abnormalities, traffic light abnormalities, and / or roadway abnormalities in information related to roads. , May be used to generate.
 統計処理部は、道路異常情報を、一の端末装置から取得した道路に係る情報から生成してもよいし、複数の端末装置から取得した道路に係る情報から生成してもよい。統計処理部は、後者の場合、同一の時を含む所定の範囲の時であって、同一の位置情報を含む所定の範囲の領域について、所定の数又は所定の割合の端末装置から、道路に係る情報が、標識の異常、道路標示の異常、信号機の異常、及び/又は、車道の異常、を含む場合に、道路異常情報を生成してもよい。 The statistical processing unit may generate road abnormality information from information related to roads acquired from one terminal device, or may generate road abnormality information from information related to roads acquired from a plurality of terminal devices. In the latter case, the statistical processing unit performs a predetermined number or a predetermined ratio of terminal devices on the road for a predetermined range of areas including the same position information in a predetermined range including the same time. Road abnormality information may be generated when such information includes an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
 道路異常情報は、道路異常状態に係る情報と、その異常に係る画像が撮像された位置情報又はかかる位置情報を含む所定の範囲の領域と、を関連付けて含んでよい。 The road abnormality information may include the information related to the road abnormality state and the position information in which the image related to the abnormality is captured or the area in a predetermined range including the position information in association with each other.
 また、道路異常情報は、道路異常状態に係る情報は、かかる画像が撮像された時又はかかる時を含む所定の範囲の時と、を関連付けて含んでよい。 Further, the road abnormality information may include the information related to the road abnormality state in association with the time when the image is captured or the time in a predetermined range including the time when the image is taken.
 また、道路異常情報は、道路異常状態に係る情報と、その異常に係る像と、を関連付けて含んでよい。 Further, the road abnormality information may include information related to the road abnormality state and an image related to the abnormality in association with each other.
 図17は、道路異常情報の一例である。 FIG. 17 is an example of road abnormality information.
<道路情報>
 統計処理部は、道路情報を生成してよい。道路情報は、道路に係る情報を用いて生成してよい。統計処理部は、道路情報を、一の端末装置から取得した道路に係る情報から生成してもよいし、複数の端末装置から取得した道路に係る情報から生成してもよい。統計処理部は、後者の場合、同一の時を含む所定の範囲の時であって、同一の位置情報を含む所定の範囲の領域について、所定の数又は所定の割合の端末装置から、同一又は類似の道路に係る情報を取得した場合に、道路情報を生成してもよい。ここで類似の情報は、画像から生成した情報としての誤差範囲であるものであってよい。など、道路情報は、道路異常情報と同様に、時の情報、領域又は位置情報、と関連付けられて含まれてよい。
<Road information>
The statistical processing unit may generate road information. Road information may be generated using information related to roads. The statistical processing unit may generate road information from information related to roads acquired from one terminal device, or may generate road information from information related to roads acquired from a plurality of terminal devices. In the latter case, the statistical processing unit is the same or from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time. Road information may be generated when information on similar roads is acquired. Here, the similar information may be an error range as the information generated from the image. Road information, such as, may be included in association with time information, area or location information, as well as road anomaly information.
<事故情報>
 統計処理部は、事故情報を生成してよい。統計処理部は、車道上イベントに関する情報が、事故に係る情報を含む場合に、事故情報を生成してよい。また、かかる場合、統計処理部は、前記事故に係る情報と関連付けられた位置情報を用いて、事故が生じた領域を設定し、事故情報はかかる領域の情報を含んでよい。また、事故情報は、かかる事故に係る時の情報を前記領域と関連付けて含んでよい。
<Accident information>
The statistical processing unit may generate accident information. The statistical processing unit may generate accident information when the information about the event on the roadway includes the information related to the accident. Further, in such a case, the statistical processing unit may set the area where the accident has occurred by using the position information associated with the information related to the accident, and the accident information may include the information of the area. In addition, the accident information may include information related to such an accident in association with the area.
 事故情報は、車道上イベントに関する情報に基づいて、生成されてよい。また、事故情報内の時の情報は、かかる車道上イベントに係る時の情報を用いて生成してよい。また、事故情報内の時の情報は、複数の端末装置からかかる車道上イベントに係る時の情報を取得した場合において、かかる複数の車道上イベントに係る時の情報の中で早いものと遅いものの範囲であってもよい。 Accident information may be generated based on information about events on the road. Further, the information at the time in the accident information may be generated by using the information at the time related to the event on the roadway. In addition, the information at the time in the accident information is the information at the time related to the event on the roadway when the information related to the event on the roadway is acquired from a plurality of terminal devices, and the information at the time related to the event on the roadway is early or late. It may be a range.
 また、統計処理部は、継続する車道上イベントに係る情報のうち、継続した期間が所定の期間よりも短い車道上イベントに係る情報を、事故情報として生成してよい。路上の事故は、工事情報と比較して、一般に短い期間に撤去などがされるためである。そのため、画像のみから工事であるか事故であるか判定できない場合においても、車道上イベントに係る情報の継続性に着目して、事故情報を判定してよい。なお、事故情報の精度を向上する観点から、このような期間に着目して事故情報と判定しないよう構成してもよい。 Further, the statistical processing unit may generate information related to a roadway event whose continuous period is shorter than a predetermined period as accident information among the information related to the continuous roadway event. This is because accidents on the road are generally removed in a short period of time compared to construction information. Therefore, even when it is not possible to determine whether the work is a construction work or an accident only from the image, the accident information may be determined by paying attention to the continuity of the information related to the event on the roadway. From the viewpoint of improving the accuracy of the accident information, the accident information may not be determined by paying attention to such a period.
 統計処理部は、事故情報を、一の端末装置から取得した車道上イベントに係る情報から生成してもよいし、複数の端末装置から取得した車道上イベントに係る情報から生成してもよい。統計処理部は、後者の場合、同一の時を含む所定の範囲の時であって、同一の位置情報を含む所定の範囲の領域について、所定の数又は所定の割合の端末装置から、車道上イベントに係る情報が、標識の異常、道路標示の異常、信号機の異常、及び/又は、車道の異常、を含む場合に、道路異常情報を生成してもよい。 The statistical processing unit may generate accident information from information related to roadway events acquired from one terminal device, or may generate accident information from information related to roadway events acquired from a plurality of terminal devices. In the latter case, the statistical processing unit is on the road from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time. Road abnormality information may be generated when the information related to the event includes an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
図18は、かかる事故情報の一例である。 FIG. 18 is an example of such accident information.
<車両用燃料の提供情報>
 統計処理部は、車両用燃料の提供情報を生成してよい。車両用燃料の提供情報は、車両用燃料の種類と、かかる種類に対応する金額を、関連付けて含んでよい。車両用燃料の種類と、かかる種類に対応する金額は、車両用の燃料に係る情報を用いて、生成してよい。
<Vehicle fuel provision information>
The statistical processing unit may generate information on the provision of fuel for vehicles. The vehicle fuel provision information may include the type of vehicle fuel and the amount of money corresponding to such type in association with each other. The type of vehicle fuel and the amount of money corresponding to such type may be generated using the information on the vehicle fuel.
 また、車両用燃料の提供情報は、かかる種類と金額に対応する位置情報を、関連付けて含んでよい。かかる位置情報は、かかる種類と金額に対応する車両用の燃料に係る情報内の位置情報を用いて、生成されてよい。 Further, the vehicle fuel provision information may include the location information corresponding to the type and the amount of money in association with each other. Such location information may be generated using the location information in the information relating to the fuel for the vehicle corresponding to such type and amount.
 また、車両用燃料の提供情報は、かかる種類と金額に対応する時の情報を、関連付けて含んでよい。かかる時の情報は、かかる種類と金額に対応する車両用の燃料に係る情報内の時の情報を用いて、生成されてよい。 Further, the vehicle fuel provision information may include the information corresponding to the type and the amount of money in association with each other. The information at such times may be generated using the information at such times in the information relating to fuels for vehicles corresponding to such types and amounts.
 統計処理部は、車両用燃料の提供情報を、一の端末装置から取得した車両用の燃料に係る情報から生成してもよいし、複数の端末装置から取得した車両用の燃料に係る情報から生成してもよい。統計処理部は、後者の場合、同一の時を含む所定の範囲の時であって、同一の位置情報を含む所定の範囲の領域について、所定の数又は所定の割合の端末装置から、車両用の燃料に係る情報が、車両用の燃料の種類と、これに対応する金額、を含む場合に、車両用燃料の提供情報を生成してよい。所定の範囲の時は、現在の時からさかのぼって所定の期間であってよい。これにより、最新の情報にできる利点がある。 The statistical processing unit may generate vehicle fuel provision information from information related to vehicle fuel acquired from one terminal device, or from information related to vehicle fuel acquired from a plurality of terminal devices. It may be generated. In the latter case, the statistical processing unit is used for vehicles from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time. When the information relating to the fuel of the vehicle includes the type of fuel for the vehicle and the corresponding amount of money, the information for providing the fuel for the vehicle may be generated. The time in the predetermined range may be a predetermined period retroactive from the present time. This has the advantage of being able to keep up-to-date information.
 図19は、車両用燃料の提供情報の一例である。 FIG. 19 is an example of vehicle fuel provision information.
<広告統計情報>
 統計処理部は、広告統計情報を生成してよい。広告統計情報は、広告に係る数を含んでもよいし、広告に係る割合を含んでもよい。広告に係る数は、所定の条件を満たす広告の数であってよい。所定の条件を満たす広告としては、特定のブランドの広告、特定の広告主による広告、特定の分野に関する広告、所定の大きさの広告、所定の大きさよりも大きい広告、所定の大きさよりも小さい広告、及び/又は、木や建物により阻害されている広告などであってよい。また、広告に係る割合は、第1の所定の条件を満たす広告に対する第2の所定の条件を満たす広告の割合であってよい。また、第1の所定の条件及び第2の所定の条件は、上述の所定の条件であってよく、これらは異なってよい。また、上述の特定のブランドは、かかるブランドを画像内の文字認識で特定できればよい。また、上述の広告主は、かかる広告主を画像内の文字認識により特定できればよい。また、木や建物により阻害されている広告については、広告が画像内で認識可能な時間帯の中の所定の割合以上について、かかる広告の視野が阻害されているものであってよい。
<Advertising statistics>
The statistical processing unit may generate advertising statistical information. The advertisement statistical information may include the number related to the advertisement or may include the ratio related to the advertisement. The number of advertisements may be the number of advertisements satisfying a predetermined condition. Advertisements that meet certain conditions include advertisements of a specific brand, advertisements by a specific advertiser, advertisements related to a specific field, advertisements of a predetermined size, advertisements larger than a predetermined size, and advertisements smaller than a predetermined size. , And / or advertisements that are blocked by trees or buildings. Further, the ratio related to the advertisement may be the ratio of the advertisement satisfying the second predetermined condition to the advertisement satisfying the first predetermined condition. Further, the first predetermined condition and the second predetermined condition may be the above-mentioned predetermined conditions, and these may be different. Further, the above-mentioned specific brand may be able to identify such a brand by character recognition in an image. Further, the above-mentioned advertiser only needs to be able to identify such an advertiser by character recognition in the image. Further, with respect to the advertisement obstructed by trees and buildings, the field of view of the advertisement may be obstructed for a predetermined ratio or more in the time zone in which the advertisement can be recognized in the image.
 また、広告統計情報は、広告に係る数と、かかる広告に係る数を生成するのに使用した画像を撮像した位置情報を含む特定の領域を示す情報と、を関連付けて含んでよい。かかる領域を示す情報は、かかる画像と関連付けられた位置情報を用いて、かかる位置を含む特定の領域として生成されてよい。 Further, the advertisement statistical information may include the number related to the advertisement and the information indicating a specific area including the position information obtained by capturing the image used to generate the number related to the advertisement in association with each other. The information indicating such a region may be generated as a specific region including such a position by using the position information associated with the image.
 また、広告統計情報は、広告に係る数と、かかる広告に係る数を生成するのに使用した画像を撮像した時を含む特定の時間的範囲を示す情報と、を関連付けて含んでよい。かかる特定の時間的範囲を示す情報は、かかる画像と関連付けられた時の情報を用いて、かかる時を含む特定の時間的範囲として生成されてよい。 Further, the advertisement statistical information may include the number related to the advertisement and the information indicating a specific time range including the time when the image used to generate the number related to the advertisement is imaged in association with each other. Information indicating such a specific time range may be generated as a specific time range including such a time by using the information of the time associated with such an image.
 統計処理部は、広告統計情報を、一の端末装置から取得した広告に係る情報から生成してもよいし、複数の端末装置から取得した広告に係る情報から生成してもよい。後者の場合、同一または所定の範囲の位置情報に係る広告に関する情報については、所定の数又は所定の割合以上の端末装置から、同一の広告に係る情報を取得した場合に限り、広告統計情報を生成するよう構成してもよいし、一の端末装置から広告に係る情報を取得した場合であっても広告統計情報を生成するよう構成してもよい。 The statistical processing unit may generate advertisement statistical information from information related to advertisements acquired from one terminal device, or may generate advertisement statistical information from information related to advertisements acquired from a plurality of terminal devices. In the latter case, regarding the information related to the advertisement related to the same or predetermined range of position information, the advertisement statistical information is obtained only when the information related to the same advertisement is acquired from a predetermined number or a predetermined ratio or more of the terminal devices. It may be configured to generate, or it may be configured to generate advertisement statistics even when the information related to the advertisement is acquired from one terminal device.
 統計処理部は、広告統計情報について、対応する所定の条件を含んでもよい。これにより、広告統計情報がどのような条件を充足したものであるのか情報として保持できる利点がある。 The statistical processing unit may include the corresponding predetermined conditions for the advertisement statistical information. This has the advantage that it is possible to retain information as to what kind of conditions the advertisement statistical information satisfies.
 図20は、広告統計情報の一例である。 FIG. 20 is an example of advertising statistical information.
<人情報>
 統計処理部は、人情報を生成してもよい。人情報は、人の混雑具合に係る情報に基づき、生成されてよい。人情報は、人の数と、かかる人がいる領域を示す情報と、を関連付けて含んでよい。また、人情報は、人の数と、かかる人がいる領域を撮像した時を含む時の情報と、を関連付けて含んでよい。
<Personal information>
The statistical processing unit may generate human information. Person information may be generated based on information related to the degree of congestion of people. The person information may include the number of people and the information indicating the area where the person is located in association with each other. In addition, the person information may include the number of people and the information including the time when the area where the person is present is imaged in association with each other.
 統計処理部は、人情報を、一の端末装置から取得した人の混雑具合に係る情報から生成してもよいし、複数の端末装置から取得した人の混雑具合に係る情報から生成してもよい。後者の場合において、所定の時間内の時と関連付けられ、所定の領域内と関連付けられた、人の混雑具合に係る情報については、所定の条件を満たす端末装置からの情報及び/又は各端末装置の人の数の平均値を用いて、上述の人の数を生成してよい。 The statistical processing unit may generate human information from information related to the degree of congestion of people acquired from one terminal device, or may generate human information from information related to the degree of congestion of people acquired from a plurality of terminal devices. Good. In the latter case, regarding the information related to the degree of congestion of people associated with the time within the predetermined time and with the predetermined area, the information from the terminal device satisfying the predetermined condition and / or each terminal device. The average number of people in the above may be used to generate the number of people mentioned above.
 図21は、人情報の一例である。 FIG. 21 is an example of human information.
<3Dマップ>
 統計情報は、3Dマップを生成してもよい。3Dマップは、車道から見た両側の風景を3Dでデジタル表現したものであればよい。3Dマップは、画像から生成されてよい。3Dマップは、端末装置の前方及び/又は後方を撮像する撮像装置によって撮像された一又は複数の画像を用いて生成されてよい。一又は複数の画像から3Dマップを生成する手段は、公知の手段であってよい。
<3D map>
Statistical information may generate a 3D map. The 3D map may be a 3D digital representation of the landscapes on both sides as seen from the roadway. The 3D map may be generated from the image. The 3D map may be generated using one or more images captured by an imaging device that images the front and / or rear of the terminal device. The means for generating a 3D map from one or more images may be a known means.
 統計処理部は、3Dマップを、一の端末装置から取得した画像から生成してもよいし、複数の端末装置から取得した画像から生成してもよい。後者の場合、所定の時間的範囲と関連付けられ、同一の位置情報と関連付けられた画像が用いられて、3Dマップが生成されてよい。所定の時間的範囲に撮像された画像を用いることにより、時間的に変化のない情報に基づいて、3Dマップを生成できる利点がある。また、同一の位置情報を有する同一の場所で撮像された複数の画像から3Dマップを生成することにより、より精度の高い3Dマップを生成できる利点がある。 The statistical processing unit may generate a 3D map from an image acquired from one terminal device, or may generate a 3D map from an image acquired from a plurality of terminal devices. In the latter case, an image associated with a predetermined time range and associated with the same location information may be used to generate a 3D map. By using an image captured in a predetermined time range, there is an advantage that a 3D map can be generated based on information that does not change with time. Further, there is an advantage that a more accurate 3D map can be generated by generating a 3D map from a plurality of images captured at the same place having the same position information.
2.2.2.情報通信部
 情報通信部は、一又は複数の端末装置と、情報を通信してよい。情報通信部は、一又は複数の端末装置から、情報を取得してよいし、一又は複数の端末装置に対して、情報を送信してよい。情報通信部は、情報として、画像等取得部、自己車両情報生成部、画像情報生成部、記憶部、及び、出力部、の少なくとも一部に係る情報を、一又は複数の端末装置から、取得してよい。また、情報通信部は、情報として、画像等取得部、自己車両情報生成部、画像情報生成部、記憶部、出力部、及び、統計処理部、の少なくとも一部に係る情報を、一又は複数の端末装置に対して、送信してよい。
 また、情報通信部は、情報を、種々のシステムと通信する機能を有してよい。
2.2.2. Information and Communication Unit The information and communication unit may communicate information with one or more terminal devices. The information communication unit may acquire information from one or more terminal devices, and may transmit information to one or more terminal devices. The information communication unit acquires information related to at least a part of an image acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, and an output unit as information from one or a plurality of terminal devices. You can do it. In addition, the information communication unit may provide one or more pieces of information relating to at least a part of an image acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, an output unit, and a statistical processing unit. It may be transmitted to the terminal device of.
In addition, the information communication unit may have a function of communicating information with various systems.
<渋滞情報>
 情報通信部は、他車両統計情報、渋滞情報、天候情報、道路情報、道路異常情報、事故情報、車両用燃料の提供情報、広告統計情報、人情報、及び、3Dマップ、の少なくとも一部の情報を、道路利用者に係るシステムに、送信してよい。本願書類において、道路利用者は、地図を提供する会社、道路を利用する運送業者、道路を利用するタクシー会社、カーナビゲーションを提供する会社、道路をメンテナンスする会社、自動運転に係るサービスを提供する会社、及び/又は、自治体又は自治体にサービスを提供する会社、を含んでよい。
<Congestion information>
The Information and Communication Department is responsible for at least a part of other vehicle statistical information, congestion information, weather information, road information, road abnormality information, accident information, vehicle fuel provision information, advertising statistical information, human information, and 3D maps. Information may be transmitted to the system for road users. In the documents of the present application, the road user provides a company that provides a map, a carrier that uses the road, a taxi company that uses the road, a company that provides car navigation, a company that maintains the road, and services related to automatic driving. Companies and / or companies that provide services to municipalities or municipalities may be included.
<事故情報>
 情報通信部は、事故情報を、保険会社に係るシステムに、送信してよい。
<Accident information>
The information and communication department may transmit the accident information to the system related to the insurance company.
<広告統計情報>
 情報通信部は、広告統計情報を、広告会社に係るシステムに、送信してよい。
<Advertising statistics>
The information and communication department may transmit the advertisement statistical information to the system related to the advertising company.
<人情報>
 情報通信部は、人情報を、携帯電話の電波のサービスに係るシステムに、送信してよい。
<Personal information>
The information and communication unit may transmit the person information to the system related to the radio wave service of the mobile phone.
3.実施例
3.1.実施例1
 実施例1に係るシステム例は、上述の機能の一部又は全部の機能を用いて、主にドライブレコーダとして用いる態様である。一例のシステムの利用者は、端末装置として、例えばスマートフォンを有してよい。かかるスマートフォンには、予め、一例のシステムに係るソフトウェアがダウンロードされ、インストールされていてよい。
3. 3. Example
3.1. Example 1
The system example according to the first embodiment is an embodiment mainly used as a drive recorder by using some or all of the above-mentioned functions. A user of an example system may have, for example, a smartphone as a terminal device. The software related to the system of the example may be downloaded and installed in advance on such a smartphone.
 一例のシステムの利用者が利用する車両(以下、「利用者車両」ということもある)に乗車すると、利用者は、端末装置を、利用者車両に、取り付ける。ここで、取り付けの態様は種々のものであってよい。着脱式であってよい。また、矩形のスマートフォンを、縦方向より横方向が長い横長で車両に取り付けてもよいし、横方向より縦方向が長い縦長で取り付けてもよい。横長の場合、自己車両の走行するレーンと隣のレーン上を走行する車両も幅広く撮像範囲に入る可能性がある利点がある。 When boarding a vehicle used by a user of an example system (hereinafter, also referred to as a "user vehicle"), the user attaches a terminal device to the user vehicle. Here, the mounting mode may be various. It may be removable. Further, the rectangular smartphone may be attached to the vehicle in a horizontally long shape that is longer in the horizontal direction than in the vertical direction, or may be mounted in a vertically long shape that is longer in the vertical direction than in the horizontal direction. In the case of the horizontally long vehicle, there is an advantage that the lane in which the own vehicle travels and the vehicle traveling on the adjacent lane may also fall within the wide imaging range.
 次に、利用者は、一例のシステムに係るソフトウェアを起動する。一例のシステムが起動されると、一例のシステムは、自動的に撮像を開始してよい。すなわち、端末装置内の撮像装置を起動させて、撮像を開始してよい。起動のみで他に操作なく自動的に撮像が開始されることにより、利用者は、撮像の開始操作の忘れを防止し、撮像を開始する手間が省かれる利点がある。また、他の一例のシステムにおいては、起動と同時に撮像が開始されなくてもよい。この場合は、撮影ボタンが利用者によって選択されることにより、撮影が開始されてよい。なお、撮影ボタンは、スマートフォン内の機械的で物理的なボタンであってもよいし、画面上をタッチするものであってもよい。 Next, the user starts the software related to the example system. When the example system is started, the example system may automatically start imaging. That is, the imaging device in the terminal device may be activated to start imaging. Since the imaging is automatically started only by starting and without any other operation, the user has an advantage that he / she can prevent forgetting to start the imaging operation and save the trouble of starting the imaging. Further, in the other example system, the imaging does not have to be started at the same time as the startup. In this case, the shooting may be started by selecting the shooting button by the user. The shooting button may be a mechanical and physical button in the smartphone, or may be a touch on the screen.
 また、一例のシステムに係る端末装置は、撮像時に、オートフォーカスを、無限遠に設定して撮像してよい。車内における端末装置の取り付け位置によっては、撮像視野内に、自己の車両のボンネットが入ったりする場合もあり、発明者が実験をしたところそのボンネットに焦点が合うことがあった。この場合、2メートルほどの焦点となり、前方の車両に焦点が合わず、前方の車両を特定する情報を取得する精度が低下する恐れがあるためである。また、特に、雨天時などは、端末装置の焦点が、フロントガラス上の水滴や動作中のワイパーに合ってしまい、同様に、前方の車両を特定する情報を取得する精度が低下する恐れがある。上述のとおり、端末装置の撮像時の焦点を無限遠に自動的に設定することで、かかる精度低下を防止できる利点がある。 Further, the terminal device according to the example system may set the autofocus to infinity at the time of imaging and perform imaging. Depending on the mounting position of the terminal device in the vehicle, the bonnet of the own vehicle may enter the imaging field of view, and when the inventor conducted an experiment, the bonnet was sometimes focused. In this case, the focus is about 2 meters, and the focus is not on the vehicle in front, which may reduce the accuracy of acquiring information for identifying the vehicle in front. In addition, especially in rainy weather, the focus of the terminal device may be focused on water droplets on the windshield or the wiper in operation, and similarly, the accuracy of acquiring information for identifying the vehicle in front may be reduced. .. As described above, there is an advantage that such a decrease in accuracy can be prevented by automatically setting the focus at the time of imaging of the terminal device to infinity.
 一例のシステムは、撮影された映像を記憶しつつ、その時点の運転状況に係る情報を記憶する。記憶する映像は、一定のファイル容量毎のファイル内に格納されてよい。例えば、所定の記憶量のXMB(メガバイト)毎の映像のファイルを作成し、かかる所定の記憶量を超えた時点で、次のファイルを作成して映像を記憶してよい。かかるファイルの分割機能を有することにより、長時間の映像を記憶した場合も、一のファイルが大きなものにならず、他の情報処理装置に通信する場合など、利便性が高くなる利点がある。なお、撮影される映像には、音声も含まれてよい。ドライブレコーダとして、運転状況における音声も重要な情報となるためである。 The system of one example stores the information related to the driving situation at that time while storing the captured image. The video to be stored may be stored in a file for each fixed file size. For example, a video file for each XMB (megabyte) of a predetermined storage amount may be created, and when the predetermined storage amount is exceeded, the next file may be created to store the video. By having such a file division function, even when a long-time video is stored, one file does not become large, and there is an advantage that convenience is improved when communicating with another information processing device. The video to be shot may include audio. This is because, as a drive recorder, voice in the driving situation is also important information.
 一例のシステムは、撮影された映像の内容、運転状況に係る情報、に応じて、出力部に種々の表示をしたり、音を出したりしてよい。例えば、前方を運転する車両と、自己の車両の距離が、近づいた場合、警告を出してよい。警告は、出力部に見る者の注意を引くような態様で表示してもよいし、車内にいる者の注意を引くような音を出してもよい。また、運転状況に係る情報として、急ブレーキ、急加速、急ハンドルの場合に、これらを同様に警告してよい。 The system of the example may display various displays or make sounds on the output unit according to the content of the captured image and the information related to the driving situation. For example, when the distance between the vehicle driving ahead and the own vehicle is getting closer, a warning may be issued. The warning may be displayed on the output unit in a manner that attracts the attention of the viewer, or may make a sound that attracts the attention of the person in the vehicle. Further, as information related to the driving situation, in the case of sudden braking, sudden acceleration, and sudden steering, these may be similarly warned.
 一例のシステムは、車両の運転が終了すると、ドライブレポートを生成してよい。例えば、一例のシステムは、運転状況に係る情報の統計情報を生成し、ドライブレポートに含ませてよい。 An example system may generate a drive report when the vehicle has finished driving. For example, an example system may generate statistical information about driving conditions and include it in a drive report.
 一例のシステムは、管理サーバに対して、記憶した情報を、伝達してよい。例えば、撮影した動画や、運転状況に係る情報を、管理サーバに対して、送信してよい。送信するタイミングは、端末装置が、他の情報処理装置と所定の通信形態によって送信可能になったタイミングで送信を開始するよう構成されてよい。所定の通信形態は、例えば、WIFIであってよい。一例のシステムは、管理サーバに送信後、端末装置内に記憶された映像を自動的に消去してよい。端末装置は、管理サーバ内に映像が記憶されたことが確認された情報を端末装置が取得後、自動的に消去してよい。また、自動的な消去は、利用者の確認を取らずに、消去してよい。ドライブレコーダとして記憶した映像は端末装置にとっては大きな記憶容量となるため、次回利用時に The example system may transmit the stored information to the management server. For example, the captured moving image and information related to the driving situation may be transmitted to the management server. The timing of transmission may be configured so that the terminal device starts transmission at a timing when transmission becomes possible with another information processing device in a predetermined communication mode. The predetermined communication mode may be, for example, WIFI. One example system may automatically erase the video stored in the terminal device after transmission to the management server. The terminal device may automatically delete the information confirmed that the video is stored in the management server after the terminal device acquires the information. In addition, automatic erasing may be performed without confirmation of the user. The video stored as a drive recorder has a large storage capacity for the terminal device, so the next time it is used
 一例のシステムは、端末装置内に記憶された情報を、再生する機能を有してよい。また、運転状況に係る情報を、表示する機能を有してよい。 The system of the example may have a function of reproducing the information stored in the terminal device. In addition, it may have a function of displaying information related to the driving situation.
 図22は、本例のシステムに係る端末装置の画面遷移図の一例である。起動画面001において、起動されると、規約合意画面002が表示される。かかる画面は、インストール後の初回のアプリ起動時のみである。規約に合意する、又は、2回目以降の起動時には、TOP画面003が表示される。トップ画面からは、録画画面004、撮像映像一覧画面006、設定画面008に移行することができる。録画の画面の後は、ドライブレポートの画面005に移行することができる。ドライブレポートの画面は、録画中におけるドライブの運転状況について統計情報を表示できるものである。また、撮影映像一覧画面006から撮影映像再生画面007に移行できる。これは、過去に撮像した画面の一覧から、選択された一の映像を、再生できる画面である。また、設定画面008においては、種々の設定が可能であってよい。ここで示した画面推移は一例であり、例えば、撮影映像一覧画面から、選択された一の撮影映像について、再生もできれば、その撮影映像におけるドライブレポート画面を表示できてもよい。 FIG. 22 is an example of a screen transition diagram of the terminal device according to the system of this example. When the startup screen 001 is started, the agreement agreement screen 002 is displayed. Such a screen is only when the application is started for the first time after installation. The TOP screen 003 is displayed when the agreement is agreed or the second and subsequent activations are performed. From the top screen, it is possible to shift to the recording screen 004, the captured image list screen 006, and the setting screen 008. After the recording screen, the screen can be moved to the drive report screen 005. The drive report screen can display statistical information about the driving status of the drive during recording. In addition, the captured video list screen 006 can be switched to the captured video playback screen 007. This is a screen on which one video selected from a list of screens captured in the past can be played back. Further, on the setting screen 008, various settings may be possible. The screen transition shown here is an example. For example, it may be possible to reproduce the selected one shot video from the shot video list screen, or to display the drive report screen of the shot video.
 図23は、一例のシステムが起動された後、撮影の指示を待っている画面である。表示画面001において、撮像ボタン002が表示画面に大きく表示されている。本例においては、自動的な撮像ではなく、利用者の指示に応じて撮像できるようにされている。かかる撮像ボタンは、表示画面の面積中10分の1以上の大きさの面積を有するものであってよい。撮像ボタンが、所定の面積よりも大きい面積を有する場合、利用者は、より容易に撮像開始を指示しやすい利点がある。 FIG. 23 is a screen waiting for a shooting instruction after the system of one example is started. On the display screen 001, the image pickup button 002 is displayed large on the display screen. In this example, instead of automatic imaging, imaging can be performed according to a user's instruction. The image pickup button may have an area of one tenth or more of the area of the display screen. When the image pickup button has an area larger than a predetermined area, there is an advantage that the user can more easily instruct the start of image pickup.
 図24は、ドライブ中の撮像画面の一例である。表画面001において、前方方向の車両が赤枠002で囲まれ、各車両が認識されていることを示している。また、運転状況に関する情報003も表示されている。グラフは、横軸が経過時間を示し、縦軸は車両の速度を示す。そのため、グラフ内の線の角度が急なところは、急加速、急ブレーキ、又は、急ハンドルが対応しうるため、そのような表示がされてよい。また、前方の車両との距離が近い場合、特に同一レーン上の前方の車両との距離が近い場合には、危険車間距離として、表示されてよい。なお、道路上の標識004なども認識され、それらの情報が取得されてよい。 FIG. 24 is an example of an imaging screen during driving. In the front screen 001, the vehicles in the forward direction are surrounded by the red frame 002, indicating that each vehicle is recognized. In addition, information 003 regarding the driving situation is also displayed. In the graph, the horizontal axis shows the elapsed time and the vertical axis shows the speed of the vehicle. Therefore, where the angle of the line in the graph is steep, sudden acceleration, sudden braking, or sudden steering can be applied, and such a display may be made. Further, when the distance to the vehicle in front is short, particularly when the distance to the vehicle in front on the same lane is short, it may be displayed as a dangerous inter-vehicle distance. In addition, the sign 004 on the road may be recognized and the information thereof may be acquired.
 図25は、撮影映像の一覧を示すものである。各映像001は、時刻によって一覧が表示されている。これらは、一の撮影開始から撮影終了までであってもよいし、上述のとおり所定の記憶量毎に分けられたファイルごとに一覧表示されてもよい。図26は、撮影映像の一覧を示す他の例である。 FIG. 25 shows a list of captured images. A list of each video 001 is displayed according to the time. These may be from the start of one shooting to the end of shooting, or may be listed for each file divided for each predetermined storage amount as described above. FIG. 26 is another example showing a list of captured images.
 図27は、撮像した映像を表示中001に示し、また、幾つかの映像の一覧002を示している。 FIG. 27 shows the captured image at 001 during display, and also shows a list 002 of some images.
 図28は、上述までの縦長の撮像・表示に代えて、横長の撮像・表示を示す例である。同様に、横長に取り付けられたスマートフォンによって、車両を撮像可能であってよい001。 FIG. 28 is an example showing a horizontally long image / display instead of the vertically long image / display described above. Similarly, the vehicle may be imaged by a smartphone mounted horizontally 001.
 図29は、撮像中の運転状況の表示の一例である。横軸が時刻であり、縦軸が自己の車両の速度であり、その中で、急加速、急ハンドル、急ブレーキが生じた時点を示すマークを表示している。なお、図30は、撮像中の運転状況の表示の他の一例である。画面が縦画面となっている。本図も、同様に、横軸が時刻であり、縦軸が自己の車両の速度であり、その中で、急加速、急ハンドル、急ブレーキが生じた時点を示すマークを表示している。 FIG. 29 is an example of displaying the operating status during imaging. The horizontal axis is the time, and the vertical axis is the speed of the own vehicle, and a mark indicating the time when sudden acceleration, sudden steering, or sudden braking occurs is displayed. Note that FIG. 30 is another example of displaying the operating status during imaging. The screen is a vertical screen. Similarly, in this figure, the horizontal axis is the time and the vertical axis is the speed of the own vehicle, and a mark indicating the time when sudden acceleration, sudden steering, or sudden braking occurs is displayed.
 図31は、同様に、横長に取り付けられたスマートフォンにおいて、前方の車両を認識している状態を示す一例である。前方の車両を枠001によって囲い、認識していることを示している。出力部は、このように識別された車両を特定する情報を表示してよい。また、車両のみならず、出力部は、歩行者、二輪車(自転車含む)などを特定する情報を表示してもよい。この場合、画像情報生成部において、人と二輪車のようなものを識別した場合には、二輪車と特定し、二輪車のようなものを特定せずに人を識別した場合は、歩行者と特定してよい。また、かかる車両を特定する情報の線は、矩形に代えて、円形や多角形であってよい。また、かかる認識を示す、対象となる車両を特定する表示は、対象となる車両を囲む線でもよいし、囲まない線でもよい。かかる車両を特定する表示は、種々の車両であってよい。たとえば、乗用車、商用車、トラック、タクシー、二輪車、などであってよい。また、かかる車両を特定する表示は、上述の距離情報が所定の範囲の距離を含む場合に限ってもよい。例えば、車両が、15メートルの範囲内である場合に限り、かかる車両を特定する表示をしてもよい。また、車両を特定する表示は、種々の態様で表示をしてもよい。態様は、色や表示の形状などを変更してよい。また、車両を特定する表示は、上述の距離情報に含まれる情報に応じて、標示の態様を変化させてもよい。例えば、距離情報が2メートル以上5メートル以下との情報を含む場合に、黄色であり、距離情報が5メートル以上15メートル以下であるとの情報を含む場合は青色等であってよい。また、車両を特定する表示と関連付けて、かかる距離情報に含まれる距離を表示してもよい。例えば、前方車両を特定する表示と関連付けて、「3m」などと表示してよい。なお、距離情報が0メートルから2メートルと間の距離を含む場合は、かかる距離を表示しなくともよい。また、かかる距離の情報の表示の有無は、車両が、出力部の位置に応じて、表示するか表示しないかを判定してもよい。例えば、出力部は、車両を特定する表示が、出力部の所定の位置に表示されている場合のみに、表示させてもよい。例えば、出力部は、車両を特定する矩形が、画面の中央を含む車両のみについて、表示してもよい。車両を特定する矩形が、画面の中央であるかどうかは、例えば、かかる矩形を特定する座標位置と、画面中央の座標位置とを比較することにより、判定されてよい。このように、中央の車両のみ距離の情報を表示することにより、例えば、(一般的に端末装置は、車両に対して中央の位置に、車両正面を中央にするよう設置されるため)路上駐車中の車両など、画面の端にいる車両については距離の表示がされず、閲覧者にとって煩わしくない利点がある。 FIG. 31 is an example showing a state in which a smartphone mounted horizontally is recognizing a vehicle in front. The vehicle in front is surrounded by a frame 001 to indicate that it is recognized. The output unit may display information that identifies the vehicle thus identified. Further, not only the vehicle but also the output unit may display information that identifies a pedestrian, a two-wheeled vehicle (including a bicycle), and the like. In this case, when the image information generation unit identifies a person and something like a two-wheeled vehicle, it identifies it as a two-wheeled vehicle, and when it identifies a person without specifying something like a two-wheeled vehicle, it identifies it as a pedestrian. It's okay. Further, the line of information for identifying such a vehicle may be a circle or a polygon instead of a rectangle. Further, the display for identifying the target vehicle indicating such recognition may be a line surrounding the target vehicle or a line not surrounding the target vehicle. The display identifying such a vehicle may be various vehicles. For example, it may be a passenger car, a commercial vehicle, a truck, a taxi, a motorcycle, and the like. Further, the display for identifying such a vehicle may be limited to the case where the above-mentioned distance information includes a distance within a predetermined range. For example, a display identifying such a vehicle may be made only if the vehicle is within a range of 15 meters. In addition, the display that identifies the vehicle may be displayed in various modes. As for the aspect, the color, the shape of the display, and the like may be changed. Further, the display for identifying the vehicle may change the mode of the marking according to the information included in the above-mentioned distance information. For example, when the distance information includes information of 2 meters or more and 5 meters or less, it may be yellow, and when the distance information includes information of 5 meters or more and 15 meters or less, it may be blue or the like. Further, the distance included in the distance information may be displayed in association with the display that identifies the vehicle. For example, it may be displayed as "3 m" or the like in association with a display that identifies the vehicle in front. If the distance information includes a distance between 0 meters and 2 meters, it is not necessary to display the distance. Further, whether or not the information on the distance is displayed may be determined by the vehicle depending on the position of the output unit and whether or not the information is displayed. For example, the output unit may display only when the display identifying the vehicle is displayed at a predetermined position of the output unit. For example, the output unit may display only the vehicle whose rectangle that identifies the vehicle includes the center of the screen. Whether or not the rectangle that identifies the vehicle is at the center of the screen may be determined, for example, by comparing the coordinate position that specifies the rectangle with the coordinate position at the center of the screen. By displaying the distance information only for the central vehicle in this way, for example (because the terminal device is generally installed at the center position with respect to the vehicle so that the front of the vehicle is centered), parking on the road. There is an advantage that the distance is not displayed for the vehicle at the edge of the screen, such as the vehicle inside, which is not bothersome to the viewer.
 図32は、同一レーン上の前方の車両と自己の車両との近さに応じて、かかる矩形の線の態様を変更された例である。線の態様の変更の例としては、利用者の注意を引く赤色や黄色などの色を変更してもよいし、線の太さや線の飾りなどを変更してもよい。また、接近していることを示すために、本図のように「接近」との表示をしてもよい。 FIG. 32 is an example in which the form of such a rectangular line is changed according to the proximity of the vehicle in front on the same lane and the own vehicle. As an example of changing the mode of the line, the color such as red or yellow that attracts the user's attention may be changed, or the thickness of the line or the decoration of the line may be changed. Further, in order to indicate that they are approaching, "approaching" may be displayed as shown in this figure.
 図33は、撮像後に、映像を閲覧している状態を示す。映像一覧003の中から、一の映像が選択されると、映像001が表示される。このとき、急ハンドル、急加速、急―ブレーキが、各々の表示によって、グラフ002に表示されている。なお、かかるグラフは、横軸が時刻であり、縦軸が速度を示す。同様に、図34においては、縦画面の場合を表示している。 FIG. 33 shows a state in which the image is being viewed after imaging. When one video is selected from the video list 003, video 001 is displayed. At this time, the sudden steering wheel, the sudden acceleration, and the sudden-brake are displayed in the graph 002 by each display. In such a graph, the horizontal axis represents time and the vertical axis represents speed. Similarly, in FIG. 34, the case of a vertical screen is displayed.
3.2.実施例2:他社情報収集
 一例のシステムは、本実施例に必須の構成のみを備えてもよいし、他の実施例に必要な態様を備えてもよい。
3.2. Example 2: The system of another company's information gathering example may have only the configuration essential to this embodiment, or may have the mode required for another embodiment.
 一例のシステムは、一又は複数の端末装置から、画像を取得する。また、一例のシステムは、一の端末装置から、一又は複数の車両特定情報と、前記一又は複数の車両特定情報と関連付けられている他車両判定情報と、を取得してよい。 An example system acquires images from one or more terminal devices. Further, the system of one example may acquire one or more vehicle identification information and other vehicle determination information associated with the one or more vehicle identification information from one terminal device.
 一例のシステムは、他車両判定情報を、車両特定情報毎にまとめることにより、他車両統計情報を生成してよい。他車両統計情報は、所定の期間に撮像された画像に基づいて、所定の期間でまとめられてもよい。また、他車両統計情報は、所定の地域に撮像された画像に基づき、所定の地域でまとめられてもよい。また、他車両運転状況情報として、急加速、急ハンドル、急ブレーキについて、所定の係数を関連付け、それらの回数との乗算により重みづけした総合判定点数を生成してもよい。図35は、他車両統計情報の一例である。本図では、総合判定点数の高い順にランキングで、車両を整理されている。 The system of one example may generate other vehicle statistical information by collecting other vehicle determination information for each vehicle specific information. The other vehicle statistical information may be summarized in a predetermined period based on the images captured in the predetermined period. In addition, other vehicle statistical information may be summarized in a predetermined area based on an image captured in the predetermined area. Further, as information on the driving situation of other vehicles, a predetermined coefficient may be associated with each other for sudden acceleration, sudden steering, and sudden braking, and a weighted total determination score may be generated by multiplying the number of times. FIG. 35 is an example of other vehicle statistical information. In this figure, the vehicles are arranged in order of highest overall judgment score.
 一例のシステムは、一の端末装置から一又は複数の特定車両情報を取得し、前記一又は複数の特定車両情報を、他車両統計情報の中から検索し、かかる一又は複数の特定車両情報に係る他車両統計情報を、前記一の端末装置に送信してよい。ここで、特定車両情報に係る他車両統計情報とは、例えば、前記特定車両情報の急加速の回数、急ブレーキの回数、急ハンドルの回数、それぞれの回数に係る数値、総合判定点数、ランキング順位、などであってよい。回数に係る数値は、所定期間内で発生する確率やそれらを係数にしたものであってよい。また、特定車両情報に係る他車両統計情報は、危険の危険度を端的に示す情報に加工されたものでもよい。例えば、高い、通常、低い、などの3択の一の情報を含むものであってもよい。 An example system acquires one or more specific vehicle information from one terminal device, searches the one or more specific vehicle information from other vehicle statistical information, and converts the one or more specific vehicle information into such specific vehicle information. Such other vehicle statistical information may be transmitted to the above-mentioned one terminal device. Here, the other vehicle statistical information related to the specific vehicle information is, for example, the number of times of sudden acceleration, the number of times of sudden braking, the number of times of sudden steering, the numerical value related to each number of times, the total judgment score, and the ranking ranking. , And so on. The numerical value related to the number of times may be the probability of occurrence within a predetermined period or a coefficient obtained by using them. In addition, the other vehicle statistical information related to the specific vehicle information may be processed into information that simply indicates the degree of danger. For example, it may include information of one of three options such as high, normal, and low.
 前記一の端末装置では、撮像された画像内の一又は複数の他車両について、特定車両情報に係る他車両統計情報を、前記一又は複数の他車両と関連付けて、表示させてよい。図37は、かかる表示の一例である。本例では、車両001と関連付けた表示により「注意車両」と表示することより、端的に、かかる車両が所定の基準よりも危険な運転(例えば、急加速、急ブレーキ、及び/又は、急ハンドルの回数が所定基準よりも多いこと、総合判定点数が所定の基準よりも高いこと、及び/又は、ランキング順位が所定の基準よりも高いこと)であることを示している例である。なお、車両002がかかる所定の基準よりも危険な運転の経歴を有していない場合、本図のように何も表示をしなくてもよいし、「通常」などのように所定の基準よりも危険な運転経歴を有していないことを示す表示をしてもよい。 In the one terminal device, for one or more other vehicles in the captured image, the other vehicle statistical information related to the specific vehicle information may be displayed in association with the one or more other vehicles. FIG. 37 is an example of such a display. In this example, by displaying "Caution vehicle" by the display associated with vehicle 001, the vehicle is simply driven more dangerously than a predetermined standard (for example, sudden acceleration, sudden braking, and / or sudden steering). This is an example showing that the number of times is higher than the predetermined standard, the total judgment score is higher than the predetermined standard, and / or the ranking ranking is higher than the predetermined standard). If the vehicle 002 does not have a history of driving that is more dangerous than the prescribed standard, it is not necessary to display anything as shown in this figure, and it is more than the predetermined standard such as "normal". May also indicate that it does not have a dangerous driving history.
 図38は、一例のシステムが、複数の端末装置から取得した情報を、集めて処理した情報を、一又は複数の端末装置に送信する例を示す。一例の管理システム004は、端末装置001乃至003から、情報を取得し、所定の処理の上、端末装置001乃至003に送信してよい。ここで、情報を取得する端末装置と、情報を送信する端末装置と、は同じであってもよいし、異なってもよい。 FIG. 38 shows an example in which an example system collects information acquired from a plurality of terminal devices, processes the information, and transmits the information to one or a plurality of terminal devices. An example management system 004 may acquire information from the terminal devices 001 to 003, perform predetermined processing, and transmit the information to the terminal devices 001 to 003. Here, the terminal device for acquiring information and the terminal device for transmitting information may be the same or different.
3.3.実施例3:車両情報収集
 一例のシステムは、本実施例に必須の構成のみを備えてもよいし、他の実施例に必要な態様を備えてもよい。一例のシステムは、一又は複数の端末装置から対象に係る情報を取得してよく、そのタイミングはどのようなものであってもよい。例えば、一例のシステムは、端末装置がWIFIなどの無線通信装置に接続されたタイミングで、情報を取得してもよいし、端末装置が3G、4G、又は5Gなどの通信規格に接続されたタイミングで、情報を取得してもよいし、端末装置が送信する対象となる情報を取得したタイミングで、リアルタイムに、情報を取得してもよい。なお、リアルタイムは、情報処理に伴う所定の遅延で送信されるものを含んでよい。
3.3. Example 3: The system of one example of vehicle information collection may include only the configurations essential to this embodiment, or may include aspects required for other embodiments. An example system may acquire information related to a target from one or a plurality of terminal devices, and the timing may be any. For example, in the system of one example, information may be acquired at the timing when the terminal device is connected to a wireless communication device such as WIFI, or when the terminal device is connected to a communication standard such as 3G, 4G, or 5G. Then, the information may be acquired, or the information may be acquired in real time at the timing when the information to be transmitted by the terminal device is acquired. The real-time may include those transmitted with a predetermined delay associated with information processing.
 図39は、一例のシステムが、複数の端末装置から取得した情報を、集めて処理した情報を、対応する各社に送信する例を示す。一例の管理システム004は、端末装置001乃至003から、対象に係る情報などを取得し、対象会社に係るシステム005乃至007に送信してよい。対象会社は、道路利用者、保険会社、自治体又は自治体に関連する会社、広告会社、携帯電話関連会社、などを含んでよい。 FIG. 39 shows an example in which an example system collects information acquired from a plurality of terminal devices, processes the information, and transmits the information to the corresponding companies. An example management system 004 may acquire information related to the target from the terminal devices 001 to 003 and transmit it to the systems 005 to 007 related to the target company. The target company may include road users, insurance companies, local governments or companies related to local governments, advertising companies, mobile phone related companies, and the like.
 一例のシステムが、渋滞情報、天候情報、道路情報、道路異常情報、事故情報、車両用燃料の提供情報、人情報、及び、3Dマップ、の少なくとも一部の情報(上述の各用語が示す下位概念の情報であってもよい)などを、地図を提供する会社に係るシステムに送信した場合地図の更新に使用可能であり、道路を利用する運送業者に係るシステムに送信した場合運送業者の効率的な利用に使用可能であり、道路を利用するタクシー会社に係るシステムに送信した場合タクシーの効率的な利用に使用可能であり、カーナビゲーションを提供する会社に係るシステムに送信した場合カーナビゲーションシステムへの情報の更新に使用可能であり、道路をメンテナンスする会社に係るシステムに送信した場合道路情報生成に使用可能であり、自動運転に係るサービスを提供する会社に係るシステムに送信した場合自動運転に係るサービスの提供に使用可能であり、自治体に係るシステムに送信した場合自治体の運営に使用可能であり、保険会社に係るシステムに送信した場合保険情報の作成に利用可能な利点がある。また、一例のシステムが、広告統計情報を、広告会社に係るシステムに送信した場合広告情報の検討の利用に使用可能である利点がある。また、一例のシステムが、人情報を、携帯電話の電波のサービスに係るシステム送信した場合携帯電話の電波に係るサービスに利用できる利点がある。 An example system is at least some information (subordinates indicated by the above terms) of congestion information, weather information, road information, road abnormality information, accident information, vehicle fuel supply information, human information, and 3D maps. If you send information such as (which may be conceptual information) to the system related to the company that provides the map, it can be used to update the map, and if you send it to the system related to the carrier using the road, the efficiency of the carrier When it is sent to the system related to the taxi company that uses the road, it can be used for efficient use of the taxi, and when it is sent to the system related to the company that provides car navigation, the car navigation system Can be used to update information to, and sent to the system related to the company that maintains the road. Can be used to generate road information, and sent to the system related to the company that provides services related to automatic driving. It can be used to provide services related to the above, can be used for the operation of the local government when sent to the system related to the local government, and can be used to create insurance information when sent to the system related to the insurance company. Further, there is an advantage that the system of one example can be used for the examination of the advertisement information when the advertisement statistics information is transmitted to the system related to the advertisement company. Further, there is an advantage that the system of one example can be used for the service related to the radio wave of the mobile phone when the person information is transmitted to the system related to the radio wave service of the mobile phone.
 図40は、一例のシステムにフローの一例である。一例のシステムは、一又は複数の端末から、かかる端末における情報通信部から送信された情報を取得してよい。次に、一例のシステムに係る統計処理部は、かかる情報通信部が取得した情報を用いて、統計処理してよい。次に、一例のシステムは、かかる統計処理された情報を含む情報を、対象会社に送信してよい。 FIG. 40 is an example of a flow in an example system. An example system may acquire information transmitted from an information communication unit in such a terminal from one or more terminals. Next, the statistical processing unit related to the system of one example may perform statistical processing using the information acquired by the information and communication unit. The example system may then send information, including such statistically processed information, to the target company.
4.システム及び端末装置のハードウェア構成
 本願発明に係るシステム及び端末装置の各々は、一又は複数の情報処理装置から構成されてよい。本願発明に係る情報処理装置10は、図41のように、バス15、演算装置11、記憶装置12、及び通信装置16を備えてよい。また、一実施態様における情報処理装置10は、入力装置13、表示装置14を備えてよい。また、ネットワーク17と、直接または間接的に接続される。
4. Hardware Configuration of System and Terminal Device Each of the system and terminal device according to the present invention may be composed of one or a plurality of information processing devices. As shown in FIG. 41, the information processing device 10 according to the present invention may include a bus 15, an arithmetic device 11, a storage device 12, and a communication device 16. Further, the information processing device 10 in one embodiment may include an input device 13 and a display device 14. It is also directly or indirectly connected to the network 17.
 バス15は、演算装置11、記憶装置12、入力装置13、表示装置14及び通信装置16の間の情報を伝達する機能を有してよい。 The bus 15 may have a function of transmitting information between the arithmetic unit 11, the storage device 12, the input device 13, the display device 14, and the communication device 16.
 演算装置11の例としては、例えばプロセッサが挙げられる。これは、CPUであってもよいし、MPUであってもよい。また、一実施態様における演算装置は、グラフィックスプロセッシングユニット、デジタルシグナルプロセッサなどを有してもよい。要するに、演算装置12は、プログラムの命令を実行できる装置であればよい。 An example of the arithmetic unit 11 is a processor. This may be a CPU or an MPU. Further, the arithmetic unit in one embodiment may include a graphics processing unit, a digital signal processor, and the like. In short, the arithmetic unit 12 may be any device capable of executing program instructions.
 記憶装置12は、情報を記録する装置である。これは、外部メモリと内部メモリのいずれでもよく、主記憶装置と補助記憶装置のいずれでもよい。また、磁気ディスク(ハードディスク)、光ディスク、磁気テープ、半導体メモリなどでもよい。また、ネットワークを介した記憶装置又は、ネットワークを介したクラウド上の記憶装置を有してもよい。 The storage device 12 is a device that records information. This may be either an external memory or an internal memory, and may be either a main storage device or an auxiliary storage device. Further, a magnetic disk (hard disk), an optical disk, a magnetic tape, a semiconductor memory, or the like may be used. Further, it may have a storage device via a network or a storage device on the cloud via a network.
 なお、演算装置に物理的に近い位置で情報を記憶する、レジスタ、L1キャッシュ、L2キャッシュなどは、本ブロック図においては、演算装置11内に含まれる場合もあるが、計算機アーキテクチャのデザインにおいて、情報を記録する装置としては、記憶装置12がこれらを含んでもよい。要するに、演算装置11、記憶装置12及びバス11が協調して、情報処理を実行できるよう構成されていればよい。 In this block diagram, the registers, L1 cache, L2 cache, etc. that store information at a position physically close to the arithmetic unit may be included in the arithmetic unit 11, but in the design of the computer architecture, As a device for recording information, the storage device 12 may include these. In short, the arithmetic unit 11, the storage device 12, and the bus 11 may be configured to cooperate with each other to execute information processing.
 記憶装置12は、本願発明に係る処理を実行可能なプログラムの一部又は全部を備えることができる。また、本願発明に係る処理を実行する際に必要なデータを、適宜記録することもできる。また、一実施態様における記憶装置12は、データベースを含んでもよい。 The storage device 12 can include a part or all of a program capable of executing the process according to the present invention. In addition, data necessary for executing the process according to the present invention can be appropriately recorded. Further, the storage device 12 in one embodiment may include a database.
 また、上記は、演算装置12が、記憶装置13に備えられたプログラムに基づいて実行される場合を記載したが、上記のバス11、演算装置12と記憶装置13が組み合わされた形式の一つとして、本願発明に係る情報処理の一部又は全部を、ハードウェア回路自体を変更することができるプログラマブルロジックデバイス又は実行する情報処理が決まっている専用回路で実現されてもよい。 Further, the above describes the case where the arithmetic unit 12 is executed based on the program provided in the storage device 13, but it is one of the forms in which the bus 11, the arithmetic unit 12 and the storage device 13 are combined. As a result, part or all of the information processing according to the present invention may be realized by a programmable logic device capable of changing the hardware circuit itself or a dedicated circuit in which the information processing to be executed is determined.
 入力装置13は、情報を入力するものであるが、他の機能を有してもよい。入力装置14としては、キーボード、マウス、タッチパネル、又はペン型の指示装置などの入力装置が挙げられる。 The input device 13 inputs information, but may have other functions. Examples of the input device 14 include an input device such as a keyboard, a mouse, a touch panel, or a pen-type instruction device.
 表示装置14は、情報を表示する機能を有する。例えば、液晶ディスプレイ、プラズマディスプレイ、有機ELディスプレイなどが挙げられるが、要するに、情報を表示できる装置であればよい。また、タッチパネルのように入力装置13を一部に備えてもよい。 The display device 14 has a function of displaying information. For example, a liquid crystal display, a plasma display, an organic EL display, and the like can be mentioned, but in short, any device capable of displaying information may be used. Further, the input device 13 may be partially provided like a touch panel.
 ネットワーク17は、通信装置16と共に、情報を伝達する。すなわち、情報処理装置である10の情報を、ネットワーク17を介して他の情報端末(図示しない)に伝達できるようにする機能を有する。通信装置16は、どのような接続形式を用いてもよく、IEEE1394、イーサネット(登録商標)、PCI、SCSI、USB、2G、3G、4G、5Gなどでもよい。ネットワーク17への接続は、有線と無線のいずれでもよい。 The network 17 transmits information together with the communication device 16. That is, it has a function of enabling information of 10 information processing devices to be transmitted to other information terminals (not shown) via the network 17. The communication device 16 may use any connection type, such as IEEE1394, Ethernet (registered trademark), PCI, SCSI, USB, 2G, 3G, 4G, 5G, and the like. The connection to the network 17 may be either wired or wireless.
 本願発明に係る情報処理装置は、汎用型であってもよいし、専用型であってもよい。また、当該情報処理装置は、ワークステーション、デスクトップパソコン、ラップトップパソコン、ノートパソコン、PDA、携帯電話、スマートフォンなどでもよい。 The information processing device according to the present invention may be a general-purpose type or a dedicated type. Further, the information processing device may be a workstation, a desktop personal computer, a laptop personal computer, a laptop computer, a PDA, a mobile phone, a smartphone or the like.
 本図では、一台の情報処理装置10として説明したが、本願発明に係るシステムは、複数の情報処理装置によって構成されてもよい。当該複数の情報処理装置は、内部的に接続されていてもよいし、外部的に接続されていてもよい。 Although described as one information processing device 10 in this figure, the system according to the present invention may be composed of a plurality of information processing devices. The plurality of information processing devices may be internally connected or may be externally connected.
 また、本願発明に係るシステムは、種々の態様の装置形式であってよい。例えば、本願発明に係るシステムは、スタンドアローンであってもよいし、サーバクライアント形式であってもよいし、ピアツーピア形式であってもよいし、クラウド形式であってもよい。本願発明に係るシステムは、スタンドアローンの情報処理装置であってもよいし、サーバクライアント形式の一部又は全部の情報処理装置から構成されてもよいし、ピアツーピア形式の一部又は全部の情報処理装置から構成されてもよいし、クラウド形式の一部又は全部の情報処理装置から構成されてもよい。 Further, the system according to the present invention may be of various types of devices. For example, the system according to the present invention may be a stand-alone system, a server-client system, a peer-to-peer system, or a cloud system. The system according to the present invention may be a stand-alone information processing device, may be composed of a part or all of the information processing device of the server-client type, or may be a part or all of the information processing of the peer-to-peer type. It may be composed of devices, or may be composed of some or all information processing devices in the cloud format.
 また、本願発明に係るシステムが複数の情報処理装置で構成される場合、各情報処理装置の所有者や管理者は、異なってもよい。 Further, when the system according to the present invention is composed of a plurality of information processing devices, the owner and manager of each information processing device may be different.
 また、情報処理装置10は、物理的な存在であってもよいし、仮想的なものであってもよい。例えば、クラウドコンピューティングを用いて、情報処理装置10を仮想的に実現してもよい。 Further, the information processing device 10 may be a physical existence or a virtual one. For example, the information processing device 10 may be virtually realized by using cloud computing.
 上述では、本例のシステムが実施する構成として説明したが、これらは、システム内の一又は複数の情報処理装置が実施する構成であってもよい。また、上述において、携帯可能な情報処理装置として説明されたものは、適宜、設置され、固定化された情報処理装置であってもよい。 Although described above as the configuration implemented by the system of this example, these may be configurations implemented by one or more information processing devices in the system. Further, the information processing device described above as the portable information processing device may be an information processing device that is appropriately installed and fixed.
 本願書類の実施例において述べた発明例は、本願書類で説明されたものに限らず、その技術的思想の範囲内で、種々の例に適用できることはいうまでもない。例えば、本願書類の実施例において、情報処理装置の画面に提示される情報は、他の情報処理装置における画面で表示できるために前記他の情報処理装置に対して送信できるよう、各実施例のシステムが構成されてもよい。また、各種図面内の〇の表示は、各文脈に沿った適宜の値が入ってよく、全て同じであってもよいし、異なってもよい。 It goes without saying that the invention examples described in the examples of the documents of the present application are not limited to those described in the documents of the present application, and can be applied to various examples within the scope of the technical idea. For example, in the embodiment of the document of the present application, the information presented on the screen of the information processing device can be displayed on the screen of the other information processing device, so that the information can be transmitted to the other information processing device. The system may be configured. Further, the display of ◯ in various drawings may contain appropriate values according to each context, and may be the same or different.
 また、本願書類で説明される処理及び手順は、実施形態において明示的に説明されたものによってのみならず、ソフトウェア、ハードウェア又はこれらの組み合わせによっても実現可能なものであってよい。また、本願書類で説明される処理及び手順は、それらの処理・手順をコンピュータプログラムとして実装し、各種のコンピュータに実行させることが可能であってよい。またこれらのコンピュータプログラムは、記憶媒体に記憶されてよい。また、これらのプログラムは、非一過性又は一時的な記憶媒体に記憶されてよい。 Further, the processes and procedures described in the documents of the present application may be feasible not only by those explicitly described in the embodiments but also by software, hardware or a combination thereof. Further, the processes and procedures described in the documents of the present application may be able to be implemented by various computers by implementing the processes and procedures as a computer program. Further, these computer programs may be stored in a storage medium. Also, these programs may be stored on a non-transient or temporary storage medium.

Claims (15)

  1.  第1携帯端末装置によって撮像された第1画像内の第1車両を特定する第1車両特定情報と、前記第1画像に基づき前記第1車両の運転状態を判定した第1車両判定情報と、を前記第1携帯端末装置から取得する取得部と、
     第2携帯端末装置によって撮像された第2画像内の第2車両を特定する第2車両特定情報と、前記第2画像に基づき前記第2車両の運転状態を判定した第2車両判定情報と、を前記第2携帯端末装置から取得する取得部と、
     前記第1車両特定情報と前記第2車両特定情報とを用いて、前記第1車両と前記第2車両との同一性を判定する判定部と、
     前記第1車両と前記第2車両とが同一であると判定された場合、前記第1車両判定情報と、前記第2車両判定情報とを用いて、前記第1車両に係る統計情報を生成する統計処理部と、
     を備えるシステム。
    The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. From the first mobile terminal device,
    The second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. From the second mobile terminal device,
    A determination unit that determines the identity between the first vehicle and the second vehicle by using the first vehicle identification information and the second vehicle identification information.
    When it is determined that the first vehicle and the second vehicle are the same, the statistical information relating to the first vehicle is generated by using the first vehicle determination information and the second vehicle determination information. Statistical processing department and
    System with.
  2.  前記第1車両と前記第2車両とが同一であると判定された場合、前記第1車両に係る統計情報を、前記第2携帯端末装置に送信する送信部、
     を備える請求項1に記載のシステム。
    A transmission unit that transmits statistical information relating to the first vehicle to the second mobile terminal device when it is determined that the first vehicle and the second vehicle are the same.
    The system according to claim 1.
  3.  前記第1携帯端末装置から、前記第1画像内の物体に係る情報を取得する、第3取得部を備える、
    請求項1乃至2のいずれか1項に記載のシステム。
    It includes a third acquisition unit that acquires information related to an object in the first image from the first mobile terminal device.
    The system according to any one of claims 1 and 2.
  4.  前記物体に係る情報は、前記第1画像内の、ワイパーの動作に係る情報、道路に係る情報、歩道に係る情報、車道上イベントに係る情報、広告に係る情報、及び/又は、車両用の燃料に係る情報、を含む、
    請求項3に記載のシステム。
    The information related to the object is the information related to the operation of the wiper, the information related to the road, the information related to the sidewalk, the information related to the event on the roadway, the information related to the advertisement, and / or for the vehicle in the first image. Including fuel information,
    The system according to claim 3.
  5.  第3携帯端末装置によって撮像された画像内に係る第3車両を特定する第3車両特定情報と、前記第3携帯端末装置が発信するメッセージと、を取得する取得部と、
     第4携帯端末装置において自己の車両として登録した第4車両を特定する第4車両特定情報を取得する取得部と、
     前記第3車両と前記第4車両とが同一の車両と判定された場合に、前記第4携帯端末装置に対して、前記メッセージを送信する、送信部と、
    を備える請求項1乃至4のいずれか1項に記載のシステム。
    An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle in the image captured by the third mobile terminal device, the message transmitted by the third mobile terminal device, and the acquisition unit.
    An acquisition unit that acquires the fourth vehicle identification information that identifies the fourth vehicle registered as its own vehicle in the fourth mobile terminal device, and
    When the third vehicle and the fourth vehicle are determined to be the same vehicle, a transmission unit that transmits the message to the fourth mobile terminal device and a transmission unit.
    The system according to any one of claims 1 to 4.
  6.  第3携帯端末装置において自己の車両として登録した第3車両を特定する第3車両特定情報と、前記第3携帯端末装置が発信するメッセージと、を取得する取得部と、
     第5携帯端末装置によって撮像された画像内に係る第5車両を特定する第5車両特定情報を取得する取得部と、
     前記第3車両と前記第5車両とが同一の車両と判定された場合に、前記第5携帯端末装置に対して、前記メッセージを送信する、送信部と、
    を備える請求項1乃至4のいずれか1項に記載のシステム。
    An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle registered as its own vehicle in the third mobile terminal device and the message transmitted by the third mobile terminal device.
    An acquisition unit that acquires the fifth vehicle identification information that identifies the fifth vehicle related to the image captured by the fifth mobile terminal device, and the acquisition unit.
    When the third vehicle and the fifth vehicle are determined to be the same vehicle, the transmission unit and the transmission unit that transmit the message to the fifth mobile terminal device.
    The system according to any one of claims 1 to 4.
  7.  前記第1車両と前記第2車両とが同一であると判定された後、所定の時間内に、前記送信部は、前記第1車両に係る統計情報を、送信する、
    請求項2に記載のシステム。
    After it is determined that the first vehicle and the second vehicle are the same, the transmission unit transmits statistical information relating to the first vehicle within a predetermined time.
    The system according to claim 2.
  8.  前記第1携帯端末装置と、前記第2携帯端末装置と、は異なる携帯端末装置である、
    請求項1乃至7のいずれか1項に記載のシステム。
    The first mobile terminal device and the second mobile terminal device are different mobile terminal devices.
    The system according to any one of claims 1 to 7.
  9.  前記第1取得部は、前記第1携帯端末装置から、前記第1画像を含む動画を取得する、
    請求項1乃至8のいずれか1項に記載のシステム。
    The first acquisition unit acquires a moving image including the first image from the first mobile terminal device.
    The system according to any one of claims 1 to 8.
  10.  前記動画は、圧縮された動画である、
    請求項9に記載のシステム。
    The moving image is a compressed moving image.
    The system according to claim 9.
  11.  前記第1車両判定情報は、前記第1携帯端末装置内の機械学習済み装置によって生成された情報である、
    請求項1乃至10のいずれか1項に記載のシステム。
    The first vehicle determination information is information generated by the machine-learned device in the first mobile terminal device.
    The system according to any one of claims 1 to 10.
  12.  前記1車両判定情報及び前記第2車両判定情報は、それぞれ、急ハンドル、急加速、及び/又は、急ブレーキを含む、
    請求項1乃至11のいずれか1項に記載のシステム。
    The one vehicle determination information and the second vehicle determination information include sudden steering, sudden acceleration, and / or sudden braking, respectively.
    The system according to any one of claims 1 to 11.
  13. コンピュータが、
     第1携帯端末装置によって撮像された第1画像内の第1車両を特定する第1車両特定情報と、前記第1画像に基づき前記第1車両の運転状態を判定した第1車両判定情報と、を前記第1携帯端末装置から取得する取得ステップと、
     第2携帯端末装置によって撮像された第2画像内の第2車両を特定する第2車両特定情報と、前記第2画像に基づき前記第2車両の運転状態を判定した第2車両判定情報と、を前記第2携帯端末装置から取得する取得ステップと、
     前記第1車両と前記第2車両との同一性を判定する判定ステップと、
     前記第1車両と前記第2車両とが同一である場合、前記第1車両判定情報と、前記第2車両判定情報とを用いて、前記第1車両に係る統計情報を生成する統計処理ステップと、
     を実行する方法。
    The computer
    The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. And the acquisition step of acquiring the image from the first mobile terminal device.
    The second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. And the acquisition step of acquiring the image from the second mobile terminal device.
    A determination step for determining the identity of the first vehicle and the second vehicle, and
    When the first vehicle and the second vehicle are the same, a statistical processing step of generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information. ,
    How to do.
  14. コンピュータを、
     第1携帯端末装置によって撮像された第1画像内の第1車両を特定する第1車両特定情報と、前記第1画像に基づき前記第1車両の運転状態を判定した第1車両判定情報と、を前記第1携帯端末装置から取得する取得手段、
     第2携帯端末装置によって撮像された第2画像内の第2車両を特定する第2車両特定情報と、前記第2画像に基づき前記第2車両の運転状態を判定した第2車両判定情報と、を前記第2携帯端末装置から取得する取得手段、
     前記第1車両と前記第2車両との同一性を判定する判定手段、
     前記第1車両と前記第2車両とが同一である場合、前記第1車両判定情報と、前記第2車両判定情報とを用いて、前記第1車両に係る統計情報を生成する統計処理手段、
     として動作させるプログラム。
    Computer,
    The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. Is acquired from the first mobile terminal device,
    The second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. Is acquired from the second mobile terminal device,
    A determination means for determining the identity of the first vehicle and the second vehicle,
    When the first vehicle and the second vehicle are the same, a statistical processing means for generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information.
    A program that operates as.
  15.  コンピュータを、請求項1乃至12のいずれか一項に記載のシステムとして機能させるためのプログラム。 A program for operating a computer as the system according to any one of claims 1 to 12.
PCT/JP2019/046746 2019-11-29 2019-11-29 Information processing system, information processing device, terminal device, server device, program, or method WO2021106180A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019566377A JP6704568B1 (en) 2019-11-29 2019-11-29 Information processing system, information processing device, terminal device, server device, program, or method
PCT/JP2019/046746 WO2021106180A1 (en) 2019-11-29 2019-11-29 Information processing system, information processing device, terminal device, server device, program, or method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/046746 WO2021106180A1 (en) 2019-11-29 2019-11-29 Information processing system, information processing device, terminal device, server device, program, or method

Publications (1)

Publication Number Publication Date
WO2021106180A1 true WO2021106180A1 (en) 2021-06-03

Family

ID=70858192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/046746 WO2021106180A1 (en) 2019-11-29 2019-11-29 Information processing system, information processing device, terminal device, server device, program, or method

Country Status (2)

Country Link
JP (1) JP6704568B1 (en)
WO (1) WO2021106180A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009060581A1 (en) * 2007-11-05 2009-05-14 Fujitsu Ten Limited Vicinity monitoring device, safe travel supporting system, and vehicle
JP2017069917A (en) * 2015-10-02 2017-04-06 株式会社東芝 Communication processing device, on-vehicle device, and communication processing method
JP2017182678A (en) * 2016-03-31 2017-10-05 日本電気株式会社 Driving state determination device, driving state determination method, and program
JP2018112892A (en) * 2017-01-11 2018-07-19 スズキ株式会社 Drive support device
WO2019030802A1 (en) * 2017-08-07 2019-02-14 本田技研工業株式会社 Vehicle control system, vehicle control method, and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009134704A (en) * 2007-11-05 2009-06-18 Fujitsu Ten Ltd Surrounding monitor system, safe driving support system, and vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009060581A1 (en) * 2007-11-05 2009-05-14 Fujitsu Ten Limited Vicinity monitoring device, safe travel supporting system, and vehicle
JP2017069917A (en) * 2015-10-02 2017-04-06 株式会社東芝 Communication processing device, on-vehicle device, and communication processing method
JP2017182678A (en) * 2016-03-31 2017-10-05 日本電気株式会社 Driving state determination device, driving state determination method, and program
JP2018112892A (en) * 2017-01-11 2018-07-19 スズキ株式会社 Drive support device
WO2019030802A1 (en) * 2017-08-07 2019-02-14 本田技研工業株式会社 Vehicle control system, vehicle control method, and program

Also Published As

Publication number Publication date
JPWO2021106180A1 (en) 2021-12-02
JP6704568B1 (en) 2020-06-03

Similar Documents

Publication Publication Date Title
Singh et al. Analyzing driver behavior under naturalistic driving conditions: A review
US10955855B1 (en) Smart vehicle
US10816993B1 (en) Smart vehicle
US10296794B2 (en) On-demand artificial intelligence and roadway stewardship system
US9443152B2 (en) Automatic image content analysis method and system
JP6796798B2 (en) Event prediction system, event prediction method, program, and mobile
US20180075747A1 (en) Systems, apparatus, and methods for improving safety related to movable/ moving objects
US20170132710A1 (en) System and method for monitoring driving to determine an insurance property
US20200058218A1 (en) Determining causation of traffic events and encouraging good driving behavior
JP2019525185A (en) Method and apparatus for providing goal-oriented navigation instructions
EP3676754A1 (en) On-demand artificial intelligence and roadway stewardship system
JPWO2014013985A1 (en) Driving support system and driving support method
KR20190087936A (en) Advertising vehicle and management system for the vehicle
US20200074507A1 (en) Information processing apparatus and information processing method
JP2016126756A (en) Risk determination method, risk determination device, risk output device, and risk determination system
JP2012038089A (en) Information management device, data analysis device, signal, server, information management system, and program
US20230039738A1 (en) Method and apparatus for assessing traffic impact caused by individual driving behaviors
US20160189323A1 (en) Risk determination method, risk determination device, risk determination system, and risk output device
US20240085907A1 (en) Detecting and responding to processions for autonomous vehicles
JP6997471B2 (en) Information processing system, information processing device, terminal device, server device, program, or method
JP6842099B1 (en) Information processing system, information processing device, terminal device, server device, program, or method
US20230245560A1 (en) Location risk determination and ranking based on vehicle events and/or an accident database
WO2021106180A1 (en) Information processing system, information processing device, terminal device, server device, program, or method
Walcott-Bryant et al. Harsh brakes at potholes in Nairobi: Context-based driver behavior in developing cities
US20230052037A1 (en) Method and apparatus for identifying partitions associated with erratic pedestrian behaviors and their correlations to points of interest

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019566377

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19954411

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19954411

Country of ref document: EP

Kind code of ref document: A1