WO2021106180A1 - Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé - Google Patents

Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé Download PDF

Info

Publication number
WO2021106180A1
WO2021106180A1 PCT/JP2019/046746 JP2019046746W WO2021106180A1 WO 2021106180 A1 WO2021106180 A1 WO 2021106180A1 JP 2019046746 W JP2019046746 W JP 2019046746W WO 2021106180 A1 WO2021106180 A1 WO 2021106180A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
terminal device
image
mobile terminal
Prior art date
Application number
PCT/JP2019/046746
Other languages
English (en)
Japanese (ja)
Inventor
路威 重松
佐々木 雄一
涵 周
山本 正晃
聞浩 周
ジニト バット
翼 岩切
長屋 茂喜
Original Assignee
ニューラルポケット株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ニューラルポケット株式会社 filed Critical ニューラルポケット株式会社
Priority to PCT/JP2019/046746 priority Critical patent/WO2021106180A1/fr
Priority to JP2019566377A priority patent/JP6704568B1/ja
Publication of WO2021106180A1 publication Critical patent/WO2021106180A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the technology disclosed in this application relates to an information processing system, an information processing device, a server device, a program, or a method.
  • various embodiments of the present invention provide an information processing system, an information processing device, a terminal device, a server device, a program, or a method in order to solve the above problems.
  • the first system is The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information.
  • the second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information.
  • a determination unit that determines the identity between the first vehicle and the second vehicle by using the first vehicle identification information and the second vehicle identification information.
  • the statistical information relating to the first vehicle is generated by using the first vehicle determination information and the second vehicle determination information.
  • the second system is A transmission unit that transmits statistical information relating to the first vehicle to the second mobile terminal device when it is determined that the first vehicle and the second vehicle are the same. First system with.
  • the third system is It includes a third acquisition unit that acquires information related to an object in the first image from the first mobile terminal device. Any one of the first and second systems.
  • the fourth system is The information related to the object is the information related to the operation of the wiper, the information related to the road, the information related to the sidewalk, the information related to the event on the roadway, the information related to the advertisement, and / or for the vehicle in the first image. Including fuel information, Third system.
  • the fifth system is An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle in the image captured by the third mobile terminal device, the message transmitted by the third mobile terminal device, and the acquisition unit.
  • An acquisition unit that acquires the fourth vehicle identification information that identifies the fourth vehicle registered as its own vehicle in the fourth mobile terminal device, and When the third vehicle and the fourth vehicle are determined to be the same vehicle, a transmission unit that transmits the message to the fourth mobile terminal device and a transmission unit. Any one of the first to fourth systems comprising.
  • the sixth system is An acquisition unit that acquires the third vehicle identification information that identifies the third vehicle registered as its own vehicle in the third mobile terminal device and the message transmitted by the third mobile terminal device.
  • An acquisition unit that acquires the fifth vehicle identification information that identifies the fifth vehicle related to the image captured by the fifth mobile terminal device, and the acquisition unit.
  • the transmission unit and the transmission unit that transmit the message to the fifth mobile terminal device. Any one of the first to fourth systems comprising.
  • the seventh system is After it is determined that the first vehicle and the second vehicle are the same, the transmission unit transmits statistical information relating to the first vehicle within a predetermined time.
  • the second system After it is determined that the first vehicle and the second vehicle are the same, the transmission unit transmits statistical information relating to the first vehicle within a predetermined time. The second system.
  • the eighth system is The first mobile terminal device and the second mobile terminal device are different mobile terminal devices. Any one of the first to seventh systems.
  • the ninth system is The first acquisition unit acquires a moving image including the first image from the first mobile terminal device. Any one of the first to eighth systems.
  • the tenth system is The moving image is a compressed moving image. Ninth system.
  • the eleventh system is The first vehicle determination information is information generated by the machine-learned device in the first mobile terminal device. Any one of the first to ten systems.
  • the twelfth system is The one vehicle determination information and the second vehicle determination information include sudden steering, sudden acceleration, and / or sudden braking, respectively. Any one of the first to eleven systems.
  • the thirteenth method is The computer The first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information. And the acquisition step of acquiring the image from the first mobile terminal device.
  • the second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information. And the acquisition step of acquiring the image from the second mobile terminal device.
  • a determination step for determining the identity of the first vehicle and the second vehicle and When the first vehicle and the second vehicle are the same, a statistical processing step of generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information. , How to do.
  • the fourteenth program is Computer,
  • the first vehicle identification information that identifies the first vehicle in the first image captured by the first mobile terminal device, the first vehicle determination information that determines the driving state of the first vehicle based on the first image, and the first vehicle determination information.
  • the second vehicle identification information that identifies the second vehicle in the second image captured by the second mobile terminal device, the second vehicle determination information that determines the driving state of the second vehicle based on the second image, and the second vehicle determination information.
  • a determination means for determining the identity of the first vehicle and the second vehicle, When the first vehicle and the second vehicle are the same, a statistical processing means for generating statistical information relating to the first vehicle by using the first vehicle determination information and the second vehicle determination information.
  • a program that operates as.
  • the fifteenth program is A program for operating a computer as any one of the first to twelfth systems.
  • image information can be used more appropriately.
  • FIG. 1 is a diagram illustrating a situation example in which the system according to one embodiment is applied.
  • FIG. 2 is a block diagram showing the functions of the system according to the embodiment.
  • FIG. 3 is a block diagram showing the functions of the system according to the embodiment.
  • FIG. 4 is an example of a data structure used by the system according to the embodiment.
  • FIG. 5 is an example of a data structure used by the system according to the embodiment.
  • FIG. 6 is an example of a data structure used by the system according to the embodiment.
  • FIG. 7 is an example of a data structure used by the system according to the embodiment.
  • FIG. 8 is an example of a data structure used by the system according to the embodiment.
  • FIG. 9 is an example of a data structure used by the system according to the embodiment.
  • FIG. 1 is a diagram illustrating a situation example in which the system according to one embodiment is applied.
  • FIG. 2 is a block diagram showing the functions of the system according to the embodiment.
  • FIG. 3 is a block
  • FIG. 10 is an example of a data structure used by the system according to the embodiment.
  • FIG. 11 is an example of a data structure used by the system according to the embodiment.
  • FIG. 12 is an example of a data structure used by the system according to the embodiment.
  • FIG. 13 is an example of a data structure used by the system according to the embodiment.
  • FIG. 14 is an example of a data structure used by the system according to the embodiment.
  • FIG. 15 is an example of a data structure used by the system according to the embodiment.
  • FIG. 16 is an example of a data structure used by the system according to the embodiment.
  • FIG. 17 is an example of a data structure used by the system according to the embodiment.
  • FIG. 18 is an example of a data structure used by the system according to the embodiment.
  • FIG. 11 is an example of a data structure used by the system according to the embodiment.
  • FIG. 12 is an example of a data structure used by the system according to the embodiment.
  • FIG. 13 is an example of a data structure used
  • FIG. 19 is an example of a data structure used by the system according to the embodiment.
  • FIG. 20 is an example of a data structure used by the system according to the embodiment.
  • FIG. 21 is an example of a data structure used by the system according to the embodiment.
  • FIG. 22 is an example of a transition diagram used by the system according to the embodiment.
  • FIG. 23 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 24 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 25 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 26 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 27 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 20 is an example of a data structure used by the system according to the embodiment.
  • FIG. 21 is an example of a data structure used by the system according to the embodiment.
  • FIG. 22 is an example of a transition diagram used
  • FIG. 28 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 29 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 30 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 31 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 32 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 33 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 34 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 35 is an example of a data structure used by the system according to the embodiment.
  • FIG. 36 shows an example of the flow of the system according to one embodiment.
  • FIG. 37 is a diagram showing a display example of the system according to one embodiment.
  • FIG. 38 is a block diagram showing an overall configuration according to the system according to one embodiment.
  • FIG. 39 is a block diagram showing an overall configuration according to the system according to one embodiment.
  • FIG. 40 shows an example of the flow of the system according to one embodiment.
  • FIG. 41 is a block diagram showing a configuration of an information processing device according to an embodiment.
  • An example system may include one or more terminal devices used by system users and one or more management systems used by system administrators.
  • the user may be a passenger in the vehicle, may be a driver, or may be a passenger.
  • the terminal device may be fixed to the vehicle or may be detachable.
  • FIG. 1 shows an example of a situation in which an example system is used.
  • arrow 001 indicates the traveling direction of the vehicle.
  • Vehicle 002 indicates a vehicle using a user mobile terminal according to an example system. It is assumed that the vehicles 003A to I use the terminal device according to the example system as a drive recorder. It should be noted that some of these vehicles do not have to use the terminal device according to the example system. It is assumed that each vehicle is driven along the lane boundary line 004A and the lane boundary line 004B.
  • the user terminal device in the vehicle 002 may be able to identify each vehicle by its imaging device. For example, when the angle of view of one image pickup device of the user terminal device in the vehicle 002 is wide, three vehicles of vehicles 003A, 003D, and 003F may be included in the image. Similarly, when the angle of view of the other imaging device that captures the report facing the one imaging device of the user terminal device in the vehicle 002 is wide, the four vehicles of the vehicles 003C, 003E, 003H, and 003I are images. May be included in. In some cases, such as the vehicle 003I, the image pickup device of the user terminal device in the vehicle 002 comes into view.
  • the vehicles 003B and 003G travel on the side of the vehicle 002, in the vehicle 002, the front is the imaging direction of one imaging device, and the rear is the other direction that images the direction facing the one imaging device.
  • the imaging direction is set to the imaging device, even if the vehicle does not enter the field of view while traveling right beside the vehicle 002, it enters the field of view of the imaging device of the user terminal device in the vehicle 002 by moving back and forth while traveling. There may be.
  • the vehicle in the field of view in the imaging device may be referred to as a "surrounding vehicle" in the documents of the present application, and when the terminal device related to the system of one example is used in the surrounding vehicle.
  • a terminal device When such a terminal device is viewed from the user terminal device in the user vehicle 002, it may be referred to as a "peripheral terminal device" in the documents of the present application.
  • the vehicle in front of the user vehicle may be referred to as the "front vehicle” in the document of the present application
  • the vehicle behind the user vehicle may be referred to as the “rear vehicle” in the document of the present application.
  • the terminal device related to the example system in the front vehicle and the rear vehicle is used as an example system
  • the terminal devices used in each vehicle are referred to as “front terminal device” and "rear", respectively. It may also be called a "terminal device". Since the surrounding vehicle is included in the field of view in the above-mentioned imaging device, the information for identifying the surrounding vehicle may be acquired by using the information related to the license plate described later.
  • the terminal device may include one or more accelerometers.
  • a plurality of acceleration sensors may be provided in the terminal device so that accelerations in directions orthogonal to each other can be measured.
  • a plurality of acceleration sensors may be provided in the terminal device so that the acceleration in the direction imaged by the image pickup device and the direction orthogonal to the image pickup direction can be measured.
  • the terminal device may include one or more imaging devices.
  • the terminal device may be equipped with an imaging device so that it can image in opposite directions.
  • the terminal device may be installed in the vehicle so that one or more of the plurality of imaging devices have the front of the vehicle as the imaging direction.
  • the terminal device may be installed in the vehicle so that the other one or a plurality of imaging devices among the plurality of imaging devices have the rear of the vehicle as the imaging direction.
  • the installation in the vehicle may be detachably installed in the vehicle or may be fixed.
  • the display direction of the display device in the terminal device may be the same as the image pickup direction of one image pickup device. Further, the display direction of the display device may be the rear of the vehicle.
  • the terminal device may have one or more position measuring devices.
  • the position measuring device may be a position measuring function using GPS or a base station.
  • the terminal device may be provided with a communication device.
  • the communication device may be capable of communicating with an image pickup device installed in the vehicle in which the terminal device is installed.
  • the communication device may be a wireless type or a wired type. In the case of wireless type, it may be WIFI, BLUETOOTH or the like.
  • the image pickup device installed in the vehicle may be an image pickup device incorporated in the vehicle or an image pickup device installed in the vehicle.
  • the terminal device may be a portable terminal device that can be carried by a person.
  • the portable terminal device may be a smartphone, a PDA, or the like.
  • FIG. 2 is a block diagram showing a specific example of the function related to the management system of this example
  • FIG. 3 is a block diagram showing a specific example of the function related to the terminal device.
  • a part of the functions related to the management system may be executed in the terminal device.
  • the system can process based on the information acquired from a plurality of terminal devices, but when the terminal device executes the function in FIG. 2, one terminal device is used based on the information acquired by the terminal device.
  • the function may be executed as the acquired information.
  • the information obtained by executing the information on the one terminal device may be transmitted to the management system, combined with the information acquired from the other terminal devices in the management system, and processed such as summation and averaging.
  • the term system is used as a superordinate concept of a management system and a terminal device, and the system may include a management system and may not include one or a plurality of terminal devices, or one or a plurality of terminal devices. It may include the management system and may not include the management system, or may include both the management system and one or more terminal devices.
  • the vehicle in which the terminal device is installed or placed is referred to as a self-vehicle
  • the information related to such a vehicle is referred to as self-vehicle information
  • the vehicle included in the image captured by the imaging device related to the terminal device is referred to as another vehicle. That is.
  • the term "own vehicle” refers to one or more of these terminal devices may be installed or placed. You may point to the vehicle.
  • the image acquisition unit has a function of acquiring an image.
  • the image may be a moving image or a still image.
  • images are treated as a superordinate concept of moving images and still images.
  • the image acquisition unit may store the moving image in a file format divided for each predetermined capacity. This is because the convenience of communication is improved when the capacity per file is smaller than that of a single file having a large capacity.
  • the image acquisition unit may acquire the imaged location and / or the imaging time for each predetermined period of the acquired moving image, and store the acquired moving image in association with each predetermined period of the corresponding moving image.
  • the image acquisition unit may acquire an image from an image pickup device inside the terminal device, or may acquire an image from an image pickup device outside the terminal device. In this case, it may be connected to an image pickup device outside the terminal device by wire or wirelessly, and in the case of wireless communication, image information may be acquired by a connection method such as WIFI or BLUETOOTH.
  • the image acquisition unit may acquire audio.
  • the image acquisition unit may also acquire audio when acquiring a moving image as an image. This has the advantage that, for example, sounds at the time of an accident, sudden braking, sharp curves, and the like can be acquired and recorded. Further, the image acquisition unit may acquire the sound itself separately from the moving image. As a superordinate concept of image and sound, it may be referred to as "image, etc.” in the documents of the present application.
  • the own vehicle information generation unit may have a function of acquiring information related to the own vehicle.
  • the information related to the own vehicle is the information related to the license plate related to the own vehicle, the message to be transmitted to the surrounding vehicle, the vehicle in front, and / or the vehicle behind the own vehicle (referred to as "transmission message” in the documents of the present application). There is also), may be included.
  • the information related to the license plate may be an image of the license plate of the vehicle, may be information on a number that identifies the vehicle generated from the image of the image of the imaged license plate, or may be information on a number that identifies the vehicle, depending on the user or the like. It may be the information of the number that identifies the entered vehicle. Further, in this case, the area name may be selected depending on the mode of selecting from those given in advance.
  • the self-vehicle information generation unit may have a function of analyzing the image and extracting the information of the number that identifies the vehicle.
  • the information related to the license plate may be registered one or more in the terminal device. Among the information related to one or a plurality of license plates registered in advance, the information related to one license plate is selected and may be used. In addition, when the user can use a plurality of own vehicles, the user may select the own vehicle to be actually used and select the information related to the license plate related to the actually used own vehicle. .. As a result, even when the user may use a plurality of own vehicles, there is an advantage that the information related to the license plate can be used quickly without registering a new license plate each time the user uses the vehicle. ..
  • the transmission message may be in various modes as long as the message of the own vehicle can be transmitted.
  • the transmitted message may be a character string input by the user before or at the time of boarding.
  • a character string may be any character input by the user.
  • the transmitted message may be a predetermined character string.
  • the user may be selected from one of the plurality of predetermined character strings.
  • the transmitted message may be, for example, a character string indicating that the vehicle is driving slowly, a character string indicating that the driver is elderly, a character string indicating that the driver is a beginner, a character string indicating that a child is on board, or a character string indicating that the driver is in a hurry. Good.
  • These transmitted messages may be stored in advance or may be input in the terminal device.
  • the transmission message may include a transmission message to the vehicle in front and a transmission message to the vehicle behind.
  • a plurality of transmitted messages may be possessed, and one transmitted message may be selected from a plurality of transmitted messages to be used at an appropriate timing of the user.
  • the selection of the transmitted message may be selected by voice or by gesture.
  • the message conveyed by the gesture may be, for example, gratitude, apology, or giving up the other party.
  • gratitude is assumed to be gratitude for being put in the vehicle line when changing lanes, and gratitude for being given the lead when turning right or left.
  • an apology is expected to be given in place of gratitude in the same case as gratitude.
  • these gestures may include gestures or voices that identify the opponent's vehicle, such as a front vehicle, a rear vehicle, a right-side vehicle, and a left-side vehicle, before and after the message content.
  • the user of the terminal device according to the example system the driver of the vehicle into which the terminal device is brought in, and the vehicle owner of the vehicle may be different or the same. Further, when a user of the terminal device according to the example system uses a plurality of vehicles, such a user may have a function of specifying a vehicle to be boarded.
  • FIG. 4 is an example of a data structure in which information related to the license plate related to the user's vehicle and a message when the user's vehicle is boarded are stored in association with each other. Having such a data structure has an advantage that the user can select the user's vehicle to actually board and can use the information related to the license plate according to the selected user's vehicle. ..
  • the information related to the own vehicle includes information indicating the driving status of the own vehicle (sometimes referred to as "own vehicle driving status information" in the documents of the present application) and / or information indicating the position of the own vehicle "own vehicle”. "Position information" may be included.
  • the own vehicle driving situation information may include sudden acceleration, sudden braking, sudden steering, speed, and / or acceleration of the own vehicle.
  • the sudden acceleration, sudden braking, and / or sudden steering of the own vehicle may be determined by using the function in the terminal device.
  • a sensor in the terminal device may determine the sudden acceleration, sudden braking, and / or sudden steering of the own vehicle.
  • an acceleration sensor as a sensor may determine sudden acceleration, sudden braking, and / or sudden steering when the acceleration is higher or lower than a predetermined acceleration.
  • the accelerometer may be capable of measuring each of the three-dimensional accelerations.
  • the acceleration in the same direction as the front direction of the vehicle is higher than the predetermined acceleration, it is determined that the acceleration of the own vehicle is sudden, and when the acceleration in the direction opposite to the front direction of the vehicle is higher than the predetermined acceleration, the self When it is determined that the vehicle is suddenly braked and the lateral acceleration of the vehicle is higher than a predetermined acceleration, it may be determined that the vehicle is a sudden steering wheel.
  • the speed and acceleration of the own vehicle may be measured by measuring the acceleration with a sensor in the terminal device and using the acceleration.
  • the self-vehicle driving status information may include information relating to the distance of the self-vehicle from the vehicle in front (sometimes referred to as "distance information" in the documents of the present application).
  • the own vehicle information generation unit may generate distance information using the information related to the image generated by the image information generation unit described later.
  • Such distance information may include information for estimating the distance between the own vehicle and the vehicle in front or information corresponding to the distance and / or information for determining that the vehicle has approached using the distance information. For the information determined to be approaching, for example, the size in the image when the vehicle in front is viewed from the rear is used as the information corresponding to the distance between the own vehicle and the vehicle in front, and the size from the rear of the vehicle is used.
  • the size from the rear of the vehicle may be the width or height of the vehicle. Since the vehicle height can be changed depending on the vehicle type, the presence or absence of approach may be determined by using the vehicle height according to the vehicle type in the image. In this case, when the height of the vehicle in front obtained from the image is higher than the height of a predetermined vehicle height set according to the vehicle type, it may be determined that the vehicle has approached.
  • the vehicle width does not differ greatly depending on the vehicle type, so there is an advantage that information processing based on the vehicle type does not have to be performed, and the vehicle width of the vehicle in front obtained from the image is longer than the predetermined vehicle width. It may be determined that the vehicle has approached.
  • the self-vehicle information generation unit is provided with a plurality of ranks for each of the degree of sudden acceleration, sudden braking, sudden steering, and approach information, and one of these may be selected.
  • the rank is 3, threshold values are set as low, medium, and high, respectively, and the rank may be determined by comparing them with the obtained information.
  • the number of ranks is not limited to 3, and may be any number.
  • the own vehicle information generation unit may store the sudden acceleration, sudden braking, sudden steering, and / or approach information of the own vehicle in association with the situation in which these are determined.
  • the determined situation may be, for example, the time and / or location where the sudden acceleration, sudden braking, sudden steering, and / or approach information of the own vehicle occur.
  • the own vehicle information generation unit may generate statistical information regarding the driving situation of the own vehicle.
  • the statistical information may be, for example, the number of times of sudden acceleration, sudden braking, sudden steering, and / or approach information in traveling for each predetermined item.
  • the predetermined items may be within a predetermined unit time, one run, one day run, a predetermined period, and the like.
  • one run is, for example, from the start of the engine to the end of the engine, the time between running and running is a predetermined period or more, or the time from the start to the stop of the software related to the system of one example. Or it may be from the start of recording the moving image to the end of recording.
  • the above-mentioned number of times may be generated in chronological order.
  • a graph may be generated for each of sudden acceleration, sudden braking, sudden steering, and / or approach information with the horizontal axis as the timing and the vertical axis as the number of times of application, and may be used as a drive report.
  • FIG. 5 is an example of statistical information data on the driving situation of the own vehicle. This is an example in which each statistical information in one day's driving is recorded for each predetermined item.
  • the IDs are 001 to 004, which is an example in which the total number of times is stored as each statistical information for running for 4 days.
  • the position information may be acquired from the GPS inside the terminal device, or the position information may be acquired from the GPS outside the terminal device.
  • the image pickup device outside the terminal device may be connected by wire or wirelessly, and in the case of wireless communication, the position information may be acquired by a connection method such as WIFI or BLUETOOTH.
  • the own vehicle information generation unit generates a green light when the image information generation unit described later generates information that the own vehicle does not move and that the traffic light for the own vehicle is a green light. However, in order to indicate that the own vehicle has not started, information prompting the own vehicle to start may be generated. In addition, the self-vehicle information generation unit generates information that the own vehicle does not move and that the distance information to the vehicle in front is larger than a predetermined distance in the image information generation unit described later. If this is the case, information prompting the departure of the own vehicle may be generated in order to indicate that the own vehicle has not started even though the vehicle in front has started. In the latter case, when a traffic light targeting the own vehicle is detected in the image, a condition that it is a green light may be added.
  • the self-vehicle information generation unit when the self-vehicle information generation unit generates information that the own vehicle is moving and that there is a stop sign or display in the image information generation unit described later, the self-vehicle information generation unit generates information that the vehicle is moving. It may be determined that the suspension violation occurs, and information indicating the suspension violation may be generated.
  • the self-vehicle information generation unit ignores the red light when the image information generation unit described later generates information that the own vehicle is moving and information indicating that the red light continues. Judgment may be made and information indicating that the red light is ignored may be generated.
  • the self-vehicle information generation unit uses the information indicating the stop violation and the information ignoring the red light to add these information indicating the stop violation and / or the red light to the above-mentioned statistical information.
  • the total number and / or average number of information to be ignored may be included.
  • the information indicating the suspension violation and / or the information ignoring the red light may include the location information in which they occur and / or the information relating to the occasion in association with each other.
  • the image information generation unit has a function of generating information related to an image by using an image.
  • the information related to the image may be information using various objects in the image (sometimes referred to as "objects" in the documents of the present application) and various situations, and the type and range thereof are not limited, but for example, The following information can be mentioned.
  • the image information generation unit may generate information after identifying a target for generating information in the image.
  • the image information generation unit may generate information related to the image by using the image and other information.
  • information related to an image may be generated by using information related to at least a part of an image or the like acquisition unit and a self-vehicle information generation unit.
  • the information related to the image or the like acquisition unit may be, for example, the information at the time of acquiring the image or the like, the image or the like, and the information related to the own vehicle information generation unit may be, for example, the own vehicle position information or the like. However, it is not limited to these.
  • the information related to the image may be information related to one or more other vehicles in the image.
  • Information related to other vehicles is referred to as information that identifies another vehicle (sometimes referred to as “vehicle identification information" in the documents of the present application) and information that indicates the driving status of another vehicle (in the documents of the present application, "information on the driving status of another vehicle”). Sometimes) and so on.
  • the vehicle specific information may be information related to the vehicle in the image.
  • the information relating to the vehicle in the image may include information relating to the license plate of the vehicle in the image, or may include information such as the vehicle type, color, and options of the vehicle in the image.
  • the information related to the license plate may be the same as the information related to the license plate related to the user vehicle described above.
  • the other vehicle driving status information may be any information indicating the driving status of another vehicle, and may be, for example, sudden acceleration of the vehicle, sudden braking of the vehicle, and / or sudden steering of the vehicle.
  • the sudden acceleration of the vehicle in the image may be determined as a sudden acceleration when the size of the vehicle in the image changes slightly by a predetermined ratio or more within a predetermined period. Further, the sudden acceleration may be determined by using the acceleration of the own vehicle. That is, when the acceleration of the own vehicle in the forward direction is within a predetermined range and the size of the vehicle in the image changes slightly by a predetermined ratio or more within a predetermined period, it is determined to be sudden acceleration. It's okay. In this case, there is an advantage that the determination accuracy is improved.
  • the sudden braking of the vehicle in the image may be determined as sudden acceleration when the size of the vehicle in the image changes significantly by a predetermined ratio or more within a predetermined period. Further, the sudden acceleration may be determined by using the acceleration of the own vehicle. That is, when the acceleration of the own vehicle in the rear direction is within a predetermined range and the size of the vehicle in the image changes significantly by a predetermined ratio or more within a predetermined period, it is determined as sudden braking. It's okay. In this case, there is an advantage that the determination accuracy is improved.
  • the steep steering wheel of the vehicle in the image means that the lateral area of the vehicle in the image increases at a speed of a predetermined ratio or more, and the shape of the side of the vehicle in the image is the shape when the vehicle is seen from the side. If any or a combination of the speeds approaching the above is equal to or higher than a predetermined ratio, it may be determined to be sudden acceleration. Further, in combination with these, when the size of the vehicle in the image changes small at a speed of a predetermined ratio or more within a predetermined period, it may be determined that the vehicle is suddenly accelerated. The size of the vehicle may be the length of the width of the vehicle.
  • the driving status information of other vehicles may be stored in association with the date and time when the acquired target image was captured. This has the advantage of clarifying the date and time related to the driving status information of other vehicles.
  • the driving status information of other vehicles may be stored in association with the captured position information of the acquired target image. This has the advantage of clarifying the position related to the driving status information of other vehicles.
  • FIG. 6 is an example in which information is generated based on a plurality of images.
  • This figure shows the data structure, and the vehicle ID, other vehicle driving status information, the date and time, and the position are stored in association with the image ID.
  • the vehicle ID may be associated with other vehicle specific information in other data structures.
  • the vehicle ID may be one that is sequentially attached to the vehicle specified in the image, or the corresponding vehicle is collated with the vehicle identification information to which the vehicle ID is attached in the past. ID may be used.
  • the storage unit may store a set of information of the vehicle identification information in the image captured in the past by the image pickup device related to the terminal device and the vehicle ID attached to the vehicle. In addition, the information of such a set may be stored for a predetermined period of time.
  • each driving status information is associated with a different image ID. For example, when the driving status information of another vehicle is generated at the same time, a plurality of vehicles are generated for one image. And the corresponding other vehicle driving status information may be associated with them.
  • vehicle statistical information in the image may be referred to as "vehicle statistical information in the image" in the documents of the present application.
  • the image information generation unit may generate information on the movement of the own vehicle by using time-series images.
  • the presence or absence of movement of the own vehicle may be determined by using the movements in the corresponding traffic lights, signs, and / or landscapes in the still images of the plurality of adjacent frames in the moving image.
  • the information regarding the movement of the own vehicle may include information on the presence or absence of the own vehicle. Such information is that the vehicle is keeping a pause, the traffic light for the own vehicle turns blue, or the traffic light for the own vehicle is blue when the vehicle in front is moving. May be used if the vehicle departed.
  • the image information generation unit includes a state in which the traffic light for the own vehicle is a red light in the first image, and the traffic light with the red light continues in the frame following the first image in time series. After being displayed, it may be determined whether or not the image deviates from the image without a green light. When such a determination is made, information indicating the continuation of the red light may be generated.
  • the identification of the traffic light for the own vehicle may be made possible by learning by machine learning using an image including the traffic light.
  • the image information generation unit may detect such information and generate information of a stop instruction when there is a stop sign or display in the road sign or road marking of the object related to the image.
  • the information related to the image may be information related to various objects detected in the image.
  • the information related to the target may include information related to the operation of the wiper.
  • the information related to the operation of the wiper the information on the operation of the wiper related to the own vehicle may be used, or the operation of the wiper related to another vehicle may be used.
  • the information related to the operation of the wiper may include information related to the operating status of the wiper.
  • the information related to the operation status of the wiper may be the presence / absence of movement of the wiper and / or the speed of movement of the wiper.
  • the speed of movement of the wiper may be information that identifies one of a plurality of ranks indicating the speed of movement of the wiper.
  • the plurality of ranks indicating the speed of movement of the wiper may be, for example, intermittent, slow, medium, fast, etc., but are not limited to these.
  • the information related to the operation of the wiper may include information related to the operation status of the wiper and the position information obtained by capturing the image obtained by acquiring the information related to the operation status of the wiper.
  • the information related to the operation of the wiper may include information related to the operation status of the wiper and the information related to the operation status of the wiper when the image obtained by acquiring the information is associated with each other.
  • FIG. 7 shows an example of information related to the operation of the wiper.
  • this figure is an example of data in which the presence / absence of movement of the wiper and the rank of movement are separated.
  • one of the ranks of movement includes one with no movement, and the presence / absence of movement of the wiper is assumed. May be good.
  • the data without the movement of the wiper may not be included in the data, and the data may be acquired only when the wiper moves.
  • the information relating to the subject may include information relating to the road.
  • the information related to the road may include the information related to the road condition.
  • Information relating to road conditions may include information relating to road signs, information relating to road markings, information relating to traffic lights, and / or information relating to roadways.
  • Information related to road signs may include information on the presence or absence of signs, the contents of signs, and / or abnormalities in signs.
  • Abnormalities in the sign are the presence or absence of something that interferes with the sign (eg, at least part of the sign is invisible by the tree) and / or the abnormality in the sign itself (eg, at least part of the sign is damaged). It may contain information on (state of operation).
  • Information related to road markings may include the presence or absence of road markings, the content of road markings, and / or abnormalities in road markings. Anomalies in road markings may include information about the presence or absence of something that interferes with the markings and / or anomalies in the markings themselves.
  • the information related to the traffic light may include information that the traffic light has been identified (information that there is a traffic light) and / or information that the traffic light is abnormal. Anomalies in a traffic light include anomalies in the appearance of the traffic light (for example, information that a part of the traffic light cannot be seen due to a tree or the like) and / or a failure of the traffic light (for example, a part of the traffic light is damaged). ), Information may be included.
  • the information related to the roadway may be an abnormality of the roadway and / or the width of the lane.
  • the lane may be a part of a strip-shaped roadway (excluding a sub-roadway) provided to allow a single column of automobiles to pass safely and smoothly, and when the roadway is separated by a line, It may be a part through which a car or the like can pass.
  • Roadway abnormalities may be lane boundary abnormalities, foreign objects on the roadway, and / or destruction of the roadway.
  • the abnormality of the lane boundary line may be a part or all of the lane boundary line is missing.
  • the foreign matter on the roadway may be, for example, a falling object on the roadway, a fallen tree, a fallen utility pole, or the like.
  • Destruction of the roadway may be an abnormality in the shape of the roadway, such as a collapse of the roadway.
  • the information related to the roadway may include information on parking on the street.
  • the information related to the road may include information related to the road condition and the position information obtained by capturing the image obtained by acquiring the information related to the road condition.
  • the information related to the road may include information related to the road condition and the information related to the road condition when the image obtained by acquiring the information is associated with each other.
  • Figure 8 shows an example of information related to roads.
  • the information related to the road may be always acquired when it is detected in the image, or may be acquired only when the information detected in the image satisfies a predetermined condition.
  • the predetermined condition may be a predetermined target, or only a condition in which an abnormality is detected.
  • information related to roads information related to disasters may be generated.
  • Information relating to a disaster may include information on foreign matter on the roadway and / or destruction of the roadway.
  • the information related to the road may include information related to the disaster and the position information obtained by capturing the image obtained by acquiring the information related to the disaster, or may include the information related to the disaster and the information related to the disaster. It may include an image associated with the time when the acquired image is captured.
  • FIG. 9 is an example in which the information related to the road includes the information related to the disaster.
  • the information related to these roads is transmitted to the system related to autonomous driving, it may be used to judge the propriety and priority of autonomous driving.
  • the information relating to the subject may include information relating to the sidewalk.
  • the information on the sidewalk may include information on the condition of the person on the sidewalk.
  • Information on the condition of a person on the sidewalk may include information on what the person wears. What is worn by such a person may include an umbrella (in the documents of the present application, the term wearing shall include the meaning of pointing when the subject is an umbrella). Also, what is worn by a person may include a cloak. Also, what is worn by a person may include a half sleeve and a long sleeve.
  • the information related to the sidewalk may include information on the state of the person on the sidewalk and the position information obtained by capturing the image obtained by acquiring the information on the state of the person on the sidewalk.
  • the information related to the sidewalk may include information on the state of the person on the sidewalk and the information on the state of the person on the sidewalk associating the acquired image.
  • FIG. 10 is an example of information related to the sidewalk.
  • the information related to the sidewalk may be always acquired when it is detected in the image, or may be acquired only when the information detected in the image satisfies a predetermined condition.
  • predetermined conditions only those that are predetermined as information on the state of a person or that a predetermined number of people or more are detected in one image are predetermined among the people in one image. It may be only the detection of people less than or equal to the number of people.
  • the information related to the target may include information related to the degree of congestion of people.
  • the information on the degree of congestion of people may include the number of people in a predetermined area.
  • the information relating to the degree of congestion of people includes the number of such people and the position information in which the image used for determining the number of such people is captured or the information indicating such a predetermined area in association with each other. Good.
  • the information on the degree of congestion of people may be included in association with the number of such people and the time when the image in which the number of such people is determined is captured.
  • the predetermined area may be predetermined or may be defined at the time of moving image shooting. Examples of the predetermined area include a predetermined area using map information and location information. Further, what is defined at the time of moving image shooting may be a region defined by using a predetermined distance from a coordinate value such as GPS, or may be a region defined by using the elapsed imaging time.
  • Information on the degree of congestion of these people may be used to determine the propriety and priority of autonomous driving when it is transmitted to the system related to autonomous driving.
  • the information related to the target may include information related to an event on the roadway.
  • Information about roadway events may be information about construction and / or accidents.
  • Information about the construction may include the presence or absence of construction and / or the scheduled completion deadline of the construction. The presence or absence of construction may be determined by detecting information such as a stop of the construction vehicle, installation of an instant traffic light, and vehicle guidance by a person involved in the construction in the image.
  • Information about the accident may include the presence or absence of an accident and the magnitude of the accident.
  • the presence or absence of an accident may be determined by the presence or absence of an accident vehicle and / or the presence or absence of police personnel. The detection of construction personnel and police personnel may be detected by the clothes and belongings that characterize them.
  • the information on the on-road event may include information on the construction and / or the accident and the position information obtained by capturing the image obtained from the information on the construction and / or the accident.
  • the information on the event on the roadway may include information on the construction and / or the accident and the information on the acquisition of the information on the construction and / or the accident when the image is taken.
  • FIG. 11 shows an example of information regarding an event on the roadway.
  • the information related to the target may be information related to the advertisement.
  • the information related to the advertisement may include the advertisement status information.
  • the advertisement status information may include a specific brand of the advertisement, the advertiser of the advertisement, the field to which the advertisement belongs, the size of the advertisement, and / or the presence or absence of an abnormality in the advertisement.
  • the presence or absence of an abnormality in the advertisement may include a state in which the advertisement is hidden by an object such as a tree and / or damage to the advertisement itself.
  • Advertising status information may be acquired based on the image.
  • the information related to the advertisement may include an association between the advertisement status information and the position information obtained by capturing the image obtained from the advertisement status information.
  • the information related to the advertisement may include the information related to the advertisement status information and the time when the image obtained from the advertisement status information is captured.
  • FIG. 12 is an example of information related to the advertisement.
  • the size of the advertisement may be configured to select one of a plurality of predetermined ranks for the size.
  • the information relating to the target may be information relating to fuel for vehicles.
  • the information relating to the fuel for the vehicle may include the information relating to the status of the fuel for the vehicle.
  • the information on the status of the fuel for the vehicle may include the type of fuel for the vehicle and the amount of money corresponding to the type of fuel for the vehicle.
  • Vehicle fuel types may include high-octane gasoline, regular gasoline, light oil, and / or kerosene, and the like.
  • Information on the status of fuel for vehicles may be obtained from the presentation of locations in the image that provide fuel for vehicles, such as service stations, including gas stations and refueling stations.
  • the information relating to the fuel for the vehicle may include information relating to the status of the fuel for the vehicle and the position information obtained by capturing the image obtained by acquiring the information relating to the status of the fuel for the vehicle. ..
  • the information relating to the fuel for the vehicle may include information relating to the status of the fuel for the vehicle and the information relating to the acquisition of the information relating to the fuel status for the vehicle when the image is taken.
  • FIG. 13 shows an example of information on fuel for vehicles.
  • the information related to the fuel for these vehicles is transmitted to the system related to automatic driving, it may be used to judge the propriety and priority of automatic driving.
  • the image information acquisition unit may have a machine-learned identification function.
  • the machine-learned identification function may include a function capable of identifying a predetermined object.
  • the image information acquisition unit may acquire information related to the above-mentioned various objects from the image by using the machine-learned identification function.
  • the machine-learned identification function may be stored in the information processing device in the terminal device. Since the object to be identified is limited in advance, there is an advantage that a high identification function can be realized even in a simple information processing device such as in a terminal device.
  • Artificial intelligence technologies include, for example, machine learning technologies such as neural networks, genetic programming, functional logic programming, support vector machines, clustering, regression, classification, Bayesian networks, reinforcement learning, expression learning, decision trees, and k-means clustering. May be used. In the following, an example using a neural network will be used, but the present invention is not necessarily limited to the neural network.
  • the machine learning technology using the neural network deep learning technology may be used. This is a technique that makes it possible to generate an output corresponding to an input even for an unknown input by learning the relationship between the input and the output using a plurality of layers.
  • the learning image and the attribute information related to the learning image are associated with each other, and machine learning is performed using these as learning data.
  • the machine learning function related to the image information acquisition unit includes, for example, information related to at least a part of the image information acquisition unit, the information acquisition unit, the self-vehicle information generation unit, and the image information generation unit, and the information is an image seen from a person.
  • the relationship between the image determined to be displayed above and the image may be machine-learned.
  • the machine learning related to the image information acquisition unit identifies information related to at least a part of the image information acquisition unit, the self-vehicle information generation unit, and the image information generation unit from the image, and generates the corresponding information. It may have a function capable of being able to do so.
  • the learning algorithm itself when using the deep learning technique may be a known one. Further, as the program used in the deep learning technique, an open source program may be used, or a program obtained by modifying them as appropriate may be used.
  • the machine-learned function may have an update function.
  • the update function may be a function capable of updating an identifiable object. For example, it may be possible to identify information related to vehicles, information related to roads, information related to advertisements, and the like, and update the information so that the target information can be generated. Such updates may be in a mode in which the program or its parameters are downloaded and installed.
  • the information related to the target may be stored in association with the information related to the time when each information was acquired and / or the information related to the place.
  • the information related to the time may be month, day, week, time, and the like.
  • the information related to the location may be administrative divisions, GPS coordinates, and the like.
  • the information related to the location may be acquired by the GPS related to the terminal device that acquires the image.
  • the machine-learned identification function may use the identification function by using the position information. For example, when there is a high possibility that a specific advertisement is in a specific position, it may have an identification function that emphasizes the identification of the specific advertisement in the vicinity of the specific position including the specific position.
  • the storage unit has a function of storing information.
  • the storage unit may store the above-mentioned information related to the image acquisition unit, the self-vehicle information generation unit, and the image information generation unit.
  • the storage unit may have a function of associating and storing a plurality of pieces of information.
  • the information to be associated includes, for example, a part or all of an image or the like, time information, own vehicle driving status information, own vehicle position information, vehicle specific information, and driving status information. , You may remember.
  • the storage unit may store the information by the ring buffer. Further, the storage unit may store each piece of information in a ring buffer. In this case, the upper limit of the ring buffer is defined for each information, and when the information exceeding the upper limit is stored, the storage may be deleted from the oldest information.
  • the information for each information may be, for example, information such as an image or the like, own vehicle driving status information, own vehicle position information, vehicle specific information, other vehicle driving status information, and the like.
  • the output unit may have a function of producing sound or displaying. For example, information related to the own vehicle and / or information related to an image may be displayed.
  • the information related to the own vehicle may include the own vehicle driving status information and / or the distance information.
  • information on the driving of the own vehicle can be obtained from the information on the own vehicle.
  • the information related to the image may include vehicle specific information and / or other vehicle driving status information.
  • information on other vehicles can be obtained from information on other vehicles.
  • various information may be included, and such information may be displayed.
  • the output unit may display the information acquired by the information and communication unit. For example, information related to other vehicle statistical information may be displayed. Further, the output unit may display the processed information by using the other vehicle statistical information. This has the advantage that the viewer can understand the information related to other vehicles. In particular, when the driving situation of the vehicle in front is different from the usual one, there is an advantage that the driver can drive with care.
  • the output unit has a function of emitting a sound, and may notify the user of the terminal device. For example, the output unit may generate a sound that prompts the driver of the own vehicle to start the vehicle when the information for prompting the departure of the own vehicle is generated.
  • the output unit may output information related to at least a part of the image information acquisition unit, the self-vehicle information generation unit, the image information generation unit, the storage unit, and the information communication unit.
  • the information and communication unit has a function of communicating information.
  • the information communication unit may communicate information relating to at least a part of an image information acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, and an output unit.
  • the information and communication unit may transmit such information.
  • the function of communicating information may be executed at an arbitrary timing, or at a timing when a specific condition is satisfied.
  • the latter specific condition may be, for example, the timing at which the WIFI function can be executed. In this case, there is an advantage that the communication cost for the user can be reduced.
  • the information and communication unit may have a function of receiving information.
  • the information and communication unit may acquire other vehicle statistical information described later from the management system.
  • the statistical processing unit has a statistical processing function. By performing statistical processing, the statistical processing department performs other vehicle statistical information, traffic jam information, weather information, road information, road abnormality information, accident information, vehicle fuel provision information, advertising statistical information, human information, and 3D. At least some information about the map may be generated.
  • the statistical processing unit uses vehicle identification information and other vehicle driving status information obtained from one or more terminal devices to provide statistical information related to other vehicles (sometimes referred to as "other vehicle statistical information" in the documents of the present application). May be generated.
  • the statistic may include the sum of the number of sudden steering, the number of sudden braking, and / or the number of sudden accelerations for one vehicle, or the number of sudden steering, sudden braking for one vehicle. And / or the number of sudden accelerations, etc. may be included. Further, the total value and the average value may be those for vehicles imaged during a predetermined period.
  • Such other vehicle statistical information for example, regarding the number of times of sudden steering, for a specific vehicle obtained from the vehicle specific information, the vehicle specific information obtained from the one or a plurality of terminal devices and the other vehicle driving status information.
  • vehicle identification information the same vehicle as the specific vehicle is specified, and the presence / absence or the number of times of sudden steering in the corresponding other vehicle driving status information is acquired for a predetermined range.
  • the total value of the steep steering wheel may be generated by applying it to the vehicle identification information and the other vehicle driving situation information obtained from all the above-mentioned one or more terminal devices.
  • the total number of the number of cue brakes and the number of sudden accelerations may be generated, and the corresponding average value in a predetermined period, a predetermined area, or the like may be generated in the same manner.
  • the statistical processing unit may generate a ranking for the total value or the average value.
  • the ranking may be targeted at vehicles imaged during a predetermined period, may be targeted at vehicles imaged in a predetermined area, or may be targeted at vehicles traveling on a predetermined road or a predetermined road condition. It may be something to do.
  • the predetermined road may be, for example, a type of road such as a road in a residential area, a main road, an expressway, or the like.
  • the predetermined road condition may be, for example, depending on the speed of the vehicle, for example, during traffic jam, low speed running, high speed running, etc., while traveling at a speed within a specified range with the speed as an evaluation value. It may be intended for vehicles.
  • the ranking may be a predetermined number of rankings in descending order of the total value or the average value, or may be a predetermined number of rankings in ascending order.
  • FIG. 14 is an example of other vehicle statistical information.
  • the statistical processing unit may generate congestion information.
  • the traffic jam information may be any data as long as it is information related to the traffic jam of the vehicle.
  • Congestion information may include, for example, information indicating the presence of congestion associated with a particular area.
  • Information indicating the existence of a traffic jam associated with a specific area may be generated, for example, when one or more vehicles having position information in the specific area satisfy a predetermined condition.
  • the predetermined condition is, for example, when the number of vehicles determined to be congested is larger than the predetermined number, or the number of vehicles determined to be congested is a predetermined multiple or more of the number of vehicles not determined to be congested. It may be determined depending on the case.
  • the vehicle may be determined to be in a traffic jam if certain conditions are met for the vehicle speed and / or the above-mentioned distance information.
  • the vehicle speed When the vehicle speed satisfies a predetermined condition, for example, when the vehicle speed is smaller than the vehicle speed and the predetermined speed, it may be determined as a traffic jam. Further, when the average speed of the vehicles is smaller than the predetermined speed within a predetermined time or a predetermined distance, it may be determined as a traffic jam. This is because in the case of traffic congestion, the average speed of the vehicle becomes smaller than the predetermined speed.
  • the speed and / or acceleration in the own vehicle driving status information obtained from the terminal device may be used.
  • the distance information when the distance information satisfies a predetermined condition, for example, when the distance information is smaller than the predetermined distance, it may be determined as a traffic jam. In addition, it may be determined that there is a traffic jam because the average of the distance information is equal to or less than a predetermined time within a predetermined time or within a predetermined distance. This is because when the size of the vehicle in the image is large, it indicates that the distance between the vehicle in front and the vehicle of the company is short, and this is information that should be determined to be a traffic jam.
  • the information associated with the time in a predetermined range may be used. Such a case may be when the measurement is performed by the acceleration sensor in the terminal device or when the image acquired by the terminal device is captured. This has an advantage that traffic congestion information can be collected in the information of the same time or the time within a predetermined range including the same time.
  • the traffic jam information may be included in association with the time measured by the acceleration sensor in the terminal device, the time when the image acquired by the terminal device is captured, or the information in the time range including these.
  • FIG. 15 is an example of traffic congestion information.
  • the traffic jam information is generated for the area, but the traffic jam length information may be generated and the traffic jam length information may be included in the traffic jam information.
  • the information on the length of the traffic jam is between the terminal devices that have acquired the information for determining that the traffic jam is caused when the area or the position information is determined to be a traffic jam within a predetermined range which is a short distance. Using the length of the distance, it may be determined that at least the length of the distance is the traffic jam, and the length of the traffic jam may be used as the length of the traffic jam.
  • the statistical processing unit may generate weather information.
  • the weather information may include, for example, weather condition information associated with a particular area.
  • the weather condition information associated with the specific area is, for example, information related to rain acquired from an image captured by one or a plurality of terminal devices having location information in the specific area and / or information indicating temperature. However, it may be generated when a predetermined condition is satisfied.
  • the information relating to rain may include information that it is rain.
  • the information determined to be rain may be generated by using the information related to the operation of the wiper. For example, if there is movement of the wiper, it may be generated as rain, and if there is no movement of the wiper, it may not be determined that it is rain. Further, when the speed of movement of the wiper indicates one of a plurality of ranks indicating the speed of movement of the wiper, since it is one of the plurality of ranks, information indicating the degree of rain is generated. You can.
  • a predetermined condition of the information related to rain for example, when there is more information determined to be one of a plurality of ranks indicating the speed of movement of the wiper than a predetermined number, or the speed of movement of the wiper. If the ratio of the number of information determined to be one of multiple ranks indicating the speed of movement of the wiper to the number of information not determined to be one of multiple ranks is greater than the predetermined ratio, it rains.
  • the information relating to the above may include information indicating the corresponding degree of rain.
  • the information indicating the temperature may include the information that the weather is such that a mantle is required, and the number including the long sleeves is larger than the predetermined number in the information related to the sidewalk, or the number not including the long sleeves is larger than the number including the long sleeves.
  • the information indicating the temperature may include the information that the weather requires a long sleeve, and the number including the half sleeve in the information related to the sidewalk is larger than the predetermined number or the number including the half sleeve.
  • the information indicating the temperature may include information that the weather is such that half sleeves are required.
  • the weather information may include the weather condition information and the information indicating the specific area used for generating the weather condition information in association with each other.
  • the weather information may include the weather condition information and the time when the image obtained by acquiring the information related to the wiper used to generate the weather condition information and / or the information related to the sidewalk is captured in association with each other. ..
  • FIG. 16 is an example of weather information.
  • the statistical processing unit may generate road abnormality information.
  • the road abnormality information may include information relating to the road abnormality state.
  • the information relating to the road abnormality state may include, for example, an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
  • Sign abnormalities, road marking abnormalities, traffic light abnormalities, and / or roadway abnormalities are signs abnormalities, road marking abnormalities, traffic light abnormalities, and / or roadway abnormalities in information related to roads. , May be used to generate.
  • the statistical processing unit may generate road abnormality information from information related to roads acquired from one terminal device, or may generate road abnormality information from information related to roads acquired from a plurality of terminal devices. In the latter case, the statistical processing unit performs a predetermined number or a predetermined ratio of terminal devices on the road for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • Road abnormality information may be generated when such information includes an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
  • the road abnormality information may include the information related to the road abnormality state and the position information in which the image related to the abnormality is captured or the area in a predetermined range including the position information in association with each other.
  • the road abnormality information may include the information related to the road abnormality state in association with the time when the image is captured or the time in a predetermined range including the time when the image is taken.
  • the road abnormality information may include information related to the road abnormality state and an image related to the abnormality in association with each other.
  • FIG. 17 is an example of road abnormality information.
  • the statistical processing unit may generate road information.
  • Road information may be generated using information related to roads.
  • the statistical processing unit may generate road information from information related to roads acquired from one terminal device, or may generate road information from information related to roads acquired from a plurality of terminal devices. In the latter case, the statistical processing unit is the same or from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • Road information may be generated when information on similar roads is acquired.
  • the similar information may be an error range as the information generated from the image.
  • Road information such as, may be included in association with time information, area or location information, as well as road anomaly information.
  • the statistical processing unit may generate accident information.
  • the statistical processing unit may generate accident information when the information about the event on the roadway includes the information related to the accident. Further, in such a case, the statistical processing unit may set the area where the accident has occurred by using the position information associated with the information related to the accident, and the accident information may include the information of the area. In addition, the accident information may include information related to such an accident in association with the area.
  • Accident information may be generated based on information about events on the road. Further, the information at the time in the accident information may be generated by using the information at the time related to the event on the roadway. In addition, the information at the time in the accident information is the information at the time related to the event on the roadway when the information related to the event on the roadway is acquired from a plurality of terminal devices, and the information at the time related to the event on the roadway is early or late. It may be a range.
  • the statistical processing unit may generate information related to a roadway event whose continuous period is shorter than a predetermined period as accident information among the information related to the continuous roadway event. This is because accidents on the road are generally removed in a short period of time compared to construction information. Therefore, even when it is not possible to determine whether the work is a construction work or an accident only from the image, the accident information may be determined by paying attention to the continuity of the information related to the event on the roadway. From the viewpoint of improving the accuracy of the accident information, the accident information may not be determined by paying attention to such a period.
  • the statistical processing unit may generate accident information from information related to roadway events acquired from one terminal device, or may generate accident information from information related to roadway events acquired from a plurality of terminal devices. In the latter case, the statistical processing unit is on the road from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • Road abnormality information may be generated when the information related to the event includes an abnormality of a sign, an abnormality of a road marking, an abnormality of a traffic light, and / or an abnormality of a roadway.
  • FIG. 18 is an example of such accident information.
  • the statistical processing unit may generate information on the provision of fuel for vehicles.
  • the vehicle fuel provision information may include the type of vehicle fuel and the amount of money corresponding to such type in association with each other.
  • the type of vehicle fuel and the amount of money corresponding to such type may be generated using the information on the vehicle fuel.
  • the vehicle fuel provision information may include the location information corresponding to the type and the amount of money in association with each other. Such location information may be generated using the location information in the information relating to the fuel for the vehicle corresponding to such type and amount.
  • the vehicle fuel provision information may include the information corresponding to the type and the amount of money in association with each other.
  • the information at such times may be generated using the information at such times in the information relating to fuels for vehicles corresponding to such types and amounts.
  • the statistical processing unit may generate vehicle fuel provision information from information related to vehicle fuel acquired from one terminal device, or from information related to vehicle fuel acquired from a plurality of terminal devices. It may be generated. In the latter case, the statistical processing unit is used for vehicles from a predetermined number or a predetermined ratio of terminal devices for a predetermined range of areas including the same position information in a predetermined range including the same time.
  • the information relating to the fuel of the vehicle includes the type of fuel for the vehicle and the corresponding amount of money
  • the information for providing the fuel for the vehicle may be generated.
  • the time in the predetermined range may be a predetermined period retroactive from the present time. This has the advantage of being able to keep up-to-date information.
  • FIG. 19 is an example of vehicle fuel provision information.
  • the statistical processing unit may generate advertising statistical information.
  • the advertisement statistical information may include the number related to the advertisement or may include the ratio related to the advertisement.
  • the number of advertisements may be the number of advertisements satisfying a predetermined condition. Advertisements that meet certain conditions include advertisements of a specific brand, advertisements by a specific advertiser, advertisements related to a specific field, advertisements of a predetermined size, advertisements larger than a predetermined size, and advertisements smaller than a predetermined size. , And / or advertisements that are blocked by trees or buildings.
  • the ratio related to the advertisement may be the ratio of the advertisement satisfying the second predetermined condition to the advertisement satisfying the first predetermined condition.
  • the first predetermined condition and the second predetermined condition may be the above-mentioned predetermined conditions, and these may be different.
  • the above-mentioned specific brand may be able to identify such a brand by character recognition in an image.
  • the above-mentioned advertiser only needs to be able to identify such an advertiser by character recognition in the image.
  • the field of view of the advertisement may be obstructed for a predetermined ratio or more in the time zone in which the advertisement can be recognized in the image.
  • the advertisement statistical information may include the number related to the advertisement and the information indicating a specific area including the position information obtained by capturing the image used to generate the number related to the advertisement in association with each other.
  • the information indicating such a region may be generated as a specific region including such a position by using the position information associated with the image.
  • the advertisement statistical information may include the number related to the advertisement and the information indicating a specific time range including the time when the image used to generate the number related to the advertisement is imaged in association with each other.
  • Information indicating such a specific time range may be generated as a specific time range including such a time by using the information of the time associated with such an image.
  • the statistical processing unit may generate advertisement statistical information from information related to advertisements acquired from one terminal device, or may generate advertisement statistical information from information related to advertisements acquired from a plurality of terminal devices. In the latter case, regarding the information related to the advertisement related to the same or predetermined range of position information, the advertisement statistical information is obtained only when the information related to the same advertisement is acquired from a predetermined number or a predetermined ratio or more of the terminal devices. It may be configured to generate, or it may be configured to generate advertisement statistics even when the information related to the advertisement is acquired from one terminal device.
  • the statistical processing unit may include the corresponding predetermined conditions for the advertisement statistical information. This has the advantage that it is possible to retain information as to what kind of conditions the advertisement statistical information satisfies.
  • FIG. 20 is an example of advertising statistical information.
  • the statistical processing unit may generate human information.
  • Person information may be generated based on information related to the degree of congestion of people.
  • the person information may include the number of people and the information indicating the area where the person is located in association with each other.
  • the person information may include the number of people and the information including the time when the area where the person is present is imaged in association with each other.
  • the statistical processing unit may generate human information from information related to the degree of congestion of people acquired from one terminal device, or may generate human information from information related to the degree of congestion of people acquired from a plurality of terminal devices. Good. In the latter case, regarding the information related to the degree of congestion of people associated with the time within the predetermined time and with the predetermined area, the information from the terminal device satisfying the predetermined condition and / or each terminal device. The average number of people in the above may be used to generate the number of people mentioned above.
  • FIG. 21 is an example of human information.
  • Statistical information may generate a 3D map.
  • the 3D map may be a 3D digital representation of the landscapes on both sides as seen from the roadway.
  • the 3D map may be generated from the image.
  • the 3D map may be generated using one or more images captured by an imaging device that images the front and / or rear of the terminal device.
  • the means for generating a 3D map from one or more images may be a known means.
  • the statistical processing unit may generate a 3D map from an image acquired from one terminal device, or may generate a 3D map from an image acquired from a plurality of terminal devices. In the latter case, an image associated with a predetermined time range and associated with the same location information may be used to generate a 3D map.
  • an image captured in a predetermined time range there is an advantage that a 3D map can be generated based on information that does not change with time. Further, there is an advantage that a more accurate 3D map can be generated by generating a 3D map from a plurality of images captured at the same place having the same position information.
  • the information and communication unit may communicate information with one or more terminal devices.
  • the information communication unit may acquire information from one or more terminal devices, and may transmit information to one or more terminal devices.
  • the information communication unit acquires information related to at least a part of an image acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, and an output unit as information from one or a plurality of terminal devices. You can do it.
  • the information communication unit may provide one or more pieces of information relating to at least a part of an image acquisition unit, a self-vehicle information generation unit, an image information generation unit, a storage unit, an output unit, and a statistical processing unit. It may be transmitted to the terminal device of.
  • the information communication unit may have a function of communicating information with various systems.
  • the Information and Communication Department is responsible for at least a part of other vehicle statistical information, congestion information, weather information, road information, road abnormality information, accident information, vehicle fuel provision information, advertising statistical information, human information, and 3D maps. Information may be transmitted to the system for road users.
  • the road user provides a company that provides a map, a carrier that uses the road, a taxi company that uses the road, a company that provides car navigation, a company that maintains the road, and services related to automatic driving. Companies and / or companies that provide services to municipalities or municipalities may be included.
  • the information and communication department may transmit the accident information to the system related to the insurance company.
  • the information and communication department may transmit the advertisement statistical information to the system related to the advertising company.
  • the information and communication unit may transmit the person information to the system related to the radio wave service of the mobile phone.
  • the system example according to the first embodiment is an embodiment mainly used as a drive recorder by using some or all of the above-mentioned functions.
  • a user of an example system may have, for example, a smartphone as a terminal device.
  • the software related to the system of the example may be downloaded and installed in advance on such a smartphone.
  • the user When boarding a vehicle used by a user of an example system (hereinafter, also referred to as a "user vehicle"), the user attaches a terminal device to the user vehicle.
  • the mounting mode may be various. It may be removable.
  • the rectangular smartphone may be attached to the vehicle in a horizontally long shape that is longer in the horizontal direction than in the vertical direction, or may be mounted in a vertically long shape that is longer in the vertical direction than in the horizontal direction. In the case of the horizontally long vehicle, there is an advantage that the lane in which the own vehicle travels and the vehicle traveling on the adjacent lane may also fall within the wide imaging range.
  • the example system may automatically start imaging. That is, the imaging device in the terminal device may be activated to start imaging. Since the imaging is automatically started only by starting and without any other operation, the user has an advantage that he / she can prevent forgetting to start the imaging operation and save the trouble of starting the imaging. Further, in the other example system, the imaging does not have to be started at the same time as the startup.
  • the shooting may be started by selecting the shooting button by the user.
  • the shooting button may be a mechanical and physical button in the smartphone, or may be a touch on the screen.
  • the terminal device may set the autofocus to infinity at the time of imaging and perform imaging.
  • the bonnet of the own vehicle may enter the imaging field of view, and when the inventor conducted an experiment, the bonnet was sometimes focused.
  • the focus is about 2 meters, and the focus is not on the vehicle in front, which may reduce the accuracy of acquiring information for identifying the vehicle in front.
  • the focus of the terminal device may be focused on water droplets on the windshield or the wiper in operation, and similarly, the accuracy of acquiring information for identifying the vehicle in front may be reduced. ..
  • there is an advantage that such a decrease in accuracy can be prevented by automatically setting the focus at the time of imaging of the terminal device to infinity.
  • the system of one example stores the information related to the driving situation at that time while storing the captured image.
  • the video to be stored may be stored in a file for each fixed file size. For example, a video file for each XMB (megabyte) of a predetermined storage amount may be created, and when the predetermined storage amount is exceeded, the next file may be created to store the video.
  • XMB megabyte
  • the video to be shot may include audio. This is because, as a drive recorder, voice in the driving situation is also important information.
  • the system of the example may display various displays or make sounds on the output unit according to the content of the captured image and the information related to the driving situation. For example, when the distance between the vehicle driving ahead and the own vehicle is getting closer, a warning may be issued. The warning may be displayed on the output unit in a manner that attracts the attention of the viewer, or may make a sound that attracts the attention of the person in the vehicle. Further, as information related to the driving situation, in the case of sudden braking, sudden acceleration, and sudden steering, these may be similarly warned.
  • An example system may generate a drive report when the vehicle has finished driving. For example, an example system may generate statistical information about driving conditions and include it in a drive report.
  • the example system may transmit the stored information to the management server.
  • the captured moving image and information related to the driving situation may be transmitted to the management server.
  • the timing of transmission may be configured so that the terminal device starts transmission at a timing when transmission becomes possible with another information processing device in a predetermined communication mode.
  • the predetermined communication mode may be, for example, WIFI.
  • One example system may automatically erase the video stored in the terminal device after transmission to the management server.
  • the terminal device may automatically delete the information confirmed that the video is stored in the management server after the terminal device acquires the information.
  • automatic erasing may be performed without confirmation of the user.
  • the video stored as a drive recorder has a large storage capacity for the terminal device, so the next time it is used
  • the system of the example may have a function of reproducing the information stored in the terminal device. In addition, it may have a function of displaying information related to the driving situation.
  • FIG. 22 is an example of a screen transition diagram of the terminal device according to the system of this example.
  • the agreement agreement screen 002 is displayed. Such a screen is only when the application is started for the first time after installation.
  • the TOP screen 003 is displayed when the agreement is agreed or the second and subsequent activations are performed. From the top screen, it is possible to shift to the recording screen 004, the captured image list screen 006, and the setting screen 008.
  • the screen can be moved to the drive report screen 005.
  • the drive report screen can display statistical information about the driving status of the drive during recording.
  • the captured video list screen 006 can be switched to the captured video playback screen 007.
  • the screen transition shown here is an example. For example, it may be possible to reproduce the selected one shot video from the shot video list screen, or to display the drive report screen of the shot video.
  • FIG. 23 is a screen waiting for a shooting instruction after the system of one example is started.
  • the image pickup button 002 is displayed large on the display screen.
  • imaging can be performed according to a user's instruction.
  • the image pickup button may have an area of one tenth or more of the area of the display screen.
  • the image pickup button has an area larger than a predetermined area, there is an advantage that the user can more easily instruct the start of image pickup.
  • FIG. 24 is an example of an imaging screen during driving.
  • the vehicles in the forward direction are surrounded by the red frame 002, indicating that each vehicle is recognized.
  • information 003 regarding the driving situation is also displayed.
  • the horizontal axis shows the elapsed time and the vertical axis shows the speed of the vehicle. Therefore, where the angle of the line in the graph is steep, sudden acceleration, sudden braking, or sudden steering can be applied, and such a display may be made.
  • the distance to the vehicle in front is short, particularly when the distance to the vehicle in front on the same lane is short, it may be displayed as a dangerous inter-vehicle distance.
  • the sign 004 on the road may be recognized and the information thereof may be acquired.
  • FIG. 25 shows a list of captured images.
  • a list of each video 001 is displayed according to the time. These may be from the start of one shooting to the end of shooting, or may be listed for each file divided for each predetermined storage amount as described above.
  • FIG. 26 is another example showing a list of captured images.
  • FIG. 27 shows the captured image at 001 during display, and also shows a list 002 of some images.
  • FIG. 28 is an example showing a horizontally long image / display instead of the vertically long image / display described above.
  • the vehicle may be imaged by a smartphone mounted horizontally 001.
  • FIG. 29 is an example of displaying the operating status during imaging.
  • the horizontal axis is the time, and the vertical axis is the speed of the own vehicle, and a mark indicating the time when sudden acceleration, sudden steering, or sudden braking occurs is displayed.
  • FIG. 30 is another example of displaying the operating status during imaging.
  • the screen is a vertical screen.
  • the horizontal axis is the time and the vertical axis is the speed of the own vehicle, and a mark indicating the time when sudden acceleration, sudden steering, or sudden braking occurs is displayed.
  • FIG. 31 is an example showing a state in which a smartphone mounted horizontally is recognizing a vehicle in front.
  • the vehicle in front is surrounded by a frame 001 to indicate that it is recognized.
  • the output unit may display information that identifies the vehicle thus identified. Further, not only the vehicle but also the output unit may display information that identifies a pedestrian, a two-wheeled vehicle (including a bicycle), and the like.
  • the image information generation unit identifies a person and something like a two-wheeled vehicle, it identifies it as a two-wheeled vehicle, and when it identifies a person without specifying something like a two-wheeled vehicle, it identifies it as a pedestrian. It's okay.
  • the line of information for identifying such a vehicle may be a circle or a polygon instead of a rectangle.
  • the display for identifying the target vehicle indicating such recognition may be a line surrounding the target vehicle or a line not surrounding the target vehicle.
  • the display identifying such a vehicle may be various vehicles. For example, it may be a passenger car, a commercial vehicle, a truck, a taxi, a motorcycle, and the like. Further, the display for identifying such a vehicle may be limited to the case where the above-mentioned distance information includes a distance within a predetermined range. For example, a display identifying such a vehicle may be made only if the vehicle is within a range of 15 meters. In addition, the display that identifies the vehicle may be displayed in various modes.
  • the color, the shape of the display, and the like may be changed.
  • the display for identifying the vehicle may change the mode of the marking according to the information included in the above-mentioned distance information. For example, when the distance information includes information of 2 meters or more and 5 meters or less, it may be yellow, and when the distance information includes information of 5 meters or more and 15 meters or less, it may be blue or the like.
  • the distance included in the distance information may be displayed in association with the display that identifies the vehicle. For example, it may be displayed as "3 m" or the like in association with a display that identifies the vehicle in front. If the distance information includes a distance between 0 meters and 2 meters, it is not necessary to display the distance.
  • whether or not the information on the distance is displayed may be determined by the vehicle depending on the position of the output unit and whether or not the information is displayed.
  • the output unit may display only when the display identifying the vehicle is displayed at a predetermined position of the output unit.
  • the output unit may display only the vehicle whose rectangle that identifies the vehicle includes the center of the screen. Whether or not the rectangle that identifies the vehicle is at the center of the screen may be determined, for example, by comparing the coordinate position that specifies the rectangle with the coordinate position at the center of the screen.
  • FIG. 32 is an example in which the form of such a rectangular line is changed according to the proximity of the vehicle in front on the same lane and the own vehicle.
  • the color such as red or yellow that attracts the user's attention may be changed, or the thickness of the line or the decoration of the line may be changed. Further, in order to indicate that they are approaching, "approaching" may be displayed as shown in this figure.
  • FIG. 33 shows a state in which the image is being viewed after imaging.
  • video 001 is displayed.
  • the sudden steering wheel, the sudden acceleration, and the sudden-brake are displayed in the graph 002 by each display.
  • the horizontal axis represents time and the vertical axis represents speed.
  • FIG. 34 the case of a vertical screen is displayed.
  • Example 2 The system of another company's information gathering example may have only the configuration essential to this embodiment, or may have the mode required for another embodiment.
  • An example system acquires images from one or more terminal devices. Further, the system of one example may acquire one or more vehicle identification information and other vehicle determination information associated with the one or more vehicle identification information from one terminal device.
  • the system of one example may generate other vehicle statistical information by collecting other vehicle determination information for each vehicle specific information.
  • the other vehicle statistical information may be summarized in a predetermined period based on the images captured in the predetermined period.
  • other vehicle statistical information may be summarized in a predetermined area based on an image captured in the predetermined area.
  • a predetermined coefficient may be associated with each other for sudden acceleration, sudden steering, and sudden braking, and a weighted total determination score may be generated by multiplying the number of times.
  • FIG. 35 is an example of other vehicle statistical information. In this figure, the vehicles are arranged in order of highest overall judgment score.
  • An example system acquires one or more specific vehicle information from one terminal device, searches the one or more specific vehicle information from other vehicle statistical information, and converts the one or more specific vehicle information into such specific vehicle information.
  • Such other vehicle statistical information may be transmitted to the above-mentioned one terminal device.
  • the other vehicle statistical information related to the specific vehicle information is, for example, the number of times of sudden acceleration, the number of times of sudden braking, the number of times of sudden steering, the numerical value related to each number of times, the total judgment score, and the ranking ranking. , And so on.
  • the numerical value related to the number of times may be the probability of occurrence within a predetermined period or a coefficient obtained by using them.
  • the other vehicle statistical information related to the specific vehicle information may be processed into information that simply indicates the degree of danger. For example, it may include information of one of three options such as high, normal, and low.
  • the other vehicle statistical information related to the specific vehicle information may be displayed in association with the one or more other vehicles.
  • FIG. 37 is an example of such a display.
  • a predetermined standard for example, sudden acceleration, sudden braking, and / or sudden steering.
  • This is an example showing that the number of times is higher than the predetermined standard, the total judgment score is higher than the predetermined standard, and / or the ranking ranking is higher than the predetermined standard).
  • the vehicle 002 does not have a history of driving that is more dangerous than the prescribed standard, it is not necessary to display anything as shown in this figure, and it is more than the predetermined standard such as "normal”. May also indicate that it does not have a dangerous driving history.
  • FIG. 38 shows an example in which an example system collects information acquired from a plurality of terminal devices, processes the information, and transmits the information to one or a plurality of terminal devices.
  • An example management system 004 may acquire information from the terminal devices 001 to 003, perform predetermined processing, and transmit the information to the terminal devices 001 to 003.
  • the terminal device for acquiring information and the terminal device for transmitting information may be the same or different.
  • Example 3 The system of one example of vehicle information collection may include only the configurations essential to this embodiment, or may include aspects required for other embodiments.
  • An example system may acquire information related to a target from one or a plurality of terminal devices, and the timing may be any. For example, in the system of one example, information may be acquired at the timing when the terminal device is connected to a wireless communication device such as WIFI, or when the terminal device is connected to a communication standard such as 3G, 4G, or 5G. Then, the information may be acquired, or the information may be acquired in real time at the timing when the information to be transmitted by the terminal device is acquired. The real-time may include those transmitted with a predetermined delay associated with information processing.
  • FIG. 39 shows an example in which an example system collects information acquired from a plurality of terminal devices, processes the information, and transmits the information to the corresponding companies.
  • An example management system 004 may acquire information related to the target from the terminal devices 001 to 003 and transmit it to the systems 005 to 007 related to the target company.
  • the target company may include road users, insurance companies, local governments or companies related to local governments, advertising companies, mobile phone related companies, and the like.
  • An example system is at least some information (subordinates indicated by the above terms) of congestion information, weather information, road information, road abnormality information, accident information, vehicle fuel supply information, human information, and 3D maps. If you send information such as (which may be conceptual information) to the system related to the company that provides the map, it can be used to update the map, and if you send it to the system related to the carrier using the road, the efficiency of the carrier When it is sent to the system related to the taxi company that uses the road, it can be used for efficient use of the taxi, and when it is sent to the system related to the company that provides car navigation, the car navigation system Can be used to update information to, and sent to the system related to the company that maintains the road.
  • FIG. 40 is an example of a flow in an example system.
  • An example system may acquire information transmitted from an information communication unit in such a terminal from one or more terminals.
  • the statistical processing unit related to the system of one example may perform statistical processing using the information acquired by the information and communication unit.
  • the example system may then send information, including such statistically processed information, to the target company.
  • the system and terminal device may be composed of one or a plurality of information processing devices.
  • the information processing device 10 may include a bus 15, an arithmetic device 11, a storage device 12, and a communication device 16. Further, the information processing device 10 in one embodiment may include an input device 13 and a display device 14. It is also directly or indirectly connected to the network 17.
  • the bus 15 may have a function of transmitting information between the arithmetic unit 11, the storage device 12, the input device 13, the display device 14, and the communication device 16.
  • An example of the arithmetic unit 11 is a processor. This may be a CPU or an MPU. Further, the arithmetic unit in one embodiment may include a graphics processing unit, a digital signal processor, and the like. In short, the arithmetic unit 12 may be any device capable of executing program instructions.
  • the storage device 12 is a device that records information. This may be either an external memory or an internal memory, and may be either a main storage device or an auxiliary storage device. Further, a magnetic disk (hard disk), an optical disk, a magnetic tape, a semiconductor memory, or the like may be used. Further, it may have a storage device via a network or a storage device on the cloud via a network.
  • the registers, L1 cache, L2 cache, etc. that store information at a position physically close to the arithmetic unit may be included in the arithmetic unit 11, but in the design of the computer architecture, As a device for recording information, the storage device 12 may include these.
  • the arithmetic unit 11, the storage device 12, and the bus 11 may be configured to cooperate with each other to execute information processing.
  • the storage device 12 can include a part or all of a program capable of executing the process according to the present invention. In addition, data necessary for executing the process according to the present invention can be appropriately recorded. Further, the storage device 12 in one embodiment may include a database.
  • the above describes the case where the arithmetic unit 12 is executed based on the program provided in the storage device 13, but it is one of the forms in which the bus 11, the arithmetic unit 12 and the storage device 13 are combined.
  • part or all of the information processing according to the present invention may be realized by a programmable logic device capable of changing the hardware circuit itself or a dedicated circuit in which the information processing to be executed is determined.
  • the input device 13 inputs information, but may have other functions.
  • Examples of the input device 14 include an input device such as a keyboard, a mouse, a touch panel, or a pen-type instruction device.
  • the display device 14 has a function of displaying information.
  • a liquid crystal display, a plasma display, an organic EL display, and the like can be mentioned, but in short, any device capable of displaying information may be used.
  • the input device 13 may be partially provided like a touch panel.
  • the network 17 transmits information together with the communication device 16. That is, it has a function of enabling information of 10 information processing devices to be transmitted to other information terminals (not shown) via the network 17.
  • the communication device 16 may use any connection type, such as IEEE1394, Ethernet (registered trademark), PCI, SCSI, USB, 2G, 3G, 4G, 5G, and the like.
  • the connection to the network 17 may be either wired or wireless.
  • the information processing device may be a general-purpose type or a dedicated type. Further, the information processing device may be a workstation, a desktop personal computer, a laptop personal computer, a laptop computer, a PDA, a mobile phone, a smartphone or the like.
  • the system according to the present invention may be composed of a plurality of information processing devices.
  • the plurality of information processing devices may be internally connected or may be externally connected.
  • the system according to the present invention may be of various types of devices.
  • the system according to the present invention may be a stand-alone system, a server-client system, a peer-to-peer system, or a cloud system.
  • the system according to the present invention may be a stand-alone information processing device, may be composed of a part or all of the information processing device of the server-client type, or may be a part or all of the information processing of the peer-to-peer type. It may be composed of devices, or may be composed of some or all information processing devices in the cloud format.
  • the owner and manager of each information processing device may be different.
  • the information processing device 10 may be a physical existence or a virtual one.
  • the information processing device 10 may be virtually realized by using cloud computing.
  • the configuration implemented by the system of this example may be configurations implemented by one or more information processing devices in the system.
  • the information processing device described above as the portable information processing device may be an information processing device that is appropriately installed and fixed.
  • the invention examples described in the examples of the documents of the present application are not limited to those described in the documents of the present application, and can be applied to various examples within the scope of the technical idea.
  • the information presented on the screen of the information processing device can be displayed on the screen of the other information processing device, so that the information can be transmitted to the other information processing device.
  • the system may be configured.
  • the display of ⁇ in various drawings may contain appropriate values according to each context, and may be the same or different.
  • the processes and procedures described in the documents of the present application may be feasible not only by those explicitly described in the embodiments but also by software, hardware or a combination thereof. Further, the processes and procedures described in the documents of the present application may be able to be implemented by various computers by implementing the processes and procedures as a computer program. Further, these computer programs may be stored in a storage medium. Also, these programs may be stored on a non-transient or temporary storage medium.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)

Abstract

[Problème] Un système selon un exemple de la présente invention peut utiliser de manière plus appropriée des données obtenues à partir d'une image. [Solution] Un système comprend : une première unité d'acquisition qui acquiert, à partir du premier dispositif terminal portatif, des premières informations de spécification de véhicule permettant de spécifier un premier véhicule dans une première image capturée par un premier dispositif de terminal portatif et des premières informations de détermination de véhicule obtenues au moyen de la détermination d'un état de conduite du premier véhicule en fonction de la première image ; une seconde unité d'acquisition qui acquiert, à partir du second dispositif terminal portatif, des secondes informations de spécification de véhicule permettant de spécifier un second véhicule dans une seconde image capturée par un second dispositif de terminal portatif et des secondes informations de détermination de véhicule obtenues au moyen de la détermination d'un état de conduite du second véhicule en fonction de la seconde image ; une unité de détermination qui détermine l'identité entre le premier véhicule et le second véhicule ; et une unité de traitement statistique qui génère, à l'aide des premières et secondes informations de détermination de véhicule quand le premier et le second véhicule sont identiques l'un à l'autre, des informations statistiques concernant le premier véhicule. 
PCT/JP2019/046746 2019-11-29 2019-11-29 Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé WO2021106180A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/046746 WO2021106180A1 (fr) 2019-11-29 2019-11-29 Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé
JP2019566377A JP6704568B1 (ja) 2019-11-29 2019-11-29 情報処理システム、情報処理装置、端末装置、サーバ装置、プログラム、又は方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/046746 WO2021106180A1 (fr) 2019-11-29 2019-11-29 Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé

Publications (1)

Publication Number Publication Date
WO2021106180A1 true WO2021106180A1 (fr) 2021-06-03

Family

ID=70858192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/046746 WO2021106180A1 (fr) 2019-11-29 2019-11-29 Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé

Country Status (2)

Country Link
JP (1) JP6704568B1 (fr)
WO (1) WO2021106180A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009060581A1 (fr) * 2007-11-05 2009-05-14 Fujitsu Ten Limited Dispositif de surveillance de voisinage, système d'assistance à la circulation en toute sécurité, et véhicule
JP2017069917A (ja) * 2015-10-02 2017-04-06 株式会社東芝 通信処理装置、車載装置、及び通信処理方法
JP2017182678A (ja) * 2016-03-31 2017-10-05 日本電気株式会社 運転状態判定装置、運転状態判定方法、プログラム
JP2018112892A (ja) * 2017-01-11 2018-07-19 スズキ株式会社 運転支援装置
WO2019030802A1 (fr) * 2017-08-07 2019-02-14 本田技研工業株式会社 Système de commande de véhicule, procédé de commande de véhicule et programme

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009134704A (ja) * 2007-11-05 2009-06-18 Fujitsu Ten Ltd 周辺監視装置、安全走行支援システム、及び車両

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009060581A1 (fr) * 2007-11-05 2009-05-14 Fujitsu Ten Limited Dispositif de surveillance de voisinage, système d'assistance à la circulation en toute sécurité, et véhicule
JP2017069917A (ja) * 2015-10-02 2017-04-06 株式会社東芝 通信処理装置、車載装置、及び通信処理方法
JP2017182678A (ja) * 2016-03-31 2017-10-05 日本電気株式会社 運転状態判定装置、運転状態判定方法、プログラム
JP2018112892A (ja) * 2017-01-11 2018-07-19 スズキ株式会社 運転支援装置
WO2019030802A1 (fr) * 2017-08-07 2019-02-14 本田技研工業株式会社 Système de commande de véhicule, procédé de commande de véhicule et programme

Also Published As

Publication number Publication date
JPWO2021106180A1 (ja) 2021-12-02
JP6704568B1 (ja) 2020-06-03

Similar Documents

Publication Publication Date Title
Singh et al. Analyzing driver behavior under naturalistic driving conditions: A review
US11640174B2 (en) Smart vehicle
US10816993B1 (en) Smart vehicle
US9443152B2 (en) Automatic image content analysis method and system
JP6796798B2 (ja) イベント予測システム、イベント予測方法、プログラム、及び移動体
US20180075747A1 (en) Systems, apparatus, and methods for improving safety related to movable/ moving objects
US20180211117A1 (en) On-demand artificial intelligence and roadway stewardship system
US20170132710A1 (en) System and method for monitoring driving to determine an insurance property
US20200058218A1 (en) Determining causation of traffic events and encouraging good driving behavior
JP2019525185A (ja) 目標指向のナビゲーション指示を提供する方法及び装置
IL247503A (en) Traffic information system
JPWO2014013985A1 (ja) 運転支援システム及び運転支援方法
KR20190087936A (ko) 광고 차량 및 차량 관리 시스템
US20200074507A1 (en) Information processing apparatus and information processing method
JP7176098B2 (ja) 自律型車両のための行列の検出および行列に対する応答
JP2008269178A (ja) 交通情報表示装置
JP2016126756A (ja) 危険判定方法、危険判定装置、危険出力装置及び危険判定システム
JP2012038089A (ja) 情報管理装置、データ解析装置、信号機、サーバ、情報管理システム、およびプログラム
US20160189323A1 (en) Risk determination method, risk determination device, risk determination system, and risk output device
US20230039738A1 (en) Method and apparatus for assessing traffic impact caused by individual driving behaviors
JP6997471B2 (ja) 情報処理システム、情報処理装置、端末装置、サーバ装置、プログラム、又は方法
JP6842099B1 (ja) 情報処理システム、情報処理装置、端末装置、サーバ装置、プログラム、又は方法
WO2021106180A1 (fr) Système de traitement d'informations, dispositif de traitement d'informations, dispositif terminal, dispositif de serveur, programme, ou procédé
Walcott-Bryant et al. Harsh brakes at potholes in Nairobi: Context-based driver behavior in developing cities
US20230052037A1 (en) Method and apparatus for identifying partitions associated with erratic pedestrian behaviors and their correlations to points of interest

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019566377

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19954411

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19954411

Country of ref document: EP

Kind code of ref document: A1