CN111179582B - Image management device, road surface information management system, vehicle, computer-readable storage medium, and image management method - Google Patents

Image management device, road surface information management system, vehicle, computer-readable storage medium, and image management method Download PDF

Info

Publication number
CN111179582B
CN111179582B CN201911012891.1A CN201911012891A CN111179582B CN 111179582 B CN111179582 B CN 111179582B CN 201911012891 A CN201911012891 A CN 201911012891A CN 111179582 B CN111179582 B CN 111179582B
Authority
CN
China
Prior art keywords
image
vehicle
condition
unit
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911012891.1A
Other languages
Chinese (zh)
Other versions
CN111179582A (en
Inventor
浅海寿夫
坂本浩介
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN111179582A publication Critical patent/CN111179582A/en
Application granted granted Critical
Publication of CN111179582B publication Critical patent/CN111179582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention provides an image management device, a road surface information management system, a vehicle, a computer-readable storage medium, and an image management method, which solve the problem that if all image data acquired during driving is stored in a server, resources of the server and a communication network are uselessly consumed. The image management device includes a storage determination unit configured to determine an image to be stored among a plurality of images captured by an imaging unit that images an external appearance of a moving object, and a condition determination unit configured to determine a condition for determining the image to be stored. The storage determination unit determines an image to be stored according to the condition determined by the condition determination unit.

Description

Image management device, road surface information management system, vehicle, computer-readable storage medium, and image management method
Technical Field
The invention relates to an image management device, a road surface information management system, a vehicle, a computer-readable storage medium, and an image management method.
Background
A technology is known in which an in-vehicle terminal transmits road information (for example, pavement of a road surface) related to a road on which a vehicle is traveling to a server (for example, see patent document 1).
Patent document 1 Japanese laid-open patent publication No. 2015-184183
Disclosure of Invention
If all the image data acquired during traveling is stored in the server, resources of the server and the communication network are uselessly consumed.
In the 1 st aspect of the present invention, an image management apparatus is provided. The image management apparatus described above includes, for example, a storage determination unit that determines an image to be stored from among a plurality of images captured by an imaging unit that images an external appearance of a moving object. The image management apparatus described above includes, for example, a condition determining unit that determines a condition for determining an image to be saved. In the image management apparatus, the imaging unit is mounted on, for example, a mobile body. In the above-described image management apparatus, the storage determination unit determines the image to be stored, for example, according to the condition determined by the condition determination unit. In the above-described image management apparatus, the condition determining unit includes, for example, a speed information acquiring unit that acquires speed information indicating a speed of the moving object. In the above-described image management device, the condition determination unit includes, for example, an interval condition determination unit, and determines, as the condition for determining the image to be saved, based on the speed of the mobile object, at least one of (i) a condition relating to a time interval between when one image is determined to be a saving target and when a next image to be saved is determined, and (ii) a condition relating to a distance that the mobile object moves during a period between when one image is determined to be a saving target and when a next image to be saved is determined.
In the above-described image management device, the image management device may include a distance information acquisition unit that acquires distance information indicating a distance between the moving object and an object present in a moving direction of the moving object. In the above-described image management device, the condition determination unit may include a distance condition determination unit configured to determine, as the condition for determining the image to be saved, a condition that is not an image captured when the distance indicated by the distance information is smaller than a predetermined value.
In the above-described image management apparatus, the mobile body may be a vehicle. In the above-described image management apparatus, the object existing in the moving direction of the moving object may be another vehicle existing in front of the moving object. In the above-described image management device, a plurality of vehicles may be assigned unique vehicle identification numbers. In the above-described image management device, the condition determination unit may include an image condition determination unit configured to determine, as a condition for determining the image to be stored, a condition that the image does not become the storage target when the vehicle identification number assigned to the other vehicle included in the image captured by the image capture unit is identified.
In the 2 nd aspect of the present invention, an image management apparatus is provided. The image management apparatus described above includes, for example, a storage determination unit that determines an image to be stored from among a plurality of images captured by an imaging unit that images an external appearance of a moving object. The image management device described above includes, for example, a distance information acquisition unit that acquires distance information indicating a distance between the moving object and an object present in the moving direction of the moving object. In the image management device, the imaging unit is mounted on a mobile body, for example. In the above-described image management apparatus, the storage determination unit determines that the image captured by the image capturing unit is not to be stored when the distance indicated by the distance information is smaller than a predetermined value, for example.
In the above-described image management apparatus, the mobile body may be a vehicle. In the above-described image management apparatus, the object existing in the moving direction of the moving object may be another vehicle existing in front of the moving object. In the above-described image management device, a plurality of vehicles may be assigned unique vehicle identification numbers. In the above-described image management device, the distance information may be information indicating that a vehicle identification number assigned to another vehicle included in the image captured by the image capturing unit is identified. In the above-described image management device, the storage determination unit may determine that the distance indicated by the distance information is smaller than a predetermined value when the vehicle identification number assigned to the other vehicle is identified.
In the above-described 1 st and 2 nd aspects, the image management device may include an identification number storage unit configured to store a vehicle identification number assigned to another vehicle included in the image captured by the image capturing unit when the vehicle identification number is recognized. The image management device may include an identification number transmission unit that transmits information including the determined vehicle identification number to a vehicle information collection device that collects the vehicle identification number. The image management apparatus may include an image transmission unit that transmits information including image data of an image to be stored to an image information collection apparatus that collects images.
In the 3 rd aspect of the present invention, a vehicle is provided. The vehicle includes, for example, the image management device according to the above-described 1 st or 2 nd aspect.
In the 4 th aspect of the present invention, a road surface information management system is provided. The road surface information management system includes, for example, the image management device according to the above-described 1 st or 2 nd aspect. The road surface information management system described above includes, for example, an image information collection device that collects image data of images to be stored from a plurality of image management devices. In the road surface information management system, for example, the image to be stored includes an image of a road. In the road surface information management system, for example, the image information collection device analyzes an image to be stored to generate road surface information relating to a state of a road surface of a road.
In the 5 th aspect of the present invention, a program is provided. A non-transitory computer readable medium storing the above-described program may also be provided. The program may be a program for causing a computer to function as the image management apparatus according to the above-described 1 st or 2 nd aspect. The program may be a program for causing a computer to execute the information processing method in the image management apparatus according to the above-described 1 st or 2 nd aspect.
In the 6 th aspect of the present invention, an image management method is provided. The above-described image management method may include a storage determination step of determining, by a computer, an image to be stored from among a plurality of images captured by an imaging unit mounted on a moving object and capturing an image of an external aspect of the moving object. The above-described image management method may include a condition determining step in which a computer determines a condition for determining an image to be saved. The condition determining step includes, for example, a speed information acquiring step in which a computer acquires speed information indicating a speed of the mobile body. The condition determining step includes, for example, an interval condition determining step in which the computer determines, as a condition for determining the image to be saved, (i) a condition relating to a time interval between when one image is determined to be a saving target and when a next image to be saved is determined, and (ii) a condition relating to a distance that the mobile object moves during a period between when one image is determined to be a saving target and when the next image to be saved is determined, based on the speed of the mobile object. The storage determining step includes, for example, a step in which the computer determines the image to be stored according to the condition determined in the condition determining step.
In the 7 th aspect of the present invention, an image management method is provided. The above-described image management method includes, for example, a storage determination step in which a computer determines an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and capturing an image of an external portion of the mobile object. The above-described image management method includes, for example, a distance information acquisition step in which a computer acquires distance information indicating a distance between a moving object and an object existing in a moving direction of the moving object. In the above-described image management method, the storage determination step includes, for example, a step in which the computer determines that the image captured by the imaging unit is not to be a storage target when the distance indicated by the distance information is smaller than a predetermined value.
In addition, the summary of the present invention does not list all necessary features of the present invention. In addition, sub-combinations of these feature groups can also be inventions.
Drawings
Fig. 1 schematically shows an example of a system configuration of the management system 100.
Fig. 2 schematically shows an example of the internal configuration of the management server 110.
Fig. 3 schematically shows an example of the internal structure of vehicle 120.
Fig. 4 schematically shows an example of the internal configuration of the image management unit 124.
Fig. 5 schematically shows an example of the internal configuration of the storage condition determining unit 420.
Fig. 6 schematically shows an example of information processing in the image management unit 124.
Fig. 7 schematically shows an example of information processing performed by the interval condition determination unit 524.
Fig. 8 schematically shows an example of information processing performed by the interval condition determination unit 524.
Fig. 9 schematically shows an example of an image captured by the vehicle exterior imaging unit 122 of the vehicle 120.
Description of reference numerals
10 communication network, 100 management system, 110 management server, 120 vehicle, 122 vehicle exterior photographing part, 124 image management part, 126 license plate, 140 communication terminal, 220 road surface management server, 222 road surface information management part, 224 road surface information storage part, 240 vehicle management server, 242 vehicle information management part, 244 vehicle information storage part, 322GPS signal receiving part, 324 running state detection part, 330 driving part, 340 communication part, 350 control part, 352 input/output control part, 354 vehicle control part, 356 communication control part, 420 storage condition determining part, 432 storage time determining part, 434 inter-vehicle distance determining part, 440 image extracting part, 450 image analyzing part, 460 additional information acquiring part, 472 vehicle information transmitting part, 474 image information transmitting part, 522 speed information acquiring part, 524 interval condition determining part, 526 distance condition determining part, 528 image condition determining part, 720 area, 722 area, 740 area, 744 area, 822 running speed, 824 required time, 826 required time, 828 sampling frequency, 902 image, 904 image, 910 dotted line, 920 leading vehicle, 926 license plate.
Detailed Description
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. In addition, the combination of all the features described in the embodiments is not necessarily essential to the solution means of the invention. In the drawings, the same or similar components are denoted by the same reference numerals, and redundant description thereof may be omitted.
[ overview of the management System 100 ]
Fig. 1 schematically shows an example of the system configuration of the management system 100. In the present embodiment, the management system 100 includes a management server 110 and a vehicle 120. The management system 100 may include a communication terminal 140 used by a user of the management system 100. In the present embodiment, vehicle 120 includes vehicle exterior imaging unit 122, image management unit 124, and license plate 126.
The management system 100 may be an example of a road surface information management system. The management server 110 may be an example of an image information collection device and a vehicle information collection device. The vehicle 120 may be an example of the image management apparatus and the vehicle. The vehicle 120 may be an example of a mobile body. The vehicle exterior imaging unit 122 may be an example of an imaging unit. The image management section 124 may be an example of an image management apparatus.
In the present embodiment, the management server 110 and the vehicle 120 can transmit and receive information to and from each other via the communication network 10. The management server 110 and the communication terminal 140 can transmit and receive information via the communication network 10. The vehicle 120 and the communication terminal 140 can transmit and receive information via the communication network 10.
In the present embodiment, the communication network 10 may be a wired communication transmission path, a wireless communication transmission path, or a combination of a wireless communication transmission path and a wired communication transmission path. The communication network 10 may comprise a wireless packet communication network, the internet, a P2P network, a private line, a VPN, a wire line communication line, etc. The communication network 10 may include (i) a mobile communication network such as a cellular phone line network, and may include (ii) a wireless communication network such as a wireless MAN (e.g., WiMAX (registered trademark)), a wireless LAN (e.g., WiFi (registered trademark)), Bluetooth (registered trademark), Zigbee (registered trademark), nfc (near Field communication), and the like.
In the present embodiment, the management server 110 collects images. More specifically, the management server 110 collects information (sometimes referred to as image information) relating to images captured by the vehicle exterior imaging unit 122 mounted on one or more vehicles 120. The image information may include image data of the image, information indicating at least one of a position and a time at which the image is captured, or any information that is recognized by analyzing the image.
When the collected image includes an image of a road, the management server 110 may analyze the collected image to generate road surface information relating to the state of the road surface of the road. The management server 110 detects at least one of deterioration of the road surface, presence or absence or degree of freezing of the road surface, and presence or absence or degree of wetness or flooding of the road surface, for example, based on the road surface information.
In the present embodiment, the management server 110 collects various information (sometimes referred to as vehicle information) related to one or more vehicles 120. The vehicle information may include a vehicle identification number given to the vehicle 120, and information indicating at least one of a position and a time at which the vehicle identification number is captured.
In the present embodiment, the vehicle identification number is identification information for identifying each of a plurality of vehicles. A plurality of vehicles are assigned unique vehicle identification numbers. The vehicle identification number may be identification information composed of letters, numbers, symbols, and combinations thereof. The vehicle identification number includes a car registration number. In the present embodiment, the vehicle identification number of each vehicle is set on the license plate 126 disposed for the vehicle.
Management server 110 may generate information indicating the movement history of vehicle 120 identified by a specific vehicle identification number, for example, in response to a request from communication terminal 140. As a response to the above-described request, the management server 110 may transmit information indicating the movement history to the communication terminal 140. The management server 110 can use the movement history, for example, the position or movement history of the vehicle detected as stolen, not checked, or the like.
In the present embodiment, the vehicle 120 captures an image of the outside of the vehicle while the vehicle is traveling. The vehicle 120 transmits image data of an image satisfying a specific condition among the captured images to the management server 110. On the other hand, the vehicle 120 does not transmit image data of an image that does not meet the above-described condition to the management server 110. The vehicle 120 can discard the image data of the image that does not meet the above-described condition at an appropriate timing.
The management server 110 can detect, for example, (i) irregularities on the road surface, (ii) deterioration of the road surface, and (iii) a line or a road sign arranged on the road surface, using the transmitted image. The management server 110 can evaluate, for example, (i) unevenness of the road surface, (ii) deterioration of the road surface, and (iii) the state of a dividing line or a road sign arranged on the road surface, using the transmitted image.
The vehicle 120 is exemplified by an automobile, a motorcycle, and the like. Examples of the motorcycle include (i) a motorcycle, (ii) a three-wheeled motorcycle, (iii) a pick-up car (registered trademark), a step (registered trademark) with a power unit, and a stand-up motorcycle having a power unit such as a skateboard with a power unit.
In the present embodiment, the vehicle exterior imaging unit 122 is mounted on the vehicle 120. Vehicle exterior imaging unit 122 images the exterior of vehicle 120. The vehicle 120 outputs image data of the captured image to the image management unit 124.
In the present embodiment, the image management unit 124 manages image data of the image captured by the vehicle exterior capturing unit 122. In one embodiment, the image management unit 124 divides the image captured by the vehicle exterior imaging unit 122 into an image to be saved (sometimes referred to as a saved object) and an image to be discarded (sometimes referred to as a discarded object). On the other hand, the image management unit 124 discards the image data of the image to be discarded without transmitting it to the management server 110.
In another embodiment, the image management unit 124 analyzes at least a part of the image captured by the vehicle exterior capturing unit 122 to determine whether or not the vehicle identification number is recognized. When the vehicle identification number is recognized in the image, the image management unit 124 transmits, for example, vehicle information including the vehicle identification number and information indicating at least one of the position and the time at which the image is captured to the management server 110. Details of the image management section 124 will be described later.
In the present embodiment, communication terminal 140 is not particularly limited in detail as long as it can transmit and receive information to and from at least one of management server 110 and vehicle 120. The communication terminal 140 includes a personal computer, a portable terminal, and the like. Examples of the portable terminal include a mobile phone, a smart phone, a PDA, a tablet computer, a notebook computer, a portable computer, and a wearable computer.
The communication terminal 140 may correspond to one or more communication modes. Examples of the communication system include a mobile communication system, a wireless MAN system, a wireless LAN system, and a wireless PAN system. Examples of the mobile communication system include a GSM (registered trademark) system, a 3G system, an LTE system, a 4G system, and a 5G system. WiMAX (registered trademark) is an example of the wireless MAN system. The wireless LAN system includes WiFi (registered trademark). Examples of the wireless PAN system include Bluetooth (registered trademark), Zigbee (registered trademark), nfc (near Field communication), and the like.
[ concrete constitution of units of the management system 100 ]
The units of the management system 100 may be implemented by hardware, software, or both. At least a part of the units of the management system 100 may be implemented by a single server or may be implemented by a plurality of servers. At least a portion of the units of the management system 100 may be implemented on a virtual machine or a cloud system. At least a part of the units of the management system 100 may be implemented by a personal computer or a portable terminal. Examples of the portable terminal include a mobile phone, a smart phone, a PDA, a tablet computer, a notebook computer, a portable computer, and a wearable computer. The units of the management system 100 may store information using distributed book techniques such as blockchains or distributed networks.
When at least a part of the components constituting the management system 100 is realized by software, the components realized by the software can be realized by starting a program that defines operations related to the components in an information processing apparatus having a normal configuration. The information processing device includes, for example, (i) a data processing device including a processor such as a CPU or a GPU, a ROM, a RAM, a communication interface, and the like, (ii) an input device such as a keyboard, a touch panel, a camera, a microphone, various sensors, and a GPS receiver, (iii) an output device such as a display device, a speaker, and a vibration device, and (iv) a storage device such as a memory or an HDD (including an external storage device).
In the information processing apparatus, the data processing apparatus or the storage apparatus may store a program. The program described above may be stored in a non-transitory computer-readable recording medium. The program causes the information processing apparatus to execute the operation defined by the program when the program is executed by the processor.
The program may be stored in a computer-readable medium such as a CD-ROM, a DVD-ROM, a memory, a hard disk, or a storage device connected to a network. The program may be installed into a computer constituting at least a part of the management system 100 from a computer-readable medium or a storage device connected to a network. By executing the program, the computer can function as at least a part of each unit of the management system 100.
The program that causes the computer to function as at least a part of each unit of the management system 100 may include a module that defines the operation of each unit of the management system 100. These programs and modules operate on a data processing device, an input device, an output device, a storage device, and the like, cause a computer to function as each unit of the management system 100, or cause a computer to execute an information processing method in each unit of the management system 100.
The information processing described in the program functions as a specific means for cooperating software related to the program with various hardware resources of the management system 100 by reading the program into a computer. By the above-described specific means, calculation or processing of information corresponding to the purpose of use of the computer in the present embodiment is realized, and the management system 100 corresponding to the purpose of use is constructed.
The program may be a program for causing a computer to function as the management system 100 or the image management unit 124. The program may be a program for causing a computer to execute the information processing method in the management system 100 or the image management section 124.
In one embodiment, the information processing method described above may be an image management method. The above-described image management method includes, for example, a storage determination step of determining an image to be stored from among a plurality of images captured by an imaging unit mounted on a moving object and capturing an image of an external form of the moving object. The above-described image management method includes, for example, a condition determining step of determining a condition for determining an image to be saved.
In the above-described image management method, the condition determining step may include a speed information acquiring step of acquiring speed information indicating a speed of the mobile object. In the above-described image management method, the condition determining step may include an interval condition determining step of determining, as the condition for determining the image to be saved, (i) at least one of a condition relating to a time interval from when one image is determined to be saved to when a next image to be saved is determined, and a condition relating to a distance that the mobile object moves during a period from when one image is determined to be saved to when a next image to be saved is determined, based on the speed of the mobile object. In the above-described image management method, the storage determining step may include a step of determining the image to be stored according to the condition determined by the condition determining unit.
In another embodiment, the information processing method may be an image management method. The above-described image management method includes, for example, a storage determination step of determining an image to be stored from among a plurality of images captured by an imaging unit mounted on a moving object and capturing an image of an external form of the moving object. The image management method described above includes, for example, a distance information acquisition step of acquiring distance information indicating a distance between the moving body and an object existing in the moving direction of the moving body. In the above-described image management method, the storage determination step may include a step of determining that the image captured by the image capturing unit is not to be a storage target when the distance indicated by the distance information is smaller than a predetermined value.
[ overview of the units of the management server 110 ]
Fig. 2 schematically shows an example of the internal configuration of the management server 110. In the present embodiment, the management server 110 includes a road surface management server 220 and a vehicle management server 240. In the present embodiment, road surface management server 220 includes a road surface information management unit 222 and a road surface information storage unit 224. In the present embodiment, the vehicle management server 240 includes a vehicle information management unit 242 and a vehicle information storage unit 244.
The road surface management server 220 may be an example of an image information collection device. The vehicle management server 240 may be an example of a vehicle information collection device.
In the present embodiment, the road surface management server 220 manages image information. The road surface information management unit 222 receives image information including image data of an image to be stored from the image management unit 124 of one or more vehicles 120. The road surface information management unit 222 manages the image information. For example, the road surface information management unit 222 adds an appropriate index to the image information and stores the image information in the road surface information storage unit 224. This facilitates storage, search, analysis, and the like of the image information.
For example, when the image to be stored includes an image of a road, the road surface information management unit 222 analyzes the image to be stored, and generates road surface information relating to the state of the road surface of the road. The state of the road surface of the road includes (i) the presence or absence of irregularities on the road surface or the state thereof, (ii) the presence or absence of reflection from the road surface or the state thereof, and (iii) the presence or absence of a dividing line or a road mark disposed on the road surface or the state thereof. The irregularities of the road surface include undulation of the road surface, depressions of the road surface, cracks of the road surface, and the like.
Road surface information management unit 222 may detect at least 1 of deterioration of the road surface, freezing of the road surface, and wetting or flooding of the road surface based on the road surface information. The degree of at least 1 of deterioration of the road surface, freezing of the road surface, and wetting or flooding of the road surface can be evaluated. The road surface information management unit 222 may add an appropriate index to the road surface information, the detection result, or the evaluation result, and store the road surface information, the detection result, or the evaluation result in the road surface information storage unit 224.
For example, the road surface information management unit 222 evaluates the state of deterioration of the road surface based on the state of unevenness of the road surface. The road surface information management unit 222 may evaluate the state of deterioration of the road surface based on the state of irregularities on the road surface, the state of a dividing line or a road sign disposed on the road surface. As a result of the above evaluation, when it is determined that the state of deterioration of the road surface is worse than the predetermined state, the road surface information management unit 222 can detect the deterioration of the road surface.
For example, the road surface information management unit 222 detects at least 1 of the freezing of the road surface and the wetting or flooding of the road surface based on the state of reflection of the road surface. The road surface information management unit 222 may determine the degree of at least 1 of freezing of the road surface and wetting or flooding of the road surface based on the state of reflection of the road surface.
Thus, the road surface information management unit 222 can specify, for example, the position of the irregularities on the road surface, the position of the deterioration of the road surface, and the positions of the dividing lines, road signs, and the like. The road surface information management unit 222 can specify a position where the vehicle needs to be paid attention to for passing, a position where the vehicle should be prohibited from passing, a position where a recovery process needs to be performed, a position where maintenance of the road surface needs to be performed, a position where maintenance of a dividing line or a road sign needs to be performed, and the like. The road surface information management unit 222 can determine the priority of maintenance of the road surface, the division line, or the road sign, the priority of the restoration process, and the like.
In the present embodiment, the vehicle management server 240 manages vehicle information. The vehicle information management unit 242 receives vehicle information including a vehicle identification number identified in an image captured by each vehicle from the image management unit 124 of one or more vehicles 120. The vehicle information management unit 242 manages the vehicle information described above. For example, the vehicle information management unit 242 gives an appropriate index to the vehicle information and stores the index in the vehicle information storage unit 244. This facilitates storage, search, analysis, and the like of the vehicle information.
For example, the vehicle information management unit 242 generates a movement history of the vehicle 120 identified by the vehicle identification number included in the search request, in accordance with the search request from the communication terminal 140. In response to the request, the vehicle information management unit 242 may transmit information indicating the movement history to the communication terminal 140.
More specifically, first, vehicle information management unit 242 acquires the vehicle identification number of vehicle 120 to be searched from communication terminal 140. Next, vehicle information management unit 242 refers to a database in which the movement histories of a plurality of vehicles 120 are stored, and extracts information indicating the movement history of vehicle 120 to be searched. Then, vehicle information management unit 242 transmits information indicating the nearest position of vehicle 120 to be searched for or information indicating the movement history of the vehicle to communication terminal 140. Examples of the vehicle 120 to be searched include a stolen vehicle, an unchecked vehicle, and a friend vehicle.
In the present embodiment, the road surface management server 220 and the vehicle management server 240 may be physically or theoretically different servers. The vehicle management server 240 does not include image data, or stores a smaller amount or capacity of image data than the road surface management server 220. Therefore, the load of data processing in the vehicle management server 240 is smaller than that in the road surface management server 220. Thus, vehicle management server 240 can quickly execute processing such as searching for the movement history of specific vehicle 120, detecting theft of vehicle 120, and searching for the position and movement history of a stolen vehicle or an undetected vehicle.
[ overview of each unit of vehicle 120 ]
Fig. 3 schematically shows an example of the internal structure of vehicle 120. In the present embodiment, vehicle 120 includes vehicle exterior imaging unit 122, image management unit 124, GPS signal receiving unit 322, traveling state detection unit 324, driving unit 330, communication unit 340, and control unit 350. In the present embodiment, the control unit 350 includes an input/output control unit 352, a vehicle control unit 354, and a communication control unit 356.
In the present embodiment, the GPS signal receiving unit 322 receives a GPS signal. The GPS signal receiving unit 322 may generate position information indicating the position of the vehicle 120 based on the received GPS signal.
In the present embodiment, running state detection unit 324 detects the running state of vehicle 120 based on the output of any internal sensor or any external sensor disposed in vehicle 120. The traveling state of vehicle 120 includes speed, acceleration, inclination, vibration, noise, operating state of driving unit 330, occurrence of an abnormality, current position, moving route, temperature of the external environment, humidity of the external environment, pressure of the external environment, temperature of the internal space, humidity of the internal space, pressure of the internal space, relative position to the surrounding object, relative speed to the surrounding object, presence or absence or degree of congestion, continuous driving time, presence or frequency of rapid acceleration, presence or absence or frequency of rapid deceleration, and the like. The traveling state of the vehicle 120 may be an example of the moving state of the moving body.
In the present embodiment, the driving unit 330 drives the vehicle 120. The driving section 330 may drive the vehicle 120 according to a command from the control section 350. The driving unit 330 may generate power by an internal combustion engine or may generate power by an electric motor. The driving unit 330 may include at least one of an automatic safety device and an automatic driving device. The drive unit 330 may include various accessories as well as devices directly required for driving the vehicle 120. Examples of the accessory devices include security devices, seat adjusting devices, door lock management devices, window opening/closing devices, lighting devices, air conditioning devices, navigation devices, audio devices, and video devices.
In the present embodiment, the communication unit 340 transmits and receives information between the vehicle 120 and the management server 110 via the communication network 10. The communication unit 340 may correspond to 1 or more communication systems.
In the present embodiment, control unit 350 controls each unit of vehicle 120. For example, the control unit 350 receives at least 1 input of image data of an image captured by the vehicle exterior imaging unit 122, the position information generated by the GPS signal receiving unit 322, and information indicating the traveling state detected by the traveling state detecting unit 324. The control unit 350 also transfers the input image data to the image management unit 124.
In the present embodiment, input/output control unit 352 controls input/output of information in vehicle 120. In the present embodiment, the input/output control unit 352 controls the vehicle exterior imaging unit 122, the image management unit 124, the GPS signal receiving unit 322, and the traveling state detecting unit 324. The input/output control unit 352 may control at least one of the other types of input devices and output devices (not shown).
In the present embodiment, the vehicle control unit 354 controls the operation of the vehicle 120. The vehicle control unit 354 may control the operation of the drive unit 330. For example, the vehicle control unit 354 acquires information output by the traveling state detection unit 324. The vehicle control unit 354 controls the operation of the driving unit 330 based on the information output from the traveling state detection unit 324.
In the present embodiment, communication control unit 356 controls communication between vehicle 120 and an external device. The communication control unit 356 can control the operation of the communication unit 340. The communication control section 356 may be a communication interface. The communication control unit 356 may correspond to 1 or a plurality of communication systems. Communication control unit 356 may detect or monitor the communication state between management server 110 and vehicle 120. The communication control unit 356 may generate communication information indicating a communication state based on the result of the above-described detection or monitoring.
The communication information includes information on the possibility of communication, radio wave status, communication quality, the type of communication method, the type of communication carrier, and the like. The electric wave states include a radio wave reception level, a radio wave intensity, rscp (received Signal Code power), cid (cell id), and the like. The communication quality includes a communication speed, a data communication traffic, a data communication delay time, and the like.
[ overview of each unit of the image management unit 124 ]
The details of the image management unit 124 will be described with reference to fig. 4 and 5. Fig. 4 schematically shows an example of the internal configuration of the image management unit 124. Fig. 5 schematically shows an example of the internal configuration of the storage condition determining unit 420.
As shown in fig. 4, in the present embodiment, the image management unit 124 includes a storage condition determination unit 420, a storage time determination unit 432, an inter-vehicle distance determination unit 434, an image extraction unit 440, an image analysis unit 450, an additional information acquisition unit 460, a vehicle information transmission unit 472, and an image information transmission unit 474. As shown in fig. 5, in the present embodiment, the storage condition determining unit 420 includes a velocity information acquiring unit 522, a spacing condition determining unit 524, a distance condition determining unit 526, and an image condition determining unit 528.
The storage condition determining unit 420 may be an example of a condition determining unit, a velocity information acquiring unit, a spacing condition determining unit, a distance condition determining unit, and an image condition determining unit. The storage time determination unit 432 may be an example of a storage determination unit. The inter-vehicle distance determination unit 434 may be an example of a storage determination unit and a distance information acquisition unit. The image analysis unit 450 may be an example of a storage determination unit and an identification number storage unit. The vehicle information transmitting unit 472 may be an example of an identification number transmitting unit. The image information transmitting section 474 may be an example of an image transmitting section.
In the present embodiment, the storage condition determining unit 420 determines a condition for determining an image to be a storage target or a candidate thereof. The storage condition determining unit 420 determines, for example, a temporal or geographical condition of an image to be stored or a candidate thereof. The temporal or geographical condition may be a condition related to the shooting time or shooting position of the image.
The storage condition determining unit 420 determines a condition for determining whether or not the image needs to be stored, based on, for example, the inter-vehicle distance between the vehicle 120 and another vehicle present in the vicinity of the vehicle 120. As conditions for determining whether or not an image needs to be stored, there are (i) a condition that an image is not to be stored when a vehicle identification number of a vehicle 120 (which may be referred to as a captured vehicle) different from the vehicle 120 on which the vehicle exterior imaging unit 122 that captured the image is mounted is recognized, (ii) a condition that an image is not to be stored when a license plate 126 of the captured vehicle is recognized in the image, and (iii) a condition that an image is not to be stored when the size of the captured vehicle or a part of the image of the captured vehicle in the image satisfies a predetermined condition, and the like.
As a part of the above-described photographed vehicles, parts or models having a predetermined size according to a law, a standard or a business practice are listed. The above-mentioned components include a license plate, a vehicle inspection seal, and the like. Examples of the pattern include a vehicle identification number on a license plate, and characters or symbols on a vehicle inspection seal. The predetermined conditions include (i) a condition that the ratio of the size of the vehicle or a part of the image of the vehicle to the size of the entire image captured by the vehicle exterior capturing unit 122 is larger than a predetermined value, and (ii) a condition that the size of the vehicle or a part of the image of the vehicle to be captured is larger than a predetermined size.
The storage condition determining unit 420 outputs information indicating the determined conditions to at least 1 of the storage time determining unit 432, the inter-vehicle distance determining unit 434, and the image analyzing unit 450. For example, the storage condition determining unit 420 outputs the condition determined by the interval condition determining unit 524 shown in fig. 5 to the storage time determining unit 432. The storage condition determining unit 420 may output the condition determined by the distance condition determining unit 526 shown in fig. 5 to the inter-vehicle distance determining unit 434. The storage condition determining unit 420 may output the conditions determined by the image condition determining unit 528 shown in fig. 5 to the image analyzing unit 450. The details of the storage condition determining unit 420 will be described later.
[ (a) an example of determining an element to be saved based on a geographical condition of time ]
In the present embodiment, the storage time determination unit 432 determines the candidate images to be stored from among the plurality of images captured by the vehicle exterior imaging unit 122. For example, the storage time determination unit 432 determines the time at which the vehicle 120 executes the processing for temporarily storing at least the image, and determines the candidate image to be stored. The storage time determination unit 432 may determine images to be candidates for storage based on the image capturing time. The storage time determination unit 432 may determine images to be candidates for storage based on the image capturing position.
The storage time determination unit 432 determines the candidate images to be stored, for example, according to the conditions determined by the storage condition determination unit 420. The storage time determination unit 432 may determine the candidate images to be stored according to the determination conditions determined by the interval condition determination unit 524 of the storage condition determination unit 420. The storage time determination unit 432 may output information for specifying the image determined as the candidate to be stored to the image extraction unit 440. The information for specifying the image determined as the candidate to be saved includes information indicating the capturing time of the image as the candidate to be saved and a signal indicating that the execution timing of the saving process of the image has come. A command for executing save processing of an image, and the like.
[ (a-1) example of a procedure for determining a storage object based on a time condition ]
In one embodiment, the storage time determination unit 432 first acquires information indicating a condition of a time when an image to be stored or a candidate thereof is to be stored from the storage condition determination unit 420. Next, the storage time determination unit 432 determines whether the above-described time condition is satisfied. For example, the storage time determination unit 432 determines whether or not the image capturing time satisfies the above-described time condition. For example, when it is determined that the capturing time of the image satisfies the above-described time condition (that is, when the time condition is satisfied), the storage time determination unit 432 determines that the image is a storage candidate. Thus, the storage time determination unit 432 can determine the candidate images to be stored.
More specifically, first, the storage time determination unit 432 acquires information indicating conditions relating to the intervals of the time during which the images are stored from the interval condition determination unit 524 of the storage condition determination unit 420. The time interval is determined based on, for example, the setting of the imaging interval of the candidate images to be stored and the traveling speed of the vehicle 120.
For example, when it is required to temporarily store images captured at least at intervals of 25m, the condition relating to the time interval means that images are stored at intervals of 1 image at 1 second when the traveling speed is 90km/h or less, that images are stored at intervals of 2 images at 1 second when the traveling speed is 90km/h to 180km/h, that images are stored at intervals of 3 images at 1 second when the traveling speed is 180km/h to 270km/h, and that images are stored at intervals of 4 images at 1 second when the traveling speed is 270km/h to 360 km/h. The details of the condition relating to the above-described time interval will be described later.
Next, the storage time determination unit 432 acquires information indicating the current time or information indicating the elapsed time from the imaging time of the image that has been determined as the candidate to be stored most recently. The storage time determination unit 432 determines whether or not the current time or the elapsed time satisfies the condition relating to the time interval, and determines the candidate images to be stored based on the determination result.
For example, the storage time determination unit 432 determines that an image captured at a time satisfying the condition relating to the interval of time described above is a candidate to be stored. The storage time determination unit 432 outputs information indicating the imaging time of the candidate images to be stored to the image extraction unit 440, for example. In addition, when the elapsed time satisfies the condition regarding the time interval, the storage time determination unit 432 may output a signal indicating that the execution time of the image storage process has come, or an instruction for executing the image storage process, to the image extraction unit 440. Thus, the image taken at the appropriate point in time is at least temporarily saved on the vehicle 120 side.
[ (a-2) an example of a procedure for determining a stored object based on geographical conditions ]
In another embodiment, the storage time determination unit 432 first acquires information indicating the geographical conditions of the images to be candidates for storage from the storage condition determination unit 420. Next, the storage time determination unit 432 determines whether or not the geographical condition described above is satisfied. For example, the storage time determination unit 432 determines whether or not the image capturing position of the image satisfies the geographical condition described above. For example, when it is determined that the imaging position of the image satisfies the geographical condition (that is, when the geographical condition is satisfied), the storage time determination unit 432 determines the image as the storage candidate. Thus, the storage time determination unit 432 can determine the candidate images to be stored.
More specifically, first, the storage time determination unit 432 acquires information indicating conditions relating to the geographical intervals in which the images are stored, from the interval condition determination unit 524 of the storage condition determination unit 420. The geographical interval is determined based on, for example, setting of imaging intervals of images to be candidates for storage. For example, by the above setting, it is required to at least temporarily store images captured at 25m intervals. The geographical interval may be determined based on a setting relating to position coordinates of a position or an area where an image is to be stored. The details of the above-described geographical interval-related conditions will be described later.
Next, the storage time determination unit 432 acquires information indicating the current position of the vehicle 120 or information indicating the movement distance from the imaging position of the image that is determined as the candidate to be stored most recently. The storage time determination unit 432 determines whether or not the current position or the movement distance satisfies the condition relating to the geographical interval, and determines the candidate images to be stored based on the determination result.
For example, the storage time determination unit 432 determines that an image captured at a position where the current position or the movement distance satisfies the condition relating to the geographical interval is a candidate to be stored. The storage time determination unit 432 may determine whether or not the current position or the movement distance satisfies the geographical interval condition by determining whether or not the current time or the elapsed time from the latest shooting time satisfies the time interval condition.
The storage time determination unit 432 outputs, for example, information indicating the time when the vehicle 120 passes through a position satisfying the condition relating to the geographical interval to the image extraction unit 440 as information indicating the capturing time of the candidate image to be stored. In addition, when the elapsed time satisfies the condition regarding the geographical interval, the storage time determination unit 432 may output a signal indicating that the execution time of the image storage process has come, or an instruction to execute the image storage process, to the image extraction unit 440. Thus, the image taken at the appropriate point in time is at least temporarily saved on the vehicle 120 side.
[ (b) an example of determining an element to be stored based on the inter-vehicle distance from another vehicle ]
In the present embodiment, the inter-vehicle distance determination unit 434 determines an image to be a storage target or a candidate thereof from among a plurality of images captured by the vehicle exterior imaging unit 122. The inter-vehicle distance determination unit 434 determines images to be stored or candidates thereof, for example, according to the conditions determined by the storage condition determination unit 420. The inter-vehicle distance determination unit 434 may output information for identifying an image to be stored or a candidate thereof to the image extraction unit 440.
For example, the inter-vehicle distance determination unit 434 acquires distance information indicating a distance between the vehicle 120 and an object existing in the movement direction of the vehicle 120. The object existing in the moving direction of the vehicle 120 may be another vehicle existing in front of the vehicle 120. The object existing in the moving direction of the vehicle 120 may be a vehicle traveling on the same traveling lane as the vehicle 120. The object existing in the moving direction of the vehicle 120 may be a vehicle that travels on a traveling lane in the same direction as the traveling lane of the vehicle 120 and a traveling lane different from the traveling lane of the vehicle 120. The object existing in the moving direction of the vehicle 120 may be an oncoming vehicle. For example, the inter-vehicle distance determination unit 434 acquires information indicating the inter-vehicle distance to the vehicle ahead by using a distance sensor disposed in the vehicle 120. The inter-vehicle distance determination unit 434 may determine whether or not the distance indicated by the distance information satisfies the condition determined by the distance condition determination unit 526, and determine an image to be a storage target or a candidate thereof based on the determination result.
For example, as a condition for determining an image to be stored, the distance condition determining unit 526 determines a condition for defining a relationship between the image to be stored and the distance indicated by the distance information. At this time, the inter-vehicle distance determination unit 434 determines, as the storage target, an image captured while the distance indicated by the distance information satisfies the condition determined by the distance condition determination unit 526. The inter-vehicle distance determination unit 434 may determine that an image captured during a period in which the distance indicated by the distance information does not satisfy the condition determined by the distance condition determination unit 526 is not to be a storage target. The condition for defining the relationship between the image to be saved and the distance indicated by the distance information may be a condition that the distance indicated by the distance information is greater than a predetermined value.
In another embodiment, the distance condition determination unit 526 may determine a condition that defines a relationship between an image that is not to be stored and the distance indicated by the distance information. In this case, the inter-vehicle distance determination unit 434 may determine, as the storage target, an image captured during a period in which the distance indicated by the distance information does not satisfy the condition determined by the distance condition determination unit 526.
In the present embodiment, the image extracting unit 440 acquires information for specifying an image to be saved or a candidate thereof from at least one of the saving time determining unit 432 and the inter-vehicle distance determining unit 434. The image extracting unit 440 extracts image data of an image to be saved or a candidate thereof from the image data of a plurality of images captured by the vehicle exterior capturing unit 122. The image extracting unit 440 may output the extracted image data to the image analyzing unit 450.
In one embodiment, the extracted image data is output to the image analysis unit 450. In another embodiment, the extracted image data is temporarily stored in the image extracting unit 440, for example. The length of the period during which the image data is stored in the image extraction unit 440 may be 1 month or less, 2 weeks or less, 1 week or less, 3 days or less, 2 days or less, 1 day or less, 12 hours or less, or 6 hours or less.
[ (c) an example of determining an element to be saved based on an analysis result of an image ]
In the present embodiment, the image analysis unit 450 determines an image to be saved from among a plurality of images captured by the vehicle exterior imaging unit 122. The image analysis unit 450 determines an image to be stored, for example, according to the conditions determined by the storage condition determination unit 420. The image analysis unit 450 may determine the image to be stored according to the condition determined by the image condition determination unit 528.
The image analysis unit 450 may extract image data of an image to be saved from among image data of a plurality of images captured by the vehicle exterior imaging unit 122 based on the determination result. The image analysis unit 450 may output the extracted image data to the image information transmission unit 474.
In the present embodiment, the image analysis unit 450 analyzes (i) each of the image data of the plurality of images captured by the vehicle exterior imaging unit 122 or (ii) each of the image data extracted by the image extraction unit 440, and determines whether or not 1 or more vehicle identification numbers are included in the image indicated by each image data. The 1 or more vehicle identification numbers included in the image may be vehicle identification numbers given to the photographed vehicles. The photographed vehicle may be another vehicle existing in front of the vehicle 120. The photographed vehicle described above may be a vehicle that travels on the same travel lane as the vehicle 120. The photographed vehicle described above may be a vehicle that travels on a travel lane in the same direction as the travel lane of the vehicle 120 and on a travel lane different from the travel lane of the vehicle 120.
In one embodiment, when the vehicle identification number is recognized, the image analysis unit 450 determines that the vehicle identification number is included in the image. In another embodiment, when the ratio of the size of the image of the character or symbol to (i) the size of the image of the character or symbol constituting the vehicle identification number or (ii) the size of the entire image is larger than a predetermined value, the image analysis unit 450 determines that the image includes the vehicle identification number.
When it is determined that the image includes the vehicle identification number, the image analysis unit 450 may determine not to store the image. At this time (particularly, when 1 or more vehicle identification numbers in the image are recognized), the image analysis unit 450 may determine to store the 1 or more vehicle identification numbers in the management server 110. The image analysis unit 450 may output information indicating the recognized vehicle identification number to the vehicle information transmission unit 472.
On the other hand, when it is determined that the vehicle identification number is not included in the image, the image analysis unit 450 may determine to store the image in the management server 110. The image analysis unit 450 may output the image data of the image to the vehicle information transmission unit 472.
The image analysis unit 450 may analyze (i) each of the image data of the plurality of images captured by the vehicle exterior imaging unit 122 or (ii) each of the image data extracted by the image extraction unit 440, and determine whether or not a specific type of component disposed on the vehicle to be captured is discriminated in the image indicated by each of the image data. The above-mentioned components include a license plate, a vehicle inspection seal, and the like. For example, when the ratio of the size of the image of the component to (i) the size of the image of the component in the image or (ii) the size of the entire image is larger than a predetermined value, the image analyzer 450 determines that the component is discriminated in the image.
When it is determined that the above-described member is discriminated in the image, the image analyzer 450 may determine not to store the image. At this time, the image analysis unit 450 may identify 1 or more vehicle identification numbers included in the image. The image analysis unit 450 may also determine to store the identified vehicle identification number in the management server 110. The image analysis unit 450 may output information indicating the identified vehicle identification number to the vehicle information transmission unit 472.
On the other hand, when it is determined that the image does not include the image of the component, the image analysis unit 450 may determine to store the image. The image analysis unit 450 may output the image data of the image to the vehicle information transmission unit 472.
In the present embodiment, the additional information acquiring unit 460 acquires additional information indicating at least one of the position and the time at which the image is captured. The additional information acquisition unit 460 acquires information indicating the position where the image is captured, for example, from the GPS signal reception unit 322. The additional information acquisition unit 460 acquires information indicating the time when the image is captured, for example, from the vehicle exterior imaging unit 122.
The additional information acquisition unit 460 may output, to the vehicle information transmission unit 472, information indicating at least one of the position and the time at which the image including the vehicle identification number is captured, which is output to the vehicle information transmission unit 472 by the image analysis unit 450, in association with the identification information of the image data. The additional information acquiring unit 460 may output, to the image information transmitting unit 474, information indicating at least one of the position and the time at which the image of the image data is captured, which is output from the image analyzing unit 450 to the image information transmitting unit 474, in association with the identification information of the image data.
In the present embodiment, the vehicle information transmitting unit 472 acquires information indicating the vehicle identification number from the image analyzing unit 450. The vehicle information transmitting unit 472 acquires the additional information corresponding to the vehicle identification number from the additional information acquiring unit 460. The additional information includes information indicating the position and time at which the image was captured. The vehicle information transmitting unit 472 generates vehicle information including information indicating the vehicle identification number and additional information. The vehicle information transmitting unit 472 may transmit the generated vehicle information to the vehicle management server 240.
In the present embodiment, the image information transmitting unit 474 acquires image data from the image analyzing unit 450. The image information transmitting unit 474 acquires the additional information corresponding to the image data from the additional information acquiring unit 460. The additional information includes information indicating the position and time at which the image was captured. The image information transmitting unit 474 generates image information including image data and additional information. The image information transmitting unit 474 may transmit the generated image information to the road surface management server 220.
The image information transmitting unit 474 can generate image information including image data, additional data, and vibration data obtained by an acceleration sensor provided in the vehicle 120. Thus, the road surface information management unit 222 can estimate the state of the road surface using the vibration data.
As described above, fig. 5 schematically shows an example of the internal configuration of the storage condition determining unit 420. The details of the storage condition determination unit 420 according to one embodiment will be described with reference to fig. 5.
In the present embodiment, speed information acquisition unit 522 acquires speed information indicating the speed of vehicle 120, for example. The speed information acquisition unit 522 may acquire speed information indicating the speed of the vehicle 120 from the running state detection unit 324. The speed information acquisition unit 522 outputs speed information indicating the speed of the vehicle 120 to, for example, the interval condition determination unit 524.
In the present embodiment, the interval condition determination unit 524 determines, for example, a condition related to a time point at which an image is stored and a condition related to a temporal or geographic interval. The interval condition determination unit 524 may determine, as the condition for determining the image to be the storage target or the image candidate thereof, at least one of (i) a condition on a time interval until the image to be the storage target or the image candidate thereof is determined next after one image is determined to be the storage target or the image candidate thereof and (ii) a condition on a distance that the mobile object moves until the image to be the storage target or the image candidate thereof is determined next after one image is determined to be the storage target or the image candidate thereof, based on the speed of the vehicle 120.
The temporal or geographical interval defines a time point at which an image to be a storage target or a candidate thereof is extracted or clipped from image data continuously output from the vehicle exterior photographing unit 122. The temporal or geographic interval may be determined based on the speed of the vehicle 120. May continuously vary according to the speed of vehicle 120 or may vary in stages according to the speed of vehicle 120. In one embodiment, the interval of time is determined to be shorter the faster the speed of vehicle 120. In other embodiments, the geographic interval is determined to be constant regardless of the speed of vehicle 120.
The geographical interval with respect to time may be determined based on at least 1 of (i) the speed of the vehicle 120, (ii) the specification or performance of the vehicle exterior photographing unit 122, (iii) the specification or performance of the vehicle exterior photographing unit 122 of the vehicle 120 traveling nearby, (iv) the amount of traffic of the position or area where the image is photographed, (v) the weather condition of the position or area where the image is photographed, and (vi) the period of time during which the image is photographed. The temporal or geographical intervals may vary continuously or in steps.
For example, the higher the resolution of the vehicle exterior imaging unit 122 of the host vehicle, the longer the time or geographical interval. For example, the higher the resolution of the vehicle exterior imaging unit 122 of another vehicle traveling in the vicinity of the host vehicle, the longer the time or geographical interval relating to the host vehicle. For example, the lower the resolution of the vehicle exterior imaging unit 122 of another vehicle traveling in the vicinity of the host vehicle, the shorter the time or geographical interval relating to the host vehicle. For example, the worse the meteorological conditions, the worse the field of view, the shorter the temporal or geographic interval. For example, the more poorly viewed time periods, the shorter the temporal or geographic intervals. For example, the greater the number of other vehicles traveling in the vicinity of the host vehicle, the longer the temporal or geographic interval.
The interval condition determination unit 524 acquires, for example, information on at least 1 of the specification or performance of the vehicle exterior imaging unit 122, the specification or performance of the vehicle exterior imaging unit 122 of the vehicle 120 traveling in the vicinity, the amount of traffic in the position or area where the image is captured, the weather condition in the position or area where the image is captured, and the time period during which the image is captured from the management server 110. The interval condition decision unit 524 may acquire information indicating a temporal or geographic interval from the management server 110.
[ example of procedure for determining geographical intervals ]
The geographical interval for determining the image to be stored is determined according to the following procedure, for example. According to one embodiment, first, assuming that vehicle exterior imaging unit 122 is disposed at an appropriate position and orientation, vehicle exterior imaging unit 122 determines a distance (sometimes referred to as an effective range) at which an image of a vehicle identification number can be imaged based on the resolution of vehicle exterior imaging unit 122 and the size of characters or symbols of the vehicle identification number in general license plate 126. As the effective range of the vehicle exterior imaging unit 122, a value of a general effective range of a commercially available vehicle-mounted camera can be used.
Next, the degree of overlap of the images in the 1 st image and the 2 nd image stored continuously is determined. Then, the geographical interval is determined based on the effective range of the vehicle exterior imaging unit 122 and the degree of overlap of the images.
For the purpose of simple explanation, it is assumed that the effective range of the vehicle exterior imaging unit 122 is 50m and the traveling speed of the vehicle 120 is constant. Here, for example, in the 1 st image and the 2 nd image which are stored continuously, it is determined that the images are repeated by 25m, respectively. At this time, half of the image is repeated. That is, the image of the lower half of the 1 st image coincides with the image of the upper half of the 2 nd image. In this case, the geographical interval is 25 m. Similarly, when the images are repeated by 40m, 4/5 on the lower side of the 1 st image and 4/5 on the upper side of the 2 nd image match each other. In this case, the geographical interval is 10 m.
According to another embodiment, the geographical interval is determined based on setting of position coordinates of a position or an area where an image is to be stored. For example, the setting includes position coordinates of a plurality of positions at which the image should be stored. The geographical interval is calculated as a distance of 2 position coordinates. The geographical interval is calculated as a distance between 2 adjacent position coordinates on the travel route of the vehicle 120. The travel path of vehicle 120 is specified by the user of vehicle 120, for example.
In the present embodiment, the effective range of vehicle exterior imaging unit 122 is determined based on the distance that allows discrimination of the size of characters or symbols of a vehicle identification number in general license plate 126. However, the method of determining the effective range of the vehicle exterior imaging unit 122 is not limited to the present embodiment. The effective range of the vehicle exterior capturing unit 122 can be determined based on the size of an arbitrary discrimination object.
In another embodiment, the effective range of the vehicle exterior imaging unit 122 is determined based on the resolution of the vehicle exterior imaging unit 122 and the size of the irregularities of the road surface to be detected. For example, when an irregularity having a length of at least 1 side of 5cm or more is to be detected, the effective range of the vehicle exterior imaging unit 122 is determined as a distance at which an object having a length of 1 side of 5cm can be discriminated by the vehicle exterior imaging unit 122.
[ example of flow for determining time intervals ]
The interval for determining the time for storing the image to be stored is determined according to the following procedure, for example. First, according to the above-described flow, the geographical interval for determining the image to be saved or the candidate thereof is determined. Next, the current traveling speed of vehicle 120 is measured. Next, the time interval is determined based on the geographical interval and the traveling speed of the vehicle 120. The time interval may be determined in consideration of the frame rate of the vehicle exterior imaging unit 122. As the frame rate of the vehicle exterior imaging unit 122, a value of a typical frame rate of a commercially available vehicle-mounted camera can be used.
For the purpose of simple explanation, it is assumed that the geographical interval is 25m and the frame rate of the vehicle exterior imaging unit 122 is 30[ f/sec or fps ]. In this case, as a condition relating to the time interval, for example, when the traveling speed of the vehicle 120 is 90km/h or less, it is determined that the images are stored at an interval of 1 image for 1 second. That is, it is decided to save the image every 30 frames.
Similarly, when the traveling speed of the vehicle 120 is 90km/h to 180km/h, it is determined that images are stored at intervals of 2 images for 1 second. That is, it is decided to store an image every 15 frames. When the traveling speed of the vehicle 120 is 180km/h to 270km/h, it is determined that images are stored at intervals of 3 images for 1 second. That is, it is decided to save the image every 10 frames.
On the other hand, when the traveling speed is 270km/h to 360km/h, it is determined that images are stored at intervals of 4 images for 1 second. In this case, the image may be stored every 8 frames, or may be stored every 7 frames.
In the present embodiment, the distance condition determination unit 526 determines, as a condition for determining an image to be saved, for example, a condition that the image is not captured while the distance indicated by the distance information is less than a predetermined value. The distance condition determination unit 526 may determine, as a condition for determining an image to be saved, a condition that the image is captured during a period in which the distance indicated by the distance information is greater than a predetermined value.
The predetermined value may be determined based on at least 1 of (i) the speed of the vehicle 120, (ii) the specification or performance of the vehicle exterior imaging unit 122 that captures the image, (iii) the specification or performance of the vehicle exterior imaging unit 122 of the vehicle 120 that travels nearby, (iv) the weather condition of the location or area where the image is captured, (v) the time period during which the image is captured, and (vi) the amount of traffic in the location or area where the image is captured. The predetermined value may be varied continuously or in stages.
When the inter-vehicle distance to the vehicle 120 ahead is small, most of the image is occupied by the image of the vehicle 120 ahead, and the amount of road-related information is small. Therefore, the value of transmission to the road surface information management unit 222 is low. On the other hand, when the inter-vehicle distance from the preceding vehicle 120 is small, the vehicle identification number can be read with high accuracy. By determining the distance condition as described above, the data capacity and the communication capacity can be efficiently used.
In the present embodiment, for example, when the vehicle identification number of the vehicle to be captured is recognized in the image captured by the vehicle exterior imaging unit 122, the image condition determination unit 528 determines that the image is not to be stored as the condition for determining the image to be stored. For example, when the vehicle identification number of the preceding vehicle is recognized in the image captured by the vehicle exterior imaging unit 122 of the specific vehicle 120, there is a high possibility that the inter-vehicle distance between the specific vehicle 120 and the preceding vehicle is small. Therefore, the value of transmission to the road surface information management unit 222 is low. Thus, by determining the distance condition as described above, the data capacity and the communication capacity can be efficiently used.
In the present embodiment, the details of the storage condition determining unit 420, the storage time determining unit 432, and the inter-vehicle distance determining unit 434 are described by taking as an example a case where the storage condition determining unit 420, the storage time determining unit 432, and the inter-vehicle distance determining unit 434 are disposed in the vehicle 120. However, the storage condition determining unit 420, the storage time determining unit 432, and the inter-vehicle distance determining unit 434 are not limited to the present embodiment. In another embodiment, at least a part of the functions of the storage condition determination unit 420 may be arranged in the management server 110. For example, at least one of the interval condition determination unit 524 and the distance condition determination unit 526 may be disposed in the management server 110. In another embodiment, at least a part of the functions of the retention time determination unit 432 may be disposed in the management server 110. In another embodiment, at least a part of the functions of the inter-vehicle distance determination unit 434 may be disposed in the management server 110.
In the present embodiment, the details of the image management unit 124 are described by taking as an example a case where the inter-vehicle distance determination unit 434 acquires information indicating the distance to the vehicle ahead from a sensor disposed in the vehicle 120, and the inter-vehicle distance determination unit 434 determines an image to be stored based on the distance indicated by the distance information. In the present embodiment, the details of the image management unit 124 are described, taking as an example a case where the image analysis unit 450 analyzes an image and determines an image to be stored. However, the image management unit 124 is not limited to this embodiment.
In another embodiment, the distance information may be information indicating that a vehicle identification number assigned to another vehicle is recognized in the image captured by the vehicle exterior imaging unit 122. In this case, for example, the image analysis unit 450 (i) analyzes the image, and (ii) outputs information indicating whether or not the vehicle identification number given to another vehicle is recognized in the image to the inter-vehicle distance determination unit 434. The inter-vehicle distance determination unit 434 may determine that the distance indicated by the distance information is smaller than a predetermined value when the vehicle identification number assigned to the other vehicle is identified.
The distance information may be information indicating that the license plate 126 of another vehicle is recognized in the image captured by the vehicle exterior imaging unit 122. In this case, for example, the image analysis unit 450 (i) analyzes the image, and (ii) outputs information indicating whether or not the license plate 126 disposed in another vehicle is recognized in the image to the inter-vehicle distance determination unit 434. When the license plate 126 disposed in another vehicle is recognized, the inter-vehicle distance determination unit 434 may determine that the distance indicated by the distance information is smaller than a predetermined value.
The distance information may be information indicating that the size of the image of the vehicle or a part of the vehicle satisfies a predetermined condition in the image captured by the vehicle exterior capturing unit 122. In this case, for example, the image analysis unit 450 (i) analyzes the image, and (ii) outputs information indicating whether or not the size of the image of the captured vehicle or a part of the captured vehicle in the image satisfies a predetermined condition to the inter-vehicle distance determination unit 434. The inter-vehicle distance determination unit 434 may determine that the distance indicated by the distance information is smaller than a predetermined value when the size of the image satisfies the predetermined condition. The predetermined condition includes (i) a condition that a ratio of a size of the image of the vehicle or a part of the vehicle to a size of the entire image is larger than a predetermined value, and (ii) a condition that a size of the image of the vehicle or a part of the vehicle is larger than a predetermined size.
Details of information processing in the image management unit 124 will be described with reference to fig. 6, 7, 8, and 9. Specifically, an example of a series of information processing in the image management unit 124 will be described with reference to fig. 6. An example of processing in which the interval condition determination unit 524 determines the imaging interval (which may be referred to as the imaging frequency) of the candidate images to be saved will be described with reference to fig. 7 and 8. An example of processing in which the image analysis unit 450 distinguishes between an image stored in the management server 110 and an image that is not stored in the management server 110 and is discarded will be described with reference to fig. 9.
Fig. 6 schematically shows an example of information processing in the image management section 124. In one embodiment, the information processing shown in fig. 6 is repeatedly executed at a predetermined cycle. The above-described period may vary according to the traveling speed of the vehicle 120. In another embodiment, the information processing shown in fig. 6 is executed, for example, each time a frame image constituting a moving image is captured.
According to the present embodiment, first, in S622 (the step may be abbreviated as S), speed information acquisition unit 522 acquires information indicating the traveling speed of vehicle 120. Next, in S624, the interval condition determination unit 524 determines the time interval or the geographical interval related to the storage of the image, based on the traveling speed of the vehicle 120.
In one embodiment, when the vehicle exterior imaging unit 122 captures a still image, the interval condition determination unit 524 determines an interval or frequency at which the vehicle exterior imaging unit 122 captures the still image. In another embodiment, when the vehicle exterior imaging unit 122 captures a moving image, the interval condition determining unit 524 determines an interval or frequency at which the image extracting unit 440 extracts frame images to be output to the image analyzing unit 450 from among a plurality of frame images captured in a unit time.
More specifically, the interval condition determination unit 524 determines the temporal or geographical interval between the capturing of still images and the extraction of frame images (which may be referred to as image sampling), for example, according to the following procedure. The interval of time to sample the image may be the elapsed time between consecutive 2 sampling instants. The geographical interval at which the image is sampled may be the distance between consecutive 2 sampling locations. The interval of the time at which the image is sampled may be an example of a condition related to the sampling timing of the image. The geographical interval at which the image is sampled may be an example of a condition associated with the sampling location of the image.
First, the interval condition determining unit 524 acquires a target value related to the distance between 2 consecutive sampling positions. The interval condition determination unit 524 may acquire information indicating the target value from an appropriate storage device (not shown) disposed in the vehicle 120 or the management server 110. The interval condition determination unit 524 may determine the target value based on the specification or performance of the vehicle exterior imaging unit 122 mounted on the vehicle 120. The interval condition determination unit 524 may analyze the image captured by the vehicle exterior imaging unit 122 to determine the target value.
The target value of the distance between the 2 consecutive sampling positions is determined based on, for example, (i) the value of the effective range of the vehicle exterior imaging unit 122 mounted on the vehicle 120 or the value of the general effective range of a commercially available vehicle-mounted camera, and (ii) the target value of the degree of overlap of the images among the 2 images that are consecutively sampled. The target value relating to the degree of image overlap may specify that at least a part of the objects of the 2 images overlap, or may specify that the objects of the 2 images do not overlap.
The interval condition determination unit 524 may acquire information indicating the effective range of the vehicle exterior imaging unit 122 from an appropriate storage device (not shown) disposed in the vehicle 120 or the management server 110. The interval condition determination unit 524 may analyze the image captured by the vehicle exterior imaging unit 122 to estimate the effective range of the vehicle exterior imaging unit 122. The interval condition determination unit 524 may acquire information indicating a target value regarding the degree of image duplication from an appropriate storage device (not shown) disposed in the vehicle 120 or the management server 110.
For the purpose of easy understanding of the information processing in the interval condition determining unit 524, an example of a flow in which the interval condition determining unit 524 determines a target value relating to the distance between consecutive 2 sampling positions so that a part of a region corresponding to the effective range of the vehicle exterior imaging unit 122 overlaps in 2 images that are consecutively sampled will be described using the specific example shown in fig. 7. In the example shown in fig. 7, the position L72 and the position L74 represent 2 consecutive sampling positions. In addition, the time T72 represents the time when the vehicle 120 passes through the position L72. The time T74 represents the time when the vehicle 120 passes through the position L74.
In the specific example shown in fig. 7, the image captured when the vehicle 120 passes through the position L72 includes an image of the area 720 on the road surface. The area 720 may be an area corresponding to the effective range of the vehicle exterior imaging unit 122 disposed at the position L72. Similarly, the image of the area 740 on the road surface is included in the image captured when the vehicle 120 passes through the position L74. The area 740 may be an area corresponding to the effective range of the vehicle exterior imaging unit 122 disposed at the position L74.
In the specific example shown in fig. 7, the effective range of the vehicle exterior imaging unit 122 is 50 m. The interval condition determination unit 524 determines the target value relating to the distance between the consecutive 2 sampling positions so that 50% of the area corresponding to the effective range of the vehicle exterior imaging unit 122 overlaps among the consecutively sampled 2 images.
The interval condition determination unit 524 uses, for example, (i) the effective range ER [ m ] of the vehicle exterior imaging unit 122]And (ii) a ratio DR [% of image overlap corresponding to an effective range of the vehicle exterior imaging unit 122 among the 2 images continuously sampled]Based on the following equation (1), a distance-dependent target value L between consecutive 2 sampling positions is calculatedT
[ numerical formula 1]
LT[m]=ER[m]×(100-DR[%])/100
According to equation 1, in the example shown in fig. 7, the target value L related to the distance between 2 consecutive sampling positions TIs 25 m. At this time, the distance between the position L72 and the position L74 is 25m, and 1 image is sampled every time the vehicle 120 moves forward by 25 m. Further, a region 722 that occupies 50% of the forward side in the traveling direction of vehicle 120 in region 720 of the road surface photographed when vehicle 120 passes through position L72, and a region 744 that occupies 50% of the rearward side in the traveling direction of vehicle 120 in region 740 of the road surface photographed when vehicle 120 passes through position L74 may be substantially the same region. Thus, for example, even on a road with a relatively small amount of traffic, the image of the road surface of the road can be uniformly sampled by the single vehicle 120.
Next, the interval condition determination unit 524 determines a target value of the elapsed time correlation between consecutive 2 sampling times. For example, interval condition determination unit 524 acquires information indicating the traveling speed of vehicle 120 from a sensor disposed in vehicle 120. Further, the interval condition determination unit 524 determines a target value related to the elapsed time between the consecutive 2 sampling times, based on the target value related to the distance between the consecutive 2 sampling positions and the traveling speed of the vehicle 120.
The interval condition determining unit 524 uses, for example, (i) a target value L related to a distance between 2 consecutive sampling positions T[m]And (ii) a running speed v [ m/s ] of the vehicle 120]Based on the following expression (2), a target value T related to the elapsed time between consecutive 2 sampling times is determinedT
[ numerical formula 2]
TT[s]=LT[m]÷v[m/s]
The interval condition determination unit 524 may set the target value F of the sampling frequencyT[ piece/s ]]An integer form, and a target value T of the elapsed time correlation between 2 consecutive sampling times is determinedT. In addition, the target value F of the sampling frequencyTMay be a target value T related to the elapsed time between consecutive 2 sampling instantsTThe reciprocal of (c).
For example, as shown in fig. 8, if the traveling speed 822 of the vehicle 120 varies, (i) a required time 824 for the vehicle 120 to travel a distance (50 m in the example of fig. 7) corresponding to the effective range of the vehicle exterior imaging unit 122, and (ii) a target value L for the distance between 2 sampling positions at which the vehicle 120 continuously travelsTThe required time 826 (25 m in the example of fig. 7) also varies.
In one embodiment, the interval condition determining unit 524 determines the required time 826 to be a target value T related to an elapsed time between 2 consecutive sampling timesT. For example, in the specific example shown in fig. 7, when the traveling speed of the vehicle 120 is 60km/h, the interval condition determination unit 524 correlates the target value T with the elapsed time between 2 consecutive sampling times TThe decision was 1.5 seconds. When the traveling speed of vehicle 120 is 90km/h, interval condition determination unit 524 samples 2 consecutive samplesTarget value T related to elapsed time between momentsTThe decision is 1 second.
In another embodiment, the interval condition determining unit 524 first calculates a sampling frequency 828[ pieces/s ] at the travel speed of the vehicle 120 as the reciprocal of a required time 826 corresponding to the travel speed]The calculated value of (a). Then, the interval condition determination unit 524 determines the sampling frequency 828[ piece/s ]]Round up the decimal point or less of the calculated value of (b), and determines the target value F of the sampling frequency 828T[ pieces/s ]]. Next, the interval condition determination unit 524 determines the target value F of the sampling frequency 828T[ pieces/s ]]Is determined as an elapsed time-dependent target value T between successive 2 sampling instantsT
For example, in the example shown in FIG. 7, the running speed [ km/h ] of the vehicle 120 is]In the case of 0 to 90 or less, the target value F of the sampling frequencyT[ pieces/s ]]Is 1. At this time, the interval condition determination unit 524 correlates the target value T with the elapsed time between 2 consecutive sampling timesTThe decision is 1 second. In addition, the running speed [ km/h ] of the vehicle 120]In the case of 90 to 180 or less, the target value F of the sampling frequency T[ piece/s ]]Is 2. At this time, the interval condition determination unit 524 correlates the target value T with the elapsed time between 2 consecutive sampling timesTThe decision was 0.5 seconds.
Next, in S626, the storage time determination unit 432 determines whether or not the current time or the current position is a time or a position for storing an image. If it is determined that the current time or the current position is not the time or the position at which the image is sampled (no in S626), the storage time determination unit 432 determines that sampling of the image captured at the current time or the current position is not necessary in S628, and ends the process.
On the other hand, when determining that the current time or the current position is the time or the position at which the image is stored (yes in S626), the image extraction unit 440 identifies the candidate image to be stored and outputs the image data of the image to the image analysis unit 450. After that, the process proceeds to S660. The processing in S622 and S624 may be repeatedly executed to determine the next sampling period or sampling position. The period in which the information processing shown in fig. 6 is executed next may be decided based on the next sampling period or the next sampling period.
According to the present embodiment, the processes of S642 to S648 are executed in parallel with the processes of S622 to S628. In S642, the inter-vehicle distance determination unit 434 measures the inter-vehicle distance from the preceding vehicle 120. In S646, it is determined whether or not the inter-vehicle distance is equal to or less than a predetermined threshold value.
If the inter-vehicle distance is equal to or less than the predetermined threshold value (yes in S646), the inter-vehicle distance determination unit 434 determines that the image captured at the current time is not necessary to be stored in S648, and ends the process. On the other hand, when the inter-vehicle distance is not equal to or less than the predetermined threshold (no in S646), the image extracting unit 440 specifies an image to be stored, and outputs image data of the image to the image analyzing unit 450. After that, the process proceeds to S660.
In another embodiment, if the inter-vehicle distance is not equal to or less than the predetermined threshold (no in S646), the process may proceed to S626. In another embodiment, if the inter-vehicle distance is not equal to or less than the predetermined threshold (no in S646), the process may proceed to S682.
Next, in S660, the image analysis unit 450 determines whether or not the vehicle identification numbers arranged on the license plates 126 of 1 or a plurality of vehicles 120 are recognized in the image extracted by the image extraction unit 440. If the vehicle identification number in the image is identified (yes in S660), in S672, the image analysis unit 450 determines that the image is not to be stored in the road surface management server 220. In S674, the image analysis unit 450 determines to store the vehicle identification number, and outputs the vehicle identification number to the vehicle information transmission unit 472. Then, the vehicle information transmitting unit 472 transmits the vehicle information including the vehicle identification number to the vehicle management server 240, and ends the process.
On the other hand, when the vehicle identification number in the image is not recognized (no in S660), in S682, the image analysis unit 450 determines that the image is to be stored in the road surface management server 220, and outputs the image data of the image to the image information transmission unit 474. Then, the image information transmitting unit 474 transmits the image information including the image data to the road surface management server 220, and ends the processing.
Fig. 9 schematically shows an example of an image captured by the vehicle exterior imaging unit 122 of the vehicle 120. In S660 of the information processing method described in connection with fig. 6, the details of processing in which the image analysis unit 450 analyzes an image and determines whether or not the image is to be saved will be described with reference to a specific example shown in fig. 9.
In the present embodiment, the vehicle exterior imaging unit 122 of the vehicle 120 images a plurality of frame images for 1 second, and samples 1 frame image for 1 second. In fig. 9, the triangular mark indicates a time point at which a frame image is sampled. In addition, the image 902 represents an image sampled at time T92, and the image 904 represents an image sampled at time T94. In the images 902 and 904, a broken line 910 indicates an area within the effective range of the vehicle exterior imaging unit 122. Line 930 represents a dividing line of the road.
In the image 902, the preceding vehicle 920 traveling on the same driving lane as the vehicle 120 travels outside the range of the effective range of the vehicle exterior photographing unit 122. Therefore, the image analysis unit 450 cannot recognize the vehicle identification number disposed on the license plate 926 of the preceding vehicle 920. Then, the image analysis unit 450 determines that the image 902 is a storage target in the road surface management server 220. Further, the image analysis unit 450 outputs the image data of the image 902 to the image information transmission unit 474.
On the other hand, in the image 904, the preceding vehicle 920 traveling on the same traveling lane as the vehicle 120 travels within the range of the effective range of the vehicle exterior imaging unit 122. Therefore, if the weather conditions or the like are not poor, the image analysis unit 450 can recognize the vehicle identification number disposed on the license plate 926 of the preceding vehicle 920.
As shown in fig. 9, according to the present embodiment, the image analysis unit 450 can recognize the vehicle identification number disposed on the license plate 926 of the preceding vehicle 920. Then, the image analysis unit 450 determines that the image 904 is not to be stored in the road surface management server 220. On the other hand, the image analysis unit 450 determines that the vehicle identification number of the preceding vehicle 920 is a storage target of the vehicle management server 240. Then, the image analysis unit 450 discards the image data of the image 904, and outputs information indicating the vehicle identification number of the preceding vehicle 920 captured in the image 904 to the vehicle information transmission unit 472.
According to the present embodiment, an image having a large area of the road surface like the image 902 is transmitted to the management server 110, and an image having a small area of the road surface like the image 904 is not transmitted to the management server 110. This enables resources of the communication network 10 and the management server 110 to be effectively used. Even if the image has a small area of the road surface as in the image 904, if the vehicle identification number of the preceding vehicle 920 is recognized in the image, information indicating the vehicle identification number is transmitted to the management server 110. Thereby, the image captured by the vehicle 120 can be effectively utilized.
The present invention has been described above with reference to the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. In addition, in a range where there is no technical contradiction, the matters described with respect to a specific embodiment can be applied to other embodiments. It is apparent from the description of the claims that such modifications and improvements can be made within the technical scope of the present invention.
Note that the order of execution of the respective processes such as the operations, flows, steps, and steps in the devices, systems, programs, and methods shown in the claims, the description, and the drawings is not particularly explicitly indicated as "preceding" or "preceding", and may be realized in any order as long as the output of the preceding process is not used in the subsequent process. Even if the description is made using "first", "next", and the like for convenience in the operation flows in the claims, the description, and the drawings, it does not mean that the operations are necessarily performed in this order.

Claims (14)

1. An image management apparatus includes:
a storage determination unit configured to determine an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a condition determining unit that determines a condition for determining the image to be saved,
the condition determining section includes:
a speed information acquisition unit that acquires speed information indicating a speed of the mobile body; and
an interval condition determination unit that determines, as a condition for determining an image to be saved, (i) at least one of a condition relating to a time interval until a next image to be saved is determined after one image is determined to be saved and (ii) a condition relating to a distance that the mobile object moves until the next image to be saved is determined after one image is determined to be saved, as a condition for determining the image to be saved, based on a speed of the mobile object,
The storage determining unit determines the image to be stored according to the condition determined by the condition determining unit,
the mobile body is a vehicle, and the mobile body is,
the condition determining unit may further include an image condition determining unit that determines, as a condition for determining the image to be stored, a condition that the image does not become the storage target when a vehicle identification number assigned to another vehicle existing in the moving direction of the moving object included in the image captured by the imaging unit is recognized.
2. An image management apparatus includes:
a storage determination unit configured to determine an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a condition determining unit that determines a condition for determining the image to be saved,
the condition determining section includes:
a speed information acquisition unit that acquires speed information indicating a speed of the mobile body; and
an interval condition determination unit that determines, as a condition for determining the image to be saved, (i) at least one of a condition relating to a time interval until a next image to be saved is determined after one image is determined to be a saving target and a condition relating to a distance that the mobile object moves until the next image to be saved is determined after one image is determined to be the saving target, based on a speed of the mobile object,
The storage determination unit determines the image to be stored according to the condition determined by the condition determination unit,
the image management device further includes a distance information acquisition unit that acquires distance information indicating a distance between the mobile body and an object existing in a moving direction of the mobile body,
the condition determining unit may further include a distance condition determining unit that determines, as a condition for determining the image to be saved, a condition that the image is not captured when the distance indicated by the distance information is smaller than a predetermined value.
3. The image management apparatus according to claim 2,
the moving body is a vehicle and the moving body is,
the object existing in the moving direction of the moving body is another vehicle existing in front of the moving body,
unique vehicle identification numbers are assigned to a plurality of vehicles,
the condition determining unit may further include an image condition determining unit that determines, as a condition for determining an image to be stored, a condition that the image does not become the storage target when the vehicle identification number assigned to the other vehicle included in the image captured by the image capturing unit is recognized.
4. An image management apparatus includes:
a storage determination unit configured to determine an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a distance information acquisition unit that acquires distance information indicating a distance between the moving body and an object existing in a moving direction of the moving body,
the storage determination unit determines that the image captured by the imaging unit is not to be the storage target when the distance indicated by the distance information is smaller than a predetermined value.
5. The image management apparatus according to claim 4,
the moving body is a vehicle and the moving body is,
the object existing in the moving direction of the moving body is another vehicle existing in front of the moving body,
unique vehicle identification numbers are assigned to a plurality of vehicles,
the distance information is information indicating that a vehicle identification number assigned to the other vehicle included in the image captured by the image capturing unit is recognized,
the storage determination unit determines that the distance indicated by the distance information is smaller than the predetermined value when the vehicle identification number assigned to the other vehicle is identified.
6. The image management apparatus according to claim 3,
the vehicle identification device further includes an identification number storage unit configured to store the identified vehicle identification number when the vehicle identification number assigned to the other vehicle included in the image captured by the imaging unit is identified.
7. The image management apparatus according to claim 6,
the vehicle information collection device further includes an identification number transmission unit that transmits information including the vehicle identification number determined to be stored to a vehicle information collection device that collects the vehicle identification number.
8. The image management apparatus according to any one of claim 1 to claim 7,
the image processing apparatus further includes an image transmission unit that transmits information including image data of the image to be saved to an image information collection device that collects the image.
9. A vehicle, wherein,
an image management apparatus according to any one of claims 1 to 8 is provided.
10. A road surface information management system is provided with:
the image management apparatus of any one of claim 1 to claim 8; and
An image information collection device that collects image data of the image to be saved from the plurality of image management devices,
the image to be stored includes an image of a road,
the image information collection device analyzes the image to be stored to generate road surface information relating to a state of the road surface of the road.
11. A computer-readable storage medium storing a program, characterized in that,
when the program is executed by the processor, it performs:
a storage determination step of determining an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a condition determining step of determining a condition for determining an image to be saved,
the condition determining step includes:
a speed information acquisition step of acquiring speed information indicating a speed of the mobile body,
an interval condition determining step of determining, as a condition for determining an image to be saved, (i) at least one of a condition relating to a time interval from when one image is determined to be saved to when a next image to be saved is determined and (ii) a condition relating to a distance that the mobile object moves during a period from when one image is determined to be saved to when a next image to be saved is determined, based on a speed of the mobile object,
The storage determining step includes a step of determining the image to be stored according to the condition determined in the condition determining step,
the moving body is a vehicle and the moving body is,
the condition deciding step further includes the steps of: when a vehicle identification number assigned to another vehicle present in the moving direction of the moving object included in the captured image is recognized, the image is determined as a condition for determining the image to be stored, without being a condition for determining the image to be stored.
12. A computer-readable storage medium storing a program, characterized in that,
when the program is executed by the processor, it performs:
a storage determination step of determining an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a distance information acquisition step of acquiring distance information indicating a distance between the moving body and an object existing in a moving direction of the moving body,
the saving decision step includes the steps of: the computer determines that the image captured by the image capturing unit does not become the storage target when the distance indicated by the distance information is smaller than a predetermined value.
13. An image management method includes:
a storage determination step in which a computer determines an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a condition determining step of determining a condition for determining the image to be saved by the computer,
the condition determining step includes:
a speed information acquisition step in which the computer acquires speed information indicating a speed of the mobile body; and
an interval condition determining step in which the computer determines, as a condition for determining the image to be saved, (i) at least one of a condition relating to a time interval until a next image to be saved is determined after one image is determined as the object to be saved and a condition relating to a distance that the mobile object moves until the next image to be saved is determined after one image is determined as the object to be saved, as a condition for determining the image to be saved, based on a speed of the mobile object,
the storage determining step includes a step in which the computer determines the image to be stored according to the condition determined in the condition determining step,
The moving body is a vehicle and the moving body is,
the condition deciding step further includes the steps of: when a vehicle identification number assigned to another vehicle present in the moving direction of the moving object included in the captured image is recognized, the image is determined as a condition for determining the image to be stored, without being a condition for determining the image to be stored.
14. An image management method, comprising:
a storage determination step in which a computer determines an image to be stored from among a plurality of images captured by an imaging unit mounted on a mobile object and imaging a state outside the mobile object; and
a distance information acquisition step of acquiring distance information indicating a distance between the moving body and an object existing in a moving direction of the moving body,
the saving determining step includes: and a step in which the computer determines that the image captured by the image capturing unit does not become the storage target when the distance indicated by the distance information is smaller than a predetermined value.
CN201911012891.1A 2018-11-12 2019-10-23 Image management device, road surface information management system, vehicle, computer-readable storage medium, and image management method Active CN111179582B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-212563 2018-11-12
JP2018212563A JP7128723B2 (en) 2018-11-12 2018-11-12 Image management device, road surface information management system, vehicle, program, and image management method

Publications (2)

Publication Number Publication Date
CN111179582A CN111179582A (en) 2020-05-19
CN111179582B true CN111179582B (en) 2022-06-28

Family

ID=70646140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911012891.1A Active CN111179582B (en) 2018-11-12 2019-10-23 Image management device, road surface information management system, vehicle, computer-readable storage medium, and image management method

Country Status (2)

Country Link
JP (1) JP7128723B2 (en)
CN (1) CN111179582B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240096110A1 (en) * 2021-01-29 2024-03-21 Nec Corporation Data collection apparatus, onboard apparatus, data collection method, data transmission method, and program recording medium
JP2022157556A (en) * 2021-03-31 2022-10-14 トヨタ自動車株式会社 Information processing device, program, and information processing method
US20240127604A1 (en) * 2021-04-07 2024-04-18 Mitsubishi Electric Corporation Road surface information collecting device, road surface deterioration detecting system, and road surface information collecting method
US20240135717A1 (en) 2021-04-12 2024-04-25 Mitsubishi Electric Corporation Data extraction device, data extraction method, and data transmission device
CN115550251B (en) * 2022-12-01 2023-03-10 杭州蚂蚁酷爱科技有限公司 Block chain network, node set maintenance method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102474570A (en) * 2010-06-08 2012-05-23 松下电器产业株式会社 Information display device, integrated circuit for display control, and display control method
CN105959536A (en) * 2016-04-29 2016-09-21 深圳市中智仿真科技有限公司 Real scenery obtaining method and obtaining device
CN106604847A (en) * 2014-09-02 2017-04-26 株式会社电装 Image processing device for vehicle
WO2018124987A1 (en) * 2016-12-28 2018-07-05 Tty Motorlu Araclar Turizm Tasimacilik Insaat Tekstil Elektronik Bilisim Sanayi Ve Ticaret Limited Sirketi Modular safe driving assistant
CN108476273A (en) * 2015-11-18 2018-08-31 麦克赛尔株式会社 Information processing unit and its control method for image data

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1131295A (en) * 1997-07-14 1999-02-02 Toshiba Corp Road information management system and road information terminal equipment
JP2003085690A (en) * 2001-09-13 2003-03-20 Alpine Electronics Inc Live view device and live view system
JP2003309846A (en) * 2002-04-16 2003-10-31 Nippon Telegr & Teleph Corp <Ntt> Time series image data storage apparatus
JP2005348329A (en) * 2004-06-07 2005-12-15 Fuji Photo Film Co Ltd Automobile monitoring system
JP4387315B2 (en) * 2005-01-26 2009-12-16 セイコープレシジョン株式会社 In-vehicle camera system
JP2007057437A (en) * 2005-08-25 2007-03-08 Auto Network Gijutsu Kenkyusho:Kk Vehicle moving distance detector, and on-vehicle camera system with range finding function
JP2008236007A (en) * 2007-03-16 2008-10-02 Honda Motor Co Ltd Mobile imaging apparatus
JP4458131B2 (en) * 2007-08-23 2010-04-28 ソニー株式会社 Image imaging apparatus and imaging method
JP4470992B2 (en) * 2007-12-05 2010-06-02 セイコーエプソン株式会社 Video management system
WO2011039989A1 (en) * 2009-09-30 2011-04-07 パナソニック株式会社 Vehicle-surroundings monitoring device
JP5551236B2 (en) * 2010-03-03 2014-07-16 パナソニック株式会社 Road condition management system and road condition management method
WO2014118877A1 (en) * 2013-01-29 2014-08-07 Kajiyama Toshio Local image / map information collecting and providing system
JP2015195569A (en) * 2014-03-25 2015-11-05 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Imaging device for mobile
JP2016122328A (en) * 2014-12-25 2016-07-07 康郎 桑原 Drive recorder and on-vehicle system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102474570A (en) * 2010-06-08 2012-05-23 松下电器产业株式会社 Information display device, integrated circuit for display control, and display control method
CN106604847A (en) * 2014-09-02 2017-04-26 株式会社电装 Image processing device for vehicle
CN108476273A (en) * 2015-11-18 2018-08-31 麦克赛尔株式会社 Information processing unit and its control method for image data
CN105959536A (en) * 2016-04-29 2016-09-21 深圳市中智仿真科技有限公司 Real scenery obtaining method and obtaining device
WO2018124987A1 (en) * 2016-12-28 2018-07-05 Tty Motorlu Araclar Turizm Tasimacilik Insaat Tekstil Elektronik Bilisim Sanayi Ve Ticaret Limited Sirketi Modular safe driving assistant

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种针对车载全景系统的图像拼接算法的仿真;张宝龙;《电子与信息学报》;20150531;全文 *

Also Published As

Publication number Publication date
JP2020080462A (en) 2020-05-28
CN111179582A (en) 2020-05-19
JP7128723B2 (en) 2022-08-31

Similar Documents

Publication Publication Date Title
CN111179582B (en) Image management device, road surface information management system, vehicle, computer-readable storage medium, and image management method
CN109804367B (en) Distributed video storage and search using edge computation
CN109993969B (en) Road condition judgment information acquisition method, device and equipment
US9336450B2 (en) Methods and systems for selecting target vehicles for occupancy detection
Koukoumidis et al. Signalguru: leveraging mobile phones for collaborative traffic signal schedule advisory
CA2778499C (en) Method and apparatus for traffic management
Orhan et al. Road hazard detection and sharing with multimodal sensor analysis on smartphones
KR102352666B1 (en) System and Method for Predicting Traffic Accident Risk
JP2019067201A (en) Vehicle search system, vehicle search method, and vehicle and program employed in the same
US10930145B2 (en) Traffic system for predicting and providing traffic signal switching timing
KR20170039465A (en) System and Method for Collecting Traffic Information Using Real time Object Detection
US11837084B2 (en) Traffic flow estimation apparatus, traffic flow estimation method, traffic flow estimation program, and storage medium storing traffic flow estimation program
KR101494514B1 (en) Vehicle search system using vehicle blackbox
CN113380039B (en) Data processing method and device and electronic equipment
KR101788219B1 (en) System and method for gathering weather information using vehicle
CN109308806A (en) A kind of the traveling detection method and server of the vehicles
US20220101509A1 (en) Deterioration diagnostic device, deterioration diagnostic system, deterioration diagnostic method, and recording medium
CN110545322A (en) Internet of vehicles system and processing method and device of tire pressure information thereof
CN111627224A (en) Vehicle speed abnormality detection method, device, equipment and storage medium
Ji et al. Using mobile signaling data to classify vehicles on highways in real time
JP5789482B2 (en) Driving support system, first information processing apparatus, and program
KR20170045061A (en) Apparatus for guiding route using vehicle data and mehtod thereof
JP2018018128A (en) Information providing method and information providing device
CN108010319B (en) Road state identification method and device
JP5640926B2 (en) Destination prediction device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant