WO2017208371A1 - Dispositif, procédé et programme de recherche d'image - Google Patents

Dispositif, procédé et programme de recherche d'image Download PDF

Info

Publication number
WO2017208371A1
WO2017208371A1 PCT/JP2016/066073 JP2016066073W WO2017208371A1 WO 2017208371 A1 WO2017208371 A1 WO 2017208371A1 JP 2016066073 W JP2016066073 W JP 2016066073W WO 2017208371 A1 WO2017208371 A1 WO 2017208371A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
range
road
search
information
Prior art date
Application number
PCT/JP2016/066073
Other languages
English (en)
Japanese (ja)
Inventor
仁志 影山
孝司 島田
耕世 高野
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to JP2018520263A priority Critical patent/JP6693560B2/ja
Priority to PCT/JP2016/066073 priority patent/WO2017208371A1/fr
Publication of WO2017208371A1 publication Critical patent/WO2017208371A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor

Definitions

  • the present invention relates to an image search program, an image search method, and an image search device.
  • on-board devices such as digital tachographs that monitor operation status such as position and speed are attached to business vehicles such as trucks and buses, and operation management is based on information collected from on-board devices. It is carried out.
  • Some in-vehicle devices have a function of capturing an image on the road at regular intervals and transmitting the captured image.
  • the image transmitted from the vehicle-mounted device is stored in the storage unit of the system together with the image capturing time and the image capturing position. Then, for example, the system accepts designation of a time zone and a road range for extracting an image, searches the storage unit for images corresponding to the time zone and the road range for which the imaging time and imaging position are designated, and displays them. .
  • the imaging positions of images collected by the system from the vehicle-mounted device mounted on each business vehicle are not always evenly distributed. For this reason, even if the searched image is displayed, it may be difficult to grasp the road situation. For example, on roads where the frequency of traffic of business vehicles is low, there are cases where there are few images to be searched and it is difficult to grasp the road conditions.
  • An object of the present invention is to provide an image search program, an image search method, and an image search apparatus that can provide an image that makes it easy to grasp road conditions.
  • the image search program causes the computer to execute a process of receiving specification of a range condition including a time zone and a position area.
  • the image search program searches a computer for a storage unit that stores an image in which position information and time information are associated with each other, and among the images stored in the storage unit, an image that satisfies the received range condition is predetermined.
  • the process of re-searching by expanding either the time zone or the position range of the range condition is executed.
  • FIG. 1 is an explanatory diagram illustrating an example of a physical distribution support system.
  • FIG. 2 is a diagram illustrating an example of a functional configuration of the road condition providing platform.
  • FIG. 3 is a diagram schematically illustrating a functional configuration of the image search apparatus.
  • FIG. 4 is a diagram illustrating an example of a data configuration of imaging information.
  • FIG. 5 is a diagram illustrating an example of a data configuration of road information.
  • FIG. 6 is a diagram illustrating an example of a data configuration of travel section information.
  • FIG. 7 is a diagram illustrating an example of matching and aggregation.
  • FIG. 8 is a diagram for explaining the expansion and reduction of the time zone of the range condition and the range of the position.
  • FIG. 8 is a diagram for explaining the expansion and reduction of the time zone of the range condition and the range of the position.
  • FIG. 9 is a diagram illustrating an example of a flow for searching for road conditions.
  • FIG. 10 is a diagram illustrating another example of a flow for searching for road conditions.
  • FIG. 11 is a flowchart illustrating an example of an image search procedure.
  • FIG. 12 is a diagram illustrating an example of the configuration of a computer that executes an image search program.
  • FIG. 1 is an explanatory diagram illustrating an example of a physical distribution support system.
  • the logistics support system 10 is a system that supports the operation of vehicles in the logistics industry.
  • the logistics support system 10 is, for example, a cloud system.
  • an operation management platform 11, a probe analysis platform 12, and a road condition providing platform 13 are constructed.
  • the distribution support system 10 may be realized by a computer such as one server device or may be implemented as a computer system by a plurality of computers.
  • the logistics support system 10 may be implemented by dividing it into a plurality of computers for each service and function to be provided.
  • a business vehicle 16 such as a truck or a bus is provided with a vehicle-mounted device 15 such as a digital tachograph.
  • the vehicle-mounted device 15 is connected to the logistics support system 10 via the network N so as to be communicable.
  • an operator terminal device 17 and a road manager terminal device 18 are connected to the distribution support system 10 via a network N so as to communicate with each other.
  • a network N any type of communication such as a mobile communication network such as a mobile phone, the Internet (Internet), a LAN (Local Area Network), a VPN (Virtual Private Network), whether wired or wireless. A net can be adopted.
  • a mobile communication network such as a mobile phone, the Internet (Internet), a LAN (Local Area Network), a VPN (Virtual Private Network), whether wired or wireless.
  • a net can be adopted.
  • the case where the number of business vehicles 16 equipped with the vehicle-mounted device 15 is exemplified, but the present invention is not limited to this, and the number of business vehicles 16 equipped with the vehicle-mounted device 15 is an arbitrary number. Can do.
  • the service provider terminal device 17 and the road administrator terminal device 18 were each one was illustrated, it is not limited to this, the service provider terminal device 17 and the road manager terminal There may be any number of devices 18.
  • the in-vehicle device 15 detects various information such as the current position and time of the business vehicle 16 and the speed of the business vehicle 16 during operation at intervals of a predetermined time (for example, 1 second).
  • the in-vehicle device 15 transmits the detected various information as probe information to the physical distribution support system 10.
  • the vehicle-mounted device 15 has a function of capturing an image on the road at regular time intervals and transmitting the captured image.
  • the business vehicle 16 is provided with a camera at a position where the front of the vehicle can be imaged, and the camera is connected to the vehicle-mounted device 15.
  • the vehicle-mounted device 15 captures images on the road ahead of the vehicle with a camera at regular intervals. This fixed time is, for example, 5 minutes, but is not limited thereto.
  • the vehicle-mounted device 15 captures the front of the vehicle at regular time intervals, and transmits image data of the captured image to the physical distribution support system 10 together with position information indicating the captured position and time information indicating the captured time.
  • the logistics support system 10 provides various services based on various information such as probe information and image data transmitted from the vehicle-mounted device 15.
  • the logistics support system 10 provides an operation management service for managing the operation of the business vehicle 16 by using the operation management platform 11.
  • the operation management platform 11 manages the operation time, break time, and speed of the business vehicle 16 based on the probe information transmitted from the vehicle-mounted device 15.
  • the operation management platform 11 provides an image transmitted from the vehicle-mounted device 15 of the business vehicle 16 that manages the operation of the business operator in response to a request from the business operator who operates the business vehicle 16. That is, the operation management platform 11 provides an image captured by the business vehicle 16 operated by the operation operator to the operation operator.
  • the service provider can grasp the operation state of the business vehicle 16 that manages the operation.
  • the logistics support system 10 analyzes the probe information using the probe analysis platform 12.
  • the probe analysis platform 12 analyzes the road on which the business vehicle 16 travels and the congestion of the road based on the probe information transmitted from the vehicle-mounted device 15.
  • the probe analysis platform 12 identifies the road that is running, the section of the road that is running, and the running direction based on the position information included in the probe information.
  • the probe analysis platform 12 analyzes the road congestion state from the time when the business vehicle 16 travels on the road.
  • the logistics support system 10 provides a road condition providing service for providing road conditions using the road condition providing platform 13.
  • the logistics support system 10 provides the images transmitted from the vehicle-mounted device 15 of the business vehicle 16 of each operator as a road condition to third parties such as each operator and a road administrator who manages the road. To do.
  • the service provider terminal device 17 is a terminal device such as a personal computer arranged in the service provider that operates the business vehicle 16.
  • An operator operating the business vehicle 16 accesses the logistics support system 10 using the operator terminal device 17 and manages the operation of the business vehicle 16 using an operation management service provided by the logistics support system 10. .
  • the road manager terminal device 18 is a terminal device such as a personal computer arranged in a road management company that manages roads such as an expressway company.
  • a road administrator who manages the road accesses the physical distribution support system 10 using the road administrator terminal device 18 and grasps the road condition using the road condition providing service provided by the physical distribution support system 10.
  • a road manager who manages an expressway can grasp various road-related conditions such as road congestion and deterioration by referring to images stored for distribution by the logistics support system 10.
  • FIG. 2 is a diagram illustrating an example of a functional configuration of the road condition providing platform.
  • the road condition providing platform 13 is provided with imaging information including image data of an image captured by the vehicle-mounted device 15 and an imaging position when the image is captured from the operation management platform 11. Further, the road condition providing platform 13 is provided with road information regarding the road and travel section information regarding the travel section of the road on which the business vehicle 16 equipped with the vehicle-mounted device 15 travels from the probe analysis platform 12.
  • the road condition providing platform 13 identifies the type of the road on which the image is captured based on the position information when the image is captured, using the identification function (FIG. 2A).
  • the road condition providing platform 13 provides the provided image as a road condition to a third party such as each operating operator or a road manager who manages the road by an output function (FIG. 2B).
  • the operation management platform 11, the probe analysis platform 12, and the road condition providing platform 13 may be implemented on the same computer or on different computers.
  • the logistics support system 10 will be described by taking as an example a case where the operation management platform 11, the probe analysis platform 12, and the road condition providing platform 13 are mounted on different computers.
  • FIG. 3 is a diagram schematically illustrating a functional configuration of the image search apparatus.
  • the image search device 20 is a computer such as a server computer.
  • the image retrieval apparatus 20 is provided with the road condition providing platform 13 and provides a road condition providing service.
  • the case where the image search device 20 is a single computer will be described as an example.
  • the image search device 20 may be implemented as a computer system using a plurality of computers.
  • the image search device 20 includes a communication unit 21, a storage unit 22, and a control unit 23.
  • the communication unit 21 is a communication interface that performs wireless communication or wired communication with other devices.
  • the communication unit 21 receives imaging information including image data of an image captured by the vehicle-mounted device 15 and an imaging position when the image is captured from the operation management platform 11. Further, the communication unit 21 receives from the probe analysis platform 12 road information related to the road and travel section information related to the travel section of the road on which the business vehicle 16 on which the vehicle-mounted device 15 is mounted.
  • the storage unit 22 is a storage device such as a hard disk, an SSD (Solid State Drive), or an optical disk.
  • the storage unit 22 may be a semiconductor memory capable of rewriting data such as RAM (Random Access Memory), flash memory, NVSRAM (Non Volatile Static Random Access Memory).
  • the storage unit 22 stores an OS (Operating System) executed by the control unit 23 and various programs.
  • the storage unit 22 stores various information.
  • the storage unit 22 stores image data 30, imaging information 31, road information 32, travel section information 33, matching information 34, and total information 35.
  • the storage unit 22 may store other various information.
  • the image data 30 is image data of an image captured by the vehicle-mounted device 15 received from the operation management platform 11.
  • the imaging information 31 is data including an imaging position when an image of the image data 30 is captured.
  • FIG. 4 is a diagram illustrating an example of a data configuration of imaging information. As illustrated in FIG. 4, the imaging information 31 includes items of an image ID, an imaging position, a vehicle ID, and an imaging time. The imaging information 31 may store various information other than the above.
  • the item of image ID is an area for storing identification information for identifying image data of an image captured by the vehicle-mounted device 15. For example, a unique image ID (identification) such as a number is assigned to the image data of the image captured by the vehicle-mounted device 15 as the identification information in the order of image capture. In the item of image ID, an image ID assigned to image data of an image captured by the vehicle-mounted device 15 is stored.
  • the item of imaging position is an area for storing position information indicating the imaging position when an image is captured. In the present embodiment, the position information is information indicating the position in latitude, longitude, etc. by a predetermined geodetic system such as the Japanese geodetic system or the world geodetic system.
  • position information such as latitude (x) and longitude (y) indicating the imaging position is stored.
  • the position information may be other information as long as the position can be specified.
  • the item of vehicle ID is an area for storing identification information of the business vehicle 16 equipped with the vehicle-mounted device 15 that has transmitted the image data.
  • a unique vehicle ID is set in the vehicle-mounted device 15 as identification information of the installed business vehicle 16.
  • the vehicle-mounted device 15 transmits image data and probe information to which the set vehicle ID is added to the logistics support system 10.
  • the vehicle ID of the business vehicle 16 equipped with the vehicle-mounted device 15 that has transmitted the image data of the image ID is stored.
  • the item of imaging time is an area for storing the imaging time when an image is captured by the vehicle-mounted device 15.
  • the image of the image data of the image ID “image 1” has an imaging position of latitude x and longitude y, and the vehicle ID of the business vehicle 16 equipped with the vehicle-mounted device 15 that has transmitted the image data is “Aaa”. ", Indicating that the imaging time is 10:20:20.
  • the road information 32 is data storing various information related to the road.
  • the road information 32 stores information such as the type of road for each section dividing the road.
  • FIG. 5 is a diagram illustrating an example of a data configuration of road information. As shown in FIG. 5, the road information 32 includes items of section number, type, direction, facility, and section range. The road information 32 may store various information other than the above.
  • the section number item is an area for storing identification information for identifying a road section.
  • Each road is divided into one or a plurality of sections, for example, according to a predetermined standard, and a unique section number is assigned to each section as identification information, for example.
  • the predetermined reference may be determined by a distance such as every 100 m, or may be a unit in which an intersection with another road such as an intersection is used as a section break.
  • the predetermined standard may be, for example, a road boundary of a structure or facility such as a bridge, a tunnel, a service area, a parking area, a junction, etc. Conditions may be used.
  • the predetermined standard may be a combination of a plurality of conditions.
  • the section number assigned to the road section is stored in the section number section.
  • the type item is an area for storing the type of road in the section having the section number. For the road section, a type indicating what kind of road the section is, such as an expressway, an automobile-only road, and a general road, is defined. The type item stores the type determined for the road in the section with the section number.
  • the direction item is an area for storing the traveling direction of the road section. In the direction item, for example, when a traveling direction such as ascending or descending is defined in a road section, the traveling direction is stored, and when the traveling direction is not defined, the traveling direction is not defined. “-” Is stored.
  • the facility item is an area for storing facilities in a road section.
  • the facility item if the road section is a facility such as a service area or a parking area, the facility name is stored, and if it is not a facility, “-” indicating that it is not a facility is stored.
  • the section range item is an area for storing information related to the section range. For example, when a section is defined as a rectangular area, position information indicating the positions of two opposing vertices in the section area is stored in the section range item.
  • the section range may be indicated using any information. For example, the section range may be indicated by the positions of all the vertices in the section area.
  • the travel section information 33 is data that stores various types of information related to the travel section of the road on which the business vehicle 16 equipped with the vehicle-mounted device 15 travels.
  • FIG. 6 is a diagram illustrating an example of a data configuration of travel section information. As shown in FIG. 6, the travel section information 33 includes items of time zone, section number, direction, average speed, and vehicle ID. The travel section information 33 may store various information other than the above.
  • the item of the time zone is an area for storing the time zone in which the business vehicle 16 equipped with the vehicle-mounted device 15 travels in the section.
  • the time zone is, for example, in units of 5 minutes and is indicated by the start time of the time zone.
  • the section number item is an area for storing a section number for identifying a road section.
  • the direction item is an area for storing the traveling direction of the road section on which the business vehicle 16 equipped with the vehicle-mounted device 15 travels. In the direction item, when the travel direction is determined in the road section, the travel direction is stored, and when the travel direction is not defined, “ ⁇ ” indicating that the travel direction is not defined is stored.
  • the item of average speed is an area for storing the average speed when the business vehicle 16 equipped with the vehicle-mounted device 15 travels in the section.
  • the item of vehicle ID is an area for storing the vehicle ID of the business vehicle 16 that has traveled in the section.
  • the section with the section number 0002 indicates that the work vehicle 16 with the average speed of 34.5 km and the vehicle ID “Aaa” traveled in the time zone at 10:20. .
  • the matching information 34 is data obtained based on the imaging information 31 and the travel section information 33 for a section corresponding to the image capturing position of the image data 30. Details of the matching information 34 will be described later.
  • the total information 35 is data obtained by using the image data 30 to determine the number of facility usages in a pseudo manner. Details of the total information 35 will be described later.
  • the control unit 23 is a device that controls the entire image search apparatus 20.
  • an electronic circuit such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit), or an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array) can be employed.
  • the control unit 23 has an internal memory for storing programs defining various processing procedures and control data, and executes various processes using these.
  • the control unit 23 functions as various processing units by operating various programs.
  • the control unit 23 includes a storage unit 40, a specifying unit 41, a totaling unit 42, a receiving unit 43, a search unit 44, and an output unit 45.
  • the storage unit 40 stores various received data in the storage unit 22.
  • the storage unit 40 stores the image data of the image captured by the vehicle-mounted device 15 received from the operation management platform 11 as the image data 30 in the storage unit 22.
  • the storage unit 40 stores the imaging information received from the operation management platform 11 including the imaging position when the image is captured as the imaging information 31 in the storage unit 22.
  • the storage unit 40 stores the road information received from the probe analysis platform 12 in the storage unit 22 as road information 32. Further, the storage unit 40 stores the travel section information received from the probe analysis platform 12 in the storage unit 22 as travel section information 33.
  • the identification unit 41 performs various types of identification. For example, when the specifying unit 41 receives the image data 30 of the image captured by the vehicle-mounted device 15 and the imaging information 31 including the positional information when the image is captured, the identifying unit 41 uses the positional information when the image is captured. Based on this, the type of road on which the image was captured is specified. For example, the specifying unit 41 specifies, based on the road information 32, which section of the road corresponds to the imaging position of each image data 30 stored in the imaging information 31. Thereby, for example, the image data of the image ID “image 1” in FIG. 4 is identified as corresponding to the section having the section number “0001”. The image data with the image ID “image 2” is identified as corresponding to the section with the section number “0002”.
  • the image data of the image ID “image 3” is specified as corresponding to the section of the section number “0003”.
  • the image data with the image ID “image 4” is identified as corresponding to the section with the section number “0004”.
  • the specifying unit 41 specifies the type of road corresponding to the section corresponding to the imaging position. For example, in the image IDs “image 1” to “image 4”, the type of road corresponding to the section corresponding to the imaging position is specified as “highway”.
  • the specifying unit 41 includes the traveling section information 33. May be used to specify which section of the road the imaging position corresponds to from the imaging time of the image data 30 and the vehicle number of the business vehicle 16.
  • the identifying unit 41 generates matching information 34 by matching the imaging information 31 and the traveling section information 33 based on the identified section.
  • FIG. 7 is a diagram illustrating an example of matching and aggregation.
  • the specifying unit 41 specifies a record corresponding to the time zone including the imaging time and the section corresponding to the imaging position from the travel section information 33 for each image ID of the imaging information 31.
  • specification part 41 produces
  • the totaling unit 42 totals the number of images of image data whose imaging position is included in a predetermined area. For example, when the imaging position is included in the service area and the parking area, the totaling unit 42 totals the number of captured images for each time zone, service area, and parking area. Then, the totaling unit 42 stores the totaling result in the totaling information 35.
  • FIG. 7 shows an example of the total information 35.
  • the total information 35 is provided with items of time zone, facility, and number of uses.
  • the total information 35 stores the number of captured images as the number of uses for each time zone and facility.
  • the vehicle-mounted device 15 captures images at regular intervals, many images are captured in a crowded section.
  • the number of captured images is stored in the totaling information 35 as the number of uses. By using this number of usage cases, it is possible to simulate the congestion status of facilities such as service areas and parking areas.
  • the reception unit 43 receives various operations. For example, when receiving the access from the service provider terminal device 17 or the road manager terminal device 18, the reception unit 43 transmits information on various operation screens to the access source and causes the access source to display the operation screen. Various operations are accepted from. For example, the reception unit 43 provides a website for a road condition providing service. When accepting the access to the website of the road condition providing service from the operating operator terminal device 17 or the road administrator terminal device 18, the accepting unit 43 receives the information on the login screen from the operating operator terminal device 17 and the road that is the access source. It transmits to the administrator terminal device 18 to display a login screen, and inputs a login ID and a password to accept a login operation.
  • the receiving unit 43 When receiving the login operation, the receiving unit 43 performs authentication by comparing the input login ID and password with a previously registered login ID and password.
  • the reception unit 43 displays the condition designation screen by transmitting the information on the condition designation screen to the access source when authentication of the validity is obtained.
  • the reception part 43 receives the search condition of a road condition from a condition designation
  • the accepting unit 43 accepts designation of a range condition including a time zone, a position area, and a traveling direction as a road condition search condition.
  • the reception unit 43 receives designation of a time zone, a road section, and an up and down road.
  • the search unit 44 searches the image data 30 stored in the storage unit 22 for image data that satisfies the search condition received by the receiving unit 43. For example, the search unit 44 uses the matching information 34 to specify the image ID of the image corresponding to the time zone and road section specified as the search condition. Then, the search unit 44 searches whether the image data 30 of the specified image ID is stored in the storage unit 22. Further, when the road section specified as the search condition is a service area or a parking area, the search unit 44 reads the number of usages corresponding to the facility in the specified section from the total information 35.
  • the imaging positions of the images collected by the logistics support system 10 from the vehicle-mounted device 15 mounted on each business vehicle 16 are not necessarily evenly distributed. For this reason, even if the image of the image data searched by the search unit 44 is displayed, it may be difficult to grasp the road condition. For example, on roads where the traffic frequency of the business vehicles 16 is low, there are few images to be searched, and it may be difficult to grasp the road conditions. In addition, for example, on a road where the business vehicle 16 has a high traffic frequency, the number of images to be searched increases, and when all the searched images are displayed, it is difficult to refer to the images because it is troublesome to refer to the images. There is.
  • the search unit 44 selects either the time zone or the position range of the range condition. Expand and re-search.
  • the search unit 44 Re-search by reducing any range of the position.
  • the predetermined number is, for example, 10, but is not limited thereto.
  • the upper limit is set to 15, for example, this is not limited.
  • the predetermined number and the upper limit number may be changeable by a user, an administrator of the physical distribution support system 10, or the like.
  • FIG. 8 is a diagram for explaining the time zone of the range condition and the expansion and reduction of the position range.
  • FIG. 8 shows a period T1 of a designated time zone and a section range T2 of a road section.
  • the search unit 44 first searches the image data 30 that has an imaging time in the period T1 of the designated time zone and has an imaging position in the section range T2 of the road section.
  • the search unit 44 performs a search again by extending either the period T1 or the section range T2.
  • the search unit 44 gives priority to time extension, and alternately repeats time extension and position range extension until the number of searched image data 30 exceeds a predetermined number.
  • the search unit 44 extends the period T1, performs a re-search, and expands the section range T2 if the searched image data 30 is less than or equal to the predetermined number even if the search is performed after the period T1 is expanded. Perform a re-search.
  • the search unit 44 extends the period T1 with respect to the past.
  • the search unit 44 extends the section range T2 to both sides with the designated section as the center.
  • the unit for extending the period T1 may be a predetermined time (for example, 2 minutes) unit or a time zone unit.
  • the expansion of the period T1 may be expanded as the number of expansions increases. For example, the search unit 44 may extend 2 minutes for the first period T1 and extend 3 minutes for the second period T1.
  • the unit for extending the section range T2 may be a predetermined distance (for example, 100 m) unit or a section unit.
  • the expansion of the section range T2 may increase the expansion width of the section range T2 as the number of times of expansion increases.
  • the search unit 44 may extend 100 m for the first expansion of the section range T2 and 200 m for the second expansion of the section range T2.
  • the search unit 44 may extend both the period T1 and the section range T2 when the searched image data 30 is equal to or less than a predetermined number.
  • the search unit 44 may repeat the expansion for only one of the period T1 or the section range T2.
  • the search unit 44 may extend both the period T1 and the section range T2 when the searched image data 30 is equal to or less than a predetermined number. Further, the search unit 44 may extend the period T1 with respect to the past and the future centering on the period T1. In addition, the search unit 44 may extend the section range T2 only for one of the front and the rear of the specified traveling direction with respect to the section range T2. For example, in the case of extending the section range T2, the search unit 44 may use the section range T2 extended before the specified road traveling direction as a new range condition. Thereby, since the situation of the vehicle passing through the section of the designated position is known, it is possible to easily provide a prediction of a change in the road situation such as a traffic jam situation in the section of the designated position.
  • the search unit 44 performs a search again by reducing either the period T1 or the section range T2.
  • the search unit 44 gives priority to the reduction of the section range T2, and alternately repeats the reduction of the position range and the reduction of time until the searched image data 30 becomes smaller than the upper limit number.
  • the search unit 44 performs the search again by reducing the section range T2, and reduces the period T1 if the searched image data 30 is equal to or greater than the upper limit number even after the search is performed by reducing the section range T2. Search again.
  • the search unit 44 reduces only the past side for the period T1.
  • the search unit 44 reduces both sides of the section range T2 around the designated section.
  • the unit for reducing the period T1 is a predetermined time (for example, 2 minutes) unit.
  • the reduction width of the period T1 may be reduced as the number of reductions increases.
  • the search unit 44 may reduce by 2 minutes in the first reduction of the period T1, and reduce by 1 minute in the second reduction of the period T1.
  • the unit for reducing the section range T2 is a predetermined distance (for example, 50 m) unit. In the reduction of the section range T2, the reduction width of the section range T2 may be reduced as the number of times of reduction increases.
  • the search unit 44 may expand by 50 m in the first reduction of the section range T2 and reduce by 25 m in the second reduction of the section range T2.
  • the search unit 44 may reduce both the period T1 and the section range T2 when the searched image data 30 is equal to or greater than the upper limit number. Further, the search unit 44 may repeat the reduction for only one of the period T1 or the section range T2.
  • the search unit 44 may reduce both the period T1 and the section range T2 when the searched image data 30 is equal to or greater than the upper limit number. Further, the search unit 44 may reduce the period T1 with reference to the center of the period T1. Further, the search unit 44 may reduce the section range T2 only for one of the front and the rear of the specified traveling direction with respect to the section range T2.
  • the search unit 44 may set a period T1 extended from a specified time zone to a past time zone as a new range condition.
  • a period T1 extended from a specified time zone to a past time zone as a new range condition.
  • the output unit 45 outputs the search result to the access source.
  • the output unit 45 transmits information about the searched image data and the number of use cases on the road condition screen to the access source to display the road condition screen.
  • FIG. 9 is a diagram showing an example of a flow for searching for road conditions.
  • the image search device 20 provides a road condition providing service, and accepts road condition search conditions from the condition designation screen 100.
  • the condition designation screen 100 includes a map area 101 on which a road map is displayed, a time zone selection area 102 for selecting a time zone as a search condition, and a travel direction selection area 103 for selecting a travel direction of a road to be searched.
  • a search button 104 for instructing start of search is provided.
  • the map area 101 is in a state in which a road section corresponding to a designated position is selected by selecting a road. In the example of FIG. 9, a section 105 indicated by diagonal lines is in a selected state.
  • the time zone selection area 102 is a combo box, and by selecting it, for example, the time zone can be specified in units of 5 minutes. In the example of FIG. 9, the time zone from 10:20 to 10:25 is designated.
  • the traveling direction selection area 103 is a combo box, and by selecting it, for example, it is possible to specify the traveling direction of the road in ascending and descending directions. When a road whose traveling direction is determined is specified in the map area 101, the traveling direction of the selected road may be automatically selected in the traveling direction selection area 103.
  • the image search device 20 selects a road section in the map area 101 of the condition designation screen 100, selects a time zone in the time zone selection area 102, selects a travel direction in the travel direction selection area 103, and clicks the search button 104.
  • a search is performed using the selected road section, time zone, and traveling direction as search conditions.
  • the image search device 20 displays a road situation screen 120 showing the search results.
  • an image of the image data is displayed on the road situation screen 120 in association with the imaging position of the image of the searched image data.
  • the road situation screen 120 displays the number of uses in association with the service area and the parking area. In the example of FIG. 9, since sections other than the service area and the parking area are designated, an image of the image data is displayed on the road situation screen 120 in association with the image capturing position of the searched image data. ing.
  • FIG. 10 is a diagram illustrating another example of a flow for searching for road conditions.
  • a range can be designated by dragging or the like.
  • a range 110 is designated.
  • the image search device 20 the range is specified in the map area 101 of the condition specifying screen 100, the time zone is selected in the time zone selection area 102, the travel direction is selected in the travel direction selection area 103, and the search button 104 is selected. Then, the search is performed using the road section, time zone, and traveling direction included in the designated range of the map area 101 as search conditions. Then, the image search device 20 displays a road situation screen 120 showing the search results.
  • the image data is associated with the imaging position of the image data of the searched image data for each road. Is displayed.
  • the image search device 20 displays an enlarged image when each image is selected.
  • the road management company that manages the road and the service provider that operates the business vehicle 16 can grasp the road condition.
  • FIG. 11 is a flowchart illustrating an example of an image search procedure. This image search is executed at a predetermined timing, for example, when the search condition is specified on the condition specifying screen 100 and the search button 104 is selected.
  • the search unit 44 searches the image data 30 stored in the storage unit 22 for image data that satisfies the specified search condition (S10).
  • the search unit 44 determines whether the number of searched image data 30 is equal to or less than a predetermined number (S11).
  • the search unit 44 performs a search again by extending either the time zone or the position of the range condition (S12), and goes to S11. Transition.
  • the search unit 44 determines whether the number of searched image data 30 is equal to or greater than the upper limit (S13). When the number of searched image data 30 is equal to or greater than the upper limit (Yes in S13), the search unit 44 performs a search again by reducing either the time zone or the position of the range condition (S14), and goes to S13. Transition.
  • the output unit 45 transmits information on the road situation screen indicating the image of the retrieved image data to the access source (S15). ), The process is terminated.
  • the image search device 20 accepts designation of a range condition including a time zone and a position area.
  • the image search device 20 searches the storage unit 22 that stores an image in which position information and time information are associated with each other, and among the images stored in the storage unit 22, a predetermined number of images satisfying the received range condition.
  • the search is performed again by extending the range of either the time zone or the position of the range condition.
  • the image search device 20 can provide an image that makes it easy to grasp the road condition even on a road where the traffic frequency of the business vehicle 16 is low and there are few images to be searched.
  • the image search device 20 sets the range condition time when the number of images that satisfy the received range condition among the images stored in the storage unit 22 is greater than or equal to a predetermined upper limit number that is greater than the predetermined number. Re-search by reducing the range of either the band or the position. As a result, the image search device 20 can provide an image that makes it easy to grasp the road condition even on a road where the traffic frequency of the business vehicle 16 is high and there are many images to be searched.
  • the image search device 20 accepts specification of a range condition including the traveling direction of the road to be searched.
  • the image search device 20 sets the position information extended before the designated road traveling direction as a new range condition.
  • the image search device 20 can provide a change in a road condition such as a traffic jam condition in an easily-predicted area.
  • the image search device 20 sets time zone information extended to a past time zone as a new range condition.
  • the image search device 20 can easily provide a change in a road condition such as a traffic jam condition toward a designated time zone.
  • the image search device 20 gives priority to the extension of time, and expands the position range when the number of images satisfying the range condition including the extended time zone is less than a predetermined number. Thereby, even when the traffic frequency of the business vehicle 16 in the area at the designated position is low, the image search device 20 can provide an image so that the road condition in the area at the designated position can be easily grasped.
  • an on-vehicle device 15 captures an image on the road.
  • an information processing apparatus with a camera such as a smartphone or a mobile phone may be fixed to the vehicle and an image on the road may be taken.
  • the imaging device for imaging is not limited to the vehicle-mounted device 15 and may be an information processing device with a camera.
  • each component of each illustrated apparatus is functionally conceptual and does not necessarily need to be physically configured as illustrated.
  • the specific state of distribution / integration of each device is not limited to that shown in the figure, and all or a part thereof may be functionally or physically distributed or arbitrarily distributed in arbitrary units according to various loads or usage conditions.
  • the storage unit 40, the specifying unit 41, the totaling unit 42, the receiving unit 43, the search unit 44, and the output unit 45 may be integrated as appropriate.
  • the processing of each processing unit may be appropriately separated into a plurality of processing units.
  • all or any part of each processing function performed in each processing unit can be realized by a CPU and a program analyzed and executed by the CPU, or can be realized as hardware by wired logic. .
  • FIG. 12 is a diagram illustrating an example of the configuration of a computer that executes an image search program.
  • the computer 400 includes a CPU (Central Processing Unit) 410, an HDD (Hard Disk Drive) 420, and a RAM (Random Access Memory) 440. These units 400 to 440 are connected via a bus 500.
  • CPU Central Processing Unit
  • HDD Hard Disk Drive
  • RAM Random Access Memory
  • the HDD 420 stores in advance an image search program 420A that performs the same functions as the storage unit 40, the identification unit 41, the totaling unit 42, the reception unit 43, the search unit 44, and the output unit 45. Note that the image search program 420A may be separated as appropriate.
  • the HDD 420 stores various information.
  • the HDD 420 stores various data used for OS and operation support.
  • the CPU 410 reads out and executes the image search program 420A from the HDD 420, thereby executing the same operation as each processing unit of the embodiment. That is, the image search program 420A executes the same operations as the storage unit 40, the specifying unit 41, the totaling unit 42, the receiving unit 43, the search unit 44, and the output unit 45.
  • the image search program 420A is not necessarily stored in the HDD 420 from the beginning. Further, for example, the image search program 420A may be stored in a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card inserted into the computer 400. . Then, the computer 400 may read and execute the program from these.
  • a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card inserted into the computer 400. . Then, the computer 400 may read and execute the program from these.
  • the program is stored in “another computer (or server)” connected to the computer 400 via a public line, the Internet, a LAN, a WAN, or the like. Then, the computer 400 may read and execute the program from these.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne une unité de réception (43) qui reçoit une indication des conditions de portée qui comprennent une région d'emplacement et une période de temps. Une unité de recherche (44) recherche une unité de stockage (22) qui stocke des images avec lesquelles des informations d'emplacement et des informations de temps sont associées, et lorsque le nombre de données d'image (30) qui remplissent les conditions de portée reçues par l'unité de réception (43) parmi les données d'image (30) stockées dans l'unité de stockage (22) est égal ou inférieur à un nombre prescrit, recherche à nouveau par extension de la portée d'emplacement ou de la portée de la période de temps des conditions de portée.
PCT/JP2016/066073 2016-05-31 2016-05-31 Dispositif, procédé et programme de recherche d'image WO2017208371A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018520263A JP6693560B2 (ja) 2016-05-31 2016-05-31 画像検索プログラム、画像検索方法および画像検索装置
PCT/JP2016/066073 WO2017208371A1 (fr) 2016-05-31 2016-05-31 Dispositif, procédé et programme de recherche d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/066073 WO2017208371A1 (fr) 2016-05-31 2016-05-31 Dispositif, procédé et programme de recherche d'image

Publications (1)

Publication Number Publication Date
WO2017208371A1 true WO2017208371A1 (fr) 2017-12-07

Family

ID=60479216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/066073 WO2017208371A1 (fr) 2016-05-31 2016-05-31 Dispositif, procédé et programme de recherche d'image

Country Status (2)

Country Link
JP (1) JP6693560B2 (fr)
WO (1) WO2017208371A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200192932A1 (en) * 2018-12-13 2020-06-18 Sap Se On-demand variable feature extraction in database environments

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07295875A (ja) * 1994-04-22 1995-11-10 Sharp Corp 情報検索装置
JP2004220420A (ja) * 2003-01-16 2004-08-05 Fuji Photo Film Co Ltd 画像検索方法および装置並びにプログラム
JP2012133526A (ja) * 2010-12-21 2012-07-12 Panasonic Corp 位置履歴認証システム、サーバ装置及びプログラム
JP2013045319A (ja) * 2011-08-25 2013-03-04 Sony Corp 情報処理装置、情報処理方法、およびプログラム
WO2013099002A1 (fr) * 2011-12-28 2013-07-04 楽天株式会社 Dispositif de recherche, procédé de recherche, programme de recherche et support d'enregistrement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6398501B2 (ja) * 2014-09-10 2018-10-03 株式会社デンソー 車載カメラ診断装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07295875A (ja) * 1994-04-22 1995-11-10 Sharp Corp 情報検索装置
JP2004220420A (ja) * 2003-01-16 2004-08-05 Fuji Photo Film Co Ltd 画像検索方法および装置並びにプログラム
JP2012133526A (ja) * 2010-12-21 2012-07-12 Panasonic Corp 位置履歴認証システム、サーバ装置及びプログラム
JP2013045319A (ja) * 2011-08-25 2013-03-04 Sony Corp 情報処理装置、情報処理方法、およびプログラム
WO2013099002A1 (fr) * 2011-12-28 2013-07-04 楽天株式会社 Dispositif de recherche, procédé de recherche, programme de recherche et support d'enregistrement

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200192932A1 (en) * 2018-12-13 2020-06-18 Sap Se On-demand variable feature extraction in database environments

Also Published As

Publication number Publication date
JP6693560B2 (ja) 2020-05-13
JPWO2017208371A1 (ja) 2019-01-31

Similar Documents

Publication Publication Date Title
EP3335210B1 (fr) Procédé et appareil de fourniture de détection de disponibilité de stationnement sur la base d'informations de trajectoire de véhicule
JP5243896B2 (ja) 情報表示システム、情報表示方法、および、コンピュータプログラム
US20220018674A1 (en) Method, apparatus, and system for providing transportion logistics based on estimated time of arrival calculation
JP6175846B2 (ja) 車両追跡プログラム、サーバ装置および車両追跡方法
EP3410348A1 (fr) Procédé et appareil pour construire un modèle d'occupation de parkings
Kwak et al. Seeing is believing: Sharing real-time visual traffic information via vehicular clouds
US9495813B2 (en) Vehicle data collection system, vehicle data collection method, vehicle-mounted device, program, and recording medium
CN106384512B (zh) 车辆违章信息查询方法以及系统
US10745010B2 (en) Detecting anomalous vehicle behavior through automatic voting
US20150177001A1 (en) System and method for providing a dynamic telematics dashboard
US20160169702A1 (en) Method for providing traffic conditions data using a wireless communications device, and a navigation device in which this method is employed
US20230228578A1 (en) Multi-Computer System for Dynamically Detecting and Identifying Hazards
JP6729701B2 (ja) 特定プログラム、特定方法および特定装置
JP7274840B2 (ja) データ収集装置、データ収集システムおよびデータ収集方法
JP6467847B2 (ja) 移動経路の出力制御プログラム、移動経路の出力制御方法および情報処理装置
WO2017208371A1 (fr) Dispositif, procédé et programme de recherche d'image
US20140244170A1 (en) Adaptive route proposals based on prior rides
JP2013134155A (ja) カーナビゲーションシステム、カーナビゲーション方法、経路探索装置、プログラム、及び記録媒体
CN107204113A (zh) 确定道路拥堵状态的方法、装置和系统
JP6272929B2 (ja) 情報処理システム、情報処理装置、情報処理方法、および情報処理プログラム
WO2017208372A1 (fr) Programme de traitement d'enregistrement d'images, procédé de traitement d'enregistrement d'images, et dispositif de traitement d'enregistrement d'images
JP6383063B1 (ja) 算出装置、算出方法及び算出プログラム
JP2014099016A (ja) 情報処理システム、情報処理装置、サーバ、端末装置、情報処理方法、およびプログラム
JP6517486B2 (ja) 地図表示システム
JP5964158B2 (ja) 情報処理システム、情報処理装置、サーバ、端末装置、情報処理方法、及びプログラム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018520263

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16903997

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16903997

Country of ref document: EP

Kind code of ref document: A1