WO2014192248A1 - 人物探索システム、人物探索方法及び車載装置 - Google Patents

人物探索システム、人物探索方法及び車載装置 Download PDF

Info

Publication number
WO2014192248A1
WO2014192248A1 PCT/JP2014/002639 JP2014002639W WO2014192248A1 WO 2014192248 A1 WO2014192248 A1 WO 2014192248A1 JP 2014002639 W JP2014002639 W JP 2014002639W WO 2014192248 A1 WO2014192248 A1 WO 2014192248A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driver
face image
image information
database
Prior art date
Application number
PCT/JP2014/002639
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
卓矢 辰己
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2014192248A1 publication Critical patent/WO2014192248A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a person search system, a person search method, and an in-vehicle device using a driver's face image photographed by an in-vehicle camera.
  • Patent Document 1 a technique has been proposed in which a driver's sleepiness level or a side-view driving is detected from a driver's face image and a warning sound is output as necessary.
  • Patent Document 2 a technique for detecting the sign of the failure before the vehicle breaks down from a slight change in the driver's facial expression
  • Patent Document 3 A technique (Patent Document 3) that prevents the vehicle from being stolen by determining whether or not the driver's face image matches the registered face image has been proposed.
  • the in-vehicle camera is not only a feature that a clear face image can be obtained, but also a big image that a face image can be obtained from the front direction of the driver under the same conditions in any vehicle.
  • any of the proposed techniques has a problem that it is difficult to say that this feature is fully utilized.
  • a person search system and a person search that effectively use the feature of an in-vehicle camera that a clear and uniform face image can be obtained with any vehicle. It is an object to provide a method and an in-vehicle device.
  • the person search system includes a vehicle and a database connected to the vehicle via a predetermined communication line.
  • the vehicle includes an in-vehicle camera provided in front of the driver's seat, a generation unit that generates driver face image information by capturing a driver's face image with the in-vehicle camera, and position information acquisition that acquires vehicle position information.
  • a transmitter for transmitting the driver face image information and the position information to the database.
  • the database includes a search target person storage unit that stores searcher face image information based on the search target person's face image, a collation unit that matches the driver face image information transmitted by the transmission unit with the searcher face image information, If the driver face image information matches the searcher face image information as a result of the verification by the verification unit, a position information storage unit that stores the positional information transmitted by the transmission unit in association with the search target person is provided.
  • a person search method using a vehicle according to the second aspect of the present disclosure and a database connected to the vehicle via a predetermined communication line is provided on the vehicle by an in-vehicle camera provided in front of the driver's seat.
  • Driver face image information is generated by photographing a face image
  • vehicle position information is acquired by the vehicle
  • driver face image information and position information are transmitted from the vehicle to the database
  • the vehicle The driver face image information transmitted from the searcher face is collated with the searcher face image information based on the face image of the search target person.
  • the driver face image information matches the searcher face image information in the database, Storing location information in association with image information.
  • An in-vehicle device that is mounted on a vehicle according to the third aspect of the present disclosure and that is communicably connected via a line to a database that stores searcher face image information based on the face image of the search target person is a driver's seat of the vehicle
  • a generation unit that generates driver face image information based on a driver's face image taken by an in-vehicle camera provided in front of the vehicle
  • a position information acquisition unit that acquires vehicle position information
  • a driver face image A transmission unit that transmits information and position information to a database.
  • the position information transmitted by the transmission unit matches the searcher face image information stored in the database, the position information transmitted by the transmission unit is stored in the database in association with the search target person.
  • the on-vehicle device it is possible to effectively utilize the feature of the on-vehicle camera that a clear and uniform face image can be obtained for any vehicle.
  • FIG. 1 is an explanatory diagram illustrating a configuration of a person search system according to an embodiment of the present disclosure.
  • FIG. 2 is a flowchart of driver information transmission processing executed by the control unit of the embodiment.
  • FIG. 3 is a flowchart of search target person detection processing executed by the database of the embodiment.
  • FIG. 4 is an explanatory diagram conceptually showing the stored contents of the database of this embodiment.
  • FIG. 5 is a flowchart of driver information transmission processing executed by the control unit according to the first modification of the present disclosure.
  • FIG. 6 is a flowchart of driver information transmission processing executed by the control unit according to the second modification of the present disclosure.
  • FIG. 7 is a flowchart of search target person detection processing executed by the database according to the second modification of the present disclosure.
  • FIG. 8 is a flowchart of a driver information transmission process executed by the control unit of the third modification of the present disclosure.
  • FIG. 1 shows a configuration of a person search system 1 of the present embodiment.
  • the person search system 1 includes an in-vehicle camera 100 that is mounted on a vehicle 10 and captures a driver's face image, a process that generates a facial feature amount from the driver's face image, and the like.
  • the control device 110 that performs the above and the vehicle 10 are provided at a place different from the vehicle 10 and are configured by a database 20 that registers information on the search target person.
  • the in-vehicle camera 100 is an upper part of the instrument panel and is provided in front of the driver's seat and at a predetermined distance (for example, about 1 meter) from the driver's seat, and emits near-infrared light that is invisible to humans, The driver's face image is taken by detecting the reflected light.
  • the control device 110 includes a control unit 111, a feature amount generation unit 112, a position information acquisition unit 113, a communication unit 114, and a storage unit 115.
  • the control part 111 outputs the request signal which requests
  • the feature amount generation unit 112 receives a face image taken by the in-vehicle camera 100, the feature amount generation unit 112 generates a feature amount of the face (shape, size, etc. of eyes, nose, mouth, etc.) from the face image, and the control unit 111 Output to.
  • the position information acquisition unit 113 acquires the position information of the vehicle 10 based on the GPS signal received by the GPS device 130 from a GPS satellite (not shown), and outputs the position information to the control unit 111.
  • the communication unit 114 receives feature amounts and position information from the control unit 111 and transmits these pieces of information to the database 20 connected via a predetermined wireless communication line.
  • the storage unit 115 stores a vehicle number of the vehicle 10 (information that can identify the vehicle 10), and the vehicle number is also transmitted to the database 20 together with the feature amount and the position information.
  • the database 20 is obtained from the face images of persons who need to search for whereabouts, such as criminals who are seeking to be dispatched, missing persons where search requests have been issued, and the like. Feature values are registered in advance. Then, when the database 20 receives the feature amount of the driver from the vehicle 10, the driver of the vehicle 10 can check the feature amount with the feature amount of the search target person registered in the database 20. It is judged whether it corresponds to a person.
  • FIG. 2 shows a flowchart of driver information transmission processing executed by the control unit 111 of this embodiment.
  • the control unit 111 starts a driver information transmission process and determines whether the vehicle 10 has started running (S100). Whether or not the vehicle 10 has started traveling is determined by the control unit 111 detecting that the speed detected by the vehicle speed sensor 120 is not zero. As a result, if the vehicle 10 has not started running (S100: no), this determination process is repeated, and the vehicle 10 is in a standby state until the vehicle 10 starts running.
  • the control unit 111 When the vehicle 10 starts running (S100: yes), the control unit 111 outputs a request signal for requesting a face image to the in-vehicle camera 100, and causes the in-vehicle camera 100 to capture the driver's face image (S102). ).
  • the in-vehicle camera 100 can shoot a clear face image even at night while suppressing the possibility that the driver feels dazzling by photographing using near-infrared light invisible to humans.
  • the face image photographed by the in-vehicle camera 100 is output to the feature amount generation unit 112, and the feature amount generation unit 112 generates a face feature amount from the face image (S104).
  • a known method used when performing so-called face recognition can be used when generating the feature amount. For example, feature points of facial parts such as the corners of the eyes and the eyes are extracted from the captured face image, and features such as the shape and size of the facial parts (eyes, nose, mouth, etc.) based on those feature points Generate quantity.
  • the control unit 111 reads the vehicle number of the vehicle 10 from the storage unit 115, and transmits it from the communication unit 114 to the database 20 together with the generated feature amount (S106). Since the vehicle number is information unique to the vehicle 10, the database 20 that has received the feature amount and the vehicle number can identify the vehicle 10 on which the driver having the feature amount is aboard.
  • the control unit 111 determines whether or not a coincidence signal is received from the database 20 within a first predetermined time (for example, 15 seconds) after transmitting the feature amount and the vehicle number (S108).
  • a first predetermined time for example, 15 seconds
  • the control unit 111 of the vehicle 10 determines whether or not the communication unit 114 has received the coincidence signal within the first predetermined time after transmitting the feature amount and the vehicle number ( S108).
  • the position information acquisition unit 113 displays the driver's whereabouts. Is acquired from the GPS device 130 (S110). Subsequently, the control unit 111 transmits the vehicle number of the vehicle 10 together with the position information from the communication unit 114 to the database 20 (S112). That is, the position information and the vehicle number of the vehicle 10 that the search target person is driving are transmitted to the database 20 by performing the processes of S110 and S112. And the process which transmits these information is repeated whenever the 2nd predetermined time (for example, 10 second) passes (S114). For this reason, even if the position of the vehicle 10 that the search target person is driving moves, the database 20 can acquire the latest position of the search target person.
  • the 2nd predetermined time for example, 10 second
  • the driver of the vehicle 10 It is considered that the database 20 has determined that the search target person is not a search target person (if the feature amount transmitted from the vehicle 10 does not match the search target person). In this case, returning to the process of S100 at the head, when the vehicle 10 starts running again, a series of processes after S102 are repeated.
  • the driver of the vehicle 10 may change driving to another person while the engine is started, and it is considered that the vehicle 10 is temporarily stopped at the time of the change. For this reason, as described above, when the vehicle 10 starts running (from a stopped state) (S100: yes), if a series of processes after S102 is repeated, the engine of the vehicle 10 is activated. Even if the driver changes as it is, it is possible to determine whether or not the driver after the change is a search target person.
  • the vehicle 10 when the vehicle 10 according to the present embodiment receives the coincidence signal after transmitting the feature amount, the driver having the feature amount is the search target person.
  • the position information and the vehicle number of the vehicle 10 are periodically transmitted to the database 20 in order to track the person.
  • FIG. 3 shows a flowchart of the search target person detection process executed by the database 20.
  • the database 20 determines whether or not the driver of the vehicle 10 is a search target person by performing the following processing in accordance with various information transmitted from the vehicle 10, and the driver If it is a search target person, the position is detected.
  • the database 20 first determines whether or not the feature amount and the vehicle number are received from the vehicle 10 (S200). As described above with reference to FIG. 2, the feature amount and the vehicle number are transmitted from the vehicle 10 when the vehicle 10 starts traveling.
  • the feature amount and the vehicle number have not been received (S200: no)
  • the location information and the vehicle number are transmitted from the vehicle 10 when the database 21 determines that the driver of the vehicle 10 is a search target person (a coincidence signal has been received).
  • the position information and the vehicle number information are repeatedly transmitted from the vehicle 10 in order to track the vehicle 10 driven by the search target person.
  • the received feature amount is collated with the feature amount of the search target person registered in the database 20 (S204).
  • FIG. 4 conceptually shows the information stored in the database 20.
  • the search target person, the file name of the face image of the search target person (face image file), and the feature amount extracted from the face image are registered in advance in the database 20. Further, it is possible to store a vehicle number and position information in association with each search target person.
  • the database 20 collates the feature amount received from the vehicle 10 with the feature amount of the search target person registered in advance in the process of S204.
  • the vehicle number is stored in association with the matched search target person (S208). ).
  • the vehicle number of the vehicle 10 driven by the search target person is stored.
  • the feature number received from the vehicle 10 matches the feature amount of the search target person “C”, so that the vehicle number “nnnnn” of the vehicle 10 is associated with the search target person “C”. It is remembered.
  • the database 20 transmits a coincidence signal indicating that the driver of the vehicle 10 is a search target person to the vehicle 10 (S210).
  • the database 20 returns to the process of S200 at the head and repeats the series of processes described above. Even when it is determined in S206 that the driver of the vehicle 10 is not a search target person (S206: no), the process returns to the top S200 as it is, and the series of processes described above is repeated.
  • the vehicle 10 that has received the coincidence signal from the database 20, that is, the vehicle 10 that is determined to be driven by the search target person, transmits the position information and the vehicle number to the database 20 as described above (FIG. 2). Of S112). Therefore, when the database 20 receives the position information and the vehicle number (S202: yes), the database 20 identifies the search target person whose vehicle number has already been stored in the process of S208 described above, and corresponds to the search target person. In addition, the positional information is stored (S212). Thereby, the position information of the search target person (vehicle 10 driven by) is stored. In the example shown in FIG. 4, the position information “La, Lo” is stored in association with the search target person “C” by receiving the vehicle number of the search target person “C”.
  • the position information and the vehicle number are transmitted from the vehicle 10 to the database 20 every second predetermined time (see S114 in FIG. 2), and thus stored in the database 20.
  • the position information is also updated every second predetermined time.
  • the position information of the search target person is stored in this way (S212), the position information and the vehicle number together with the identification information (name, etc.) of the search target person are notified to the searcher (search person in charge) (S214).
  • search person in charge search person in charge
  • a person in charge of searching when a person to be searched is a criminal or a missing person, a staff member of a police station (related organization) searching for these persons can be exemplified.
  • the person in charge of the search finds out the person to be searched for based on the position information of the vehicle 10 and the vehicle number.
  • a face image is displayed from the front of the driver while the vehicle 10 is traveling with the in-vehicle camera 100 provided in front of the driver's seat and within a predetermined distance from the driver's seat.
  • a feature is generated from the face image and transmitted to the database 20.
  • the database 20 stores the feature amount of the search target person in advance, and if the feature amount of the driver transmitted from the vehicle 10 matches the feature amount of the search target person, the driver of the vehicle 10
  • the position information transmitted from the vehicle 10 is stored as the position information of the search target person.
  • a face image photographed under various conditions is used. For example, a face image photographed from a horizontal direction or a face image photographed from a distance is used. It was used. Face images shot under such various conditions need to be subjected to heavy correction processing when the feature amount is collated with the feature amount of the search target person, or an accurate feature amount can be extracted. Since it was not possible, it was difficult to find a search target person.
  • the in-vehicle camera 100 is provided in front of the driver's seat and at a predetermined distance (for example, about 1 meter) from the driver's seat. A clear face image can be obtained.
  • the in-vehicle camera 100 is provided in front of the driver's seat and at a short distance from the driver's seat. A face image can be obtained. Since the feature amount is generated from such a clear and homogeneous face image and the feature amount is collated with the feature amount of the search target person, the search target person can be easily found with high accuracy.
  • the vehicle 10 driven by the search target person transmits the position information of the vehicle 10 to the database 20, so that the position of the search target person can be specified.
  • the control device 110 corresponds to an in-vehicle device.
  • the control unit 111 and the feature value generation unit 112 capture the driver's feature value (driver face) by capturing the driver's face image with the in-vehicle camera 100 when the vehicle 10 starts to travel. Image information). Therefore, the control unit 111 and the feature amount generation unit 112 of the present embodiment correspond to the “generation unit” in the present disclosure.
  • the position information acquisition unit 113 of the present embodiment acquires the position information of the vehicle 10, it corresponds to the “position information acquisition unit” in the present disclosure.
  • the communication unit 114 transmits the driver's feature amount (driver face image information) and position information to the database, and thus corresponds to the “transmission unit” in the present disclosure.
  • the database according to the present embodiment stores feature amounts (searcher face image information) based on the face image of the search target person. Further, the feature amount (driver face image information) of the driver transmitted by the communication unit 114 (transmission unit) of the vehicle 10 is collated with the feature amount (searcher face image information) of the search target person. Further, as a result of the collation, if the driver's feature amount (driver face image information) matches the search subject's feature amount (searcher face image information), the communication unit 114 ( The position information transmitted by the transmission unit) is stored. Therefore, the database according to the present embodiment corresponds to the “search target person storage unit”, “collation unit”, and “position information storage unit” in the present disclosure.
  • the driver's feature amount is described as being transmitted from the communication unit 114 of the vehicle 10, but the driver's face image may be transmitted instead of the feature amount.
  • the database 20 is good also as producing
  • the vehicle 10 often starts running without the driver changing, such as when the vehicle 10 is temporarily stopped based on a traffic light or a sign. Accordingly, it is possible to determine whether or not the driver has been changed in the vehicle 10 and to transmit the feature amount of the driver to the database 20 when it is determined that the driver has changed.
  • FIG. 5 shows a flowchart of the driver information transmission process of the first modification.
  • the control unit 111 starts the driver information transmission process when the engine is started, based on the vehicle speed detected by the vehicle speed sensor 120, as in the embodiment. It is determined whether or not 10 has started running (S300). When the vehicle 10 starts running (S300: yes), the in-vehicle camera 100 captures the driver's face image (S302), and causes the feature amount generation unit 112 to generate a feature amount from the face image (S304).
  • the feature amount generated from the driver's face image is stored at a predetermined address in the storage unit 115 (S306). Then, the control unit 111 determines whether or not the feature amount generated this time matches the feature amount generated previously (S308). That is, also in the first modification, as in the above-described embodiment, until the coincidence signal is received, the driver's face image is photographed and a feature amount is generated from the face image.
  • the feature amount generated immediately before (the feature amount generated from the face image captured immediately before) is read from the storage unit 115, and such feature amount and the newly generated feature this time are included. It is determined whether or not the amounts match (S308).
  • the control unit 111 transmits the feature amount generated this time together with the vehicle number of the vehicle 10 from the communication unit 114 to the database 20 (S310). Thereby, it is possible to determine whether or not the driver after the change is a search target person at the timing of the change of the driver.
  • the feature amount and the vehicle number are not transmitted to the database 20, so that the amount of information transmitted from the vehicle 10 to the database 20 can be suppressed.
  • the control unit 111 initializes (clears) the feature amount stored in the storage unit 115. Therefore, when the feature amount generated this time is generated from the face image first taken after the engine is started, the feature amount that can be compared in the determination process of S308 described above (generated immediately before). Feature amount) does not exist. In this case, of course, the feature quantity generated this time (the feature quantity generated first after the engine is started) is transmitted to the database 20 together with the vehicle number (S310).
  • the communication unit 114 according to the first modification includes the newly generated driver feature amount (driver face image information) and the driver feature generated before the feature amount (the driver face image information).
  • the newly generated feature amount (driver face image information) of the driver is transmitted to the database 20. Accordingly, the communication unit 114 of the first modification corresponds to the “communication unit” in the present disclosure.
  • FIG. 6 shows a flowchart of the driver information transmission process of the second modification.
  • the control unit 111 determines whether or not the vehicle speed of the vehicle 10 exceeds the legal speed (S400). Whether or not the vehicle 10 exceeds the legal speed is determined by the control unit 111 detecting that the speed detected by the vehicle speed sensor 120 exceeds the legal speed stored in the storage unit 115. As a result, if the legal speed is not exceeded (S400: no), this determination process is repeated.
  • the control unit 111 outputs a request signal for requesting a face image to the in-vehicle camera 100, and causes the in-vehicle camera 100 to capture the driver's face image (S402).
  • generation part 112 of the modification 2 will output this face image (driver face image information) to the control part 111 as it is, if a face image is received from the vehicle-mounted camera 100.
  • this face image is a clue when finding out the driver of the vehicle 10 that has violated the speed.
  • the position information acquisition unit 113 is caused to acquire the position information of the vehicle 10 from the GPS device 130 (S404). And a driver
  • the face image of the driver of the vehicle 10 is transmitted to the database 20.
  • the database 20 receives the face image of the driver of the vehicle 10
  • the database 20 performs processing for searching for the driver regardless of whether or not the driver is registered (stored) in advance.
  • FIG. 7 shows a flowchart of the search target person detection process according to the second modification.
  • the database 20 determines whether or not a face image, position information, and a vehicle number have been received (S500). As a result, if these pieces of information have not been received (S500: no), this determination process is repeated, and a standby state is entered until reception.
  • the face image received from the vehicle 10 is displayed together with the position information and the vehicle number at the police station. Etc. (S510).
  • a member of the related organization that performs the search uses the face image, position information, and vehicle number received by the database 20 from the vehicle 10. Using the clue, the driver of the vehicle 10 can be found.
  • the present invention is not limited to such a case.
  • a criminal such as when the driver steals the vehicle 10 and drives the vehicle, It is good also as searching for a driver.
  • whether or not the driver of the vehicle 10 is a criminal may be determined as follows.
  • a feature amount generated from a face image of the owner of the vehicle 10 or the family of the owner is stored in the storage unit 115 in advance. And if the control part 111 judges that the vehicle 10 started driving
  • the control unit 111 transmits the driver's face image captured by the in-vehicle camera 100 together with the position information and the vehicle number of the vehicle 10 from the communication unit 114 to the database 20 in the same manner as the process of S406 described above. .
  • the staff of the related organization that performs the search uses the face image, the position information, and the vehicle number received by the database 20 from the vehicle 10 as clues. Can find a driver.
  • FIG. 8 shows a flowchart of the driver information transmission process of the third modification. Also in the third modification, the control unit 111 determines whether or not the vehicle 10 has started traveling based on the speed detected by the vehicle speed sensor 120 (S600). As a result, if the vehicle 10 has not started running (S600: no), this determination process is repeated, and the vehicle 10 enters a standby state until the vehicle 10 starts running.
  • the control unit 111 determines that the steering angle of the steering wheel is within a predetermined range including the straight traveling direction of the vehicle 10 (eg, ⁇ 10 degrees with the straight traveling direction of the vehicle 10 being 0 degrees). It is determined whether the angle is within the range of +10 degrees (S602).
  • the steering angle of the steering wheel is detected by a steering angle sensor (not shown) provided in the vehicle 10 and output to the control unit 111.
  • a request signal for requesting a face image is output to the in-vehicle camera 100 to cause the in-vehicle camera 100 to capture the driver's face image ( S604).
  • the feature amount generation unit 112 generates a feature amount from the photographed face image (S606).
  • Modification 3 when the steering angle of the steering wheel is within a predetermined range and the traveling direction of the vehicle is in a state close to straight traveling, a face image is captured to generate a feature amount. In this way, it is possible to generate a feature amount by capturing a clear face image from the front of the driver with a high probability.
  • the steering wheel is largely turned to either the left or right, so that the driver Is likely to drive while looking at the left or right direction.
  • the vehicle speed detected by the vehicle speed sensor 120 it is determined whether or not the vehicle 10 is traveling (S608).
  • the steering angle sensor of the third modification corresponds to the “steering angle detection unit” in the present disclosure because it detects the steering angle of the steering wheel of the vehicle 10.
  • the control unit 111 and the feature amount generation unit 112 of Modification 3 generate the feature amount (driver face image information) of the driver when the steering angle is within a predetermined range including the straight traveling direction of the vehicle 10. . Therefore, the control unit 111 and the feature value generation unit 112 of Modification 3 correspond to the “generation unit” in the present disclosure.
  • the above disclosure includes the following aspects.
  • the in-vehicle camera 100 provided in front of the driver's seat captures the driver's face image information by photographing the driver's face image while the vehicle 10 is traveling.
  • the driver face image information and the position information of the vehicle 10 are generated and transmitted to the database 20.
  • the database 20 stores in advance searcher face image information based on the face image of the search target person. When the driver face image information matches the searcher face image information, the database 20 is associated with the search target person from the vehicle 10.
  • the transmitted position information of the vehicle 10 is stored.
  • the in-vehicle camera 100 is provided in front of the driver's seat and at a relatively short distance (for example, 1 meter) from the driver's seat, so that a clear face image of the driver can be obtained and the driver's face with high accuracy is obtained. Image information can be obtained. Further, since the in-vehicle camera 100 captures the driver's face image at the same angle and at the same distance regardless of the vehicle 10, it can obtain uniform driver face image information. Can do. Since the database 20 matches the face image information with high accuracy and homogeneity in this way with the searcher face image information, the search target person can be found easily and with high accuracy. Further, since the vehicle 10 transmits not only the driver face image information but also the position information (simultaneously or separately) to the database 20, when the search target person is found, the position of the search target person can be specified. .
  • the steering angle of the steering wheel is detected, and when the steering angle is within a predetermined range including the straight traveling direction of the vehicle 10, the driver face image information is generated. It is good as well.
  • the traveling direction of the vehicle 10 is almost straight, the driver is likely driving while looking forward. Therefore, when the steering angle of the steering wheel is within a predetermined range and the traveling direction of the vehicle 10 is almost in a straight line, a face image is taken to generate driver face image information. In this way, it is possible to generate driver face image information by capturing a clear face image from the front of the driver with a high probability.
  • the driver face image information may be generated when the vehicle 10 starts to travel.
  • the person search system 1 when the newly generated driver face image information is different from the driver face image information generated before the driver face image information, the person search system 1 is newly added.
  • the generated driver face image information may be transmitted to the database 20.
  • the newly generated driver face image information is different from the driver face image information generated before that when the driver changes. Since the driver image information is transmitted to the database 20 when the driver changes in this way, it is possible to determine whether or not the driver after this change is a search target person at the timing when the driver changes. It becomes.
  • the in-vehicle device 110 is mounted on the vehicle 10 and is communicably connected to a database 20 in which searcher face image information based on a search target person's face image is stored via a line. Yes.
  • the in-vehicle device 110 includes generation units 111 and 112 that generate driver face image information based on the driver's face image captured by the in-vehicle camera 100 provided in front of the driver's seat of the vehicle 10, and the vehicle 10.
  • a position information acquisition unit 113 that acquires position information and a transmission unit 114 that transmits driver face image information and position information to the database 20 are provided.
  • the position information transmitted by the transmission unit 114 is associated with the search target person in the database. 20 is stored.
  • the in-vehicle device 110 a homogeneous driver face image photographed by the in-vehicle camera is transmitted to the database 20, and the database 20 is thus accurate and has uniform facial image information as searcher face image information. Since collation is performed, a search target person can be easily found with high accuracy.
  • the vehicle-mounted device 110 transmits not only the driver face image information but also the position information (simultaneously or separately) to the database 20, when the search target person is found, the position of the search target person is specified. Can do.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
  • Collating Specific Patterns (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)
  • Traffic Control Systems (AREA)
PCT/JP2014/002639 2013-05-29 2014-05-20 人物探索システム、人物探索方法及び車載装置 WO2014192248A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-112816 2013-05-29
JP2013112816A JP6052062B2 (ja) 2013-05-29 2013-05-29 人物探索システム

Publications (1)

Publication Number Publication Date
WO2014192248A1 true WO2014192248A1 (ja) 2014-12-04

Family

ID=51988301

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/002639 WO2014192248A1 (ja) 2013-05-29 2014-05-20 人物探索システム、人物探索方法及び車載装置

Country Status (2)

Country Link
JP (1) JP6052062B2 (enrdf_load_stackoverflow)
WO (1) WO2014192248A1 (enrdf_load_stackoverflow)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017002240A1 (ja) * 2015-07-01 2017-01-05 株式会社日立国際電気 監視システム、撮影側装置、及び照合側装置
CN107466223A (zh) * 2015-04-09 2017-12-12 宝马股份公司 用于电子多功能装置的控制
CN110113540A (zh) * 2019-06-13 2019-08-09 广州小鹏汽车科技有限公司 一种车辆拍摄方法、装置、车辆和可读介质
CN110443213A (zh) * 2019-08-12 2019-11-12 北京比特大陆科技有限公司 面部检测方法、目标检测方法和装置
CN111078927A (zh) * 2019-12-19 2020-04-28 罗普特科技集团股份有限公司 基于家谱数据识别驾驶员身份的方法、装置、存储介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106355675A (zh) * 2016-08-31 2017-01-25 重庆市朗信智能科技开发有限公司 一种obd隐藏式汽车行车记录设备
JP6402787B2 (ja) * 2017-01-27 2018-10-10 日本電気株式会社 監視システム、監視センター装置、搭載装置、監視方法、処理方法、プログラム
JP6933161B2 (ja) * 2018-03-02 2021-09-08 日本電気株式会社 画像処理装置、画像処理方法、プログラム
US12079319B2 (en) 2019-04-18 2024-09-03 Nec Corporation Person specifying device, person specifying method, and recording medium
JP2021119436A (ja) * 2020-01-30 2021-08-12 荘太郎 林 交通事故と逃げ得が急激に大幅に激減する自動車。
JP7581972B2 (ja) 2021-03-04 2024-11-13 株式会社Jvcケンウッド 検知機能制御装置、検知機能制御方法、及びプログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006260483A (ja) * 2005-03-18 2006-09-28 Toshiba Corp 顔照合システム及び顔照合方法
JP2009059259A (ja) * 2007-09-03 2009-03-19 Masahiro Watanabe 車両運行管理システム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006260483A (ja) * 2005-03-18 2006-09-28 Toshiba Corp 顔照合システム及び顔照合方法
JP2009059259A (ja) * 2007-09-03 2009-03-19 Masahiro Watanabe 車両運行管理システム

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107466223A (zh) * 2015-04-09 2017-12-12 宝马股份公司 用于电子多功能装置的控制
WO2017002240A1 (ja) * 2015-07-01 2017-01-05 株式会社日立国際電気 監視システム、撮影側装置、及び照合側装置
JPWO2017002240A1 (ja) * 2015-07-01 2018-04-19 株式会社日立国際電気 監視システム、撮影側装置、及び照合側装置
US10198802B2 (en) 2015-07-01 2019-02-05 Hitachi Kokusai Electric Inc. Monitoring system, photography-side device, and verification-side device
CN110113540A (zh) * 2019-06-13 2019-08-09 广州小鹏汽车科技有限公司 一种车辆拍摄方法、装置、车辆和可读介质
CN110113540B (zh) * 2019-06-13 2021-06-04 广州小鹏汽车科技有限公司 一种车辆拍摄方法、装置、车辆和可读介质
CN110443213A (zh) * 2019-08-12 2019-11-12 北京比特大陆科技有限公司 面部检测方法、目标检测方法和装置
CN111078927A (zh) * 2019-12-19 2020-04-28 罗普特科技集团股份有限公司 基于家谱数据识别驾驶员身份的方法、装置、存储介质

Also Published As

Publication number Publication date
JP2014232421A (ja) 2014-12-11
JP6052062B2 (ja) 2016-12-27

Similar Documents

Publication Publication Date Title
JP6052062B2 (ja) 人物探索システム
CN109080641B (zh) 驾驶意识推定装置
US9047721B1 (en) Driver log generation
US20170009509A1 (en) Device and method for opening trunk of vehicle, and recording medium for recording program for executing method
US11587442B2 (en) System, program, and method for detecting information on a person from a video of an on-vehicle camera
JP6509361B2 (ja) 駐車支援装置及び駐車支援方法
JP2019091255A (ja) 情報処理装置、運転者モニタリングシステム、情報処理方法、及び情報処理プログラム
CN108327719A (zh) 辅助车辆行驶的方法及装置
KR20170051197A (ko) 운전 패턴 분석을 통한 운전자 상태 감시 방법 및 장치
JP6448880B1 (ja) 危険情報収集装置
JP6048246B2 (ja) 車間距離計測装置及び車間距離計測方法
CN111325088B (zh) 信息处理系统、记录介质以及信息处理方法
JP2009237897A (ja) 画像認識装置
JP2019088522A (ja) 情報処理装置、運転者モニタリングシステム、情報処理方法、及び情報処理プログラム
JP6288204B1 (ja) 車両用制限速度検出装置
JP2002342883A (ja) 危険運転抑制装置
CN111464736A (zh) 服务器、服务器控制方法、车辆、车辆控制方法以及存储程序的存储介质
JP2019067201A (ja) 車両捜索システム、車両捜索方法、ならびに、それに用いられる車両およびプログラム
US12052563B2 (en) System for data communication using vehicle camera, method therefor and vehicle for the same
JP6331232B2 (ja) 車両用制限速度検出装置
CN116895058A (zh) 物体信息取得方法以及用于实现该方法的系统
JP7110254B2 (ja) 捜索システム
CN112639902B (zh) 队列车辆判别装置和车辆
JP7030000B2 (ja) 情報処理方法、情報処理システム、および、プログラム
JP2023166227A (ja) 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14805017

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14805017

Country of ref document: EP

Kind code of ref document: A1