JP2007025767A - Image recognition system, image recognition method, and image recognition program - Google Patents

Image recognition system, image recognition method, and image recognition program Download PDF

Info

Publication number
JP2007025767A
JP2007025767A JP2005202875A JP2005202875A JP2007025767A JP 2007025767 A JP2007025767 A JP 2007025767A JP 2005202875 A JP2005202875 A JP 2005202875A JP 2005202875 A JP2005202875 A JP 2005202875A JP 2007025767 A JP2007025767 A JP 2007025767A
Authority
JP
Japan
Prior art keywords
image
recognition
image recognition
detection
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2005202875A
Other languages
Japanese (ja)
Inventor
Kajiro Ushio
嘉次郎 潮
Original Assignee
Nikon Corp
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp, 株式会社ニコン filed Critical Nikon Corp
Priority to JP2005202875A priority Critical patent/JP2007025767A/en
Publication of JP2007025767A publication Critical patent/JP2007025767A/en
Withdrawn legal-status Critical Current

Links

Images

Abstract

【Task】
When recognizing an image by extracting a face part from an image obtained by photographing a person entering or leaving a room with a camera, there is a problem that the recognition is very difficult because the characteristics greatly differ depending on the orientation of the face to be photographed.
[Solution]
The present invention relates to a camera equipped with a lens, an image position detecting means for detecting the position of a recognized object in an image taken by the camera, and an image recognition of the recognized object at the position detected by the image position detecting means. And a system control unit for controlling the entire system, wherein the image recognition unit changes a method of image recognition processing according to the position of the recognition object detected by the image position detection unit. .
[Selection] Figure 1

Description

  The present invention relates to an image recognition system and method, and an image recognition program for executing them on a computer.

  In recent years, from the viewpoint of security, entrance / exit management systems that manage people entering and exiting specific places and rooms have been widely introduced. However, many of the conventional entrance / exit management systems use ID cards, and ID card information remains in the room entry record, but it is possible to enter other people's ID cards as well, so personal authentication is required to identify the person entering the room. It has become. Fingerprint authentication, iris authentication, vein authentication, etc. are known as techniques for authenticating people. However, with these authentication methods, one person must stop and enter / exit, and the frequency of entering and exiting is high. It is difficult to use in places where many people come and go.

Therefore, as a method other than the above, there has been active development of technology for recognizing who is going in and out by processing an image taken by a camera. However, when recognizing a passing person by identifying facial features from an image taken by a camera, the recognition is very difficult because the features differ greatly depending on the orientation of the face to be photographed.
In Patent Document 1, image recognition processing is performed by selecting a portion with a small variation in feature amount so that stable image recognition can be performed even if there is a variation in the image capture environment such as illumination or the direction of photographing a face image. A technique to improve collation accuracy is introduced.

Further, in Patent Document 2, a face image is left in addition to ID card entry information, and in particular, a camera that monitors entry / exit in a predetermined area has been photographed so that a person entering the room can be reliably identified. A technique has been introduced in which a face area image that faces the front most with respect to a monitoring camera is automatically determined from an image sequence, and only a face image close to the front face image that easily identifies a person is recorded.
JP 2001-307090 A JP 2002-304651 A

In a system and method or program that recognizes an image by extracting a facial part from an image of a person entering or leaving a room or the like, there is a problem that the recognition is very difficult because the characteristics vary greatly depending on the orientation of the face to be photographed. It was.
For the purpose of only recording an image as described in Patent Document 2, it is sufficient to use only an image close to a face image from the front where it is easy to specify a person from an image sequence captured by a camera. When an image of a person passing through the gate is recognized, an image close to the front is not necessarily captured by a person passing through the end of the camera shooting range.

In addition, in order to perform stable image recognition even if there is a change in the image capturing environment such as illumination or the shooting direction of the face image, for example, it is necessary to perform complicated image processing as described in Patent Document 1, and high speed There is a problem that a costly device is required and the cost becomes high.
An object of the present invention is to provide an image recognition system capable of easily and accurately recognizing an object to be recognized even when the imaging direction of the object to be recognized such as a human face image is not frontal, while using a conventional image recognition technology. It is to provide a method and program.

  An image recognition system according to the present invention includes a camera equipped with a lens, an image position detection unit that detects a position of a recognition object in an image captured by the camera, and a position recognition unit that detects the position detected by the image position detection unit. An image recognition means for recognizing an object and a system control means for controlling the entire system, wherein the image recognition means changes an image recognition processing method according to the position of the recognition object detected by the image position detection means. It is characterized by.

  The image recognition method of the present invention includes a camera equipped with a lens, an image position detection unit that detects the position of a recognition object in an image captured by the camera, and a position detected by the image position detection unit. In an image recognition method using an image recognition system comprising an image recognition means for recognizing an object to be recognized and a system control unit for controlling the entire system, the image recognition method depends on the position of the recognition object detected by the image position detection means. The image recognizing means changes the image recognition processing method.

  The image recognition program of the present invention is a computer for inputting image data taken by a camera equipped with a lens. Image position detection means for detecting the position of a recognition object in the input image data; It is made to function as an image recognition means which switches the method of an image recognition process according to the position of the said to-be-recognized object which the detection means detected.

  According to the image recognition system, the image recognition method, and the image recognition program of the present invention, the image recognition unit performs appropriate image recognition based on the position information detected by the image position detection unit that detects the position of the recognition object from the image captured by the camera. Since the process is switched, the recognition object can be easily and accurately recognized even when the shooting direction of the recognition object such as a person's face is not the front.

(First embodiment)
A first embodiment of the image recognition system of the present invention will be described in detail with reference to FIG. FIG. 1 is a block diagram of a system for performing entrance / exit management using the image recognition system of the present invention. A camera is installed at the entrance of the building, and the person who is admitted is recognized from the captured image. It is a system that records and manages who entered at what time and what day in a database.

  In FIG. 1, 101 is a wide-angle lens, 102 is a camera, 103 is an infrared sensor, 104 is a captured image buffer, 105 is image recognition means, 106 is image position detection means, 107 is database switching means, 108 is a database for the right side, 109 Is a front database, 110 is a left database, 111 is an entrance / exit management system control means, 112 is an entrance / exit management database, 113 is a front registration image of person A, 114 is a right registration image of person A, and 115 is a person A's registration image. Left registration image, 116 is front registration image of person B, 117 is right registration image of person B, 118 is left registration image of person B, 119 is front registration image of person C, 120 is right registration image of person C, 121 Indicates the left registration image of the person C, respectively.

  Here, the database for the right side 108, the database for the front side 109, and the database for the left side 110 are pre-photographed from the right side, those taken from the left side, and those taken from the front side. Three types of face images are registered. For example, in FIG. 1, 113 is a front registration image obtained by photographing the face of a person A from the front, 114 is a right registration image obtained by photographing the face of the same person A from the right side, and 115 is a face registration of the same person A from the left side. The photographed left registration image, similarly 116 is the front registration image of the person B, 117 is the right registration image of B, 118 is the left registration image of B, 119 is the front registration image of the person C, 120 is C The right registration image 121 indicates the C left registration image. In addition to the three persons A, B, and C, human face images using the building 201 are registered, but the description thereof is omitted.

Next, the state of the entrance of the building will be described with reference to FIG. FIG. 2 is an explanatory diagram depicting the state of the entrance of the building, 201 is the building, 202 is the door of the entrance, 203, 204 and 205 are the arrival people, 206 and 207 are the shooting range of the camera 102, 208 is the infrared sensor 103 Each infrared detection position is shown. In addition, the same number as FIG. 1 shows the same thing.
Now, when a person approaches the door 202 of the building 201 and comes to the infrared detection position 208, the infrared sensor 103 detects the arrival of the person and notifies the entrance / exit management system control means 111 of FIG. In response to this, the entrance / exit management system control means 111 instructs the image recognition means 105 to recognize who has arrived. The image recognizing means 105 captures an image obtained by photographing the photographing ranges 206 and 207 of FIG. 2 via the wide-angle lens 101 by the camera 102 into the photographed image buffer 104, and among the images photographed by the image position detecting means 106. Command to detect the position of the human face. Receiving this, the image position detecting means 106 searches the image captured in the captured image buffer 104 for a portion where a human face is reflected, informs the image recognition means 105 of the position in the captured image, and detects the detected position. Accordingly, the database switching means 107 switches the database selected. The database switching means 107, as detected by the image position detection means 106, detects the right database 108 when detected in the right position range, and detects the front database 109 when detected in the front position range, with the left position range. If it is detected, the left side database 110 is selected, and the image recognition means 105 reads the registered image data of the selected database.

  Here, it is assumed that the image when the infrared sensor 103 detects the arrival of a person and notifies the entrance / exit management system control unit 111 of FIG. 1 is as shown in FIG. FIG. 3 is an explanatory diagram for explaining an image photographed by the camera 102 via the wide-angle lens 101. 301 represents a photographed image, 302 represents a front position range, 303 represents a left position range, and 304 represents a right position range. In FIG. 2, the arrival person 203 is in the right position range 304, the arrival person 204 is in the front position area 302, and the arrival person 205 is in the left position area 303. The front position range 302, the left position range 303, and the right position range 304 are ranges for the image position detection means 106 to detect the position of the human face, and the image position detection means 106 is a method as shown in FIG. Detect the position of the face.

  In FIG. 4, it is checked whether or not a template similar to a template 402 modeling a human face is in the detection range 401. The detection of whether there is something similar is, for example, as is generally known, after performing image processing such as noise removal and edge extraction, correlate with the template 402, the correlation value is greater than a certain value It can be realized by a method of discriminating that things are similar. In this way, the image position detecting means 106 detects the position of the human face by examining the correlation with the template 402 while moving the detection range 401 little by little as the arrow in the scanning direction 403 in FIG.

FIG. 5 is an explanatory diagram showing a state in which the image position detection means 106 scans the detection range 401 as shown in FIG. 4 and extracts a portion having a high correlation with the template 402, and 501 is detected in the front position range 302. A face image 502 is a face image detected in the left position range 303, and a face image 503 is a face image detected in the right position range 304.
If a face image is detected at the boundary portion of the detection position range, it is assumed that the detection is in the detection position range where the area of the template 402 is larger, or the left end of the template 402 is applied. It is assumed that the detection is in the detection position range of the other.

  Next, the operation of the image recognition means 105 when a person comes to the front position in FIG. 1 will be described. The image position detection means 106 informs the image recognition means 105 that the face image 501 has been detected in the front position range 302 and the coordinate data of the template 402 at the time of detection, and at the same time, switches the database switching means 107 to the front database 109. Based on the position information from the image position detection means 106, the image recognition means 105 is registered in advance in the front database 109 read via the database switching means 107 and the portion corresponding to the face image 501 of the captured image buffer 104. The front registration images are sequentially read and compared, and the person 204 is identified by recognizing who the arrival person 204 is.

  By the way, there are various known methods for identifying a person by image recognition processing. For example, the correlation between the contour, the eyes and the nose is taken, and the person is identified by the height of the correlation value. Is possible. That is, since the correlation value is high for the same person and the correlation value is low for the other person, the person can be specified by comparing the correlation values. In the present embodiment, description will be made assuming that such an image recognition method is used.

  Next, the operation when a person comes to the front position in FIG. 1 will be described. The image recognition means 105 reads, for example, the front registration image 113 of A from the front database 109 via the database switching means 107 and compares it with the face image 501 of the arrival person 204. Find out if there is. If they are not the same person, the front registration image 116 of B is read and compared with the face image 501 of the arrival person 204 to check whether they are the same person. If it is not the same person again, the front registration image 119 of C is read and compared with the face image 501 of the arrival person 204 to check whether or not they are the same person.

Now, assuming that the correlation between the front registration image 119 of C and the face image 501 of the arrival person 204 is high, the image recognition unit 105 recognizes that the person C has visited, and notifies the entrance / exit management system control unit 111, The entrance / exit management system control means 111 records the arrival of the person C in the entrance / exit management database 112 together with the date and time.
Next, the operation when a human comes to the left position will be described. The image position detection means 106 also notifies the image recognition means 105 that the face image 502 has been detected in the left position range 303 and the coordinate data of the template 402 at the time of detection, and simultaneously uses the database switching means 107 for the left side. Switch to database 110. Based on the information from the image position detection unit 106, the image recognition unit 105 compares the portion corresponding to the face image 502 of the captured image buffer 104 with the left registration image of the left side database 110 read out through the database switching unit 107. Go and recognize who the arrival person 205 is.

Specifically, the image recognition unit 105 sequentially reads, for example, the left registration image 115 of A, the left registration image 118 of B, and the left registration image 121 of C from the left side database 110 via the database switching unit 107. Compared with the face image 502 of the arrival person 205, it is checked whether or not they are the same person.
Now, assuming that the correlation between the left registration image 118 of B and the face image 502 of the arrival person 205 is high, the image recognition unit 105 recognizes that the person B has visited and informs the entrance / exit management system control unit 111, The entrance / exit management system control unit 111 records the arrival of the person B in the entrance / exit management database 112 together with the date and time.

  Similarly to the case described above, the operation when the person arrives at the right position is also informed of the face image 503 of the arrival person 203. The image position detection means 106 informs the image recognition means 105 and at the same time the database switching means 107 is detected. Since the right side database 108 is switched to the same side as the range 304, the image recognition unit 105 is configured such that the A right registration image 114, the B right registration image 117, and the C right registration image 120 registered in the right database 108 in advance. And the face image 503 of the arrival person 203 are compared with each other to check whether or not they are the same person. As in the case of the arrivals 204 and 205, when the face image 503 of the arrival 203 is recognized as the right registration image 114 of A, the entrance / exit management system control unit 111 is notified, and the entrance / exit management system control unit 111 enters / exits The fact that the person A has visited the management database 112 is recorded along with the date and time.

In order to make the configuration easy to understand, three separate databases are provided for the front side, the right side, and the left side. However, the addresses may be stored separately in one database, or the memory card. You can also remember it.
In this way, it is possible to recognize the face image of the arrival person, identify who the person is, and record it in the entrance / exit management database 112 together with the date and time. Alternatively, depending on the operation method of the entrance / exit management system, it is possible to prevent a person other than the registrant from entering by sounding an alarm in the case of an unregistered person or automatically controlling the opening / closing of the door 202. .

(Second Embodiment)
Next, a second embodiment in the case where the image recognition system of the first embodiment is realized by a software program using a computer will be described using a flowchart. FIG. 6 is a flowchart showing a computer program process for recognizing a person by switching databases, and the process starts in step S601.

In step S602, in a branching process in which the presence or absence of a human visit is checked by a sensor corresponding to the detection of the infrared sensor 103 in FIG. 1, if there is no human arrival, the process branches to No, and returns to step S602 again. Wait for human arrival. If there is a human arrival, the process branches to Yes, and the process proceeds to step S603.
In step S603, this corresponds to a process of capturing an image captured by the camera 102 of FIG. 1 into the captured image buffer 104. The captured image data is stored in the memory of the computer, and the process proceeds to step S604.

In step S604, the position of the portion corresponding to the human face is detected from the image data stored in the memory, and the process proceeds to step S605. The face position detection method is processed by a computer in the same manner as the image position detection means 106 described in the first embodiment.
In step S605, it is checked whether or not the face position detected in step S604 is in the front position range 302 in FIG. 3. If the position is the front position, the process proceeds to step S607. If the position is not the front, the process proceeds to step S606.

In step S606, it is checked whether or not the face position detected in step S604 is in the right position range 304 in FIG. 3. If the position is the right position, the process proceeds to step S608. If the position is the left position, the process proceeds to step S609. In this flowchart, exception processing such as when there is no match is not described.
In step S607, the file name or memory address on the computer corresponding to the front database 109 of FIG. 1 in which only registered images taken from the front registered in advance are stored is set, and the process proceeds to step S610.

In step S608, the file name or memory address on the computer corresponding to the right-side database 108 in FIG. 1 in which only registered images taken from the right side registered in advance are stored, and the process proceeds to step S610.
In step S609, the file name or memory address on the computer corresponding to the left side database 110 of FIG. 1 in which only registered images taken from the left side registered in advance are stored, and the process proceeds to step S610.

  In step S610, the computer compares the face image detected in step S604 with the pre-registered image read from the database in which the database file name or memory address is set, and determines which face image it is. The image recognition process to identify is performed and it progresses to step S611. The image recognition processing method is processed by a computer in the same manner as the image recognition means 105 described in the first embodiment.

In step S611, the arrival of the person specified in step S610 is recorded in the database on the computer corresponding to the entrance / exit management database 112, and the process returns to step S602.
In this way, an entrance / exit management system using image recognition can be realized by executing a program that repeats the above steps on a computer.

(Third embodiment)
Next, a third embodiment of the image recognition system of the present invention will be described in detail with reference to FIG. FIG. 7 is a block diagram of a system for performing entrance / exit management using the image recognition system of the present invention, as in the first embodiment. A camera is installed at the entrance of the building, and visitors are taken from the captured images. It is configured to recognize who is, and to record and manage who entered at what time and on what day of the month.

Unlike the first embodiment, instead of switching the database used by the image recognition unit 105 based on the position information output by the image position detection unit 106, the processing method of the image recognition process, that is, the algorithm is switched. This is the point.
In FIG. 7, 705 is an image recognition unit different from the first embodiment, 707 is an algorithm switching unit, 710 is a left side algorithm, 709 is a front side algorithm, 708 is a right side algorithm, and 712 is not a face image. 2 shows a facial feature database in which parameter values indicating the facial features of the person are registered in advance. The same reference numerals as those described in the first embodiment of the present invention are the same, and the description thereof is omitted.

Now, as shown in FIG. 2 described in the first embodiment, a portion corresponding to a human face is extracted from an image photographed by a camera 102 equipped with a wide-angle lens 101 mounted on an entrance gate, and image position detection is performed. The operation until the means 106 detects the face position is the same as in the first embodiment.
In the third embodiment, the image position detection unit 106 outputs the detected position information to the image recognition unit 705 and the algorithm switching unit 707. The algorithm switching unit 707 selects an algorithm which is an image recognition processing method used by the image recognition unit 705 based on the position information output from the image position detection unit 106. The algorithm here is a processing method for identifying a person from a face image. A processing method suitable for a front image is the front algorithm 709. Similarly, a processing method suitable for the right image is a right algorithm 708 and a left image. A suitable processing method is stored in the left algorithm 710 as a subprogram that can be called by the image recognition means 705.

  That is, in order to recognize the front face image, a processing method suitable for the face image is selected, for example, a processing method for identifying a person by looking at the features of the face outline, the distance between the eyes and the position of the nose, and the left face image. Is recognized by selecting a processing method suitable for it, for example, a processing method for identifying a person by looking at the characteristics of the height of eyes and nose on one side. Similarly, a processing method suitable for the right face image is selected.

It should be noted that parameter values representing facial features of people using the entrance / exit management system are registered in advance in the facial feature database 712 so as to match the processing results obtained by these processing methods.
If the face position detected by the image position detection means 106 is the left position range 303 of the captured image 301 in FIG. 5, the algorithm switching means 707 selects the left algorithm 710, and the image recognition means 705 is the image position. The face image at the position detected by the detection means 106 is processed by the subprogram read from the algorithm 710 for the left side to detect facial features. When the facial feature is detected, the image recognition unit 705 compares the facial feature registered in advance with the facial feature data in the registered facial feature database 712, and the face captured by the image recognition method as described above. Identify who face the image is.

In order to make the configuration easy to understand, three independent algorithms are provided for the front side, the right side, and the left side, respectively, but this can be realized by simply changing the subroutine call destination in one program.
In this way, by selecting an algorithm suitable for the position of the face, the feature parameter of the photographed face image is calculated, and compared with the feature parameter of the face image registered in advance in the database, the closest one is applicable It can be identified as a person.

If a person is specified, the image recognition means 705 notifies the entrance / exit management system control means 111 as in the first embodiment, and the entrance / exit management system control means 111 is registered in the entrance / exit management database 112. The visitor can be recorded along with the date and time, and entry and exit can be managed.
(Fourth embodiment)
Next, a fourth embodiment in the case where the image recognition system of the third embodiment is realized by a software program using a computer will be described using a flowchart. FIG. 8 is a flowchart showing computer program processing for recognizing a person by switching algorithms, and the processing is started in step S801.

Processing from step S602 to step S604 is performed in the same manner as in the second embodiment.
In step S805, it is checked whether or not the face position detected in step S604 is in the front position range 302 in FIG. 5. If the position is the front position, the process proceeds to step S807. If the position is not the front, the process proceeds to step S806.

In step S806, it is checked whether or not the face position detected in step S604 is in the right position range 304 in FIG. 5. If the position is the right position, the process proceeds to step S808. If the position is the left position, the process proceeds to step S809.
In step S807, the program address on the computer containing the front algorithm, that is, the front image recognition processing program is set, and the process proceeds to step S810.

In step S808, the program address on the computer containing the right algorithm, that is, the right image recognition processing program, is set, and the process proceeds to step S810.
In step S809, the program address on the computer that contains the algorithm for the left side, that is, the image recognition processing program for the left side is set, and the process proceeds to step S810.

  In step S810, the program address on the set computer is called, and the computer performs feature parameter calculation processing of the photographed facial image according to the program, and the computer on the computer corresponding to the facial feature database 712 in FIG. A person matching the facial feature data registered in the database is identified, and the process proceeds to step S811. Note that the facial image feature parameter calculation processing method is processed by a computer in the same manner as described in the third embodiment.

In step S811, the arrival of the identified person is recorded in a database on the computer corresponding to the entrance / exit management database 112 together with date and time information, and the process returns to step S602.
In this way, an entrance / exit management system using image recognition can be realized by executing a program that repeats the above steps on a computer.

(Fifth embodiment)
A technique for correcting image distortion to make it easy to recognize an image is known, but in the case of the image recognition system of the present invention, it is only necessary to switch the database to be compared depending on the location where the image is reflected. There is an effect that the image can be recognized at high speed without correcting the distortion.

  For example, in the case of FA (Factor Automation) applications where it is necessary to recognize an object to be recognized at high speed, an image recognition system is used for inspecting a circuit board on which components are assembled. The difference between the image of the three-dimensional part located at the center of the photographed image and the image of the same part located at the end is that the circuit board is small and can be sufficiently separated from the photographing camera above. It is easy to recognize parts because it is not large, but in the case of inspection of a large three-dimensional part such as a relatively large circuit board or electrical product, a three-dimensional part located in the center of the photographed image The difference between the image and the image of the same component located at the end is large, and it is not easy to recognize the component instantaneously.

  Therefore, an example in which the image recognition system, the image recognition method, and the image recognition program of the present invention are used for FA will be described as a fifth embodiment with reference to FIG. 9 and FIG. FIG. 9 is a system for inspecting circuit board components assembled with electronic components. 901 is a camera with a wide-angle lens, 902 is a belt conveyor, 903 is a circuit board, 904, 905 and 906 are the same type of electronic devices to be inspected. Reference numeral 907 denotes an imaging range of the camera 901.

  In FIG. 9, the circuit boards 903 after the assembly of the electronic components are successively carried by the belt conveyor 902, and when they enter the photographing range 907, the circuit board 903 is photographed from above by the photographing camera 901. An image photographed by the photographing camera 901 is shown in FIG. FIG. 10 is an image of the circuit board 903 taken from above. Reference numerals 908 and 909 denote detection position ranges corresponding to the front position range 302, the right position range 304, and the left position range 303 in the first embodiment, respectively. .

Now, the electronic components 904, 905, and 906 look the same when viewed from the side as shown in FIG. 9, but the images taken from above by the camera 901 are shown in different shapes as shown in FIG. It is difficult to recognize that they are the same parts only by their planar shapes.
However, the image recognition system, the image recognition method, and the image recognition program according to the present invention can be used for easy recognition. That is, the image recognition processing is switched between when the component is within the dotted line of the detection position range 908, that is, at the center portion, and when within the detection position range 909 excluding the detection position range 908, that is, at the peripheral portion. As in the first embodiment, by selecting a registered image database suitable for the shooting direction or by selecting an image recognition algorithm suitable for the shooting direction, the FA image can be easily and quickly obtained with high accuracy. A recognition system can be realized.

  In the description of the first to fifth embodiments, an example in which a photographed image is divided into three or two has been described, but the same is true even if the image is divided into four or more depending on the application and the angle of view of the lens. An effect is obtained. In addition, there is a method of dividing into a central portion and both sides as in the first embodiment, and a method of dividing into a central portion and the periphery as shown in the fifth embodiment. The image recognition system, the image recognition method, and the image recognition program of the present invention work effectively in any case depending on the position of the object and the object to be recognized. It is not limited to the embodiment.

  In the case of the present invention, since the position of the image to be recognized is used, it is effective to use a wide-angle lens that covers the entrance gate with one camera. In general, when the angle of view of a wide-angle lens is about 70 degrees and the shooting direction of a human face is greater than about 70 degrees, there are individual differences, but it is difficult for both eyes to appear in the photographed image, which is very different from the image from the front. The angle of view starts to appear, and the effect of the image recognition processing system of the present invention is exhibited. Of course, it is needless to say that even if the angle is not wide, it is effective depending on the arrangement of the camera and the shooting distance.

  Further, the present invention can be effectively applied when a wide-angle lens is used as a fish-eye lens so that a 360-degree range can be photographed. Examples of applications using fisheye lenses include, for example, a camera that has a fisheye lens attached to the ceiling of a conference room or office, and automatically recognizes who is attending the meeting. It can be used for the purpose of automatically recognizing whether the person working in the office is working or where in the office. In particular, when using a fisheye lens, the image distortion is concentric, so by dividing the captured image into a plurality of concentric circles and switching the registered image database and algorithm suitable for each area, Can be realized effectively. If such a system is attached to a public place such as a park, it can also be used for automatic detection of criminals currently being arranged. Alternatively, it may be applied to a security purpose for managing the occupants at the entrance of a bank safe deposit box.

  As described above, in the image recognition system, the image recognition method, and the image recognition program of the present invention, since the image recognition method is switched depending on the position of the face or the part in the captured image, the influence of the shooting direction is reduced. This makes it possible to easily realize highly accurate image recognition.

1 is a block diagram illustrating an image recognition system according to a first embodiment of the present invention. It is explanatory drawing which shows the outline | summary of embodiment of this invention. It is explanatory drawing which shows the relationship between the picked-up image of this invention, and the position range used for the detection of the image position detection means 106. FIG. It is explanatory drawing which shows the process of the image position detection means 106 of this invention. It is explanatory drawing which shows the processing result of the image position detection means 106 of this invention. It is a flowchart of the image recognition program of the 2nd Embodiment of this invention. It is a block diagram which shows the image recognition system of the 3rd Embodiment of this invention. It is a flowchart of the image recognition program of the 4th Embodiment of this invention. It is explanatory drawing explaining the 5th Embodiment of this invention. It is explanatory drawing explaining the 5th Embodiment of this invention.

Explanation of symbols

DESCRIPTION OF SYMBOLS 101 ... Wide-angle lens 102 ... Camera 105 ... Image recognition means 106 ... Image position detection means 107 ... Database switching means 108 ... Database 109 for right side ... Database 110 for front 110 ...・ Left side database 707: Algorithm switching means

Claims (9)

  1. A camera with a lens,
    Image position detecting means for detecting the position of an object to be recognized in an image captured by the camera;
    Image recognition means for recognizing an object to be recognized at a position detected by the image position detection means;
    System control means for controlling the entire system,
    An image recognition system, wherein the image recognition means changes a method of image recognition processing according to the position of the recognition object detected by the image position detection means.
  2. The image recognition system according to claim 1,
    A plurality of image databases in which the recognition object is previously imaged for each of a plurality of imaging directions;
    Database switching means for switching to a database to be used among the plurality of image databases according to the position of the recognition object detected by the image position detection means;
    The image recognition system wherein the image recognition means performs image recognition processing using the image database switched by the database switching means.
  3. The image recognition system according to claim 1,
    Multiple processing algorithms,
    Algorithm switching means for switching to an algorithm to be used among the plurality of processing algorithms depending on the position of the recognition object detected by the image position detection means,
    The image recognition system wherein the image recognition means performs image recognition processing using the algorithm switched by the algorithm switching means.
  4. In the image recognition system in any one of Claims 1, 2, and 3,
    An image recognition system, wherein the lens is a wide-angle lens.
  5. The image recognition system according to claim 4.
    An image recognition system, wherein an angle of view of the wide-angle lens is 70 degrees or more.
  6. The image recognition system according to claim 4.
    An image recognition system, wherein the object to be recognized is a human face.
  7. The image recognition system according to claim 6.
    The system control means is an entrance / exit management system control means,
    The entrance / exit management system control means performs entrance / exit management based on a result recognized by the image recognition means.
  8. A camera with a lens,
    Image position detecting means for detecting the position of an object to be recognized in an image captured by the camera;
    Image recognition means for recognizing an object to be recognized at a position detected by the image position detection means;
    In an image recognition method using an image recognition system comprising system control means for controlling the entire system,
    An image recognition method, wherein the image recognition means changes an image recognition processing method according to the position of the recognition object detected by the image position detection means.
  9. A computer that inputs image data taken by a camera equipped with a lens.
    Image position detecting means for detecting the position of a recognition object in the input image data;
    Image recognition means for switching an image recognition processing method according to the position of the recognition object detected by the image position detection means;
    System control means for controlling the entire system,
    An image recognition program characterized in that it functions as a computer program.
JP2005202875A 2005-07-12 2005-07-12 Image recognition system, image recognition method, and image recognition program Withdrawn JP2007025767A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005202875A JP2007025767A (en) 2005-07-12 2005-07-12 Image recognition system, image recognition method, and image recognition program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005202875A JP2007025767A (en) 2005-07-12 2005-07-12 Image recognition system, image recognition method, and image recognition program

Publications (1)

Publication Number Publication Date
JP2007025767A true JP2007025767A (en) 2007-02-01

Family

ID=37786485

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005202875A Withdrawn JP2007025767A (en) 2005-07-12 2005-07-12 Image recognition system, image recognition method, and image recognition program

Country Status (1)

Country Link
JP (1) JP2007025767A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008309701A (en) * 2007-06-15 2008-12-25 Sony Computer Entertainment Inc User interface device and control method
JP2009071517A (en) * 2007-09-12 2009-04-02 Mitsubishi Electric Corp Camera system and recording system
JP2009157766A (en) * 2007-12-27 2009-07-16 Nippon Telegr & Teleph Corp <Ntt> Face recognition apparatus, face recognition method, face recognition program and recording medium recording the program
JP2009193421A (en) * 2008-02-15 2009-08-27 Sony Corp Image processing device, camera device, image processing method, and program
JP2011091523A (en) * 2009-10-21 2011-05-06 Victor Co Of Japan Ltd Shape recognition method and shape recognition device
JP2012104964A (en) * 2010-11-08 2012-05-31 Canon Inc Image processing device, image processing method, and program
JP2012230546A (en) * 2011-04-26 2012-11-22 Hitachi Information & Communication Engineering Ltd Object recognition method and recognition device
WO2013001941A1 (en) * 2011-06-27 2013-01-03 日本電気株式会社 Object detection device, object detection method, and object detection program
CN102956049A (en) * 2011-08-24 2013-03-06 苏州飞锐智能科技有限公司 Switch control method of intelligent entrance guard
CN102955933A (en) * 2011-08-24 2013-03-06 苏州飞锐智能科技有限公司 Household access control method based on face recognition
KR101240043B1 (en) 2010-03-15 2013-03-07 오므론 가부시키가이샤 Verification apparatus, digital image processing system, verification apparatus controlling program, computer readable recording medium, and controlling method for verification apparatus
US9694764B2 (en) 2011-12-09 2017-07-04 Flextronics Automotive, Inc. Vehicle electromechanical systems triggering based on image recognition and radio frequency
CN104968532B (en) * 2012-12-10 2017-10-27 伟创力汽车有限公司 Vehicle Mechatronic Systems triggering based on image recognition and radio frequency

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008309701A (en) * 2007-06-15 2008-12-25 Sony Computer Entertainment Inc User interface device and control method
JP2009071517A (en) * 2007-09-12 2009-04-02 Mitsubishi Electric Corp Camera system and recording system
JP2009157766A (en) * 2007-12-27 2009-07-16 Nippon Telegr & Teleph Corp <Ntt> Face recognition apparatus, face recognition method, face recognition program and recording medium recording the program
US8149280B2 (en) 2008-02-15 2012-04-03 Sony Corporation Face detection image processing device, camera device, image processing method, and program
JP2009193421A (en) * 2008-02-15 2009-08-27 Sony Corp Image processing device, camera device, image processing method, and program
JP4539729B2 (en) * 2008-02-15 2010-09-08 ソニー株式会社 Image processing apparatus, camera apparatus, image processing method, and program
JP2011091523A (en) * 2009-10-21 2011-05-06 Victor Co Of Japan Ltd Shape recognition method and shape recognition device
KR101240043B1 (en) 2010-03-15 2013-03-07 오므론 가부시키가이샤 Verification apparatus, digital image processing system, verification apparatus controlling program, computer readable recording medium, and controlling method for verification apparatus
JP2012104964A (en) * 2010-11-08 2012-05-31 Canon Inc Image processing device, image processing method, and program
KR101374643B1 (en) * 2011-04-26 2014-03-14 가부시키가이샤 히타치 죠호 츠우신 엔지니어링 Object recognition method and recognition apparatus
JP2012230546A (en) * 2011-04-26 2012-11-22 Hitachi Information & Communication Engineering Ltd Object recognition method and recognition device
WO2013001941A1 (en) * 2011-06-27 2013-01-03 日本電気株式会社 Object detection device, object detection method, and object detection program
JPWO2013001941A1 (en) * 2011-06-27 2015-02-23 日本電気株式会社 Object detection apparatus, object detection method, and object detection program
CN102955933A (en) * 2011-08-24 2013-03-06 苏州飞锐智能科技有限公司 Household access control method based on face recognition
CN102956049A (en) * 2011-08-24 2013-03-06 苏州飞锐智能科技有限公司 Switch control method of intelligent entrance guard
US9694764B2 (en) 2011-12-09 2017-07-04 Flextronics Automotive, Inc. Vehicle electromechanical systems triggering based on image recognition and radio frequency
CN104968532B (en) * 2012-12-10 2017-10-27 伟创力汽车有限公司 Vehicle Mechatronic Systems triggering based on image recognition and radio frequency

Similar Documents

Publication Publication Date Title
US10225518B2 (en) Secure nonscheduled video visitation system
CA2692424C (en) System and process for detecting, tracking and counting human objects of interest
CN1235395C (en) Moving object monitoring surveillance apparatus
US7400744B2 (en) Stereo door sensor
EP0989517B1 (en) Determining the position of eyes through detection of flashlight reflection and correcting defects in a captured frame
US8331674B2 (en) Rule-based combination of a hierarchy of classifiers for occlusion detection
DE602004002180T2 (en) Object recognition
JP5326527B2 (en) Authentication apparatus and authentication method
Fleuret et al. Coarse-to-fine face detection
EP2024900B1 (en) Method for identifying a person and acquisition device
US6760467B1 (en) Falsification discrimination method for iris recognition system
US6583723B2 (en) Human interface system using a plurality of sensors
EP1814061B1 (en) Method and device for collating biometric information
US7801330B2 (en) Target detection and tracking from video streams
JP4675660B2 (en) Multiple simultaneous biometrics authentication device
US6553131B1 (en) License plate recognition with an intelligent camera
Hampapur et al. Smart video surveillance: exploring the concept of multiscale spatiotemporal tracking
KR101724658B1 (en) Human detecting apparatus and method
JP5370927B2 (en) Behavior monitoring system and behavior monitoring method
JP6268960B2 (en) Image recognition apparatus and data registration method for image recognition apparatus
US6345105B1 (en) Automatic door system and method for controlling automatic door
US7336297B2 (en) Camera-linked surveillance system
JP5517858B2 (en) Image processing apparatus, imaging apparatus, and image processing method
JP5390322B2 (en) Image processing apparatus and image processing method
US7397929B2 (en) Method and apparatus for monitoring a passageway using 3D images

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20081007