US20140341427A1 - Surveillance camera system and surveillance camera control apparatus - Google Patents
Surveillance camera system and surveillance camera control apparatus Download PDFInfo
- Publication number
- US20140341427A1 US20140341427A1 US14/279,020 US201414279020A US2014341427A1 US 20140341427 A1 US20140341427 A1 US 20140341427A1 US 201414279020 A US201414279020 A US 201414279020A US 2014341427 A1 US2014341427 A1 US 2014341427A1
- Authority
- US
- United States
- Prior art keywords
- information
- recognition processing
- recognition
- processing
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G06K9/00771—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/415—Identification of targets based on measurements of movement associated with the target
-
- G06K9/00255—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/53—Recognition of crowd images, e.g. recognition of crowd congestion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4044—Direction of movement, e.g. backwards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
Definitions
- aspects of the present invention generally relate to a surveillance camera system and a surveillance camera control apparatus, and more particularly to a technology suitably used for efficiently recognizing a specific person from an image captured by photographing a surveillance region.
- the surveillance camera system is a system that includes a plurality of surveillance cameras, a server configured to process a video of the surveillance camera, and a viewer configured to check the video.
- a wide angle camera for photographing a wide region or a pan-tilt-zoom (PTZ) camera for performing pan-tilt-zoom control by a user's operation is used.
- PTZ pan-tilt-zoom
- Japanese Patent Application Laid-Open No. 2008-85832 discusses a surveillance camera system that performs moving object detection, unmoving object detection, or sensor input detection by a fixed camera and photographs an abnormal region by the PTZ camera.
- Japanese Patent Application Laid-Open No. 2007-249298 discusses a face recognition apparatus that includes a first camera for photographing an entire region and a second camera for recognizing a face of one person selected as a recognition target from an overall image captured by the first camera through zooming.
- a priority order of face recognition is determined from a face direction or brightness at the point of time when the person is detected by an infrared sensor.
- information obtained at only one point of time may not satisfy a condition suited for face recognition. Consequently, efficient face recognition is not always guaranteed.
- An aspect of the present is directed to efficient execution of face recognition even when a plurality of abnormal region is simultaneously present in a surveillance region.
- a surveillance system includes an acquisition unit configured to acquire motion information of an object detected from an image captured by an image capturing unit, an association unit configured to associate a recognition processing result of the object with the motion information, and a determination unit configured to determine, from among a plurality of objects detected from the image captured by the image capturing unit, an object to be subjected to recognition processing based on the recognition result associated with the motion information.
- FIG. 1 is a diagram illustrating an example of a surveillance camera system according to a first exemplary embodiment.
- FIG. 2 is a block diagram illustrating a configuration example of the surveillance camera system according to the first exemplary embodiment.
- FIG. 3 is a diagram illustrating an example of tracking managing information according to the first exemplary embodiment.
- FIG. 4 is a diagram illustrating an example of a method for determining a moving direction and a moving speed according to the first exemplary embodiment.
- FIG. 5A is a diagram illustrating a point example with respect to the moving direction according to the first exemplary embodiment.
- FIG. 5B is a diagram illustrating a point example with respect to the moving speed according to the first exemplary embodiment.
- FIG. 6 is a flowchart illustrating a processing procedure performed in a wide angle camera according to the first exemplary embodiment.
- FIG. 7 is a flowchart illustrating an example of processing performed in a PTZ camera according to the first exemplary embodiment.
- FIG. 8 is a flowchart illustrating a procedure of setting a priority order in a server according to the first exemplary embodiment.
- FIG. 9 is a diagram illustrating a surveillance camera system of a single camera according to a second exemplary embodiment.
- FIG. 10 is a block diagram illustrating a configuration example of the surveillance camera system of the single camera according to the second exemplary embodiment.
- FIG. 11 is a diagram illustrating examples of tracking managing information and processing history information according to the second exemplary embodiment.
- FIG. 12 is a flowchart illustrating an example of processing performed by a PTZ camera in the surveillance camera system using the single camera according to the second exemplary embodiment.
- FIG. 13 is a flowchart illustrating a processing procedure of a server of the surveillance camera system using the single camera according to the second exemplary embodiment.
- FIG. 1 is a diagram illustrating an example of a surveillance camera system according to a first exemplary embodiment.
- the surveillance camera system includes a wide angle camera 101 for photographing a wide region, a camera 103 (PTZ camera) 103 capable of performing pan-tilt-zoom control, and a sever 105 for controlling the wide angle camera 101 and the PTZ camera 103 .
- a wide angle camera 101 for photographing a wide region
- a camera 103 (PTZ camera) 103 capable of performing pan-tilt-zoom control
- a sever 105 for controlling the wide angle camera 101 and the PTZ camera 103 .
- a photographing region of the wide angle camera 101 is a wide angle camera photographing region 102
- a photographing region of the PTZ camera 103 is a PTZ camera photographing region 104 .
- FIG. 2 is a block diagram illustrating a configuration example of the surveillance camera system according to the present exemplary embodiment.
- the wide angle camera 101 includes an image capturing unit 201 , a human body detection/tracking unit 202 , and a tracking information transmission unit 203 .
- the PTZ camera 103 includes an image capturing unit 204 , a face recognition database (DB) unit 205 , a face detection/recognition unit 206 , a recognition information transmission unit 207 , a control information reception unit 208 , a recognition processing control unit 209 , and a camera control unit 210 .
- DB face recognition database
- the server 105 includes a tracking information reception unit 211 , a recognition information reception unit 212 , a tracking information management unit 213 , a priority order determination unit 214 , a PTZ camera management unit 215 , and a control information transmission unit 216 , and is configured to control surveillance camera.
- the image capturing unit 201 captures an image of a surveillance region to create a continuous region image, and adds image identification (ID) for identifying each image to the created region image to output the region image to the human body detection/tracking unit 202 .
- ID image identification
- the human body detection/tracking unit 202 performs pattern matching processing to detect a human body from the captured image of the surveillance region created by the image capturing unit 201 , and adds unique human body tracking ID to a human body identified from a positional relationship between frames to perform human body tracking processing.
- the human body detection/tracking unit 202 After completion of the processing, the human body detection/tracking unit 202 outputs tracking information including human body tracking ID unique to each human body, center point coordinates in an image, a width/height and a size of a bounding box, and a moving speed, for each processing execution.
- the processing of the human body detection/tracking unit 202 is not limited to the pattern matching processing. Any processing may be performed as long as the human body can be detected from the image, for example, human body detection processing using a feature amount based on a luminance gradient/intensity.
- the tracking information transmission unit 203 transmits the tracking information output from the human body detection/tracking unit 202 to the server 105 via a network. Through the processing of the tracking information transmission unit 203 , the tracking information is transmitted from the wide angle camera 101 to the server.
- the processing is not limited to any specific communication method such as local area network (LAN) or serial communication.
- the image capturing unit 204 captures an image of a part of the wide angle camera photographing region 102 to generate an image including a plurality of frames. Image ID for identifying each image is added to the generated image. The image ID does not need to be synchronized with image ID added by the other camera, i.e., the image capturing unit 201 of the wide angle camera 101 .
- the face recognition DB unit 205 is a database for managing an individual feature amount necessary for recognizing a face as a specific individual.
- the feature amount and a name of a person corresponding to the feature amount are acquired beforehand from the outside, for example, from the server 105 , and set.
- the face detection/recognition unit 206 detects a face from the image created by the image capturing unit 204 by pattern matching processing, extracts a feature amount of the face.
- the face detection/recognition unit 206 compares the feature amount with the feature amount owned by the face recognition DB unit 205 to perform individual face detection recognition processing, and outputs an authentication result.
- Recognition information including person tracking ID, a moving direction, a moving speed, and a result of a recognition success/failure, and in the case of a success, a position of the face, a name of a recognized individual, and person ID are output for each execution of processing.
- the face detection recognition processing of the face detection/recognition unit 206 is performed only when a command to execute recognition processing is received from the recognition processing control unit 209 , and is not carried out when there is no recognition processing execution command. No processing is performed during a period from reception of a PTZ control start command from the camera control unit 210 to reception of a PTZ control end command.
- the processing of the face detection/recognition unit 206 which detects and recognizes the face from the image, is not limited to the pattern matching processing.
- a high-speed detection method such as that using a Haar-like feature amount can also be used.
- the recognition information transmission unit 207 transmits the recognition information output from the face detection/recognition unit 206 to the server 105 via the network.
- the processing of the recognition information transmission unit 207 which transmit s the recognition information from the PTZ camera 103 to the server 105 , is not limited to any specific communication method such as LAN or serial communication.
- the control information reception unit 208 receives, from the server 105 , pan-tilt-zoom control information, an execution command of face detection recognition processing, processing target human body tracking ID, a moving direction, and a moving speed.
- the recognition processing control unit 209 outputs the recognition processing execution command to the face detection/recognition unit 206 and the pan-tilt-zoom control command to the camera control unit 210 based on the information received by the control information reception unit 208 .
- the camera control unit 210 performs camera pan-tilt-zoom control based on a pan-tilt-zoom control command from the recognition processing control unit 209 .
- a PTZ control start command is transmitted to the face detection/recognition unit 206 .
- a PTZ control end command is transmitted to the face detection/recognition unit 206 .
- the tracking information reception unit 211 receives tracking information from the wide angle camera 101
- the recognition information reception unit 212 receives recognition information from the PTZ camera 103 .
- the tracking information management unit 213 manages the tracking information received by the tracking information reception unit 211 and the recognition information received by the recognition information reception unit 212 in association with each other.
- the tracking information management unit 213 generates authentication processing history including motion information of a person when authentication processing is performed and a result of the authentication processing. Tracking management information 301 illustrated in FIG. 3 is updated.
- FIG. 3 illustrates an example of the tracking management information.
- the tracking management information 301 manages person tracking information 302 for each person tracking ID.
- the person tracking information 302 includes person tracking ID, time when a person appears for the first time, a name and person ID of the person, a current moving direction and a moving speed, a moving direction and a moving speed when recognition processing fails, and one or more pieces of positional information 303 .
- Other information such as acceleration speed, a moving pattern, and a moving amount can also be used as motion information.
- the positional information 303 includes a position of the person on the image captured by the wide angle camera 101 , a width/height and a size of a bounding box, and time of acquiring the information.
- the tracking information management unit 213 updates the tracking management information 301 when acquiring each tracking information from the tracking information reception unit 211 . Specifically, when tracking information having person tracking ID not managed by the tracking management information 301 is present, new person tracking information 302 is created, and the time of first appearance, the moving direction and the moving speed of the present time, and the positional information 303 are updated. When tracking information having person tracking ID managed by the tracking management information 301 is present, the moving direction and the moving speed of the present time and the positional information 303 are updated.
- the tracking information management unit 213 updates the tracking management information 301 when acquiring each recognition information from the recognition information reception unit 212 .
- recognition is successful, the name and the person ID of the person are updated.
- recognition is a failure, the moving direction and the moving speed at the time of a recognition failure are updated.
- the priority order determination unit 214 determines a priority order of a person (Point•human) subjected to face recognition processing in the PTZ camera 103 .
- the priority order is determined referring to the tracking management information 301 and by a total score of points based on a moving direction (direction) and a moving speed (speed) for each person tracking ID, and a penalty based on failure history in history of authentication processing results.
- the following expression is a calculation formula for point determination. The higher a score, the higher a priority order.
- Point human Point direction +Point speed +Point penalty
- FIG. 4 illustrates an example of a method for determining a moving direction and a moving speed.
- a line segment connecting a past position 402 and a current position 403 of a person 401 on an image is set as a person moving vector 404 in a moving direction.
- past positional information Position (n ⁇ 4) is positional information four frames before current positional information Position (n).
- a frame interval is not limited to five frames.
- a moving speed is determined based on a length of the line segment on the image and required time between two points.
- FIG. 5A illustrates a point example with respect to a moving direction, and a point is determined based on the moving direction.
- a point in a range of 45° to ⁇ 45° of a downward moving direction in the image, a point is 10.
- a point is 8 in a range of 45° to 90° or ⁇ 45° to ⁇ 90°.
- a point is 6 in a range of 90° to 135° or ⁇ 90° to ⁇ 135°.
- a point is 2 in a range of 135° to 180° or ⁇ 135° to ⁇ 180°.
- FIG. 5B illustrates a point example with respect to a moving speed, and a point is determined based on the moving speed.
- a penalty is ⁇ 3 when a current moving direction is within a range of ⁇ 3 of a moving direction in failure history.
- a person unrecognized and having a highest score is subjected to recognition processing at the PTZ camera 103 .
- Setting of a priority order is performed once in a fixed period or after the recognition processing is successful at the PTZ camera 103 .
- the priority order setting is performed by setting the penalty based on the failure history.
- the priority order setting is not limited to this method.
- the setting method of the priority order only needs to take a result of last processing into account.
- a method for setting a priority order based on stability of recognition information in a period where last authentication processing has been executed can be employed.
- a low priority order is set to a person when his recognition information has been acquired at a fixed or higher than fixed rate among pieces of recognition information acquired from frames in the period where last authentication processing has been executed.
- a high priority order is set to a person when his recognition information has been acquired at the fixed or lower than fixed rate. In other words, the high priority order us set to the person showing varied results from each frame.
- the PTZ camera management unit 215 manages person tracking ID of a recognition processing target person determined by the priority order determination unit 214 , and current positional information.
- the PTZ camera management unit 215 converts the current positional information into pan-tilt-zoom control information of the PTZ camera 103 .
- the conversion processing into the control information is performed by creating beforehand a conversion table of a position on the image of the wide angle camera 101 and a pan angle, a tilt angle, and a zoom magnification of the PTZ camera. Processing continuation time after outputting of an execution command of recognition processing is measured and managed.
- the control information transmission unit 216 transmits the pan-tilt-zoom control information created by the PTZ camera management unit 215 , the command to execute the face detection recognition processing, the human body tracking ID of the processing target person, the moving direction, and the moving speed, to the PTZ camera 103 to perform surveillance camera control.
- FIG. 6 is a flowchart illustrating a processing procedure performed in the wide angle camera 101 .
- step S 601 the image capturing unit 201 captures an image. Then, in step S 602 , the human body detection/tracking unit 202 performs detection and tracking of a human body based on the image.
- step S 603 the tracking information transmission unit 203 transmits tracking information that is a result of the human body detection and tracking processing performed by the human body detection/tracking unit 202 .
- step S 604 whether to continue the processing is determined. When the processing is continued (YES in step S 604 ), the processing returns to step S 601 . When not continued (NO in step S 604 ), the processing is ended.
- FIG. 7 is a flowchart illustrating a processing procedure performed in the PTZ camera 103 .
- the recognition processing control unit 209 checks presence of a recognition execution command.
- the camera control unit 210 performs pan-tilt-zoom control.
- the image capturing unit 204 captures an image.
- the face detection/recognition unit 206 performs face detection and face authentication processing.
- the recognition information transmission unit 207 transmits a result of the face detection and face authentication performed in step S 704 to the server 105 .
- step S 706 whether to continue the processing is determined.
- the processing is continued (YES in step S 706 )
- the processing returns to step S 701 .
- not continued (NO in step S 706 )
- the processing is ended.
- FIG. 8 is a flowchart illustrating the priority order setting procedure carried out in the server 105 .
- the PTZ camera management unit 215 acquires information as to whether the recognition processing is being performed by the PTZ camera 103 , whether the recognition processing has been successful, and the continuation time of the recognition processing, from the tracking information management unit 213 .
- step S 802 when it is determined that the recognition processing is being executed, the recognition processing has not been successful, and the recognition processing continuation time is shorter than elapsed time until the processing ends (YES in step S 802 ), the currently executed recognition processing is continued.
- step S 803 the PTZ camera management unit 215 performs pan-tilt-zoom control of the PTZ camera 103 according to positional information of a current recognition target of the PTZ camera 103 .
- step S 804 the priority order determination unit 214 sets a priority order.
- step S 805 whether to execute recognition processing is determined based on the priority order setting.
- the PTZ camera management unit 215 issues a command to execute pan-tilt-zoom control and face detection recognition processing in the PTZ camera 103 according to positional information of a recognition target.
- step S 807 whether to continue the processing is determined.
- the processing is continued (YES in step S 807 )
- the processing returns to step S 801 .
- not continued NO in step S 807
- the processing is ended.
- the number of PTZ cameras 103 is one. However, the number of PTZ cameras is not limited to one. Even in a system including more than one PTZ cameras, processing based on a similar priority order can be performed.
- the control of the PTZ camera 103 is performed by way of the pan-tilt-zoom. However, the control method of the camera is not limited to this. Digital pan-tilt-zoom control based on rotation, vertical and horizontal movement, and partial segmentation of an image can be performed.
- FIG. 9 is a diagram illustrating an example of a surveillance camera system of a single camera according to a second exemplary embodiment.
- the surveillance camera system of the present exemplary embodiment includes a camera (PTZ camera) 901 capable of performing pan-tilt-zoom control, and a server 904 for controlling the PTZ camera 901 .
- a camera PTZ camera
- server 904 for controlling the PTZ camera 901 .
- a photographing region of the PTZ camera 901 at the time of wide angle setting is a wide angle photographing region 902
- a photographing region at the time of zoom setting is a zoom photographing region 903 .
- the PTZ camera 901 normally performs photographing at the wide angle setting, and at the zoom setting when face detection recognition processing is executed.
- FIG. 10 is a block diagram illustrating a configuration example of the surveillance camera system according to the present exemplary embodiment.
- the PTZ camera 901 includes an image capturing unit 1001 , a human body detection/tracking unit 1002 , and a tracking information transmission unit 1003 .
- the PTZ camera 901 further includes a face recognition DB unit 1004 , a face detection/recognition unit 105 , a recognition information transmission unit 1006 , a control information reception unit 1007 , a recognition processing control unit 1008 , and a camera control unit 1009 .
- the server 904 includes a tracking information reception unit 1010 , a recognition information reception unit 1011 , a tracking recognition information management unit 1012 , a processing target determination unit 1013 , a PTZ camera management unit 1014 , and a control information transmission unit 1015 .
- the image capturing unit 1001 captures an image of a surveillance region to create a continuous region image, and adds image ID for identifying each image to the created region image to output the region image to the human body detection/tracking unit 1002 and the face detection/recognition unit 1005 .
- the human body detection/tracking unit 1002 is similar to the human body detection/tracking unit 202 of the first exemplary embodiment.
- human body detection/tracking by the human body detection/tracking unit 1002 is performed only when a command to execute human body processing is received from the recognition processing control unit 1008 .
- the human body detection/tracking is not performed when no command to execute the human body processing is received.
- the tracking information transmission unit 1003 is similar to the tracking information transmission unit 203 of the first exemplary embodiment.
- the face recognition DB unit 1004 is similar to the face recognition DB unit 205 of the first exemplary embodiment.
- the face detection/recognition unit 1005 is similar to the face detection/recognition unit 206 of the first exemplary embodiment.
- face detection recognition by the face detection/recognition unit 1005 is performed only when a demand to execute the face processing is received from the recognition processing control unit 1008 .
- the face detection recognition is not performed when no face processing execution command is received. No processing is performed during a period from reception of a PTZ control start command from the camera control unit 1009 until reception of a PTZ control end command.
- the recognition information transmission unit 1006 transmits recognition information output from the face detection/recognition unit 1005 to the server 904 via a network.
- the processing of the recognition information transmission unit 1006 which transmits the recognition information from the PTZ camera 901 to the server 904 , is not limited to any specific communication method such as LAN or serial communication.
- the control information reception unit 1007 receives PTZ control information, and a command to execute human body processing or a command to execute face recognition processing from the server 904 .
- the recognition processing control unit 1008 outputs the human body processing execution command to the human body detection/tracking unit 1002 when the control information reception unit 1007 receives the human body processing execution command, and outputs a face processing execution command to the face detection/recognition unit 1005 when the control information reception unit 1007 receives the face processing execution command.
- the recognition processing control unit 1008 outputs the PTZ control command to the camera control unit 1009 when the control information reception unit 1007 acquires PTZ control information.
- the camera control unit 1009 is similar to the camera control unit 210 of the first exemplary embodiment.
- the tracking information reception unit 1010 receives tracking information from the PTZ camera 901
- the recognition information reception unit 1011 receives recognition information from the PTZ camera 901 .
- the tracking recognition information management unit 1012 manages tracking management information 1101 and processing history information 1111 .
- FIG. 11 illustrates an example of the tracking management information 1101 and the processing history information 1111 .
- the tracking management information 1101 manages person tracking information 1102 for each person tracking ID.
- the person tracking information 1102 includes person tracking ID, time when a person appears for the first time, a current moving direction and a moving speed, and one or more pieces of positional information 1103 illustrated in FIG. 11 .
- Other information such as acceleration speed, a moving pattern, and a moving amount can also be used as motion information.
- the positional information 1103 includes a position of the person on the image captured by the PTZ camera 901 at the time of wide angle setting, a width/height and a size of a bounding box, and time of acquiring the information.
- the tracking management information 1101 is updated for each time tracking information is acquired from the tracking information reception unit 1010 . Specifically, when tracking information having person tracking ID that is not managed by the tracking management information 1101 is present, new person tracking information 1102 is created, and the time of first appearance, the current moving direction and moving speed, and the positional information 1103 are updated.
- tracking management information 1101 When tracking information having person tracking ID that is managed by the tracking management information 1101 is present, the current moving direction and moving speed, and the positional information 1103 are updated. On the other hand, when no tracking information having person tracking ID managed by the tracking management information 1101 is present, corresponding person tracking information 1102 is deleted. When a tracking information deletion command is received from the PTZ camera management unit 1014 , the tracking management information 1101 is deleted.
- the processing history information 1111 includes one or more pieces of processing information 1112 .
- the processing information 1112 includes the time of executing face authentication processing, rectangular coordinates of a processing region on the image captured at the wide angle setting of the PTZ camera 901 , a pan-tilt-zoom setting value, and a success/failure of a face authentication.
- processing information 1112 is added each time face authentication processing is executed under pan-tilt-zoom control of the PTZ camera 901 .
- recognition information acquired from the recognition information reception unit 1011 a success is set to the face authentication result when the face detection recognition processing is successful and a person is specified.
- a failure is set to the face authentication result when a person is not specified.
- the processing target determination unit 1013 determines a target to be subjected to face detection recognition processing by controlling the pan-tilt-zoom operation of the PTZ camera 901 after a point (Point•human) has been set on each person managed by the tracking management information 1101 .
- the point on each person is set as follows. The point is determined referring to the tracking management information 1101 by a total score of points based on a moving direction (direction) and a moving speed (speed) for each person tracking ID, and a penalty based on the tracking management information 1101 and the processing history information 1111 .
- the following expression is a calculation formula for point determination. In this case, a person of a highest point is set as a processing target.
- Point human Point direction +Point speed +Point penalty
- a method for determining a moving direction and a moving speed and a method for determining a point with respect to the moving direction and the moving speed are similar to those of the first exemplary embodiment.
- a penalty is ⁇ 5 when current center coordinates of a human body are within a region where immediate face recognition processing has been performed.
- a penalty is ⁇ 2 when the current center coordinates of the human body are within a region where in the past, face recognition processing has been performed two to five steps before. According to the sore calculation result, the person of a highest total value is subjected to face recognition processing under pan-tilt-zoom control of the PTZ camera 901 .
- the processing target is determined by setting the penalty based on the processing history information.
- the processing target determination is not limited to this method.
- the processing target determination method may be employed as long as past processing target history is taken into account.
- priority may be given to a region where a probability of success is higher as a result of past face recognition processing.
- the PTZ camera management unit 1014 manages a processing state of the PTZ camera 901 .
- a human body execution command and PTZ control information where an image is captured at wide angle setting are output to the control information transmission unit 1015 .
- the PTZ camera photographing region 104 outputs PTZ control information and a face processing execution command in which pan-tilt-zoom is controlled such that a rectangle of a person can satisfy 80% of the photographing region, to the control information transmission unit 1015 based on a position and a rectangle of the positional information 1103 of the target person. Further, processing continuation time after determination of the processing target by the processing target determination unit 1013 is measured and managed.
- the PTZ camera photographing region 104 When the processing continuation time exceeds preset end time, the PTZ camera photographing region 104 outputs a human body processing execution command and PTZ control information where an image is captured at wide angle setting, to the control information transmission unit 1015 . Further, the PTZ camera photographing region 104 outputs a tracking information deletion command to the tracking recognition information management unit 1012 .
- the pan-tilt-zoom control is performed so that the rectangle region of the person when the target person is photographed can be 80% or more of the photographing region.
- the control is not limited to this.
- the pan-tilt-zoom control may be performed so as to acquire resolution sufficient for execution of face detection authentication processing.
- the control information transmission unit 1015 transmits the PTZ control information created by the PTZ camera management unit 1014 , and the face processing execution command, or the human body processing execution command to the PTZ camera 901 .
- FIG. 12 is a flowchart illustrating a processing procedure performed in the PTZ camera 901 .
- step S 1201 human body processing and face processing are switched over according to a processing execution command output from the recognition processing control unit 1008 .
- the recognition processing control unit 1008 outputs a human body processing execution command
- step S 1202 an image is acquired from the image capturing unit 1001 .
- step S 1203 the human body detection/tracking unit 1002 performs detection and tracking processing of a human body based on the image.
- step S 1204 the tracking information transmission unit 1003 transmits tracking information that is a tracking result of the human body detection and tracking processing performed by the human body detection/tracking unit 1002 .
- the recognition processing control unit 1008 outputs a face processing execution command
- step S 1205 pan-tilt-zoom control is performed according to a PTZ control command. However, when no PTZ control command has been received, pan-tilt-zoom control is not performed.
- step S 1206 the face detection/recognition unit 1005 acquires an image from the image capturing unit 1001 .
- the face detection/recognition unit 1005 performs face detection and face authentication processing.
- step S 1208 the recognition information transmission unit 1006 transmits a result of the face detection and face authentication processing performed in step S 1207 .
- step S 1209 whether to continue the processing is determined.
- the processing is continued (YES in step S 1209 )
- the processing returns to step S 1201 .
- not continued (NO in step S 1209 )
- the processing is ended.
- FIG. 13 is a flowchart illustrating the processing procedure carried out in the server 904 .
- the PTZ camera 901 performs video photographing at wide angle setting.
- step S 1301 the PTZ camera management unit 1014 creates a human body processing execution command, and the control information transmission unit 1015 transmits the human body processing execution command to the PTZ camera 901 .
- step S 1302 the processing waits for reception of tracking information from the PTZ camera 901 .
- the tracking recognition information management unit 1012 updates the tracking management information 1101 .
- step S 1303 the processing target determination unit 1013 performs processing target determination.
- step S 1304 whether a processing target has been determined in step S 1303 is decided.
- the PTZ camera management unit 1014 creates a PTZ control command to perform pan-tilt-zoom control to zoom in on the processing target determined in the determination processing of step S 1303 .
- step S 1306 the control information transmission unit 1015 transmits the PTZ control command to the PTZ camera 901 .
- step S 1307 the PTZ camera management unit 1014 measures processing continuation time after the outputting of the face processing execution command, and determines whether to end the processing.
- the processing proceeds to step S 1308 , and the control information transmission unit 1015 transmits the face processing execution command to the PTZ camera 901 .
- step S 1309 the processing waits for reception of recognition information from the PTZ camera 901 .
- the recognition information is received by the recognition information reception unit 1011
- the tracking recognition information management unit 1012 updates the processing history information 1111 .
- step S 1310 whether to continue the processing is determined.
- the processing proceeds to step S 1311 .
- the PTZ camera 1014 creates PTZ control information for photographing a video at wide angle setting.
- step S 1312 the control information transmission unit 1015 transmits the PTZ control command to the PTZ camera 901 .
- step S 1313 whether to end the processing is determined.
- the processing is continued (NO in step S 1313 )
- the processing returns to step S 1301 .
- the processing is ended.
- step S 1307 when it is determined that the face detection recognition processing has failed according to the recognition information, the processing returns to step S 1307 , and the processing is continued.
- the control of the PTZ camera 901 is the pan-tilt-zoom control.
- the control method of the camera is not limited to this method. Digital pan-tilt-zoom control based on rotation, vertical and horizontal movement, and partial segmentation of an image can also be performed.
- exemplary embodiments have been described in detail.
- other exemplary embodiments can be implemented in forms such as a system, an apparatus, a method, and a program or a recording medium (storage medium).
- the exemplary embodiments can be applied to a system including a plurality of devices (e.g., host computer, interface device, image capturing device, and web application), and can also be applied to an apparatus including one device.
- Additional embodiments can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s).
- the computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
- RAM random-access memory
- ROM read only memory
- BD Blu-ray Disc
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/706,601 US20180005045A1 (en) | 2013-05-17 | 2017-09-15 | Surveillance camera system and surveillance camera control apparatus |
US17/328,834 US12118792B2 (en) | 2013-05-17 | 2021-05-24 | Surveillance camera system and surveillance camera control apparatus |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-105226 | 2013-05-17 | ||
JP2013105226 | 2013-05-17 | ||
JP2014-032158 | 2014-02-21 | ||
JP2014032158A JP6316023B2 (ja) | 2013-05-17 | 2014-02-21 | カメラシステム及びカメラ制御装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/706,601 Continuation US20180005045A1 (en) | 2013-05-17 | 2017-09-15 | Surveillance camera system and surveillance camera control apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140341427A1 true US20140341427A1 (en) | 2014-11-20 |
Family
ID=51895811
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/279,020 Abandoned US20140341427A1 (en) | 2013-05-17 | 2014-05-15 | Surveillance camera system and surveillance camera control apparatus |
US15/706,601 Abandoned US20180005045A1 (en) | 2013-05-17 | 2017-09-15 | Surveillance camera system and surveillance camera control apparatus |
US17/328,834 Active 2034-08-05 US12118792B2 (en) | 2013-05-17 | 2021-05-24 | Surveillance camera system and surveillance camera control apparatus |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/706,601 Abandoned US20180005045A1 (en) | 2013-05-17 | 2017-09-15 | Surveillance camera system and surveillance camera control apparatus |
US17/328,834 Active 2034-08-05 US12118792B2 (en) | 2013-05-17 | 2021-05-24 | Surveillance camera system and surveillance camera control apparatus |
Country Status (2)
Country | Link |
---|---|
US (3) | US20140341427A1 (enrdf_load_stackoverflow) |
JP (1) | JP6316023B2 (enrdf_load_stackoverflow) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104809440A (zh) * | 2015-04-29 | 2015-07-29 | 姜振宇 | 一种异常视线停驻的识别与提醒方法及装置 |
CN104809439A (zh) * | 2015-04-29 | 2015-07-29 | 姜振宇 | 一种异常眨眼动作的识别与提醒方法及装置 |
CN104820495A (zh) * | 2015-04-29 | 2015-08-05 | 姜振宇 | 一种异常微表情识别与提醒方法及装置 |
EP3156940A1 (en) * | 2015-10-15 | 2017-04-19 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and program |
CN107452027A (zh) * | 2017-07-29 | 2017-12-08 | 安徽博威康信息技术有限公司 | 一种基于多摄像头监控的目标人物安全保护方法 |
EP3282388A1 (en) * | 2016-08-08 | 2018-02-14 | Toshiba TEC Kabushiki Kaisha | Authentication apparatus for carrying out authentication based on captured image, authentication method and server |
US20180163700A1 (en) * | 2014-08-21 | 2018-06-14 | Identiflight International, Llc | Imaging array for bird or bat detection and identification |
EP3346694A1 (en) * | 2017-01-09 | 2018-07-11 | Samsung Electronics Co., Ltd. | Electronic device and image capture method thereof |
CN108921874A (zh) * | 2018-07-04 | 2018-11-30 | 百度在线网络技术(北京)有限公司 | 人体跟踪处理方法、装置及系统 |
CN108921773A (zh) * | 2018-07-04 | 2018-11-30 | 百度在线网络技术(北京)有限公司 | 人体跟踪处理方法、装置、设备及系统 |
EP3531340A3 (en) * | 2018-07-02 | 2019-12-04 | Baidu Online Network Technology (Beijing) Co., Ltd. | Human body tracing method, apparatus and device, and storage medium |
CN110771143A (zh) * | 2018-09-13 | 2020-02-07 | 深圳市大疆创新科技有限公司 | 手持云台的控制方法及手持云台、手持设备 |
US10961825B2 (en) * | 2014-11-14 | 2021-03-30 | National Oilwell Vargo Norway As | Drilling rig |
CN113243015A (zh) * | 2018-12-19 | 2021-08-10 | 浙江大华技术股份有限公司 | 一种视频监控系统和方法 |
US11296942B1 (en) * | 2021-03-11 | 2022-04-05 | International Business Machines Corporation | Relative device placement configuration |
US11495264B2 (en) * | 2019-10-28 | 2022-11-08 | Shanghai Bilibili Technology Co., Ltd. | Method and system of clipping a video, computing device, and computer storage medium |
US11544490B2 (en) | 2014-08-21 | 2023-01-03 | Identiflight International, Llc | Avian detection systems and methods |
US12148176B2 (en) | 2019-03-27 | 2024-11-19 | Nec Corporation | Comparison apparatus, control method, and program |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6531435B2 (ja) * | 2015-03-11 | 2019-06-19 | カシオ計算機株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2019068234A (ja) * | 2017-09-29 | 2019-04-25 | 富士通株式会社 | 処理プログラム、処理方法、及び処理装置 |
CN110716803A (zh) * | 2018-07-13 | 2020-01-21 | 中强光电股份有限公司 | 电脑系统、资源分配方法及其影像辨识方法 |
JP2021141470A (ja) * | 2020-03-06 | 2021-09-16 | 株式会社Wds | Aiカメラシステム、特定人物通知方法、及び特定人物通知プログラム |
US11496671B2 (en) | 2021-02-19 | 2022-11-08 | Western Digital Technologies, Inc. | Surveillance video streams with embedded object data |
US20250124712A1 (en) * | 2023-10-16 | 2025-04-17 | Motorola Solutions, Inc. | System and method for reconfiguring a second camera based on a first camera |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7336297B2 (en) * | 2003-04-22 | 2008-02-26 | Matsushita Electric Industrial Co., Ltd. | Camera-linked surveillance system |
US8199208B2 (en) * | 2008-09-22 | 2012-06-12 | Sony Corporation | Operation input apparatus, operation input method, and computer readable medium for determining a priority between detected images |
US20130050395A1 (en) * | 2011-08-29 | 2013-02-28 | DigitalOptics Corporation Europe Limited | Rich Mobile Video Conferencing Solution for No Light, Low Light and Uneven Light Conditions |
US8730375B2 (en) * | 2007-05-18 | 2014-05-20 | Casio Computer Co., Ltd. | Imaging apparatus having focus control function |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6215519B1 (en) * | 1998-03-04 | 2001-04-10 | The Trustees Of Columbia University In The City Of New York | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring |
JP2006086707A (ja) * | 2004-09-15 | 2006-03-30 | Sony Corp | 画像処理装置および画像処理方法、プログラムおよびプログラム記録媒体、並びにデータ構造およびデータ記録媒体 |
JP2007249298A (ja) * | 2006-03-13 | 2007-09-27 | Toshiba Corp | 顔認証装置および顔認証方法 |
US8599267B2 (en) * | 2006-03-15 | 2013-12-03 | Omron Corporation | Tracking device, tracking method, tracking device control program, and computer-readable recording medium |
JP5111343B2 (ja) * | 2008-12-02 | 2013-01-09 | キヤノン株式会社 | 再生装置 |
CN101604446B (zh) * | 2009-07-03 | 2011-08-31 | 清华大学深圳研究生院 | 用于疲劳检测的嘴唇图像分割方法及系统 |
JP5523027B2 (ja) * | 2009-09-02 | 2014-06-18 | キヤノン株式会社 | 情報送信装置及び情報送信方法 |
JP5457985B2 (ja) * | 2010-09-17 | 2014-04-02 | 株式会社日立製作所 | カメラ管理装置、ネットワークカメラシステム、ネットワークカメラ制御方法、ネットワーク機器制御方法 |
US8879789B1 (en) * | 2011-01-20 | 2014-11-04 | Verint Americas Inc. | Object analysis using motion history |
US8542879B1 (en) * | 2012-06-26 | 2013-09-24 | Google Inc. | Facial recognition |
US20170017833A1 (en) * | 2014-03-14 | 2017-01-19 | Hitachi Kokusai Electric Inc. | Video monitoring support apparatus, video monitoring support method, and storage medium |
-
2014
- 2014-02-21 JP JP2014032158A patent/JP6316023B2/ja active Active
- 2014-05-15 US US14/279,020 patent/US20140341427A1/en not_active Abandoned
-
2017
- 2017-09-15 US US15/706,601 patent/US20180005045A1/en not_active Abandoned
-
2021
- 2021-05-24 US US17/328,834 patent/US12118792B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7336297B2 (en) * | 2003-04-22 | 2008-02-26 | Matsushita Electric Industrial Co., Ltd. | Camera-linked surveillance system |
US8730375B2 (en) * | 2007-05-18 | 2014-05-20 | Casio Computer Co., Ltd. | Imaging apparatus having focus control function |
US8199208B2 (en) * | 2008-09-22 | 2012-06-12 | Sony Corporation | Operation input apparatus, operation input method, and computer readable medium for determining a priority between detected images |
US20130050395A1 (en) * | 2011-08-29 | 2013-02-28 | DigitalOptics Corporation Europe Limited | Rich Mobile Video Conferencing Solution for No Light, Low Light and Uneven Light Conditions |
Non-Patent Citations (1)
Title |
---|
Wheeler et al., Face Recognition at a Distance System for Surveillance Applications, 27-29 Sept. 2010 [retrieved 11/16/16], 2010 Fourth IEEE Internatioanl Conference on Biometrics: Theory Applications and Systems, 8 total pages.Retrieved from the Internet: http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5634523 * |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10920748B2 (en) * | 2014-08-21 | 2021-02-16 | Identiflight International, Llc | Imaging array for bird or bat detection and identification |
US20210324832A1 (en) * | 2014-08-21 | 2021-10-21 | Identiflight International, Llc | Imaging Array for Bird or Bat Detection and Identification |
US10519932B2 (en) * | 2014-08-21 | 2019-12-31 | Identiflight International, Llc | Imaging array for bird or bat detection and identification |
US11751560B2 (en) * | 2014-08-21 | 2023-09-12 | Identiflight International, Llc | Imaging array for bird or bat detection and identification |
US11544490B2 (en) | 2014-08-21 | 2023-01-03 | Identiflight International, Llc | Avian detection systems and methods |
US11555477B2 (en) | 2014-08-21 | 2023-01-17 | Identiflight International, Llc | Bird or bat detection and identification for wind turbine risk mitigation |
US12048301B2 (en) | 2014-08-21 | 2024-07-30 | Identiflight International, Llc | Bird or bat detection and identification for wind turbine risk mitigation |
US20180163700A1 (en) * | 2014-08-21 | 2018-06-14 | Identiflight International, Llc | Imaging array for bird or bat detection and identification |
US12399933B2 (en) | 2014-08-21 | 2025-08-26 | Identiflight International, Llc | Avian detection systems and methods |
US10961825B2 (en) * | 2014-11-14 | 2021-03-30 | National Oilwell Vargo Norway As | Drilling rig |
US11885204B2 (en) | 2014-11-14 | 2024-01-30 | National Oilwell Varco Norway As | Drilling rig |
CN104809440A (zh) * | 2015-04-29 | 2015-07-29 | 姜振宇 | 一种异常视线停驻的识别与提醒方法及装置 |
CN104809439A (zh) * | 2015-04-29 | 2015-07-29 | 姜振宇 | 一种异常眨眼动作的识别与提醒方法及装置 |
CN104820495A (zh) * | 2015-04-29 | 2015-08-05 | 姜振宇 | 一种异常微表情识别与提醒方法及装置 |
US10181075B2 (en) | 2015-10-15 | 2019-01-15 | Canon Kabushiki Kaisha | Image analyzing apparatus,image analyzing, and storage medium |
EP3156940A1 (en) * | 2015-10-15 | 2017-04-19 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and program |
US10230860B2 (en) | 2016-08-08 | 2019-03-12 | Kabushiki Kaisha Toshiba | Authentication apparatus for carrying out authentication based on captured image, authentication method and server |
EP3282388A1 (en) * | 2016-08-08 | 2018-02-14 | Toshiba TEC Kabushiki Kaisha | Authentication apparatus for carrying out authentication based on captured image, authentication method and server |
US10871798B2 (en) | 2017-01-09 | 2020-12-22 | Samsung Electronics Co., Ltd. | Electronic device and image capture method thereof |
US10423194B2 (en) | 2017-01-09 | 2019-09-24 | Samsung Electronics Co., Ltd. | Electronic device and image capture method thereof |
EP3346694A1 (en) * | 2017-01-09 | 2018-07-11 | Samsung Electronics Co., Ltd. | Electronic device and image capture method thereof |
CN107452027A (zh) * | 2017-07-29 | 2017-12-08 | 安徽博威康信息技术有限公司 | 一种基于多摄像头监控的目标人物安全保护方法 |
US11348354B2 (en) * | 2018-07-02 | 2022-05-31 | Baidu Online Network Technology (Beijing) Co., Ltd. | Human body tracing method, apparatus and device, and storage medium |
EP3531340A3 (en) * | 2018-07-02 | 2019-12-04 | Baidu Online Network Technology (Beijing) Co., Ltd. | Human body tracing method, apparatus and device, and storage medium |
CN108921874A (zh) * | 2018-07-04 | 2018-11-30 | 百度在线网络技术(北京)有限公司 | 人体跟踪处理方法、装置及系统 |
US11263445B2 (en) * | 2018-07-04 | 2022-03-01 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method, apparatus and system for human body tracking processing |
EP3531342A3 (en) * | 2018-07-04 | 2019-11-20 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method, apparatus and system for human body tracking processing |
CN108921773A (zh) * | 2018-07-04 | 2018-11-30 | 百度在线网络技术(北京)有限公司 | 人体跟踪处理方法、装置、设备及系统 |
CN110771143A (zh) * | 2018-09-13 | 2020-02-07 | 深圳市大疆创新科技有限公司 | 手持云台的控制方法及手持云台、手持设备 |
CN113243015A (zh) * | 2018-12-19 | 2021-08-10 | 浙江大华技术股份有限公司 | 一种视频监控系统和方法 |
US12148176B2 (en) | 2019-03-27 | 2024-11-19 | Nec Corporation | Comparison apparatus, control method, and program |
US11495264B2 (en) * | 2019-10-28 | 2022-11-08 | Shanghai Bilibili Technology Co., Ltd. | Method and system of clipping a video, computing device, and computer storage medium |
US11296942B1 (en) * | 2021-03-11 | 2022-04-05 | International Business Machines Corporation | Relative device placement configuration |
Also Published As
Publication number | Publication date |
---|---|
JP2014241578A (ja) | 2014-12-25 |
US20180005045A1 (en) | 2018-01-04 |
JP6316023B2 (ja) | 2018-04-25 |
US20210279474A1 (en) | 2021-09-09 |
US12118792B2 (en) | 2024-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12118792B2 (en) | Surveillance camera system and surveillance camera control apparatus | |
US10438360B2 (en) | Video processing apparatus, video processing method, and storage medium | |
US8314854B2 (en) | Apparatus and method for image recognition of facial areas in photographic images from a digital camera | |
KR101687530B1 (ko) | 촬상 시스템에 있어서의 제어방법, 제어장치 및 컴퓨터 판독 가능한 기억매체 | |
US9823331B2 (en) | Object detecting apparatus, image capturing apparatus, method for controlling object detecting apparatus, and storage medium | |
US7423669B2 (en) | Monitoring system and setting method for the same | |
US20160142680A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP6555906B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP5730095B2 (ja) | 顔画像認証装置 | |
JP4800073B2 (ja) | 監視システム、監視方法、及び監視プログラム | |
KR101513215B1 (ko) | 객체 행동패턴 cctv 영상 분석서버 | |
JP2011070576A (ja) | 画像処理装置、及び画像処理方法 | |
US10878222B2 (en) | Face authentication device having database with small storage capacity | |
US20240048672A1 (en) | Adjustment of shutter value of surveillance camera via ai-based object recognition | |
JP6157165B2 (ja) | 視線検出装置及び撮像装置 | |
KR101372860B1 (ko) | 영상 검색 시스템 및 영상 분석 서버 | |
US10762355B2 (en) | Information processing apparatus, information processing method, and storage medium to estimate queue waiting time | |
US10733423B2 (en) | Image processing apparatus, image processing method, and storage medium | |
JP7448043B2 (ja) | 撮影制御システム | |
US10127424B2 (en) | Image processing apparatus, image processing method, and image processing system | |
KR20150112712A (ko) | 객체 행동패턴 cctv 영상 분석서버 | |
EP3929861B1 (en) | Image processing device, method, and system, and computer-readable medium | |
JP2018006910A (ja) | 撮像装置、撮像装置の制御方法およびプログラム | |
JP2018005574A (ja) | 画像処理装置、画像処理装置の制御方法およびプログラム | |
US20110267463A1 (en) | Image capturing device and method for controlling image capturing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KAWANO, ATSUSHI;REEL/FRAME:033587/0331 Effective date: 20140428 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |