JP2006229322A - Automatic tracking controller and automatic tracking control method, program, and automatic tracking system - Google Patents

Automatic tracking controller and automatic tracking control method, program, and automatic tracking system Download PDF

Info

Publication number
JP2006229322A
JP2006229322A JP2005037815A JP2005037815A JP2006229322A JP 2006229322 A JP2006229322 A JP 2006229322A JP 2005037815 A JP2005037815 A JP 2005037815A JP 2005037815 A JP2005037815 A JP 2005037815A JP 2006229322 A JP2006229322 A JP 2006229322A
Authority
JP
Japan
Prior art keywords
image
target
tracking
adjustment mechanism
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2005037815A
Other languages
Japanese (ja)
Other versions
JP4699040B2 (en
Inventor
Yasuyuki Domoto
Takahiro Ike
Manabu Yada
隆宏 池
学 矢田
泰之 道本
Original Assignee
Matsushita Electric Ind Co Ltd
松下電器産業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Ind Co Ltd, 松下電器産業株式会社 filed Critical Matsushita Electric Ind Co Ltd
Priority to JP2005037815A priority Critical patent/JP4699040B2/en
Publication of JP2006229322A publication Critical patent/JP2006229322A/en
Application granted granted Critical
Publication of JP4699040B2 publication Critical patent/JP4699040B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an automatic tracking controller capable of automatically photographing an area more suitable for the supervision of an intruder or the like even if delays in processing and control take place. <P>SOLUTION: The object area tracking unit 102 of an automatic tracking apparatus 100 processes the information of an image received from an image input section 101 to particularize the position of a tracking object, and a camera control section 103 controls a camera 106 so that the tracking object is not deviated from the field angle of the camera 106 for photographing the object and further the tracking object is displayed on a position suitable for the supervision on the photographing screen. In this case, the camera control section 103 uses the time series information of the position detected from the image to detect the moving direction of the tracking object and the noted direction of a person and automatically controls the photographing direction so as to more widen an area in front of the moving direction and in front of the noted direction. The camera control section 103 automatically adjusts zoom magnification toward a wide angle side when the object is likely to deviate from the photographing range and controls the zoom magnification toward a telephoto side when the display position of the object is stable. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by zoom adjustment or orientation adjustment, and controls the adjustment mechanism to select a specific target. The present invention relates to an automatic tracking control device and an automatic tracking control method for tracking.

  For surveillance camera systems installed for the purpose of crime prevention or the like, conventionally, a television camera or an industrial camera with a fixed shooting direction was generally used as an imaging device, but recently, panning, tilting and zooming have been used. In many cases, a camera equipped with an adjustment function is used. Thereby, it is possible to acquire the intruder's human phase and the like as a more detailed image.

  When using a camera equipped with pan, tilt, and zoom adjustment functions, an operation unit such as a joystick is provided to monitor the pan, tilt, and zoom by electric remote control. It is common to operate by manual operation. However, the task of tracking the intruder etc. by operating the joystick so that the target such as the intruder keeps entering the desired position in the screen while monitoring the image taken by the camera on the monitor screen is an advanced The situation is that it is a very heavy burden for the observers because it requires skill and concentration.

  Therefore, devices for automatically tracking an intruding object using image recognition technology have been proposed in, for example, Patent Document 1, Non-Patent Document 1, and Non-Patent Document 2.

  Patent Document 1 proposes a technique for obtaining a reference image of only a target area from which the background is removed even when the shape or size of the target changes.

  In the technique disclosed in Non-Patent Document 1, when the object moves out of the region (CC) defined near the center in the image as shown in FIG. 9, the pan of the camera turntable is moved so that the object returns to the region CC. Controls the tilt angle.

  In the technique disclosed in Non-Patent Document 2, the turntable is controlled so that the tracking object is photographed at the center of the image. Further, the control amount of the pan / tilt angle is adjusted by feeding back the deviation of the object position from the image center position.

JP-A-5-103249 Ushita, Fujiwara, Tadami, "Swinging, Tracking Intrusion Monitoring System Using Zoom Camera", IEICE, PRMU99-67 (P.23-30), September 1999 Morita, "Motion Detection and Tracking by Local Correlation", IEICE Transactions D-II, Vol.J84-D-II No.2 pp.299-309, February 2001

  When the target is automatically tracked by the conventional technique as described above, the shooting direction and the like are controlled so that the tracked target is shot near the center of the image. However, since it is affected by camera control processing delays and control signal transmission delays, the camera control tends to be delayed in reality, and the moving target is photographed with a bias toward the front of the moving direction rather than the center of the screen. Result.

  That is, in the image displayed on the screen of the monitor television, the area in front of the target tends to be narrowed and the area behind the target tends to be widened. For example, when the target moves from the left to the right, the target is photographed with the target moving to the right on the image. As a result, since it is difficult to confirm the front of the target on the screen, it takes time for the monitor to recognize, for example, what the target is moving toward.

  For example, in order to recognize the intruder's human phase, it is desired to track the intruder as much as possible by increasing the zoom magnification of the camera. However, if the zoom magnification is increased, the intruder's apparent movement speed on the screen increases, so the camera control process cannot catch up or higher control accuracy is required. It is easy to deviate from the shooting range.

  Furthermore, when monitoring the behavior of an intruder, it is necessary to predict the behavior of the intruder earlier, so it is important to confirm the location where the intruder is paying attention. However, the direction that the intruder is paying attention may not appear on the screen.

  The present invention has been made in view of the above circumstances, and an automatic tracking control device and an automatic tracking capable of automatically capturing an area suitable for monitoring an intruder or the like even when processing or control delay occurs. An object is to provide a control method.

  The automatic tracking control device of the present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction, and controls and specifies the adjustment mechanism. An automatic tracking control apparatus for tracking a target object, wherein the target object is detected for each of a plurality of images sequentially input from the imaging means in time series, and a plurality of image frames relating to a region where the target object is detected The object area tracking means that retains the position information in association with the time change as tracking information, and detects the moving direction of the target based on the past tracking information detected and held by the object area tracking means The adjustment mechanism is controlled so as to reflect the moving direction, and at least the imaging area ahead of the moving direction is larger than that behind the moving direction of the target. In which and a camera control means for controlling the position of the target in the.

  In the above configuration, the shooting range is automatically controlled so that the area ahead in the movement direction is displayed wider on the screen than the area behind the movement target object in the movement direction. When monitoring, etc., a more useful image can be obtained for monitoring an intruder or the like. That is, it becomes possible to quickly recognize a target at the destination of the intruder, which is useful for predicting the behavior of the intruder. As the adjusting mechanism, for example, a mechanism that adjusts the pan angle and the tilt angle using a mechanism such as a turntable that supports the imaging means is assumed.

  The automatic tracking control device of the present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction, and controls the adjustment mechanism. An automatic tracking control device that tracks a specific target, an object region tracking unit that detects the position of a person in the image based on an image input from the imaging unit, and an image input from the imaging unit And detecting the direction of attention of the person in the image based on the image, and controlling the adjustment mechanism to reflect the detected direction of attention, so that at least a shooting area in front of the direction of attention is behind the direction of attention of the person. Camera control means for controlling the position of the person in the image so as to be large.

  In the above configuration, the shooting range is automatically controlled so that the shooting area in the front of the attention direction becomes larger than the rear of the attention direction of the person being tracked, so a monitor or the like monitors the captured image. In some cases, a more useful video can be obtained for monitoring intruders and the like. In other words, it is possible to quickly recognize a target that is a target of the intruder, which is useful for recognizing the behavior and intention of the intruder.

  The automatic tracking control device of the present invention controls the adjustment mechanism by using an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a zoom magnification. An automatic tracking control device for tracking a specific target, wherein the target is detected for each of a plurality of images sequentially input from the imaging means in time series, and a plurality of areas related to the area where the target is detected Object region tracking means for holding the position information of the image frame as tracking information in association with the time change, and based on the tracking information detected by the object region tracking means, the acceleration of the target, the moving speed, and the position on the screen Camera control means for acquiring at least one measurement value and controlling the adjustment mechanism in accordance with the measurement value to change the zoom magnification is provided.

  In the above configuration, since the zoom magnification is changed based on the detected acceleration, moving speed, or position on the screen, it is possible to automatically track the tracking target target so that it does not fall within the angle of view. . For example, if the target object to be tracked is likely to be out of the angle of view due to changes in the moving speed and moving direction of the target object, processing or control delays, etc., the target is obtained by changing the zoom magnification to the wide angle side. The object can be controlled so as to be within the angle of view.

  Further, in the automatic tracking control apparatus described above, the camera control unit recognizes that the predicted position of the target predicted based on the measurement value is a peripheral portion of the image or protrudes from the image. In this case, the adjustment mechanism is controlled to change the zoom magnification to the wide angle side.

  In the above configuration, for example, when the movement of the target object to be tracked suddenly changes, it can be predicted that the target object will protrude from the angle of view based on the acceleration or moving speed of the target object. By changing the zoom magnification to the side, the target can be controlled to be within the angle of view.

  Further, in the automatic tracking control device described above, the camera control unit controls the adjustment mechanism to detect the telephoto when detecting that the moving speed or the moving acceleration of the target is reduced based on the measured value. The zoom magnification is changed to the side.

  In the above-described configuration, when the moving speed of the target becomes slow, the target is photographed by automatically zooming up, so that the target can be observed in more detail. Further, when the moving speed of the target is slow, even if the zoom magnification is changed to the telephoto side, the influence of processing and control delays is reduced, so that the target does not protrude from the angle of view.

  Further, in the automatic tracking control device described above, the camera control unit identifies whether the position of the target on the image is a peripheral portion of the image based on the measurement value, and the position of the target on the image is the image. If the zoom lens is recognized as a peripheral portion, the zoom mechanism is changed to the wide angle side by controlling the adjustment mechanism.

  In the above configuration, for example, when the movement of the tracking target object suddenly changes, the position of the detected target object on the screen may approach the periphery of the image, and the target object may protrude from the image. Since it can be recognized, the target can be controlled to be within the angle of view by changing the zoom magnification to the wide angle side.

  Further, the automatic tracking control method of the present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction, and controls the adjustment mechanism. An automatic tracking control method for tracking a specific target, wherein the target is detected for each of a plurality of images sequentially input from the imaging means in time series, and an area in which the target is detected The object area tracking step that holds the positional information of multiple image frames related to tracking as time tracking information, and the moving direction of the target is detected based on the past tracking information detected and held in the object area tracking step. In addition, the adjustment mechanism is controlled so as to reflect the detected moving direction, and at least the photographing area in the front of the moving direction is larger than the rear in the moving direction of the target. Those having a camera control step of controlling the position of the target in the image so.

  According to the above procedure, even when a monitor or the like monitors a captured image, even if processing or control delay occurs, a more useful video for monitoring an intruder or the like can be obtained.

  Further, the automatic tracking control method of the present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction, and controls the adjustment mechanism. An automatic tracking control device for tracking a specific target, an object region tracking step for detecting the position of a person in the image based on an image input from the imaging means, and an input from the imaging means The attention direction of the person in the image is detected based on the image, and the adjustment mechanism is controlled to reflect the detected attention direction, and at least the front of the attention direction is photographed compared to the rear of the person's attention direction. And a camera control step for controlling the position of the person in the image so that the area becomes large.

  According to the above procedure, even when a monitor or the like monitors a captured image, even if processing or control delay occurs, a more useful video for monitoring an intruder or the like can be obtained.

  Further, the automatic tracking control method of the present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a zoom magnification, and controls the adjustment mechanism. An automatic tracking control method for tracking a specific target, wherein the target is detected for each of a plurality of images sequentially input from the imaging means in time series, and an area in which the target is detected An object region tracking step that retains position information of a plurality of image frames relating to tracking information as tracking information, and based on the tracking information detected in the object region tracking step, the acceleration, movement speed, and on-screen of the target A camera control step of acquiring at least one of the positions as a measurement value, controlling the adjustment mechanism according to the measurement value, and changing a zoom magnification. .

  According to the above procedure, by changing the zoom magnification based on the detected acceleration, moving speed or position on the screen, it is possible to automatically track the tracking target target so that it does not fall within the angle of view. Become.

  In addition, the present invention uses an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction or a zoom magnification, and controls and specifies the adjustment mechanism. A program for tracking a target of the above-described object and causing a computer to execute any one of the procedures described above is provided.

  By executing the above program on a computer, even if processing or control delay occurs, it is possible to automatically photograph an area more suitable for monitoring an intruder or the like.

  The present invention also provides an automatic tracking control device according to any one of the above, an imaging unit that captures an image including a target, a turntable mechanism that can control the imaging direction of the imaging unit, and the imaging of the imaging unit. An automatic tracking system including a lens mechanism capable of controlling the zoom magnification at the time is provided.

  With the above configuration, even when processing or control delay occurs, it is possible to automatically capture an area suitable for monitoring an intruder or the like.

  According to the present invention, it is possible to provide an automatic tracking control device and an automatic tracking control method capable of automatically capturing an area suitable for monitoring an intruder or the like even when processing or control delay occurs.

  The automatic tracking control device of this embodiment is applied to, for example, an automatic tracking camera or an automatic tracking device of a remote monitoring system. In the present embodiment, for example, a specific target is automatically tracked using image information obtained by continuously or periodically photographing a subject including an intruder or the like with a camera unit such as a television camera as an imaging unit. Assume the case. In the camera unit used in this embodiment, as an adjustment mechanism for tracking a specific target, a pan-tilt or pan-tilt adjustment mechanism that adjusts the shooting direction, a shooting range (image area by optical or digital signal processing), and the like. It has a zoom mechanism that adjusts the angle.

(First embodiment)
FIG. 1 is a block diagram showing a configuration of an automatic tracking system according to an embodiment of the present invention. 2 is a diagram showing an object region in an image frame to be tracked in the present embodiment, FIG. 3 is a diagram showing a specific example of a correlation map generated by the object region tracking unit in the present embodiment, and FIG. 4 is a camera in the present embodiment. FIG. 5 is a schematic diagram illustrating the operation of the control unit, and FIG. 5 is a diagram illustrating a specific example of an image frame processed in the first embodiment.

  The automatic tracking system 110 of the present embodiment includes a camera 106 and an automatic tracking device 100 for photographing a subject such as an intruder. An image display device 105 is connected to the output of the automatic tracking device 100.

  The camera 106 is supported at a predetermined position in a state in which the orientation can be adjusted via the turntable 107 in order to change the shooting direction, and also includes a zoom lens 109. The PTZ control unit 108 provided on the turntable 107 electrically controls the drive mechanism of the turntable 107 and the zoom lens 109, thereby panning, tilting and zooming the camera 106 (hereinafter, these operations are collectively referred to as PTZ). Adjustment). The drive mechanism is configured by an electric motor (not shown).

  The camera 106 sequentially outputs, as a video signal, information of a two-dimensional image frame obtained by photographing a subject continuously or periodically like, for example, a general television camera or an industrial television camera. The image display device 105 is a device capable of displaying a two-dimensional image, such as a monitor television, and inputs and displays an image signal captured by the camera 106 via the automatic tracking device 100.

  The automatic tracking device 100 includes an image input unit 101, an object area tracking unit 102, a camera control unit 103, and an image output unit 104. The image input unit 101 periodically inputs images taken by the camera 106.

  The object region tracking unit 102 processes the information of the image input by the image input unit 101 and specifies the position of the tracking target object on the image. Specifically, the position of the tracking target is specified by searching for an area having a high correlation with the registered image (template image) in the input image. More detailed operation will be described later.

  Based on the position of the tracking target detected by the object region tracking unit 102, the camera control unit 103 further captures the tracking target so that the tracking target does not deviate from the angle of view captured by the camera 106. Control to display on the screen at a position suitable for monitoring. Therefore, a control signal for controlling pan, tilt, and zoom of the camera 106 is output to the PTZ control unit 108. The image output unit 104 outputs an image signal input from the camera 106 to the external image display device 105.

  Next, a specific operation of the object region tracking unit 102 will be described with reference to FIG. The object region tracking unit 102 searches the region having the highest correlation with the template image T for the input image for each frame input by the image input unit 101. This area is a corresponding area 201. The template image T is registered image information indicating a reference of an object (for example, a person) to be tracked by the camera 106. For example, as shown in FIG. 2, the image of the frame immediately before the input image currently being processed is shown in FIG. The image information including the main part of the object extracted from can be used.

  By repeating the search for the corresponding region 201 at a cycle of, for example, several tens of milliseconds to several hundreds of milliseconds, time-series position information on a plurality of frames related to the position of the corresponding region 201 can be obtained. For example, when the object moves with the orientation of the camera 106 and the zoom magnification fixed, time-series position information representing the movement locus of the object represents time-series information representing a change in the position of the corresponding region 201. Will be obtained as

  Accordingly, by using a plurality of time-series position information obtained by the processing of the object region tracking unit 102, it is possible to track the moving object by pan / tilt control or zoom control of the camera 106. If there are a plurality of template images T, the correlation with the input image is examined for each of all target template images.

  In the present embodiment, the object region tracking unit 102 uses the position of the region having the highest correlation with the template image T in the processing related to the immediately previous image frame as a reference in the input image I of the latest frame in the horizontal direction and the vertical direction. A rectangular image range of about several tens of pixels in each direction is assigned as the search area 202. Then, the correlation with the template image T is sequentially obtained while scanning the search area 202.

  It is possible to make the search area 202 the entire input image. However, considering the fact that the amount of computation becomes too large and that miscorrespondence is likely to occur, the search range should be limited in advance to investigate the correlation. It is desirable to process. About the width of the search area 202, depending on the movement of the object, about several to several tens of pixels are appropriate in the horizontal and vertical directions.

  When the object region tracking unit 102 examines the correlation of the search region 202, for example, normalized correlation can be used as an index of the correlation value. Processing when using normalized correlation will be described below.

  When the size of the template image T, that is, the number of pixels is represented by (K (H) × L (V)), the template image T (p, q) at the (p, q) pixel position and the current frame image A normalized correlation value NRML (xv, yv) with a rectangular region (K (H) × L (V) pixel) in I (x, y) can be expressed by the following equation (1).

  The normalized correlation value NRML is closer to 1.0 as the correlation between both images to be compared is higher, and is 0 when there is no correlation. A correlation map representing a correlation value at each position in the search area as a two-dimensional array is called a correlation map. The correlation map created by the processing of the object region tracking unit 102 is used later to correct the position of the template T. At this stage, the template destination has not been determined.

  Here, an example has been described in which the correlation value between the input image and the template image is obtained by normalized correlation, but other methods such as a difference method and a difference absolute value sum method with a smaller amount of calculation can be used. In the present embodiment, it is assumed that the luminance value of the image is used. However, in actual correlation value calculation, information such as color and edge may be used as necessary. It is considered that the template size is generally determined to include the entire object region, but only a part of the object region such as the periphery of the head may be determined as the template image.

  When starting automatic tracking of an object such as a person by means of the turntable control or zoom control of the camera 106, since an appropriate template image T does not yet exist, a template image T to be used first is created as an initial template. There is a need to. For example, the following method is conceivable as a process for creating the initial template.

  1. The position on the image displayed on the screen of the image display device 105 is designated manually by the operator using a pointing device such as a mouse, and an initial template is created based on this position.

  2. The input image data is processed to recognize a region having a high probability that a desired object exists, and an initial template is automatically created based on the image of this region. As a region where the probability that a desired object exists is high, for example, a region where a motion occurs, that is, a region where a difference occurs between frames of an image can be considered.

  3. One or a plurality of standard template information relating to an object to be tracked in advance is registered on a predetermined storage device, and information registered in advance in the image of the latest frame when tracking of the object is started An image of the most similar region is automatically created as an initial template.

  In addition, you may use arbitrary methods about creation of a template. Further, in this example, it is assumed that a plurality of images having different acquisition and update times are held for the same object in this example, but a single template may be used. Further, when the correlation between the template and the input image becomes low, it is desirable to add up to the maximum number of templates that can be stored, and when the maximum number is reached, update the oldest data to the new data.

  FIG. 3 shows a specific example of the correlation map created by the processing of the object region tracking unit 102. The object region tracking unit 102 recognizes the position on the image corresponding to the peak position having the highest correlation in such a correlation map as the movement destination of the template image. In FIG. 3, since the correlation value is shown upside down, the downward convex portion represents the peak of the correlation value.

  In this way, the object region tracking unit 102 recognizes the position of the object on the image for each of the plurality of image frames that are sequentially input, so that time-series position information regarding the position of the object on the image, that is, the movement locus of the object Is obtained. Further, when photographing is repeated at a constant time period, since the difference in photographing time between image frames is known, it is also possible to obtain the moving speed of the past object and the acceleration during the movement from the movement locus.

  The camera control unit 103 grasps the moving direction of the object based on the time-series movement information of the object obtained by the processing of the object region tracking unit 102, and tracks the object according to the moving direction. Performs turntable control and zoom control. Further, instead of tracking so that the object is displayed at the center of the screen, the turntable control of the camera 106 is performed so that the area ahead of the moving direction of the object is photographed wider than the area behind.

  When performing automatic tracking control of the camera so that an object is photographed at the center of the screen as in the conventional control, the camera control unit 103 performs horizontal scanning from the screen center position according to the following equations (2) and (3). In addition, the pan and tilt angle control speeds are determined so that the amount of shift in the vertical direction is fed back to the shooting direction of the camera. That is, when the deviation amount is large, the pan / tilt angle of the camera is controlled at a high angular velocity, and when the deviation amount is small, the camera is controlled slowly (see FIG. 4A). The pan / tilt angle control speed is almost stepless, and once controlled, the pan / tilt angle is operated at that angular speed until it is stopped or another control speed signal is given.

Pan control speed CCNTP = SPDTBLp (x-IMGWIDTH / 2) (2)
Tilt control speed CCNTT = SPDTBLt (y-IMGHEIGHT / 2) (3)
Where x, y: horizontal and vertical positions on the object image
IMGWIDTH / 2: Half the width of the input image and the horizontal center position of the image
IMGHEIGHT / 2: Half the height of the input image and the vertical center position of the image
SPDTBLp, SPDTBLt: Speed table used to determine the pan and tilt control speed according to the position of the object (see FIG. 4B)

  When estimating the moving direction of the current object, the camera control amount for the past certain period obtained while tracking the object by the turntable control of the camera 106 is held, and the control amount for the past certain period is retained. It is conceivable to use the median value of as the moving direction.

  Assume that the camera control amount at time t is Ccmdt (p, t, z). Here, p and t are control angular velocities of the pan angle and the tilt angle, respectively, and z is a control speed of the zoom magnification. The moving direction can be estimated from a representative value such as the median value of each control angular velocity in the period TA that goes back in the past.

  When the control amount for the period TA sufficient to estimate the moving direction cannot be obtained as at the start of tracking, the tracking target object 501 is photographed at the center of the screen as shown in FIG. The direction of the camera is controlled as described above, but after the control amount for the period TA is held, the moving direction can be determined, and the result can be reflected in the turntable control of the camera.

  When the moving direction can be determined, the camera control unit 103 can actually perform the following control. For example, when the tracking target object 501 is moving in the lower right direction on the image, as shown in FIG. The direction of the camera is controlled so that the position of the object on the image is arranged. As a result, the lower right of the image, that is, the front in the traveling direction of the object can be widely viewed on the image.

  In the case of performing such control, for example, control may be performed using the following formulas (4) and (5) obtained by modifying the above formulas (2) and (3).

Pan control speed CCNTP = SPDTBL (x-IMGWIDTH / 2-cntx) (4)
Tilt control speed CCNTT = SPDTBL (y-IMGHEIGHT / 2-cnty) (5)
However, cntx, cnty: horizontal and vertical shift amounts from the center of the screen. Note that the shift amounts cntx, cnty may be pre-determined constants or variables that can be increased / decreased by the input operation of the observer. There may be.

  In the first embodiment, by performing the control as described above, not only the object such as an intruder is tracked by the camera turntable control, but also the traveling direction is reflected by reflecting the traveling direction of the tracking target object. Control the direction of the camera so that it can be photographed widely. Thereby, for example, when the inspector determines where the intruder is moving and what the intruder is trying to do, information useful for faster determination can be provided.

  Each component of the automatic tracking device 100 shown in FIG. 1 can be configured by dedicated hardware (electric circuit), or a predetermined program is executed using a computer such as a microprocessor. It is also possible to realize each function. In the case of configuring an automatic tracking system 110 as shown in FIG. 1, a plurality of cameras 106 and a plurality of image display devices 105 can be connected to a single automatic tracking device 100 as necessary.

(Second Embodiment)
FIG. 6 is a diagram illustrating a specific example of an image frame processed in the second embodiment. The second embodiment is a modification of the first embodiment, and is the same as the first embodiment except for the operation of the camera control unit 103 described above. Only the changed operation will be described below.

  In the second embodiment, it is assumed that the tracking target is limited to a person. Then, the camera control unit 103 detects the direction in which the person being tracked is paying attention by controlling the turntable of the camera 106, and pans so that the area in front of the direction of interest is photographed wider than the area in the back. Control the tilt angle and zoom magnification.

  The direction in which the person is paying attention can be detected by eye gaze detection. That is, processing as described below is performed.

  First, an area having a flesh-colored shape close to a circle is extracted from the image as a face. Next, the center of gravity of the extracted skin color area is calculated. In addition, an image including features such as the eyes and mouth edges is prepared in advance as a template image, and the matching of the skin color region in the image with the template image is examined, and the exact eye and mouth position is detected based on the result. To do.

  Then, the camera control unit 103 calculates which direction the face is facing based on the detected positional relationship between the eyes, mouth and camera of the person 601, and this direction is shown in FIG. 6A. Detect as line of sight. In other words, the direction of the face is set as the direction of attention.

  Also, for example, a person 601 is arranged on the image as shown in FIG. In other words, the camera control unit 103 controls the pan, tilt angle, and zoom magnification of the camera so that the front area in the direction in which the person 601 is paying attention is photographed wider than the rear area. The actual control may be controlled by replacing the moving direction in the first embodiment with the direction of interest detected in this form.

  In the second embodiment, by performing the control as described above, information useful for the observer to quickly determine where the tracking target person is moving and what he is trying to do. Can provide. In other words, since the object that is the purpose of the person's action should be in front of the person's line of sight, the observer recognizes the object that is displayed on the screen before the intruder acts. And it becomes possible to predict the behavior.

(Third embodiment)
The third embodiment is a modification of the first embodiment, and is the same as the first embodiment except for the operation of the camera control unit 103 described above. Only the changed operation will be described below.

  In the third embodiment, the camera control unit 103 first calculates the moving speed or acceleration of an object based on past time-series movement information (movement trajectory) related to the tracking target detected by the object region tracking unit 102. . Then, the camera control unit 103 performs pan / tilt angle control with zoom magnification control for the camera 106 according to the detected moving speed or acceleration of the object. The pan and tilt angle control is the same as in the previous embodiment.

  For example, when the camera 106 is stationary, the moving speed can be obtained by dividing the moving amount of the object in units of pixels on the image by the required time (difference in time when each image frame was shot). The acceleration can be obtained as a differential value of the moving speed.

  On the other hand, when the camera 106 is controlling (tracking) the pan and tilt angles in accordance with the movement of the object, it is difficult to accurately determine the moving speed, but the acceleration can be calculated. For example, if the camera is controlled so as to capture the tracking target object 701 moving at a constant speed at the center of the image, the object position from the center of the image when the tracking target object 701 starts acceleration / deceleration. Can be regarded as equivalent to the acceleration parameter.

  Therefore, the camera control unit 103 uses the detected speed or acceleration when the camera is stationary, and uses the detected acceleration when pan / tilt angle control is operating to track the image at a future time. The position of the target object 701 can be predicted. Then, the camera control unit 103 determines whether or not the position of the tracking target object 701 is out of the angle of view of the camera or whether the position of the tracking target object 701 is a peripheral portion of the image.

  If the tracking target object 701 deviates from the angle of view of the camera 106 or if there is a sudden change in the moving speed that can be predicted that this object is placed at the periphery of the image, the pan and tilt angles It is difficult to keep track of the tracking target object 701 in the angle of view only by control.

  Therefore, in the third embodiment, the camera control unit 103 determines whether or not the predicted position on the screen of the tracking target object 701 predicted based on the moving speed or acceleration of the tracking target object 701 is out of the screen. When the predicted position on the screen of the target object 701 is located at the periphery of the screen, the zoom magnification of the camera 106 is automatically adjusted to the wide-angle side from the present. As a result, the apparent movement of the tracking target object 701, that is, the moving speed of the object on the screen is reduced, so that the influence of control and processing delay can be reduced, and the tracking target object can be controlled by pan and tilt angle control. The display position 701 can be maintained near the center of the screen.

  That is, according to the third embodiment, even when the tracking target object moves rapidly or changes its moving direction, the zoom magnification is set to the wide angle side even when the object is likely to be off the screen. Since the movement of the object can be followed by automatic adjustment, the object can be prevented from coming off the screen in advance, and high tracking performance can be obtained.

(Fourth embodiment)
FIG. 7 is a diagram showing a specific example of an image frame processed in the fourth embodiment. The fourth embodiment is a modification of the first embodiment, and is the same as the first embodiment except for the operation of the camera control unit 103 described above. Only the changed operation will be described below.

  In the fourth embodiment, the camera control unit 103 determines the position of the tracking target object detected by the object region tracking unit 102 on the screen, and performs zoom magnification control and pan / tilt angle control according to the result. carry out. The pan and tilt angle control is the same as that in the above-described embodiment.

  For example, as shown in FIG. 7, when the position of the tracking target object on the screen is the peripheral portion of the image, a sudden acceleration / deceleration of the object causes a delay in camera turntable control (object tracking). Can be considered. In such a case, the camera control unit 103 automatically changes the zoom magnification to the wide angle side. At the same time, pan and tilt angles are also controlled. That is, by comparing the position of the detected object with the coordinates of the peripheral area of the image, it is identified whether or not there is a control delay, and the zoom magnification is controlled according to the result.

  On the other hand, the camera control unit 103 stably captures the position of the tracking target object detected by the object region tracking unit 102 on the screen in the vicinity of the center of the image (or near the control target position) for a predetermined time. If it is recognized that the zooming is in progress, the zoom magnification is automatically changed to the telephoto side. As a result, the tracking object can be enlarged and observed in more detail.

  According to the fourth embodiment, for example, even when the tracking target object moves rapidly or changes its moving direction, the zoom magnification is automatically set to the wide-angle side even if the object is likely to be off the screen. By adjusting, it is possible to follow the movement of the object. Also, when the object can be tracked stably, such as when the object is stationary or moving relatively slowly, the object can be automatically enlarged to capture the details of the object. Thereby, high tracking performance is obtained.

  Incidentally, as described above, each component of the automatic tracking device 100 shown in FIG. 1 can also be realized by executing a predetermined program on a predetermined computer. In addition, the program can be read from a predetermined recording medium and executed, or can be downloaded and executed via a communication network such as the Internet.

  When the image input unit 101, the object region tracking unit 102, the camera control unit 103, and the image output unit 104 shown in FIG. 1 are realized by a single computer, it is common to sequentially execute processing corresponding to each unit. Method.

  When such processing is performed, each processing may be performed according to the procedure shown in FIG. FIG. 8 is a flowchart showing processing of the automatic tracking system according to the embodiment of the present invention. In this process, an image input step 801 for inputting an image captured by the imaging device, an object region tracking step 802 for searching a region having a high correlation with a registered template image from the input image, and a tracking object image A camera control unit 803 that causes the imaging device to perform a PTZ operation so that the image is displayed in the center of the image, and an image output step 804 that outputs an image to an external image display device. Steps 801 to 804 correspond to the components 101 to 104 in FIG. 1, respectively, and the contents of each process are the same as those in the above-described embodiments.

  Even when the processing shown in FIG. 8 is performed, the camera control step 803 detects the traveling direction of the tracking target object, and controls the shooting direction of the camera so that the front of the traveling direction is photographed widely. When the observer determines where the object is moving or what he is trying to do, it can provide information useful for faster determination.

  In addition, in camera control step 803, the direction of the line of sight or face of the person being tracked is detected, and the shooting direction is controlled so as to shoot widely in front of the corresponding direction, so that the tracking target object moves toward where it is. It can provide information that can help you determine earlier or what you are trying to do.

  In the camera control step 803, the movement speed and acceleration of the tracking target object are calculated, and when the object is predicted to deviate from the angle of view or to be photographed at the periphery of the image, not only the photographing direction of the camera. By controlling the zoom magnification together, the camera can be controlled so that the tracking object does not deviate from the angle of view.

  In addition, in the camera control step 803, when it is detected that the tracking target object is captured at the periphery of the image, the tracking target object is controlled from the angle of view by controlling not only the shooting direction of the camera but also the zoom magnification. It becomes possible to control the camera so that it does not come off.

  As described above, according to the present embodiment, even when processing and control are delayed due to the movement of the tracking target, control is automatically performed so that a region in front of the moving direction is imaged more widely. Therefore, automatic tracking can be performed so that a target such as an intruder does not protrude from the screen. For this reason, when a monitor or the like monitors an object while viewing an image, image information useful for faster determination and recognition can be provided. It is also possible to take a picture so that the area in front of the intruder's moving direction, which is important for monitoring the intruder, etc., and the area to which the intruder is paying attention can be clearly seen.

  In addition, by detecting the direction in which the person to be tracked focuses, the area in front of the direction in which the person is focused can be photographed more widely. Also, based on the measurement results of the moving speed and acceleration of the tracking target object, predict that the object will deviate from the angle of view of the camera or that the object will be photographed at the periphery of the image. By controlling to the wide angle side, it is also possible to prevent the object from deviating from the angle of view. In addition, when the position of the tracking target object on the image is in the periphery of the image and the camera turntable control is considered to be delayed, the zoom magnification is controlled to the wide angle side. As a result, pan and tilt control is facilitated, and it is possible to prevent an object from deviating from the angle of view.

  The present invention has an effect of automatically capturing a region suitable for monitoring an intruder or the like even when processing or control delay occurs, and an imaging unit that captures an image including a target An automatic tracking control apparatus and an automatic tracking control method for tracking a specific target by controlling the adjustment mechanism using an adjustment mechanism that changes a range captured by the imaging unit by zoom adjustment or orientation adjustment Useful.

The block diagram which shows the structure of the automatic tracking system which concerns on embodiment of this invention The figure which shows the object area | region in the image frame tracked in this embodiment The figure which shows the specific example of the correlation map which the object area | region tracking part in this embodiment produces | generates. Schematic diagram showing the operation of the camera control unit in the present embodiment The figure which shows the specific example of the image frame processed in 1st Embodiment The figure which shows the specific example of the image frame processed in 2nd Embodiment The figure which shows the specific example of the image frame processed in 4th Embodiment The flowchart which shows the process of the automatic tracking system which concerns on embodiment of this invention. Schematic diagram showing areas in prior art camera control

Explanation of symbols

DESCRIPTION OF SYMBOLS 100 Automatic tracking apparatus 101 Image input part 102 Object area tracking part 103 Camera control part 104 Image output part 105 Image display apparatus 106 Camera 107 Turntable 108 PTZ control part 109 Zoom lens 110 Automatic tracking system 201 Corresponding area 202 Search area 801 Image input Step 802 Object region tracking step 803 Camera control step 804 Image output step

Claims (11)

  1. Automatic tracking control for tracking a specific target by controlling the adjustment mechanism by using an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction A device,
    A target is detected for each of a plurality of images sequentially input from the imaging means in time series, and position information of a plurality of image frames related to a region in which the target is detected is associated with time changes and held as tracking information. Object region tracking means;
    The moving direction of the target is detected based on past tracking information detected and held by the object region tracking means, and the adjustment mechanism is controlled to reflect the detected moving direction, and at least the target An automatic tracking control device comprising: camera control means for controlling a position of a target in an image so that an imaging region ahead in the movement direction is larger than that in the rear in the movement direction.
  2. Automatic tracking control for tracking a specific target by controlling the adjustment mechanism by using an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction A device,
    Object region tracking means for detecting the position of a person in the image based on the image input from the imaging means;
    The attention direction of the person in the image is detected based on the image input from the imaging means, and the adjustment mechanism is controlled to reflect the detected attention direction, and at least as compared with the rear of the attention direction of the person. An automatic tracking control device comprising: camera control means for controlling the position of a person in an image so that a shooting area ahead of the direction of interest becomes large.
  3. An automatic tracking control for tracking a specific target by controlling the adjustment mechanism using an imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a zoom magnification. A device,
    A target is detected for each of a plurality of images sequentially input from the imaging means in time series, and position information of a plurality of image frames related to a region in which the target is detected is associated with time changes and held as tracking information. Object region tracking means;
    Based on the tracking information detected by the object region tracking means, at least one of the acceleration, moving speed, and position on the screen of the target is obtained as a measured value, and the adjustment mechanism is controlled according to the measured value. An automatic tracking control device comprising: camera control means for changing a zoom magnification.
  4.   The camera control means controls the adjustment mechanism to recognize the predicted position on the image of the target predicted based on the measurement value at the periphery of the image or to protrude from the image. The automatic tracking control device according to claim 3, wherein the zoom magnification is changed.
  5.   The said camera control means controls the said adjustment mechanism, and changes a zoom magnification to a telephoto side, when it detects that the moving speed or moving acceleration of a target became small based on the said measured value. Automatic tracking control device.
  6.   The camera control means identifies whether the position of the target on the image is a peripheral part of the image based on the measurement value, and recognizes that the position of the target on the image is the peripheral part of the image, The automatic tracking control device according to claim 3, wherein the adjustment mechanism is controlled to change the zoom magnification to the wide angle side.
  7. An image capturing unit that captures an image including a target and an adjustment mechanism that changes a range captured by the image capturing unit by adjusting a shooting direction, and an automatic mechanism for tracking the specific target by controlling the adjustment mechanism. A tracking control method,
    A target is detected for each of a plurality of images sequentially input from the imaging means in time series, and position information of a plurality of image frames related to a region in which the target is detected is associated with time changes and held as tracking information. An object region tracking step;
    Based on the past tracking information detected and held in the object region tracking step, the moving direction of the target is detected, and the adjustment mechanism is controlled to reflect the detected moving direction, and at least the target is detected. An automatic tracking control method, comprising: a camera control step for controlling a position of a target in an image so that an imaging region ahead in the movement direction is larger than that behind the movement direction.
  8. An image capturing unit that captures an image including a target and an adjustment mechanism that changes a range captured by the image capturing unit by adjusting a shooting direction, and an automatic mechanism for tracking the specific target by controlling the adjustment mechanism. A tracking control device,
    An object region tracking step of detecting the position of a person in the image based on the image input from the imaging means;
    The attention direction of the person in the image is detected based on the image input from the imaging means, and the adjustment mechanism is controlled to reflect the detected attention direction, and at least as compared with the rear of the attention direction of the person. An automatic tracking control method comprising: a camera control step for controlling a position of a person in the image so that a photographing area ahead of the direction of interest is enlarged.
  9. An image capturing unit that captures an image including a target and an adjustment mechanism that changes a range captured by the image capturing unit by adjusting a zoom magnification, and an automatic for tracking the specific target by controlling the adjustment mechanism A tracking control method,
    A target is detected for each of a plurality of images sequentially input from the imaging means in time series, and position information of a plurality of image frames related to a region in which the target is detected is associated with time changes and held as tracking information. An object region tracking step;
    Based on the tracking information detected in the object region tracking step, at least one of the acceleration, moving speed, and position on the screen of the target is acquired as a measured value, and the adjustment mechanism is controlled according to the measured value. An automatic tracking control method comprising: a camera control step for changing a zoom magnification.
  10.   An imaging unit that captures an image including a target and an adjustment mechanism that changes a range captured by the imaging unit by adjusting a shooting direction or zoom magnification, and controls the adjustment mechanism to track a specific target. A program for causing a computer to execute each procedure according to claim 7.
  11.   The automatic tracking control device according to any one of claims 1 to 6, an imaging unit that captures an image including a target, a turntable mechanism that can control an imaging direction of the imaging unit, and a time when the imaging unit captures an image Automatic tracking system with a lens mechanism that can control the zoom magnification of
JP2005037815A 2005-02-15 2005-02-15 Automatic tracking control device, automatic tracking control method, program, and automatic tracking system Active JP4699040B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005037815A JP4699040B2 (en) 2005-02-15 2005-02-15 Automatic tracking control device, automatic tracking control method, program, and automatic tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005037815A JP4699040B2 (en) 2005-02-15 2005-02-15 Automatic tracking control device, automatic tracking control method, program, and automatic tracking system

Publications (2)

Publication Number Publication Date
JP2006229322A true JP2006229322A (en) 2006-08-31
JP4699040B2 JP4699040B2 (en) 2011-06-08

Family

ID=36990323

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005037815A Active JP4699040B2 (en) 2005-02-15 2005-02-15 Automatic tracking control device, automatic tracking control method, program, and automatic tracking system

Country Status (1)

Country Link
JP (1) JP4699040B2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008141700A (en) * 2006-12-05 2008-06-19 Fujifilm Corp Monitoring system and method, and program
WO2008108088A1 (en) 2007-03-05 2008-09-12 Panasonic Corporation Automatic tracking device and automatic tracking method
JP2009033450A (en) * 2007-07-26 2009-02-12 Casio Comput Co Ltd Camera apparatus, object tracking zooming method and object tracking zooming program
JP2009033672A (en) * 2007-07-30 2009-02-12 Sogo Keibi Hosho Co Ltd Surveillance control apparatus and method
JP2009088860A (en) * 2007-09-28 2009-04-23 Canon Inc Imaging device
JP2009195271A (en) * 2008-02-19 2009-09-03 Fujifilm Corp Capsule endoscope system
JP2009272882A (en) * 2008-05-07 2009-11-19 Ntt Docomo Inc Video distribution server, video distribution system, and video distribution method
JP2010028158A (en) * 2008-07-15 2010-02-04 Canon Inc Imaging apparatus
CN102075674A (en) * 2009-11-25 2011-05-25 奥林巴斯映像株式会社 Imaging apparatus
WO2011142480A1 (en) 2010-05-14 2011-11-17 Ricoh Company, Ltd. Imaging apparatus, image processing method, and recording medium for recording program thereon
JP2012117833A (en) * 2010-11-29 2012-06-21 Equos Research Co Ltd Image recognition device, mounting type robot, and image recognition program
JP2013042386A (en) * 2011-08-17 2013-02-28 Hitachi Kokusai Electric Inc Monitoring system
JP2013146032A (en) * 2012-01-16 2013-07-25 Denso Corp Driver monitor system and processing method thereof
JP2013192184A (en) * 2012-03-15 2013-09-26 Casio Comput Co Ltd Subject tracking display controller, subject tracking display control method, and program
US8594371B2 (en) 2009-04-08 2013-11-26 Nikon Corporation Subject tracking device and camera
JP2013258748A (en) * 2013-08-07 2013-12-26 Canon Inc Imaging apparatus
JP2014143681A (en) * 2012-12-26 2014-08-07 Canon Inc Automatic tracking photographing system
WO2015072166A1 (en) * 2013-11-18 2015-05-21 オリンパスイメージング株式会社 Imaging device, imaging assistant method, and recoding medium on which imaging assistant program is recorded
JP2015130612A (en) * 2014-01-08 2015-07-16 キヤノン株式会社 Imaging apparatus and control method of the same
JP2015192242A (en) * 2014-03-27 2015-11-02 キヤノン株式会社 Imaging apparatus, control method of imaging apparatus, control program for imaging apparatus, and storage medium
JP2016059014A (en) * 2014-09-12 2016-04-21 沖電気工業株式会社 Monitoring system, video analyzer, video analyzing method, and program
CN106464784A (en) * 2014-06-30 2017-02-22 奥林巴斯株式会社 Image capturing device and image capturing method
WO2018074045A1 (en) * 2016-10-17 2018-04-26 ソニー株式会社 Information processing device, information processing method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07114642A (en) * 1993-10-19 1995-05-02 Oyo Keisoku Kenkyusho:Kk Measuring instrument for mobile object
JPH09214946A (en) * 1996-02-07 1997-08-15 Fujitsu General Ltd Mobile object tracking camera system
JP2001268425A (en) * 2000-03-16 2001-09-28 Fuji Photo Optical Co Ltd Automatic tracking device
JP2003219225A (en) * 2002-01-25 2003-07-31 Nippon Micro Systems Kk Device for monitoring moving object image
JP2003319386A (en) * 2002-04-22 2003-11-07 Gen Tec:Kk Photographed image transmission system for mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07114642A (en) * 1993-10-19 1995-05-02 Oyo Keisoku Kenkyusho:Kk Measuring instrument for mobile object
JPH09214946A (en) * 1996-02-07 1997-08-15 Fujitsu General Ltd Mobile object tracking camera system
JP2001268425A (en) * 2000-03-16 2001-09-28 Fuji Photo Optical Co Ltd Automatic tracking device
JP2003219225A (en) * 2002-01-25 2003-07-31 Nippon Micro Systems Kk Device for monitoring moving object image
JP2003319386A (en) * 2002-04-22 2003-11-07 Gen Tec:Kk Photographed image transmission system for mobile terminal

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008141700A (en) * 2006-12-05 2008-06-19 Fujifilm Corp Monitoring system and method, and program
WO2008108088A1 (en) 2007-03-05 2008-09-12 Panasonic Corporation Automatic tracking device and automatic tracking method
JP2009033450A (en) * 2007-07-26 2009-02-12 Casio Comput Co Ltd Camera apparatus, object tracking zooming method and object tracking zooming program
JP2009033672A (en) * 2007-07-30 2009-02-12 Sogo Keibi Hosho Co Ltd Surveillance control apparatus and method
JP2009088860A (en) * 2007-09-28 2009-04-23 Canon Inc Imaging device
JP2009195271A (en) * 2008-02-19 2009-09-03 Fujifilm Corp Capsule endoscope system
JP2009272882A (en) * 2008-05-07 2009-11-19 Ntt Docomo Inc Video distribution server, video distribution system, and video distribution method
JP4615035B2 (en) * 2008-05-07 2011-01-19 株式会社エヌ・ティ・ティ・ドコモ Video distribution server, video distribution system, and video distribution method
JP2010028158A (en) * 2008-07-15 2010-02-04 Canon Inc Imaging apparatus
US8594371B2 (en) 2009-04-08 2013-11-26 Nikon Corporation Subject tracking device and camera
CN102075674A (en) * 2009-11-25 2011-05-25 奥林巴斯映像株式会社 Imaging apparatus
CN102075674B (en) * 2009-11-25 2013-08-28 奥林巴斯映像株式会社 Imaging apparatus
WO2011142480A1 (en) 2010-05-14 2011-11-17 Ricoh Company, Ltd. Imaging apparatus, image processing method, and recording medium for recording program thereon
US9057932B2 (en) 2010-05-14 2015-06-16 Ricoh Company, Ltd. Imaging apparatus, image processing method, and recording medium for recording program thereon
JP2012117833A (en) * 2010-11-29 2012-06-21 Equos Research Co Ltd Image recognition device, mounting type robot, and image recognition program
JP2013042386A (en) * 2011-08-17 2013-02-28 Hitachi Kokusai Electric Inc Monitoring system
JP2013146032A (en) * 2012-01-16 2013-07-25 Denso Corp Driver monitor system and processing method thereof
JP2013192184A (en) * 2012-03-15 2013-09-26 Casio Comput Co Ltd Subject tracking display controller, subject tracking display control method, and program
JP2014143681A (en) * 2012-12-26 2014-08-07 Canon Inc Automatic tracking photographing system
JP2013258748A (en) * 2013-08-07 2013-12-26 Canon Inc Imaging apparatus
WO2015072166A1 (en) * 2013-11-18 2015-05-21 オリンパスイメージング株式会社 Imaging device, imaging assistant method, and recoding medium on which imaging assistant program is recorded
US9628700B2 (en) 2013-11-18 2017-04-18 Olympus Corporation Imaging apparatus, imaging assist method, and non-transitory recoding medium storing an imaging assist program
JP5886479B2 (en) * 2013-11-18 2016-03-16 オリンパス株式会社 Imaging device, imaging assist method, and recording medium containing imaging assist program
JP2015130612A (en) * 2014-01-08 2015-07-16 キヤノン株式会社 Imaging apparatus and control method of the same
JP2015192242A (en) * 2014-03-27 2015-11-02 キヤノン株式会社 Imaging apparatus, control method of imaging apparatus, control program for imaging apparatus, and storage medium
CN106464784A (en) * 2014-06-30 2017-02-22 奥林巴斯株式会社 Image capturing device and image capturing method
JP2016059014A (en) * 2014-09-12 2016-04-21 沖電気工業株式会社 Monitoring system, video analyzer, video analyzing method, and program
WO2018074045A1 (en) * 2016-10-17 2018-04-26 ソニー株式会社 Information processing device, information processing method, and program
EP3528024A4 (en) * 2016-10-17 2019-11-06 Sony Corp Information processing device, information processing method, and program

Also Published As

Publication number Publication date
JP4699040B2 (en) 2011-06-08

Similar Documents

Publication Publication Date Title
US7742077B2 (en) Image stabilization system and method for a video camera
US5745175A (en) Method and system for providing automatic focus control for a still digital camera
DE112005000929B4 (en) Automatic imaging method and device
EP1998567B1 (en) Tracking device, tracking method, tracking device control program, and computer-readable recording medium
US7336297B2 (en) Camera-linked surveillance system
US7385626B2 (en) Method and system for performing surveillance
US10250799B2 (en) Enhanced image capture
US20100013917A1 (en) Method and system for performing surveillance
US9876993B2 (en) Video tracking system and method
EP1914682B1 (en) Image processing system and method for improving repeatability
KR960005204B1 (en) Video camera having focusing and image-processing function
EP0366136B1 (en) Image sensing and processing device
JP4140567B2 (en) Object tracking device and object tracking method
US20050084179A1 (en) Method and apparatus for performing iris recognition from an image
US7382400B2 (en) Image stabilization system and method for a video camera
JP2004519955A (en) How to assist an automatic video tracking system in target reacquisition
JP2008124787A (en) Camera shake correcting device and method, and imaging device
US7920161B2 (en) Method for forming combined digital images
US7548269B2 (en) System for autofocusing a moving object
US20050018879A1 (en) Object tracking method and object tracking apparatus
US7561790B2 (en) Auto focus system
JPWO2006082967A1 (en) Imaging device
EP1601189A2 (en) Autofocus system
DE60123534T2 (en) Device for tracking a moving object
US8068639B2 (en) Image pickup apparatus, control method therefor, and computer program for detecting image blur according to movement speed and change in size of face area

Legal Events

Date Code Title Description
RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20071113

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20071120

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080212

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20101116

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20101124

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110113

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110201

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110302

R150 Certificate of patent or registration of utility model

Ref document number: 4699040

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111