WO2002103649A2 - Method and apparatus for detecting objects - Google Patents
Method and apparatus for detecting objects Download PDFInfo
- Publication number
- WO2002103649A2 WO2002103649A2 PCT/US2002/001120 US0201120W WO02103649A2 WO 2002103649 A2 WO2002103649 A2 WO 2002103649A2 US 0201120 W US0201120 W US 0201120W WO 02103649 A2 WO02103649 A2 WO 02103649A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- monitored area
- pattern
- image
- reference image
- live image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19604—Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19686—Interfaces masking personal details for privacy, e.g. blurring faces, vehicle license plates
Definitions
- the present invention relates to object detection, and more specifically, to object intrusion and/or presence detection within a monitored area or region.
- Such systems monitor a user-defined area to detect when an object enters or passes through a monitored area.
- Such systems typically include an image capture device ⁇ (typically a video camera or still camera) capable of capturing an image of the monitored . area and, if required, a device for digitizing the captured images. The digitized images are analyzed in an attempt to detect whether an object has entered the monitored area.
- image capture device ⁇ typically a video camera or still camera
- a device for digitizing the captured images The digitized images are analyzed in an attempt to detect whether an object has entered the monitored area.
- There are many different known methods and algorithms for analyzing digitized images for determining when an object has entered a monitored area One of the most common methods is generally referred to as a change detection method.
- Change detection is often accomplished by examining the difference between a current live image and a reference image, where the reference image contains only the static background of the monitored area.
- a reference image can be thought of as a representation of the monitored area as it would appear if no transitory objects were in view.
- Change detection algorithms often take two digitized images as input and return the locations in the field of view where differences between the images are identified.
- Object detection systems are commonly used in environments that have dynamic lighting conditions. For example, in industrial settings, moving shadows can be cast on a monitored area or region, which can cause significant changes in ambient lighting conditions. Many existing object detection systems, including those that use change detection algorithms to detect objects, can be challenged by such shadows and/or other dynamic lighting conditions.
- the present invention overcomes many of the disadvantages of the prior art by providing an object detection system that is less susceptible to dynamic lighting conditions, and/or more sensitive to three-dimensional object motion and/or presence. This is preferably accomplished by projecting one or more static or dynamic patterns onto the monitored area, capturing one or more live images of the monitored area including the one or more patterns, and detecting objects in the monitored area by detecting changes in the one or more patterns in selected captured images.
- a single pattern is projected onto the monitored area.
- One or more live images of the monitored area are then captured at selected times, and analyzed to detect changes in the projected pattern.
- the changes in the pattern may indicate a topographical change in the monitored area, and thus the entry or movement of an object in the monitored area. Because the. pattern is projected onto the monitored area, changes in the ambient lighting conditions may have less effect on the efficacy of the object detection system.
- a moire interference pattern is used to help detect objects in the monitored area.
- Moire interference patterns are particularly sensitive to relative motion between the two or more underlying patterns that are used to create the moire interference pattern. As such, the use of moire interference patterns can be highly effective in detecting objects that intrude into a monitored area.
- a moire interference pattern may be created in any number of ways. For example, two or more similar patterns may be projected onto the monitored area from offset illumination positions. Alternatively, or in addition, one pattern may be projected onto the monitored area, while another may be imposed by a patterned grating positioned in the image plane of the image capture device. Yet another way of creating a moire interference pattern is to capture two images of the same area having a single projected pattern, and digitally or optically rotating or otherwise changing the position of one of the images relative to the other to create a moire interference pattern. Any number of other methods may also be used to create a moire interference pattern, as desired.
- one or more live images of the monitored may be captured at selected times. The images may then be analyzed to detect changes in the moire interference pattern. Changes in the moire interference pattern may indicate a topographical change in the monitored area and thus the entry or movement of an object in the monitored area.
- An advantage of using moire interference patterns is that shadows and/or other changes in ambient lighting conditions may have little or no effect on the position, frequency or other characteristics of the moire interference pattern. To help prevent the projected pattern from being overwhelmed by ambient light, it is contemplated that a portion of the spectrum or lighting frequency that is not typically found in the ambient lighting conditions may be used to project the one or more patterns on the monitored area, such as near infrared.
- the present invention may be used to monitor a user-defined safety zone for the intrusion of people or other objects.
- numerous other applications are also contemplated including security, recording, and other monitoring and/or detection applications.
- Figures 1 A- IB are schematic diagrams showing one illustrative object detection system in accordance with the present invention.
- FIGS. 2A-2B are schematic diagrams showing other illustrative object detection systems in accordance with the present invention.
- Figures 3A-3B depict two example patterns that can be used in accordance with some embodiments of the present invention
- Figures 4A-4B depict two examples of patterns that can be used in accordance with other embodiments of the present invention
- Figures 5A-5C depict an example of moire interference phenomena in accordance with the present invention
- Figures 6A-6C depict an illustrative reference image, live image and comparison image, respectively, in accordance with one embodiment of the present invention
- Figure 7 is a flow diagram showing an illustrative method in accordance with present invention
- Figure 8 is a flow diagram showing another illustrative method in accordance with the present invention.
- Figure 9 is a flow diagram showing yet another illustrative method in accordance ' with the present invention.
- the present invention provides an object detection system that may be lessi susceptible to dynamic lighting conditions, and or may be more sensitive tc c-bjeet motion and/or presence than prior art systems.
- Figures 1 A- IB are schematic diagrams showing a first illustrative object detection system in accordance with the present invention.
- the illustrative object detection system of Figures 1 A includes an illumination source 2, an image capture device 4, an image storage device 6, and a processing device 8.
- the illumination source 2, image capture device 4, image storage device 6, and processing device 8 are integrated into a common device, while in other embodiments, separate devices are provided, as desired.
- the illumination source 2 is located above a monitored area 14, such as near a ceiling.
- the illumination source 2 illuminates the monitored area 14 with a desired pattern.
- the pattern may be generated by, for example, projecting through a patterned grating, projecting interference patterns where the interference fringes are formed by a phasor or wavelength shifting, projecting a pattern using a scanning mechanism, or any - 5 other suitable method.
- the pattern may be static or dynamic.
- a dynamic pattern is ones where the spatial position of the light areas and dark areas is moving, and vgeneral the * ' movement is periodic in nature.
- Thesfreque ⁇ e ofmovement of the mirror may deteimine:tthet.>. frequency of interference fringe movement. . Itsis: contemplated that the mdhit ⁇ red,areBL l ⁇ 4?c ⁇ may be recitang ar roundjOranjy. other sha'pe; -asidesired. As shown in Figure 'lBy-the!'- illustrative monitored area 14 extends around three side of a machine 18.
- the illumination-source 2 maybe any type of illumihation s
- a'.suitablei illumination source 2 may be an infrared source. Using a portion of the spectrum, not '. . ordinarily found in the ambient lighting conditions, such as the near infrared,! may help ' keep the projected pattern from being overwhelmed by the ambient lighting conditions, and may also help enhance the differentiation between the projected pattern and other
- the image capture device may be a sensor (CCD or the like) that is attuned to a desired spectrum, such as the spectrum of the illumination source.
- the illumination source 2 preferably projects at least one pattern on the monitored area 14.
- the pattern used may depend on the particular application at hand.
- the pattern may be any pattern that has transitions between areas that have illumination (e.g. light areas) and areas that lack illumination (e.g. dark areas).
- the average distance between transitions should be approximately the same size as the smallest object for which detection is desired, ⁇ ⁇ • 5 although this is not required.
- suitable patterns included striped or checker board patterns where illuminated and non-illuminated areas alternate. , Some illustrative •patterns are shown in Figures 3A-3B and. Figures:4A-4B, but any suitable pattern -may be ⁇ ised.-,
- the reference' image is preferably st ⁇ ed;-at;least temporarily, in theimageistorageidevice .6. ;
- a new reference image may.be; captured periodically, if desired. Once/a reference, image; is captured, the. image; captureidevice ' 4. may 'Capture successive livesimages ofthe :;.
- images. are. preferably stored; atleasttdrnporarilyi' in the image storagedevioe ⁇ .
- the ⁇ image' storagedevice 6 mayiprovide the 'reference image and the live images to / the.; processing device 8. for; processing.
- the processing device 8 preferably analyzes.the live images to detect changes in
- the monitored area is divided into' a ⁇ number of image segments called mask windows.
- the size of each mask window is ' preferably chosen so that it is no bigger than the approximate size of the smallest object for which detection is desired. While objects smaller than the mask window may be detected, the probability of detecting such objects decreases with object size.
- 25 position of the various mask windows may be chosen so that the entire area to be monitored is covered by overlapping mask windows.
- the image area that corresponds to each mask window may be analyzed separately for object detection, if desired.
- the analysis method that is used to analyze the various mask windows may differ across the image, and the triggered response may vary depending on which mask window detects an object, if desired.
- the comparison between a reference image and a live image can be accomplished in any number of ways.
- One method is to simply do a pixel-by-pixel comparison of the images, such as by subtracting one image from the other. If there is no entry or movement of objects in the monitored area, the projected pattern in the two images will substantially cancel out. However, if there is entry or movement of an object in the monitored area, part of the projected pattern shown in one image may be shifted or otherwisedeformed relative:t ⁇ ; thepattern ⁇ shownin the other image.
- a threshold value ' > may be ' used to help, determineif there' is . sufficient difference between the reference image and.a live image to; indicate.! a detected obj feet, as further described below.. ⁇ • • • • • .
- Another method for comparing ' one image toi another is to calculate a difference ' "gref.' between he value, ofithe brightness levels .corresponding to the light area's of th .. pattern.(such as. in amask window),. and tshe.ival ⁇ e ' .®f the brightness, levels corresponding to the dark areas inthe maskwindow, of the-refef ence image. )
- a similar calculation may be made for the masklwindows of a live image.. Whenever the : second calculation is, ⁇ different from the first calculation:b a specified amount,, a change may be inferred..
- Yet another method for comparing one image to another is to measure a correlation between each pixel and some neighboring pixels and/or a correlation between selected features, and then compare the correlation values. Whenever the correlation values are different by a specified amount, a change may be inferred.
- the image analysis may extract the moire spatial frequency and phase using a Fourier transform.
- Other image analysis techniques may also be used including, for example, unsharp masking, thresholding, contrast segmentation, filtering processing, skeletonization processing, multi-resolution analysis, deformable contour modeling, image clustering, morphology, etc.
- the reference and/or live images may be preprocessed , ' before they are compared.
- the reference image and/or live images may be provided to a filter that helps removes speckle, provides smoothing, changes overall intensity, or otherwise cleans-up the images.
- Changes that are detected in the illuminated pattern may indicate a topographical change within the monitored area 14, and thus entry or movement of an object in the monitored area 14.
- the processing device 8 may sound an .alarm,. shut down the machine 18, and/or provide some other alarm or action. Images of the monitored area with the detected object present may be retained for subsequent analysis, and/or sent to a monitoring station if desired.
- Figures 2A-2B are schematic diagrams showing yet other illustrative object detection systems in accordance with the present invention.
- the illumination source 2 includes a radiation source 20, an obj ective proj ection lens 22, and a patterned grating 24.
- the patterned grating 24 provides the desired pattern on the monitored area 14. It is contemplated that any pattern suitable for creating a moire interference pattern may be used. Some suitable patterns are shown in Figures 4A-4B. Rather than providing a separate patterned grating, the illumination source 2 itself may be configured to provide the desired pattern, if desired.
- the image capture device 4 captures a first image of the monitored area 14 through a second grating 26.
- the second grating 26 may be any type of grating, filter or mask that produces the desired pattern.
- the second grating 26 may be provided by the CCD pixel array of the image capture device 4 itself.
- the first grating 24 and the second grating 26 preferably collectively produce a live image that including a moire interference pattern therein.
- the second pattern may be imposed digitally on the live image after the live image has been captured, if desired.
- the image capture device 4 transmits the live image to an image storage device 28, which in the embodiment shown, may also include a processing device.
- the position of the illumination source 2 and the image capture device 4 are preferably spaced from each other by a distance. This distance can be small to large. This space may help encourage a change in the pattern that reaches the image capture device 4 when a topographical change occurs in the monitored area 14. Typically, the space should be larger as the resolution of the projected pattern is increased.
- a warning device 30 may be attached to the image storage device 28, and may be activated upon detection of an object.
- two illumination sources 36 and 38 are provided for projecting two separate patterns on the monitored area 14.
- Illumination source 36 includes a radiation source 40, an objective projection lens 42, and a patterned grating 44.
- the illumination source 36 preferably projects a first pattern on the monitored area 14.
- illumination source 38 includes a radiation source 46, an objective projection lens 48, and a patterned grating 50. Illumination source 38 preferably proj ects a second pattern on the monitored area 14. The first pattern and the second pattern preferably collectively cause a moire interference pattern to be formed on the monitored area 14.
- Image capture device 4 is used to capture a live image of the monitored area 14. Like above, the image capture device 4 may include an image storage device and a processing device. The live images are preferably analyzed to detect changes in the moire interference pattern over time. Changes in the moire interference pattern may indicate a topographical change in the monitored area 14 and thus the entry or movement of an object in the monitored area 14.
- the illustrative embodiments shown in Figures 2 A-2B both cause a moire interference pattern to be provided in the resulting live image.
- the image analysis may be similar to that discussed above with respect to Figures 1 A-1B.
- the image analysis may extract the moire spatial frequency and phase using a Fourier transform.
- Other image analysis techniques may also be used including, for example, unsharp masking, thresholding, contrast segmentation, filtering processing, skeletonization processing, multi-resolution analysis, deformable contour modeling, image clustering, morphology, etc.
- a radon filter may be oriented perpendicular to the moire interference bands, and any loss of correlation between the filtered results from the reference image and the live image may indicate a change in the moire interference bands.
- Figures 5A-5C depict an example of moire interference phenomena in accordance with the present invention.
- Figure 5 A shows an image of a first pattern projected onto a monitored area.
- the first pattern includes a number of parallel lines extending in the vertical direction.
- Figure 5B shows an image of the monitored area with a second pattern superimposed on the first pattern.
- the second pattern includes a number of parallel lines extending in a direction that is radially offset relative to the vertical direction, hi the illustrative embodiment shown, the surface of the monitored area is spherical in shape, resulting in a number of curved moire interference bands 50.
- Figure 5C shows an image of the monitored area with an object 52 positioned in the monitored area.
- the object 52 causes a change in the moire interference bands relative to Figure 5B.
- the moire interference bands are highly sensitive to topographical changes in the monitored area, which in the embodiment shown, is illustrated by the introduction of the three-dimensional object 52 into the monitored area.
- Figures 6A-6C depict an illustrative reference image, live image and comparison image, respectively, where only a single pattern is projected onto the monitored area and no grating is positioned between the monitored area and the image capture device.
- the image shown in Figure 6A is a reference image of the monitored area.
- the image shown in Figure 6B is an image of the monitored area when a hand is placed in the monitored area.
- the image shown in Figure 6C is an image that results from the comparison (e.g. subtraction) of the image shown in Figure 6A and the image shown in Figure 6B.
- the image shown in Figure 6C highlights the object, including its boundary, within the monitored area.
- virtual interference bands appear in and around the object.
- FIG. 7 is a flow diagram showing an illustrative method in accordance with present invention.
- the illustrative method is entered at step 70, wherein a threshold value is selected.
- Control is then passed to step 72.
- Step 72 illuminates a monitored area with a pattern.
- the pattern may be any suitable pattern.
- Step 74 then captures a reference image of the monitored area.
- the reference image preferably contains only the static background of the monitored area, with no transitory objects present.
- the reference image can be captured/updated upon manual initiation, as shown at 88.
- Step 76 then captures a live image of the monitored area.
- Step 78 compares the reference image and the live image to determine a difference parameter.
- the comparison of the reference image and the live image can be accomplished in any number of ways. One method is to simply do a pixel-by-pixel comparison of the images, such as by subtracting one image from the other. If there is no entry or. movement of objects in the monitored area, the projected pattern in the two images will substantially cancel out. However, if there is entry or movement of an object inthe monitored area, part of the projected pattern shown in one image may be shifted or otherwise deformed relative to the pattern shown in the other image.
- Another method is to calculate a difference "gref ' between the value of the brightness levels corresponding to the light areas of the pattern (such as in a mask, window), and the value of the brightness levels corresponding to the dark areas in the mask window of the reference image. A similar calculation may be made for the mask windows of the live image.
- Yet another method is to measure a correlation between each pixel and some neighboring pixels and/or a correlation between selected features, and then compare the correlation values.
- Other illustrative methods include extracting the moire spatial frequency and phase using a Fourier transform, unsharp masking, thresholding, contrast segmentation, filtering processing, skeletonization processing, multi-resolution analysis, deformable contour modeling, image clustering, morphology, etc. These comparison methods are meant to be only illustrative, and that any suitable method may be used to compare the images or selected characteristics of the images, depending on the application.
- the reference and/or live images may be preprocessed before they are compared.
- the reference image and/or live images may be provided to a filter that helps removes speckle, provides smoothing, changes overall intensity, or otherwise cleans-up the images.
- Step 80 determines if the differences identified in step 78 exceed the threshold value specified in step 70. If the differences exceed the threshold value, control is passed to step 82. Step 82 signals that an object is present in the monitored area. In some embodiments, an action is then taken, such as sounding an alarm, shutting down a machine, and/or providing some other alarm or action. If the differences do not exceed the threshold value, control is passed to step 84. Step 84 signals that an object is not present in the monitored area, and control is passed to step 86. Step 86 determines if an updated reference image is desirable. Under some circumstances, such as when the lighting conditions are dynamic in or around the monitored area, it may be advantageous to periodically update the reference image.
- Step 74 updates the reference image with the previous live image. Alternatively, a new reference image may be captured, if desired, so long as no objects have entered the monitored area. If it is determined that an updated reference image is not needed, control is passed to step 76, wherein a new live image is captured.
- FIG 8 is a flow diagram showing another illustrative method in accordance with the present invention.
- the illustrative method is entered at step 90, wherein a threshold value is selected.
- Control is then passed to step 92.
- Step 92 illuminates a monitored area with a first pattern.
- Control is then passed to step 94.
- Step 94 imposes a second pattern relative to the first pattern.
- Step 94 may, for example, illuminate the monitored area with the second pattern, or a grating may be placed between the monitored area and an image capture device, as desired.
- Step 96 then captures a reference image of the monitored area.
- the reference image preferably contains only the static background of the monitored area, with no transitory objects present.
- the reference image can be captured/updated upon manual initiation, as shown at 110.
- Step 98 captures a live image of the monitored area.
- Step 100 compares the reference image and the live image to determine selected differences.
- Step 102 determines if the differences identified in step 100 exceed the threshold value specified in step 90. If the differences exceed the threshold value, control is passed to step 104.
- Step 104 signals that an object is present in the monitored area, hi some embodiments, an action is then taken, such as. sounding an alarm, shutting down a machine, and/or providing some other alarm or action. If the differences do not exceed the threshold value, control is passed to step 106.
- Step 106 signals that an object is not present in the monitored area, and control is passed to step 108.
- Step 108 determines if an updated reference image is desirable. Under some circumstances, such as when the lighting conditions are dynamic in or around the monitored area, it may be advantageous to periodically update the reference image. If it is determined that an updated reference image is desirable, control is passed to step 96. Step 96 updates the reference image with the previous live image. Alternatively, a new reference image maybe captured, if desired, so long as no objects have entered the monitored area. If it is determined that an updated reference image is not needed, control is passed to step 98, wherein a new live image is captured.
- FIG. 9 is a flow diagram showing yet another illustrative method in accordance with the present invention.
- the illustrative method is entered at step 120, wherein a threshold value is selected. Control is then passed to step 122.
- Step 122 illuminates a monitored area with at least one pattern.
- Step 124 captures a reference image of two or more mask windows of the monitored area. Each mask window preferably corresponds to a sub-area or region within the monitored area.
- the reference , image can be captured/updated upon manual initiation, as shown at 140.
- Step 126 captures a live image of each mask window within the monitored area. Then, for each mask window, step 128 compares the reference image and the live image to determine differences therebetween. In some embodiments, selected mask windows of the reference image and the live image are compared using different compare algorithms. Step 130 determines if any of the differences identified in step 128 exceed the threshold value specified in step 120. In some embodiments, each mask window or group of mask windows has a different threshold value.
- Step 132 signals that an object is present in the monitored area.
- an action is then taken, such as sounding an alarm, shutting down a machine, and/or providing some other alarm or action.
- the action taken niay depend on which mask window detects an object. For example, for one mask window, the action may include sounding an alarm, while for another mask window the action may include shutting down a machine within the monitored area.
- control is passed to step 134.
- Step 134 signals that an object is not present in the monitored area, and control is passed to step 136.
- Step 136 determines if an updated reference image is desirable. If it is determined that an updated reference image is desirable, control is passed to step 124. Step 124 updates the reference image with the previous live image. Alternatively, a new reference image may be captured, if desired, so long as no objects have entered the monitored area. If it is determined that an updated reference image is not needed, control is passed to step 126, wherein a new live image is captured of the mask windows of the monitored area. While the invention is susceptible to various modifications, and alternative forms, specific examples thereof have been shown in the drawings and are herein described in detail. It should be understood, however, that the invention is not to be limited to the particular forms or methods disclosed, but to the contrary, the invention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the appended claims.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
- Burglar Alarm Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Emergency Alarm Devices (AREA)
- Image Processing (AREA)
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE60204280T DE60204280T2 (de) | 2001-01-19 | 2002-01-18 | Verfaren und vorrichtung zur detektion von objekten |
| AT02768271T ATE296471T1 (de) | 2001-01-19 | 2002-01-18 | Verfaren und vorrichtung zur detektion von objekten |
| AU2002330840A AU2002330840A1 (en) | 2001-01-19 | 2002-01-18 | Method and apparatus for detecting objects |
| JP2003505889A JP2005508027A (ja) | 2001-01-19 | 2002-01-18 | 対象物を検出するための方法及び装置 |
| EP02768271A EP1354303B1 (en) | 2001-01-19 | 2002-01-18 | Method and apparatus for detecting objects |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US26292501P | 2001-01-19 | 2001-01-19 | |
| US60/262,925 | 2001-01-19 | ||
| US10/052,953 US6841780B2 (en) | 2001-01-19 | 2002-01-17 | Method and apparatus for detecting objects |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2002103649A2 true WO2002103649A2 (en) | 2002-12-27 |
| WO2002103649A3 WO2002103649A3 (en) | 2003-04-03 |
Family
ID=26731290
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2002/001120 Ceased WO2002103649A2 (en) | 2001-01-19 | 2002-01-18 | Method and apparatus for detecting objects |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US6841780B2 (enExample) |
| JP (2) | JP2005508027A (enExample) |
| AU (1) | AU2002330840A1 (enExample) |
| WO (1) | WO2002103649A2 (enExample) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011073888A3 (en) * | 2009-12-14 | 2011-09-29 | Montel Inc. | Entity detection system and method for monitoring an area |
| US8823951B2 (en) | 2010-07-23 | 2014-09-02 | Leddartech Inc. | 3D optical detection system and method for a mobile storage system |
Families Citing this family (119)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7176440B2 (en) | 2001-01-19 | 2007-02-13 | Honeywell International Inc. | Method and apparatus for detecting objects using structured light patterns |
| JP2003087772A (ja) * | 2001-09-10 | 2003-03-20 | Fujitsu Ltd | 画像制御装置 |
| US7298866B2 (en) * | 2001-10-15 | 2007-11-20 | Lockheed Martin Corporation | Two dimensional autonomous isotropic detection technique |
| US7154531B2 (en) * | 2001-10-26 | 2006-12-26 | The Chamberlain Group, Inc. | Detecting objects by digital imaging device |
| JP3829729B2 (ja) * | 2002-02-14 | 2006-10-04 | オムロン株式会社 | 個人認証装置 |
| JP3704706B2 (ja) * | 2002-03-13 | 2005-10-12 | オムロン株式会社 | 三次元監視装置 |
| KR20050057223A (ko) * | 2002-09-05 | 2005-06-16 | 솔비젼 인코포레이티드 | 무영 3d/2d 측정장치 및 방법 |
| DE10244719A1 (de) * | 2002-09-25 | 2004-04-01 | Delphi Technologies, Inc., Troy | Verfahren und System zur Innenraumüberwachung |
| DE10253501A1 (de) * | 2002-11-16 | 2004-05-27 | Robert Bosch Gmbh | Bildgeber |
| US7746379B2 (en) * | 2002-12-31 | 2010-06-29 | Asset Intelligence, Llc | Sensing cargo using an imaging device |
| EP1649314B1 (fr) * | 2003-07-04 | 2008-03-05 | Vincent Lauer | Dispositif d'imagerie a balayage pour microscopie confocale a soustraction d'images |
| KR100601933B1 (ko) * | 2003-11-18 | 2006-07-14 | 삼성전자주식회사 | 사람검출방법 및 장치와 이를 이용한 사생활 보호방법 및 시스템 |
| DE10360761A1 (de) * | 2003-12-23 | 2005-07-28 | Airbus Deutschland Gmbh | Beleuchtungseinrichtung für eine Überwachungskamera |
| US7421112B2 (en) * | 2004-03-12 | 2008-09-02 | General Electric Company | Cargo sensing system |
| US20050207616A1 (en) * | 2004-03-17 | 2005-09-22 | The Chamberlain Group, Inc. | Movable barrier operator with an obstacle detector |
| NZ550905A (en) * | 2004-04-30 | 2009-09-25 | Utc Fire & Security Corp | ATM security system |
| US7440620B1 (en) * | 2004-05-21 | 2008-10-21 | Rockwell Automation B.V. | Infrared safety systems and methods |
| JP2006098252A (ja) * | 2004-09-30 | 2006-04-13 | Brother Ind Ltd | 3次元情報取得方法 |
| EP1929333A1 (en) * | 2005-08-18 | 2008-06-11 | Datasensor S.p.A. | Vision sensor for security systems and its operating method |
| CN101288105B (zh) | 2005-10-11 | 2016-05-25 | 苹果公司 | 用于物体重现的方法和系统 |
| US7627170B2 (en) | 2005-10-11 | 2009-12-01 | Northrop Grumman Corporation | Process for the identification of objects |
| US9330324B2 (en) | 2005-10-11 | 2016-05-03 | Apple Inc. | Error compensation in three-dimensional mapping |
| US7230722B2 (en) * | 2005-10-19 | 2007-06-12 | University Of Maryland | Shadow moire using non-zero talbot distance |
| US7489408B2 (en) * | 2005-11-15 | 2009-02-10 | General Electric Company | Optical edge break gage |
| US7499830B2 (en) * | 2005-11-15 | 2009-03-03 | General Electric Company | Computer-implemented techniques and system for characterizing geometric parameters of an edge break in a machined part |
| EP1956360B2 (en) | 2005-11-30 | 2020-12-02 | Nikon Corporation | An appartus for observing a specimen and a method of providing the same |
| WO2007065259A1 (en) * | 2005-12-06 | 2007-06-14 | March Networks Corporation | System and method for automatic camera health monitoring |
| DE102006003228A1 (de) * | 2006-01-24 | 2007-08-02 | Sick Ag | Vorrichtung zur Überwachung eines Schutzfeldes |
| KR101331543B1 (ko) * | 2006-03-14 | 2013-11-20 | 프라임센스 엘티디. | 스페클 패턴을 이용한 3차원 센싱 |
| JP5592070B2 (ja) * | 2006-03-14 | 2014-09-17 | プライム センス リミティド | 三次元検知のために深度変化させる光照射野 |
| DE202006008112U1 (de) * | 2006-05-20 | 2006-08-10 | Sick Ag | Optoelektronische Schutzeinrichtung |
| US7925117B2 (en) | 2006-06-27 | 2011-04-12 | Honeywell International Inc. | Fusion of sensor data and synthetic data to form an integrated image |
| DE102006034926A1 (de) * | 2006-07-28 | 2008-01-31 | Sick Ag | Entfernungsmessgerät |
| US7466628B2 (en) * | 2006-08-15 | 2008-12-16 | Coda Octopus Group, Inc. | Method of constructing mathematical representations of objects from reflected sonar signals |
| WO2008036354A1 (en) * | 2006-09-19 | 2008-03-27 | Braintech Canada, Inc. | System and method of determining object pose |
| US7986336B2 (en) * | 2006-11-27 | 2011-07-26 | Eastman Kodak Company | Image capture apparatus with indicator |
| EP1927957A1 (de) * | 2006-11-29 | 2008-06-04 | Sick Ag | Vorrichtung und Verfahren zur Überwachung eines Überwachungsbereichs |
| WO2008076942A1 (en) * | 2006-12-15 | 2008-06-26 | Braintech Canada, Inc. | System and method of identifying objects |
| WO2008087652A2 (en) * | 2007-01-21 | 2008-07-24 | Prime Sense Ltd. | Depth mapping using multi-beam illumination |
| NL1033589C2 (nl) * | 2007-03-26 | 2008-09-29 | Maasland Nv | Samenstel van een melkrobot met een melkrobotvoerplaats, en inrichting voor het grijpen en verplaatsen van materiaal. |
| WO2008120217A2 (en) * | 2007-04-02 | 2008-10-09 | Prime Sense Ltd. | Depth mapping using projected patterns |
| US7925075B2 (en) * | 2007-05-07 | 2011-04-12 | General Electric Company | Inspection system and methods with autocompensation for edge break gauging orientation |
| HUP0700391A2 (en) * | 2007-06-05 | 2008-12-29 | Zsombor Lazar | Method of determining an intrusion into a monitored area and system for generating output signals in response to such intrusion |
| WO2008149923A1 (ja) * | 2007-06-07 | 2008-12-11 | The University Of Electro-Communications | 物体検出装置とそれを適用したゲート装置 |
| US8494252B2 (en) * | 2007-06-19 | 2013-07-23 | Primesense Ltd. | Depth mapping using optical elements having non-uniform focal characteristics |
| DE102007036129B3 (de) * | 2007-08-01 | 2008-09-25 | Sick Ag | Vorrichtung und Verfahren zur dreidimensionalen Überwachung eines Raumbereichs mit mindestens zwei Bildsensoren |
| US7957583B2 (en) * | 2007-08-02 | 2011-06-07 | Roboticvisiontech Llc | System and method of three-dimensional pose estimation |
| DE102007062949A1 (de) * | 2007-12-21 | 2009-06-25 | Robert Bosch Gmbh | Werkzeugmaschinenvorrichtung |
| DE202008017729U1 (de) * | 2008-06-10 | 2010-06-10 | Sick Ag | Dreidimensionale Überwachung und Absicherung eines Raumbereichs |
| US8456517B2 (en) * | 2008-07-09 | 2013-06-04 | Primesense Ltd. | Integrated processor for 3D mapping |
| US8233734B2 (en) * | 2008-09-22 | 2012-07-31 | Microsoft Corporation | Image upsampling with training images |
| EP2166304A1 (de) * | 2008-09-23 | 2010-03-24 | Sick Ag | Beleuchtungseinheit und Verfahren zur Erzeugung eines selbstunähnlichen Musters |
| US8559699B2 (en) * | 2008-10-10 | 2013-10-15 | Roboticvisiontech Llc | Methods and apparatus to facilitate operations in image based systems |
| US7932838B2 (en) * | 2008-11-17 | 2011-04-26 | Honeywell International, Inc. | Aircraft collision avoidance system |
| US8194916B2 (en) * | 2008-12-24 | 2012-06-05 | Weyerhaeuser Nr Company | Method and apparatus for monitoring tree growth |
| US8462207B2 (en) * | 2009-02-12 | 2013-06-11 | Primesense Ltd. | Depth ranging with Moiré patterns |
| US9147324B2 (en) * | 2009-02-23 | 2015-09-29 | Honeywell International Inc. | System and method to detect tampering at ATM machines |
| US8786682B2 (en) * | 2009-03-05 | 2014-07-22 | Primesense Ltd. | Reference image techniques for three-dimensional sensing |
| US8717417B2 (en) * | 2009-04-16 | 2014-05-06 | Primesense Ltd. | Three-dimensional mapping and imaging |
| US8427649B2 (en) * | 2009-05-15 | 2013-04-23 | Michigan Aerospace Corporation | Range imaging lidar |
| US8797550B2 (en) | 2009-04-21 | 2014-08-05 | Michigan Aerospace Corporation | Atmospheric measurement system |
| WO2011079323A2 (en) * | 2009-12-24 | 2011-06-30 | Michigan Aerospace Corporation | Light processing system and method |
| WO2011013079A1 (en) * | 2009-07-30 | 2011-02-03 | Primesense Ltd. | Depth mapping based on pattern matching and stereoscopic information |
| DE102009028212A1 (de) * | 2009-08-04 | 2011-02-10 | Robert Bosch Gmbh | Verfahren zum Überwachen eines Bereichs |
| JP5355316B2 (ja) * | 2009-09-10 | 2013-11-27 | キヤノン株式会社 | テンプレート画像の評価方法及び生体運動検出装置 |
| CN102656543A (zh) * | 2009-09-22 | 2012-09-05 | 泊布欧斯技术有限公司 | 计算机装置的远程控制 |
| DE102009055623B4 (de) * | 2009-11-25 | 2016-01-28 | Technische Universität Braunschweig | Verfahren und Einrichtung zur optischen Aktivitätenerkennung |
| US8830227B2 (en) * | 2009-12-06 | 2014-09-09 | Primesense Ltd. | Depth-based gain control |
| WO2011085420A1 (de) * | 2010-01-18 | 2011-07-21 | Stefan Wieser | Vorrichtung und verfahren zum überwachen einer gebäudeöffnung |
| US8982182B2 (en) * | 2010-03-01 | 2015-03-17 | Apple Inc. | Non-uniform spatial resource allocation for depth mapping |
| EP2558886B1 (de) * | 2010-04-16 | 2014-03-05 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Einrichtung zur überwachung mindestens eines dreidimensionalen sicherheitsbereichs |
| CN103053167B (zh) | 2010-08-11 | 2016-01-20 | 苹果公司 | 扫描投影机及用于3d映射的图像捕获模块 |
| US9870068B2 (en) | 2010-09-19 | 2018-01-16 | Facebook, Inc. | Depth mapping with a head mounted display using stereo cameras and structured light |
| JP5983409B2 (ja) | 2010-09-22 | 2016-08-31 | 日本電気株式会社 | 撮影装置、画像転送方法、及びプログラム |
| US20120106778A1 (en) * | 2010-10-28 | 2012-05-03 | General Electric Company | System and method for monitoring location of persons and objects |
| WO2012066501A1 (en) | 2010-11-19 | 2012-05-24 | Primesense Ltd. | Depth mapping using time-coded illumination |
| JP2012133759A (ja) * | 2010-11-29 | 2012-07-12 | Canon Inc | 侵入物体の検知を行うことができる物体追尾装置、物体追尾方法及び記憶媒体 |
| US9131136B2 (en) | 2010-12-06 | 2015-09-08 | Apple Inc. | Lens arrays for pattern projection and imaging |
| US9030528B2 (en) | 2011-04-04 | 2015-05-12 | Apple Inc. | Multi-zone imaging sensor and lens array |
| US8989437B2 (en) | 2011-05-16 | 2015-03-24 | Microsoft Corporation | Salient object detection by composition |
| US8831287B2 (en) * | 2011-06-09 | 2014-09-09 | Utah State University | Systems and methods for sensing occupancy |
| US8908024B2 (en) | 2011-06-29 | 2014-12-09 | Honeywell International Inc. | System for detecting an item within a specified zone |
| KR101446902B1 (ko) * | 2011-08-19 | 2014-10-07 | 한국전자통신연구원 | 사용자 인터랙션 장치 및 방법 |
| CN102360510A (zh) * | 2011-09-29 | 2012-02-22 | 沈阳体育学院 | 莫阿条纹动态图 |
| US20140333763A1 (en) * | 2011-11-22 | 2014-11-13 | Schneider Electric Buildings, Llc | Method and system for controlling access using a smart optical sensor |
| CN103226060B (zh) * | 2012-01-31 | 2016-08-24 | 通用电气公司 | 风力涡轮叶片的检测系统和方法 |
| US9157790B2 (en) | 2012-02-15 | 2015-10-13 | Apple Inc. | Integrated optoelectronic modules with transmitter, receiver and beam-combining optics for aligning a beam axis with a collection axis |
| US8823930B2 (en) * | 2012-08-07 | 2014-09-02 | Carl Zeiss Industrielle Messtechnik Gmbh | Apparatus and method for inspecting an object |
| AU2012268882B2 (en) * | 2012-12-24 | 2015-07-09 | Canon Kabushiki Kaisha | Estimating phase for phase-stepping algorithms |
| EP2801958B1 (en) * | 2013-05-08 | 2016-09-14 | Axis AB | Monitoring method and camera |
| BR112016009202A8 (pt) | 2013-10-23 | 2020-03-24 | Oculus Vr Llc | aparelhos e método para gerar um padrão de luz estruturada |
| US9224044B1 (en) | 2014-07-07 | 2015-12-29 | Google Inc. | Method and system for video zone monitoring |
| US9544636B2 (en) | 2014-07-07 | 2017-01-10 | Google Inc. | Method and system for editing event categories |
| US10140827B2 (en) | 2014-07-07 | 2018-11-27 | Google Llc | Method and system for processing motion event notifications |
| US9501915B1 (en) | 2014-07-07 | 2016-11-22 | Google Inc. | Systems and methods for analyzing a video stream |
| US10127783B2 (en) | 2014-07-07 | 2018-11-13 | Google Llc | Method and device for processing motion events |
| US9449229B1 (en) | 2014-07-07 | 2016-09-20 | Google Inc. | Systems and methods for categorizing motion event candidates |
| USD782495S1 (en) | 2014-10-07 | 2017-03-28 | Google Inc. | Display screen or portion thereof with graphical user interface |
| DE102014226691A1 (de) | 2014-12-19 | 2016-06-23 | Carl Zeiss Industrielle Messtechnik Gmbh | Verfahren zur Überwachung eines Koordinatenmessgeräts |
| US9435635B1 (en) * | 2015-02-27 | 2016-09-06 | Ge Aviation Systems Llc | System and methods of detecting an intruding object in a relative navigation system |
| WO2016154218A1 (en) | 2015-03-22 | 2016-09-29 | Oculus Vr, Llc | Depth mapping with a head mounted display using stereo cameras and structured light |
| US9361011B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting multiple live video feeds in a user interface |
| GB201511334D0 (en) * | 2015-06-29 | 2015-08-12 | Nokia Technologies Oy | A method, apparatus, computer and system for image analysis |
| DE102015215234A1 (de) * | 2015-08-10 | 2017-02-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Einrichtung zum Absichern eines Sicherheitsbereichs um mindestens eine automatisch arbeitende Maschine |
| CA3006483A1 (en) * | 2015-11-25 | 2017-06-01 | VHS IP Pty Ltd | Worksite safety device using lidar |
| JP6668763B2 (ja) * | 2016-01-13 | 2020-03-18 | セイコーエプソン株式会社 | 画像認識装置、画像認識方法および画像認識ユニット |
| KR101779590B1 (ko) * | 2016-03-09 | 2017-09-19 | 주식회사 에스원 | 패턴무늬를 이용한 면 감시 시스템 및 이를 이용한 면 감시 방법 |
| US10506237B1 (en) | 2016-05-27 | 2019-12-10 | Google Llc | Methods and devices for dynamic adaptation of encoding bitrate for video streaming |
| US10380429B2 (en) | 2016-07-11 | 2019-08-13 | Google Llc | Methods and systems for person detection in a video feed |
| US11783010B2 (en) | 2017-05-30 | 2023-10-10 | Google Llc | Systems and methods of person recognition in video streams |
| CN107323114B (zh) * | 2017-06-22 | 2019-08-16 | 珠海汇金科技股份有限公司 | 印控仪的入侵检测方法、系统及印控仪 |
| WO2019005916A1 (en) * | 2017-06-28 | 2019-01-03 | Schneider Electric It Corporation | SYSTEMS AND METHODS FOR INTRUSION DETECTION FOR BAY SPEAKERS |
| US10664688B2 (en) | 2017-09-20 | 2020-05-26 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
| US10948578B2 (en) * | 2018-10-10 | 2021-03-16 | International Business Machines Corporation | Distinguishing holographic objects from physical objects |
| US11232685B1 (en) * | 2018-12-04 | 2022-01-25 | Amazon Technologies, Inc. | Security system with dual-mode event video and still image recording |
| JP6841297B2 (ja) * | 2019-05-31 | 2021-03-10 | 株式会社デンソー | ビジュアルサーボシステム |
| US11388775B2 (en) | 2019-09-11 | 2022-07-12 | Honeywell International Inc. | Systems and methods for identifying blockages of emergency exists in a building |
| DE102020214251A1 (de) * | 2020-11-12 | 2022-05-12 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Bereitstellen von Überwachungsdaten zur Detektion eines beweglichen Objekts, Verfahren zur Detektion eines beweglichen Objekts, Verfahren zum Herstellen zumindest eines vordefinierten punktsymmetrischen Bereichs und Vorrichtung |
| DE102021115280A1 (de) * | 2021-06-14 | 2022-12-15 | Agtatec Ag | Automatische Türanordnung mit Sensorvorrichtung und Verfahren zum Betreiben einer solchen automatischen Türanordnung |
Family Cites Families (92)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5845000A (en) | 1992-05-05 | 1998-12-01 | Automotive Technologies International, Inc. | Optical identification and monitoring system using pattern recognition for use with vehicles |
| US4589140A (en) | 1983-03-21 | 1986-05-13 | Beltronics, Inc. | Method of and apparatus for real-time high-speed inspection of objects for identifying or recognizing known and unknown portions thereof, including defects and the like |
| US4923066A (en) | 1987-10-08 | 1990-05-08 | Elor Optronics Ltd. | Small arms ammunition inspection system |
| FR2665317B1 (fr) | 1990-07-27 | 1996-02-09 | Thomson Surveillance Video | Camera de surveillance a support integre. |
| EP0484076B1 (en) | 1990-10-29 | 1996-12-18 | Kabushiki Kaisha Toshiba | Video camera having focusing and image-processing function |
| KR930010843B1 (ko) | 1990-12-15 | 1993-11-12 | 삼성전자 주식회사 | 이동감시 카메라장치 |
| US5359363A (en) | 1991-05-13 | 1994-10-25 | Telerobotics International, Inc. | Omniview motionless camera surveillance system |
| US5479021A (en) | 1991-06-10 | 1995-12-26 | Picker International, Inc. | Transmission line source assembly for spect cameras |
| JP3151921B2 (ja) | 1991-06-17 | 2001-04-03 | 松下電器産業株式会社 | テレビジョンカメラ装置 |
| CA2062620C (en) | 1991-07-31 | 1998-10-06 | Robert Paff | Surveillance apparatus with enhanced control of camera and lens assembly |
| US5164827A (en) | 1991-08-22 | 1992-11-17 | Sensormatic Electronics Corporation | Surveillance system with master camera control of slave cameras |
| CA2068022C (en) | 1991-09-17 | 2002-07-09 | Norbert M. Stiepel | Surveillance device with eyeball assembly and pivotably mountable carriage assembly |
| USD349713S (en) | 1991-11-18 | 1994-08-16 | Elmo Company Ltd. | Surveillance camera |
| USD349714S (en) | 1991-11-18 | 1994-08-16 | Elmo Company Ltd. | Surveillance camera |
| FR2686720A1 (fr) | 1992-01-28 | 1993-07-30 | Bidault Louis | Dispositif de surveillance a camera mobile. |
| JP3214704B2 (ja) | 1992-03-06 | 2001-10-02 | ローズ コーポレーション | ラインマーキングマシン用制御装置 |
| US5835613A (en) | 1992-05-05 | 1998-11-10 | Automotive Technologies International, Inc. | Optical identification and monitoring system using pattern recognition for use with vehicles |
| FR2692423B1 (fr) | 1992-06-16 | 1995-12-01 | Thomson Csf | Camera d'observation multistandard et systeme de surveillance utilisant une telle camera. |
| JPH0667266A (ja) | 1992-08-21 | 1994-03-11 | Ngk Insulators Ltd | 防犯カメラ装置および警報システム |
| USD354973S (en) | 1992-10-07 | 1995-01-31 | Sony Corporation | Surveillance video camera |
| USD347442S (en) | 1992-11-06 | 1994-05-31 | Falconer Leonard S | Combined imitation surveillance camera and support therefor |
| US5657076A (en) * | 1993-01-12 | 1997-08-12 | Tapp; Hollis M. | Security and surveillance system |
| USD349913S (en) | 1993-01-22 | 1994-08-23 | Morris Bryan W | Surveillance video camera security box |
| US5418567A (en) | 1993-01-29 | 1995-05-23 | Bayport Controls, Inc. | Surveillance camera system |
| USD349911S (en) | 1993-04-05 | 1994-08-23 | Koyo Electronics Industries Co., Ltd. | Surveillance camera |
| GB9308952D0 (en) | 1993-04-30 | 1993-06-16 | Philips Electronics Uk Ltd | Tracking objects in video sequences |
| EP0631430A3 (en) | 1993-06-22 | 1995-02-22 | Nippon Electric Co | Color image processing device for removing moiré. |
| JP3169196B2 (ja) | 1993-08-10 | 2001-05-21 | オリンパス光学工業株式会社 | カメラ |
| JPH07104362A (ja) | 1993-10-01 | 1995-04-21 | Canon Inc | カメラの制御装置 |
| JP3212777B2 (ja) * | 1993-10-28 | 2001-09-25 | 三菱電機株式会社 | 画像処理装置 |
| JPH07159892A (ja) | 1993-12-08 | 1995-06-23 | Canon Inc | カメラ |
| JPH07175128A (ja) | 1993-12-20 | 1995-07-14 | Canon Inc | カメラ |
| US5436462A (en) * | 1993-12-21 | 1995-07-25 | United Technologies Optical Systems | Video contour measurement system employing moire interferometry having a beat frequency pattern |
| JPH07191390A (ja) | 1993-12-27 | 1995-07-28 | Canon Inc | カメラ |
| CA2113399C (en) * | 1994-01-13 | 2002-03-05 | Greg Vaillancourt | Remote fuel station |
| JPH07222039A (ja) | 1994-01-31 | 1995-08-18 | Mitsubishi Electric Corp | ビデオカメラの電源装置 |
| DE4405376C1 (de) * | 1994-02-19 | 1995-02-16 | Leuze Electronic Gmbh & Co | Verfahren zum Erfassen von Objekten in einem Überwachungsbereich |
| JP3293308B2 (ja) | 1994-03-10 | 2002-06-17 | 三菱電機株式会社 | 人物状態検出装置 |
| JP3462899B2 (ja) | 1994-03-14 | 2003-11-05 | 松下電器産業株式会社 | 三板式テレビカメラ装置 |
| JPH07281276A (ja) | 1994-04-04 | 1995-10-27 | Konica Corp | カメラ |
| FR2719670B1 (fr) | 1994-05-03 | 1996-07-05 | Sopha Medical | Gamma caméra à plans d'approche et de sécurité. |
| US5613013A (en) * | 1994-05-13 | 1997-03-18 | Reticula Corporation | Glass patterns in image alignment and analysis |
| US5627616A (en) | 1994-06-22 | 1997-05-06 | Philips Electronics North America Corporation | Surveillance camera system |
| GB9413413D0 (en) | 1994-07-04 | 1994-08-24 | At & T Global Inf Solution | Apparatus and method for testing bank-notes |
| US5477212A (en) | 1994-07-18 | 1995-12-19 | Rumpel; David C. | Surveillance camera simulator apparatus |
| JP3531231B2 (ja) | 1994-09-07 | 2004-05-24 | 株式会社ニコン | 電源供給回路,負荷装置,および電源供給回路を備えたカメラ |
| USD378095S (en) | 1994-09-26 | 1997-02-18 | Elmo Company Limited | Surveillance camera |
| JPH08140941A (ja) | 1994-11-25 | 1996-06-04 | Canon Inc | 眼科撮影装置 |
| USD365834S (en) | 1995-01-06 | 1996-01-02 | Dozier Charles W | Housing for a surveillance camera |
| US6151065A (en) | 1995-06-20 | 2000-11-21 | Steed; Van P. | Concealed integrated vehicular camera safety system |
| KR19990029064A (ko) * | 1995-07-18 | 1999-04-15 | 낸시 엘. 후체슨 | 확장된 영상 깊이를 갖는 모아레 간섭 시스템 및 방법 |
| US5691765A (en) | 1995-07-27 | 1997-11-25 | Sensormatic Electronics Corporation | Image forming and processing device and method for use with no moving parts camera |
| WO1997005744A1 (en) | 1995-07-27 | 1997-02-13 | Sensormatic Electronics Corporation | Image splitting, forming and processing device and method for use with no moving parts camera |
| JP3350296B2 (ja) | 1995-07-28 | 2002-11-25 | 三菱電機株式会社 | 顔画像処理装置 |
| JP3220626B2 (ja) | 1995-09-20 | 2001-10-22 | シャープ株式会社 | 車載用監視カメラ装置 |
| US5649255A (en) | 1995-09-25 | 1997-07-15 | Sensormatic Electronics Corporation | Video surveillance camera release and removal mechanism |
| JP3497929B2 (ja) * | 1995-09-29 | 2004-02-16 | 富士重工業株式会社 | 侵入物監視装置 |
| JPH09130780A (ja) * | 1995-10-27 | 1997-05-16 | Toshiba Corp | 監視装置 |
| US5745170A (en) | 1995-11-01 | 1998-04-28 | Itt Corporation | Mounting device for joining a night vision device to a surveillance camera |
| SG87750A1 (en) | 1995-11-01 | 2002-04-16 | Thomson Consumer Electronics | Surveillance system for a video recording camera |
| DE69635101T2 (de) | 1995-11-01 | 2006-06-01 | Canon K.K. | Verfahren zur Extraktion von Gegenständen und dieses Verfahren verwendendes Bildaufnahmegerät |
| US5793900A (en) | 1995-12-29 | 1998-08-11 | Stanford University | Generating categorical depth maps using passive defocus sensing |
| IL116703A (en) * | 1996-01-08 | 2001-01-11 | Israel State | System and method for detecting an intruder |
| US5818519A (en) | 1996-01-17 | 1998-10-06 | Wren; Clifford T. | Surveillance camera mounting apparatus |
| JPH09193078A (ja) | 1996-01-22 | 1997-07-29 | Hitachi Constr Mach Co Ltd | 遠隔操縦機械のカメラ方向制御装置 |
| US5752100A (en) | 1996-01-26 | 1998-05-12 | Eastman Kodak Company | Driver circuit for a camera autofocus laser diode with provision for fault protection |
| JP3146150B2 (ja) | 1996-04-01 | 2001-03-12 | スター精密株式会社 | 監視カメラシステム |
| JPH1031256A (ja) | 1996-07-16 | 1998-02-03 | Fuji Photo Film Co Ltd | カメラ |
| JPH1042231A (ja) | 1996-07-19 | 1998-02-13 | Canon Inc | デジタルカメラ及びデジタルカメラシステム |
| US5953055A (en) | 1996-08-08 | 1999-09-14 | Ncr Corporation | System and method for detecting and analyzing a queue |
| JP3800617B2 (ja) * | 1996-09-25 | 2006-07-26 | ソニー株式会社 | 画像照合装置、画像照合方法、指紋照合装置及び指紋照合方法 |
| DE19643018B4 (de) * | 1996-10-18 | 2010-06-17 | Isra Surface Vision Gmbh | Verfahren und Vorrichtung zum Messen des Verlaufs reflektierender Oberflächen |
| US6509967B1 (en) * | 1996-10-18 | 2003-01-21 | Innomess Gelsellschaft Fur Messtechnik Mbh | Method for detecting optical errors in large surface panels |
| DE19644278A1 (de) | 1996-10-24 | 1998-05-07 | Ines Elektronik Systementwickl | Optische Schranke sowie daraus aufgebaute Überwachungseinrichtung |
| US5731832A (en) | 1996-11-05 | 1998-03-24 | Prescient Systems | Apparatus and method for detecting motion in a video signal |
| USD399517S (en) | 1996-12-06 | 1998-10-13 | Elmo Co., Ltd. | Surveillance television camera |
| JP3103931B2 (ja) * | 1997-02-19 | 2000-10-30 | 鐘紡株式会社 | 室内監視装置 |
| DE19709992C1 (de) * | 1997-03-11 | 1998-10-01 | Betr Forsch Inst Angew Forsch | Verfahren zum Messen der Oberflächengeometrie von Warmband |
| US6727938B1 (en) | 1997-04-14 | 2004-04-27 | Robert Bosch Gmbh | Security system with maskable motion detection and camera with an adjustable field of view |
| WO1998046116A2 (en) | 1997-04-16 | 1998-10-22 | Charles Jeffrey R | Solid catadioptric omnidirectional optical system having central coverage means which is associated with a camera, projector, medical instrument, or similar article |
| US5857032A (en) * | 1997-04-25 | 1999-01-05 | General Electric Company | System and method for measuring and monitoring three-dimensional shaped objects |
| US6456320B2 (en) * | 1997-05-27 | 2002-09-24 | Sanyo Electric Co., Ltd. | Monitoring system and imaging system |
| US5790910A (en) | 1997-08-04 | 1998-08-04 | Peerless Industries, Inc. | Camera mounting apparatus |
| US5852754A (en) | 1997-08-27 | 1998-12-22 | Videolarm, Inc. | Pressurized housing for surveillance camera |
| DE19809210A1 (de) | 1998-03-04 | 1999-09-16 | Siemens Ag | Verfahren und Vorrichtung zur Überwachung einer Szene |
| JP3531512B2 (ja) * | 1998-12-25 | 2004-05-31 | 松下電工株式会社 | 睡眠状態監視装置 |
| JP4168518B2 (ja) * | 1999-03-09 | 2008-10-22 | コニカミノルタホールディングス株式会社 | 画像処理システム |
| IL130465A0 (en) * | 1999-06-14 | 2000-06-01 | Prolaser Ltd | Method and apparatus for measuring power of an optical element for its mapping |
| DE29911390U1 (de) * | 1999-06-30 | 1999-08-12 | Sick AG, 79183 Waldkirch | Optoelektronisches Überwachungssystem |
| US6564166B1 (en) * | 1999-10-27 | 2003-05-13 | Georgia Tech Research Corporation | Projection moiré method and apparatus for dynamic measuring of thermal induced warpage |
| DE10026710A1 (de) | 2000-05-30 | 2001-12-06 | Sick Ag | Optoelektronische Schutzeinrichtung |
| US6456384B1 (en) * | 2000-11-09 | 2002-09-24 | Tropel Corporation | Moiré interferometer with overlapping illumination and imaging systems |
-
2002
- 2002-01-17 US US10/052,953 patent/US6841780B2/en not_active Expired - Lifetime
- 2002-01-18 JP JP2003505889A patent/JP2005508027A/ja active Pending
- 2002-01-18 WO PCT/US2002/001120 patent/WO2002103649A2/en not_active Ceased
- 2002-01-18 AU AU2002330840A patent/AU2002330840A1/en not_active Abandoned
-
2010
- 2010-03-25 JP JP2010069909A patent/JP5264815B2/ja not_active Expired - Fee Related
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011073888A3 (en) * | 2009-12-14 | 2011-09-29 | Montel Inc. | Entity detection system and method for monitoring an area |
| US9507050B2 (en) | 2009-12-14 | 2016-11-29 | Montel Inc. | Entity detection system and method for monitoring an area |
| US8823951B2 (en) | 2010-07-23 | 2014-09-02 | Leddartech Inc. | 3D optical detection system and method for a mobile storage system |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2002103649A3 (en) | 2003-04-03 |
| JP5264815B2 (ja) | 2013-08-14 |
| AU2002330840A1 (en) | 2003-01-02 |
| US6841780B2 (en) | 2005-01-11 |
| US20020125435A1 (en) | 2002-09-12 |
| JP2005508027A (ja) | 2005-03-24 |
| JP2010211805A (ja) | 2010-09-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6841780B2 (en) | Method and apparatus for detecting objects | |
| US7176440B2 (en) | Method and apparatus for detecting objects using structured light patterns | |
| US6469734B1 (en) | Video safety detector with shadow elimination | |
| KR101808587B1 (ko) | 객체인식과 추적감시 및 이상상황 감지기술을 이용한 지능형 통합감시관제시스템 | |
| US7167575B1 (en) | Video safety detector with projected pattern | |
| US10102427B2 (en) | Methods for performing biometric recognition of a human eye and corroboration of same | |
| JP3827426B2 (ja) | 火災検出装置 | |
| US10595014B2 (en) | Object distance determination from image | |
| WO2005122094A1 (en) | Method for detecting desired objects in a highly dynamic environment by a monitoring system | |
| CA2275893C (en) | Low false alarm rate video security system using object classification | |
| EP1068588A1 (en) | Method for rejection of flickering lights in an imaging system | |
| EP1354303B1 (en) | Method and apparatus for detecting objects | |
| JP2003121556A (ja) | 物体検出装置及びその方法 | |
| CN110782495A (zh) | 用于在工作空间中产生和监控安全区域的装置和方法 | |
| JP6093270B2 (ja) | 画像センサ | |
| JP6133700B2 (ja) | 画像センサ | |
| JP6155106B2 (ja) | 画像センサ | |
| JP7262412B2 (ja) | 画像処理装置及び画像処理プログラム | |
| WO2012065241A1 (en) | System and method for video recording device detection | |
| Hamza et al. | Virtual moire interference approach for an industrial safety monitoring system | |
| JP2012212239A (ja) | 移動物体監視システム | |
| HK1245750A1 (zh) | 用於乘客運輸裝置的移動扶手監測系統、乘客運輸裝置及其監測方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2002768271 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2003505889 Country of ref document: JP |
|
| WWP | Wipo information: published in national office |
Ref document number: 2002768271 Country of ref document: EP |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2002768271 Country of ref document: EP |