WO2015170962A1 - Method for providing multiple perspective views and multiple pan-tilt-zoom tracking using a single camera - Google Patents
Method for providing multiple perspective views and multiple pan-tilt-zoom tracking using a single camera Download PDFInfo
- Publication number
- WO2015170962A1 WO2015170962A1 PCT/MY2015/000031 MY2015000031W WO2015170962A1 WO 2015170962 A1 WO2015170962 A1 WO 2015170962A1 MY 2015000031 W MY2015000031 W MY 2015000031W WO 2015170962 A1 WO2015170962 A1 WO 2015170962A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- view
- perspective
- input
- corrected
- tracking
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 86
- 238000012545 processing Methods 0.000 claims abstract description 10
- 238000013507 mapping Methods 0.000 claims description 7
- 230000001131 transforming effect Effects 0.000 claims description 4
- 238000013519 translation Methods 0.000 claims description 4
- 238000012952 Resampling Methods 0.000 claims description 3
- 238000004088 simulation Methods 0.000 claims description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011410 subtraction method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
Definitions
- the present invention generally relates to image processing, and more particularly to a method for providing a wide coverage of views based on an area under surveillance using a single camera.
- Wide coverage view is a prevalent preference and highly desirable especially for surveillance purposes, to enable thorough image analysis for every single space in the area under surveillance or monitored.
- the selection of cameras to be deployed for surveillance purposes are primarily selected based on cost, structure and its capacity in providing a wide coverage view.
- a method for generating multiple perspective views and providing multiple pan-tilt-zoom tracking based on an image captured having at least one object of interest as input, using a single camera comprising: an initialization routine generally includes generating 360° lookup tables; generating perspective view lookup table and perspective view and generating simplified 360° floor plan based on the input; a processing routine generally includes performing object tracking and object pan-tilt-zoom (PTZ) tracking; selectively updating 360° corrected view and updating the perspective view; and a display routine the output image obtained upon processed by the first two routines is wider and less blocked than that of the input image.
- an initialization routine generally includes generating 360° lookup tables; generating perspective view lookup table and perspective view and generating simplified 360° floor plan based on the input
- a processing routine generally includes performing object tracking and object pan-tilt-zoom (PTZ) tracking; selectively updating 360° corrected view and updating the perspective view; and a display routine the output image obtained upon processed by the first two routines is wider and less blocked than that of
- generating 360° lookup tables of the initialization routine includes: transforming the input into 360° corrected view; and generating 360° lookup table for determining pixel coordinates in 360° corrected view based on original pixel coordinate of the input.
- generating perspective view lookup table of the initialization routine includes: determining the number of perspective views; dividing each perspective view into quadrant; transforming each quadrant into a simulation of a corrected perspective view; computing rotation angle for each perspective view; and determining the corresponding region based on the 360° corrected view and realigning the views.
- generating simplified 360° floor plan of the initialization routine includes: mapping of ROI points to the corrected view; and mapping the ROI points onto the 360° corrected view to generate the simplified floor plan.
- selectively updating 360° of the process routine includes: determining the corresponding motion pixels in 360° corrected view based on the lookup table generated in an earlier routine; updating the pixels based on actual values of the input, determining the corresponding edge points of within the input; mapping the edge points into the corrected view; and performing interpolation on missing pixels based on the edge points.
- object tracking of the process routine includes: resampling input to a lower resolution image; detecting motion within image; and identifying object by finding the correlation between all detected objects; and analysing any occurrence of event and generating an alarm for the event if detected.
- updating the perspective view of the process routine includes: determining if there is user's input, if no user's navigation input, updating the coordinates in the quadrants based on the motion pixels detected from the motion tacking; if there is user's input, determining a new perspective view center point with respect to the corrected view; determining the relative rotation angle based on the new center point which includes determining a new cropped area based on the new perspective view based on a previous cropped area suitably translated by horizontal and vertical shift of the previous center point; apply translation to the new cropped area; computing the rotation angle difference between the current and previous center points; and realigning the new cropped area based on the rotation angle difference computation.
- object PTZ tracking of the process routine includes: determining the presence of an event; determining whether the event is a new event to initialize virtual PTZ view; determining the next position of PTZ view with respect to 360° corrected view based on current object location; determining the cropping area within image for maximizing the coverage of the object detected within the PTZ view; extracting the image from full resolution image and updating PTZ view; and computing tracking consistency by computing the color similarity between current and previous PTZ views as a feedback to track moving object.
- the display routine provides output image obtained upon processed by the first two routines includes displaying the finalized multiple perspective view, simplified floor plan and outcome of object PTZ tracking.
- system for providing a wider coverage view and less blocked view of a monitored area based on a top-down view comprising; an image capturing means configured to capture a top-down image input; at least one means connected to the image capturing means configured to perform the following tasks based on the top-down image input; and generate multiple perspective views and multiple pan-tilt- zoom (PTZ) tracking.
- PTZ pan-tilt- zoom
- FIGURE 1 illustrates an overall system for generating multiple perspective views and Pan-Tilt-Zoom (PTZ) tracking using a single camera in accordance with an embodiment of the present invention
- FIGURE 2A shows the image capturing means in accordance with an embodiment of the present invention
- FIGURE 2B, 2C, 2D show samples of the multiple perspective views, multiple PTZ views and simplified floor plan which can be generated using the image capturing means in accordance with an embodiment of the present invention
- FIGURE 3 shows the overall flow of the system and method in accordance with an embodiment of the present invention
- FIGURE 4 shows one aspect of the initialization routine in accordance with an embodiment of the present invention
- FIGURE 5 shows the process for generating corrected 360 ° table of the initialization routine in accordance with an embodiment of the present invention
- FIGURE 6 shows the process for generating simplified 360° floor plan of the initialization routine in accordance with an embodiment of the present invention
- FIGURE 7 shows the process for generating perspective view lookup table of the initialization routine in accordance with an embodiment of the present invention
- FIGURE 8 shows a flowchart generalizing the entire process for the second routine being the process routine of the method of the present invention
- FIGURE 9A shows a flowchart generalizing the process of selectively updating the 360 "corrected view in accordance with an embodiment of the present invention.
- FIGURE 9B shows the steps involved in determining the corresponding edge points of enclosing bounding box in motion map and performing image intensity interpolation on missing pixels within the corresponding edge points in accordance with an embodiment of the present invention
- FIGURE 10 shows the object tracking process in accordance with an embodiment of the present invention
- FIGURE 11 shows the process flow in PTZ tracking of the process routine in accordance with an embodiment of the present invention
- FIGURE 12A shows the process of updating perspective view in accordance with an embodiment of the present invention
- FIGURE 12B illustrates the realigning process in accordance with an embodiment of the present invention
- FIGURE 13 shows the display routine in accordance with an embodiment of the present invention.
- the term "input” in this specification is used to mean the original image input captured by the image capturing means;
- the term “corrected view” or “corrected image” refers to any form of image or view which has been edited and prior to generating a final view or image for display.
- FIGURE 1 The overall system for providing a wider coverage of view contemplated in accordance with one embodiment of the present invention is shown as FIGURE 1.
- the system includes one image capturing means 100 a display means 102 and a processor 103.
- the processor 103 is connected to the image capturing means 100 and display means 102 and configured to perform image-processing functionalities to be elucidated herein.
- the image capturing means In one embodiment and as shown in FIGURE 2A, the image capturing means
- the image capturing means 100 is positioned in a manner such that it can provide a top-down overall view of the area under surveillance.
- the image capturing means 100 is equipped with fish-eye lens in order to capture a wide coverage of view within the area.
- the image capturing means 100 is configured to capture multiple perspective views and multiple pan-tilt-zoom (PTZ) views and simplified floor plan as shown in FIGURE 2B, 2C and 2D respectively. Accordingly, the image capturing means 100 is further configured to continuously track and maintain each object of interest within the area under surveillance without having to install multiple or a plurality of PTZ based cameras.
- PTZ pan-tilt-zoom
- the processor 103 is configured to perform two main functionalities based on the captured images, these are; initialization routine 300 and process routine 310. Results or output from the processor 103 are displayed based on the display routine 320 with the display means 102.
- FIGURE 3 A system incorporating the method of the present invention is shown in FIGURE 3 as a flowchart generalizing the entire process in accordance with an embodiment of the present invention.
- the first routine being the initialization routine 300 includes generating lookup tables, perspective view lookup table and generating simplified 360° floor plan of the originally captured view.
- the second routine is the processing routine 310 and generally includes performing object tracking, selectively updating 360° corrected view and updating the perspective view.
- the third routine is the display routine 320 whereby the resulting output or view obtained is wider and less blocked than that of the original view.
- FIGURE 4 illustrates an initialization routine 300 adapted in FIGURE 3 in accordance with one embodiment of the present invention.
- the initialization routine 300 includes three main processes to be performed on every input image captured by the single image capturing means 100. It is presumed that, for the purpose of elucidating the operational steps of the present invention, the input image or view captured has a fish-eye view or truncated at its edges view or having a distorted view.
- the next process is generating corrected 360° lookup tables at 401.
- the system proceeds to generate the simplified 360° floor plan, whereby the output from 402 is used to generate 360° floor plan at 402A.
- the correct dynamic perspective view look up table is generated, which is then used for generating perspective view lookup table at 403A.
- a 360° corrected view is maintained, whereby in this view, the distortion at the edge of the wide view image is corrected.
- FIGURE 5 shows the process for generating corrected 360 ° table of the initialization routine 300 in accordance with an embodiment of the present invention.
- the process starts at 501, having a fisheye image as input and calibrating the camera.
- the process is followed by transformation of fish-eye view to 360° corrected image at 502.
- the 360° lookup table is generated, at which this lookup table will be used to determine which pixel coordinates in 360° corrected image, based on any pixel coordinate in the original distorted input.
- ( ⁇ , ⁇ )— > (x,y) is used, wherein (u,v) is the coordinate in the fisheye view, while (x,y) is the coordinate in 360° corrected image.
- the process ends at 504, at which a 360° lookup table is generated.
- FIGURE 6 shows the process for generating simplified 360° floor plan 402 of the initialization routine 300 in accordance with an embodiment of the present invention.
- the process is initiated at 601, whereby ROI points are used as input.
- mapping of Region of Interest (ROI) points to the corrected image is performed.
- the list of ROI based on the fisheye view is used to indicate and thus determine the ROI within the simplified 360° floor plan.
- ROI points are then mapped onto the 360° corrected image using the 360° lookup table obtained in the earlier process referred as 401.
- the corrected view is used as the basis to generate the simplified floor plan at 603, to which it shows the correct relative distance between these ROIs as compared to fisheye image or view.
- the process ends at 604 at which a finalized floor plan is generated.
- FIGURE 7 shows the process for generating perspective view lookup table 403 of the initialization routine 300 in accordance with an embodiment of the present invention.
- the process is initiated at 701, whereby the user may provide the preferred configurations.
- the next step 702 the number of perspective views is determined. There can be 4 perspective views generated, to which the image is divided into four quadrants. Then each of the four quadrants is transformed to simulate a corrected perspective view. For this step, the user can navigate the perspective view, whereby during navigation, the user simulates the pan tilt zoom control of a PTZ camera.
- the image also can be divided into multiple perspective views based on user inputs on the ROI points. The input image can be divided, in such a way that each perspective view contains at least one ROI.
- each one view will focus on one of the ROIs.
- the method then proceeds to 703, to compute rotation angle for each view.
- the corresponding region with respect to the corrected 360° view is determined at 704.
- realignment of the views is performed, based on the rotation angles obtained at step 703. The realignment is to maintain the views in an upright position regardless of the position in the original fisheye view. Accordingly, the corresponding areas in the 360° corrected view with respect to each of the areas in the fish eye view are determined. And then at 706, a perspective view lookup table is generated.
- a lookup table which can be used to map the coordinate point (n, v) in fisheye image to which quadrant, and at which coordinates in that respective quadrant, is generated. Accordingly, this lookup table can later be used to directly map any coordinate in fisheye view to any quadrant views.
- the perspective views are then displayed at 707. The process ends at 708.
- FIGURE 8 shows a flowchart generalizing the entire process for the process routine 310 as adapted in FIGURE 3 in accordance with an embodiment of the present invention.
- the process routine 310 starts at 801, and proceeds to update the selected 360 corrected view at 802.
- the system performs object tracking, whereby new object is tracked using PTZ view at 804.
- the output from 804 is used as input in updating the perspective view at 805, at which any form of user configuration 805A can be used as input.
- the process proceeds to updating the 360° floor plan.
- the process ends at 807, whereby all output will be displayed.
- the next process is selectively updating 360° corrected view at 802.
- the selectively updating 360 ° corrected view process starts with determining the corresponding motion pixels in 360° corrected view by referring to the lookup table, at 901. It should be noted that only pixels within changes are to be corrected. In this process, the motion area detected by way of motion tracking is used to guide the update subroutine on which pixels need to be updated. Then in 902, these pixels are then updated accordingly, based on the actual values in the top down wide view image. Next in 903, the corresponding edge points of enclosing bounding box in motion map in 360° corrected view is determined. This is to further determine which intermediate pixels require updating. Accordingly, the edges of the bounding box are then mapped onto the corrected view. Finally in 1004, intensity interpolation on missing pixels within the corresponding edge points is performed. The steps of 903 and 904 are accordingly illustrated in FIGURE 9B. [0044] For the object tracking process 803 of the process routine 310 as shown in
- FIGURE 10 the process starts at 1001, for resampling of video images.
- the video images being the input image are resampled to a lower resolution image.
- Next step is 1002, which is detecting motion within the image.
- any standard or known methods can be utilized, such as a standard background subtraction method. Such detection may include motion pixels within the detected motion area.
- object is identified and tracked at 1003 by finding the correlation between all detected object across all subsequent frames and providing the same object label or identifier to the same object across frames of different time. By performing tracking, the historical trajectories information of each object can be obtained and maintained. Such information may include, but not limiting to, object location, dimension and identifier.
- any specific event analysis can be applied at 1004, and thus generating an alarm 1005 to the user on any event occurred.
- the next process flow being the PTZ tracking 804 of the process routine 310 is generalised in FIGURE 11. Resampled input image and output from perform motion tracking will be used to invoke the PTZ camera tracking process.
- the first step 1101 if an event is detected and if the event comprises a new object at 1102, (whereby the new object tracking is performed earlier in the motion tracking process, at tracking process 802;) a new virtual PTZ view is initiliazed 1102A.
- the process proceeds to 1103, to determine the next position of PTZ view with respect to 360° corrected view based on current object location.
- FIGURE 12A shows the updating perspective view process 805 of the process routine 310 in accordance with an embodiment of the present invention. This process starts at 1201 to determine whether there is any user navigation input.
- the method proceeds to 1202, to determine corresponding pixels in perspective views that require update using a look-up table. Further, all motion pixels coordinates detected from motion tracking at process 802 are used to indicate which coordinates in which respective quadrants are to be updated. Perspective views lookup table and updated 360° corrected view generated during initialization process 300 can be used to map these coordinates.
- new perspective view center point with respect to corrected view is determined at step 1203.
- the relative rotation angle for this new center point is computed with respect to previous center point at 1204.
- the new-cropped area for the new perspective view is determined based on previous cropped area suitably translated by the horizontal and vertical shift from the previous center point.
- the translation is applied to the new-cropped area at 1205.
- the rotation angle difference between current and previous center points are computed. If the angle difference is less than a threshold, e.g. 5 degree, the method concludes that the cropped view is not to be realigned to ensure that the object appears up right.
- FIGURE 12B The step of realigning based on angle difference of 1207 is further illustrated in FIGURE 12B.
- the element referenced as 50 is the corrected perspective view while the element referenced as 51 is the input corrected 360° view.
- the 'X' marking 50a indicates the previous perspective view center point and the dotted rectangular 51a defines the cropping area.
- the second 'X' marking 50b indicates a new center point in the perspective view. As the rotation angle difference between this new center point is viewed as not significant from that of the previous center point, the new cropping area - indicated as 51b thereby experiences the translation.
- the third 'X' marking 50c, having cropping area of 51c indicates new rotation is applied to realign the view.
- the third routine in accordance with an embodiment of the present invention is the display routine 320 as adapted in FIGURE 3.
- the process is initiated at 1301, whereby technically; it is initiated when the process routine 310 is initiated. All outputs generated from previous routines will be displayed.
- an output image which is wider and less blocked than that of the original input image is displayed.
- the routine is configured to display individual PTZ view for each object and tracking path.
- the process ends at 1304. It should be noted that final views are accordingly displayed for the user, these are the perspective view, simplified floor plan and views from PTZ object tracking which can be based on the duration of time, (n, n+1, n+2, n+3..).
- the method and system of the present invention provides a wider and less blocked view with a single camera, as opposed to using multiple cameras.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
The present invention provides a method for generating multiple perspective views and providing multiple pan-tilt-zoom tracking based on an image captured having at least one object of interest as input, using a single camera comprising: an initialization routine 300 generally includes; generating 360° lookup tables; generating perspective view lookup table and perspective view and generating simplified 360° floor plan based on the input; a processing routine 310 generally includes performing object tracking and object pan-tilt- zoom (PTZ) tracking; selectively updating 360° corrected view and updating the perspective view; and a display routine 320 the output image obtained upon processed by the first two routines 300, 310 is wider and less blocked than that of the input image. A system thereof is also provided.
Description
METHOD FOR PROVIDING MULTIPLE PERSPECTIVE VDZWS AND MULTIPLE PAN-TILT-ZOOM TRACKING USING A SINGLE CAMERA FIELD OF INVENTION
[0001] The present invention generally relates to image processing, and more particularly to a method for providing a wide coverage of views based on an area under surveillance using a single camera.
BACKGROUND OF INVENTION
[0002] Wide coverage view is a prevalent preference and highly desirable especially for surveillance purposes, to enable thorough image analysis for every single space in the area under surveillance or monitored. Typically, the selection of cameras to be deployed for surveillance purposes are primarily selected based on cost, structure and its capacity in providing a wide coverage view.
[0003] Most available cameras can provide a certain degree of coverage, whereby the cameras are built with the ability to orientate at variable angles within an area under surveillance. However, problem arises when the area under surveillance or monitored is obstructed or hindered by objects thus installation of a single camera within an area becomes decidedly ineffective when such situation occurs. In order to resolve such inconvenience, users resort to installing more than one camera within an area in order to obtain a wider coverage. Conventional systems require a minimal deployment of two cameras, having at least one being steerable so as to attain a wider coverage within the monitored. Expectedly, with every additional camera entails increased cost and maintenance constrains. In another exemplary situation, the views obtained are distorted particularly towards the edge of the image. Accordingly, resulted visual distortion is not favourable when it comes to viewing as well as understanding and analyzing the image.
[0004] Thus, there remains a considerable need for systems and methods that can conveniently address the above-discussed drawbacks.
SUMMARY
[0005] In accordance with one aspect of the present invention, there is provided a method for generating multiple perspective views and providing multiple pan-tilt-zoom tracking based on an image captured having at least one object of interest as input, using a single camera comprising: an initialization routine generally includes generating 360° lookup tables; generating perspective view lookup table and perspective view and generating simplified 360° floor plan based on the input; a processing routine generally includes performing object tracking and object pan-tilt-zoom (PTZ) tracking; selectively updating 360° corrected view and updating the perspective view; and a display routine the output image obtained upon processed by the first two routines is wider and less blocked than that of the input image.
[0006] In one embodiment of the present invention, generating 360° lookup tables of the initialization routine includes: transforming the input into 360° corrected view; and generating 360° lookup table for determining pixel coordinates in 360° corrected view based on original pixel coordinate of the input.
[0007] In another embodiment of the present invention, generating perspective view lookup table of the initialization routine includes: determining the number of perspective views; dividing each perspective view into quadrant; transforming each quadrant into a simulation of a corrected perspective view; computing rotation angle for each perspective view; and determining the corresponding region based on the 360° corrected view and realigning the views.
[0008] In another embodiment of the present invention, generating simplified 360° floor plan of the initialization routine includes: mapping of ROI points to the corrected view; and mapping the ROI points onto the 360° corrected view to generate the simplified floor plan.
[0009] In a further embodiment, selectively updating 360° of the process routine includes: determining the corresponding motion pixels in 360° corrected view based on the lookup table generated in an earlier routine; updating the pixels based on actual values of the input, determining the corresponding edge points of within the input; mapping the
edge points into the corrected view; and performing interpolation on missing pixels based on the edge points.
[0010] In yet another embodiment, object tracking of the process routine includes: resampling input to a lower resolution image; detecting motion within image; and identifying object by finding the correlation between all detected objects; and analysing any occurrence of event and generating an alarm for the event if detected.
[0011] In yet a further embodiment, updating the perspective view of the process routine includes: determining if there is user's input, if no user's navigation input, updating the coordinates in the quadrants based on the motion pixels detected from the motion tacking; if there is user's input, determining a new perspective view center point with respect to the corrected view; determining the relative rotation angle based on the new center point which includes determining a new cropped area based on the new perspective view based on a previous cropped area suitably translated by horizontal and vertical shift of the previous center point; apply translation to the new cropped area; computing the rotation angle difference between the current and previous center points; and realigning the new cropped area based on the rotation angle difference computation. [0012] In a further embodiment, object PTZ tracking of the process routine includes: determining the presence of an event; determining whether the event is a new event to initialize virtual PTZ view; determining the next position of PTZ view with respect to 360° corrected view based on current object location; determining the cropping area within image for maximizing the coverage of the object detected within the PTZ view; extracting the image from full resolution image and updating PTZ view; and computing tracking consistency by computing the color similarity between current and previous PTZ views as a feedback to track moving object.
[0013] In a further embodiment, the display routine provides output image obtained upon processed by the first two routines includes displaying the finalized multiple perspective view, simplified floor plan and outcome of object PTZ tracking.
[0014] In another aspect, there is provided system for providing a wider coverage view and less blocked view of a monitored area based on a top-down view, comprising;
an image capturing means configured to capture a top-down image input; at least one means connected to the image capturing means configured to perform the following tasks based on the top-down image input; and generate multiple perspective views and multiple pan-tilt- zoom (PTZ) tracking.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The invention will be more understood by reference to the description below taken in conjunction with the accompanying drawings herein:
[0016] FIGURE 1 illustrates an overall system for generating multiple perspective views and Pan-Tilt-Zoom (PTZ) tracking using a single camera in accordance with an embodiment of the present invention;
[0017] FIGURE 2A shows the image capturing means in accordance with an embodiment of the present invention;
[0018] FIGURE 2B, 2C, 2D show samples of the multiple perspective views, multiple PTZ views and simplified floor plan which can be generated using the image capturing means in accordance with an embodiment of the present invention;
[0019] FIGURE 3 shows the overall flow of the system and method in accordance with an embodiment of the present invention;
[0020] FIGURE 4 shows one aspect of the initialization routine in accordance with an embodiment of the present invention; [0021] FIGURE 5 shows the process for generating corrected 360 ° table of the initialization routine in accordance with an embodiment of the present invention;
[0022] FIGURE 6 shows the process for generating simplified 360° floor plan of the initialization routine in accordance with an embodiment of the present invention;
[0023] FIGURE 7 shows the process for generating perspective view lookup table of the initialization routine in accordance with an embodiment of the present invention; [0024] FIGURE 8 shows a flowchart generalizing the entire process for the second routine being the process routine of the method of the present invention;
[0025] FIGURE 9A shows a flowchart generalizing the process of selectively updating the 360 "corrected view in accordance with an embodiment of the present invention.
[0026] FIGURE 9B shows the steps involved in determining the corresponding edge points of enclosing bounding box in motion map and performing image intensity interpolation on missing pixels within the corresponding edge points in accordance with an embodiment of the present invention;
[0027] FIGURE 10 shows the object tracking process in accordance with an embodiment of the present invention; [0028] FIGURE 11 shows the process flow in PTZ tracking of the process routine in accordance with an embodiment of the present invention;
[0029] FIGURE 12A shows the process of updating perspective view in accordance with an embodiment of the present invention;
[0030] FIGURE 12B illustrates the realigning process in accordance with an embodiment of the present invention;
[0031] FIGURE 13 shows the display routine in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
[0032] In line with the above summary, the following description of a number of specific and alternative embodiments is provided to understand the inventive features of the present invention. It shall be apparent to one skilled in the art, however that this invention may be practiced without such specific details. Some of the details may not be described at length so as not to obscure the invention. For ease of reference, common reference numerals will be used throughout the figures when referring to the same or similar features common to the figures.
[0033] For avoidance of doubt, the term "input" in this specification is used to mean the original image input captured by the image capturing means; Similarly, the term "corrected view" or "corrected image" refers to any form of image or view which has been edited and prior to generating a final view or image for display.
[0034] The overall system for providing a wider coverage of view contemplated in accordance with one embodiment of the present invention is shown as FIGURE 1. The system includes one image capturing means 100 a display means 102 and a processor 103. The processor 103 is connected to the image capturing means 100 and display means 102 and configured to perform image-processing functionalities to be elucidated herein.
[0035] In one embodiment and as shown in FIGURE 2A, the image capturing means
100 is positioned in a manner such that it can provide a top-down overall view of the area under surveillance. The image capturing means 100 is equipped with fish-eye lens in order to capture a wide coverage of view within the area. The image capturing means 100 is configured to capture multiple perspective views and multiple pan-tilt-zoom (PTZ) views and simplified floor plan as shown in FIGURE 2B, 2C and 2D respectively. Accordingly, the image capturing means 100 is further configured to continuously track and maintain each object of interest within the area under surveillance without having to install multiple or a plurality of PTZ based cameras.
[0036] The processor 103 is configured to perform two main functionalities based on the captured images, these are; initialization routine 300 and process routine 310.
Results or output from the processor 103 are displayed based on the display routine 320 with the display means 102.
[0037] A system incorporating the method of the present invention is shown in FIGURE 3 as a flowchart generalizing the entire process in accordance with an embodiment of the present invention. There are three main routines, the first being the initialization routine 300, second is the processing routine 310 and third is the display routine 320. The first routine being the initialization routine 300 includes generating lookup tables, perspective view lookup table and generating simplified 360° floor plan of the originally captured view. The second routine is the processing routine 310 and generally includes performing object tracking, selectively updating 360° corrected view and updating the perspective view. The third routine is the display routine 320 whereby the resulting output or view obtained is wider and less blocked than that of the original view. [0038] FIGURE 4 illustrates an initialization routine 300 adapted in FIGURE 3 in accordance with one embodiment of the present invention. The initialization routine 300 includes three main processes to be performed on every input image captured by the single image capturing means 100. It is presumed that, for the purpose of elucidating the operational steps of the present invention, the input image or view captured has a fish-eye view or truncated at its edges view or having a distorted view. Upon completion of camera calibration at 400, and inputting user' s configurations at 400A, the next process is generating corrected 360° lookup tables at 401. Then, at 402, the system proceeds to generate the simplified 360° floor plan, whereby the output from 402 is used to generate 360° floor plan at 402A. Upon generated the floor plan, at 403, the correct dynamic perspective view look up table is generated, which is then used for generating perspective view lookup table at 403A. In 401, there can be a plurality of lookup tables generated, whereby the tables directly map coordinates from one image to the coordinates of other image. During the initialization routine 300, a 360° corrected view is maintained, whereby in this view, the distortion at the edge of the wide view image is corrected.
[0039] FIGURE 5 shows the process for generating corrected 360 ° table of the initialization routine 300 in accordance with an embodiment of the present invention. The process starts at 501, having a fisheye image as input and calibrating the camera. The
process is followed by transformation of fish-eye view to 360° corrected image at 502. At 503, the 360° lookup table is generated, at which this lookup table will be used to determine which pixel coordinates in 360° corrected image, based on any pixel coordinate in the original distorted input. In determining the coordinates, (κ,ν)— > (x,y) is used, wherein (u,v) is the coordinate in the fisheye view, while (x,y) is the coordinate in 360° corrected image. The process ends at 504, at which a 360° lookup table is generated.
[0040] FIGURE 6 shows the process for generating simplified 360° floor plan 402 of the initialization routine 300 in accordance with an embodiment of the present invention. The process is initiated at 601, whereby ROI points are used as input. Then in 602, mapping of Region of Interest (ROI) points to the corrected image is performed. Further during this step, the list of ROI based on the fisheye view is used to indicate and thus determine the ROI within the simplified 360° floor plan. These ROI points are then mapped onto the 360° corrected image using the 360° lookup table obtained in the earlier process referred as 401. The corrected view is used as the basis to generate the simplified floor plan at 603, to which it shows the correct relative distance between these ROIs as compared to fisheye image or view. The process ends at 604 at which a finalized floor plan is generated.
[0041] FIGURE 7 shows the process for generating perspective view lookup table 403 of the initialization routine 300 in accordance with an embodiment of the present invention. The process is initiated at 701, whereby the user may provide the preferred configurations. The next step 702, the number of perspective views is determined. There can be 4 perspective views generated, to which the image is divided into four quadrants. Then each of the four quadrants is transformed to simulate a corrected perspective view. For this step, the user can navigate the perspective view, whereby during navigation, the user simulates the pan tilt zoom control of a PTZ camera. It should be understood that the image also can be divided into multiple perspective views based on user inputs on the ROI points. The input image can be divided, in such a way that each perspective view contains at least one ROI. It is preferable that multiple views can be generated; wherein each one view will focus on one of the ROIs. The method then proceeds to 703, to compute rotation angle for each view. Upon completion of this step, the corresponding region with respect to the corrected 360° view is determined at 704. In 705, realignment of the views is performed, based on the rotation angles obtained at step
703. The realignment is to maintain the views in an upright position regardless of the position in the original fisheye view. Accordingly, the corresponding areas in the 360° corrected view with respect to each of the areas in the fish eye view are determined. And then at 706, a perspective view lookup table is generated. Upon determined the corresponding coordinates in fisheye view based on the perspective view; a lookup table, which can be used to map the coordinate point (n, v) in fisheye image to which quadrant, and at which coordinates in that respective quadrant, is generated. Accordingly, this lookup table can later be used to directly map any coordinate in fisheye view to any quadrant views. The perspective views are then displayed at 707. The process ends at 708.
[0042] FIGURE 8 shows a flowchart generalizing the entire process for the process routine 310 as adapted in FIGURE 3 in accordance with an embodiment of the present invention. The process routine 310 starts at 801, and proceeds to update the selected 360 corrected view at 802. At 803 the system performs object tracking, whereby new object is tracked using PTZ view at 804. The output from 804 is used as input in updating the perspective view at 805, at which any form of user configuration 805A can be used as input. At 806, the process proceeds to updating the 360° floor plan. The process ends at 807, whereby all output will be displayed. [0043] The next process is selectively updating 360° corrected view at 802. With reference to FIGURE 9A, based on the top down wide view image input and motion map from previous processes, the selectively updating 360 ° corrected view process starts with determining the corresponding motion pixels in 360° corrected view by referring to the lookup table, at 901. It should be noted that only pixels within changes are to be corrected. In this process, the motion area detected by way of motion tracking is used to guide the update subroutine on which pixels need to be updated. Then in 902, these pixels are then updated accordingly, based on the actual values in the top down wide view image. Next in 903, the corresponding edge points of enclosing bounding box in motion map in 360° corrected view is determined. This is to further determine which intermediate pixels require updating. Accordingly, the edges of the bounding box are then mapped onto the corrected view. Finally in 1004, intensity interpolation on missing pixels within the corresponding edge points is performed. The steps of 903 and 904 are accordingly illustrated in FIGURE 9B.
[0044] For the object tracking process 803 of the process routine 310 as shown in
FIGURE 10, the process starts at 1001, for resampling of video images. In this step, the video images being the input image are resampled to a lower resolution image. Next step is 1002, which is detecting motion within the image. To detect motion in the image, any standard or known methods can be utilized, such as a standard background subtraction method. Such detection may include motion pixels within the detected motion area. Based on detected motion area, object is identified and tracked at 1003 by finding the correlation between all detected object across all subsequent frames and providing the same object label or identifier to the same object across frames of different time. By performing tracking, the historical trajectories information of each object can be obtained and maintained. Such information may include, but not limiting to, object location, dimension and identifier. Upon completion of tracking, any specific event analysis can be applied at 1004, and thus generating an alarm 1005 to the user on any event occurred.
[0045] The next process flow being the PTZ tracking 804 of the process routine 310 is generalised in FIGURE 11. Resampled input image and output from perform motion tracking will be used to invoke the PTZ camera tracking process. The first step 1101, if an event is detected and if the event comprises a new object at 1102, (whereby the new object tracking is performed earlier in the motion tracking process, at tracking process 802;) a new virtual PTZ view is initiliazed 1102A. At 1101A, in the event that there is no new object or motion of the previously tracked object within the scene, the process proceeds to 1103, to determine the next position of PTZ view with respect to 360° corrected view based on current object location. Then the process proceeds to determine the cropping area for maximizing the coverage of the object within the PTZ view at 1104. In 1105, the process extracts the image from full resolution image and then updating the PTZ view at 1106. Next, the system computes tracking consistency by computing the color similarity between current and previous PTZ views at 1107. In this process, the tracking consistency can be computed based on the consistency of the color properties (e.g. color similarity across image frames) or the trajectory consistency. The output obtained can be used as feedback to perform object tracking at 802. The process ends at 1108.
[0046] FIGURE 12A shows the updating perspective view process 805 of the process routine 310 in accordance with an embodiment of the present invention. This process starts at 1201 to determine whether there is any user navigation input. If there is no user's navigation input (such as user pans or tilts or zooms the perspective view), the method proceeds to 1202, to determine corresponding pixels in perspective views that require update using a look-up table. Further, all motion pixels coordinates detected from motion tracking at process 802 are used to indicate which coordinates in which respective quadrants are to be updated. Perspective views lookup table and updated 360° corrected view generated during initialization process 300 can be used to map these coordinates.
[0047] In the event that a user's navigation input is available, then new perspective view center point with respect to corrected view is determined at step 1203. Next, the relative rotation angle for this new center point is computed with respect to previous center point at 1204. Accordingly, the new-cropped area for the new perspective view is determined based on previous cropped area suitably translated by the horizontal and vertical shift from the previous center point. The translation is applied to the new-cropped area at 1205. Then at 1206 the rotation angle difference between current and previous center points are computed. If the angle difference is less than a threshold, e.g. 5 degree, the method concludes that the cropped view is not to be realigned to ensure that the object appears up right. However, if the angle difference is more than the threshold, then the new-cropped area will need to be realigned at 1207 so that the object in the cropped view will remain appears in up-right position. All changes are then used to update the perspective view at 1208. Upon completion of updating the perspective view, the process ends at 1209. All output obtained from steps 1202, 1206 and 1207 are inputted for updating perspective views at 1208 prior to ending the process at 1209.
[0048] The step of realigning based on angle difference of 1207 is further illustrated in FIGURE 12B. Referring to FIGURE 12B, the element referenced as 50 is the corrected perspective view while the element referenced as 51 is the input corrected 360° view. The 'X' marking 50a indicates the previous perspective view center point and the dotted rectangular 51a defines the cropping area. The second 'X' marking 50b indicates a new center point in the perspective view. As the rotation angle difference between this new center point is viewed as not significant from that of the previous center point, the new cropping area - indicated as 51b thereby experiences the translation. The
third 'X' marking 50c, having cropping area of 51c indicates new rotation is applied to realign the view.
[0049] The third routine in accordance with an embodiment of the present invention is the display routine 320 as adapted in FIGURE 3. As shown in FIGURE 13, the process is initiated at 1301, whereby technically; it is initiated when the process routine 310 is initiated. All outputs generated from previous routines will be displayed. In one step 1302, an output image, which is wider and less blocked than that of the original input image is displayed. In the next step 1303, the routine is configured to display individual PTZ view for each object and tracking path. The process ends at 1304. It should be noted that final views are accordingly displayed for the user, these are the perspective view, simplified floor plan and views from PTZ object tracking which can be based on the duration of time, (n, n+1, n+2, n+3..). [0050] As previously elucidated, the method and system of the present invention provides a wider and less blocked view with a single camera, as opposed to using multiple cameras.
[0051] As would be apparent to a person having ordinary skilled in the art, the afore- described methods and components may be provided in many variations, modifications or alternatives to existing camera systems. The principles and concepts disclosed herein may also be implemented in various manner or form in conjunction with the hardware or firmware of the systems which may not have been specifically described herein but which are to be understood as encompassed within the scope and letter of the following claims.
Claims
A method for generating multiple perspective views and providing multiple pan-tilt- zoom tracking based on an image captured having at least one object of interest as input, using a single camera comprising: generating corrected 360° lookup tables 401;;
generating simplified 360° floor plan based on the input 402 generating correct dynamic perspective view lookup table and perspective view 403;
selectively updating 360° corrected view 802;
performing object tracking 803;
performing object pan-tilt-zoom (PTZ) tracking 804 on any new object;
updating the perspective view and 360° floor plan 805, 806;
displaying an output image which is wider and less blocked than that of the input image 1302.
The method as claimed in Claim 1 wherein generating 360° lookup tables 401 includes:
transforming the input into 360° corrected view 502; and generating 360° lookup table for determining pixel coordinates in 360° corrected view based on original pixel coordinate of the input 503.
The method as claimed in Claim 1 wherein generating simplified 360° floor plan based on the input 402 includes:
mapping of Region of Interest ROI points to the corrected view 602; and
mapping the ROI points onto the 360° corrected view to generate the simplified floor plan 603.
The method as claimed in Claim 1 wherein generating perspective view lookup table and perspective view 403 includes:
determining the number of perspective views 702 which includes; dividing each perspective view into quadrant; and transforming each quadrant into a simulation of a corrected perspective view;
wherein each perspective view contains one ROI point; computing rotation angle for each perspective view 703 ; determining the corresponding region based on the corrected 360° view 704;
realigning the views 705; and
generating the perspective view lookup table 706.
The method as claimed in Claim 1 wherein selectively updating 360° corrected view 802 includes:
determining the corresponding motion pixels in 360° corrected view based on the lookup table generated in an earlier routine 901;
updating the pixels based on actual values of the input, determining the corresponding edge points of within the input 902;
mapping the edge points into the corrected view 903; and
performing interpolation on missing pixels based on the edge points 904.
The method as claimed in Claim 1 wherein object tracking 803 includes:
resampling input to a lower resolution image 1001;
detecting motion within image 1002;
tracking and identifying object by finding the correlation between all detected objects 1003;
analysing any occurrence of event 1004 and generating an alarm for the event if detected 1005.
The method as claimed in Claim 1 wherein the object PTZ tracking 804 includes: determining the presence of an event 1101;
determining whether the event comprises a new object to initialize virtual PTZ view 1101A;
initializing PTZ tracking if there is new object detected 1102A;
if the object is not new, determining the next position of PTZ view with respect to 360° corrected view based on current object location 1103;
determining the cropping area within image for maximizing the coverage of the object detected within the PTZ view 1104;
extracting the image from full resolution image and updating PTZ view 1105; computing tracking consistency by computing the color similarity between current and previous PTZ views as a feedback to track moving object 1107.
The method as claimed in Claim 1 wherein updating the perspective view 805 includes:
determining if there is user's input 1201;
if there is no user input, determining the corresponding pixels in perspective views that require update using look-up table 1202;
if there is user's input, determining a new perspective view center point with respect to the corrected view 1203;
determining the relative rotation angle based on the new center point, including determining a new cropped area based on the new perspective view 1204;
apply translation to the new cropped area 1205;
computing the rotation angle difference between the current and previous center points 1206;
realigning the new cropped area based on the rotation angle difference computation 1207 and updating the perspective views.
The method as claimed in Claim 1 wherein displaying an out image includes displaying the finalized multiple perspective view, simplified floor plan and outcome of object PTZ tracking.
10. A system for providing a wider coverage view and less blocked view of a monitored area based on a single camera 100 configured to capture a top-down view of the monitored area, the system comprising; at least one processing means 103 connected to the single camera 100, the processing means 103 is configured to generate multiple perspective views and multiple pan-tilt- zoom (PTZ) tracking based on the top-down single view input captured by the image capturing means 100 and generating an output with a wider coverage view and less blocked view of the monitored area; wherein in generating multiple perspective views and multiple pan-tilt- zoom (PTZ) view, the processing means 103 is configured to generate 360° lookup tables; generate simplified 360° floor plan based on the input; generate perspective view lookup table and perspective view; perform object tracking; selectively updating 360° corrected view ; provides object pan-tilt-zoom (PTZ) tracking using the single camera 100; and updates the perspective view; and a display means 102 configured to display the output as generated by the processing means 103.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MYPI2014001296A MY168810A (en) | 2014-05-05 | 2014-05-05 | Method for providing multiple perspective views and multiple pan-tilt-zoom tracking using a single camera |
MYPI2014001296 | 2014-05-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015170962A1 true WO2015170962A1 (en) | 2015-11-12 |
Family
ID=54392734
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/MY2015/000031 WO2015170962A1 (en) | 2014-05-05 | 2015-04-21 | Method for providing multiple perspective views and multiple pan-tilt-zoom tracking using a single camera |
Country Status (2)
Country | Link |
---|---|
MY (1) | MY168810A (en) |
WO (1) | WO2015170962A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3817360A4 (en) * | 2018-05-30 | 2022-03-16 | Arashi Vision Inc. | Method for tracking target in panoramic video, and panoramic camera |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070236570A1 (en) * | 2006-04-05 | 2007-10-11 | Zehang Sun | Method and apparatus for providing motion control signals between a fixed camera and a ptz camera |
KR20100129125A (en) * | 2009-05-29 | 2010-12-08 | 주식회사 영국전자 | Intelligent panorama camera, circuit and method for controlling thereof, and video monitoring system |
US20120098927A1 (en) * | 2009-06-29 | 2012-04-26 | Bosch Security Systems Inc. | Omni-directional intelligent autotour and situational aware dome surveillance camera system and method |
US8212837B1 (en) * | 2007-10-02 | 2012-07-03 | Grandeye, Ltd. | Color processing pipelines generating a lower color bit-depth image compared to an unprocessed image from a wide-angle camera |
WO2012158017A1 (en) * | 2011-05-13 | 2012-11-22 | Mimos, Berhad | Method and system for multiple objects tracking and display |
-
2014
- 2014-05-05 MY MYPI2014001296A patent/MY168810A/en unknown
-
2015
- 2015-04-21 WO PCT/MY2015/000031 patent/WO2015170962A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070236570A1 (en) * | 2006-04-05 | 2007-10-11 | Zehang Sun | Method and apparatus for providing motion control signals between a fixed camera and a ptz camera |
US8212837B1 (en) * | 2007-10-02 | 2012-07-03 | Grandeye, Ltd. | Color processing pipelines generating a lower color bit-depth image compared to an unprocessed image from a wide-angle camera |
KR20100129125A (en) * | 2009-05-29 | 2010-12-08 | 주식회사 영국전자 | Intelligent panorama camera, circuit and method for controlling thereof, and video monitoring system |
US20120098927A1 (en) * | 2009-06-29 | 2012-04-26 | Bosch Security Systems Inc. | Omni-directional intelligent autotour and situational aware dome surveillance camera system and method |
WO2012158017A1 (en) * | 2011-05-13 | 2012-11-22 | Mimos, Berhad | Method and system for multiple objects tracking and display |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3817360A4 (en) * | 2018-05-30 | 2022-03-16 | Arashi Vision Inc. | Method for tracking target in panoramic video, and panoramic camera |
Also Published As
Publication number | Publication date |
---|---|
MY168810A (en) | 2018-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10452931B2 (en) | Processing method for distinguishing a three dimensional object from a two dimensional object using a vehicular system | |
US10339386B2 (en) | Unusual event detection in wide-angle video (based on moving object trajectories) | |
US10445887B2 (en) | Tracking processing device and tracking processing system provided with same, and tracking processing method | |
KR101071352B1 (en) | Apparatus and method for tracking object based on PTZ camera using coordinate map | |
EP3641298B1 (en) | Method and device for capturing target object and video monitoring device | |
US8842162B2 (en) | Method and system for improving surveillance of PTZ cameras | |
CN107665505B (en) | Method and device for realizing augmented reality based on plane detection | |
US9886769B1 (en) | Use of 3D depth map with low and high resolution 2D images for gesture recognition and object tracking systems | |
CN108198199B (en) | Moving object tracking method, moving object tracking device and electronic equipment | |
US20050052533A1 (en) | Object tracking method and object tracking apparatus | |
JP2008102620A (en) | Image processing device | |
WO2016012593A1 (en) | Method and system for object detection with multi-scale single pass sliding window hog linear svm classifiers | |
EP3629570A2 (en) | Image capturing apparatus and image recording method | |
JP2016127571A (en) | Camera system, display control device, display control method, and program | |
US9615050B2 (en) | Topology preserving intensity binning on reduced resolution grid of adaptive weighted cells | |
JP6617150B2 (en) | Object detection method and object detection apparatus | |
ES2717186T3 (en) | Procedure and device for the detection of moving objects in a sequence of video images | |
CN109543496B (en) | Image acquisition method and device, electronic equipment and system | |
KR100623835B1 (en) | Object detecting method and apparatus | |
JP4699056B2 (en) | Automatic tracking device and automatic tracking method | |
CN108961182B (en) | Vertical direction vanishing point detection method and video correction method for video image | |
KR101620580B1 (en) | Method and system for dectecting run | |
WO2015170962A1 (en) | Method for providing multiple perspective views and multiple pan-tilt-zoom tracking using a single camera | |
WO2018146997A1 (en) | Three-dimensional object detection device | |
Lin et al. | Large-area, multilayered, and high-resolution visual monitoring using a dual-camera system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15789028 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15789028 Country of ref document: EP Kind code of ref document: A1 |