KR101789520B1 - Device and method for tracking group-based multiple object - Google Patents
Device and method for tracking group-based multiple object Download PDFInfo
- Publication number
- KR101789520B1 KR101789520B1 KR1020150182820A KR20150182820A KR101789520B1 KR 101789520 B1 KR101789520 B1 KR 101789520B1 KR 1020150182820 A KR1020150182820 A KR 1020150182820A KR 20150182820 A KR20150182820 A KR 20150182820A KR 101789520 B1 KR101789520 B1 KR 101789520B1
- Authority
- KR
- South Korea
- Prior art keywords
- group
- information
- objects
- image
- camera
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G06K9/00369—
-
- G06K9/00771—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
An object tracking method according to an embodiment of the present invention includes the steps of identifying a plurality of objects from a first image obtained from a first camera having a color sensor and a depth sensor, based on the three-dimensional distance between the objects and the overlap ratio A step of grouping a part or all of a plurality of objects to form an object group, a step of tracing an object group as a single object, and a step of determining a three-dimensional distance between each object in the object group and the object group, Based on at least one of the percentages of the objects in the object group.
Description
This specification relates to the field of object tracking. More particularly, it relates to techniques for identifying and tracking multiple objects.
[0004] 2. Description of the Related Art [0005] With the development of sensors and image processing technologies, there have been many techniques for identifying objects from an image acquired by a camera sensor or the like and tracking the identified objects. In particular, the technology of identifying a person from three-dimensional images and tracking an identified person is of interest in various fields such as robot technology field and crime prevention field.
Among these person identification and tracking techniques, the conventional " three-dimensional position-based tracking technique " disclosed in the
In addition, the "conventional Kalman filter-based tracking technique" disclosed in the prior art 2, among the human identification and tracking techniques, is a technique for finding the three-dimensional position of each person identified in the previous frame The three-dimensional distance between the 'estimated three-dimensional position of each person' of the current frame estimated by the Kalman filter on the basis of the three-dimensional distance of the current frame is calculated and the identified person is continuously tracked by associating the persons having the closest three- . However, this tracking technique also has the problem that if a person suddenly changes direction, he may lose a person's identity or miss tracking.
As such, conventional human identification and tracking techniques can be used to identify and track a person when multiple persons are very close to each other, forming a group and moving, suddenly changing the direction of movement, or when a person is blocked by another person It has a difficult problem.
Accordingly, the present invention intends to provide a novel group-based object tracking apparatus and method capable of continuously tracking an object in a group, an object rapidly changing its moving direction, and an object obscured by another object.
An object tracking apparatus according to an embodiment of the present invention includes an object identification module for identifying a plurality of objects from a first image acquired from a first camera having a color sensor and a depth sensor, a three-dimensional distance between the objects and an overlap ratio A group tracing module for tracing the object group as a single object, and a three-dimensional distance between the object group and each object in the object group, And an overlap ratio between each object in the object group, based on at least one of an overlap ratio among the objects in the object group.
As an embodiment, the group formation module may be configured such that the three-dimensional distance between the first object and the second object identified from the first image is less than the first threshold distance, and the overlap ratio between the first object and the second object is If the threshold value is higher than the threshold value, the first object and the second object may be grouped to form a first object group.
As an embodiment, the group formation module may be configured such that the three-dimensional distance between the third object identified from the first image and the first object group is less than the first threshold distance, If at least one of the overlap ratios among the objects is greater than or equal to the threshold ratio, the third object may be further grouped into the first object group.
As an embodiment, the de-grouping module may be configured such that the three-dimensional distance between the first object group and the first object is greater than or equal to the first threshold distance and less than the second threshold distance, If all of the overlap ratios between other objects in the group are less than the threshold ratio, it can be determined that the first object is ungrouped from the first object group.
As an embodiment, the ungrouping module may determine the three-dimensional distance between the object group in the first frame and the object in the object group in the second frame, which is the frame after the first frame, And determine whether to ungroup all or some of the objects in the object group based on the overlap ratio between each object in the object group.
In an embodiment, the object identification module obtains object information and object name information for each object, and the group formation module obtains group information and group name information for the object group, , Obtains the object information and the object name information for the ungrouped object again, and obtains the group information and the group name information for the object group including the ungrouped object again.
In an embodiment, the object information includes at least one of positional information on individual objects, region of interest information, speed-up robust feature (SURF) feature information, and color histogram information, And group location information.
As an embodiment, the object identification module may obtain the object information based on the first image obtained from the first camera, enlarge a specific area of the identified object obtained from the second camera different from the first camera And obtain the object name information based on the captured second image.
As an embodiment, the group formation module may obtain the group information and the group name information based on the obtained object information and object name information.
In an embodiment, the ungrouping module acquires object information for an object ungrouped based on the first image acquired from the first camera, and acquires object information One can be assigned as the object name for the ungrouped object.
In an embodiment, the first camera may be an RGB-D camera, and the second camera may be a PTZ camera.
An object tracking method according to an embodiment of the present invention includes the steps of identifying a plurality of objects from a first image obtained from a first camera having a color sensor and a depth sensor, based on the three-dimensional distance between the objects and the overlap ratio The method comprising the steps of: grouping a part or all of a plurality of objects to form an object group; tracking the object group as a single object; and determining a three-dimensional distance between the object group and each object in the object group, And an overlap ratio between objects, based on at least one of the ratio of overlap between objects.
According to the present disclosure, an object tracking device can continuously track a number of objects in a group environment.
Further, according to the present specification, the object tracking apparatus can accurately track the group by determining whether to release and maintain the group based on the three-dimensional distance as well as the overlap ratio between the respective objects.
In addition, according to the present specification, the object tracking apparatus can accurately identify the names of ungrouped objects through minimal information and processing at the time of ungrouping.
1 is a block diagram of an object tracking apparatus according to an embodiment of the present invention.
2 is a detailed block diagram of the control unit of FIG. 1 according to an embodiment of the present invention.
FIG. 3 is an exemplary diagram illustrating a process of identifying and tracking objects on a group basis, according to an image frame, according to an embodiment of the present invention.
FIG. 4 is an exemplary diagram illustrating a process of identifying and tracking objects on a group-by-object basis according to another embodiment of the present invention.
5 is a flowchart illustrating an object tracking method according to an embodiment of the present invention.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings and the accompanying drawings, but the scope of the claims is not limited or limited by the embodiments.
As used herein, terms used in the present specification are selected from the general terms that are currently widely used, while taking into consideration the functions, but these may vary depending on the intention or custom of the artisan or the emergence of new techniques. Also, in certain cases, there may be a term selected by the applicant at will, in which case the meaning will be described in the description part of the corresponding specification. Therefore, it is intended that the terminology used herein should be interpreted based on the meaning of the term rather than on the name of the term, and on the entire contents of the specification.
In addition, the embodiments described herein may be wholly hardware, partially hardware, partially software, or entirely software. A "unit", "module", "device", "robot" or "system" or the like in this specification refers to a computer-related entity such as hardware, a combination of hardware and software, . For example, a component, a module, a device, a robot or a system may refer to software such as an application for driving the hardware and / or the hardware constituting part or all of the platform.
In this specification, an object tracking device refers to a device that identifies and tracks a plurality of objects on a group-based basis. For example, the object tracking device may be a device that groups some or all of a plurality of objects identified in an image acquired from one or more cameras to form an object group, and tracks the object group as a single object. Here, the object may be an identifiable object, for example, a person.
1 is a block diagram of an object tracking apparatus according to an embodiment of the present invention. Referring to FIG. 1, the
The
In an embodiment, the
As an embodiment, the
The
Here, the object information may include at least one of object position information of an individual object, object interest area information, speed-up robust feature (SURF) feature information, and color histogram information, for example. Also, the object name information is information indicating the name of an individual object, for example, information about a real or virtual name representing an individual object.
At this time, the object position information is information indicating the position of the individual object, for example, information about the three-dimensional position (e.g., the three-dimensional position of the upper left corner of the ROI associated with the individual object) have. In addition, the object interest region information may be information about a region of interest associated with an individual object, for example, information about the height of the region of interest. In addition, the SURF feature information is information indicating a SURF feature for an individual object, for example, in the form of SURF feature points within a region of interest of an individual object. In addition, the color histogram information may be information indicating a color histogram for an individual object.
Further, the group information is information on an object group composed of a plurality of individual objects, and may include, for example, at least one of group position information (e.g., three-dimensional position) and group interest information of an object group. The group name information is information indicating the name of the object group, for example, information about a real or virtual name representing the object group.
At this time, the group position information is information indicating the position of the object group. For example, the group position information may be information about a three-dimensional position (for example, a three-dimensional position of the upper left corner of the ROI associated with the object group) have. Further, the group interest area information may be information on a region of interest associated with the object group, for example, information on the height of the region of interest.
The
The
FIG. 1 is a configuration diagram according to one embodiment of the present invention, and the separated configurations are logically distinguishable from the components of the apparatus. Thus, the components of the apparatus described above can be mounted as one chip or as a plurality of chips, depending on the design of the apparatus. Hereinafter, the
2 is a detailed block diagram of the control unit of FIG. 1 according to an embodiment of the present invention.
Referring to FIG. 2, the control unit may include an
The
In addition, the
In one embodiment, the
In one embodiment, the
The
In one embodiment, the
In one embodiment, the
In one embodiment, the
The
The
In one embodiment, the
The
For example, if the three-dimensional distance between the object group and the first object in the object group is greater than or equal to the first threshold distance and less than the second threshold distance, or the overlap between the first object and the other objects in the object group If the ratio is less than the threshold ratio, the first object may be ungrouped from the object group.
In one embodiment, the
In one embodiment, the
In addition, the
In the reference-mining approach, the
For example, when the grouping of the first object is determined, the
Accordingly, unlike assigning the name of the object for the first time, the object tracking apparatus can not acquire the second image obtained by enlarging the specific region of the object using the second camera, You can identify the exact object name of the object. That is, the object tracking apparatus can quickly and accurately reassign the object name of the released object after the group is formed through the minimized reference.
In one embodiment, the
FIG. 3 is an exemplary diagram illustrating a process of identifying and tracking objects on a group basis, according to an image frame, according to an embodiment of the present invention.
3, the object tracking apparatus identifies a first object P1 and a second object P2 in a t-1 frame of a first image acquired by a first camera, The
Referring to the center of FIG. 3, the object tracking apparatus can track a first object P1 and a second object P2 in a t frame of a first image, and determine whether to form a group of objects. For example, as shown, the three-dimensional distance between the first object P1 and the second object P2 in the t frame is less than the first threshold distance, and the first object P1 and the second object P2, The object tracking device can determine to form the first object group by grouping the first object P1 and the second object P2, and when the overlap ratio between the first object group P1 and the second object P2 is equal to or greater than the threshold ratio, May determine the associated region of
Also, the object tracking device may obtain group location information and group name information for the first object group. For example, the object tracking device may assign the three-dimensional position of the upper left corner of the region of
Referring to the right side of FIG. 3, the object tracking apparatus can track the first object group in the (t + 1) th frame of the first image and determine whether to ungroup a part or all of the objects in the first object P1. At this time, the object tracking apparatus can track the first object group as a single object.
For example, if the three-dimensional distance between the first object group of the t frame and each object of the t + 1 frame is equal to or less than the first threshold and the first object P1 of the t + And the second object P2 are equal to or greater than the threshold ratio, the object tracking apparatus can maintain the first object group.
As another example, if the three-dimensional distance between the first object group of the t frame and the first object P1 or the second object P2 of the t + 1 frame is equal to or greater than the first threshold distance And the overlap ratio between the first object (P1) and the second object (P2) of the (t + 1) th frame is less than the threshold ratio, the
FIG. 4 is an exemplary diagram illustrating a process of identifying and tracking objects on a group-by-object basis according to another embodiment of the present invention. In Fig. 4, it is assumed that the object is a person, and an embodiment will be described.
As described above, the object tracking apparatus can display object images and object information through the display unit. For example, the object tracking apparatus can display a first image photographed by the first camera unit in a first display area by combining the first image with a region of interest associated with the identified person or group in the first image, Information on the identified person or group information on the group can be displayed in the second display area.
4 (a), when three persons are first identified in the first image, the object tracking apparatus synthesizes a region of interest associated with each person on the first image and displays it on the first display region, The position information can be displayed in the second display area. In this case, the position information of each person can be obtained from the first image, and the name information can be obtained from the second image.
4 (b), when two of the three persons form a group, the object tracking apparatus synthesizes the region of interest associated with the group consisting of two persons and the region of interest associated with the other person to the first image, And display the group name and group location information of the group and the name and location information of the remaining one person in the second display area. In this case, the group location and group name information can be obtained from the location and name information of each person already obtained as described above. For example, the object tracking device can assign the three-dimensional position of the upper left corner of the region of interest associated with the group to a group position representing the group, and combine the names of each person in order It can be assigned to a group name.
4C, when the remaining one person is added to the already formed group, the object identification device synthesizes the interest area associated with the group consisting of three persons into the first image and displays it on the first display area, Name and group location information can be displayed in the second display area. In this case, the group location information and the group name information can be obtained from the already obtained location and name information of each person, the group location of the object group, and the group name information. For example, you can assign the three-dimensional position of the upper-left corner of the updated interest area of a person-added group to the group location representing the group, and add the person's name before or after the previous group name You can assign a name to a group name.
4 (d), when one person in the group is released from the group, the object tracking device synthesizes the updated group interest area of the group consisting of the remaining two persons and the interest area of the ungrouped person into the
4 (e), when the group itself is released, the object identification apparatus synthesizes the interest region of each person, which has been ungrouped, on the first image and displays it on the first display region, and displays the name and location Information can be displayed on the second display area. At this time, the name information of each person to be grouped is obtained from the name information for each person stored in the storage unit, which has been already obtained based on at least one of the SURF feature information and the color histogram information for the grouped person obtained from the first image Can be obtained as described above.
5 is a flowchart illustrating an object tracking method according to an embodiment of the present invention. In FIG. 5, detailed description of the same or similar parts as those described in FIGS. 1 to 4 will be omitted.
Referring to FIG. 5, the object tracking apparatus can identify a plurality of objects from a first image obtained from a first camera having a color sensor and a depth sensor (S10). In addition, the object tracking apparatus can acquire object information and object name information for each object identified based on the image photographed by the camera unit. In addition, the object tracking apparatus can store all or a part of the obtained object information and object name information in the storage unit.
Next, the object tracking apparatus can form an object group by grouping some or all of a plurality of objects based on the three-dimensional distance and the overlap ratio between the objects (S20). In one embodiment, the object tracking device may form a group of objects by grouping some or all of the plurality of identified objects based on the 3D distance and the overlapped rate between each object . For example, the object tracking apparatus may be configured such that the three-dimensional distance between the first object and the second object identified from the first image is less than the first threshold distance, and the overlap ratio between the first object and the second object is less than the threshold ratio The first object and the second object may be grouped to form a first object group.
In addition, the object tracking apparatus can obtain group information and group name information for each object group. In addition, the object tracking apparatus can store the obtained group information and group name information in a storage unit.
Next, the object tracking apparatus can track the object group as a single object (S30). In addition, the object tracking device can update object information and group information.
Next, the object tracking apparatus can ungroup some or all of the objects in the object group based on the three-dimensional distance between the object group and each object in the object group and the overlap ratio between the objects in the object group (S40 ). In addition, the object tracking device may reallocate the object name for the ungrouped object and reallocate the group name for the object group in which the ungrouped object was contained, if the ungrouping is determined. For example, the object tracking device may determine that the three-dimensional distance between the first object group and the first object is greater than the first threshold distance and less than the second threshold distance, or that the overlap between the first object and the other objects in the object group If all of the ratios are less than the threshold ratio, it may be determined that the first object is ungrouped from the first object group.
Such an object tracking method may be implemented in an application or implemented in the form of program instructions that can be executed through various computer components and recorded in a computer-readable recording medium. The computer-readable recording medium may include program instructions, data files, data structures, and the like, alone or in combination. Program instructions that are recorded on a computer-readable recording medium may be those that are specially designed and constructed for the present invention and are known and available to those skilled in the art of computer software.
Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those generated by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. A hardware device may be configured to operate as one or more software modules to perform processing in accordance with the present invention, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It should be understood that various modifications may be made by those skilled in the art without departing from the spirit and scope of the present invention.
In this specification, both the invention and the method invention are explained, and the description of both inventions can be supplemented as necessary.
Claims (12)
A group forming module for forming a group of objects by grouping some or all of a plurality of objects based on a three-dimensional distance between the objects and an overlap ratio;
A group tracking module for tracking the object group as a single object; And
And a group releasing module for ungrouping some or all of the objects in the object group based on at least one of the three-dimensional distance between the object group and each object in the object group and the overlap ratio between each object in the object group However,
Wherein the object identification module comprises:
Acquiring object information for each object based on the first image acquired from the first camera,
Acquires object name information based on a second image obtained by enlarging a specific area of the identified object obtained from a second camera different from the first camera.
The group forming module includes:
When the three-dimensional distance between the first object and the second object identified from the first image is less than the first threshold distance and the overlap ratio between the first object and the second object is equal to or greater than the threshold ratio, And grouping the first object and the second object to form a first object group.
The group forming module includes:
Wherein a three-dimensional distance between a third object identified from the first image and the first object group is less than the first threshold distance and at least one of overlap ratios between the third object and objects in the first object group And grouping the third object into the first object group if one is greater than or equal to the threshold ratio.
Wherein the grouping module comprises:
Wherein the three-dimensional distance between the first object group and the first object is greater than or equal to the first threshold distance and less than the second threshold distance, or both the overlap ratios between the first object and the other objects in the object group Determines that the first object is ungrouped from the first object group if the first object is less than the threshold rate.
Wherein the grouping module comprises:
Based on a three-dimensional distance between an object group in the first frame and a second frame, which is a frame after the first frame, and an overlap ratio between each object in the object group in the second frame And determines whether to ungroup some or all of the objects in the object group.
Wherein the object identification module comprises:
Acquiring object information and object name information for each object,
The group forming module includes:
Acquiring group information and group name information for the object group,
Wherein the grouping module comprises:
Acquires object information and object name information for the ungrouped object again, and reacquires group information and group name information for the object group in which the ungrouped object was included.
Wherein the object information includes at least one of position information for individual objects, region of interest information, speed-up robust feature (SURF) feature information, and color histogram information,
Wherein the group information includes group location information for the object group.
The group forming module includes:
And obtains the group information and the group name information based on the obtained object information and object name information.
Wherein the grouping module comprises:
Acquires object information about an object ungrouped based on the first image acquired from the first camera, and acquires object information of objects already obtained based on the acquired object information again, An object tracking device, assigned by name.
Wherein the first camera is an RGB-D camera and the second camera is a PTZ camera.
Grouping a part or all of a plurality of objects to form an object group based on a three-dimensional distance and an overlap ratio between the objects;
Tracking the group of objects as a single object; And
Disassembling some or all of the objects in the object group based on at least one of a three-dimensional distance between the object group and each object in the object group and an overlap ratio between each object in the object group,
Wherein identifying the plurality of objects comprises:
Obtaining object information for each object based on the first image; And
And obtaining object name information based on a second image that is different from the first image and enlarged photographed on a specific area of the identified object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150182820A KR101789520B1 (en) | 2015-12-21 | 2015-12-21 | Device and method for tracking group-based multiple object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150182820A KR101789520B1 (en) | 2015-12-21 | 2015-12-21 | Device and method for tracking group-based multiple object |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170073963A KR20170073963A (en) | 2017-06-29 |
KR101789520B1 true KR101789520B1 (en) | 2017-10-26 |
Family
ID=59280220
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150182820A KR101789520B1 (en) | 2015-12-21 | 2015-12-21 | Device and method for tracking group-based multiple object |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101789520B1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101868103B1 (en) * | 2017-07-12 | 2018-06-18 | 군산대학교 산학협력단 | A video surveillance apparatus for identification and tracking multiple moving objects and method thereof |
KR102050898B1 (en) * | 2017-09-15 | 2019-12-03 | 고려대학교 산학협력단 | Method and apparatus for tracking multiple curling stones using two cameras |
WO2020085526A1 (en) * | 2018-10-23 | 2020-04-30 | 주식회사 인에이블와우 | Terminal and control method thereof |
KR102029140B1 (en) * | 2019-04-30 | 2019-10-07 | 배경 | Apparatus for generating monitoring image |
KR20220073444A (en) | 2020-11-26 | 2022-06-03 | 삼성전자주식회사 | Method and apparatus for tracking object and terminal for performing the method |
KR102544492B1 (en) * | 2021-06-30 | 2023-06-15 | 롯데정보통신 주식회사 | Apparatus and method of managing safety of swimming pool |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101508310B1 (en) | 2014-04-10 | 2015-04-07 | 군산대학교산학협력단 | Apparatus and method for tracking multiple moving objects in video surveillance system |
-
2015
- 2015-12-21 KR KR1020150182820A patent/KR101789520B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101508310B1 (en) | 2014-04-10 | 2015-04-07 | 군산대학교산학협력단 | Apparatus and method for tracking multiple moving objects in video surveillance system |
Non-Patent Citations (1)
Title |
---|
Munaro et al. Tracking people within groups with RGB-D data. IEEE/RSJ 2012, 2012년, pp. 2101-2107.* |
Also Published As
Publication number | Publication date |
---|---|
KR20170073963A (en) | 2017-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101789520B1 (en) | Device and method for tracking group-based multiple object | |
JP3781370B2 (en) | Mobile device | |
US11037325B2 (en) | Information processing apparatus and method of controlling the same | |
JP2022036143A (en) | Object tracking system, object tracking device, and object tracking method | |
US11308347B2 (en) | Method of determining a similarity transformation between first and second coordinates of 3D features | |
Koehler et al. | Stationary detection of the pedestrian? s intention at intersections | |
US11017588B2 (en) | Image processing apparatus that generates a virtual view image from multiple images captured from different directions and method controlling the same | |
US20150199562A1 (en) | Scale independent tracking pattern | |
US20120086778A1 (en) | Time of flight camera and motion tracking method | |
US20180120106A1 (en) | Map generating device, map generating method, and program recording medium | |
US20100103266A1 (en) | Method, device and computer program for the self-calibration of a surveillance camera | |
KR20160106514A (en) | Method and apparatus for detecting object in moving image and storage medium storing program thereof | |
US20160275695A1 (en) | System and a method for tracking objects | |
US10861185B2 (en) | Information processing apparatus and method of controlling the same | |
US10181075B2 (en) | Image analyzing apparatus,image analyzing, and storage medium | |
CN108629799B (en) | Method and equipment for realizing augmented reality | |
JP2018113021A (en) | Information processing apparatus and method for controlling the same, and program | |
CN108369739B (en) | Object detection device and object detection method | |
CN113362441A (en) | Three-dimensional reconstruction method and device, computer equipment and storage medium | |
JP2018205870A (en) | Object tracking method and device | |
WO2020046203A1 (en) | Device and method for tracking human subjects | |
Mohedano et al. | Robust 3d people tracking and positioning system in a semi-overlapped multi-camera environment | |
US20180350216A1 (en) | Generating Representations of Interior Space | |
Zhou et al. | The chameleon-like vision system | |
Naser et al. | Infrastructure-free NLoS obstacle detection for autonomous cars |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |