AU2013398544B2 - A method of determining the location of a point of interest and the system thereof - Google Patents
A method of determining the location of a point of interest and the system thereof Download PDFInfo
- Publication number
- AU2013398544B2 AU2013398544B2 AU2013398544A AU2013398544A AU2013398544B2 AU 2013398544 B2 AU2013398544 B2 AU 2013398544B2 AU 2013398544 A AU2013398544 A AU 2013398544A AU 2013398544 A AU2013398544 A AU 2013398544A AU 2013398544 B2 AU2013398544 B2 AU 2013398544B2
- Authority
- AU
- Australia
- Prior art keywords
- optical detector
- point
- interest
- location
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000003287 optical effect Effects 0.000 claims abstract description 92
- 238000013507 mapping Methods 0.000 claims abstract description 7
- 238000002329 infrared spectrum Methods 0.000 claims description 6
- 238000001429 visible spectrum Methods 0.000 claims description 4
- 238000001514 detection method Methods 0.000 description 9
- 238000012544 monitoring process Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
A method of determining the location of a point of interest is disclosed. The method comprises the steps of providing an optical detector; capturing an image by the optical detector; obtaining the position and orientation of the optical detector; identifying at least one coordinates of the point of interest within the captured image; mapping the position of the optical detector and the coordinates of the point of interest in a digital elevation model; and determining the location of the point of interest by projecting at least one path from the position of the optical detector through the coordinates of the point of interest in the digital elevation model. A system for determining the location of a point of interest by utilizing the aforesaid method is also disclosed herein.
Description
BACKGROUND OF INVENTION [0002] The automatic/intelligent monitoring systems nowadays can only detect the occurrence of a particular event but cannot specifically identify the location of the event. One of the possible solutions to this technical problem is to pre-input the geographic information of the monitoring area and corresponding monitoring devices manually. However, this kind of system cannot provide an accurate location of a particular point of interest or object of interest.
[0003] An alternative solution is to utilize multiple cameras to identify the location of the event. The resolution of this kind of system depends on the separation of the cameras. As such, a bulky system is required in order to provide accurate location information, especially in the case when the location of event is far away from the system. Another limitation of such system is that regular maintenance is required as the mechanical errors will accumulate over time.
SUMMARY OF INVENTION [0004] In the light of the foregoing background, embodiments of the present invention seek to provide an alternate method of identifying the location of the event and a system thereof, or to at least provide the public with a useful choice.
P1349AUPC
2013398544 12 Jun2018 [0005] Accordingly, the present invention, in one aspect, is a method of determining the location of a point of interest. The method comprises providing an optical detector; capturing an image by the optical detector; obtaining the position and the orientation of the optical detector; identifying at least one coordinates of the point of interest within the captured image; mapping the position of the optical detector and the coordinates of the point of interest in a digital elevation model; generating the digital elevation model (DEM) by first providing a point cloud comprising a plurality of points; and linking the plurality of points together thereby generating a three dimensional model of a terrain's surface, wherein each of the points correlates to the altitude of the terrain's surface at a specific latitude and longitude; and determining the location of the point of interest by projecting at least one path from the position of the optical detector through the coordinates of the point of interest in the digital elevation model.
[0006] The term 'comprises' and its grammatical variants has a meaning that is determined by the context in which it appears. Accordingly, the term should not be interpreted exhaustively unless the context dictates so.
[0007] In an exemplary embodiment of the present invention, the step of determining the location of the point of interest further comprises iterative steps of: extending the path in the direction away from the optical detector; and determining whether the path collides with any surface of said digital elevation model. The iterative steps terminate when (a) a collision surface is identified between the path and said digital elevation model or (b) said path has extended over a predetermined value. In one embodiment, the predetermined value is the range of the optical detector.
[0008] In another embodiment, the method further comprises a step of continuously rotating and tilting the optical detector within a predetermined angle while capturing the image.
[0009] According to another aspect of the present invention, a system for determining the location of a point of interest is provided. The system comprises an optical detector for capturing an image comprising the point of interest; a movable platform coupled to the optical detector and configured to rotate and tilt the optical detector; a microprocessor coupled to the optical detector and the movable platform; and a non-transitory computer-readable storage
P1349AUPC
2013398544 12 Jun 2018 medium coupled to the microprocessor. The non-transitory computer-readable storage medium is encoded with computer readable instructions that cause said microprocessor to execute: capturing said image; obtaining a position and an orientation of said optical detector; identifying at least one two-dimensional Cartesian coordinates of said point of interest within said image;
mapping said position of said optical detector and said coordinates of said point of interest in a digital elevation model; and determining said location of said point of interest by projecting at least one path from said position of said optical detector through said coordinates of said point of interest in said digital elevation model, wherein said optical detector is an infrared image sensor for capturing a heat distribution or light intensity with a range of said optical detector;
said non-transitory computer-readable storage medium is further encoded with computerreadable instructions that cause said microprocessor to determine said coordinates of said point of interest based on said heat distribution.
[0009a] In one embodiment, the non-transitory computer-readable storage medium is encoded with computer-readable instructions for causing said microprocessor to execute the aforesaid method.
[0010] In one embodiment of the present invention, the optical detector is an infrared image sensor configured to capture the heat distribution.
[0011] In another embodiment, the system further comprises a mist-fog-penetrating detector. The mist-fog-penetrating detector comprises a high dynamic range image sensor;
and a first light filter unit, which is configured to filter out visible spectrum, coupled to the high dynamic range image sensor.
P1349AUPC
2013398544 12 Jun2018 [0012] In another embodiment, the system further comprises a sensing device coupled to the microprocessor and configured to detect the position and orientation of the optical detector and to feedback the position and orientation to the microprocessor. In one embodiment, the sensing device comprises a Global Positioning System receiver and an attitude and heading reference system (AHRS) configured to detect position and orientation of the optical detector respectively.
[0013] In a further aspect of the present invention, a non-transitory computer readable storage medium encoded with instructions that, when executed by one or more processors, causes: capturing an image by an optical detector; obtaining the position and orientation of the optical detector; identifying at least one coordinates of the point of interest within the image; mapping the position of the optical detector and the coordinates of the point of interest in a digital elevation model; generating said digital elevation model by: providing a point cloud comprising a plurality of points, and linking said plurality of points thereby generating a three-dimensional model of a terrain's surface; wherein each of said points correlates to the altitude of said terrain's surface at a specific latitude and longitude; and determining the location of the point of interest by projecting at least one path from the position of the optical detector through the coordinates of the point of interest in the digital elevation model is provided.
[0014] There are many advantages to the present invention. Comparing with existing monitoring system, the present invention utilizes only one optical detector. As such, the present invention has a smaller form factor and is more cost effective when comparing
3a
WO 2015/025195
PCT/IB2013/056837 with other existing system. Another advantage of the present invention is that the mechanical error of the movable platform will not affect the detection accuracy as (1) the detection algorithm is conducted using the DEM and (2) the system comprises a plurality of sensors configured to feedback the current position and orientation of the optical sensor.
WO 2015/025195
PCT/IB2013/056837
BRIEF DESCRIPTION OF FIGURES [0015] Figure 1 is a flow chart of a method of determining the location of a point of interest according to one embodiment of the present invention.
[0016] Figure 2 shows a projection path in a digital elevation model illustrating the algorithm of determining the location of a point of interest according to one embodiment of the present invention.
[0017] Figure 3 is a schematic diagram of an apparatus according to one embodiment of the present invention.
[0018] Figure 4 is a schematic diagram of an apparatus for detecting wildfire according to another embodiment of the present invention.
WO 2015/025195
PCT/IB2013/056837
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0019] As used herein and in the claims, “comprising” means including the following elements but not excluding others.
[0020] Referring now to figure 1, the first aspect of the present invention is a method of determining the location of a point of interest. In step 20, an image comprising the point of interest is captured by an optical detector 30. Then, the position and orientation of the optical detector 30 at the time the image was taken are obtained in step 22. The point of interest is selected from the image and at least one coordinates of the point of interest in the image are identified in step 24. In one embodiment, the point can be represented by the 2-dimensional coordinates (i.e. x coordinate and y coordinate) as in a Cartesian coordinate system. In one specific embodiment, a particular point in the Cartesian coordinate system represents a pixel of the image. In another specific embodiment, a particular point in the Cartesian coordinate system represents a plurality of pixel of the image. Having a particular point representing a plurality of pixel of the image can reduce the computational demand of the method. In step 26, the position of the optical detector 30 and the coordinates of the point of interest as determined in step 24 are mapped in a digital elevation model (DEM). In one specific embodiment, the DEM represents the terrain of the Earth. The position of the optical detector 30 is obtained in a form of geographic coordinates, which includes the information of longitude, latitude and altitude. In one embodiment, the position of the optical detector 30 is obtained by a Global Positioning System. In another embodiment, global coordinates are manually inputted in order to provide the position of the optical detector 30. The coordinates of the point of interest are determined based on the relationship among the field of view, the position and the orientation of the optical detector 30 at the time of the optical detector 30 captured the image. In one embodiment, the orientation of the optical detector 30 is obtained by using an attitude and heading reference system. In one specific embodiment, . the attitude and heading reference system comprises a gyroscope, an accelerometer and a digital compass. In yet another embodiment, the orientation of the optical detector 30 is obtained by detecting the degree to which a moveable platform (which can be human
WO 2015/025195
PCT/IB2013/056837 hand or provided as a form of mechanical moveable platform) is offset. In step 28, the location of the point of interest is determined using the DEM.
[0021] In one embodiment, the DEM is a substantially spherical point cloud which comprises a plurality of points. Each of the points represents the altitude of a terrain’s surface at specific latitude and longitude. In another embodiment, any three adjacent points are linked together to from a plurality of triangular surface thereby providing a three dimensional model of the terrain’s surface. In yet another embodiment, any four adjacent points are linked together to from a plurality of rectangular surface thereby providing a three dimensional model of the terrain’s surface. In one embodiment, the posting interval of the DEM is 30 mm and the accuracy is 7-14 m (standard deviation).
[0022] The algorithm of determining the location of point of interest using the DEM in step 28 is illustrated in figure 2. As shown in figure 2, the optical detector 30, the captured image 32 and the coordinates 34 of the point of interest with respect to the captured image 32 are mapped in the DEM 36 according to their positions and orientations in the real world respectively. To determine the location of the point of interest, a path 40 passing through the optical detector 30 and the coordinates 34 was first projected. The path 40 illustrates the trajectory of light originated from the point of interest entering the optical detector 30. After the path 40 is projected, it will extend incrementally away from the optical detector 30 in the direction of the path 40 passing through the optical detector 30 and the coordinates 34. . In one embodiment, the path 40 extends by one pixel along the direction of the path 40 in each increment. For each increment, a collision detection algorithm will be executed to determine if the path collides with any surface of the DEM. In one embodiment, the path keeps extending until 1) the path collides with a particular surface of the DEM; or 2) the length of the path has extended over a predetermined value. In one specific embodiment, the predetermined value is the effective detection range of the optical detector system. In a specific embodiment, the effective detection range is 5 km with an optical detector system with a 320 x 240 pixel sensor.
WO 2015/025195
PCT/IB2013/056837 [0023] If collision between the path 40 and a particular surface of the DEM is detected, that collision surface 41 refers to the location of the point of interest. In one embodiment, the xyz-coordinates of the collision surface 41 with respect to the DEM 36 correlate to the latitude, longitude and altitude of the point of interest respectively. In one specific embodiment, the geographic coordination (i.e. the latitude, longitude and altitude) of the point of interest is obtained by checking a predefined database. In yet another specific embodiment, the latitude and longitude of that particular surface is provided as the latitude and longitude of the point of interest. In a further specific embodiment, the altitude of the point of interest, which is equivalent to the altitude of that particular surface, is also provided.
[0024] On the other hand, if no collision is detected even the path 40 is extended beyond a predetermined value, an error message will be provided. In one embodiment, the error message indicates that the point of interest is at or beyond the edge of the DEM.
[0025] The second aspect of the present invention, as shown in figure 3, is a system 50 for determining the location of a point of interest by applying the aforesaid method. The system 50 comprises an optical detector 30 for capturing images, a sensing device 49 which includes a locating system 42 for obtaining geographic coordinates of the optical detector 30 and an orientation system 44 for obtaining the orientation of the optical detector 30, a microprocessor 46 and a non-transitory computer-readable storage medium 48. The sensing device 49 is configured to detect the position and the orientation of the optical detector 30 and feedback the position and the orientation of the optical detector to the microprocessor 46. The microprocessor 46 is coupled to the optical detector 30, the locating system 42, the orientation system 44 and the non-transitory computer-readable storage medium 48. The non-transitory computer readable storage medium 48 is encoded with instructions that when performed by the microprocessor 36, causes performance of the steps of the aforesaid method.
[0026] In one embodiment, the locating system 42 is configured to receive a manual input of the global coordinates of the optical detector 30 from the user of the system 50. In another embodiment, the orientation system 44 detects the degrees to which a
WO 2015/025195
PCT/IB2013/056837 moveable platform (which can be human hand or provided as a form of mechanical moveable platform) is offset. In one specific embodiment, orientation system 44 is a mechanism configured to report the offset angles of the platform.
[0027] In one embodiment, the system 50 is an electronic device which includes a camera, a Global Positioning receiver (i.e. locating system 42), an attitude and heading reference system (AHRS) (i.e. orientation system 44), a micro-computer (i.e. microprocessor 46 and non-transitory computer-readable storage medium 48). In one specific embodiment, the attitude and heading reference system (AHRS) comprises a gyroscope, an accelerometer and a digital compass. In another specific embodiment, the electronic device is smartphone, tablet, laptop computer, binocular, camera and/or handheld video camcorder.
[0028] To better illustrate the present invention, a specific realization of applying the aforesaid method and system on detecting wildfire is shown in figure 4. A system 52 for detecting wildfire comprising an optical detector 54 for capturing images, a movable platform 56 coupled to the optical detector 54 for rotating and tilting the optical detector 54 and a microcomputer 58 coupled to the optical detector 54 and the movable platform 56. In one embodiment, the microcomputer 58 further comprises a microprocessor (not shown) and a non-transitory computer-readable storage medium (not shown). In the context of detecting wildfire, the optical detector 54 is an infrared image sensor configured to capture heat distribution within a predefined range. In one embodiment, the infrared image sensor comprises a plurality of cooled photodetectors configured to detect far infrared emitted by the wildfire. In yet another embodiment, the infrared image sensor comprises a plurality of uncooled photodetectors configured to detect far infrared emitted by the wildfire. In one specific embodiment, the infrared image sensor can output 14-bit signals to the microcomputer 58 for further processing.
[0029] The infrared image senor is further coupled to a movable platform 56. In one embodiment, the movable platform 56 is a numerical control head with positioning accuracy of + 0.2°. In another embodiment, the movable platform 56 is configured to the continuously rotate and tilt upon receiving instruction from the microcomputer 58. In
WO 2015/025195
PCT/IB2013/056837 another embodiment, the optical detector 54 is configured to continuously rotate and tilt within a predetermined angle while capturing images of the environment. In one specific embodiment, the movable platform 56 is freely rotatable along its axial axis. In another specific embodiment, the tilt range of the movable platform 56 is ±40°. In another embodiment, the movable range of the movable platform 56 is adjustable according to the user’s preferences without affecting the performance of the system 50.
[0030] Accordingly, the microcomputer 58 is coupled to the optical detector 54 and the movable platform 56. In the context of detecting wildfire, the microcomputer 58 is configured to analyze the heat distribution as captured by the infrared image senor thereby determining the coordinates of the place which is on fire with respect to the image (i.e. selecting the point of interest from the captured image by identifying an object that emits infrared signal over predetermined value). In one embodiment, the microcomputer 58 is further configured to retrieve the position and orientation of the movable platform 56. After the coordinates of point of interest; and the position and orientation of the movable platform 56 are determined, the microcomputer 58 is then configured to execute the aforesaid method to determine the location of the place that is on fire.
[0031] To further facilitate the detection of wildfire, the system 52 further comprises a mist-fog-penetrating detector (not shown). The mist-fog-penetrating detector is essential for wildfire detection application. Due to the inherited limitation of infrared image sensor 54, objects emitting similar amount of infrared (i.e. with similar temperature) within a region could not be identified distinctively merely based on the heat distribution as capture by the infrared image sensor. For instance, if there is a burning tree surrounded by a pile of rocks with similar temperature, the system 52 may misinterpret the fire area and wrongly determine the coordinates of the point on fire. As such, mist-fog-penetrating detector which can provide clear grayscale image of the actual forest environment can greatly improve the efficiency and accuracy of the system 52 as proposed by the present invention.
WO 2015/025195
PCT/IB2013/056837 [0032] In one embodiment, the mist-fog-penetrating detector comprises a fixed-focal lens, a high dynamic range image sensor and a first light filter coupled to the high dynamic range image sensor which is configured to filter out the visible spectrum before the light is captured by the image sensor. In the environment of wildfire, the visible spectrum of the environment light will diffract and scatter when it hits the surrounding water vapors or dust particles. The intensity of these diffracted / scattered lights is usually higher than that of the lights reflected by surrounding objects. The first light filter is configured to filter out these high intensity lights.
[0033] In another embodiment, the mist-fog-penetrating detector further comprises a second light filter which allows only the near infrared spectrum to reach the high dynamic range image sensor. The penetration power of near infrared spectrum is higher than that of visible light. Another characteristic of near infrared spectrum is that: dark-colorobjects absorb more near infrared spectrum than light-color-objects. Therefore by analyzing the intensity of the near infrared spectrum as captured by the high dynamic range image sensor of the mist-fog-penetrating detector, different objects can be identified based on their color. In one embodiment, the grayscale image captured by the mist-fog-penetrating detector is compared with the heat distribution captured by the infrared image sensor to reduce the false alarm of the wildfire detection system.
[0034] The exemplary embodiments of the present invention are thus fully described. Although the description referred to particular embodiments and realizations, it will be clear to one skilled in the art that the present invention may be practiced with variation of these specific details. Hence this invention should not be construed as limited to the embodiments set forth herein.
[0035] For example, wildfire detection system and method is disclosed as one of the realization of the present invention. The method and system as taught by the present invention should be able to be applied to other applications, for instance security monitoring system and line of sight analysis.
WO 2015/025195
PCT/IB2013/056837 [0036] The optical detector 30 is selected from a group consisting of infrared image sensor, mist-fog-penetrating detector and combination thereof according to the embodiments described above. However, the optical detector 30 can be other detectors according to the requirement of a particular application, for instance complementary metal-oxide-semiconductor (CMOS) image sensor, charge-coupled device (CCD) sensor and Ultraviolet (UV) image sensor.
P1349AUPC
2013398544 12 Jun2018
Claims (4)
- What is claimed is:1. A method of determining a location of a point of interest comprising:providing an optical detector;capturing an image by said optical detector;5 obtaining a position and an orientation of said optical detector;identifying at least one coordinates of said point of interest within said image; mapping said position of said optical detector and said coordinates of said point of interest in a digital elevation model;generating said digital elevation model by: providing a point cloud comprising a 10 plurality of points; and linking said plurality of points thereby generating a threedimensional model of a terrain's surface; wherein each of said points correlates to the altitude of said terrain's surface at a specific latitude and longitude; and determining said location of said point of interest by projecting at least one path from said position of said optical detector through said coordinates of said point of interest 15 in said digital elevation model.
- 2. The method of claim 1, wherein said step of determining said location of said point of interest further comprises iterative steps of:extending said path in the direction away from said optical detector; and 20 determining whether said path collides with any surface of said digital elevation model;wherein said iterative steps terminate when at least one of termination conditions is satisfied, said termination conditions including:P1349AUPC2013398544 12 Jun2018 a collision surface is identified between said path and said digital elevation model;and said path has extended over a predetermined value.5 3. The method of claim 2, wherein said determining said location of said point of interest further includes calculating and obtaining said location in a form of geographic coordination based on said collision surface.4. The method of claim 1 further comprising continuously rotating and tilting said optical 10 detector within a predetermined angle while capturing said image.5. The method of claim 1, wherein said obtaining said position of said optical detector further includes gathering a geographic coordination of said optical detector through a Global Positioning System.6. The method of claim 1, wherein said step of obtaining said orientation further comprises assessing a degree to which an attitude and heading reference system is offset.7. The method of claim 1 further comprising selecting said point of interest from said 20 captured image by identifying an object on said captured image.8. The method of claim 7, wherein said object emits an infrared signal over a predetermined value.P1349AUPC2013398544 12 Jun 20189. The method of claim 1 further comprising selecting said point of interest by a user input.10. A system for determining a location of a point of interest comprising:an optical detector for capturing an image including said point of interest;5 a movable platform coupled to said optical detector and configured to rotate and tilt said optical detector;a microprocessor coupled to said optical detector and said movable platform; and a non-transitory computer-readable storage medium coupled to said microprocessor, said non-transitory computer-readable storage medium encoded with10 computer-readable instructions that cause said microprocessor to execute: capturing said image;obtaining a position and an orientation of said optical detector; identifying at least one two-dimensional Cartesian coordinates of said point of interest within said image;15 mapping said position of said optical detector and said coordinates of said point of interest in a digital elevation model; and determining said location of said point of interest by projecting at least one path from said position of said optical detector through said coordinates of said point of interest in said digital elevation model,20 wherein said optical detector is an infrared image sensor for capturing a heat distribution or light intensity with a range of said optical detector; said nontransitory computer-readable storage medium is further encoded with computerreadable instructions that cause said microprocessor to determine said coordinates of said point of interest based on said heat distribution.P1349AUPC2013398544 12 Jun201811. The system of claim 10 further comprising a mist-fog-penetrating detector, wherein said mist-fog-penetrating detector comprises:a high dynamic range image sensor; and5 a first light filter unit coupled to said high dynamic range image sensor; said first light filter unit filters out visible spectrum from light reaching said high dynamic range image sensor.12. The system of claim 11 further comprising a second light filter unit coupled to said first 10 light filter unit; said second light filter unit allows only a specific portion of infrared spectrum to reach said high dynamic range image sensor.13. The system of claim 11 further comprising a sensing device coupled to said microprocessor; said sensing device detects said position and said orientation of said optical 15 detector and sends said position and said orientation of said optical detector to said microprocessor.14. They system of claim 13, wherein said sensing device comprises a Global Positioning System receiver that detects said position of said optical detector and attitude and a heading20 reference system (AHRS) that detects said orientation of said optical detector.15. A non-transitory computer readable storage medium encoded with instructions that, when executed by one or more processers, causes:capturing an image by an optical detector;P1349AUPC2013398544 12 Jun2018 obtaining a position and an orientation of said optical detector;identifying at least one coordinates of said point of interest within said image; mapping said position of said optical detector and said coordinates of said point of interest in a digital elevation model;5 generating said digital elevation model by: providing a point cloud comprising a plurality of points; and linking said plurality of points thereby generating a threedimensional model of a terrain's surface; wherein each of said points correlates to the altitude of said terrain's surface at a specific latitude and longitude; and determining said location of said point of interest by:10 projecting at least one path from said position of said optical detector through said coordinates of said point of interest in said digital elevation model.16. The non-transitory computer readable storage medium of claim 15, wherein said step of determining said location of said point of interest further comprises iterative steps of:15 extending said path in the direction away from said optical detector; and determining whether said path collides with any surface of said digital elevation model;wherein said iterative steps terminate when at least one of termination conditions is satisfied, said termination conditions including:20 a collision surface is identified between said path and said digital elevation model;and said path has extended over a predetermined value.P1349AUPC2013398544 12 Jun201817. The non-transitory computer readable storage medium of claim 15, wherein said step of determining said location further comprises a step of calculating and obtaining said location in a form of geographic coordination based on said collision surface.5 18. The non-transitory computer readable storage medium of claim 15, wherein said instructions further including continuously rotating and tilting said optical detector within a predetermined angle while capturing said image.WO 2015/025195PCT/IB2013/0568371/4Figure 1WO 2015/025195PCT/IB2013/0568372/4Figure 2WO 2015/025195PCT/IB2013/056837
- 3/4Figure 3WO 2015/025195PCT/IB2013/056837
- 4/4Figure 4
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IB2013/056837 WO2015025195A1 (en) | 2013-08-23 | 2013-08-23 | A method of determining the location of a point of interest and the system thereof |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| AU2013398544A1 AU2013398544A1 (en) | 2016-03-03 |
| AU2013398544B2 true AU2013398544B2 (en) | 2018-07-05 |
Family
ID=52483122
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU2013398544A Ceased AU2013398544B2 (en) | 2013-08-23 | 2013-08-23 | A method of determining the location of a point of interest and the system thereof |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US9714833B2 (en) |
| CN (1) | CN105637322A (en) |
| AU (1) | AU2013398544B2 (en) |
| CA (1) | CA2921662A1 (en) |
| MX (1) | MX361868B (en) |
| WO (1) | WO2015025195A1 (en) |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10444006B2 (en) | 2015-08-19 | 2019-10-15 | Faro Technologies, Inc. | Three-dimensional imager |
| US9589358B1 (en) | 2015-09-29 | 2017-03-07 | International Business Machines Corporation | Determination of point of interest views from selected vantage points |
| US10984640B2 (en) * | 2017-04-20 | 2021-04-20 | Amazon Technologies, Inc. | Automatic adjusting of day-night sensitivity for motion detection in audio/video recording and communication devices |
| CN107392778B (en) * | 2017-07-26 | 2020-12-22 | 深圳畅博通科技有限公司 | Self-authentication method and storage medium for plot and shooting information |
| CN108536980B (en) * | 2018-04-18 | 2021-11-12 | 中国石油大学(华东) | Gas detector discrete site selection optimization method considering reliability factor |
| TWI690718B (en) * | 2018-08-02 | 2020-04-11 | 李宏富 | Target position tracking system |
| CN110807831B (en) * | 2019-09-18 | 2023-03-28 | 重庆大学 | Sensor coverage area calculation method based on minimum unit collision detection |
| CN111352112B (en) * | 2020-05-08 | 2022-11-29 | 泉州装备制造研究所 | Target detection method based on vision, lidar and millimeter wave radar |
| US11170476B1 (en) * | 2020-10-15 | 2021-11-09 | Aeva, Inc. | Techniques for fast point cloud filtering using a series cascaded filter |
| FR3151690A1 (en) | 2023-07-28 | 2025-01-31 | Centre National De La Recherche Scientifique | Method and device for locating a point of interest on a terrain from an image |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1880918A (en) * | 2005-06-14 | 2006-12-20 | Lg电子株式会社 | Matching camera-photographed image with map data in portable terminal and travel route guidance method |
| US20110137561A1 (en) * | 2009-12-04 | 2011-06-09 | Nokia Corporation | Method and apparatus for measuring geographic coordinates of a point of interest in an image |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2706345B1 (en) * | 1993-06-11 | 1995-09-22 | Bertin & Cie | Method and device for locating in space a mobile object such as a sensor or a tool carried by a robot. |
| US7295925B2 (en) * | 1997-10-22 | 2007-11-13 | Intelligent Technologies International, Inc. | Accident avoidance systems and methods |
| US7426437B2 (en) * | 1997-10-22 | 2008-09-16 | Intelligent Technologies International, Inc. | Accident avoidance systems and methods |
| US7418346B2 (en) * | 1997-10-22 | 2008-08-26 | Intelligent Technologies International, Inc. | Collision avoidance methods and systems |
| US7629899B2 (en) * | 1997-10-22 | 2009-12-08 | Intelligent Technologies International, Inc. | Vehicular communication arrangement and method |
| US8634993B2 (en) * | 2003-03-20 | 2014-01-21 | Agjunction Llc | GNSS based control for dispensing material from vehicle |
| US7728869B2 (en) * | 2005-06-14 | 2010-06-01 | Lg Electronics Inc. | Matching camera-photographed image with map data in portable terminal and travel route guidance method |
| ATE467817T1 (en) * | 2005-09-12 | 2010-05-15 | Trimble Jena Gmbh | SURVEYING INSTRUMENT AND METHOD FOR PROVIDING SURVEYING DATA USING A SURVEYING INSTRUMENT |
| US9285459B2 (en) * | 2008-05-09 | 2016-03-15 | Analog Devices, Inc. | Method of locating an object in 3D |
| KR101663669B1 (en) | 2008-06-16 | 2016-10-07 | 아이파이 피티와이 리미티드 | Spatial predictive approximation and radial convolution |
| IL193906A (en) * | 2008-09-04 | 2012-06-28 | Pro Track Ltd | Methods and systems for creating an aligned bank of images with an iterative self-correction technique for coordinate acquisition and object detection |
| US9020753B2 (en) * | 2010-05-12 | 2015-04-28 | Telefonaktiebolaget L M Ericsson (Publ) | Method, computer program and apparatus for determining an object in sight |
| CN102646311B (en) * | 2012-05-04 | 2014-06-11 | 中国科学院长春光学精密机械与物理研究所 | Intelligent smoke and fire detecting system using real-time dynamic cruising images |
| CN102784451B (en) * | 2012-08-06 | 2014-12-10 | 西安硅光电子科技有限公司 | Automatic positioning flame detection system for three-dimensional space |
| CN103065412B (en) * | 2012-12-06 | 2014-12-17 | 广东省林业科学研究院 | Interference source intelligent shielding method and device thereof applied to forest fire monitoring system |
| CN103106766B (en) * | 2013-01-14 | 2014-12-17 | 广东赛能科技有限公司 | Forest fire identification method and forest fire identification system |
| CN103400463B (en) * | 2013-06-21 | 2016-08-10 | 广东省林业科学研究院 | A kind of forest fires localization method based on two dimensional image and device |
-
2013
- 2013-08-23 CA CA2921662A patent/CA2921662A1/en not_active Abandoned
- 2013-08-23 AU AU2013398544A patent/AU2013398544B2/en not_active Ceased
- 2013-08-23 WO PCT/IB2013/056837 patent/WO2015025195A1/en not_active Ceased
- 2013-08-23 MX MX2016002231A patent/MX361868B/en active IP Right Grant
- 2013-08-23 CN CN201380078947.8A patent/CN105637322A/en active Pending
- 2013-08-23 US US14/913,378 patent/US9714833B2/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1880918A (en) * | 2005-06-14 | 2006-12-20 | Lg电子株式会社 | Matching camera-photographed image with map data in portable terminal and travel route guidance method |
| US20110137561A1 (en) * | 2009-12-04 | 2011-06-09 | Nokia Corporation | Method and apparatus for measuring geographic coordinates of a point of interest in an image |
Also Published As
| Publication number | Publication date |
|---|---|
| MX2016002231A (en) | 2016-11-18 |
| US20160202071A1 (en) | 2016-07-14 |
| CA2921662A1 (en) | 2015-02-26 |
| WO2015025195A1 (en) | 2015-02-26 |
| CN105637322A (en) | 2016-06-01 |
| AU2013398544A1 (en) | 2016-03-03 |
| US9714833B2 (en) | 2017-07-25 |
| MX361868B (en) | 2018-12-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AU2013398544B2 (en) | A method of determining the location of a point of interest and the system thereof | |
| US10687022B2 (en) | Systems and methods for automated visual surveillance | |
| CN104902246A (en) | Video monitoring method and device | |
| JP2008527806A (en) | Night monitoring system and method | |
| DK2690459T3 (en) | Device and method for identifying and documenting at least one object passing through a radiation field | |
| CN103400463B (en) | A kind of forest fires localization method based on two dimensional image and device | |
| US20100265505A1 (en) | Laser beam image contrast enhancement | |
| CA3112187C (en) | Optics based multi-dimensional target and multiple object detection and tracking method | |
| CN110067274A (en) | Apparatus control method and excavator | |
| CN105141912B (en) | A kind of method and apparatus of signal lamp reorientation | |
| WO2019198076A1 (en) | Real-time raw data- and sensor fusion | |
| WO2020105527A1 (en) | Image analysis device, image analysis system, and control program | |
| CN104330075B (en) | Rasterizing polar coordinate system object localization method | |
| KR101889051B1 (en) | Method for increasing reliability in monitoring systems | |
| CN106524995B (en) | Detect the localization method of target object space length in real time based on visible images | |
| CN102622845A (en) | Background interference elimination device and elimination method based on forest flash point de-disturbance point positioning device | |
| EP3015839B1 (en) | Laser pointing system for monitoring stability of structures | |
| KR102660024B1 (en) | Method for detection road event and apparatus for the same | |
| Shahbazi et al. | Vehicle Tracking and Speed Estimation from Unmanned Aerial Videos | |
| HK1225099A1 (en) | A method of determining the location of a point of interest and the system thereof | |
| CN120747800B (en) | A method for inspecting airport navigation lights based on UAV technology | |
| US20250182320A1 (en) | Automatic range and geo-referencing for image processing systems and methods | |
| CN114002751B (en) | Abnormal position identification method, system and device | |
| US20170188018A1 (en) | Method for generating a depth map using a camera | |
| KR101150563B1 (en) | Apparatus and method for monitoring using beam splitter |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FGA | Letters patent sealed or granted (standard patent) | ||
| MK14 | Patent ceased section 143(a) (annual fees not paid) or expired |