WO2010089738A2 - Controlling an imaging apparatus over a delayed communication link - Google Patents
Controlling an imaging apparatus over a delayed communication link Download PDFInfo
- Publication number
- WO2010089738A2 WO2010089738A2 PCT/IL2010/000095 IL2010000095W WO2010089738A2 WO 2010089738 A2 WO2010089738 A2 WO 2010089738A2 IL 2010000095 W IL2010000095 W IL 2010000095W WO 2010089738 A2 WO2010089738 A2 WO 2010089738A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- imaging apparatus
- identified target
- command
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
Definitions
- the present invention relates to the field of remote controlling, and more particularly, to remote controlling over a delayed communication link via a vision display.
- remotely piloted aircraft or "unmanned aerial vehicle” (U A WRP A) as used herein in this application, refers to an aircraft flying without a human pilot.
- a UAWRPA may be remotely controlled or fly autonomously based on preprogrammed flight plans or more complex dynamic automation systems.
- UAVs/RPAs are currently used in a number of military roles, including reconnaissance. They are also used in a small but growing number of civil applications such as firefighting when a human observer would be at risk, police observation of civil disturbances and crime scenes, and reconnaissance support in natural disasters.
- the term "payload” as used herein in this application is the load carried by an UAWRPA exclusive of what is necessary for its operation.
- the payload may comprise, inter alia, an imaging apparatus that provides the user of the UAWRPA with a dynamic vision display (e.g. a video sequence).
- the vision display may comprise a predefined point that corresponds with the general pointing point of the payload.
- the pointing point may be indicated in a particular graphic manner (e.g., a cross) so that the user will be informed of the current pointing direction of the payload.
- FIG. 1 is a high level schematic diagram showing a communication link between a user and a remote controlled unmanned aerial vehicle (UAV/RPA).
- UAV/RPA remote controlled unmanned aerial vehicle
- a user (not shown) is in operative association with a control station 10 that is in direct communication with a transponder such as a communication satellite 20.
- Communication satellite 20 is in direct communication with UAV/RPA 30 that carries a payload such as an imaging apparatus 35. Between imaging apparatus 35 and a potential target 40 there is a direct line of sight. In operation, imaging apparatus 35 repeatedly captures images that may contain potential target 40. These images are transmitted to communication satellite 20 which in turn, transmits them to control station 10 thereby providing the user with a dynamic vision display (e.g. video sequence) associated with the pointing direction of imaging apparatus 35.
- a dynamic vision display e.g. video sequence
- the delay is constituted of two parts.
- the first part is an uplink delay which is the delay from the time a control command is given (and transmitted) by the user until the control command reaches the payload.
- the second part is a downlink delay which is a delay from the time of a particular image of the video sequence is captured until the time that particular image reaches the user.
- a method of enabling a user to control a pointing direction of an imaging apparatus over a delayed communication link comprises: enabling the user to track a user- identified target on a currently presented image of periodically transmitted images from the imaging apparatus; calculating a distance between the estimated location of the user-identified target in view of the user's tracking and the estimated location of the pointing point of the imaging apparatus at said future time, wherein the estimation relate to a future time by which a command control currently transmitted by the user reaches the imaging apparatus; and calculating a command control required for directing the pointing point of the imaging apparatus onto the user-identified target, based on said calculated distance and further based on all previous control commands that had been already transmitted by the user but have not yet affected the currently presented image due to the delay in the communication link.
- FIG. 1 is a high level schematic diagram of a unmanned aerial vehicle (UAV/RPA) controlled via a satellite according to the existing art;
- UAV/RPA unmanned aerial vehicle
- FIG. 2 is a high level flowchart showing an aspect of the method according to some embodiments of the invention.
- FIG. 3 is a timing diagram showing an aspect of the method according to some embodiments of the invention.
- FIG. 4 is a schematic diagram of a vision display according to some embodiments of the invention.
- FIG. 5 is a timing diagram showing an aspect of the method according to some embodiments of the invention.
- FIG. 6 and FIG. 7 show a high level flowchart illustrating an aspect of a method according to some embodiments of the invention.
- the present invention in embodiments thereof, provides a method of enabling a user to effectively control a remotely located imaging apparatus over a communication link exhibiting a delay.
- Embodiments of the present invention take into account the delays involved in computing the optimal commands that need to be transmitted at any given time in order to direct the imaging device on a target identified by the user.
- a visual display e.g., a video sequence exhibiting consecutive images
- the user is provided with an interface enabling him or her to track a target he or she identifies on the visual display.
- the tracking of the target is then used by the proposed method to estimate the location and velocity of the identified target on an image currently presented to the user, at a future time which corresponds with the time by which commands executed by the user at a current time will reach the imaging apparatus.
- the proposed method may calculate the required commands in order to direct the imaging apparatus onto the target.
- the calculated commands further take into account all previous commands that had been transmitted by the user but have not yet affected the image currently presented to the user.
- FIG. 2 is a high level flowchart showing an aspect of the method according to some embodiments of the invention.
- the flowchart shows a method of enabling a user to control a spatial direction of an imaging apparatus over a communication link exhibiting an uplink delay and a downlink delay.
- the method comprises: periodically transmitting a control command for spatially directing the imaging apparatus, wherein the imaging apparatus periodically transmits to the user an image, and wherein the user is presented with the transmitted image which contains a pointing point of the imaging apparatus 210; enabling the user to track a user-identified target on a currently presented image of the periodically transmitted images 220 in real time; estimating a location of the user-identified target in view of the user's tracking and the command control which directed the presented image, at a future time corresponding with the uplink delay, wherein the uplink delay is a time required for a command control currently transmitted by the user to reach the imaging apparatus 230; estimating a location of the a pointing point of the imaging apparatus, at a future time related to the uplink delay 240; calculating a distance between the location of the user-identified target and the location of the pointing point of the imaging apparatus at said future time 250; and calculating a command control required for spatially directing the pointing point of the imaging apparatus onto the user-identified target,
- FIG. 3 is a timing diagram showing an aspect of the method according to some embodiments of the invention.
- Timing diagram 300 shows a time-scale exhibiting periods or cycles of operation 1-14. In each cycle, a new image from the imaging apparatus is presented to the user and further, a command control from the user may be transmitted to the imaging apparatus.
- the uplink delay 320, 340 As explained above, due to the delayed communication link there is a time difference between transmitting a command by the user 310 and receiving it by the imaging apparatus 312. This delay is denoted as the uplink delay 320, 340. There is also a delay due to the time difference between transmitting the image by the imaging apparatus 312 and receiving it by the user 314. This delay is denoted as the downlink delay 340.
- receiving the command by the imaging apparatus and transmitting an image by the imaging apparatus occur at the same time.
- Embodiments of the present invention overcome these two types of delays by taking them into account while calculating, at any given time, the required command for directing the pointing point of the imaging apparatus onto the user-identified target.
- the position of the pointing point of the imaging apparatus is easily determined by summing up all the previous commands that have been already transmitted.
- the location of the user-identified target may be estimated by first calculating it's momentary and then average velocity under the assumption that it's velocity (a vector incorporating speed and direction) does not change substantially during the uplink delay. The momentary velocity is calculated by comparing the location of both user-identified target and pointing point of the imaging apparatus in a currently presented image to their location in a previously presented image (one period/cycle earlier). Thus an average velocity may also be calculated - several momentary velocities averaged over a predefined time such as the total delay, uplink and downlink added together.
- the location of the user-identified target is determined by enabling the user to track it independently.
- the user determines at any given time and for each transmitted image, the location of the user-identified target.
- the tracking is enabled, by providing a graphical user interface as explained below.
- FIG. 4 is a schematic diagram of a vision display according to some embodiments of the invention.
- Vision display 400 comprises a dynamically changing image, on a cycle-by cycle basis (period-by-period).
- Vision display 400 may be a video sequence exhibiting the optical image taken by the imaging apparatus or any other imaging technology, including radar, infrared (IR) and the like.
- Vision display 400 presents the images taken by the imaging apparatus which may contain a target 420 identifiable by the user.
- Vision display 400 also presents a pointing point which represents the pointing point of the imaging apparatus.
- a command curser 430 is also presented to the user over vision display 400.
- the user is enabled to move command curser 430 towards user-identified target 420.
- the user determines the location of user-identified target 420 in any given image.
- the location of user-identified target 420 in a currently presented image may be used for estimating it's future location at a time corresponding to the current time plus the uplink delay.
- embodiments of the present invention enable the determination of the location of user-identified target 420 by assuming that the user will successfully track user-identified target 420 using command curser 430 after a predefined time.
- the location of the user identified target may be determined automatically using machine vision techniques or by an external tracker.
- the user may be enabled to provide an initial indicating only of the target upon identifying it, leaving the actual tracking for the aforementioned automatic tracking means.
- FIG. 5 is timing diagram showing an aspect of the method according to some embodiments of the invention.
- timing diagram 500 shows a time- scale exhibiting periods or cycles of operation 1-14.
- a new image from the imaging apparatus is presented to the user and further, a command control from the user may be transmitted to the imaging apparatus.
- the uplink delay 320, 340 is denoted as the uplink delay 320, 340.
- the downlink delay 340 is denoted as the downlink delay 340.
- receiving the command by the imaging apparatus and transmitting an image by the imaging apparatus occur at the same time.
- Second, a summation of all previous commands that had been already transmitted but have net yet affected the currently presented image needs to be taken into account.
- calculating a command control required for directing the pointing point of the imaging apparatus is followed by transmitting the calculated command to the imaging apparatus.
- each image comprises an array of pixels and wherein distances are calculated by calculating the difference in the location of the corresponding pixels.
- the differences are calculated in angular terms.
- the pointing point of the imaging apparatus is located in the center of the image of the visual display.
- enabling the user to track a user- identified target on a currently presented image of the periodically transmitted images is achieved and implemented by presenting a command cursor over the visual display, wherein the user is enabled to move the command curser towards the user-identified target thereby tracking it.
- the command cursor is located on the pointing point of the imaging apparatus.
- the proposed algorithm makes use of the aforementioned user interface of a command indicator that may be moved by the user at any given time.
- the algorithm starts with calculating the distance between the location of the command cursor and the pointing point at the current time t. it then goes to measure the same distance in a previous cycle (period) t-1 and calculates the difference between the current and previous location distance. Then the momentary velocity (per cycle) of the target is estimated in accordance with the following formula:
- Veloc ⁇ y J;J , Command J;J ,-" v + Di J f J ference J;J t
- Velocity is a vector denoting the velocity of the user-identified target at time t for each axis j (X and Y); Command denotes all the commands in each j axis that were transmitted at time t-N; wherein N is the total delay (uplink and downlink summed up); and wherein Difference denotes the difference between the distance between the locations of the command cursor and the pointing point at time t and the respective distance at time t-1.
- the summation in formula (2) is over the number of cycles used for estimating the average velocity which is as noted, set to the number of cycles in the total delay.
- ForecastDist j t+up ⁇ ink _ ⁇ Dist j t + EstVelocity * (N - X)
- ForecastDist is an estimated distance between the user identified target and the pointing point of the imaging apparatus at the time the current command reaches the imaging apparatus
- Dist is the current distance between the user-identified target and the pointing point
- EstVelocity is a vector denoting the estimated average velocity of the user-identified target at time t in each axis j.
- NotYetAffected denotes a summation of all commands that had been already transmitted and have not yet been affected in the currently presented image.
- the estimated distance between the estimated location of the pointing point of the imaging apparatus and the estimated location user-identified target, at time t+uplink-1 which represent one cycle before the time in which presently transmitted commands reach the imaging apparatus is calculated in accordance with the following formula:
- ForecastTotDist j t uplink _ x
- ForecastDist j t+uplink _ ⁇ - NotYetAffected ) , ⁇
- ForecastTotDist is an estimated distance between the estimated location of the pointing point of the imaging apparatus and the estimated location of the user-identified target
- ForecastDist is an estimated distance between the user identified target and the pointing point of the imaging apparatus one cycle before the time the current command reaches the imaging apparatus
- NotYetAffected denotes a summation of all commands that had been already transmitted by the user and have not yet been affected in the currently presented image.
- ForecastTo tDist , , complicathold,, preparation * . , EstVelocit y , , + j ⁇ u ⁇ nk - i cyclesToOv ertake (6)
- ForecasTotDist is the estimated distance between the estimated location of the pointing point of the imaging apparatus and the estimated location of the user-identified target
- EstVelocity is a vector denoting the estimated average velocity of the user-identified target at time t in each axis j
- CyclesToOvertake is the number of cycles that is set for closure of the distance between the estimated location of the pointing point of the imaging apparatus and the estimated location of the user-identified target.
- FIG. 6 and FIG. 7 show a high level flowchart illustrating an implementation of the aforementioned algorithm according to some embodiments of the invention.
- the flowchart shows a computer implemented method of controlling an imaging apparatus over a delayed communication link, by periodically transmitting a control command to the imaging apparatus, the method comprises: presenting a user with a visual display operatively associated with images periodically obtained by the imaging apparatus, the visual display comprising a sequence of images, each image associated with a particular cycle, wherein each image contains a pointing point of the imaging apparatus, and a command curser 600; enabling the user, in each particular cycle, to direct the command curser towards a user-identified target contained within a particular image, thereby tracking the user-identified target 610; calculating, in each particular cycle, a first distance exhibiting a distance between the command cursor and the indicator of the pointing point of the imaging apparatus 620; calculating, in each particular cycle, a difference between the first distance at the particular cycle and the first distance in a previous cycle 630; estimating
- calculating, in each particular cycle, a control command required for directing the pointing point of the imaging apparatus is followed by transmitting the calculated command to the imaging apparatus.
- the command cursor is initially located on the pointing point of the imaging apparatus.
- the pointing point of the imaging apparatus is located in the center of each image.
- the velocity and distances are calculated in angular terms.
- the averaged estimated velocity is averaged over the total delay.
- the predefined time set for overtaking the user-identified target is set to the total delay.
- the present invention is aimed for the unmanned aerial vehicle market (UAV/RPAs).
- UAV/RPAs unmanned aerial vehicle market
- the necessary modification may be performed in order to support any kind of remote controlling of a device that is equipped with an imaging apparatus, over a delayed communication link, be it manned or unmanned.
- Such devices may comprise, but are not limited to: remote controlled weaponry, aerospace related device, submarines, surface vehicles and the like.
- the disclosed method may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations thereof.
- Suitable processors may be used to implement the aforementioned method.
- a processor will receive instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data.
- a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files.
- Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices.
- Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
- method may refer to manners, means, techniques and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques and procedures either known to, or readily developed from known manners, means, techniques and procedures by practitioners of the art to which the invention belongs.
- the present invention may be implemented in the testing or practice with methods and materials equivalent or similar to those described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
- Communication Control (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DK10710684.1T DK2286397T3 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging device over a delayed communication connection |
AT10710684T ATE531021T1 (en) | 2009-02-05 | 2010-02-03 | CONTROL OF AN IMAGING DEVICE VIA DELAYED COMMUNICATION CONNECTIONS |
ES10710684T ES2376298T3 (en) | 2009-02-05 | 2010-02-03 | CONTROL OF AN APPLICATION FOR OBTAINING IMAGES ON A DELAYED COMMUNICATION LINK. |
SG2011023983A SG172753A1 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
EP10710684A EP2286397B1 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
PL10710684T PL2286397T3 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
KR1020117008183A KR101790059B1 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
US12/937,433 US8144194B2 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
AU2010212020A AU2010212020B2 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL196923A IL196923A (en) | 2009-02-05 | 2009-02-05 | Controlling an imaging apparatus over a delayed communication link |
IL196923 | 2009-02-05 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2010089738A2 true WO2010089738A2 (en) | 2010-08-12 |
WO2010089738A3 WO2010089738A3 (en) | 2010-09-30 |
Family
ID=42113528
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2010/000095 WO2010089738A2 (en) | 2009-02-05 | 2010-02-03 | Controlling an imaging apparatus over a delayed communication link |
Country Status (12)
Country | Link |
---|---|
US (1) | US8144194B2 (en) |
EP (1) | EP2286397B1 (en) |
KR (1) | KR101790059B1 (en) |
AT (1) | ATE531021T1 (en) |
AU (1) | AU2010212020B2 (en) |
DK (1) | DK2286397T3 (en) |
ES (1) | ES2376298T3 (en) |
IL (1) | IL196923A (en) |
PL (1) | PL2286397T3 (en) |
PT (1) | PT2286397E (en) |
SG (1) | SG172753A1 (en) |
WO (1) | WO2010089738A2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9164506B1 (en) | 2014-07-30 | 2015-10-20 | SZ DJI Technology Co., Ltd | Systems and methods for target tracking |
EP3190789A1 (en) * | 2015-12-31 | 2017-07-12 | Wellen Sham | Facilitating location positioning service through a uav network |
US9800321B2 (en) | 2015-12-31 | 2017-10-24 | Wellen Sham | Facilitating communication with a vehicle via a UAV |
US9826256B2 (en) | 2015-12-31 | 2017-11-21 | Wellen Sham | Facilitating multimedia information delivery through a UAV network |
US9955115B2 (en) | 2015-12-31 | 2018-04-24 | Wellen Sham | Facilitating wide view video conferencing through a drone network |
US10454576B2 (en) | 2015-12-31 | 2019-10-22 | Wellen Sham | UAV network |
Families Citing this family (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0718706D0 (en) | 2007-09-25 | 2007-11-07 | Creative Physics Ltd | Method and apparatus for reducing laser speckle |
US9335604B2 (en) | 2013-12-11 | 2016-05-10 | Milan Momcilo Popovich | Holographic waveguide display |
US11726332B2 (en) | 2009-04-27 | 2023-08-15 | Digilens Inc. | Diffractive projection apparatus |
US10795160B1 (en) | 2014-09-25 | 2020-10-06 | Rockwell Collins, Inc. | Systems for and methods of using fold gratings for dual axis expansion |
US8233204B1 (en) | 2009-09-30 | 2012-07-31 | Rockwell Collins, Inc. | Optical displays |
US11320571B2 (en) | 2012-11-16 | 2022-05-03 | Rockwell Collins, Inc. | Transparent waveguide display providing upper and lower fields of view with uniform light extraction |
US11300795B1 (en) | 2009-09-30 | 2022-04-12 | Digilens Inc. | Systems for and methods of using fold gratings coordinated with output couplers for dual axis expansion |
US8659826B1 (en) | 2010-02-04 | 2014-02-25 | Rockwell Collins, Inc. | Worn display system and method without requiring real time tracking for boresight precision |
US9274349B2 (en) | 2011-04-07 | 2016-03-01 | Digilens Inc. | Laser despeckler based on angular diversity |
WO2013027004A1 (en) | 2011-08-24 | 2013-02-28 | Milan Momcilo Popovich | Wearable data display |
WO2016020630A2 (en) | 2014-08-08 | 2016-02-11 | Milan Momcilo Popovich | Waveguide laser illuminator incorporating a despeckler |
US10670876B2 (en) | 2011-08-24 | 2020-06-02 | Digilens Inc. | Waveguide laser illuminator incorporating a despeckler |
US8634139B1 (en) | 2011-09-30 | 2014-01-21 | Rockwell Collins, Inc. | System for and method of catadioptric collimation in a compact head up display (HUD) |
US9715067B1 (en) | 2011-09-30 | 2017-07-25 | Rockwell Collins, Inc. | Ultra-compact HUD utilizing waveguide pupil expander with surface relief gratings in high refractive index materials |
US9366864B1 (en) | 2011-09-30 | 2016-06-14 | Rockwell Collins, Inc. | System for and method of displaying information without need for a combiner alignment detector |
US9599813B1 (en) | 2011-09-30 | 2017-03-21 | Rockwell Collins, Inc. | Waveguide combiner system and method with less susceptibility to glare |
US20150010265A1 (en) | 2012-01-06 | 2015-01-08 | Milan, Momcilo POPOVICH | Contact image sensor using switchable bragg gratings |
US9523852B1 (en) | 2012-03-28 | 2016-12-20 | Rockwell Collins, Inc. | Micro collimator system and method for a head up display (HUD) |
CN103562802B (en) | 2012-04-25 | 2016-08-17 | 罗克韦尔柯林斯公司 | Holographic wide angle display |
IL219639A (en) | 2012-05-08 | 2016-04-21 | Israel Aerospace Ind Ltd | Remote tracking of objects |
US9933684B2 (en) * | 2012-11-16 | 2018-04-03 | Rockwell Collins, Inc. | Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration |
US10212396B2 (en) | 2013-01-15 | 2019-02-19 | Israel Aerospace Industries Ltd | Remote tracking of objects |
IL224273B (en) * | 2013-01-17 | 2018-05-31 | Cohen Yossi | Delay compensation while controlling a remote sensor |
US9674413B1 (en) | 2013-04-17 | 2017-06-06 | Rockwell Collins, Inc. | Vision system and method having improved performance and solar mitigation |
US9727772B2 (en) | 2013-07-31 | 2017-08-08 | Digilens, Inc. | Method and apparatus for contact image sensing |
US9244281B1 (en) | 2013-09-26 | 2016-01-26 | Rockwell Collins, Inc. | Display system and method using a detached combiner |
EP2879012A1 (en) * | 2013-11-29 | 2015-06-03 | The Boeing Company | System and method for commanding a payload of an aircraft |
US10732407B1 (en) | 2014-01-10 | 2020-08-04 | Rockwell Collins, Inc. | Near eye head up display system and method with fixed combiner |
US9519089B1 (en) | 2014-01-30 | 2016-12-13 | Rockwell Collins, Inc. | High performance volume phase gratings |
US9244280B1 (en) | 2014-03-25 | 2016-01-26 | Rockwell Collins, Inc. | Near eye display system and method for display enhancement or redundancy |
WO2016020632A1 (en) | 2014-08-08 | 2016-02-11 | Milan Momcilo Popovich | Method for holographic mastering and replication |
US10241330B2 (en) | 2014-09-19 | 2019-03-26 | Digilens, Inc. | Method and apparatus for generating input images for holographic waveguide displays |
US9715110B1 (en) | 2014-09-25 | 2017-07-25 | Rockwell Collins, Inc. | Automotive head up display (HUD) |
US10088675B1 (en) | 2015-05-18 | 2018-10-02 | Rockwell Collins, Inc. | Turning light pipe for a pupil expansion system and method |
CN107873086B (en) | 2015-01-12 | 2020-03-20 | 迪吉伦斯公司 | Environmentally isolated waveguide display |
US9632226B2 (en) | 2015-02-12 | 2017-04-25 | Digilens Inc. | Waveguide grating device |
US11366316B2 (en) | 2015-05-18 | 2022-06-21 | Rockwell Collins, Inc. | Head up display (HUD) using a light pipe |
US10126552B2 (en) | 2015-05-18 | 2018-11-13 | Rockwell Collins, Inc. | Micro collimator system and method for a head up display (HUD) |
US10247943B1 (en) | 2015-05-18 | 2019-04-02 | Rockwell Collins, Inc. | Head up display (HUD) using a light pipe |
US10108010B2 (en) | 2015-06-29 | 2018-10-23 | Rockwell Collins, Inc. | System for and method of integrating head up displays and head down displays |
US10690916B2 (en) | 2015-10-05 | 2020-06-23 | Digilens Inc. | Apparatus for providing waveguide displays with two-dimensional pupil expansion |
JP6515787B2 (en) * | 2015-11-02 | 2019-05-22 | 富士通株式会社 | Virtual desktop program, virtual desktop processing method, and virtual desktop system |
US10598932B1 (en) | 2016-01-06 | 2020-03-24 | Rockwell Collins, Inc. | Head up display for integrating views of conformally mapped symbols and a fixed image source |
EP3433659B1 (en) | 2016-03-24 | 2024-10-23 | DigiLens, Inc. | Method and apparatus for providing a polarization selective holographic waveguide device |
JP6734933B2 (en) | 2016-04-11 | 2020-08-05 | ディジレンズ インコーポレイテッド | Holographic Waveguide Device for Structured Light Projection |
US11513350B2 (en) | 2016-12-02 | 2022-11-29 | Digilens Inc. | Waveguide device with uniform output illumination |
US10545346B2 (en) | 2017-01-05 | 2020-01-28 | Digilens Inc. | Wearable heads up displays |
US10295824B2 (en) | 2017-01-26 | 2019-05-21 | Rockwell Collins, Inc. | Head up display with an angled light pipe |
CN116149058A (en) | 2017-10-16 | 2023-05-23 | 迪吉伦斯公司 | System and method for multiplying image resolution of pixellated display |
KR20200104402A (en) | 2018-01-08 | 2020-09-03 | 디지렌즈 인코포레이티드. | Systems and methods for manufacturing waveguide cells |
KR20200108030A (en) | 2018-01-08 | 2020-09-16 | 디지렌즈 인코포레이티드. | System and method for high throughput recording of holographic gratings in waveguide cells |
WO2019136476A1 (en) | 2018-01-08 | 2019-07-11 | Digilens, Inc. | Waveguide architectures and related methods of manufacturing |
US11402801B2 (en) | 2018-07-25 | 2022-08-02 | Digilens Inc. | Systems and methods for fabricating a multilayer optical structure |
KR20210138609A (en) | 2019-02-15 | 2021-11-19 | 디지렌즈 인코포레이티드. | Method and apparatus for providing a holographic waveguide display using an integral grating |
JP2022525165A (en) | 2019-03-12 | 2022-05-11 | ディジレンズ インコーポレイテッド | Holographic Waveguide Backlights and Related Manufacturing Methods |
CN114207492A (en) | 2019-06-07 | 2022-03-18 | 迪吉伦斯公司 | Waveguide with transmission grating and reflection grating and method for producing the same |
KR20220038452A (en) | 2019-07-29 | 2022-03-28 | 디지렌즈 인코포레이티드. | Method and apparatus for multiplying the image resolution and field of view of a pixelated display |
KR20220054386A (en) | 2019-08-29 | 2022-05-02 | 디지렌즈 인코포레이티드. | Vacuum Bragg grating and manufacturing method thereof |
WO2022244329A1 (en) * | 2021-05-20 | 2022-11-24 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2557971B1 (en) * | 1984-01-06 | 1988-05-27 | Thomson Csf | PILOTLESS AIRCRAFT MONITORING SYSTEM FOR OBJECTIVE LOCATION |
AU2317200A (en) * | 1999-02-03 | 2000-08-25 | Elbit Systems Ltd | Delayed video tracking |
US6245137B1 (en) | 1999-04-30 | 2001-06-12 | Hewlett-Packard Company | Surfactants for improved ink-jet performance |
US6532191B2 (en) * | 2000-08-08 | 2003-03-11 | Lockhead Martin Corporation | System and method for target tracking and motion anyalysis |
US20040006424A1 (en) * | 2002-06-28 | 2004-01-08 | Joyce Glenn J. | Control system for tracking and targeting multiple autonomous objects |
US20060058954A1 (en) * | 2003-10-08 | 2006-03-16 | Haney Philip J | Constrained tracking of ground objects using regional measurements |
WO2008018156A1 (en) * | 2006-08-10 | 2008-02-14 | Sanritz Automation Co., Ltd. | Fluctuation corrected image display method and mobile object remote control system using the method |
US7782247B1 (en) * | 2008-07-25 | 2010-08-24 | Rockwell Collins, Inc. | System and method for target location |
-
2009
- 2009-02-05 IL IL196923A patent/IL196923A/en active IP Right Grant
-
2010
- 2010-02-03 ES ES10710684T patent/ES2376298T3/en active Active
- 2010-02-03 AU AU2010212020A patent/AU2010212020B2/en active Active
- 2010-02-03 US US12/937,433 patent/US8144194B2/en active Active
- 2010-02-03 DK DK10710684.1T patent/DK2286397T3/en active
- 2010-02-03 AT AT10710684T patent/ATE531021T1/en not_active IP Right Cessation
- 2010-02-03 EP EP10710684A patent/EP2286397B1/en active Active
- 2010-02-03 KR KR1020117008183A patent/KR101790059B1/en active IP Right Grant
- 2010-02-03 SG SG2011023983A patent/SG172753A1/en unknown
- 2010-02-03 PL PL10710684T patent/PL2286397T3/en unknown
- 2010-02-03 WO PCT/IL2010/000095 patent/WO2010089738A2/en active Application Filing
- 2010-02-03 PT PT10710684T patent/PT2286397E/en unknown
Non-Patent Citations (1)
Title |
---|
None |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11106201B2 (en) | 2014-07-30 | 2021-08-31 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
JP2017503226A (en) * | 2014-07-30 | 2017-01-26 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Goal tracking system, device and method |
CN107703963B (en) * | 2014-07-30 | 2020-12-01 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
EP3060966A4 (en) * | 2014-07-30 | 2017-01-11 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
CN107291104A (en) * | 2014-07-30 | 2017-10-24 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
US9567078B2 (en) | 2014-07-30 | 2017-02-14 | SZ DJI Technology Co., Ltd | Systems and methods for target tracking |
US11194323B2 (en) | 2014-07-30 | 2021-12-07 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
EP3862837A1 (en) * | 2014-07-30 | 2021-08-11 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
CN105518555A (en) * | 2014-07-30 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Systems and methods for target tracking |
WO2016015251A1 (en) * | 2014-07-30 | 2016-02-04 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
CN107015572A (en) * | 2014-07-30 | 2017-08-04 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
CN105518555B (en) * | 2014-07-30 | 2017-11-03 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
US9164506B1 (en) | 2014-07-30 | 2015-10-20 | SZ DJI Technology Co., Ltd | Systems and methods for target tracking |
US9846429B2 (en) | 2014-07-30 | 2017-12-19 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
CN107577247A (en) * | 2014-07-30 | 2018-01-12 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
CN107703963A (en) * | 2014-07-30 | 2018-02-16 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
CN107577247B (en) * | 2014-07-30 | 2021-06-25 | 深圳市大疆创新科技有限公司 | Target tracking system and method |
US9826256B2 (en) | 2015-12-31 | 2017-11-21 | Wellen Sham | Facilitating multimedia information delivery through a UAV network |
US10354521B2 (en) | 2015-12-31 | 2019-07-16 | Wellen Sham | Facilitating location positioning service through a UAV network |
US10440323B2 (en) | 2015-12-31 | 2019-10-08 | Wellen Sham | Facilitating wide view video conferencing through a drone network |
US10454564B2 (en) | 2015-12-31 | 2019-10-22 | Wellen Sham | Facilitating communication with a vehicle via a UAV |
US10454576B2 (en) | 2015-12-31 | 2019-10-22 | Wellen Sham | UAV network |
US10097862B2 (en) | 2015-12-31 | 2018-10-09 | Wellen Sham | Facilitating multimedia information delivery through a UAV network |
US9955115B2 (en) | 2015-12-31 | 2018-04-24 | Wellen Sham | Facilitating wide view video conferencing through a drone network |
US9800321B2 (en) | 2015-12-31 | 2017-10-24 | Wellen Sham | Facilitating communication with a vehicle via a UAV |
US9786165B2 (en) | 2015-12-31 | 2017-10-10 | Wellen Sham | Facilitating location positioning service through a UAV network |
EP3190789A1 (en) * | 2015-12-31 | 2017-07-12 | Wellen Sham | Facilitating location positioning service through a uav network |
Also Published As
Publication number | Publication date |
---|---|
ES2376298T3 (en) | 2012-03-12 |
EP2286397B1 (en) | 2011-10-26 |
AU2010212020B2 (en) | 2014-10-30 |
EP2286397A2 (en) | 2011-02-23 |
US8144194B2 (en) | 2012-03-27 |
AU2010212020A1 (en) | 2010-08-12 |
KR20110134372A (en) | 2011-12-14 |
PT2286397E (en) | 2011-11-30 |
IL196923A (en) | 2014-01-30 |
PL2286397T3 (en) | 2011-12-30 |
ATE531021T1 (en) | 2011-11-15 |
WO2010089738A3 (en) | 2010-09-30 |
US20110026774A1 (en) | 2011-02-03 |
DK2286397T3 (en) | 2012-01-02 |
KR101790059B1 (en) | 2017-10-26 |
SG172753A1 (en) | 2011-08-29 |
IL196923A0 (en) | 2009-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2010212020B2 (en) | Controlling an imaging apparatus over a delayed communication link | |
US11867479B2 (en) | Interactive weapon targeting system displaying remote sensed image of target area | |
US8380362B2 (en) | Systems and methods for remotely collaborative vehicles | |
EP3077879B1 (en) | Imaging method and apparatus | |
US20200191556A1 (en) | Distance mesurement method by an unmanned aerial vehicle (uav) and uav | |
US20180022472A1 (en) | Autonomous system for taking moving images from a drone, with target tracking and improved target location | |
US10037041B2 (en) | System and apparatus for integrating mobile sensor platforms into autonomous vehicle operational control | |
IL258551A (en) | Target tracking method performed by a drone, related computer program, electronic system and drone | |
Farmani et al. | Tracking multiple mobile targets using cooperative unmanned aerial vehicles | |
CN107450586B (en) | Method and system for adjusting air route and unmanned aerial vehicle system | |
US20200180784A1 (en) | Method and system for detecting, positioning and capturing an intruder in-flight using a laser detection and ranging device | |
US10551474B2 (en) | Delay compensation while controlling a remote sensor | |
KR102125490B1 (en) | Flight control system and unmanned vehicle controlling method | |
EP3077760A1 (en) | Payload delivery | |
WO2018059398A1 (en) | Method, apparatus, and system for controlling multi-rotor aircraft | |
WO2023100187A2 (en) | Systems and methods for managing unmanned vehicle interactions with various payloads | |
RU2652329C1 (en) | Combat support multi-functional robotic-technical complex control system | |
US20220349677A1 (en) | Device for locating, sharing, and engaging targets with firearms | |
CN206323500U (en) | Electronic equipment tracks of device and system | |
EP2881697A1 (en) | Capturing and processing images | |
Yang et al. | Design, implementation, and verification of a low‐cost terminal guidance system for small fixed‐wing UAVs | |
KR102149494B1 (en) | Structure inspection system and method using dron | |
US12125230B2 (en) | Information processing device and information processing method for estimating locations or postures of devices based on acquired environment information | |
US20230343229A1 (en) | Systems and methods for managing unmanned vehicle interactions with various payloads |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10710684 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12937433 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010710684 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010212020 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1460/KOLNP/2011 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 20117008183 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2010212020 Country of ref document: AU Date of ref document: 20100203 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |