EP2778819A1 - Procédé de prise de vue d'une interprétation cinématographique utilisant un véhicule aérien sans pilote - Google Patents
Procédé de prise de vue d'une interprétation cinématographique utilisant un véhicule aérien sans pilote Download PDFInfo
- Publication number
- EP2778819A1 EP2778819A1 EP13305269.6A EP13305269A EP2778819A1 EP 2778819 A1 EP2778819 A1 EP 2778819A1 EP 13305269 A EP13305269 A EP 13305269A EP 2778819 A1 EP2778819 A1 EP 2778819A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- unmanned aerial
- aerial vehicle
- motion trajectory
- virtual
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 230000003993 interaction Effects 0.000 claims description 13
- 230000003287 optical effect Effects 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 5
- 238000000926 separation method Methods 0.000 claims description 3
- 239000003550 marker Substances 0.000 description 8
- 238000012545 processing Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003875 gradient-accelerated spectroscopy Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/006—Apparatus mounted on flying objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/06—Special arrangements of screening, diffusing, or reflecting devices, e.g. in studio
- G03B15/07—Arrangements of lamps in studios
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/28—Mobile studios
Definitions
- This invention generally relates to a method for shooting a performance in which at least one actor interacts with a virtual element moving along a determined motion trajectory.
- the invention relies on a specific unmanned aerial vehicle, an apparatus and a film shooting studio.
- Computer-Generated Imagery is increasingly present in film and TV production. Dedicated techniques are needed to ensure seamless compositing and interaction between the virtual and real elements of a scene.
- the performance of real actors is composited with a virtual background.
- This is, for instance, the situation in a virtual TV studio, where the news presenter is filmed against a green background, and the furniture and background of the studio are inserted later as virtual elements.
- Chroma keying is used to matte out the silhouette of the journalist for compositing with the virtual elements in the scene.
- a TV or film shooting studio is usually equipped with an optical motion capture system which consists of a camera setup and an acquisition system.
- the camera setup consists of a set of calibrated cameras placed around a capture volume.
- the actors wear dedicated suits where physical markers are placed at the location of the main body articulations.
- the actors play the role of the film characters or virtual creatures inside the capture volume, as defined by the scenario.
- the optical motion capture system tracks the locations of the physical markers in the images captured by the cameras. This data is fed into animation and rendering software that generates the appearance of virtual characters or creatures at each frame of the target production.
- a misplacement of the presenter's hand in this case could be fixed during the compositing phase by tweaking the viewpoint of the virtual camera.
- this solution would not be applicable to multiple interactions occurring with elements of a rigid virtual layout, since the adjustments would need to be different for each interaction.
- the present invention solves the aforementioned drawbacks by using umanned aerial vehicles, such as drones for example, to provide the physical markers that are needed to give the real actors indications on the positioning of virtual elements to be inserted later in the scene, and with which they need to interact.
- umanned aerial vehicles such as drones for example
- the invention concerns an unmanned aerial vehicule which is characterized in that a part of said unmanned aerial vehicle follows a determined motion trajectory of a contact location of a virtual element in a scene that it materializes.
- unmanned aerial vehicle Said part of the unmanned aerial vehicle is then a physical marker "floating in the air” that allow an interaction occurring between an actor and a real virtual element of a scene.
- Multiple unmanned aerial vehicules may be used and each of them may be controlled with different adjustments to reproduce interactions between real and/or virtual elements even when these elements are moving along different motion trajectories.
- the invention concerns a method for shooting a performance in which at least one actor interacts with a virtual element moving along a determined motion trajectory.
- the method is characterized in that it makes use of an unmanned aerial vehicle navigation control capability.
- the invention concerns an apparatus comprising means to specify a 3D position of an unmanned aerial vehicle according to a determined motion trajectory.
- the apparatus is characterized in that said means are configured in order that a part of the unmanned aerial vehicle follows the motion trajectory at a predefined speed, said motion trajectory being determined in order to allow interactions occurring between real and/or virtual elements of a scene.
- the invention concerns a film shooting studio which is characterized in that it is equipped with at least one unmanned aerial vehicle as previously disclosed and an apparatus as previously disclosed.
- Fig. 1 shows an example of a TV or film shooting studio.
- the invention is not limited to this single example but may extend to any indoor or outdoor environment which is adapted to capture the optical motion of an object from images of physical markers.
- a TV or film shooting studio is a room equipped with an optical motion capture system which comprises a camera setup and an acquisition system.
- the camera setup comprises cameras, here four referenced C1 to C4, and light sources, here three referenced L1 to L3.
- the TV or film shooting studio is surrounded, at least partially, by walls which are painted in a uniform green or blue colour, so that actors or props filmed in the studio can be easily segmented out from the background of the studio using chroma keying.
- the studio needs to be large enough to hold the camera setup and make sure that the volume captured by this setup, called the capture volume, allows sufficient room for the props and the performance of the actors.
- the cameras are positioned all around the capture volume usually in the center of the room, in such a way that any point within this volume is seen by a minimum of 3 cameras, and preferably more.
- the cameras must be synchronized, typically from an external genlock signal, and operate at sufficiently high frame rates (to avoid motion blur) and with sufficient resolution to accurately estimate the motion trajectories of physical markers used for motion capture.
- the cameras are calibrated, both with respect to their intrinsic and extrinsic parameters, so that the location on a camera image of the projection of any 3D point of the motion capture volume in its viewing frustum, referenced in some 3D coordinate system S MC , can be accurately predicted.
- Lighting in the TV or film shooting studio relies on a set of fixed light sources, here L1 to L3 that provides an ideally diffuse and uniform lighting within the capture volume.
- the time-stamped video signals captured by the camera setup are transferred and recorded from each of the cameras to a storage device, typically hard disk drives, thanks to the acquisition system (not represented in Fig. 1 ).
- the acquisition system also features a user interface and software for controlling the operation of the cameras and visualizing their outputs.
- the tracking method comprises detecting the locations of the physical markers in the images of the cameras. This is straightforward, as markers, owing to their high reflectivity, appear as bright spots in the images.
- spatial correspondences between the detected markers locations across camera images are established.
- a 3D point in the 3D coordinate system S MC having generated a detected location in a camera image lies on a viewing line going through this location in the camera image plane and the camera projection centre.
- Spatial correspondences between detected locations across camera views, corresponding to the projections in the views of physical markers can be determined by the fact that the above-defined viewing lines for each considered camera intersect at the location of the physical marker in 3D space.
- the locations and orientations of the image plane and projection center for each camera are known from the camera calibration data.
- the detected marker locations set in correspondence, and thus corresponding to the projections of physical markers, are tracked over time for each camera image.
- Temporal tracking typically relies on non-rigid point set registration techniques, wherein a global mapping is determined between the distributions of marker locations between two consecutive images of the same camera in consecutive frames.
- the marker tracks are labeled. This can be performed manually, or alternatively the labels can be set automatically. Automatic labeling can benefit from a known initial layout of markers, for instance, in the case of body motion capture, the "T-stance" where the person stands with legs apart and both arms stretched away from the body.
- the captured data is post-processed, especially in order to fill holes caused by marker occlusion.
- a model of the captured object e.g ., an articulated body model
- an articulated human body is fitted to the 3D locations of physical markers at each frame, thus providing data for animating a virtual character (possibly after retargeting if the anthropometric proportions of the actor and the virtual character are different).
- At least four non-planar physical markers M detectable by the optical motion capture system are located on an unmanned aerial vehicle UAV schematically represented in Fig. 1 , where the unmanned aerial vehicle UAV is represented by the four ovales and the markers M are represented by black filled disks.
- the non-coplanar physical markers define a 3D coordinate system S UAV for the unmanned aerial vehicle UAV, whose relative translation and rotation with respect to the 3D coordinate system S MC can be computed using straightforward 3D geometry, the locations of the markers in S MC being determined by the optical motion capture system.
- a part of the unmanned aerial vehicle UAV follows a determined motion trajectory of a contact location of a virtual element in a scene that it materializes.
- a stick S is rigidly attached to the unmanned aerial vehicle UAV, as represented on Fig. 1 , in such a way that its extremity can be accessed without danger of getting hurt by the unmanned aerial vehicle propellers.
- the location of the extremity of the stick S mounted on the unmanned aerial vehicle UAV is fixed and known in the 3D coordinate system S UAV , and can therefore easily be computed in the 3D coordinate system S MC .
- the extremity of the stick S is then the part of the unmanned vehicle which follows the determined motion trajectory of a contact location of a virtual element in a scene that it materializes.
- Complex scenes may require several unmanned aerial vehicles UAV, on each of which at least four physical markers are located.
- a minimal separation distance between these unmanned aerial vehicles UAV is maintained at all times, to avoid aerodynamic interference.
- the unmanned aerial vehicle is a drone.
- a drone is a lightweight unmanned aerial vehicle powered by multiple rotors, typically 4 to 8, running on batteries.
- the drone is equipped with onboard electronics including processing means, an Inertial Measurement Unit and additional position and velocity sensors for navigation, and with means for wireless communication with a remote apparatus.
- the navigation of a drone can be controlled by a so-called navigation control method usually implemented on a remote station over a dedicated Application Programming Interface (API) which may provide access to low-level controls, such as the speeds of the rotors, and/or to higher-level features such as a target drone attitude, elevation speed or rotation speed around the vertical axis passing through the drone center of mass.
- API Application Programming Interface
- the navigation control method can be developed on top of this API in order to control the displacements of the drone in real-time.
- the control can be performed manually from a user interface, for instance relying on graphical pads on a mobile device display.
- the navigation of the drone can be constrained programmatically to follow a determined motion trajectory. This motion trajectory defines a target 3D position of the center of mass of the drone in some reference 3D coordinate system at each time instant after a reference start time.
- the navigation control method can benefit from the positional estimates of the drone provided by an optical motion capture system.
- an optical motion capture system Such a closed-loop feedback control of a drone using an optical motion capture system is described, for example, in the paper entitled « The GRASP Multiple Micro UA V Testbed » by N. Michael et al., published in the Sept. 2010 issue of the IEEE Robotics and Automation Magazine, Sept. 2010.
- the control of the drone relies on two nested feedback loops, as shown on Fig. 2 .
- the purpose of the loops is to ensure that the actual attitude and position values of the drone, as computed from the IMU and positional sensors measurements, match the target values determined by a target trajectory.
- the Position Control module takes as input, at each time instant t , the target 3D position of the drone center of mass r T (t) and its estimated position r (t) in the coordinate system of the motion capture volume S MC .
- the accurate estimates of r (t) provided by the motion capture system owing to the non-coplanar retro-reflective markers attached to the drone, can advantageously be fed into the navigation control method, in order to improve the stability and accuracy of the motion trajectory following.
- a control loop within the position control module generates, as a function of the positional error r T (t)- r (t), the desired values of the attitude angles ⁇ des (t), ⁇ des (t) and ⁇ des (t) or the roll, pitch and yaw angles respectively, that stabilize the attitude of the drone and ensure the desired linear displacement that compensates for the positional error.
- the Attitude Control module is a second, inner, control loop that generates the increments of the moments ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ to be produced by the drone rotors along the roll, pitch and yaw axes respectively, in order to to obtain the desired attitude values.
- the position control module feeds the motor dynamics module with an extra moment ⁇ F that results in a net force along the vertical axis at the center of gravity of the drone, allowing the control of its altitude.
- the Motor Dynamics module translates ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ and ⁇ F into set point values for the rotor speeds, that are transmitted to the drone via its communication means, so that the rotor speeds are updated over the API.
- the Motor Dynamics module translates the updates of the rotors speeds into net forces T i applied to the drone along the vertical axes at the location of each rotor, as well as into angular moments M i along these same axes.
- a model of the drone dynamics allows to compute, in the Rigid Body Dynamics module, the linear acceleration of the drone r ⁇ and its angular accelerations ⁇ ( t ) ,q ⁇ ( t ) and ⁇ ( t ) in its body frame. These accelerations are fed back to the Position Control and Attitude Control modules, respectively, to provide the inputs to the control loops implemented in these two modules.
- Position Control and Attitude Control loops use measurements, not represented on Fig. 2 , from the Inertial Measurement Unit and the positional sensors mounted on the drone, in order to estimate the drone position and attitude at their inputs.
- the invention also concerns a method for shooting a performance in which at least one actor interacts with a virtual element moving along a determined motion trajectory comprises two phases, both making use of an unmanned aerial vehicle UAV navigation control capability.
- a part of the unmanned aerial vehicle UAV prior to the start of the shooting, a part of the unmanned aerial vehicle UAV, such as the extremity of the stick S, is moved to the initial position of a determined motion trajectory of a contact location of a virtual element in the scene that it materializes.
- a trigger signal synchronized with the action taking place during the shooting, for instance provided by a member of the on-set staff, the part of the unmanned aerial vehicle UAV is moved along said determined motion trajectory, either manually from a control interface, or programmatically.
- the unmanned aerial vehicle UAV In a second phase, triggered by a signal synchronized with the captured performance, which may be provided for instance by a member of the on-set staff, the unmanned aerial vehicle UAV is displaced so that its part which materializes the contact location of the virtual element follows said determined motion trajectory.
- a 3D model of the virtual scene is assumed known and registered with the 3D coordinate system S MC .
- the motion trajectories of all moving virtual elements within the 3D virtual scene model are predefined from the scenario of the performance to be captured. These motion trajectories are represented by a temporal sequence of 3D locations in the 3D coordinate system S MC , defined with reference to a predefined start time t ref , typically set to the starting time of the performance to be captured.
- the sampling frequency of this sequence is chosen, for example, so as to be compatible with the rate at which the target 3D position of the drone center of mass r T (t) can be estimated.
- the location of contact on each of the moving virtual elements of the performance where, for instance an actor should interact with the element, for instance by placing a hand on this location is materialized by a part of an unmanned aerial vehicle UAV such as, according to an embodiment, the extremety of a stick S.
- an unmanned aerial vehicle UAV such as, according to an embodiment, the extremety of a stick S.
- the 3D coordinate system S UAV is registered with respect to the 3D coordinate system S MC
- the coordinate of this location of contact on the unmanned aerial vehicle UAV can be expressed in the 3D coordinate system S MC via a straightforward change of coordinate system, and therefore matched at any time against the target location of the virtual element, also expressed in the 3D coordinate system S MC .
- Figure 3 shows an apparatus 300 that can be used in a Film or TV studio to control an unmanned aerial vehicle.
- the apparatus comprises the following components, interconnected by a digital data- and address bus 30:
- Processing unit 33 can be implemented as a microprocessor, a custom chip, a dedicated (micro-) controller, and so on.
- Memory 35 can be implemented in any form of volatile and/or non-volatile memory, such as a RAM (Random Access Memory), hard disk drive, non-volatile random-access memory, EPROM (Erasable Programmable ROM), and so on.
- RAM Random Access Memory
- EPROM Erasable Programmable ROM
- the processing unit 33, the memory 35 and the network interface 34 are configured to control the navigation of an unmanned aerial vehicle such as a drone, i.e. they are configured to specify a target position of the unmanned aerial vehicle at each time instant, corresponding to a determined motion trajectory in the 3D coordinate system S UAV . It is then possible to control the unmanned aerial vehicle (a drone for example) in such a way that a part of it follows a motion trajectory in the 3D coordinate system S MC at a predefined speed, said motion trajectory being determined in order to allow interactions to occur between real and/or virtual elements of a scene.
- This form of control allows to combine the navigation of the unmanned aerial vehicle UAV with other features, for instance, related to the remote operation of a camera mounted on the unmanned aerial vehicle UAV.
- the apparatus comprises a Graphical User Interface 32 which is configured to allow a user to specify the target position of the unmanned aerial vehicle UAV at each time instant.
- the unmanned aerial vehicle UAV trajectory control is then operated from the Graphical User Interface 32 that can take the form for example of a joystick or a tactile interface, e.g ., on a tablet.
- the modules are functional units, which may or not be in relation with distinguishable physical units. For example, these modules or some of them may be brought together in a unique component or circuit, or contribute to functionalities of a software. A contrario, some modules may potentially be composed of separate physical entities.
- the apparatus which are compatible with the invention are implemented using either pure hardware, for example using dedicated hardware such ASIC or FPGA or VLSI, respectively «Application Specific Integrated Circuit » « Field-Programmable Gate Array » « Very Large Scale Integration » or from several integrated electronic components embedded in a device or from a brend of hardware and software components.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Studio Devices (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Control Of Position Or Direction (AREA)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13305269.6A EP2778819A1 (fr) | 2013-03-12 | 2013-03-12 | Procédé de prise de vue d'une interprétation cinématographique utilisant un véhicule aérien sans pilote |
US14/203,471 US9621821B2 (en) | 2013-03-12 | 2014-03-10 | Method for shooting a performance using an unmanned aerial vehicle |
EP14158826.9A EP2784618A3 (fr) | 2013-03-12 | 2014-03-11 | Procédé de prise de vue cinematographique utilisant un véhicule aérien autonome |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13305269.6A EP2778819A1 (fr) | 2013-03-12 | 2013-03-12 | Procédé de prise de vue d'une interprétation cinématographique utilisant un véhicule aérien sans pilote |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2778819A1 true EP2778819A1 (fr) | 2014-09-17 |
Family
ID=48049921
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13305269.6A Withdrawn EP2778819A1 (fr) | 2013-03-12 | 2013-03-12 | Procédé de prise de vue d'une interprétation cinématographique utilisant un véhicule aérien sans pilote |
EP14158826.9A Ceased EP2784618A3 (fr) | 2013-03-12 | 2014-03-11 | Procédé de prise de vue cinematographique utilisant un véhicule aérien autonome |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14158826.9A Ceased EP2784618A3 (fr) | 2013-03-12 | 2014-03-11 | Procédé de prise de vue cinematographique utilisant un véhicule aérien autonome |
Country Status (2)
Country | Link |
---|---|
US (1) | US9621821B2 (fr) |
EP (2) | EP2778819A1 (fr) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105045278A (zh) * | 2015-07-09 | 2015-11-11 | 沈阳卡迩特科技有限公司 | 一种微型无人机自主感知与规避方法 |
CN105173102A (zh) * | 2015-09-18 | 2015-12-23 | 西北农林科技大学 | 一种基于多图像的四旋翼飞行器增稳系统与方法 |
CN105282519A (zh) * | 2015-11-13 | 2016-01-27 | 杨珊珊 | 基于无人飞行器的安防系统及其安防方法 |
EP3015147A1 (fr) * | 2014-10-28 | 2016-05-04 | Thomson Licensing | Procédé permettant de générer une trajectoire cible d'une caméra embarquée sur un drone et système correspondant |
WO2016065623A1 (fr) * | 2014-10-31 | 2016-05-06 | SZ DJI Technology Co., Ltd. | Systèmes et procédés de surveillance doté de repère visuel |
CN105867400A (zh) * | 2016-04-20 | 2016-08-17 | 北京博瑞爱飞科技发展有限公司 | 无人机的飞行控制方法和装置 |
CN106527496A (zh) * | 2017-01-13 | 2017-03-22 | 平顶山学院 | 面向无人机航拍图像序列的空中目标快速跟踪方法 |
CN108885466A (zh) * | 2017-11-22 | 2018-11-23 | 深圳市大疆创新科技有限公司 | 一种控制参数配置方法及无人机 |
CN108920711A (zh) * | 2018-07-25 | 2018-11-30 | 中国人民解放军国防科技大学 | 面向无人机起降引导的深度学习标签数据生成方法 |
CN112327909A (zh) * | 2020-10-27 | 2021-02-05 | 一飞(海南)科技有限公司 | 一种无人机编队的贴图灯效控制方法、控制系统及无人机 |
WO2022216465A1 (fr) * | 2021-04-06 | 2022-10-13 | Sony Interactive Entertainment LLC | Robot réglable pour fournir une échelle d'actifs virtuels et identifier des objets dans une scène interactive |
CN115617079A (zh) * | 2022-12-14 | 2023-01-17 | 四川轻化工大学 | 一种可交互无人机系统 |
CN116431005A (zh) * | 2023-06-07 | 2023-07-14 | 安徽大学 | 一种基于改进移动端唇语识别的无人机控制方法及系统 |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10632740B2 (en) | 2010-04-23 | 2020-04-28 | Landa Corporation Ltd. | Digital printing process |
US10642198B2 (en) | 2012-03-05 | 2020-05-05 | Landa Corporation Ltd. | Intermediate transfer members for use with indirect printing systems and protonatable intermediate transfer members for use with indirect printing systems |
EP4019596A1 (fr) | 2012-03-05 | 2022-06-29 | Landa Corporation Ltd. | Procédé de fabrication d'une construction de film d'encre |
US9902147B2 (en) | 2012-03-05 | 2018-02-27 | Landa Corporation Ltd. | Digital printing system |
US9498946B2 (en) | 2012-03-05 | 2016-11-22 | Landa Corporation Ltd. | Apparatus and method for control or monitoring of a printing system |
US10434761B2 (en) | 2012-03-05 | 2019-10-08 | Landa Corporation Ltd. | Digital printing process |
US9643403B2 (en) | 2012-03-05 | 2017-05-09 | Landa Corporation Ltd. | Printing system |
US10569534B2 (en) | 2012-03-05 | 2020-02-25 | Landa Corporation Ltd. | Digital printing system |
US9381736B2 (en) | 2012-03-05 | 2016-07-05 | Landa Corporation Ltd. | Digital printing process |
CN104284850B (zh) | 2012-03-15 | 2018-09-11 | 兰达公司 | 打印系统的环形柔性皮带 |
US9367067B2 (en) * | 2013-03-15 | 2016-06-14 | Ashley A Gilmore | Digital tethering for tracking with autonomous aerial robot |
GB201401173D0 (en) | 2013-09-11 | 2014-03-12 | Landa Corp Ltd | Ink formulations and film constructions thereof |
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US9798322B2 (en) | 2014-06-19 | 2017-10-24 | Skydio, Inc. | Virtual camera interface and other user interaction paradigms for a flying digital assistant |
US9678506B2 (en) | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
WO2016131005A1 (fr) | 2015-02-13 | 2016-08-18 | Unmanned Innovation, Inc. | Corrélation et activation de capteur de véhicule aérien sans pilote |
US9855658B2 (en) | 2015-03-19 | 2018-01-02 | Rahul Babu | Drone assisted adaptive robot control |
GB2536489B (en) | 2015-03-20 | 2018-08-29 | Landa Corporation Ltd | Indirect printing system |
GB2537813A (en) | 2015-04-14 | 2016-11-02 | Landa Corp Ltd | Apparatus for threading an intermediate transfer member of a printing system |
US9971355B2 (en) * | 2015-09-24 | 2018-05-15 | Intel Corporation | Drone sourced content authoring using swarm attestation |
GB201602877D0 (en) | 2016-02-18 | 2016-04-06 | Landa Corp Ltd | System and method for generating videos |
US9541633B2 (en) | 2016-04-29 | 2017-01-10 | Caterpillar Inc. | Sensor calibration system |
GB201609463D0 (en) | 2016-05-30 | 2016-07-13 | Landa Labs 2012 Ltd | Method of manufacturing a multi-layer article |
US10933661B2 (en) | 2016-05-30 | 2021-03-02 | Landa Corporation Ltd. | Digital printing process |
US10520943B2 (en) | 2016-08-12 | 2019-12-31 | Skydio, Inc. | Unmanned aerial image capture platform |
US20220264007A1 (en) * | 2016-09-02 | 2022-08-18 | Skyyfish Llc | Intelligent gimbal assembly and method for unmanned vehicle |
CN106488216B (zh) * | 2016-09-27 | 2019-03-26 | 三星电子(中国)研发中心 | 生成物体3d模型的方法、装置和系统 |
US11295458B2 (en) * | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
US10447995B1 (en) | 2017-03-16 | 2019-10-15 | Amazon Technologies, Inc. | Validation of camera calibration data using augmented reality |
US10554950B1 (en) | 2017-03-16 | 2020-02-04 | Amazon Technologies, Inc. | Collection of camera calibration data using augmented reality |
US9986233B1 (en) * | 2017-03-16 | 2018-05-29 | Amazon Technologies, Inc. | Camera calibration using fixed calibration targets |
JP7206268B2 (ja) | 2017-10-19 | 2023-01-17 | ランダ コーポレイション リミテッド | 印刷システム用の無端可撓性ベルト |
WO2019097464A1 (fr) | 2017-11-19 | 2019-05-23 | Landa Corporation Ltd. | Système d'impression numérique |
US11511536B2 (en) | 2017-11-27 | 2022-11-29 | Landa Corporation Ltd. | Calibration of runout error in a digital printing system |
US11707943B2 (en) | 2017-12-06 | 2023-07-25 | Landa Corporation Ltd. | Method and apparatus for digital printing |
JP7273038B2 (ja) | 2017-12-07 | 2023-05-12 | ランダ コーポレイション リミテッド | デジタル印刷処理及び方法 |
WO2020003088A1 (fr) | 2018-06-26 | 2020-01-02 | Landa Corporation Ltd. | Élément de transfert intermédiaire pour système d'impression numérique |
US10994528B1 (en) | 2018-08-02 | 2021-05-04 | Landa Corporation Ltd. | Digital printing system with flexible intermediate transfer member |
JP7305748B2 (ja) | 2018-08-13 | 2023-07-10 | ランダ コーポレイション リミテッド | デジタル画像にダミー画素を埋め込むことによるデジタル印刷における歪み補正 |
JP7246496B2 (ja) | 2018-10-08 | 2023-03-27 | ランダ コーポレイション リミテッド | 印刷システムおよび方法に関する摩擦低減手段 |
JP7462648B2 (ja) | 2018-12-24 | 2024-04-05 | ランダ コーポレイション リミテッド | デジタル印刷システム |
US11118948B2 (en) * | 2019-08-23 | 2021-09-14 | Toyota Motor North America, Inc. | Systems and methods of calibrating vehicle sensors using augmented reality |
US10723455B1 (en) * | 2019-09-03 | 2020-07-28 | Disney Enterprises, Inc. | Aerial show system with dynamic participation of unmanned aerial vehicles (UAVs) with distributed show systems |
WO2021105806A1 (fr) | 2019-11-25 | 2021-06-03 | Landa Corporation Ltd. | Séchage d'encre en impression numérique avec un rayonnement infrarouge absorbé par des particules incorporées à l'intérieur d'un itm |
US11321028B2 (en) | 2019-12-11 | 2022-05-03 | Landa Corporation Ltd. | Correcting registration errors in digital printing |
WO2021137063A1 (fr) | 2019-12-29 | 2021-07-08 | Landa Corporation Ltd. | Procédé et système d'impression |
CN112596536A (zh) * | 2020-11-19 | 2021-04-02 | 一飞(海南)科技有限公司 | 集群无人机表演画面的方法、产品、存储介质、电子设备 |
KR102584931B1 (ko) * | 2021-07-05 | 2023-11-09 | 주식회사 삼영기술 | 구조물 유지 보수용 드론 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008013648A2 (fr) * | 2006-07-24 | 2008-01-31 | The Boeing Company | Commande asservie à boucle fermée utilisant des systèmes de détection de mouvement |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7468778B2 (en) * | 2002-03-15 | 2008-12-23 | British Broadcasting Corp | Virtual studio system |
FR2912318B1 (fr) * | 2007-02-13 | 2016-12-30 | Parrot | Reconnaissance d'objets dans un jeu de tir pour jouets telecommandes |
JP2009212582A (ja) | 2008-02-29 | 2009-09-17 | Nippon Hoso Kyokai <Nhk> | バーチャルスタジオ用フィードバックシステム |
FR2939325B1 (fr) * | 2008-12-04 | 2015-10-16 | Parrot | Systeme de drones munis de balises de reconnaissance |
WO2012151395A2 (fr) * | 2011-05-03 | 2012-11-08 | Ivi Media Llc | Fourniture d'expérience multimédia adaptative |
-
2013
- 2013-03-12 EP EP13305269.6A patent/EP2778819A1/fr not_active Withdrawn
-
2014
- 2014-03-10 US US14/203,471 patent/US9621821B2/en active Active
- 2014-03-11 EP EP14158826.9A patent/EP2784618A3/fr not_active Ceased
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008013648A2 (fr) * | 2006-07-24 | 2008-01-31 | The Boeing Company | Commande asservie à boucle fermée utilisant des systèmes de détection de mouvement |
Non-Patent Citations (4)
Title |
---|
G.B. GUERRA-FILHO: "Optical Motion Capture: Theory and Implementation", JOURNAL OF THEORETICAL AND APPLIED INFORMATICS, 2005 |
MELLINGER DANIEL, KUMAR VIJAY: "Aggressive quadrotor II", 15 September 2010 (2010-09-15), XP002699899, Retrieved from the Internet <URL:http://www.youtube.com/watch?v=geqip_0Vjec&list=PL6E012F3385B68E08> [retrieved on 20130702] * |
MICHAEL NATHAN, MELLINGER DANIEL, KUMAR VIJAY: "The GRASP Multiple Micro-UAV Testbed", IEEE ROBOTICS & AUTOMATION MAGAZINE, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 17, no. 3, 9 September 2010 (2010-09-09), pages 56 - 65, XP011317945, ISSN: 1070-9932 * |
N. MICHAEL ET AL.: "The GRASP Multiple Micro UA V Testbed", IEEE ROBOTICS AND AUTOMATION MAGAZINE, September 2010 (2010-09-01) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3015147A1 (fr) * | 2014-10-28 | 2016-05-04 | Thomson Licensing | Procédé permettant de générer une trajectoire cible d'une caméra embarquée sur un drone et système correspondant |
EP3015146A1 (fr) * | 2014-10-28 | 2016-05-04 | Thomson Licensing | Procédé permettant de générer une trajectoire cible d'une caméra embarquée sur un drone et système correspondant |
US10698423B2 (en) | 2014-10-31 | 2020-06-30 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with a visual marker |
WO2016065623A1 (fr) * | 2014-10-31 | 2016-05-06 | SZ DJI Technology Co., Ltd. | Systèmes et procédés de surveillance doté de repère visuel |
US20170031369A1 (en) | 2014-10-31 | 2017-02-02 | SZ DJI Technology Co., Ltd | Systems and methods for surveillance with a visual marker |
US11442473B2 (en) | 2014-10-31 | 2022-09-13 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with a visual marker |
US10691141B2 (en) | 2014-10-31 | 2020-06-23 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with a visual marker |
CN105045278A (zh) * | 2015-07-09 | 2015-11-11 | 沈阳卡迩特科技有限公司 | 一种微型无人机自主感知与规避方法 |
CN105173102A (zh) * | 2015-09-18 | 2015-12-23 | 西北农林科技大学 | 一种基于多图像的四旋翼飞行器增稳系统与方法 |
CN105282519A (zh) * | 2015-11-13 | 2016-01-27 | 杨珊珊 | 基于无人飞行器的安防系统及其安防方法 |
CN105282519B (zh) * | 2015-11-13 | 2018-11-09 | 杨珊珊 | 基于无人飞行器的安防系统及其安防方法 |
CN105867400A (zh) * | 2016-04-20 | 2016-08-17 | 北京博瑞爱飞科技发展有限公司 | 无人机的飞行控制方法和装置 |
WO2017181513A1 (fr) * | 2016-04-20 | 2017-10-26 | 高鹏 | Procédé et dispositif de commande de vol pour véhicule aérien sans pilote |
CN106527496B (zh) * | 2017-01-13 | 2019-07-02 | 平顶山学院 | 面向无人机航拍图像序列的空中目标快速跟踪方法 |
CN106527496A (zh) * | 2017-01-13 | 2017-03-22 | 平顶山学院 | 面向无人机航拍图像序列的空中目标快速跟踪方法 |
CN108885466A (zh) * | 2017-11-22 | 2018-11-23 | 深圳市大疆创新科技有限公司 | 一种控制参数配置方法及无人机 |
CN108920711A (zh) * | 2018-07-25 | 2018-11-30 | 中国人民解放军国防科技大学 | 面向无人机起降引导的深度学习标签数据生成方法 |
CN112327909A (zh) * | 2020-10-27 | 2021-02-05 | 一飞(海南)科技有限公司 | 一种无人机编队的贴图灯效控制方法、控制系统及无人机 |
WO2022216465A1 (fr) * | 2021-04-06 | 2022-10-13 | Sony Interactive Entertainment LLC | Robot réglable pour fournir une échelle d'actifs virtuels et identifier des objets dans une scène interactive |
CN115617079A (zh) * | 2022-12-14 | 2023-01-17 | 四川轻化工大学 | 一种可交互无人机系统 |
CN115617079B (zh) * | 2022-12-14 | 2023-02-28 | 四川轻化工大学 | 一种可交互无人机系统 |
CN116431005A (zh) * | 2023-06-07 | 2023-07-14 | 安徽大学 | 一种基于改进移动端唇语识别的无人机控制方法及系统 |
CN116431005B (zh) * | 2023-06-07 | 2023-09-12 | 安徽大学 | 一种基于改进移动端唇语识别的无人机控制方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
EP2784618A3 (fr) | 2015-09-23 |
US9621821B2 (en) | 2017-04-11 |
US20140267777A1 (en) | 2014-09-18 |
EP2784618A2 (fr) | 2014-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9621821B2 (en) | Method for shooting a performance using an unmanned aerial vehicle | |
US11347217B2 (en) | User interaction paradigms for a flying digital assistant | |
EP2849150A1 (fr) | Procédé permettant de capturer le mouvement tridimensionnel d'un objet, véhicule aérien sans pilote et système de capture de mouvement | |
US11797009B2 (en) | Unmanned aerial image capture platform | |
US11573562B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
EP3312088B1 (fr) | Véhicule aérien sans pilote et procédé de commande de vol | |
CN108351654B (zh) | 用于视觉目标跟踪的系统和方法 | |
CN108399642B (zh) | 一种融合旋翼无人机imu数据的通用目标跟随方法和系统 | |
US20160194079A1 (en) | Method of automatically piloting a rotary-wing drone for performing camera movements with an onboard camera | |
JP6943988B2 (ja) | 移動可能物体の制御方法、機器およびシステム | |
CN109071034A (zh) | 切换云台工作模式的方法、控制器和图像增稳设备 | |
CN113741543A (zh) | 无人机及返航控制方法、终端、系统和机器可读存储介质 | |
US20180095469A1 (en) | Autonomous system for shooting moving images from a drone, with target tracking and holding of the target shooting angle | |
WO2020014987A1 (fr) | Procédé et appareil de commande de robot mobile, dispositif et support d'informations | |
CN105763790A (zh) | 用于以沉浸模式来驾驶无人机的视频系统 | |
CN108605098A (zh) | 用于卷帘快门校正的系统和方法 | |
Karakostas et al. | UAV cinematography constraints imposed by visual target tracking | |
CN108450032B (zh) | 飞行控制方法和装置 | |
CN204287973U (zh) | 飞行相机 | |
WO2022109860A1 (fr) | Procédé de suivi d'objet cible et cardan | |
US20220187828A1 (en) | Information processing device, information processing method, and program | |
CN105807783A (zh) | 飞行相机 | |
US12007763B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
Kim et al. | Object location estimation from a single flying camera | |
TWI596043B (zh) | 無人航空器具及其校正定位方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20130312 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20150318 |