EP2119226A2 - Device and method for watching real-time augmented reality - Google Patents
Device and method for watching real-time augmented realityInfo
- Publication number
- EP2119226A2 EP2119226A2 EP08761752A EP08761752A EP2119226A2 EP 2119226 A2 EP2119226 A2 EP 2119226A2 EP 08761752 A EP08761752 A EP 08761752A EP 08761752 A EP08761752 A EP 08761752A EP 2119226 A2 EP2119226 A2 EP 2119226A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- image sensor
- image
- orientation
- sight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
Definitions
- Real-time augmented reality observation device and method for implementing such a device
- the present invention relates to observation devices such as telescopes and more particularly augmented reality observation devices for adding real-time virtual objects to the observed image and methods for implementing such devices.
- a large number of high-traffic sites such as tourist sites are equipped with telescopes or binoculars that allow their users to observe a panorama with a magnification factor, variable or not, giving the opportunity to better appreciate the view.
- a telescope refers to a device located on many tourist sites offering an interesting point of view to observe.
- the principle of operation of the telescope is as follows: the user puts a coin in the device and then has a predetermined time to observe the offered point of view. The observation is similar to what would be done with the use of a pair of binoculars.
- the main function is therefore to provide a "magnification" of the panorama observed via a dedicated optical system. The latter offers a restricted field of view around the line of sight.
- the limitation of the field of view of the optical system is compensated by the two degrees of freedom offered by the apparatus by rotation of the line of sight in the horizontal plane, that is to say around the main axis of the device, and by rotation of the line of sight in the vertical plane, that is to say around an axis perpendicular to the main axis of the device. Thanks to these movements, the user is able to scan a large part of the observable panorama and thus observe in detail, using magnification, the areas that interest him.
- Figure 1 illustrates an example of such a telescope.
- the user of the system has no other information than that naturally present in the image. However, for the development of a site, it is often important to provide additional information. These For example, information may be of a cultural or technical nature but may also be of the advertising or economic type to indicate, for example, the presence of restaurants or hotels.
- Algmented Reality are adapted to the addition of such information to enrich the image observed. They make it possible to make appear in the same image, the real view and the added information. This information can then take the form of symbols such as arrows, logos, special signage, text but also three-dimensional virtual objects, animated or not. It is thus possible to make emerge, for example, an old building today disappeared above the ruin which the user would have had to be satisfied if it had only a conventional telescope.
- YDreams offers a solution called Virtual Sightseeing (Virtual Sightseeing is a registered trademark).
- This device has a degree of freedom.
- a disadvantage of this device is the quality of service drift over time. Indeed, the spatial and temporal synchronization depends not only on the accuracy of the encoders used to determine the motions but also on the accuracy and stability of the mechanical links used. In this respect, it should be noted that an error of one degree in the horizontal plane results in an error of more than four meters on the X coordinate of an object located two-fifty meters from the observation point. Any mechanical connection of this nature consisting of a displacement of heavy parts, as precise as it, evolves over time.
- the invention solves at least one of the problems discussed above.
- the invention thus relates to a method for a real-time augmented reality observation device comprising an image sensor, a display system and a control interface, this method being characterized in that it comprises the following steps receiving a request comprising axis of view guidance information transmitted by said control interface;
- the method according to the invention makes it possible to distinguish the control commands of the user from the commands of the image sensor and thus to improve the accuracy and reliability of the observation device, with a slight drift over time.
- the device is thus more robust in time and in the face of external aggressions.
- the camera brings the actual shooting of an area of the panorama to which a computer adds information elements precisely superimposed on the different elements of the image.
- This method makes it possible to improve the quality of the spatial and temporal synchronization between the real and the virtual which only then depends on the parameters inherent to the image sensor.
- the method according to the invention also offers possibilities of correction of disturbances.
- the method according to the invention allows great freedom as to the form that can take the control interface.
- the method according to the invention advantageously comprises a calibration phase of said image sensor to take into account any imperfections thereof.
- said calibration step comprises the calibration of at least one of the parameters included in the set of parameters including the radial distortion correction of said image sensor, the roll correction of said image sensor, the heading and pitch corrections of the line of sight of said image sensor and the offset between the image sensor; optical center and the center of rotation of said image sensor.
- said image sensor comprises a zoom function and the calibration of said at least one parameter is performed for a plurality of zoom factors.
- the calibration of the image sensor and in particular the image sensor calibration for several zoom factors make it possible to consider the camera's defects as extrinsic parameters of the camera and thus optimize the calculations required on the images from this image sensor.
- the method further comprises a co-location step of said image sensor and the scene observed by said image sensor to determine the laying of said at least one datum to be inserted into said image received in said observed scene.
- Said at least one piece of data to be inserted in said received image is advantageously a representation of a virtual three-dimensional model animated or not.
- said orientation of the line of sight is defined according to two degrees of freedom and said image sensor comprises a zoom function to enable the user to observe the point he wishes to see, with the desired magnification.
- the invention also relates to a computer program comprising instructions adapted to the implementation of each of the steps of the method described above.
- the invention also relates to a means for storing information, removable or not, partially or completely readable by a computer or a microprocessor comprising code instructions of a computer program for the execution of each of the steps of the previously described method.
- the invention also relates to an augmented reality observation device comprising means of connection to an image sensor, a display system and a control interface, this device being characterized in that it comprises the following means means for receiving guidance line orientation information transmitted by said control interface;
- the device according to the invention makes it possible to separate the control commands of the user from the commands of the image sensor and thus to improve the accuracy and reliability of the observation device, with a small drift over time.
- the device is thus more robust in time and in the face of external aggressions.
- the camera brings the actual shooting of an area of the panorama to which a computer adds information elements precisely superimposed on the different elements of the image.
- the device makes it possible to improve the quality of the spatial and temporal synchronization between the real and the virtual which only then depends on the parameters inherent to the image sensor.
- the device according to the invention also offers possibilities of correction of disturbances.
- the device according to the invention allows great freedom as to the shape that can take the control interface.
- the device according to the invention advantageously comprises means for transmitting said received image comprising said at least one given to allow the user to view the augmented images from said image sensor on a suitable device.
- At least one of said image sensor and storage means is remote from said observation device.
- This embodiment allows the user of the observation device to benefit from a point of view that he can not reach and to protect the image sensor and / or the storage means against external aggressions such as vandalism.
- FIG. 1 schematically shows a conventional telescope as used in tourist places
- FIG. 2 diagrammatically illustrates the observation device according to the invention
- FIG. 3 comprising FIGS. 3a, 3b and 3c, shows an example of a part of the observation device according to the invention comprising a sphere housing a motorized camera on which are mounted a control interface and a display system.
- FIG. 4 illustrates the observation device presented in FIG.
- FIG. 5 illustrates an example of an apparatus that can be used to control the movements of the camera and to insert the virtual objects in the images coming from the camera
- FIG. 6 schematically represents certain operating steps of the observation device according to the invention.
- FIG. 7 shows another example of an observation device according to the invention in which the motorized camera is deported.
- control of the image capture device is indirect, that is to say that the movement of the camera is not physically or physically coupled to the movement made by the user.
- the aiming control interface is separated from the positioning interface of the shooting module such as a camera.
- the coupling between the two interfaces is realized in a software way which also offers possibilities of correction of the disturbances.
- the movements of the user are reflected on the shooting module via the software interface. Since the virtual scene is co-located with the camera and not the hardware physically manipulated by the user, any loss of precision on the motion sensors of the control interface has no impact on the quality of the integration of the devices. virtual elements in the real image.
- the camera module, motorized is preferably capable of producing lateral movements, also called cap or pan, and elevation movements, also called pitching or tilting. This type of shooting module can be introduced under the name PT (Pan, Tilt) camera.
- the shooting module also makes it possible to zoom, this type of shooting module can be called PTZ camera (Pan, Tilt, Zoom).
- the device according to the invention comprises a video acquisition system producing a video stream composed of images corresponding to the actual view observed, an image processing system for increasing the video stream by virtual elements inlaid in real time according to the direction of the line of sight of the device and a system for viewing the increased video stream.
- the line of sight is determined in real time by motion sensors.
- the accuracy of the position of the line of sight of the camera is essential to the proper functioning of the set, in order to insert the virtual objects in the appropriate locations, that is to say to spatially synchronize the real and virtual environments .
- the separation of the command to define the position of the line of sight as desired by the user, that is to say the user interface, means to achieve this position, that is to say say the internal mechanism of control of the movements of the camera, implies the setting up of a command interface allowing not only to define but also to transmit permanently the requested position of axis of sight.
- the use of a motorized camera makes it possible to move its axis of sight according to the request.
- This interface is preferably performed by the computer used to insert the virtual objects in the images from the camera.
- This computer receives the orientation request for the axis of view of the command interface, transmits the orders to the motorized camera to modify the line of sight and receives, from the camera, the exact position of the axis of view. referred.
- the orientation information of the line of sight of the camera is received from it in the form of, for example, flow.
- Such a stream may transmit, for example, orientation and zoom data per second.
- the video stream is increased and restored.
- One of the objectives of indirect control is to limit the propagation of errors within the system.
- the durability of the synchronization between the virtual and the real over time relies entirely on the quality and reliability of the data received from the motorized camera.
- the precision of the command interface deteriorates over time, the impact will be limited: the line of sight may not correspond exactly to the request sent by the interface but the real and the virtual will remain perfectly synchronized.
- command interface and the rest of the system allows great freedom as to the form that this interface can take. It may be for example a keyboard, a joystick type joystick or a dedicated pointing system such as a mechanical system with motion capture. Such a pointing system can mechanically couple the control interface and the display system.
- Figure 2 schematically illustrates the device 200 of the invention.
- the device 200 comprises a control interface 205 which transmits a position request of the line of sight to the computer 210.
- the computer 210 transmits a command, corresponding to the position request of the line of sight, to the motorized camera 215 having a line of sight 220.
- the camera 215 transmits to the computer 210 the orientation of its line of sight, for example in the form of a data stream, thus allowing the processing of images from the camera during the period of movement.
- the camera 215 also transmits an image flow, preferably continuous, to the computer 210 which integrates in the images from this video stream the virtual objects according to the wishes of the user, pre-established scenarios and / or the context for forming the augmented video stream that is transmitted to a display system 225.
- the quality of the system depends on the characteristics of the motorized camera which must be fast enough to respond to the requests from the control interface, sufficiently precise to best reach the requested positions and sufficiently enduring to maintain the quality of the preceding characteristics during the time.
- the presented architecture concentrates the problem of the synchronization of the real and virtual environments on the camera, preferably a PT camera or a PTZ camera.
- Part of the coordinates of its line of sight is determined during the installation of the observation system while another part of the coordinates of its line of sight is determined in real time.
- the X, Y and Z coordinates as well as the roll (mi!) Along the line of sight are determined or calculated during the installation of the observation system while the heading and pitch (pan, tilt) are provided in real time.
- the zoom function on the line of sight is implemented, the zoom factor is determined in real time.
- the camera mark can be used as the reference mark.
- the X, Y and Z coordinates as well as the roll have zero values.
- the PTZ camera is preferably not accessible to the user. It is therefore necessary to integrate the latter within a suitable receptacle.
- a suitable receptacle must advantageously protect the camera from external aggression, voluntary or otherwise, such as misuse, vandalism and bad weather.
- This receptacle presented by example a part without tint masking the camera to the user but allowing the latter to film the panorama without loss of light or deformation.
- the camera In order to avoid the problems of deformation related to the shape of this receptacle, the camera is here placed in the center of a sphere. Thus, whatever the position of the line of sight, the lens of the camera always remains equidistant from the receptacle.
- the point of rotation of the line of sight of the camera is preferably located at eye level.
- the camera can be deported to allow including a view that could not have the user.
- control interface, or PTZ interface, and the display system are arranged on the sphere used as a receptacle for the camera.
- the shape of the sphere is then used as a guide for the PTZ interface and for the display system, the guide being able to assume the appearance of a meridian-shaped movable rail.
- FIG. 3 comprising FIGS. 3a, 3b and 3c, illustrates an example of a part of the observation device 300 comprising a sphere housing a PT camera or a PTZ camera, on which a control interface and a control system are mounted. viewing.
- Figure 3a shows a part of the device seen user side
- Figure 3b shows a part of the device seen in profile
- Figure 3c shows a part of the device seen camera side.
- the illustrated portion of the device comprises a sphere 305, a control interface 310 mounted on a guide 315 and a display system 320, here a binocular display.
- the sphere 305 preferably comprises two distinct parts, a first portion 325 transparent, semi transparent, or non-tain, located on the camera side and a second part 330 preferably opaque located on the user side.
- the control interface 310 advantageously comprises two handles allowing the user to move this control interface 310 on the opaque portion 330 of the sphere 305.
- the control interface 310 can move along the guide 315. , the guide 315 being rotatable about 180 degrees about the vertical axis of the sphere 305. It is of course possible to limit or extend the displacement of the control interface 310, in particular around the vertical axis of the sphere 305.
- a motorized camera 335 is located in the center of the sphere 305 as shown in FIG. 3c.
- the movement of the camera 335 is controlled by the displacement of the interface 310.
- a movement along the vertical axis of the sphere of the control interface 310 causes a movement along the vertical axis of the camera 335 while a movement along the guide 315 of the control interface 310 causes a movement along an axis of the horizontal plane of the camera 335.
- the movement of the control interface 310 is detected by an optical encoder whose accuracy corresponds to that of the movements servo camera 335, or by a set of sensors.
- a linear position sensor may be integrated with the control interface 310 to determine its position on the guide 315 while an angular sensor is placed on the link between the guide 315 and the sphere 305.
- FIG. 4 illustrates the observation device 300 presented in FIG. 3.
- the observation device here comprises a foot 400 mounted on a base 405, a step 410 and a coin mechanism 415. It should be noted that the coin mechanism 415 is not necessary for the implementation of the invention.
- the camera 335 may be a camera equipped with a CCD (Charge-Coupled Device) sensor having a resolution of HD 108Oi, that is to say a resolution of 1080 lines with progressive scanning, with a 60 fps refresh rate and YUV-HD / HD-SDI type interface and RS-232 port to control camera movement and receive data related to its position.
- the binocular display 320 may comprise two Organic Light-Emitting Diode (OLED) displays, one for each eye, each having a resolution of 800 ⁇ 600 pixels (picture element), a resolution of 24 bits, a refresh rate of 60 images per second. second, a brightness greater than 50 cd / m 2 , a contrast greater than 200: 1 and an interface type VGA (Video Graphics Array) and USB (Universal Serial Bus).
- OLED Organic Light-Emitting Diode
- Figure 5 illustrates an example of an apparatus that can be used to control camera movements and to insert virtual objects into images from the camera.
- the device 500 is for example a microcomputer or a workstation.
- the device 500 preferably comprises a communication bus 502 to which are connected: a central processing unit or microprocessor 504 (CPU,
- ROM read-only memory 506
- Programg programs
- a random access memory or cache 508 (Random Access Memory RAM) comprising registers adapted to record variables and parameters created and modified during the execution of the aforementioned programs;
- a video capture card 510 connected to a camera 335 ';
- the apparatus 500 may also have the following elements:
- a hard disk 520 which may comprise the aforementioned "Prog" programs and data processed or to be processed according to the invention
- the user can, according to a particular embodiment, insert a memory card so as to store images from the camera 335, real or augmented.
- the communication bus allows communication and interoperability between the various elements included in the device 500 or connected to it.
- the representation of the bus is not limiting and, in particular, the central unit is capable of communicating instructions to any element of the apparatus 500 directly or via another element of the apparatus 500.
- the executable code of each program enabling the programmable device to implement the processes according to the invention can be stored, for example, in the hard disk 520 or in the read-only memory 506. According to a variant, the executable code of the programs can be received via the communication network 528, via the interface 526, to be stored identically to that previously described.
- program or programs may be loaded into one of the storage means of the device 500 before being executed.
- the central unit 504 will control and direct the execution of the instructions or portions of software code of the program or programs according to the invention, instructions which are stored in the hard disk 520 or in the read-only memory 506 or else in the other elements of aforementioned storage.
- the program or programs that are stored in a non-volatile memory, for example the hard disk 520 or the read-only memory 506, are transferred into the random access memory 508 which then contains the executable code of the program or programs according to the invention, as well as registers for storing the variables and parameters necessary for the implementation of the invention.
- the communication apparatus comprising the device according to the invention can also be a programmed apparatus.
- This device then contains the code of the computer program or programs for example frozen in a specific application integrated circuit (ASIC).
- ASIC application integrated circuit
- the apparatus 500 includes an augmented reality application such as Total Fusion's Fusion software (D'Fusion is a trademark of Total Immersion).
- D'Fusion is a trademark of Total Immersion.
- the principle of real-time insertion of a virtual object in an image from a camera or other video acquisition means according to this software is described in the patent application WO 2004/012445.
- FIG. 6 schematically illustrates certain operating steps of the observation device according to the invention.
- the operation of the observation device includes an installation and initialization phase (phase I) and a phase of use (phase II).
- the installation and initialization phase comprises a step of calibrating the PTZ camera (step 600) and a step of loading the data used to enrich the real images (step 605).
- the loading of these data can be carried out during the installation of the observation device, when the device is switched on or at regular or programmed times.
- the user's sighting movement information is received from the control interface (step 610) and used to control the line of sight of the camera (step 615).
- the camera then transmits the position of its line of sight and the zoom factor if the zoom function is implemented (step 620).
- the zoom factor then makes it possible to retrieve the intrinsic parameters of the camera and the distortion parameters (step 625) by comparison with the values established during the calibration of the camera during the initialization phase.
- the heading and pitch data of the line of sight of the camera make it possible to retrieve the extrinsic data of the camera according to the current zoom level (step 630).
- a projection matrix is then determined from the data from steps 605, 625 and 630 (step 635).
- This projection matrix is used to determine the position of the elements, such as virtual objects, to be inserted into the image from the camera (step 640). These elements, for example a representation of virtual objects, are then inserted into the image from the camera (step 645) to form an augmented image. The augmented image is then presented to the user (step 650). Steps 610 to 650 are repeated for each image as indicated by the dotted arrow. It should be noted that steps 610 to 630 may not be repeated if there is no movement of the user.
- the calibration of the camera (step 600) is intended to allow a good integration of the elements, such as virtual objects, in the images from the camera, by modeling the behavior of the camera for any type of environment in which it operates, that is to say by modeling the transformation of a point of space into a point of the image.
- the main calibration steps of the PTZ type camera are preferably the following,
- the shooting module for example a CCD sensor
- O is the position of the camera, and k is the line of sight;
- ( ⁇ , ⁇ j, k) is the reference linked to the camera, in space;
- (D, u, v) is the landmark in the image;
- O ' is the center of the image, the coordinates of O' in the reference (D, u, v) are
- ⁇ is the line perpendicular to the image plane and passing through the point O, ⁇ thus represents the optical axis of the camera;
- f is the focal length, that is to say the distance between the point O and the image plane;
- M is a point of space having coordinates (x, y, z) in the coordinate system
- the projection matrix P r making it possible to go from a point M to the point m can be written in the following form,
- (w 0 v 0 l) are the homogeneous coordinates of the point O in the frame (D, û, v), expressed in pixels, k u is the horizontal scale factor and k v is the vertical scale factor, expressed in pixels per millimeter.
- the intrinsic parameters of the camera are the internal characteristics of the camera.
- the geometric model of the camera is expressed by the matrix product AT • /'.which gives the relation between the coordinates in the coordinate system ( ⁇ j, j, k) of the point M (x, y, z) and the coordinates in the landmark (D, û, v) from the point q (u, v), projected from M into the image plane.
- the coordinates of the point q can therefore be expressed according to the following form,
- the matrix A intrinsic parameters can be expressed in the following form, a ,, 0 u n
- the intrinsic parameters must be related to other information more generally used in the world of video such as the resolution, in pixels, the size of the image sensor, in millimeters and the focal length according to the known relations.
- the extrinsic parameters correspond to the rigid spatial transformation of the camera.
- the transformation matrix D taking into account three degrees of freedom linked to the rotation R and three degrees of freedom linked to the translation T, can be written in the following form,
- the distortion parameters are then interpolated for the intermediate focal length values. It is therefore unnecessary to take this into account in the formulation.
- the The solution is to consider the camera used as a simple ideal camera for which offsets are made outside the camera model. Thus the distortion is pre-compensated and other residual errors, such as the optical center, are not compensated at the projection but considering that these errors come from the positioning (position and orientation) of the camera. Even if this approach can theoretically prove to be wrong, it nevertheless allows to obtain good results.
- these compensations should not be considered constant because they may be a function of the variation of the user's aim control parameters.
- most of the compensations are calibrated for a set of focal values and interpolated when the focal length is between two focal values for which the compensations to be applied have been calculated.
- This approach makes it possible to correctly overlay the virtual objects on the images of a real scene by displaying the compensated images of the real scene and by rendering the virtual objects with a virtual camera simply offset and offset in position and orientation.
- the real-time calculation cost of the virtual PTZ camera model is then almost the same as that of a perfect virtual camera (pin-hole).
- the approach for calibrating a PTZ type camera is however different than that set up for a conventional camera. Indeed, it is necessary to perform radial distortion calibration and field of view calibration steps for several zoom levels. The corresponding parameters are thus associated with a precise zoom level given in real time by the camera. Then, during the use of the device these parameters are interpolated according to the current zoom level of the camera.
- Each calibration phase uses the compensations from the previous calibration phases. So, to be able to consider the camera as a pin-hole model, it is necessary to start by compensating the distortion. When the radial distortion is corrected, it is then possible to calibrate the roll of the image sensor which has the effect of having a rotation of the image around the center of the distortion. To this end, it is necessary to determine whether, during a course movement (horizontal rotation) the trajectory of the points in the image is a horizontal trajectory and if necessary, to compensate for this defect. When this compensation is established it is possible to measure the horizontal field of view simply by making a course movement. The focal length of the camera is thus calibrated and then a sufficiently complete camera model is available to compare the projection of a point theoretically on the optical axis and its actual position as a zoom is done.
- the previous formulation of the distortion shows that it is necessary to know the position of the optical center to estimate it.
- the calibration uses a pattern consisting of a set of coplanar points placed in a regular and known manner.
- one solution consists in comparing the theoretical projection of the set of points with the observed effective projection, which implies the knowledge of the extrinsic parameters of position and orientation of the target with respect to the camera as well as than the intrinsic parameters of horizontal and vertical focal lengths.
- the distortion is estimated for several focal lengths of the camera.
- the distortion does not depend on the orientation of the camera which is preferably left in central orientation during the distortion calibration step.
- the first phase of the calibration for a given focal length and expressed not by its metric focal length but by the value of the camera's encoder associated with the zoom factor, consists in placing a target in front of the camera lens. and to analyze the projection of the points of the test pattern into the image.
- the points of the chart are a set of coplanar points placed in a regular way (fixed and known horizontal and vertical spacings).
- the pattern is characterized by determining one of its points as a reference. This reference point defines the reference of the target from which all other points of the target can be expressed.
- the expression of the configuration of the test pattern with respect to the camera thus amounts to expressing the configuration (position and orientation) of this point with respect to the optical center of the camera.
- Any point N of the test pattern can therefore be expressed, in the local coordinate system M of the test pattern, by the following relation,
- this point of the target projected on the image plane of the camera sees its image coordinates (u, v) generate an error (squared) E n with respect to the measurement (u n , v n ) .
- the calibration steps are as follows,
- This calibration phase of the distortion can be automated.
- the test pattern controllable by computer, is placed on a mechanical rail so as to be able to move the sight away from the camera.
- a set of sights is placed in front of the camera, each is placed at an angle and a distance to ensure that for each zoom factor chosen for the calibration, there is a heading configuration to observe at a good distance, a test pattern and only one.
- the distortion compensation can not compensate for a roll defect of the image sensor.
- ⁇ which must be estimated. This angle is considered to be independent of the zoom factor.
- the angle ⁇ it is necessary to determine a reference point situated in the observed scene, substantially in the center of the image coming from the camera, and to make a course movement of the camera. If the angle is zero, the reference point must remain permanently on the mid-height line, or horizon line, of the image coming from the camera otherwise, it passes below or above this line of halfway up. The angle ⁇ is then estimated so that the reference point has a perfectly horizontal movement in the image. This step only needs to be performed for a single zoom value, for example the lowest zoom value.
- this step can be automated by following a decorative point during a camera heading movement and calculating the compensation to be made so that the movement is horizontal in the image. For more robustness, this calculation can be performed on a set of points, the angle ⁇ being evaluated according to an average.
- the zoom factor is controlled by indicating a value given to the encoder in charge of the zoom to calibrate the distortion. We therefore do not have the focal value directly.
- the principle is to create a map between the values of the zoom encoder and the field of view ⁇ u (Field Of View angle). The use of the field of view makes it possible to determine the value of the focal length in pixel according to the following relation,
- FoW is the width of the field of view ⁇ Field of View Width
- a solution consists in aiming at a point or a vertical line of the decoration located at the center of the image when the heading is at zero and changing. the heading angle until the point or line is precisely on the right or left edge of the image. This makes it possible to measure the half-angle of the horizontal field of view and thus to calculate the value of the horizontal focal length.
- the value of the horizontal focal point makes it possible to deduce that of the vertical focal length from the known ratio of the focal lengths. This solution is applicable if the relationship between the heading encoder and heading is linear, known and symmetrical with respect to the center of course.
- This procedure could be automated by following a point or line of scenery during a course move, until the tracked item disappears.
- the heading encoder then directly gives the value of the half horizontal field of view.
- the decentering can be corrected by a rotation combining heading and pitching to avoid inserting this decentering into the intrinsic parameters.
- the heading and pitch errors with respect to the optical axis of the camera are compensated not in the form of a complexification of the model of the camera but by means of an offset on the position and the orientation of the camera.
- This pitch and pitch shift is preferably measured for all zoom factors. The measurement is made, for example, by aiming at a relatively far point of the scene while using the lowest minimum zoom factor. For each zoom factor increase, the pitch and pitch offset is set manually on the virtual camera so that the virtual point associated with the target point remains superimposed on it.
- this procedure can be automated by following a point of the scene during a zooming motion and offsetting the pixel error by a shift in heading and pitch.
- the calibration of the offset between the optical center and the center of rotation of the camera is the last phase of the calibration step.
- This Calibration of the shift between the optical center and the center of rotation of the camera is not necessary if the filmed scene and the position of the virtual objects in the filmed scene are still far from the camera. Indeed, in the other steps of the calibration, it was considered that the optical center and the center of physical rotation of the camera was in the same place. This can be false but has almost no impact as soon as the visual elements are a few tens of centimeters apart. However, for close points, the previous compensations are not sufficient and the taking into account of this shift of the optical center may be necessary. It is considered here that the offset exists only along the optical axis, which is consistent with the fact that the previous calibrations are intended to compensate for the offset, except on its optical axis component.
- one solution consists in aiming at a point of the scenery located physically close to the camera and whose physical distance with respect to the camera has previously been measured. This point is preferably chosen in the middle of the image when the line of sight of the camera is oriented with zero heading and pitch. For each zoom factor increase, the camera heading is changed until the target point is at the edge of the image. The offset is then adjusted manually so that the virtual point associated with the real point is superimposed on it. It is possible to automate the calibration of the shift between the optical center and the center of rotation of the camera by automatically following a point close to the scene by varying the zoom factor and compensating for the pixel error by a shift in translation. along the optical axis.
- the calibration of the camera is necessary to obtain a good integration of the virtual objects in the images coming from the camera, it is also necessary to know precisely the position and the orientation of the object with respect to the camera or in an equivalent way to know the configuration of the real camera in relation to the real world that it films.
- This calculation is called a "pose" calculation.
- To perform this pose calculation it is necessary to relate an object of the three-dimensional scene with its two-dimensional image. For this, it is necessary to have the coordinates of several points of the object in the real scene (three-dimensional coordinates), all expressed with respect to a point of the object considered as object reference during the calculation of the pose. For this set of points, it is also necessary to have coordinates of their projection in the image (two-dimensional coordinates).
- a multi-level pose evaluation is then implemented.
- a first level is to perform a quick calculation to obtain an approximate pose while a second level, by a longer calculation, uses the approximate pose to improve the pose estimate, iteratively.
- the first-level estimate is based on a first-order approximation of the "weak perspective" perspective projection model.
- This method is fast and robust as long as the points chosen in the real space are distributed over the entire surface of the object and are not coplanar. In addition, to converge, it is necessary that the object is visible rather in the center of the image and located relatively far from the camera.
- This known method is presented, for example, in the article "Model-Based Object Pose in 25 Lines of Code” D. DeMenthon and L. S. Davis, International Journal of Computer Vision, 15, pp. 123-141, June 1995 and in the article "Object Pose: The Link between Weak Perspective, Paraperspective, and FuII Perspective", R. Horaud, F. Dornaika, Lamiroy B., S. Christy, International Journal of Computer Vision, Volume 22, No. 2, 1997.
- This method performs a pose calculation by choosing one of the points of the object, in three-dimensional space, as a reference frame.
- the quality of the pose estimate was not the same depending on the chosen reference point and sometimes it is useful to remove some points to obtain a better result, while keeping at least five points not coplanar.
- the object comprises r points in the three-dimensional space, to perform r. (R-5) pose calculations by taking each of the r points as a reference point and then to delete at each iteration, among the other points, the furthest point from the reference point. For each of these pose calculations, an average of the reprojection error in the image plane is calculated. The final pose is the one that corresponds to the smallest error.
- the second-level estimation uses a stochastic exploration, guided by an error criterion, of the configuration space, that is to say the six-dimensional space corresponding to the set of pairs (position, orientation) close to the current pose.
- the idea is to start from the current pose then to move a small random shift in the configuration space. If the new configuration better meets the error criterion then it becomes the new reference or else another random offset is used.
- the steps of this method are as follows,
- the calculation of the error is preferably done by summing, for each point, the orthogonal distance to the square between the point of the object (P), in the three-dimensional space, in the considered pose (P) and the straight line, in the three-dimensional space resulting from the projection in the two-dimensional image associated with the point, in the three-dimensional space, and the projection center of the camera (defined by the center of the camera in the three-dimensional space and a unit vector u whose direction is defined by the center of the camera and the considered point, in three-dimensional space).
- the camera When the camera is calibrated and its configuration (position / orientation) relative to the environment is known, it is possible inserting virtual objects, or any other element such as a secondary video stream, into the images from the camera.
- the insertion of the virtual objects in the video stream can be realized in real time by the merger software of the company Total Immersion as indicated above.
- the choice of virtual objects to be inserted into the video stream can be done by geolocation by constituting a database. This database can be made manually or from existing databases via, for example, a connection to a network such as the Internet.
- the camera of the observation device can be deported as shown in FIG. 7. According to this embodiment, one or more cameras are installed at locations that are not directly accessible to the users such as at the top of a camera. building, in a cave or underwater.
- FIG. 7 illustrates an observation device 300 'comprising a sphere 305' on which are movably attached a control interface 310 'and a display system 320''.
- the sphere 305' is preferably coupled to a foot 400 ' mounted on a base 405 'comprising a computer 500' which can also be deported
- a step 410 ' allows the user to position themselves appropriately facing the control interface 310' and the display system 320 ".
- the camera 335 is here deported to be fixed on a chimney 700 of a house 705.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Astronomy & Astrophysics (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0752641A FR2911463B1 (en) | 2007-01-12 | 2007-01-12 | REAL-TIME REALITY REALITY OBSERVATION DEVICE AND METHOD FOR IMPLEMENTING A DEVICE |
PCT/FR2008/000030 WO2008099092A2 (en) | 2007-01-12 | 2008-01-10 | Device and method for watching real-time augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2119226A2 true EP2119226A2 (en) | 2009-11-18 |
Family
ID=38372367
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08761752A Withdrawn EP2119226A2 (en) | 2007-01-12 | 2008-01-10 | Device and method for watching real-time augmented reality |
Country Status (4)
Country | Link |
---|---|
US (1) | US20100045700A1 (en) |
EP (1) | EP2119226A2 (en) |
FR (1) | FR2911463B1 (en) |
WO (1) | WO2008099092A2 (en) |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8817045B2 (en) | 2000-11-06 | 2014-08-26 | Nant Holdings Ip, Llc | Interactivity via mobile image recognition |
KR101019569B1 (en) | 2005-08-29 | 2011-03-08 | 에브릭스 테크놀로지스, 인코포레이티드 | Interactivity via mobile image recognition |
US8239132B2 (en) | 2008-01-22 | 2012-08-07 | Maran Ma | Systems, apparatus and methods for delivery of location-oriented information |
JP2011530708A (en) * | 2008-08-13 | 2011-12-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Measurement and correction of lens distortion in a multi-spot scanner |
FR2944932B1 (en) * | 2009-04-27 | 2012-06-01 | Scutum | METHOD AND SYSTEM FOR DISSEMINATION OF INFORMATION THROUGH A COMMUNICATION NETWORK |
KR20150008840A (en) | 2010-02-24 | 2015-01-23 | 아이피플렉 홀딩스 코포레이션 | Augmented reality panorama supporting visually imparired individuals |
US9514654B2 (en) | 2010-07-13 | 2016-12-06 | Alive Studios, Llc | Method and system for presenting interactive, three-dimensional learning tools |
USD654538S1 (en) | 2011-01-31 | 2012-02-21 | Logical Choice Technologies, Inc. | Educational card |
USD675648S1 (en) | 2011-01-31 | 2013-02-05 | Logical Choice Technologies, Inc. | Display screen with animated avatar |
USD647968S1 (en) | 2011-01-31 | 2011-11-01 | Logical Choice Technologies, Inc. | Educational card |
USD648391S1 (en) | 2011-01-31 | 2011-11-08 | Logical Choice Technologies, Inc. | Educational card |
USD648390S1 (en) | 2011-01-31 | 2011-11-08 | Logical Choice Technologies, Inc. | Educational card |
USD648796S1 (en) | 2011-01-31 | 2011-11-15 | Logical Choice Technologies, Inc. | Educational card |
US8711186B2 (en) * | 2011-05-02 | 2014-04-29 | Microvision, Inc. | Scanning projection apparatus with tangential compensation |
US9600933B2 (en) | 2011-07-01 | 2017-03-21 | Intel Corporation | Mobile augmented reality system |
EP2798847B1 (en) * | 2011-12-30 | 2018-08-22 | Barco NV | Method and system for determining image retention |
FR2998680B1 (en) * | 2012-11-26 | 2015-01-16 | Laurent Desombre | NAVIGATION METHOD IN AN ENVIRONMENT ASSOCIATED WITH AN INTERACTIVE PERISCOPE WITH A VIRTUAL REALITY |
US10262462B2 (en) | 2014-04-18 | 2019-04-16 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
WO2015054387A1 (en) * | 2013-10-10 | 2015-04-16 | Selverston Aaron | Outdoor, interactive 3d viewing apparatus |
KR101592740B1 (en) * | 2014-07-24 | 2016-02-15 | 현대자동차주식회사 | Apparatus and method for correcting image distortion of wide angle camera for vehicle |
US10196005B2 (en) | 2015-01-22 | 2019-02-05 | Mobileye Vision Technologies Ltd. | Method and system of camera focus for advanced driver assistance system (ADAS) |
US20180255285A1 (en) * | 2017-03-06 | 2018-09-06 | Universal City Studios Llc | Systems and methods for layered virtual features in an amusement park environment |
RU2682588C1 (en) * | 2018-02-28 | 2019-03-19 | Федеральное государственное автономное научное учреждение "Центральный научно-исследовательский и опытно-конструкторский институт робототехники и технической кибернетики" (ЦНИИ РТК) | Method of high-precision calibration of digital video channel distortion |
WO2021101524A1 (en) * | 2019-11-19 | 2021-05-27 | Hewlett-Packard Development Company, L.P. | Determining a preferred region of a scanner |
US11145117B2 (en) | 2019-12-02 | 2021-10-12 | At&T Intellectual Property I, L.P. | System and method for preserving a configurable augmented reality experience |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060238861A1 (en) * | 2005-04-20 | 2006-10-26 | Baun Kenneth W | High definition telescope |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2001070A1 (en) * | 1988-10-24 | 1990-04-24 | Douglas B. George | Telescope operating system |
US5448053A (en) * | 1993-03-01 | 1995-09-05 | Rhoads; Geoffrey B. | Method and apparatus for wide field distortion-compensated imaging |
US5815411A (en) * | 1993-09-10 | 1998-09-29 | Criticom Corporation | Electro-optic vision system which exploits position and attitude |
US5745387A (en) * | 1995-09-28 | 1998-04-28 | General Electric Company | Augmented reality maintenance system employing manipulator arm with archive and comparison device |
US6665454B1 (en) * | 1997-07-15 | 2003-12-16 | Silverbrook Research Pty Ltd | Dot adjacency compensation in optical storage systems using ink dots |
EP1125156A4 (en) * | 1998-10-26 | 2006-06-28 | Meade Instruments Corp | Fully automated telescope system with distributed intelligence |
US20020082498A1 (en) * | 2000-10-05 | 2002-06-27 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
US6765569B2 (en) * | 2001-03-07 | 2004-07-20 | University Of Southern California | Augmented-reality tool employing scene-feature autocalibration during camera motion |
US20020171924A1 (en) * | 2001-05-15 | 2002-11-21 | Varner Jerry W. | Telescope viewing system |
US8688833B1 (en) * | 2001-11-08 | 2014-04-01 | Oceanit Laboratories, Inc. | Autonomous robotic telescope system |
US7138963B2 (en) * | 2002-07-18 | 2006-11-21 | Metamersion, Llc | Method for automatically tracking objects in augmented reality |
US7427996B2 (en) * | 2002-10-16 | 2008-09-23 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
WO2004107012A1 (en) * | 2003-05-30 | 2004-12-09 | Vixen Co., Ltd. | Automatic introduction system of celestial body |
US7443404B2 (en) * | 2003-10-17 | 2008-10-28 | Casio Computer Co., Ltd. | Image display apparatus, image display controlling method, and image display program |
US8521411B2 (en) * | 2004-06-03 | 2013-08-27 | Making Virtual Solid, L.L.C. | En-route navigation display method and apparatus using head-up display |
US20060103926A1 (en) * | 2004-11-12 | 2006-05-18 | Imaginova Corporation | Telescope system and method of use |
US7339731B2 (en) * | 2005-04-20 | 2008-03-04 | Meade Instruments Corporation | Self-aligning telescope |
EP1931945B1 (en) * | 2005-09-12 | 2011-04-06 | Trimble Jena GmbH | Surveying instrument and method of providing survey data using a surveying instrument |
US20080018995A1 (en) * | 2006-07-21 | 2008-01-24 | Baun Kenneth W | User-directed automated telescope alignment |
-
2007
- 2007-01-12 FR FR0752641A patent/FR2911463B1/en not_active Expired - Fee Related
-
2008
- 2008-01-10 US US12/522,948 patent/US20100045700A1/en not_active Abandoned
- 2008-01-10 EP EP08761752A patent/EP2119226A2/en not_active Withdrawn
- 2008-01-10 WO PCT/FR2008/000030 patent/WO2008099092A2/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060238861A1 (en) * | 2005-04-20 | 2006-10-26 | Baun Kenneth W | High definition telescope |
Also Published As
Publication number | Publication date |
---|---|
FR2911463A1 (en) | 2008-07-18 |
WO2008099092A3 (en) | 2008-10-02 |
WO2008099092A2 (en) | 2008-08-21 |
FR2911463B1 (en) | 2009-10-30 |
US20100045700A1 (en) | 2010-02-25 |
WO2008099092A8 (en) | 2010-05-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2119226A2 (en) | Device and method for watching real-time augmented reality | |
EP2715662B1 (en) | Method for localisation of a camera and 3d reconstruction in a partially known environment | |
EP2923330B1 (en) | Method of 3d reconstruction and 3d panoramic mosaicing of a scene | |
EP0661672B1 (en) | Picture processing process and device for building a target image from a source image with perspective change | |
EP0661671B1 (en) | Picture processing process and device for building a target image from a source image with perspective change | |
EP2385405B1 (en) | Panomaric projection device and method implemented by said device | |
EP4254333A2 (en) | Augmented reality method and devices using a real time automatic tracking of marker-free textured planar geometrical objects in a video stream | |
WO2011144408A1 (en) | Method and system for fusing data arising from image sensors and from motion or position sensors | |
EP2791914A1 (en) | System for filming a video movie | |
EP0971319A2 (en) | Modelisation process for image capturing system and process and system for combining real images with synthetic pictures | |
EP2111605A2 (en) | Method and device for creating at least two key images corresponding to a three-dimensional object | |
EP1168831B1 (en) | Method for camera calibration | |
FR3027144A1 (en) | METHOD AND DEVICE FOR DETERMINING MOVEMENT BETWEEN SUCCESSIVE VIDEO IMAGES | |
EP3008693A1 (en) | System for tracking the position of the shooting camera for shooting video films | |
WO2018229358A1 (en) | Method and device for constructing a three-dimensional image | |
FR3052287B1 (en) | CONSTRUCTION OF A THREE-DIMENSIONAL IMAGE | |
EP3072110A1 (en) | Method for estimating the movement of an object | |
FR3047831A1 (en) | METHOD FOR DETERMINING THE SPEED OF MOVING OBJECTS IN A SCENE | |
FR3030843A1 (en) | METHOD FOR PROJECTING IMAGES ON A THREE-DIMENSIONAL HYBRID SCENE | |
FR2964203A1 (en) | Image acquiring device e.g. camera, for use in photograph field of two-dimensional effects, has target determining unit determining part of panoramic, where viewing axis of device is passed through optical center of lens and sensor | |
FR2737071A1 (en) | Matrix array camera movement system for site surveillance and e.g. being mounted on platform on boat - has random position camera with position picture and coordinate memorisation and display, and is provided with gyroscope | |
FR2962871A3 (en) | Image acquisition device for e.g. camera in video-conferencing, has scenery elements seen from device, and photographic objective with optical axis made positioned with respect to device when no correction by line of sight of device is made | |
WO2012093209A1 (en) | Method and device for assisting in the shooting of a digital photo using a wide-angle lens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090729 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20110127 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/01 20060101ALI20130628BHEP Ipc: H04N 5/232 20060101ALI20130628BHEP Ipc: G02B 23/00 20060101AFI20130628BHEP Ipc: H04N 5/445 20110101ALI20130628BHEP Ipc: H04N 5/272 20060101ALI20130628BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20131125 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20140408 |