EP1828928A1 - Procédé et système d'identification d'un objet dans une photo, programme, support d'enregistrement, terminal et serveur pour la mise en oeuvre du système - Google Patents
Procédé et système d'identification d'un objet dans une photo, programme, support d'enregistrement, terminal et serveur pour la mise en oeuvre du systèmeInfo
- Publication number
- EP1828928A1 EP1828928A1 EP05802703A EP05802703A EP1828928A1 EP 1828928 A1 EP1828928 A1 EP 1828928A1 EP 05802703 A EP05802703 A EP 05802703A EP 05802703 A EP05802703 A EP 05802703A EP 1828928 A1 EP1828928 A1 EP 1828928A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- geographical position
- module
- extracted
- photo
- objective
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
Definitions
- the present invention relates to a method and system for identifying an object in a photo, as well as a program, a recording medium, a terminal and a server for the implementation. of the system.
- the object of the invention is to remedy this drawback by proposing a method of automatic identification of an object in a photo.
- the subject of the invention is therefore a method of automatically identifying an object in a photograph taken from a camera equipped with an objective, this method comprising:
- the above method allows automatic identification of at least one object in the photo.
- this method takes advantage of the fact that from the moment when the geographic position and the direction of aiming of the objective are known, it is possible to select in a cartographic database at least one object corresponding to the one of those photographed. Information about the selected object can then identify the object in this photo.
- the embodiments of this method may include one or more of the following features:
- the invention also relates to a consultation process and a selection process adapted to be implemented in the identification method described above.
- the invention also relates to a computer program and an information recording medium comprising instructions for executing an identification method, a consultation process or a selection process such as than those described above, when the instructions are executed by an electronic calculator.
- the invention also relates to a system for automatic identification of an object in a photo taken from a camera equipped with a lens, this system comprises:
- the selection module is also able to select only the object closest to the extracted geographic position among objects selected as being the closest to the determined oriented line; the selection module is also able to select the object or objects according to an angle of view of the objective.
- the invention also relates to a consultation terminal and a computer server adapted to be implemented in the system described above.
- FIG. 1 is a schematic illustration of the general architecture of a system for automatic identification of an object in a photo
- FIG. 2 is a schematic illustration of the architecture of a particular embodiment of the system of FIG. 1;
- Figure 3 is a flowchart of a method of automatically identifying an object in a photo;
- FIG. 4 is a diagram illustrating a method for correcting a direction as a function of the position of a point on a photograph.
- Figure 1 shows a system, designated by the general reference 40, identification of a visible object on a photo.
- each photo is associated with data subsequently called "metadata", as, for example, that encountered in the EXIF photo recording format.
- This metadata includes: - the geographical position of the camera lens used to take the picture at the time this picture was taken,
- geometric position refers to coordinates in a three-dimensional repository, these coordinates being representative of the latitude, longitude and altitude of the position.
- the geographical position and the direction of aiming of the objective are measured at the moment when the photo is taken and then recorded in the metadata associated with this photo.
- the angle of view or the focal length and the format of the photo are recorded and recorded in the metadata associated with this photo.
- the metadata and photos are stored in a memory 42.
- the system 40 includes a unit 44 for processing the metadata stored in the memory 42.
- the unit 44 includes a module 48 for extracting the geographical position of the objective, the direction of aim of the objective and the angle of view of the objective in the metadata recorded in FIG. memory 46.
- the unit 44 also comprises a module 50 for acquiring the coordinates of a point on a photo and a module 52 for correcting the direction extracted by the module 48.
- the module 50 is able to acquire the coordinates of a point on a photo in a two-dimensional orthonormal frame whose origin is, for example, coincides with the center of the photo.
- This module has an output connected to the module 52 to transmit to the module 52 the coordinates acquired.
- the module 52 is able to correct the direction extracted by the module 48 to produce a direction corrected through the geographical position of the point of view and by a geographical position corresponding to the point of the photo whose coordinates have been acquired.
- the module 52 uses the field of view of the camera.
- the angle of view data is extracted from the metadata contained in the memory 46.
- the angle of the field defines the limits of a visible scene through the lens of the camera.
- the unit 44 also has two outputs connected to a database engine 60 for transmitting to the latter the position extracted by the module 48 and the corrected direction.
- the engine 60 is adapted to select an object in a map database 62 stored in a memory 64.
- the database 62 contains the geographical position of a large number of objects associated with an identifier of each of these objects. These objects are, for example, historical monuments, mountains, place names. Here, each of these objects is likely to be seen and identified with the naked eye by a human being.
- the motor 60 comprises a module 66 for determining a straight line and a module 68 for selecting an object near the determined right.
- the module 66 determines the equation of the line passing through the extracted geographic position and having as direction that corrected by the module 52.
- the module 68 is able to select in the database 62 the closest object or objects of the right determined by the module 66 and which are visible in the photo. This module 68 will be described in more detail with reference to FIG.
- the motor 60 has an output through which the identifiers of the objects selected by the module 68 are transmitted. This output is connected to a unit 70 for presenting information on the selected object or objects.
- the motor 60 is preferably made in the form of a computer program comprising instructions for executing a selection method as described with reference to FIG. 3, when these instructions are executed by an electronic calculator .
- the unit 70 comprises a module 72 for creating a legend from complementary information contained in a database 74 stored in a memory 76.
- the database 74 associates with each object identifier additional information such as: for example, the name of the object, its intrinsic characteristics, its history. This information is saved in an appropriate format for viewing. For example, here the names of the objects are saved as an alphanumeric string while the history of an object is saved as an audio file.
- the unit 70 also comprises a man / machine interface 78.
- this man / machine interface 78 is equipped with a loudspeaker 80 capable of delivering audio files to a user and a screen 82 able to display the photograph taken. by the camera in which is, for example, inlaid the legend created by the module 72.
- FIG. 2 represents a particular embodiment of the system 40.
- the elements already described in FIG. Figure 1 shows the same reference numerals in Figure 2.
- the system 40 comprises a computer server 86 connected via a network 84 for transmitting information to a terminal 88 for viewing photos.
- FIG. 2 also shows a camera 90 equipped with a lens 92.
- the lens 92 has a viewing direction 94 which corresponds to the optical axis of this lens.
- This apparatus 90 is able to record in the memory 42 of the system 40 the photos as well as the corresponding metadata including in particular the geographical position, the direction of view and the angle of view for each of these photos.
- the apparatus 90 is equipped with a unit 96 for measuring the geographic position and the aiming direction of the objective 92.
- this unit 96 is produced using a sensor 97 of geographical position and a sensor 98 orientation.
- the sensor 97 is, for example, a GPS (Global Positioning System) sensor and the sensor 98 is, for example, made using three gyroscopes arranged perpendicularly to each other.
- the unit 96 is also able to read the settings of the device 90 such as the angle of view of the lens, the date, the time and the brightness.
- the apparatus 90 is capable of recording the photos and the corresponding metadata in the memory 42 via an information transmission link 99 such as, for example, a wireless link.
- the device 90 is, for example, a digital camera or a mobile phone equipped with a camera.
- the server 86 is equipped with a modem 100 for exchanging information with the terminal 88 via the network 84.
- the database engine 60 and the module 72 for creating a legend are located in the server 86.
- the databases 62 and 74 of the system 40 have been grouped into a single database 104 stored in a memory 105 associated with the server 86.
- the database 104 groups for each object its identifier, its geographical position as well as the additional information concerning it.
- the memory 105 also includes, for example, the instructions of the computer program corresponding to the engine 60 and the module 72, the server 86 then fulfilling the role of the electronic calculator able to execute these instructions.
- the terminal 88 is, for example, made from a conventional computer equipped with a central unit 110 and the man / machine interface 78.
- the unit 110 is provided with a modem 112 for exchanging information with the computer.
- server 86 via the network 84.
- the modules 48, 50 and 52 are located in the central unit 110.
- This central unit 110 is associated with the memory 42 containing the photos and the metadata.
- the memory 46 includes the instructions of a computer program corresponding to the modules 48, 50 and 52 and the central unit 110 then plays the role of electronic calculator proper to execute these instructions.
- the screen and a speaker of the computer correspond respectively to the screen 82 and the speaker 80 of the interface 78.
- This interface 78 comprises also in this embodiment a mouse 120 and a keyboard 122.
- a user of the apparatus 90 takes a picture during a step 140.
- the metadata associated with the picture that has just been taken are created during a step 144. More specifically, during an operation 146, the sensor 97 measures the position of the apparatus 90 and the sensor 98 measures the orientation of the direction 94 relative to the horizontal and relative to the magnetic north. The inclination of the apparatus 90 relative to the horizontal is also measured during this operation 146 to determine the inclination of the photo relative to the horizontal.
- the unit 96 also records, during an operation 152, the settings of the camera used to take the picture.
- the camera 90 raises the field of view of the lens at the moment the picture is taken.
- Other information such as, for example, the date, time, brightness and the opening time are also recorded during this operation 152.
- the metadata is associated, during a step 154, with the photograph taken during the step 140.
- the photo as well as the metadata are recorded in a format
- the metadata and the photo are transmitted via the link 99 and then recorded, in a step 156, in the memory 42.
- a user of the terminal 88 can, if he wishes, proceed to a phase 162, of automatic creation of a legend for one of the photos recorded in the In this phase 162, the terminal 88 transmits to the engine 60, during a step 164, the geographical position, the direction of view and the field of view associated with one of the photos stored in the memory 42.
- motor 60 receives the data transmitted in step 164.
- the engine 60 selects, according to the received data, during a step 166 at least one object in the database 104. More specifically, during the step 166, the module 66 determines, during an operation 168, the oriented line passing through the geographical position received and having as direction the direction of sight received. Then, during an operation 170, the module 68 selects from the database 104 the object or objects whose geographical position is closest to the oriented line determined during the operation 168. For this, for example, the The module 68 calculates the shortest distance separating each object from the oriented line and selects only the or each object separated from the line oriented by a distance less than a threshold. This threshold is set by the module 68 as a function of the value of the received field angle so as to eliminate all the objects that are not visible in the photo. In addition, this threshold is determined to select only the objects present on the received direction.
- the module 72 creates a caption for the photo according to the additional information associated with the objects selected by the engine 60. For example, it creates the following caption "photo taken in the direction (north-east) the bell tower of the plan of Grace Saturday, February 14 at 8 hours 48 ".
- This example of a legend is constructed using information about the object located on the direction of sight, as well as the date and time extracted from the metadata associated with the photo.
- the created legend is transmitted to the terminal 88, during a step 182, and recorded in the metadata associated with this photo.
- phase 200 begins with the display, in a step 202, of a geographical map on the screen 82, on which are placed shooting points, each point of view being representative of the geographical position recorded in the metadata associated with a photo.
- the user selects with the mouse 120, during a step 204, one of these points of view.
- the terminal 88 then automatically displays during a step 206, the picture taken from this point of view on the screen 82. If a caption has already been created for this picture, preferably, the picture displayed on the screen will be displayed. screen 82 also includes in it the legend created by the module 72.
- the user then proceeds to a step 208 of identifying a visible object in the photo. For this he selects a particular point of the photo corresponding to an object to identify with the help of the mouse, for example.
- the module 50 acquires, during an operation 210, the coordinates of the point selected by the user in the coordinate system linked to the center of the photo. These coordinates are noted (a, b).
- the module 48 extracts during an operation 214 the geographical position of the point of view and the aiming direction of the metadata recorded in the memory 46.
- the module 52 then corrects, during an operation 216, the extracted direction of the metadata to deduce a corrected direction.
- the corrected direction coincides with that of a straight line passing through the extracted geographical position and the geographical position of an object corresponding to the point selected in the photo.
- the module 52 uses the angle of field ⁇ stored in the metadata associated with the photo. This angle of view ⁇ is shown in FIG. 4. In this same FIG. 4, the position of the point of view is represented by a point 218.
- An angle x represents the angle between the direction 94 and the indicated magnetic north direction. by an arrow 220.
- the correction of the angle x will be described here in the particular case of a picture 222 taken horizontally so that it is not necessary to take into account the tilting of the photograph or camera 90 relative to the horizontal.
- the position of the point selected by the user is represented by a cross 224 while the center of the marker linked to the photo is represented by a cross 226.
- the distance between these two crosses 224 and 226 corresponds to the value of the abscissa " at ".
- the known length of a horizontal edge of the photo is noted here d.
- a ⁇ angle which the corrected direction relative to the direction 94 is calculated using the following relationship: ⁇ . ⁇ has Once this angle ' ⁇ calculated, it is added to the angle x. We thus obtain an angle x 'that the corrected direction with respect to the magnetic north.
- the module 52 also calculates an angle y 'that the direction corrected relative to the horizontal.
- the position extracted from the metadata and the corrected direction are then transmitted, in a step 230, to the engine 60 via the network 84.
- the engine 60 selects, during a step 232, based on the data received the object or objects close to the oriented line passing through the extracted position and having the corrected direction.
- This step 232 comprises an operation 234 for determining the line oriented identical to the operation 168 and an operation 236 for selecting the objects closest to the oriented line.
- the engine 60 selects from the database 104 the object which: - is close to the oriented line,
- - is included in the frame of the photo, and is also closest to the geographical position of the point of view.
- the last condition allows you to select only a visible object on the photo.
- an object is considered to be close to the straight line if, for example, the shortest distance separating it from this line is less than a preset threshold.
- the metadata is associated with the photo using the EXIF format.
- the EXIF format is replaced by the MPEG7 format.
- Many other embodiments of the system 40 are possible. For example, instead of distributing the elements of the system 40 between, on the one hand one or more local consultation terminals and on the other hand a computer server, it is possible to implement all the elements of the system 40 in the consultation position. Conversely, it is also possible to implant the processing unit 44 in the remote computer server which will then be associated with the memory 42. In this last embodiment, the consultation station comprises only the presentation unit. information.
- the legend creation module 72 and the phase 162 are deleted.
- the presentation unit is reduced to a man / machine interface.
- the operations 210 and 216 are deleted. The system is then only able to identify the object in the center of the photo on the line of sight.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Library & Information Science (AREA)
- Manufacturing & Machinery (AREA)
- Radar, Positioning & Navigation (AREA)
- Studio Devices (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Instructional Devices (AREA)
- Processing Or Creating Images (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FR0409769A FR2875320A1 (fr) | 2004-09-15 | 2004-09-15 | Procede et systeme d'identification d'un objet dans une photo, programme, support d'enregistement, terminal et serveur pour la mise en oeuvre du systeme |
| PCT/FR2005/002280 WO2006030133A1 (fr) | 2004-09-15 | 2005-09-14 | Procede et systeme d'identification d'un objet dans une photo, programme, support d'enregistrement, terminal et serveur pour la mise en œuvre du systeme |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP1828928A1 true EP1828928A1 (fr) | 2007-09-05 |
Family
ID=34952202
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP05802703A Ceased EP1828928A1 (fr) | 2004-09-15 | 2005-09-14 | Procédé et système d'identification d'un objet dans une photo, programme, support d'enregistrement, terminal et serveur pour la mise en oeuvre du système |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US20080140638A1 (enExample) |
| EP (1) | EP1828928A1 (enExample) |
| JP (1) | JP2008513852A (enExample) |
| KR (1) | KR20070055533A (enExample) |
| FR (1) | FR2875320A1 (enExample) |
| WO (1) | WO2006030133A1 (enExample) |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9104694B2 (en) | 2008-01-10 | 2015-08-11 | Koninklijke Philips N.V. | Method of searching in a collection of data items |
| US8611592B2 (en) * | 2009-08-26 | 2013-12-17 | Apple Inc. | Landmark identification using metadata |
| JP2011055250A (ja) * | 2009-09-02 | 2011-03-17 | Sony Corp | 情報提供方法及び装置、情報表示方法及び携帯端末、プログラム、並びに情報提供システム |
| US20110109747A1 (en) * | 2009-11-12 | 2011-05-12 | Siemens Industry, Inc. | System and method for annotating video with geospatially referenced data |
| US20110137561A1 (en) * | 2009-12-04 | 2011-06-09 | Nokia Corporation | Method and apparatus for measuring geographic coordinates of a point of interest in an image |
| KR100975128B1 (ko) | 2010-01-11 | 2010-08-11 | (주)올라웍스 | 뷰잉 프러스텀을 이용하여 객체에 대한 정보를 제공하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체 |
| JP5789982B2 (ja) * | 2010-12-29 | 2015-10-07 | 株式会社ニコン | 撮影方向決定プログラム及び表示装置 |
| US9041819B2 (en) | 2011-11-17 | 2015-05-26 | Apple Inc. | Method for stabilizing a digital video |
| US8611642B2 (en) | 2011-11-17 | 2013-12-17 | Apple Inc. | Forming a steroscopic image using range map |
| US20130129192A1 (en) * | 2011-11-17 | 2013-05-23 | Sen Wang | Range map determination for a video frame |
| JP5788810B2 (ja) * | 2012-01-10 | 2015-10-07 | 株式会社パスコ | 撮影対象検索システム |
| KR101942288B1 (ko) * | 2012-04-23 | 2019-01-25 | 한국전자통신연구원 | 위치 보정 장치 및 방법 |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2811501B2 (ja) * | 1990-08-30 | 1998-10-15 | インターナショナル・ビジネス・マシーンズ・コーポレーション | カーソル移動制御方法及び装置 |
| US5913078A (en) * | 1994-11-01 | 1999-06-15 | Konica Corporation | Camera utilizing a satellite positioning system |
| JPH0981361A (ja) * | 1995-09-12 | 1997-03-28 | Toshiba Corp | 画像表示方法、データ収集方法及び対象物特定方法 |
| JP3156646B2 (ja) * | 1997-08-12 | 2001-04-16 | 日本電信電話株式会社 | 検索型景観ラベリング装置およびシステム |
| US6208353B1 (en) * | 1997-09-05 | 2001-03-27 | ECOLE POLYTECHNIQUE FEDéRALE DE LAUSANNE | Automated cartographic annotation of digital images |
| JP4216917B2 (ja) * | 1997-11-21 | 2009-01-28 | Tdk株式会社 | チップビーズ素子およびその製造方法 |
| JP4296451B2 (ja) * | 1998-06-22 | 2009-07-15 | 株式会社日立製作所 | 画像記録装置 |
| US6690883B2 (en) * | 2001-12-14 | 2004-02-10 | Koninklijke Philips Electronics N.V. | Self-annotating camera |
| US6885371B2 (en) * | 2002-04-30 | 2005-04-26 | Hewlett-Packard Development Company, L.P. | System and method of identifying a selected image object in a three-dimensional graphical environment |
| JP2003323440A (ja) * | 2002-04-30 | 2003-11-14 | Japan Research Institute Ltd | 携帯端末を用いた撮影画像の情報提供システム、撮影画像の情報提供方法、およびその方法をコンピュータに実行させるプログラム |
| US20040021780A1 (en) * | 2002-07-31 | 2004-02-05 | Intel Corporation | Method and apparatus for automatic photograph annotation with contents of a camera's field of view |
| US7234106B2 (en) * | 2002-09-10 | 2007-06-19 | Simske Steven J | System for and method of generating image annotation information |
| US20040114042A1 (en) * | 2002-12-12 | 2004-06-17 | International Business Machines Corporation | Systems and methods for annotating digital images |
| JP3984155B2 (ja) * | 2002-12-27 | 2007-10-03 | 富士フイルム株式会社 | 被写体推定方法および装置並びにプログラム |
-
2004
- 2004-09-15 FR FR0409769A patent/FR2875320A1/fr active Pending
-
2005
- 2005-09-14 US US11/662,470 patent/US20080140638A1/en not_active Abandoned
- 2005-09-14 KR KR1020077005846A patent/KR20070055533A/ko not_active Ceased
- 2005-09-14 JP JP2007530746A patent/JP2008513852A/ja active Pending
- 2005-09-14 WO PCT/FR2005/002280 patent/WO2006030133A1/fr not_active Ceased
- 2005-09-14 EP EP05802703A patent/EP1828928A1/fr not_active Ceased
Non-Patent Citations (2)
| Title |
|---|
| None * |
| See also references of WO2006030133A1 * |
Also Published As
| Publication number | Publication date |
|---|---|
| US20080140638A1 (en) | 2008-06-12 |
| JP2008513852A (ja) | 2008-05-01 |
| FR2875320A1 (fr) | 2006-03-17 |
| WO2006030133A1 (fr) | 2006-03-23 |
| KR20070055533A (ko) | 2007-05-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5200780B2 (ja) | 撮影装置および方法、並びにプログラム | |
| US8212784B2 (en) | Selection and display of media associated with a geographic area based on gesture input | |
| US7518640B2 (en) | Method, apparatus, and recording medium for generating album | |
| US20060155761A1 (en) | Enhanced organization and retrieval of digital images | |
| EP1828928A1 (fr) | Procédé et système d'identification d'un objet dans une photo, programme, support d'enregistrement, terminal et serveur pour la mise en oeuvre du système | |
| US20100053371A1 (en) | Location name registration apparatus and location name registration method | |
| US20130128059A1 (en) | Method for supporting a user taking a photo with a mobile device | |
| FR2827984A1 (fr) | Dispositif de capture d'image | |
| CN101874195B (zh) | 地图显示设备、地图显示方法和图像摄取设备 | |
| JP2004289825A (ja) | 周知の写真撮影場所で撮影された画像からの拡張写真製品の製造 | |
| WO2005124594A1 (en) | Automatic, real-time, superimposed labeling of points and objects of interest within a view | |
| CN101911072B (zh) | 在数据项集合中搜索的方法 | |
| FR2913803A1 (fr) | Procede de furetage a vitesse variable pour images numeriques | |
| KR20100079833A (ko) | 전자지도상에 촬영정보를 표시하기 위한 이미지 처리장치 및 이미지 처리방법 | |
| KR102010318B1 (ko) | 지피에스를 이용한 수치지도 수정시스템 | |
| EP2542862B1 (fr) | Système de navigation routière et procédé d'activation automatique d'une application de navigation routière | |
| FR2871257A1 (fr) | Moteur de base de donnees, procede de selection, systeme et procede d'identification d'une vue, et appareil, serveur informatique, programme et support d'enregistrement mis en oeuvre dans le systeme | |
| JP2009177611A (ja) | デジタルカメラ | |
| JP2007164534A (ja) | 電子機器および撮像装置 | |
| EP2192501B1 (fr) | Procédé d'acquisition de données et procédé de construction d'un produit multimédia de visite virtuelle | |
| KR20180113944A (ko) | Vr 컨텐츠 생성 시스템 | |
| WO2010020624A1 (fr) | Procédé de télémesure métrique | |
| JP6362735B2 (ja) | 撮像装置、撮像装置の制御方法及び制御プログラム | |
| WO2006040455A2 (fr) | Procede, systeme, terminal et module logiciel de vision avec realite augmentee | |
| EP2230816A1 (en) | System and method for managing file catalogs on a wireless handheld device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| 17P | Request for examination filed |
Effective date: 20070227 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
| DAX | Request for extension of the european patent (deleted) | ||
| 17Q | First examination report despatched |
Effective date: 20120814 |
|
| RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ORANGE |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
| 18R | Application refused |
Effective date: 20170705 |