EP3785229A1 - Verfahren zur bestimmung von augmentierungsinformationen für ein bild - Google Patents
Verfahren zur bestimmung von augmentierungsinformationen für ein bildInfo
- Publication number
- EP3785229A1 EP3785229A1 EP19714156.7A EP19714156A EP3785229A1 EP 3785229 A1 EP3785229 A1 EP 3785229A1 EP 19714156 A EP19714156 A EP 19714156A EP 3785229 A1 EP3785229 A1 EP 3785229A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- information
- mobile device
- image
- current image
- augmentation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
Definitions
- the subject matter relates to a method for the determination of
- Augmentation information for an image relates to a device having a processor set up to carry out the method, a computer program configured to execute the method on a processor, and a computer program product with a computer program stored thereon.
- Augmented Reality allows to capture images of real objects
- Additional information which are displayed in such images, enrich.
- a user can use his mobile device, such as a smartphone, to create a virtual object
- augmentation information in an image captured by his mobile device and displayed on a display.
- the image can represent a real object and the augmentation information can be displayed in the image relative to this real object.
- a reference image can first be created. For this purpose, an angle of the recording of the real object, a distance to the real object and / or an absolute position of the mobile device in the area and / or space can be determined. This can be reference information. If the user moves his mobile device close to the real object during the same AR session and directs his camera to the real object, the augmented image can be displayed based on the previously created reference information, depending on the absolute Position of the mobile device, the shooting angle and / or the distance from the real object, the augmentation information can be displayed differently in the image. After the mobile device has been switched off, however, the reference information is lost and the augmentation data is no longer associated with the real object.
- each mobile device For mass deployment, it is necessary to provide each mobile device with a same set of augmentation information. Moreover, it is necessary for the mass deployment to detect the exact position of the user or the mobile device in space so as to represent the augmentation information in an image captured by a mobile device as accurately as possible and in particular the relation between the augmentation information and the image of the real Display object correctly.
- a mobile device which may be, for example, a smartphone, a tablet computer, a laptop computer, smart glasses or the like, is to be enabled by means of the present method,
- the mobile device has one
- Location information is loaded at least two reference information. To perform AR, it is necessary that the augmentation information (augmentation data) in advance of the actual augmentation
- Augmentation data can be linked to real data. It will be
- Reference images are recorded and stored together with the position information of the reference images as well as augmentation information of the reference images.
- a reference image together with position information can be stored as reference information.
- Augmentation information for that location If this is the case, for example, a number of reference information that has been previously created can be loaded for the corresponding location. At least one reference image may be included in the loaded reference information. Thus, in the mobile device, depending on its current location, a set of reference pictures is preloaded. The loaded reference images were previously created in an augmentation session.
- the reference information be loaded and not first created in the mobile device.
- the reference information can be stored in a server remote from the mobile device. After the location information is known, the mobile device can load the reference information.
- an application run on the mobile device, which detects the location information and, depending on the detected location information at a remote server asks for existing reference information. If such exist, they can be downloaded from the mobile device through the application to the mobile device, then serve the loaded
- the comparison of the current image with the reference images can be a comparison to color information, for example by means of a histogram and / or the comparison of
- Edge information for example, by edge detection included.
- the comparison of the images may include similarity maps, and include, for example, zooming in, zooming out, tilting or tilting the current image to subsequently perform an edge comparison with the reference image.
- Augmentation information associated with the reference information is created.
- reference images and augmentation information associated with these reference images are created
- Augmentation information can be loaded on the mobile device and optionally displayed by the mobile device in the current image.
- An augmentation of image information is preferably carried out by means of graphical objects. Therefore, it is suggested that the augmentation information include graphical objects displayed in the current image.
- the display of the objects can be semi-transparent or opaque in the current image.
- Position data can be acquired in the same way as the location information. It is also possible to detect the position data with a higher resolution, for example, by using higher-resolution positioning methods while the reference information is being acquired. It is proposed that position data for the respective reference images are stored in the reference information.
- the position data can in particular be absolute position data of the reference image, in particular coordinates.
- the position data correspond in particular to the location information of the device with which the reference images were acquired.
- relative position data of the reference images can also be stored relative to each other, ie
- the relative distances of individual reference images to each other For example, the relative distances of individual reference images to each other.
- the position data Based on the position data, it is possible to determine whether or not reference information matches location information of the mobile device.
- all the position data of all reference images of a session collectively and thus define a spatial area in which the reference images were captured. If the location information is such that it indicates that the mobile device is within the spatial area spanned by the position data, it can be concluded that the mobile device has been suitable for the reference information that is acquired in the corresponding session Load reference information.
- a transmission of the reference images determined in this way can take place.
- orientation of the device e.g. Angle information in particular camera angle, an optical axis or shooting angle to capture the camera with which the reference image has been detected.
- a reference angle can also be stored in the reference information. At one position, images can be captured with different directions of view, so that the position alone does not necessarily provide an insight into how a
- Reference image looks like. Rather, the angle information is also relevant to determine the reference image.
- Reference information can be saved as a map set.
- a set of cards may contain a plurality of reference information.
- augmentation information may be stored in the map set for reference information.
- the set of cards is one sentence
- Reference information and respective augmentation information With the help of the map set, a wide variety of reference images can be stored with their corresponding position information and the augmentation information created for this purpose.
- the card set may be provided in a server and loaded by the mobile device over a wide area network, a wireless network, or the like. As previously explained, in order to speed up the processing in the mobile device, only the reference information is initially loaded. After a detected
- the map set can be loaded by at least the matching reference image is included.
- the augmentation information belonging to the current image can then be determined from the loaded map set.
- the position of the mobile device which was previously formed from the location information, can be supplemented or replaced by the position data of the matching reference image.
- the position data may be stored at a higher resolution than the location information acquired by the mobile device.
- Augmenting information as shown in the current image as they were created in the reference image, it may happen that the relative position of the augmentation information to the real objects in the displayed current image is not correct. Therefore, it is necessary to know the angle information of the current image relative to the angle information of the reference image. For this reason, a shift of the angle information of the current image against the angle information of the reference angle can be determined.
- This calculation can be complicated, so that it can not only be carried out in the mobile device, but according to one embodiment, it is also proposed that the current image is sent to a server. In the server, the current image is compared with the matching reference image such that an angular displacement is calculated.
- Imaging functions and shift functions on the current image and / or the reference image are made such that a difference between the angle information of the reference image and the current image can be determined.
- a difference between the angle information of the reference image and the current image can be determined.
- not only a shift of the recording angle, but also a shift of the position between the matching reference image and the current image can be determined.
- the relative displacement between the current image and the reference image has been calculated in a server, it is suggested that the relative displacement be received from a server.
- Position data of the reference image may not be exactly the position of the mobile device. Therefore, a further adjustment of the position of the mobile device based on the relative displacement is performed.
- the display position and / or size of the augmentation information in the current image can be changed depending on the relative displacement.
- relative positioning of the augmentation information to the real object in the current image can be adjusted to match the arrangement of the image
- Augmentation information is present in the reference image.
- Time may elapse between the time the current image was acquired and the time the displacement information is received.
- the mobile device can move.
- the movement can be a twisting or pivoting of the mobile device, so that the angle information changes and / or, on the other hand, actually translates the coordinates of the mobile device in space.
- These movements can be detected and recorded by corresponding acceleration sensors and / or tilt sensors in the mobile device.
- Adjust augmentation information in the current image The position of the mobile device according to an embodiment depends on
- Movement information of the mobile device adapted.
- the current image be detected together with a time stamp. Starting from this time stamp, the movement information can be recorded. When the relative displacement is received, the time of reception can be compared with the time stamp and that recorded since the time stamp
- Motion information can be used to adjust the relative displacement.
- the relative displacement is characterized, on the one hand, by the position and angle information of the mobile device at the time of capturing the current image and, on the other hand, by the motion information since that time and the timing of adjusting the relative displacement.
- the current image be determined from a moving image. From a so-called video feed, current images can be determined.
- Fig. 1 shows a schematic structure of a system for carrying out the
- Fig. 2 the creation of a set of cards with reference information
- Fig. 4a is a current picture
- Fig. 4b shows an augmented current picture
- Fig. 5a is a current picture
- Fig. 5b an augmented current image.
- 1 shows a mobile device 2, a server 4, stored in the server 4
- the mobile device 2 has a camera 10 and a screen 12.
- the camera 10 has one
- the pickup angle 10a may be an optical axis resulting from the location of the mobile device 2 in space.
- a sensor for example a gyrosensor can be arranged, with which the position of the mobile device 2 can be detected in space, so that from the recording angle 10a can be determined.
- the current images of the camera 10 can be frames or
- Each set of cards 6a, b is formed from reference information 14 and augmentation information 16.
- reference information 14 and augmentation information 16 it is possible for only reference information 14 and augmentation information 16 to be present, or else a plurality of reference information 14 and augmentation information 16.
- the communication between the mobile device 2 and the server 4 takes place via the communication network 8.
- the communication network 8 can provide radio connections and / or wired connections.
- the communication network 8 is at least partially formed by the Internet.
- mobile radio links may be present in the communication network 8.
- a mobile device 2 is moved in a "design" session through a room, as shown in Fig. 2.
- the mobile device 2 is thereby moved to different positions 18a-e and at the positions 18a, e, at least reference images and position data are detected as reference information
- a mobile device is used which is different from the mobile device 2, with which subsequently the
- Augmentation information should be displayed.
- the creator of the set of cards 6a, b for example, initially moves to position 18a and captures a reference image by a camera. Together with the reference image, the position 18a of the mobile device 2 is detected.
- This position detection can be done by means of a very accurate positioning method, for example a Indoor position determination method of a differential GPS method, so that the position data high resolution with an accuracy of up to 20cm or less, the positions of the mobile device 2 at the time of
- a recording angle 10a In addition to the position data and the reference image, it is additionally possible to detect a recording angle 10a.
- detecting the recording angle for example, an azimuth angle and a polar angle can be detected.
- the azimuth angle can be used to determine the angular positions of different reference images to each other, as described below. However, this does not affect the fact that the acceptance angle, in particular both
- augmentation information may be added in the mobile device 2 with which the reference image has been created or in the server 4 or in any other computer.
- augmentation information may in particular be image information, which may be displayed semi-transparently or opaque in the reference image.
- the reference information ie in particular the reference image, the position data and the angle information is stored.
- the augmentation information in particular be image information, which may be displayed semi-transparently or opaque in the reference image.
- Image information and the absolute positioning of the image information stored in the reference image are identical to
- the creator of a set of cards 6a, b can move to position 18b.
- reference information can again be acquired.
- relative position data between the positions 18a, 18b can be detected. This can be done, for example, by evaluation by means of movement information from Motion sensors in the mobile device 2 done. Successively, the
- Positions 18a-e reached and recorded reference information.
- the detected angle information at the positions 18a-e may also be relative to one another, so that at the position 18a an azimuth angle is assumed as the zero angle and the relative angular variations for this purpose are detected or determined at the positions 18b-e.
- the reference images After the reference images have been acquired, they can be enriched with augmentation data.
- the reference information and the associated augmentation information are stored in a set of cards 6a.
- a variety of map sets 6a, 6b can be created, which are stored in the server 4. This information can then be used to enrich current images of users with augmentation information.
- FIGS. 3-5 real objects are drawn with solid lines in the images, and objects of augmentation data are drawn as dashed lines.
- FIGS. 3a-e show the card set 6a.
- a reference image according to FIG. 3a has been detected.
- image information in the reference image is known as
- Augmentation information added At the position 18b, the reference image according to FIG. 3b was detected. Augmentation information is supplemented in this reference image. At the position 18c, the reference image according to Fig. 3c was detected.
- Augmentation information is supplemented in this reference image.
- the reference image according to FIG. 3d was detected. Augmentation information is supplemented in this picture.
- the reference image according to FIG. 3e has been detected. Augmentation information is supplemented in this reference image.
- an application on the mobile device 2 is initially started, for example.
- This application will first determine the location information of the mobile device, for example by means of a GPS receiver. The resolution of this
- Location information may be less than the resolution of the position information and, for example, have an accuracy of about 5m, 2m, 1m or 0.5m.
- the location information is used to communicate with the
- Position information is determined, the reference information 14 of the corresponding set of cards 6a, b are transmitted to the mobile device 2.
- a set of cards 6a, b has only a limited number of
- a current image is detected.
- This current image can be, for example, an image from a live stream, ie a moving image.
- the current image of the camera 10 is compared with the loaded reference images of the reference information 14 of the transferred set of cards 6a, b.
- the user is at position 18c and takes the current image as shown in Fig. 4a.
- the current image according to FIG. 4 a With the reference image according to FIG. 3 c and a similarity map, it can be determined that with a high probability in the current image 4 a the same real object is imaged as in the reference image 3c. Only the shooting angle is obviously different.
- Augmentation information 16 is loaded to match the corresponding reference image and as shown in Fig. 4a shown in the current image with. However, it can be seen in FIG. 4a that the display of the augmentation information relative to the real object is different than is the case in FIG. 3c.
- the recording angle 10a can be transmitted to the server 4. From this, the server 4, a relative
- Displacement can be calculated in the server 4, a displacement information with which the augmentation information 16 can be adjusted, so that as shown in Fig. 4b, the augmentation information 16 in the image displayed on the display 12 is displayed correctly relative to the real object.
- the calculation and / or the adaptation of the augmentation information 16 to the relative shift with regard to the acceptance angle can take place in the mobile device 2.
- the augmentation information 16 will be according to this
- the reference information stores position data that is higher than the location information. Was a match between a
- the position data used to more accurately determine the position of the mobile device 2.
- the location information is improved by the position data.
- this movement can be sensed by means of motion sensors. This movement information can be used to further adapt the location information of the mobile device 2.
- Augmentation information 16 is loaded and displayed in the image as shown in FIG. 5a.
- this display of the augmentation information 16 is not according to the display in Fig. 3a, in particular the
- Augmentation information 16 is positioned incorrectly relative to the real object. To optimize this positioning, a relative shift in the
- Position information between the current image and the reference image are calculated. It can be seen in FIG. 5a that the real objects were taken from a different position than the real objects in FIG. 3a. Using an image comparison, a relative shift between the objects can be calculated. This relative displacement can be used to
- Augmentation information 16 in the displayed image By shifting the augmentation information 16, a displayed image according to FIG. 5b is made possible in which the augmentation information 16 is displayed correctly relative to the real objects.
- the calculation of the relative displacement can be done in the mobile device 2 or the server 4. If the calculation of a relative shift in the
- Movement information of the mobile device 2 are detected. If the relative displacement is calculated, this relative displacement can be determined by the Time of the timestamp detected movement of the mobile device 2 are adjusted and an adjusted relative displacement can be determined. From this relative displacement, the augmentation information 16 can be displayed in the current image.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Studio Devices (AREA)
- Navigation (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018109756.5A DE102018109756A1 (de) | 2018-04-24 | 2018-04-24 | Verfahren zur Bestimmung von Augmentierungsinformationen für ein Bild |
PCT/EP2019/057369 WO2019206539A1 (de) | 2018-04-24 | 2019-03-25 | Verfahren zur bestimmung von augmentierungsinformationen für ein bild |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3785229A1 true EP3785229A1 (de) | 2021-03-03 |
Family
ID=65955204
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19714156.7A Pending EP3785229A1 (de) | 2018-04-24 | 2019-03-25 | Verfahren zur bestimmung von augmentierungsinformationen für ein bild |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3785229A1 (de) |
DE (1) | DE102018109756A1 (de) |
WO (1) | WO2019206539A1 (de) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004015369A2 (en) * | 2002-08-09 | 2004-02-19 | Intersense, Inc. | Motion tracking system and method |
KR101329111B1 (ko) * | 2012-05-02 | 2013-11-14 | 한국과학기술연구원 | 실내 네비게이션 시스템 및 방법 |
WO2013180320A1 (ko) * | 2012-05-31 | 2013-12-05 | 인텔 코오퍼레이션 | 증강 현실 서비스 제공 방법, 서버 및 컴퓨터 판독 가능한 기록매체 |
KR102077305B1 (ko) * | 2013-05-09 | 2020-02-14 | 삼성전자 주식회사 | 증강 현실 정보를 포함하는 콘텐츠 제공 방법 및 장치 |
-
2018
- 2018-04-24 DE DE102018109756.5A patent/DE102018109756A1/de active Pending
-
2019
- 2019-03-25 EP EP19714156.7A patent/EP3785229A1/de active Pending
- 2019-03-25 WO PCT/EP2019/057369 patent/WO2019206539A1/de unknown
Also Published As
Publication number | Publication date |
---|---|
WO2019206539A1 (de) | 2019-10-31 |
DE102018109756A1 (de) | 2019-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102008034979B4 (de) | Verfahren und Einrichtung zur Erzeugung von fehlerreduzierten hochauflösenden und kontrastverbesserten Bildern | |
DE102016224095A1 (de) | Verfahren zum Kalibrieren einer Kamera und Kalibriersystem | |
EP2880853B1 (de) | Vorrichtung und verfahren zur bestimmung der eigenlage einer bildaufnehmenden kamera | |
DE112016004079T5 (de) | Sensorvorrichtung, Sensorsystem und Informationsverarbeitungsvorrichtung | |
WO2012076274A1 (de) | Verfahren und vorrichtung zum verarbeiten von bildinformationen zweier zur bilderfassung geeigneter sensoren eines stereo-sensor-systems | |
DE202016007867U1 (de) | Steuerung des Sichtlinienwinkels einer Bildverarbeitungsplattform | |
EP2381207B1 (de) | 3D-Zielvermessung und Zieleinweisung aus IR-Daten | |
EP2369296A2 (de) | Navigationsverfahren für einen Flugkörper | |
EP3347878A2 (de) | Verfahren und vorrichtung zum überlagern eines abbilds einer realen szenerie mit einem virtuellen bild und mobiles gerät | |
DE102018222169A1 (de) | Bordeigenes visuelles Ermitteln kinematischer Messgrößen eines Schienenfahrzeugs | |
DE60019464T2 (de) | Verspätete videospurverfolgung | |
DE102016212650B4 (de) | Verfahren und Vorrichtung zur Erzeugung von verorteten Sensordaten eines Koordinatenmessgeräts | |
EP3867796A1 (de) | Verfahren und vorrichtung zur bestimmung einer umgebungskarte | |
DE4416557A1 (de) | Verfahren und Vorrichtung zur Stützung der Trägheitsnavigation eines ein entferntes Ziel autonom ansteuernden Flugkörpers | |
WO2021233718A1 (de) | Computerimplementiertes verfahren zur bestimmung von zentrierparametern für mobile endgeräte, mobiles endgerät und computerprogramm | |
WO2019206539A1 (de) | Verfahren zur bestimmung von augmentierungsinformationen für ein bild | |
EP2831839A1 (de) | Verfahren zum automatischen betreiben einer überwachungsanlage | |
DE10340023B3 (de) | Verfahren zur Selbstkalibrierung eines Kamerasystems | |
WO2017198441A1 (de) | Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung | |
EP2940624B1 (de) | Dreidimensionales virtuelles Modell einer Umgebung für Anwendungen zur Positionsbestimmung | |
EP3200154B1 (de) | Verfahren zum bestimmen einer position eines objekts | |
EP3200149B1 (de) | Verfahren zum erkennen eines objekts in einem suchbild | |
DE102017120741A1 (de) | Vorrichtung, System und Verfahren zur Entkopplung eines VR-Systems von Infrastruktur und ortsgebundener Hardware | |
WO2019096459A1 (de) | Verfahren und vorrichtung zur bestimmung von objekten in einem bild | |
WO2022069424A1 (de) | Verfahren zur räumlichen bilderfassung mit hilfe einer zwei kameras aufweisenden stereokamera sowie verfahren zur erzeugung einer redundanten abbildung eines messobjektes und vorrichtung zur durchführung der verfahren |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201118 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: INPIXON |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: INPIXON |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230228 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: DESIGN REACTOR, INC. |