WO2015044051A1 - Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes - Google Patents

Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes Download PDF

Info

Publication number
WO2015044051A1
WO2015044051A1 PCT/EP2014/070020 EP2014070020W WO2015044051A1 WO 2015044051 A1 WO2015044051 A1 WO 2015044051A1 EP 2014070020 W EP2014070020 W EP 2014070020W WO 2015044051 A1 WO2015044051 A1 WO 2015044051A1
Authority
WO
WIPO (PCT)
Prior art keywords
document
image
mobile device
security feature
feature
Prior art date
Application number
PCT/EP2014/070020
Other languages
German (de)
English (en)
French (fr)
Inventor
Andreas Hartl
Jens GRUBERT
Gerhard Reitmayr
Olaf Dressel
Original Assignee
Bundesdruckerei Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bundesdruckerei Gmbh filed Critical Bundesdruckerei Gmbh
Priority to EP14780428.0A priority Critical patent/EP3053098A1/de
Publication of WO2015044051A1 publication Critical patent/WO2015044051A1/de

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07DHANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
    • G07D7/00Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency
    • G07D7/003Testing specially adapted to determine the identity or genuineness of valuable papers or for segregating those which are unacceptable, e.g. banknotes that are alien to a currency using security elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/10Integrity
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/0005Adaptation of holography to specific applications
    • G03H1/0011Adaptation of holography to specific applications for security or authentication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/22Processes or apparatus for obtaining an optical image from holograms
    • G03H1/2202Reconstruction geometries or arrangements
    • G03H2001/2244Means for detecting or recording the holobject
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/12Applying verification of the received information
    • H04L63/123Applying verification of the received information received data contents, e.g. message integrity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/12Detection or prevention of fraud
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/63Location-dependent; Proximity-dependent

Definitions

  • the present invention relates to the field of positioning a mobile device.
  • Positioning methods for positioning a mobile device are of interest for a variety of applications.
  • an absolute position or a relative position to an object in space is usually determined and positioning is carried out on this basis.
  • the accuracy and reproducibility of the positioning of importance is usually determined.
  • a positioning method is described, for example, in Y.-C. Cheng et al., "AR-based Positioning for Mobile Devices", International Conference on Parallel Processing Workshops, pp. 63-70, 201 1.
  • a visualization of a coordinate system on an object is described, for example, in K. Chintamani et al., "Improved Telemanipulator Navigation Düring Display-Control Misalignments Using Augmented Reality Cues ", Systems, Man and Cybernetics, Part A: Systems and Humans, 40 (1), page 29-39, 2010.
  • the positioning of a mobile device relative to a security feature of a document places particularly high demands on the accuracy and reproducibility of the positioning.
  • the positioning of the mobile device can, for example, be done visually by means of illustrated instructions, whereby the relative position is not exactly known.
  • the positioning of the mobile device can be performed only with insufficient accuracy and reproducibility.
  • an actual position of the mobile device with respect to the document can be determined.
  • an actual position of the mobile device with respect to the security feature can be determined.
  • the actual position and a target position of the mobile device with regard to the security feature can then be displayed on the display of the mobile device. This enables accurate and reproducible relative positioning of the mobile device relative to the security feature of the document.
  • the invention relates to a positioning method for the positioning of a mobile device relative to a security feature of a document, wherein the mobile device comprises a display, with a capture of an image of the document by the mobile device to obtain a document image, a determination of an actual Position of the mobile device with respect to the document based on a perspective distortion of a document feature in the document image, determining a position of the security feature in the document image, and displaying the actual position and a target position of the mobile device with respect to the position of the security feature on the display of the mobile device , This ensures that the mobile device can be positioned accurately and reproducibly relative to the security feature of the document.
  • the mobile device may be, for example, a mobile phone or a smartphone.
  • the position of the mobile device can be changed by a user.
  • the actual position and the desired position of the mobile device may be a pose of the mobile device.
  • the pose of the mobile device is understood to mean an arrangement and an inclination of the mobile device in the room.
  • the display of the mobile device may be, for example, a liquid crystal display (LC display) or a thin film transistor display (TFT display).
  • the display of the mobile device may also be touch-sensitive.
  • the security feature can be optically detectable.
  • the security feature may have viewing angle dependent or illumination angle dependent properties.
  • the security feature may be, for example, a hologram.
  • the security feature may further include ink, which has optically variable or color iridescent properties.
  • the document may be, for example, a banknote.
  • the document may also be one of the following identification documents: identity document such as identity card, passport, access control card, authorization card, company card, tax stamp or ticket, birth certificate, driver's license or vehicle pass, means of payment, for example a bank card or credit card.
  • identity document such as identity card, passport, access control card, authorization card, company card, tax stamp or ticket, birth certificate, driver's license or vehicle pass, means of payment, for example a bank card or credit card.
  • the document can be single-layered or multi-layered or paper and / or plastic-based.
  • the document may be constructed of plastic-based films which are assembled into a body by means of gluing and / or lamination, the films preferably having similar material properties.
  • the image of the document may be a black and white image, a grayscale image, or a color image.
  • the image of the document may also be a still image of a video.
  • the document feature may include a characteristic feature of the document in the document, such as a previously known geometric pattern.
  • the perspective distortion of the document feature may be based on a perspective view of the spatial arrangement of the document feature.
  • the actual position or the desired position of the mobile device is determined by an arrangement and an inclination of the mobile device relative to the security feature.
  • the arrangement of the mobile device may be determined by a spatial coordinate of a point, for example a center of gravity of the mobile device.
  • the tilt of the mobile device may be determined by an angular coordinate of an axis, for example a longitudinal axis of the mobile device.
  • the document feature comprises a corner of the document, an edge of the document, a geometric arrangement of points in the document, or a geometric arrangement of lines in the document. This provides the advantage that the perspective distortion of the document feature can be efficiently determined.
  • the document feature can be detected and extracted by means of a feature detector.
  • the feature detector may be, for example, a BRISK (Binary Robust Invariant Scalable Keypoint) feature detector.
  • determining the position of the security feature in the document image comprises determining a document type, wherein the position of the security feature in the document image is determined based on the document type. This provides the advantage that the position of the security feature in the document image can be efficiently determined.
  • the type of document may identify different types of documents, for example different banknotes of different denominations or different identification documents.
  • the position of the security feature in the document may be previously known for different document types. By determining the document type, the position of the security feature in the document image can thus be efficiently determined.
  • the document type is determined based on a size of the document in the document image, a shape of the document in the document image, or a comparison of a document type feature in the document image with a predetermined document type feature. This provides the advantage that the document type can be efficiently determined.
  • the size of the document in the document image can be determined, for example, by determining the absolute dimensions of the document in the document image.
  • the absolute dimensions can be for example 5 cm x 10 cm.
  • the shape of the document in the document image can be rectangular, square, or circular.
  • the form of the document in the document image may be distorted in perspective.
  • the document type feature may include a geometric arrangement of points in the document, or a geometric arrangement of lines in the document.
  • the document type feature can be detected and extracted by means of a feature detector.
  • the feature detector may be, for example, a SURF (Speeded Up Robust Feature) feature detector.
  • the comparison of the document type feature with the predetermined document type feature may be performed by means of a similarity measure, for example a normalized cross-correlation.
  • the actual position and the desired position are superimposed on the document image. This provides the advantage that the positioning of the mobile device can be carried out efficiently.
  • the desired position of the mobile device is displayed by means of a sight beam, a beam base circle, a beam subcircuit, or a desired horizon line. This provides the advantage that the target position can be displayed efficiently.
  • the visual beam may indicate the desired position of the mobile device in elevation and azimuth with respect to the position of the security feature.
  • the viewing beam can pass through the center of the beam base circle or the center of the beam sub-circle.
  • the beam base circle may indicate a tilt of the mobile device with respect to the position of the security feature.
  • the beam subcircuit may indicate a distance of the target position of the mobile device with respect to the position of the security feature.
  • the desired horizon line may indicate a rotation of the desired position of the mobile device around the sight beam.
  • the sight beam, the beam base circle, the beam sub-circle, or the target horizon line is displayed in perspective distorted relation to the actual position.
  • the perspective distortion can be realized by means of a projection of the spatial arrangement and inclination of the mobile device at the desired position to a display level of the display at the actual position.
  • the actual position of the mobile device is indicated by means of a base sphere, a radiation super circle, or an actual horizon line. This provides the advantage that the actual position can be displayed efficiently.
  • the base sphere may indicate a tilt of the mobile device at the current position.
  • the base sphere can just be displayed in the form of a dot.
  • the base sphere may be displayed at a fixed point, such as the center, of the display of the mobile device.
  • the radiating circle can be scaled in relation to the beam subcircuit.
  • the radiotrack can be permanently displayed on the display of the mobile device.
  • the actual horizon line may indicate a rotation of the mobile device at the current position. The actual horizon line can be displayed firmly on the display of the mobile device.
  • the positioning method further comprises displaying a difference between the current position and the target position of the mobile device with respect to the position of the security feature on the display of the mobile device.
  • the difference between the actual position and the desired position can be displayed, for example, by means of a difference vector between the actual position and the desired position.
  • the mobile device comprises an image camera, wherein the capture of the image of the document is performed by means of the image camera. This provides the advantage that the image can be captured efficiently.
  • the image camera may include an optical image sensor.
  • the image camera may further comprise an image optics.
  • the actual position and / or the desired position of the mobile device can be determined in relation to the image camera.
  • the invention relates to a verification method for verifying the authenticity of a document comprising a security feature using a mobile device, by performing the positioning method for positioning the mobile device relative to the security feature of the document, capturing an image of the security feature the mobile device to obtain a security feature image and comparing a feature of the security feature image with a feature of a reference image to verify the authenticity of the document. This ensures that the authenticity of the document can be verified efficiently.
  • the security feature image may be a black and white image, a grayscale image, or a color image.
  • the security feature image may also be a still image of a video.
  • the security feature image can be displayed on the display of the mobile device.
  • the reference image may be a black and white image, a grayscale image, or a color image.
  • the reference image may be stored in the mobile device.
  • the reference image may be perspective distorted or perspective equalized displayed on the display of the mobile device.
  • the security feature image feature may include a geometric arrangement of points, or a geometric arrangement of lines in the security feature image.
  • the feature of the reference image may include a geometric arrangement of points, or a geometric arrangement of lines in the reference image.
  • the feature of the security feature image and / or the feature of the reference image can be detected and extracted by means of a feature detector.
  • the comparison of the feature of the security feature image with the feature of the reference image can be performed by means of a similarity measure, for example a normalized cross-correlation.
  • the authenticity of the document can be verified, for example, if the similarity measure exceeds a predetermined threshold.
  • the security feature has viewing angle dependent or illumination angle dependent properties.
  • the security feature may be, for example, a hologram.
  • the security feature may further include ink, which has optically variable or color iridescent properties.
  • the viewing angle dependent and / or illumination angle dependent properties may be represented by a spatially varying bidirectional reflectance distribution function (SVBRDF).
  • SVBRDF spatially varying bidirectional reflectance distribution function
  • capturing the image of the security feature by the mobile device comprises extracting the image of the security feature from the image of the document. This provides the advantage that the image of the document can be used to capture the image of the security feature.
  • Extracting the image of the security feature may be performed based on the particular location of the security feature in the document image.
  • capturing the image of the security feature by the mobile device comprises a perspective equalization of the image of the security feature relative to the current position. This provides the advantage that an equalized image of the security feature can be displayed and provided.
  • the security feature can be displayed and provided in a flat form.
  • the perspective equalizing can be carried out, for example, by means of geometric transformation matrices.
  • the verification method comprises performing a further positioning procedure for positioning the mobile device relative to the security feature of the document at a further target position, capturing a further image of the security feature by the mobile device to obtain another security feature image, and comparing one feature of another Security feature image with a feature of another reference image to verify the authenticity of the document.
  • the mobile device comprises a lighting, wherein upon detection of the image of the security feature by the mobile device, the security feature is illuminated by the illumination.
  • the lighting can emit white light, infrared light or ultraviolet light.
  • the lighting can be realized for example by means of a light-emitting diode (LED) or a halogen lamp.
  • the illumination can focus and / or focus the emitted light.
  • the invention relates to a mobile device for the detection of a document, which comprises a security feature, with an image camera for acquiring an image of the document to obtain a document image, a processor, which is adapted to an actual position of the mobile device the document based on a perspective distortion of a document feature in the document image, wherein the processor is further configured to determine a position of the security feature in the document image, and a display for displaying the current position and a target position of the mobile device with respect to the position of the security feature. This ensures that the mobile device can be positioned accurately and reproducibly relative to the security feature of the document.
  • the image camera may include an optical image sensor.
  • the image camera may further comprise an image optics.
  • the actual position and / or the desired position of the mobile device can be determined in relation to the image camera.
  • the processor may be configured to execute a computer program.
  • the processor can also be set up programmatically.
  • the display of the mobile device may be, for example, a liquid crystal display (LC display) or a thin film transistor display (TFT display).
  • the display of the mobile device may also be touch-sensitive.
  • the positioning method may be performed by the mobile device. Other features of the mobile device arise directly from the functionality of the positioning.
  • the image camera is further configured to capture an image of the security feature to obtain a security feature image
  • the processor is further configured to compare a feature of the security feature image to a feature of a reference image to verify the authenticity of the document.
  • the verification process may be performed by the mobile device. Other features of the mobile device arise directly from the functionality of the verification process.
  • the invention relates to a computer program having a program code for carrying out the positioning method or the verification method when the program code is executed on a computer. It is thereby achieved that the positioning method and / or the verification method can be carried out in an automated and repeatable manner.
  • the computer may include a processor, a memory, an input interface, and / or an output interface.
  • the computer can be realized in the form of the mobile device.
  • the computer program may be in the form of machine-readable program code.
  • the computer program can be executed, for example, by the processor of the mobile device.
  • the invention can be implemented in hardware and / or in software.
  • Fig. 1 is a schematic diagram of a positioning method for the
  • FIG. 2 is a schematic diagram of a verification method for verifying the authenticity of a document that includes a security feature, according to an embodiment
  • FIG. 3 is a schematic representation of a capture scenario for capturing an image of a document by a mobile device according to one embodiment
  • FIG. 4 is a schematic representation of a capture scenario for capturing an image of a security feature of a document using an imaging camera and illumination, according to an embodiment
  • FIG. 5 is a schematic representation of a positioning scenario of a
  • a mobile device having an indication of a document having a security feature according to an embodiment
  • FIG. 6 is a schematic representation of a positioning scenario of a
  • a mobile device having an indication of a document having a security feature according to an embodiment
  • Fig. 7 is a schematic representation of a comparison of an actual position with a
  • Desired position on a display of a mobile device is a schematic representation of a comparison of an actual position with a desired position on a display of a mobile device according to an embodiment.
  • FIG. 1 shows a schematic diagram of a positioning method 100 for positioning a mobile device relative to a security feature of a document according to an embodiment.
  • the mobile device in this case comprises a display, which is not shown in Fig. 1.
  • the positioning method 100 includes capturing 101 an image of the document by the mobile device to obtain a document image, determining 103 an actual position of the mobile device with respect to the document from a perspective distortion of a document feature in the document image, determining 105 a position of the security feature in the document image, and a display 107 of the current position and a target position of the mobile device with respect to the position of the security feature on the display of the mobile device.
  • FIG. 2 shows a schematic diagram of a verification method 200 for verifying the authenticity of a document that includes a security feature, according to one embodiment.
  • the verification method 200 is carried out using a mobile device, which is not shown in Fig. 2.
  • the verification method 200 includes performing 201 the positioning method for positioning the mobile device relative to the security feature of the document, capturing 203 an image of the security feature by the mobile device to obtain a security feature image, and comparing 205 a feature of the security feature image with a feature of Reference image to verify the authenticity of the document.
  • FIG. 3 shows a schematic representation of a capture scenario 300 for capturing an image of a document 301 by a mobile device 305 according to one embodiment.
  • the document 301 includes a security feature 303.
  • the mobile device 305 includes a display 307, an imaging camera 309, a lighting 31 1, and a processor 313.
  • the mobile device 305 is configured to detect the document 301 that includes the security feature 303.
  • the mobile device 305 includes the image camera 309 for acquiring an image of the document 301 to obtain a document image, the processor 313 configured to an actual position of the mobile device 305 with respect to the document 301 based on a perspective distortion of a document feature in the document image wherein the processor 313 is further configured to determine a position of the security feature 303 in the document image and the display 307 to display the current position and a target position of the mobile device 305 with respect to the location of the security feature 303.
  • the image camera 309 may further configured to capture an image of the security feature 303 to obtain a security feature image.
  • the processor 313 may further be configured to compare a feature of the security feature image with a feature of a reference image to verify the authenticity of the document 301.
  • the document 301 may be, for example, a banknote.
  • the document 301 may also be one of the following identification documents: identity document such as identity card, passport, access control card, authorization card, company card, tax stamp or ticket, birth certificate, driver's license or vehicle pass, means of payment, for example a bank card or credit card.
  • identity document such as identity card, passport, access control card, authorization card, company card, tax stamp or ticket, birth certificate, driver's license or vehicle pass, means of payment, for example a bank card or credit card.
  • the document 301 may be single-layered or multi-layered or paper and / or plastic-based.
  • the document 301 can be constructed of plastic-based films which are joined together to form a body by means of adhesive bonding and / or lamination, the films preferably having similar material properties.
  • the security feature 303 may be optically detectable.
  • the security feature 303 may have viewing angle dependent or illumination angle dependent properties.
  • the security feature 303 may be, for example, a hologram.
  • the security feature 303 may further comprise ink, which has optically variable or color iridescent properties.
  • the mobile device 305 may be, for example, a mobile phone or a smartphone.
  • the location of the mobile device 305 may be changed by a user.
  • the actual position and the target position of the mobile device 305 may be a pose of the mobile device 305.
  • the pose of the mobile device 305 is understood to mean an arrangement and an inclination of the mobile device 305 in the room.
  • the display 307 of the mobile device 305 may be, for example, a liquid crystal display (LC display) or a thin film transistor display (TFT display).
  • the display 307 of the mobile device 305 may also be touch-sensitive.
  • the image camera 309 may include an optical image sensor.
  • the image camera 309 may further include image optics. The actual position and / or the target position of the mobile device 305 may be determined relative to the image camera 309.
  • the illumination 31 1 can emit white light, infrared light or ultraviolet light.
  • the lighting 31 1 can be realized for example by means of a light-emitting diode (LED) or a halogen lamp.
  • the illumination 31 1 can focus and / or focus the emitted light.
  • the processor 313 may be configured to execute a computer program.
  • the processor 313 can also be set up by programming.
  • the capture of an image of the document 301 by the mobile device 305 is shown schematically.
  • the document 301 with the security feature 303 is here below the mobile device 305.
  • the upper right part of the document 301 and the security feature 303 are hidden by the mobile device 305 and displayed schematically on the display 307 of the mobile device 305.
  • the image of the document 301 with the security feature 303 can be detected by means of the image camera 309.
  • the security feature 303 can be illuminated by means of the illumination 31 1.
  • the image of the document 301 with the security feature 303 can be detected continuously during a movement of the mobile device 305.
  • the display 307 can be updated in real time.
  • the image of the document 301 with the security feature 303 may be a still image of a video.
  • the processor 313 may be configured to determine the actual position of the mobile device 305 with respect to the document 301 or the security feature 303 based on the captured image.
  • the current position and the target position of the mobile device 305 may be displayed on the display 307.
  • 4 shows a schematic illustration of a capture scenario 400 for capturing an image of a security feature 303 of a document 301 using an image camera 309 and a lighting 31 1 according to an embodiment.
  • the position of the imaging camera 309 relative to the security feature 303 may be represented by a vector d.
  • the position of the illumination 31 1 relative to the security feature 303 may be represented by a vector I.
  • the relative position of the image camera 309 to the illumination 31 1 can be represented by means of a vector o.
  • the vector o can be described by means of a rotation R around the image camera 309.
  • the perspective-angle-dependent and / or illumination-angle-dependent properties of the security feature 303 can be represented by means of a spatially-varying Spatially Varying Bidirectional Reflection Distribution Function (SVBRDF).
  • SVBRDF spatially-varying Spatially Varying Bidirectional Reflection Distribution Function
  • the illumination 31 for example the LED illumination
  • the illumination 31 is in a fixed position with respect to the image camera 309
  • only 3 degrees of freedom remain as input to the SVBRDF.
  • one degree of freedom of SVBRDF is eliminated.
  • the positioning scenario 500 includes a sight beam 501, a beam base circle 503, a beam subcircuit 505, a first target horizon line 507A, a second target horizon line 507B, a base sphere 509, an exposure circle 51 1, a first actual horizon line 513A, and a second actual horizon line 513B.
  • the sight beam 501 may indicate the desired position of the mobile in elevation and azimuth relative to the position of the security feature 303.
  • the sight beam 501 may pass through the center of the beam base circle 503 or the center of the beam subcircuit 505.
  • the beam base circle 503 may indicate a tilt of the mobile device with respect to the location of the security feature 303.
  • the beam subcircuit 505 may indicate a distance of the target position of the mobile device with respect to the position of the security feature 303.
  • the first target horizon line 507A and / or the second target horizon line 507B may indicate a rotation of the target position of the mobile device around the sight beam 501.
  • the base ball 509 may indicate a tilt of the mobile device at the current position.
  • the base sphere 509 can just be displayed in the form of a dot.
  • the base sphere 509 may be displayed at a fixed point, such as the center, of the display 307 of the mobile device.
  • the radiating circuit 51 1 may be scaled with respect to the beam subcircuit 505.
  • the radiation circuit 51 1 can be permanently displayed on the display 307 of the mobile device.
  • the first actual horizon line 513A and / or the second actual horizon line 513B may indicate a rotation of the mobile device at the current position.
  • the first actual horizon line 513A and / or the second actual horizon line 513B may be fixedly displayed on the display 307 of the mobile device.
  • the positioning scenario 500 illustrates the positioning of a mobile device with a display 307 regarding a security feature 303 of a document 301.
  • the desired position is represented by means of the sight beam 501, the beam base circle 503, the beam subcircuit 505, the first target horizon line 507A and the second target horizon line 507B.
  • the actual position is represented by the base sphere 509, the radiation super circle 51 1, the first actual horizon line 513A and the second actual horizon line 513B.
  • Positioning of the mobile device relative to the security feature 303 of the document 301 can be effected by firstly placing the base sphere 509 within the beam base circle 503. Subsequently, the radiation circuit 51 1 with the Strahlunternik 505 are brought into agreement. Finally, the first actual horizon line 513A may be matched with the first desired horizon line 507A. In addition, the second actual horizon line 513B may be matched with the second target horizon line 507B.
  • the positioning scenario 500 illustrates the geometric relationships of positioning or alignment.
  • Matching the actual position or the current view with a desired position or a reference view can, according to an embodiment, be performed by matching the visual beam direction, the position, for example the base ball 509 on the display 307 with the beam base circle 503, and Radiation circle 51 1 with the beam subcircuit 505, and the orientation, for example, the target horizon line 507A, 507B or the virtual horizon on the beam with the actual horizon line 513A, 513B or the virtual horizon on the display 307.
  • the diameter of the Abhlobernikes 51 1 For example, according to one embodiment, it may be adaptively scaled or scaled with respect to the diameter of the beam subcircuit 505.
  • the sight beam 501 is not displayed, according to one embodiment.
  • FIG. 6 shows a schematic representation of a positioning scenario 600 of a mobile device 305 with a display 307 with respect to a document 301 with a security feature 303 according to an embodiment.
  • the mobile device 305 further comprises an image camera 309, a lighting 31 1 and a processor 313.
  • the desired position is spatially represented by means of the sight beam 501, the beam base circle 503, the beam subcircuit 505, the first target horizon line 507A and the second target horizon line 507B with respect to the actual position of the mobile device 305.
  • the target position is displayed in perspective with the sight beam 501, the beam base circle 503, the beam subcircuit 505, the first target skyline 507A and the second target skyline 507B on the display 307 of the mobile device 305 and superimposed on the captured image of the document 301.
  • the actual position is displayed on the display 307 of the mobile device 305 by means of the base sphere 509, the radiation super circle 51 1, the first actual horizon line 513A and the second actual horizon line 513B.
  • the actual position can be made in accordance with the target position.
  • the displayed actual position and the displayed nominal position can be updated in real time.
  • an image of the security feature 303 can be detected by the image camera 309.
  • the security feature 303 can be illuminated by the illumination 31 1.
  • the processor 313 may be configured to compare a feature of the security feature image with a feature of a reference image to verify the authenticity of the document 301.
  • FIG. 7 shows a schematic representation of a comparison of an actual position with a desired position on a display 307 of a mobile device according to an embodiment.
  • the mobile device is at a first actual position relative to the desired position.
  • the mobile device is at a second actual position relative to the desired position.
  • the mobile device is at a third actual position relative to the desired position.
  • the mobile device is at a fourth actual position relative to the desired position. In the first actual position, the mobile device is not positioned. In the second actual position and the third actual position, the mobile device is partially positioned. In the fourth actual position, the mobile device is positioned.
  • the actual position is displayed on the display 307 of the mobile device by means of a base sphere, a radiation super circle, a first actual horizon line and a second actual horizon line.
  • the actual position is displayed in the form of solid lines.
  • the target position is displayed in perspective on the display 307 of the mobile device by means of a sight beam, a beam base circle, a beam subcircuit, a first target horizon line and a second target horizon line.
  • the target position is displayed in the form of dashed lines.
  • a comparison of the actual position with the desired position can be carried out, for example, as follows. Starting from the first actual position of the mobile device in diagram 701, initially the base sphere can be arranged in the beam base circle. In addition, the mobile device can be positioned directly in the sight beam. Thus, positioning the mobile in azimuth and elevation relative to the security feature be achieved. This positioning corresponds to the second actual position of the mobile device in diagram 703.
  • the radiation circuit can be brought into agreement with the beam lower circuit.
  • a positioning of the mobile device with respect to the distance to the security feature can be achieved. This positioning corresponds to the third actual position of the mobile device in diagram 705.
  • first actual horizon line can be matched with the first desired horizon line.
  • second actual horizon line may be matched with the second target horizon line.
  • Diagram 701 corresponds to a start position, a base point positioning diagram 703, a viewpoint positioning diagram 705, and a diagram 707 of positioning a rotation about the line of sight.
  • this may provide an interactive system for the verification of viewing angle dependent security features that detects an SVBRDF using illumination, such as a built-in light emitting diode (LED) on the mobile device.
  • illumination such as a built-in light emitting diode (LED) on the mobile device.
  • the user can obtain an overview of relevant target positions or viewing directions for a verification, which can be color-coded with respect to the decisions of the user.
  • the system may allow the user to accurately match given target positions or reference views and compare changes or changes in holographic or other security features with corresponding reference images or references.
  • diagram 701 illustrates a non-aligned actual position
  • diagram 703 an alignment of a direction
  • diagram 705 an adjustment of a distance
  • diagram 707 an alignment of a rotation
  • 8 shows a schematic representation of a comparison of an actual position with a desired position on a display 307 of a mobile device according to an embodiment.
  • the mobile device is at a first actual position relative to the desired position.
  • the mobile device is at a second actual position relative to the desired position. In the first actual position, the mobile device is not positioned. In the second actual position, the mobile device is positioned.
  • the actual position is indicated on the display 307 of the mobile device by means of a base sphere, a tolerance range 805 of the radiation super-circle, a tolerance range 807A of the first actual horizon line and a tolerance range 807B of the second actual horizon line.
  • the actual position of the mobile device is displayed in the form of solid lines.
  • the target position is displayed in perspective on the display 307 of the mobile device by means of a sight beam, a beam base circle, a beam subcircuit, a first target horizon line and a second target horizon line.
  • the target position is displayed in the form of dashed lines.
  • a matching of the actual position with the desired position can be carried out in accordance with FIG. 7 taking into account the tolerance ranges 805, 807A, 807B.
  • a positioning of the mobile device relative to the security feature within predetermined accuracy tolerance values can be achieved.
  • a first display field 809 and / or a second display field 81 1 can be displayed on the display 307.
  • a reference image may be displayed on the first display panel 809.
  • Diagram 801 corresponds to a non-positioned or non-aligned actual position.
  • Diagram 803 corresponds to a positioned or aligned actual position.
  • the reference image or the reference patch can each be in the upper part of the display 307.
  • Other elements may embody areas for positioning or orientation.
  • Viewing angle dependent security features e.g. Holograms, change their appearance depending on viewing angle or viewing direction d, and existing lighting or light sources.
  • illustrated instructions are used to test such security features or features.
  • the respective person can compare different desired views or reference views, which can each be given by a reference image or example image, with the security feature or element to be tested.
  • the lighting situation is usually not set. Also, there is no guarantee that the person will view the security feature or element from the desired direction, or that all given target views or reference views will be checked.
  • the document type or class of document e.g. 50 Euro note, including reference layout is known.
  • the position of the document relative to the camera of the mobile device or recording device i. extrinsic parameters, is known and can be accurately tracked or tracked. For example, a recognition and / or a planar tracking or tracking of images with mobile devices or mobile phones can be performed.
  • the camera can be described by its intrinsic parameters, eg main point, focal length, etc.
  • the tracking system or tracking system may provide extrinsic parameters as rotation and translation which may be used to place additional information, eg according to the reference layout, over the document.
  • each pixel (x, y) on the document may be dependent on the type and distribution of the illuminations or light sources in the environment.
  • the appearance or appearance can be largely determined by using a dominant illumination or light source with direction of incidence I.
  • the intensity I of each color channel can thus be dependent on the pixel, the direction of the illumination or light source I and the viewing direction d.
  • l l (x; y; I; d) with 6 degrees of freedom. This may correspond to a Spatially Varying Bidirectional Reflectance Distribution Function (SVBRDF).
  • SVBRDF Spatially Varying Bidirectional Reflectance Distribution Function
  • a lighting or light source for example a light-emitting diode (LED)
  • LED light-emitting diode
  • the lighting or light source can be attached to the camera.
  • the relative position between the camera and illumination or light source, such as the light emitting diode (LED) may be described by a translation vector o.
  • the direction of the illumination or light source is thus proportional to the position of the camera and the translation vector o rotated with the camera rotation R.
  • the camera position itself is just a rotated vector. Accordingly, one degree of freedom can be omitted. Since one is interested in the verification of strong changes, for example, can be dispensed with a continuous representation, and only corresponding target image areas or reference patches are recorded or recorded. If in this position or arrangement it can be ensured that the mobile device or recording device assumes the respective desired position or reference position, the appearance or the appearance of the image area or patch is also reproducible. This makes it possible to test the security feature or element.
  • the position or spatial position for example in the form of extrinsic parameters, still comprises 6 degrees of freedom. The user should therefore be guided to the respective desired positions or reference positions. For this, an intuitive approach to positioning or alignment to any desired 3D position or reference pose can be used. The approach is based on the principles of sight and grain as well as virtual horizon.
  • the existing tracking information or tracking information for the current view can be used to place specific alignment elements, such as a line of sight or line of sight, circles for sight and grain, horizon lines or bars for the virtual horizon, directly over the document and track them in real time.
  • specific alignment elements such as a line of sight or line of sight, circles for sight and grain, horizon lines or bars for the virtual horizon.
  • the other elements used for positioning or alignment may remain static on the display or screen.
  • the adjustment can be done in 3 steps.
  • the user may first match the camera center with the base point of the desired view or reference view, i. the static red dot in the middle of the display or the screen.
  • the camera rotation and the distance can be adapted, for example by means of green circles or sight and grain.
  • the rotation can be adjusted around the line of sight, for example by means of blue / black horizon lines or bars or a virtual horizon. Now an image can be captured or captured and the relevant image area or patch can be extracted by the existing position or location information.
  • the approach to capturing or capturing viewing angle dependent security features or elements can achieve high accuracy in terms of alignment of the target views or reference views despite purely manual positioning or alignment.
  • the comparison of the image area or patch can be performed by means of a Normalized Cross Correlation (NCC).
  • NCC Normalized Cross Correlation
  • An automatic Recording when approaching the respective desired position or Referenzpose is also possible.
  • An implementation is possible due to the existing position or position information by the tracking system or tracking system.
  • the invention enables a reproducible detection or recording of viewing angle-dependent security features or elements as a basis for manual or automatic verification or testing of security features or elements on mobile devices.
  • a clear link between viewing direction and appearance or appearance can be guaranteed. There is no need for a stationary arrangement or special equipment. Also deviations of the actual position or recording position of the desired position or ideal position can be determined and displayed. In addition, an automation of the detection or recording by an evaluation of a deviation from the nearest or most similar desired position or reference position and the current actual position or location of the mobile device can be performed.
  • a workflow can include the following steps. First, a determination or recognition of the document type or the document class can be performed. Subsequently, a tracking of the document and a visualization of the associated layout can be performed. Thereafter, a selection of the security feature or element to be tested can take place. You can then display available target views or reference views, per view. Thereafter, an approach of the mobile device by the user and a triggering of the detection process or recording process can be performed. Subsequently, a manual comparison of reference image or reference patch and captured or captured image or patch can be performed. Finally, the user can make a local decision. The global decision regarding authenticity can be made by the user.
  • An extension can include the following options. First, continuous extraction and display of the equalized hologram image area or hologram patch can be performed. In addition, an automatic comparison of the rectified image area or patch with an associated target view or Reference view, for example, by means of a mean-normalized cross-correlation (Zero-mean Normalized Cross Correlation, ZNCC) are performed. In addition, for example, an edge image of the image area or patch with an adequate similarity measure, eg a Hausdorff distance, or a combination of several similarity measures can be used.
  • ZNCC mean-normalized cross-correlation
  • automatic detection or recording can be performed with better agreement with a desired view or reference view.
  • an indication of the last acquired or recorded image area or patch can be made when approaching a desired view or reference view and a possibility for revaluation be provided.
  • the workflow can lean on the procedure by means of an illustrated manual, in which typically a continuous adjustment takes place.
  • a preliminary decision can be made by the system, which can be corrected by the user at any time.
  • the automatic comparison can be repeated and the visualization of the decision made updated.
  • the system can thus decide locally, with regard to the current view, and globally, with regard to the security feature or element, with regard to a similarity, for example by means of a similarity measure.
  • the invention relates to a method for mobile interactive hologram verification.
  • Verification of paper documents is an important part of verifying a person's identity, access authorization, or means of payment.
  • Many documents such as passports or paper notes, include holograms or other look-angle-dependent security features or elements that are difficult to spoof and therefore used to verify the authenticity of these documents.
  • View-angle-dependent security features or elements change their appearance based on both the viewing direction and the dominant lights or light sources, requiring specialized knowledge and training to distinguish original security features or elements of counterfeiting.
  • a mobile interactive method may be used that combines or integrates the recognition of the documents with the interactive verification of the viewing angle dependent security features or elements.
  • the system may recognize and track the paper document, provide user guidance for visual positioning or visual orientation, and display a stored image of the security feature or item appearance, and also record user decisions based on the current view of the document.
  • the underlying spatially-variable BRDF representation of viewpoint-dependent security features or elements can be modeled and captured.
  • the feasibility of verifying such security features or elements with a mobile augmented reality (AR) system using information from a real-time tracking system running on hardware may be considered.
  • the system can assist in efficiently detecting these security features or elements and provide a comparison for the user between the expected appearance of a viewing-angle-dependent security feature and the one actually observed at the current viewing direction.
  • AR augmented reality
  • an approach to actively positioning or aligning a mobile device with a 3D target position or reference pose using sight and grain and a virtual horizon may be employed.
  • the user can then decide whether the security feature or item is accepted or rejected as genuine. While a comparison from a single point of view may result in rejection, it is desirable to check for acceptability all important viewing directions representing different phenomena. Consequently, the system can guide the user to the different directions and record the progress along the way.
  • Viewpoint-dependent security features provide images that vary significantly depending on the viewing direction as well as the dominant direction of illumination or direction of light. Therefore, a simple single texture image usually can not capture the full appearance of such security features.
  • Viewing angle dependent security features or elements can be described using a spatially varying BRDF (SVBRDF) representation which allows to obtain both the dependence of viewing and illumination angles and the spatial variation of the images.
  • SVBRDF spatially varying BRDF
  • planar, thin surfaces, such as printed documents, are of interest, and therefore accurate models with self-shading or sub-surface scattering effects are not important.
  • a 6D appearance model per color channel may be used, where the radiation I is a function of both the location (x; y) on the document and the incident illumination direction or direction I and the view direction d, according to:
  • the direction vectors I and d have unit length and therefore have only 2 degrees of freedom.
  • a representation of a representative image of the viewpoint dependent security feature or element is primarily of interest to the user.
  • several simplifying assumptions can be made. It can be assumed that all radiation from a point on the security feature or element is dominated by a single main illumination direction or main light source direction. Consequently, integration over all incident illumination directions or directions of light can be dispensed with, and a single snapshot or image acquisition can be sufficient given the dominant direction.
  • full radiometric calibration and control of auto exposure or white balance in the camera can be eliminated. This can be a simple capture the appearance in the form of a set of images, which are indexed by the viewing direction d and the direction of illumination or light direction I, can be achieved.
  • a lighting or light source such as a light emitting diode (LED)
  • LED light emitting diode
  • the illumination or light source for example the light emitting diode (LED)
  • LED light emitting diode
  • the illumination direction or light direction I is now a function of the camera position or camera pose with respect to the document.
  • the illumination direction or light direction is now proportional to the camera position or camera position P plus offset vector o, which is rotated by the camera rotation in world coordinates.
  • the representation can be reduced to a 5D model indexed by the full 3D camera rotation and the location (x; y) on the document.
  • a mobile augmented reality (AR) device can be used to verify a hologram.
  • the system can estimate the current viewing direction and / or camera pose.
  • the camera rotation indexes the plurality of appearances of a reference security feature or reference element.
  • the reference image is displayed for comparison with an image, which can be acquired from the live video image, for example.
  • the system may include an active user guidance component instructing the user to specifically compare that set of primary views.
  • a mobile visual search pipeline can be used, which runs independently on the mobile device.
  • SURF features can be calculated, arranged in a hierarchical k-mean tree, and geometric verification performed using a robust homography estimation to reintroduce spatial information. This provides adequate recognition performance and scales up to a large number of documents.
  • a natural feature tracker may be configured with a representative example of the recognized document class or the recognized document type.
  • the natural feature tracking implementation can run in real time directly on the mobile device.
  • the tracker may be initialized by estimating a position or pose from fitted BRISK features extracted from a given template selected during a visual search. Both detection and tracking can be based on the assumption that planar objects are considered. This often does not apply to paper documents. Since this will not lead to a tracking error in most cases, but to position or pose jitter, the positions or poses in a ring buffer can be smoothed to accommodate the To improve stability. Averaging the position or pose over 2-3 frames or images can stabilize the view, while the introduced delay for this particular arrangement is small.
  • the system guides the user to capture a frame or image from the same viewing direction and under the same direction of illumination or direction of light as captured in the reference image set.
  • lighting such as a light emitting diode (LED), the mobile device, or the mobile phone as the dominant illumination or light source, the task is simplified to position or align the current actual position or pose of the mobile device or mobile phone used by the user , with several nominal positions or reference views, which have 6 degrees of freedom.
  • LED light emitting diode
  • a visual guidance approach which is based on two approaches, namely, sight and grain, and a virtual horizon, can be used to position or align a device with high accuracy at any desired 3-dimensional target position or reference pose.
  • the sight and the grain can be used to align the viewing direction of a user with the direction of the device.
  • shaped alignment marks can be used, which are arranged at a predetermined distance on the device. Consideration of the distance or scaling depends on the task and may include a calibration procedure. This can be used for target mechanisms.
  • the virtual horizon may be a level indicator used when aiming a device relative to the ground.
  • the instrument can indicate the level of the object relative to the earth's gravity. Implementation variants range from simple water level measurements for mechanical tasks to electronic devices in aircraft.
  • a visual orientation can be divided into three steps.
  • the direction of the visual beam, ie the sight and the grain, the position along the beam, and the in-plane rotation, ie the virtual horizon, can be adjusted. It is desirable to guide the user through the steps so that accurate alignment can be performed.
  • the leadership approach can be realized by means of an interactive mobile device.
  • the sight and grain positioning can be realized by two large circles, which mark a start and an end point of the sight beam.
  • the bottom beam circle can be scaled so that it overlaps completely with the top circle once the direction and distance are balanced.
  • a small beam base circle can additionally be used, which is intended to overlap with a small ball which is fixed on the device display.
  • the virtual horizon assembly may include two lines located at the top of the beam and two corresponding lines fixed on the display. Using two different colors for each line, one possible ambiguity of rotation about the optical axis can be considered. The following color scheme can be used to support the 3-step alignment or positioning approach. Red for the base sphere or the display sphere and for the small ray base circle, green for the ray subcircle or large beam base circle as well as for the radiation circle or superradiation circle, and blue / yellow for the horizon lines or the virtual horizon.
  • an automatic pre-selection by the system can be performed, whereby the full sight and grain and virtual horizon position is drawn only for the selected target position or reference pose.
  • the similarity can be determined, for example, by means of a similarity measure or a similarity metric.
  • the color of the sighting beam or reference beam can be adjusted. This gives the user a brief summary of his decisions as he sees the order farther away, and also knows where no decision has been taken up to that point.
  • the last detected beam associated with the current desired position or reference pose may also be drawn so that the user can get an idea of how well the captured views fit.
  • desired positions or reference positions may depend on the hologram, for example by the number of transitions, and may be dictated by the particular Restricted arrangement which is used. For each view, stable tracking and reproducible appearance are desirable. In practice, it seems reasonable to operate at about a constant distance from the hologram, resulting in a hemispherical sensing space. For the holograms, for example, 2 to 6 views with stable appearance of the areas at a distance of approximately 10 cm can be detected or recorded.
  • image capture may be initiated by the user if the positioning or orientation of a desired position or reference pose and the current actual position or pose is considered close enough for accurate visual feedback.
  • an autofocus action may be initiated and the tracking position or tracking pose checked for stability before taking the current image or frame and position or pose. This is intended to prevent detection or picking up of position or pose jitter or blurred areas.
  • the hologram is planar and a bounding box of the hologram can be projected into the image using the current position or pose. Subsequently, an image transformation with respect to the hologram area on the undistorted original can be estimated, and consequently the subimage comprising the hologram can be distorted. Consequently, the appearance of the distorted area corresponds to the selected viewing direction. This allows an efficient comparison.
  • This area can be displayed next to a reference image area or reference area. The similarity can be assessed by a user to express assent, uncertainty or rejection of the verification.
  • an area-based SVBRDF representation may be captured with a mobile device using a dominant illumination or light source in a fixed location relative to a camera of the mobile device. Then, a user may verify a perspective-dependent security feature or element by comparing the captured live image with a stored SVBRDF representation.
  • a guidance approach based on sight and grain and a virtual horizon can be used. This allows the user to match the position or pose of the device or phone with pre-recorded views of the hologram.
  • This approach can be implemented in a mobile interactive augmented reality (AR) system that performs document recognition and tracking and supports hologram verification.
  • AR augmented reality
  • viewing-angle-dependent security features or elements such as holograms or elements having ink, which has optically variable or color-changing properties, change their appearance depending on the viewing angle and / or the type of lighting or light sources in the environment.
  • detection of the entire representation of the security feature is dispensed with and only a few selected nominal positions or poses are used.
  • the invention relates to a method for receiving an SVBRDF of perspective-dependent security features or elements with mobile devices.
  • the invention relates to a method for recording viewing angle-dependent security features or elements on documents by means of mobile devices or mobile telephones and representation as SVBRDF in the form of detail images for checking documents.
  • Detecting perspective-dependent security features or elements may be performed by detecting a spatially-varying BRDF (SVBRDF).
  • SVBRDF spatially-varying BRDF
  • This 6D function can characterize the amount of radiation that is reflected from each surface point, when looking from direction d and illumination from direction I.
  • reproducible image acquisition results of viewing angle-dependent security features or elements are achieved. This can cause sharp or significant changes in appearance to be obtained.
  • a relatively small amount of image data and pose information can be stored.
  • recording of perspective-dependent security features or elements with mobile devices can be achieved, which is suitable for document verification.
  • a semi-automatic approach is performed with automatic capture and matching during runtime.
  • the invention relates to a method of aligning a mobile device at any 6-degree position or pose for augmented reality (AR) applications.
  • AR augmented reality
  • the invention relates to an approach for active alignment of a device, for example a mobile device, to a 3D target position or reference pose by means of a sight and a grain and a virtual horizon.
  • the positioning can be divided into three steps. First, the direction of the visual beam can be adjusted, for example by means of sight and grain. Thereafter, the distance can be adjusted at the beam, for example, for scaling. Thereafter, the rotation around the beam can be taken into account, for example by means of the virtual horizon.
  • the alignment process is divided into 3 steps with visual feedback during the process, high alignment accuracy, the ability to visualize deviations, and the ability to provide additional hints to the user.
  • This allows an intuitive alignment of a device to any position or pose with 6 degrees of freedom and visual feedback can be realized.
  • a high alignment accuracy can be realized by means of a simple or unrestricted arrangement, which is suitable for small augmented reality (AR) areas, and can be used for mobile applications.
  • AR augmented reality
  • a semi-automatic approach is used which improves the final positioning or orientation by automatically selecting positions or poses that better match the given target position or reference pose. This can reduce the cognitive load on a user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
PCT/EP2014/070020 2013-09-30 2014-09-19 Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes WO2015044051A1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14780428.0A EP3053098A1 (de) 2013-09-30 2014-09-19 Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013110785.0 2013-09-30
DE102013110785.0A DE102013110785A1 (de) 2013-09-30 2013-09-30 Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes

Publications (1)

Publication Number Publication Date
WO2015044051A1 true WO2015044051A1 (de) 2015-04-02

Family

ID=51659613

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/070020 WO2015044051A1 (de) 2013-09-30 2014-09-19 Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes

Country Status (3)

Country Link
EP (1) EP3053098A1 (it)
DE (1) DE102013110785A1 (it)
WO (1) WO2015044051A1 (it)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017055277A1 (de) * 2015-10-01 2017-04-06 Bundesdruckerei Gmbh Dokument und verfahren zum verifizieren eines dokuments
CN106682912A (zh) * 2015-11-10 2017-05-17 艾普维真股份有限公司 3d结构的认证方法
CN108780595A (zh) * 2016-03-14 2018-11-09 凸版印刷株式会社 识别装置、识别方法、识别程序以及包含识别程序的计算机可读介质
US11443559B2 (en) 2019-08-29 2022-09-13 PXL Vision AG Facial liveness detection with a mobile device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2644513C1 (ru) 2017-02-27 2018-02-12 Общество с ограниченной ответственностью "СМАРТ ЭНДЖИНС СЕРВИС" Способ детектирования голографических элементов в видеопотоке
DE102018209366B4 (de) * 2018-06-12 2020-07-02 Audi Ag Verfahren zur Bestimmung einer Position und/oder Orientierung einer Einrichtung
GB202111974D0 (en) * 2021-08-20 2021-10-06 Andrews & Wykeham Ltd Optical authentication structure with augmented reality feature

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007087498A2 (en) * 2006-01-23 2007-08-02 Digimarc Corporation Methods, systems, and subcombinations useful with physical articles
EP1852752A2 (en) * 2006-04-28 2007-11-07 Kabushiki Kaisha Toshiba Verification apparatus and verification method for recording media
US20090154813A1 (en) * 2007-12-12 2009-06-18 Xerox Corporation Method and apparatus for validating holograms
EP2320390A1 (en) * 2009-11-10 2011-05-11 Icar Vision Systems, SL Method and system for reading and validation of identity documents
DE102013101587A1 (de) * 2013-02-18 2014-08-21 Bundesdruckerei Gmbh Verfahren zum überprüfen der echtheit eines identifikationsdokumentes

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8953037B2 (en) * 2011-10-14 2015-02-10 Microsoft Corporation Obtaining spatially varying bidirectional reflectance distribution function

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007087498A2 (en) * 2006-01-23 2007-08-02 Digimarc Corporation Methods, systems, and subcombinations useful with physical articles
EP1852752A2 (en) * 2006-04-28 2007-11-07 Kabushiki Kaisha Toshiba Verification apparatus and verification method for recording media
US20090154813A1 (en) * 2007-12-12 2009-06-18 Xerox Corporation Method and apparatus for validating holograms
EP2320390A1 (en) * 2009-11-10 2011-05-11 Icar Vision Systems, SL Method and system for reading and validation of identity documents
DE102013101587A1 (de) * 2013-02-18 2014-08-21 Bundesdruckerei Gmbh Verfahren zum überprüfen der echtheit eines identifikationsdokumentes

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
K. CHINTAMANI ET AL.: "Improved Telemanipulator Navigation During Display-Control Misalignments Using Augmented Reality Cues", SYSTEMS, MAN AND CYBERNETICS, PART A: SYSTEMS AND HUMANS, vol. 40, no. 1, 2010, pages 29 - 39, XP011344797, DOI: doi:10.1109/TSMCA.2009.2030166
Y.-C. CHENG ET AL.: "AR-based Positioning for Mobile Devices", INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING WORKSHOPS, 2011, pages 63 - 70, XP032460956, DOI: doi:10.1109/ICPPW.2011.48

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017055277A1 (de) * 2015-10-01 2017-04-06 Bundesdruckerei Gmbh Dokument und verfahren zum verifizieren eines dokuments
CN106682912A (zh) * 2015-11-10 2017-05-17 艾普维真股份有限公司 3d结构的认证方法
WO2017080975A1 (en) * 2015-11-10 2017-05-18 Alpvision S.A. Method and apparatus for authentication of a 3d structure
CN106682912B (zh) * 2015-11-10 2021-06-15 艾普维真股份有限公司 3d结构的认证方法
CN108780595A (zh) * 2016-03-14 2018-11-09 凸版印刷株式会社 识别装置、识别方法、识别程序以及包含识别程序的计算机可读介质
EP3432278A4 (en) * 2016-03-14 2019-11-27 Toppan Printing Co., Ltd. IDENTIFICATION DEVICE, IDENTIFICATION METHOD, IDENTIFICATION PROGRAM, AND COMPUTER-READABLE MEDIUM CONTAINING IDENTIFICATION PROGRAM
US10891818B2 (en) 2016-03-14 2021-01-12 Toppan Printing Co., Ltd. Identification devices, identification methods, identification programs and computer readable media including identification programs
US11443559B2 (en) 2019-08-29 2022-09-13 PXL Vision AG Facial liveness detection with a mobile device
US11669607B2 (en) 2019-08-29 2023-06-06 PXL Vision AG ID verification with a mobile device

Also Published As

Publication number Publication date
DE102013110785A1 (de) 2015-04-02
EP3053098A1 (de) 2016-08-10

Similar Documents

Publication Publication Date Title
EP3053098A1 (de) Positionierverfahren für die positionierung eines mobilgerätes relativ zu einem sicherheitsmerkmal eines dokumentes
US9940515B2 (en) Image analysis for authenticating a product
EP2956915B1 (de) Verfahren zum überprüfen der echtheit eines identifikationsdokumentes
Hui et al. Reflectance capture using univariate sampling of brdfs
EP3025200A1 (de) Verfahren und gerät zur verifikation von diffraktiven elementen
DE112010002174T5 (de) Verfahren und vorrichtung für ein praktisches 3d-sehsystem
WO2016170041A1 (de) Verfahren zur identifikation eines sicherheitsmusters über eine artifizielle 3-d-rekonstruktion
DE102017114081B4 (de) Vorrichtung und Verfahren zum Rundum-Inspizieren von Behältnissen am Transportband
US20190114762A1 (en) Computer-Controlled 3D Analysis Of Collectible Objects
CH709003B1 (de) Distanzbestimmung aus Bildern mit Referenzobjekt.
Hartl et al. Mobile interactive hologram verification
WO2017042348A2 (de) Verfahren und vorrichtung zum überlagern eines abbilds einer realen szenerie mit einem virtuellen bild und mobiles gerät
EP3158543B1 (de) Verfahren zum detektieren eines blickwinkelabhängigen merkmals eines dokumentes
EP3357043B1 (de) Dokument und verfahren zum verifizieren eines dokuments
AT511460B1 (de) Verfahren zur bestimmung der position eines luftfahrzeugs
CN105679179A (zh) 防伪标签及其制作方法、识别方法
CH717006B1 (de) Verfahren zur Benutzeridentifikation.
DE102007033835B4 (de) Bildaufnahmetechnik zur direkten Objektsegmentierung in Bildern
DE102016117491A1 (de) Erfassungsvorrichtung zum Erfassen eines Gesichtsbildes einer Person
EP3069295B1 (en) Image analysis for authenticating a product
WO2016131812A1 (de) Mobilgerät zum erfassen eines textbereiches auf einem identifikationsdokument
EP3200154B1 (de) Verfahren zum bestimmen einer position eines objekts
EP2899699B1 (de) Verfahren zum Überprüfen der Echtheit eines Identifikationsdokumentes
CN113192008B (zh) 一种证件数字图像的光场防篡改采集装置及防篡改方法
CN110647948B (zh) 一种基于神经网络的图片拼接检测方法和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14780428

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2014780428

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014780428

Country of ref document: EP