WO2023148060A1 - Procédé de lecture d'une étiquette basées sur des rétro-réflecteurs, par exemple des réflecteurs sphériques cholestériques. - Google Patents

Procédé de lecture d'une étiquette basées sur des rétro-réflecteurs, par exemple des réflecteurs sphériques cholestériques. Download PDF

Info

Publication number
WO2023148060A1
WO2023148060A1 PCT/EP2023/051776 EP2023051776W WO2023148060A1 WO 2023148060 A1 WO2023148060 A1 WO 2023148060A1 EP 2023051776 W EP2023051776 W EP 2023051776W WO 2023148060 A1 WO2023148060 A1 WO 2023148060A1
Authority
WO
WIPO (PCT)
Prior art keywords
pictures
tag
picture
flash
video
Prior art date
Application number
PCT/EP2023/051776
Other languages
English (en)
Inventor
Jan LAGERWALL
Hakam AGHA
Original Assignee
Universite Du Luxembourg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universite Du Luxembourg filed Critical Universite Du Luxembourg
Publication of WO2023148060A1 publication Critical patent/WO2023148060A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • G06K7/10732Light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10762Relative movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1447Methods for optical code recognition including a method step for retrieval of the optical code extracting optical codes from image or text carrying said optical code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • G06K7/1465Methods for optical code recognition the method including quality enhancement steps using several successive scans of the optical code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/80Recognising image objects characterised by unique random patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/95Pattern authentication; Markers therefor; Forgery detection

Definitions

  • the invention generally relates to the detection and reading of machine- readable tags comprising retroreflectors, e.g., cholesteric spherical reflectors (CSRs).
  • CSRs cholesteric spherical reflectors
  • CSRs Illuminated with white light, CSRs reflect only a narrow wavelength band and only with one circular polarisation, which can be either right- or left-handed. Because of their spherical shape, they function as retroreflectors, i.e. , they reflect light back to any light source that illuminates them, with the selectivity in wavelength and polarisation given by the cholesteric liquid crystal structure. Their PUF characteristics arise because their optical appearance changes dynamically with illumination and viewing conditions, and they depend sensitively on the exact arrangement of CSRs and of the particular characteristics of each CSR in a sample.
  • D-CSRs A new version of CSRs are “d-CSRs”, where the “d” stands for “dye-doped”.
  • D-CSRs may be specifically designed to be used for serialization and authentication of mass-produced products, where the CSR-based tag can be printed with any size on the front of the product packaging, where it can easily be used for logistics and supply chain tracing. This is possible because the dye mixed into the CSRs gives them the function of a pigment, as in standard printed matter, and they can thus be printed (or deposited using other technologies) into any desired pattern on top of the packaging (without interfering with the design apparent to the human eye).
  • CSRs By combining CSRs with dyes giving them the appearance of (primary) colours of a colour model (e.g., red (R), green (G), blue (B) and black (K)), effectively any colour can be produced.
  • An additive colour generation principle as in display technology and/or a subtractive colour model could be used.
  • a serialization code such as a QR-code or other machine-readable code can be hidden from plain eyesight, allowing it to be printed across the entire front of the packaging, if desired, without impairing (significantly) the design.
  • CSRs While the human eye should ideally not detect the CSRs, a machine that carries out the authentication should.
  • the special optical properties of CSRs allow them to be distinguished no matter what the background is, and the equipment required is low-cost.
  • it is not part of standard smartphone technology, causing a significant acceptance threshold for CSR- based tagging. To reduce the acceptance threshold, it is highly desirable to find a way to detect a pattern generated by CSRs that does not require any additional hardware beyond what is in a standard smartphone.
  • aspects of the present invention are drawn to addressing the problem of making it easier to read retroreflector-based, e.g., CSR-based, patterns, in particular retroreflector-based (e.g., CSR-based) machine-readable tags.
  • retroreflector-based e.g., CSR-based
  • retroreflector-based e.g., CSR-based
  • the present invention proposes a method for reading a retroreflector-based (e.g., CSR-based) machine-readable tag using a smart mobile device equipped with a camera and a flash.
  • the smart mobile device is used to take a first picture of the tag with the flash turned off and a second picture of the tag with the flash turned on.
  • the first and second pictures are then processed to bring out retroreflections of the retroreflectors (e.g., the CSRs or other wavelength-selective retroreflectors).
  • the processing may include differential image processing of the first and second pictures.
  • the processing may include highlighting and/or identifying the positions of the retroreflections in the differential image.
  • the processing may include application of a colour (or wave-length) filter on at least one of the first picture, the second picture and the differential image, the colour (or wavelength) filter tuned to a colour of the retroreflections of the retroreflectors (in particular, the CSRs).
  • a colour (or wave-length) filter on at least one of the first picture, the second picture and the differential image, the colour (or wavelength) filter tuned to a colour of the retroreflections of the retroreflectors (in particular, the CSRs).
  • the expression “smart mobile device” designates an electronic device capable of connecting to other devices or networks (in particular the Internet) via different wireless protocols such as Bluetooth, Zigbee, NFC, Wi-Fi, LiFi, 5G, etc., and small enough to hold and operate in the hand.
  • Examples of most preferred smart mobile devices include smartphones and tablets (tablet computers) with a touchscreen.
  • the proposed method for reading CSR-tags makes use of the retroreflection of CSRs rather than of the circular polarisation of the reflection.
  • the method may comprise placing the smart mobile device in an off-normal position with respect to the tag to take the first and second pictures.
  • the off-normal position is preferably at an angle from 30° to 60° from the surface normal across the tag.
  • Such off-normal perspective is preferred because the specular reflection of light flashes emitted by the smart mobile device on the surface on which the tag is applied will in this case be directed away from the mobile device, whereas the retroreflections from the retroreflectors (e.g., the CSRs) will be directed back to the smart mobile device.
  • the processing of the first and second pictures may comprise subtracting the first picture from the second picture.
  • the processing of the first and second pictures includes image registration of the first and second pictures.
  • Image registration also image alignment
  • Image registration involves spatially transforming one of the first and second pictures to align with the other one of the first and second pictures.
  • Feature-based image registration identifies correspondences between features such as points, lines, and contours. It will be appreciated that image registration need not be necessary if the smart mobile device is not moved (or not much moved) with respect to the retroflector-based (e.g., CSR- based) tag between the taking of the first and second pictures.
  • the method may include attempting to extract the machine-readable tag without carrying out image registration and make a further attempt with image registration only if the first attempt failed.
  • the first and second pictures may be taken by recording a video and using a first and a second frame of the video as the first and the second picture, respectively.
  • the video may include a slow-motion video, i.e. , a video that is recorded with a higher frame rate than the play back frame rate.
  • the flash of the smart mobile device is activated at least once to generate the second picture(s).
  • the retroflector-based (e.g., CSR-based) machine-readable tag could comprise a one-dimensional barcode or a two-dimensional barcode, e.g., a QR code, an Aztec Code, a Data Matrix code, a PDF417 code, a DotCode, an EZcode, a ShotCode, a MaxiCode, etc.
  • the method may comprise extracting the machine-readable tag from the retroreflections of the retroreflectors (e.g., the CSRs).
  • a further aspect of the invention relates to a computer-implemented method for reading a retroflector-based (e.g., CSR-based) machine-readable tag with a smart mobile device including a camera and a flash.
  • the method comprises controlling the camera and the flash to take a first picture of the tag with the flash turned off and a second picture of the tag with the flash turned on, and processing the first and second pictures to bring out retroreflections of the retroreflectors (e.g., the CSRs).
  • the computer-implemented method may include providing visual and/or audio guidance to a user to hold the smart mobile device in an off-normal position with respect to the tag. Additionally, tactile feedback could be given to the user.
  • the method may include verifying whether the smart mobile device has reached an off- normal position and continuing providing the guidance until an off-normal position has been reached.
  • the guidance given to the user may include taking into account further parameters such as the distance between the smart mobile device and the tag, the ambient illumination, the size of the tag relative to the size of the first and second pictures, etc.
  • the guidance given to the user may be based on readings of further sensors available on the smart mobile device, e.g., an accelerometer, a compass, an inclination sensor, etc.
  • the computer-implemented method may further include controlling camera settings, such as aperture, focal length, exposure time, etc. If the smart mobile device includes more than one camera (comprising each an optical system, an image sensor and a dedicated software to control both), the computer-implemented method may control one or more of the cameras individually or jointly, depending on the possibilities offered by the API(s) (application programming interface(s)) of the camera(s)
  • the off-normal position may be from 30° to 60° from the normal. 45° may be preferred.
  • the processing of the first and second pictures may be carried out entirely onboard the smart mobile device.
  • the processing the first and second pictures could be outsourced at least in parts to a remote location (e.g., a data centre).
  • the processing of the first and second pictures could comprise subtracting the first picture from the second picture.
  • processing of the first and second pictures may include image registration of the first and second pictures.
  • the method may comprise controlling the camera and the flash to record a video during which the first picture is taken as a first frame of the video while the flash is turned off and the second picture is taken as a second frame of the video while the flash is turned on.
  • the flash may be controlled to emit a series of light flashes whereby a series of first pictures is taken when the light flashes are emitted and a series of second pictures is taken between the light flashes.
  • processing the first and second pictures to bring out retroreflections of the retroreflectors may include processing the series of first pictures and the series of second pictures.
  • the video could include a slow-motion video.
  • the camera could, for instance, be controlled to record the video at the maximum frame rate available on the camera (i.e. made available via the camera API).
  • the retroflector-based (e.g., CSR-based) machine-readable tag may comprise a two-dimensional barcode.
  • the processing of the first and second pictures may comprise extracting the machine-readable tag from the retroreflections of the retroreflectors (e.g., the CSRs).
  • an app comprising instructions, which when executed on a smart mobile device including a camera and a flash, causes the smart mobile device to carry out the computer- implemented method as set out above.
  • Fig. 1 is a schematic illustration of a substrate carrying a CSR-based machine- readable tag under ambient illumination
  • Fig. 2 is a schematic illustration of the substrate carrying a CSR-based machine- readable tag of Fig. 1 when illuminated by the flash of a mobile device;
  • Fig. 3 is a schematic illustration of the substrate carrying a CSR-based machine- readable tag of Fig. 1 when illuminated by the flash of a mobile device and under ambient illumination;
  • Fig. 4 shows (a.) a picture of a substrate with randomly deposited d-CSRs taken without flash, (b.) a picture of the same substrate taken with flash, (c.) the picture obtained by subtraction of the picture of Fig. 4a from the picture of Fig. 4b, and (d.) the picture obtained by conversion of the picture of Fig. 4c into a monochrome image.
  • the proposed solution for detecting and reading CSR-tags makes use of the retroreflection of CSRs rather than of the circular polarisation of the reflection.
  • a CSR-based machine-readable tag such as, e.g., a QR-code (or a similar code)
  • the solution is based on distinguishing the retroreflection, which always travels along the direction of illumination, from the specular reflection of ordinary surfaces, which travels away from the light source.
  • a user films a substrate with a machine-readable code made using CSRs, illuminated by ambient light, with a standard smartphone (or other smart mobile device) running a dedicated software (app).
  • the user holds the phone as still as possible, such that the camera images the entire area containing the code, and the app guides the user to orient the camera viewing direction at an angle of about 30-60° (e.g. 45°) to the surface normal.
  • the app then starts a brief maximum-speed video recording, during which the app ignites the LED torch (the flash) so as to generate a few sequential light flashes.
  • the light from the flashes will be reflected forward by the background surface (specular reflection) and not back to the camera, due to the inclined imaging angle.
  • CSRs the light reflected by CSRs is, by virtue of them being retroreflectors, reflected back to the camera. Although scattering from the substrate means that some LED light also reaches the camera, the CSR reflections will be more intense. Moreover, in case of d- CSRs, they also reflect with a red-shifted colour compared to their appearance under ambient light, since retroreflection appears at longer wavelength than the mixed-colour reflection under ambient illumination.
  • the app then subtracts each frame with flash from the frame immediately before and/or after it in the video. With high enough frame rate, the displacement between the frames can be sufficiently small that the background appearance is much reduced, since it appears similar with and without flash, since the ambient illumination is always present.
  • the CSRs appear with increased contrast since strong retroreflection is visible only in a frame with flash. Processing the sequence of subtractions, a final image is produced in which machine-readable code pattern is clearly detectable. This image is then used for the extraction of the message encoded in the machine-readable tag.
  • the image processing may further include any processing required for identifying the unique PUF character and consequent authentication.
  • FIG. 1 shows how the diffuse ambient light (rays illustrated by dotted lines) makes the entire substrate 18, with and without CSRs 16, equally visible to the camera 12 of a mobile phone 10 filming along an angle away from the substrate normal.
  • Fig. 2 shows the response when no ambient light is present, and the only illumination is the light from the mobile phone torch 14 (rays illustrated by dashed lines with large gaps).
  • the specular reflection from the background surface reflects the rays (dotted lines) away from the mobile phone 10, in contrast to the CSRs 16, which retroreflect (rays illustrated by dashed lines with small gaps) the rays back to the phone 10 and its camera 12.
  • both ambient and torch illumination are present.
  • the background specular reflection is removed and only the retroreflection is retained. Since only the CSRs 16 provide retroreflection, this reveals the locations of CSRs 16 and thus produces an image corresponding to the encoded machine-readable code, e.g., a QR-code, even if this is undetectable to the human eye.
  • Figure 4 illustrates the principle on the basis of a concrete example.
  • D-CSRs were randomly deposited on a surface.
  • the CSRs were doped with (orange) dye that was not tuned to the CSR reflection colour (green).
  • a standard smartphone iPhone XS
  • Figure 4a shows a picture taken in the conditions of Fig. 1 , with only ambient light.
  • the background has the same orange colour as the d-CSRs and therefore one cannot distinguish where the d-CSRs are.
  • Figure 4b shows a picture taken in the conditions of Fig. 3, and one may note that parts of the image appear brighter, with a different apparent colour than the background orange.
  • Fig. 4a This is where the d-CSRs are, the different appearance being due to the green retroreflection of the torch light from the mobile phone.
  • the picture of Fig. 4a is subtracted from the picture of Fig. 4b in a standard graphics software on a computer, yielding the image in Fig. 4c.
  • the d-CSRs here appear with their green retroreflection colour, but they still appear dark on the black background.
  • the image is converted to monochrome (Fig. 4d), where any colour present above a threshold strength is pictured in white, and now the randomly distributed d-CSRs are easy to detect.
  • the above-discussed example demonstrates the proof of the principle underlying the invention, which concerns the ability to detect CSRs and the pattern they produce using a regular smartphone without any additional hardware.
  • the d-CSRs will be laid out in the pattern of a machine- readable tage (e.g., a QR-code or the like), allowing the smartphone to decode the information encoded onto the surface, invisible to the naked human eye.
  • a machine- readable tage e.g., a QR-code or the like
  • a key advantage of the invention is that it allows a smart mobile device, such as a standard smartphone, to distinguish CSRs hidden in a surface, such that they are undetectable to the human eye, and remove the background, such that any pattern produced by the CSRs becomes apparent to the mobile device software, which can then process it to decode the information contained in the pattern, e.g., a QR-code.
  • This allows CSRs to generate serialisation codes that can be made very large and cover the most visible face of product packaging, since they are not visible to the human eye.
  • a smartphone can read the code without necessitating any additional hardware, in order to perform authentication and supply chain tracing functions, of great value to confirm the authenticity of a product and/or to trace an item through a supply chain.
  • the details of the acquired image may not be as rich and sharp as when a dedicated device with opposite circular polarisers is used to detect the CSRs. If only the simplest of the PUF characteristics of CSR tags, primarily related to the physical location of each type of CSR and its reflection colour, can be tested with the presented methods, it is possible to carry out a more complete PUF analysis as a supplement. To carry out such supplementary analysis, dedicated equipment that uses circular polarisers and that can illuminate the sample from different directions may be useful.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Toxicology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Input (AREA)

Abstract

Selon un premier aspect, l'invention propose un procédé de lecture d'une étiquette lisible par machine basée sur des rétro-réflecteurs, par exemple des réflecteurs sphériques cholestériques (CSR), à l'aide d'un dispositif mobile intelligent (10) équipé d'une caméra (12) et d'un flash (14). Le dispositif mobile intelligent est utilisé pour prendre une première image de l'étiquette avec le flash éteint et une seconde image de l'étiquette avec le flash allumé. Les première et seconde images sont ensuite traitées pour effectuer des rétro-réflexions des rétro-réflecteurs (16). D'autres aspects de l'invention concernent un procédé mis en œuvre par ordinateur pour lire une Étiquette lisible par machine basée sur un rétro-réflecteur (par exemple, à base de CSR) et un programme d'application (app) correspondant.
PCT/EP2023/051776 2022-02-01 2023-01-25 Procédé de lecture d'une étiquette basées sur des rétro-réflecteurs, par exemple des réflecteurs sphériques cholestériques. WO2023148060A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
LU501374A LU501374B1 (en) 2022-02-01 2022-02-01 Method for reading a tag based on cholesteric spherical reflectors
LULU501374 2022-02-01

Publications (1)

Publication Number Publication Date
WO2023148060A1 true WO2023148060A1 (fr) 2023-08-10

Family

ID=80461854

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/051776 WO2023148060A1 (fr) 2022-02-01 2023-01-25 Procédé de lecture d'une étiquette basées sur des rétro-réflecteurs, par exemple des réflecteurs sphériques cholestériques.

Country Status (2)

Country Link
LU (1) LU501374B1 (fr)
WO (1) WO2023148060A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100200649A1 (en) * 2007-04-24 2010-08-12 Andrea Callegari Method of marking a document or item; method and device for identifying the marked document or item; use of circular polarizing particles
EP3504665A1 (fr) * 2016-08-23 2019-07-03 V. L. Engineering, Inc. Lecture de codes-barres invisibles et d'autres insignes invisibles à l'aide d'un téléphone intelligent non modifié physiquement
US20200311365A1 (en) * 2019-03-29 2020-10-01 At&T Intellectual Property I, L.P. Apparatus and method for identifying and authenticating an object

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100200649A1 (en) * 2007-04-24 2010-08-12 Andrea Callegari Method of marking a document or item; method and device for identifying the marked document or item; use of circular polarizing particles
EP3504665A1 (fr) * 2016-08-23 2019-07-03 V. L. Engineering, Inc. Lecture de codes-barres invisibles et d'autres insignes invisibles à l'aide d'un téléphone intelligent non modifié physiquement
US20200311365A1 (en) * 2019-03-29 2020-10-01 At&T Intellectual Property I, L.P. Apparatus and method for identifying and authenticating an object

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ARENAS MONICA MONICA ARENAS@UNI LU ET AL: "Cholesteric Spherical Reflectors as Physical Unclonable Identifiers in Anti-counterfeiting", CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, ACMPUB27, NEW YORK, NY, USA, 17 August 2021 (2021-08-17), pages 1 - 11, XP058807583, ISBN: 978-1-4503-9681-3, DOI: 10.1145/3465481.3465766 *
ARPPE, R.SORENSEN, T. J.: "Physical unclonable functions generated through chemical methods for anti-counterfeiting", NATURE REVIEWS CHEMISTRY, vol. 1, 2017
GENG, Y ET AL.: "High-fidelity spherical cholesteric liquid crystal Bragg reflectors generating unclonable patterns for secure authentication", SCI. REP., vol. 6, 2016, pages 26840
GENG, Y.KIZHAKIDATHAZHATH, R.LAGERWALL, J. P. F.: "Encoding Hidden Information onto Surfaces Using Polymerized Cholesteric Spherical Reflectors", ADV. FUNCT. MATER., vol. 31, 2021, pages 2100399, XP055961300, DOI: 10.1002/adfm.202100399
SCHWARTZ, M ET AL.: "Cholesteric Liquid Crystal Shells as Enabling Material for Information-Rich Design and Architecture", ADV. MATER., vol. 30, 2018, pages 1707382, XP055946908, DOI: 10.1002/adma.201707382
SCHWARTZ, M ET AL.: "Linking Physical Objects to Their Digital Twins via Fiducial Markers Designed for Invisibility to Humans", MULTIFUNCTIONAL MATERIALS, vol. 4, 2021, pages 022002

Also Published As

Publication number Publication date
LU501374B1 (en) 2023-08-02

Similar Documents

Publication Publication Date Title
US20210248338A1 (en) Systems, methods and apparatuses of a security device
US9483677B2 (en) Machine-readable symbols
US9195870B2 (en) Copy-resistant symbol having a substrate and a machine-readable symbol instantiated on the substrate
US9010638B2 (en) Integrated unit for reading identification information based on inherent disorder
JP6470646B2 (ja) セキュアバーコードを生成する方法、バーコード生成システム、及び二次元バーコード
CN205068462U (zh) 嵌入有机器可读图像的卡、附接卡的产品和连接卡的产品
US9094595B2 (en) System for authenticating an object
CA3092189A1 (fr) Procede et systeme d'authentification optique de produits
US9691208B2 (en) Mechanisms for authenticating the validity of an item
US9123190B2 (en) Method for authenticating an object
US10282648B2 (en) Machine readable visual codes encoding multiple messages
LU501374B1 (en) Method for reading a tag based on cholesteric spherical reflectors
JP5784813B1 (ja) バーコード表示装置、バーコード表示装置の動作方法及びプログラム
US9983410B2 (en) Device and method for authentication of a document
CN111316305A (zh) 用于认证消费产品的系统和方法
KR20200060858A (ko) 광결정 소재를 포함하는 위변조 방지용 전자태그 및 이의 활용방법
US11995510B2 (en) Optical authentication structure with augmented reality feature
US20230394856A1 (en) Method for determining a manipulation or forgery of an object and system therefor
JP6493974B2 (ja) バーコード表示装置、バーコードサーバ装置及びバーコード読取装置並びにそれらの動作方法並びにプログラム
CN117935669A (zh) 防伪标签及其制作、验证方法和装置
KR101688207B1 (ko) 정보 제공 방법
KR20220089488A (ko) 이미지정보를 기반으로 하는 미세 데이터 코드를 처리하기 위한 방법, 시스템 및 컴퓨터 판독 가능한 기록 매체
RU2205451C1 (ru) Способ идентификации предметов религиозного назначения
KR20200051537A (ko) 광결정 소재를 포함하는 위변조 방지용 큐알 코드 및 이의 활용방법
CN116911872A (zh) 物品验证方法、针对隔屏采集的检测方法、印刷有图形码的标签

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23701744

Country of ref document: EP

Kind code of ref document: A1