WO2011044704A1 - Manufacturing security documents using 3d surface parameterization and halftone dithering - Google Patents

Manufacturing security documents using 3d surface parameterization and halftone dithering Download PDF

Info

Publication number
WO2011044704A1
WO2011044704A1 PCT/CH2009/000330 CH2009000330W WO2011044704A1 WO 2011044704 A1 WO2011044704 A1 WO 2011044704A1 CH 2009000330 W CH2009000330 W CH 2009000330W WO 2011044704 A1 WO2011044704 A1 WO 2011044704A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
printing
image
brightness
texture
Prior art date
Application number
PCT/CH2009/000330
Other languages
French (fr)
Inventor
Martin Eichenberger
Armin Waldhauser
Original Assignee
Orell Füssli Sicherheitsdruck Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orell Füssli Sicherheitsdruck Ag filed Critical Orell Füssli Sicherheitsdruck Ag
Priority to PCT/CH2009/000330 priority Critical patent/WO2011044704A1/en
Priority to EP09744312A priority patent/EP2488371A1/en
Publication of WO2011044704A1 publication Critical patent/WO2011044704A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • B42D25/40Manufacture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B42BOOKBINDING; ALBUMS; FILES; SPECIAL PRINTED MATTER
    • B42DBOOKS; BOOK COVERS; LOOSE LEAVES; PRINTED MATTER CHARACTERISED BY IDENTIFICATION OR SECURITY FEATURES; PRINTED MATTER OF SPECIAL FORMAT OR STYLE NOT OTHERWISE PROVIDED FOR; DEVICES FOR USE THEREWITH AND NOT OTHERWISE PROVIDED FOR; MOVABLE-STRIP WRITING OR READING APPARATUS
    • B42D25/00Information-bearing cards or sheet-like structures characterised by identification or security features; Manufacture thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/405Halftoning, i.e. converting the picture signal of a continuous-tone original into a corresponding signal showing only two levels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits
    • H04N1/405Halftoning, i.e. converting the picture signal of a continuous-tone original into a corresponding signal showing only two levels
    • H04N1/4055Halftoning, i.e. converting the picture signal of a continuous-tone original into a corresponding signal showing only two levels producing a clustered dots or a size modulated halftone pattern
    • B42D2035/06
    • B42D2035/26

Definitions

  • the invention relates to a method for manufacturing a security document by applying an object thereon.
  • it relates to a technique where three dimensional object is applied onto the security document using a three-dimensional projection.
  • the inven tion also relates to a security document manufactured by using this technique.
  • Another class of techniques includes the application of special rendering techniques, such as special halftone dithering mechanisms.
  • special rendering techniques such as special halftone dithering mechanisms.
  • An example of such a technique is described in US 6 198 545.
  • Disclosure of the Invention is to provide another rendering technique for manufacturing security documents .
  • the method comprises the following steps:
  • Such a surface model can e.g. be a real object, such as a building or a person, or it may be a virtual object, such as a torus.
  • the surface model is e.g. represented by a mesh of nodes in three-dimensional space.
  • a brightness or color Attributing, to each point on said surface model, a brightness or color:
  • the color or brightness of the surface model can e.g. correspond to the coloring of the object it represents, and/or it may be derived from a shading of the object, i.e. the play of shadow and light on the object.
  • the brightness or color is typically non ⁇ uniform over the object.
  • a bijective mapping function for mapping said surface model into a two-dimensional parameter space, thereby defining a brightness or color image in said parameter space, wherein the color or brightness of each point in said image is given by the brightness or color of the point on the surface model corresponding to the point of the image:
  • the bijective mapping function maps each point of the surface model into the two- dimensional parameter space, thereby defining the brightness or color image therein.
  • step d) Rendering, in said parameter space, said image as a halftone screen pattern, which halftone screen pattern comprises an array of texture pixels, with each texture pixel being attributed to a point in said parameter space:
  • the image created in step c) is "drawn" into an array of pixels, the so-called texture pixels, using a halftone screen pattern.
  • step d) Using an inverse of the mapping function for mapping the texture pixels onto the surface model:
  • the texture pixels obtained in step d) are now mapped back onto the original surface model.
  • each point on the surface model now has a color or brightness defined by its corresponding texture pixel.
  • the texture pixels are applied as a texture to the surface model .
  • step f) Rendering the texture pixels in said printing plane into a two-dimensional array of printing pixels:
  • the texture pixels as obtained in step f) are now rendered into the printing pixels (i.e. the pixels that will finally be printed onto the document) .
  • step g) Applying said printing pixels to said se- curity document:
  • the printing pixels of step g) are finally applied by suitable techniques to the document.
  • This procedure will create a unique textured appearance of the model in the ' applied image, with the texture being such that it depends and represents not only the shape, but also the color or brightness of the original model.
  • the invention also relates to a security document manufactured by this method.
  • a security document may e.g. be a banknote, check or identification document.
  • the document need not necessarily be of paper, but may e.g. also be of plastics.
  • the document may be a credit card or identification card.
  • Fig. 1 shows a banknote
  • Fig. 2 shows an example of a surface model with shading
  • Fig. 3 shows the surface model mapped into the parameter space
  • Fig. 4 shows the halftone screen pattern in parameter space
  • Fig. 5 shows an enlarged section of Fig. 4
  • Fig. 6 shows the final print result
  • Fig. 7 shows an enlarged section of Fig. 6.
  • Fig. 1 shows a banknote having a sheet-like carrier 1 of plastic and/or paper material and carrying various features 2 - 5 thereon, such as
  • Step a providing a surface model:
  • a model of a non-flat two- dimensional surface model in three-dimensional space is provided.
  • This surface model typically (but not necessarily) represents a real-world object. In the example of Fig. 2, it represents a cup, but it may e.g. also represent other types of objects, such as buildings, coins, animals or people.
  • the security document is a document of identification, such as a passport or identity card, it may also represent the face of the bearer as scanned by suitable machinery.
  • the model may be the same on each document of a series of the identical document, as in the example of the banknote shown in Fig. 1. Alternatively, it may be individual to each document, such as in the example of the document of identification mentioned above. An individual model for each document has the advantage that a counterfeiter cannot copy the document by scanning with ⁇ out applying the manufacturing steps described here.
  • the surface model is typically represented by a series of nodes (each of which is characterized by a three-dimensional coordinate) and by a series of faces spanned by the nodes.
  • nodes each of which is characterized by a three-dimensional coordinate
  • faces spanned by the nodes.
  • a mesh of triangles or quadlaterals can be used, where each corner of each triangle or quadlateral is given by a node.
  • Step b attributing color or brightness:
  • Each (visible) point of the surface should have a color or at least a brightness value attributed to it.
  • a color may e.g. be expressed as a tuple of coordinates in a color space (such as RGB or CMYK) , while a single coordinate (such as gray level) is typically sufficient for brightness.
  • the brightness or color may e.g. correspond to the brightness or color in real live. It can be a calculated value, such as in the example of Fig. 2, where the gray level for each (visible) point on the surface was calculated using conventional shading techniques, or it may have been recorded while scanning a real-world ob- j ect .
  • the model including its color and brightness values, is to be mapped to a two- dimensional parameter space, i.e. a space having two coordinates, e.g. called u, v.
  • This process is typically called “parameterization” and is e.g. used for applying a texture to a three-dimensional object.
  • mapping function must be provided to map each point of the surface model into parameter space.
  • the mapping function must be bijective (for points that are visible on the surface, at least) , so that its inverse can be used for the step e below.
  • mapping function should advantageously minimize distortions, although some distortions cannot be0 avoided for non-trivial surface models .
  • mappings are conformal mappings, such as "least-squares conformal” mappings, or isometric mappings (length-invariable mappings) , or equiareal mappings (mappings that leave areas invariant) .
  • mapping functions are known to the skilled person and e.g. described by:
  • mapping functions are known to the skilled person.
  • mapping into parameter space can be a fully automated or semi-automatic process.
  • it may e.g. be advantageous for a human operator to select certain parameters of the mapping function, such as the location of clipping lines, or to identify regions that should be stretched or compressed during mapping, in order to minimize visual distortions or artifacts.
  • mapping of the surface model with its color or brightness into parameter space defines a color or brightness image in parameter space, with the color or brightness of each point in said image given by the color or brightness of the point on the surface model corresponding to the point of the image.
  • FIG. 3 Such a color or brightness image for the model of Fig. 2 is shown in Fig 3.
  • the image comprises three distinct sections 10, 11, 12.
  • a first section 10 corresponds to the inner and outer surfaces, and the upper rim of the cup's body.
  • a second section 11 corresponds to the cup's handle, and a third section 12 to the cup's bottom side.
  • Step d halftone screening in parameter space:
  • the image obtained in step c is rendered as a halftone screen pattern into an array of texture pixels.
  • the image can be divided into a plurality of halftone cells, with each halftone cell spanning a plurality of pixels.
  • the pixels within each cell are switched on or off as a function of the color or brightness of the image in the given cell, in particular as a function of the average color or average brightness of the image in the given cell.
  • the average gray scale factor of the original image at a given cell is calculated and then a number of pixels proportional to this factor is switched on in the cell, while the others remain switched off.
  • the image is divided into a plurality of vector-based symbols, with each symbol having a parameter that varies the coverage of the symbol.
  • the symbols can e.g. be stars drawn as outlines, and the parameter can e.g. be the line thickness of the star's outline.
  • the parameter can e.g. be 0, in which case the star is not drawn at all, or 1, in which case the star is drawn with such a large outline thickness that it basically covers all its assigned area, or the parameter can be any value in between.
  • the parameter is varied according to the average gray scale factor of the original image at the area assigned to each symbol, and then the symbols are rendered into the texture pixels.
  • the stars at dark regions are rendered with a thick line
  • the stars at light regions are rendered with a thin line, thereby blackening more pixels in dark regions of the image.
  • the object may e.g. be a solid (filled) star whose size varies according to the average gray level of the image.
  • the second embodiment basically divides the image into a plurality of halftone cells, with each cell corresponding to the area assigned to a symbol.
  • the texture pixels of a cell are again switched on or off as a function of a brightness or color level of said brightness or color image in said cell
  • the algorithm for switching the pixels of a cell on or off as a function of the color or brightness of the image at said cell is the same for all cells.
  • all cells representing a given brightness look a the same, which allows to generate a texture of high regularity that emphasizes the curvature of the model surface in the final document.
  • Figs. 4 and 5 show a representation of the texture pixels in parameter space after halftone screen- ing.
  • the present example is based on substantially quadratic pixel cells, with each cell having a roughly star-shaped cluster of white pixels in the center and dark pixels at its edges.
  • the number of dark pixels in a cell is a function of the average gray level of the image in the given cell - the darker the image is in the given cell, the higher the number of dark pixel is.
  • step c the inverse of the mapping function of step c is used for mapping the texture pixel of step d back onto the surface model.
  • each texture pixel is attributed to a point on the surface model, such that it can be projected as described in the following step.
  • Step f projecting the surface model:
  • the surface model with its texture is projected into a two-dimensional printing plane, i.e. in a space whose coordinates correspond to locations on the document to be manufactured.
  • Projection takes place by means of a three- dimensional projection, which is a mapping of the three- dimensional surface model into two-dimensional space by means of orthographic or perspective projection.
  • This type of mapping corresponds to the mapping performed by a camera and generates an object in the plane that, when viewed, is perceived by the viewer as having depth.
  • Orthographic and perspective projections are well known to the skilled person and need not be ex ⁇ plained in detail herein.
  • each point in the printing plane is attributed to one of the texture pixels .
  • Step g rendering in the printing plane:
  • step f the printing plane obtained in step f is rendered into a two-dimensional array of printing pixels.
  • each printing pixel is e.g. represented by a binary value that in ⁇ dicates if the given pixel is on or off.
  • a suitable algorithm can e.g. calculate the average value (weighted by overlap) of all texture pixels overlapping the given printing pixel, and then switch the printing pixel on if the average value is above a given threshold.
  • an ink dot is applied to the given location of the document.
  • each printing pixel is e.g. represented by an n-tuple of binary values indicating which of n differently colored inks is/are to be applied at the given location.
  • Step h application to ducument :
  • step g the printing pixels of step g are physically applied the document using suitable techniques, such as:
  • a diffractive structure such as a hologram, or any other type of structure, such as a structured metal layer, that is structured according to the printing pixels, - manufacturing a micro-engraving according to the printing pixels,
  • the above techniques can be used to apply the printing pixels "directly” or "indirectly” to the document.
  • the printing pixels are applied directly to the document itself, e.g. by means of printing ink on the document.
  • the printing pixels are applied to a carrier other than the document, such as a separate security element, which carrier is then attached to the document by suitable techniques, such as gluing.
  • the printing pixels can be applied to the document in such a manner that the rendered image can be verified by eye, or, alternatively, in such a manner that further tools are required to view them, such as optical polarizers, microlens-arrays or devices for making special inks visible.
  • Figs. 6 and 7. an apparently three-dimensional object is applied onto the document.
  • the object carries a texture, namely the texture calculated in step d and mapped onto the object in step e.
  • the texture varies in such a manner that its brightness changes. This brightness is, in its turn, chosen such that it imitates light and shadow playing on the surface of the object.
  • the three-dimensional effect conveyed by the texture is strongest if the structure of the texture remains visible and crisp in the final result. This can be achieved if the resolution of the printing pixels (the printer resolution) is sufficiently fine. More precisely, the resolution of the printing pixels should be suffi- ciently large such that, for a majority of the halftone cells, each cell extends over a plurality of printing pixels. In particular, in order to resolve the individual cells in the final printed product optimally, for a ma- jority of the halftone cells, each cell should extend over a number of printing pixels being at least equal to (in particular being larger than) the number of texture pixels in the cell. Notes:
  • the present method allows to create a two- dimensional rendering of a three-dimensional textured object where the texture not only serves to emphasize the curvature and orientation of the object's surface, but also serves to encode the color or brightness of the same.
  • the brightness of an object is not encoded in its texture, but overlaid with said texture, and halftone printing techniques need to be applied to the object af- ter its three-dimensional projection into the printing plane, which affects the clarity and crispness of the result .
  • the surface model can be common for all of them.
  • all banknotes of a given denomination and series may use an identical object seen from the same position and using the same texture.
  • the surface model may be unique for each security document in a series of documents.
  • the surface model may be a scanned version of the face of the bearer of the document. This makes counterfeiting particularly difficult because a counterfeiter needs to be able to replicate the present mechanism.
  • the printing pixels may be applied to a blank or uniform background, or they may be applied to or superimposed with a structure, e.g. a patterned background.

Abstract

In a method for manufacturing a security document, a three-dimensional surface model is mapped into a two-dimensional parameter space, thereby generating a color or brightness image in this parameter space. The image is rendered using halftone techniques and projected back, as a texture, onto the model, whereupon the model is applied to the document using three-dimensional projection. This technique allows to create a two- dimensional rendering of a three-dimensional textured object where the texture not only serves to emphasize the curvature and orientation of the object's surface, but also serves to encode the color or brightness of the same.

Description

Manufacturing security documents using 3D surface parameterization and halftone dithering
Technical Field
The invention relates to a method for manufacturing a security document by applying an object thereon. In particular, it relates to a technique where three dimensional object is applied onto the security document using a three-dimensional projection. The inven tion also relates to a security document manufactured by using this technique.
Background Art
There exists a large number of techniques intended to make counterfeiting of security documents more difficult. Some of these are e.g. based on using special ized inks, etc., or on the application of safety devices such as holograms or other optically variable devices.
Another class of techniques includes the application of special rendering techniques, such as special halftone dithering mechanisms. An example of such a technique is described in US 6 198 545.
Disclosure of the Invention The problem to be solved by the present invention is to provide another rendering technique for manufacturing security documents .
This problem is solved by the method of claim 1. Accordingly, the method comprises the following steps:
a) Providing a surface model of a two- dimensional surface in thee-dimensional space: Such a surface model can e.g. be a real object, such as a building or a person, or it may be a virtual object, such as a torus. Typically, the surface model is e.g. represented by a mesh of nodes in three-dimensional space.
b) Attributing, to each point on said surface model, a brightness or color: The color or brightness of the surface model can e.g. correspond to the coloring of the object it represents, and/or it may be derived from a shading of the object, i.e. the play of shadow and light on the object. The brightness or color is typically non¬ uniform over the object.
c) Using a bijective mapping function for mapping said surface model into a two-dimensional parameter space, thereby defining a brightness or color image in said parameter space, wherein the color or brightness of each point in said image is given by the brightness or color of the point on the surface model corresponding to the point of the image: The bijective mapping function maps each point of the surface model into the two- dimensional parameter space, thereby defining the brightness or color image therein.
d) Rendering, in said parameter space, said image as a halftone screen pattern, which halftone screen pattern comprises an array of texture pixels, with each texture pixel being attributed to a point in said parameter space: In other words, the image created in step c) is "drawn" into an array of pixels, the so-called texture pixels, using a halftone screen pattern.
e) Using an inverse of the mapping function for mapping the texture pixels onto the surface model: The texture pixels obtained in step d) are now mapped back onto the original surface model. Thus, each point on the surface model now has a color or brightness defined by its corresponding texture pixel. In other word, the texture pixels are applied as a texture to the surface model .
f) Projecting said surface model into a two- dimensional printing plane using a three-dimensional projection, thereby projecting said texture pixels into said printing plane: The surface model is projected into two- dimensional space (i.e. onto the paper of the document) . Hence, each texture pixel is projected into the two- dimensional printing plane.
g) Rendering the texture pixels in said printing plane into a two-dimensional array of printing pixels: The texture pixels as obtained in step f) are now rendered into the printing pixels (i.e. the pixels that will finally be printed onto the document) .
h) Applying said printing pixels to said se- curity document: The printing pixels of step g) are finally applied by suitable techniques to the document.
This procedure will create a unique textured appearance of the model in the 'applied image, with the texture being such that it depends and represents not only the shape, but also the color or brightness of the original model.
The invention also relates to a security document manufactured by this method. Such a security document may e.g. be a banknote, check or identification document. The document need not necessarily be of paper, but may e.g. also be of plastics. For example, the document may be a credit card or identification card.
Brief Description of the Drawings The invention will be better understood and objects other than those set forth above will become apparent when consideration is given to the following detailed description thereof. Such description makes refer ence to the annexed drawings, wherein:
Fig. 1 shows a banknote,
Fig. 2 shows an example of a surface model with shading,
Fig. 3 shows the surface model mapped into the parameter space,
Fig. 4 shows the halftone screen pattern in parameter space,
Fig. 5 shows an enlarged section of Fig. 4, Fig. 6 shows the final print result, and Fig. 7 shows an enlarged section of Fig. 6.
Modes for Carrying Out the Invention
Fig. 1 shows a banknote having a sheet-like carrier 1 of plastic and/or paper material and carrying various features 2 - 5 thereon, such as
- conventional security features 2, such as special prints or inks or devices applied thereto,
- other types of artwork 3,
- a denomination 4, and
- a rendering 5 of a three-dimensional object manufactured with the steps according to the present invention .
Even though the manufacturing of the rendering 5 is shown on a banknote in Fig. 1, it must be clearly understood that it can also be applied to other types of security documents as mentioned above.
In the following, the steps for manufacturing the rendering 5 will be explained in more detail by reference to Figs. 2ff. Step a, providing a surface model:
In a first step, a model of a non-flat two- dimensional surface model in three-dimensional space is provided. This surface model typically (but not necessarily) represents a real-world object. In the example of Fig. 2, it represents a cup, but it may e.g. also represent other types of objects, such as buildings, coins, animals or people. If the security document is a document of identification, such as a passport or identity card, it may also represent the face of the bearer as scanned by suitable machinery.
The model may be the same on each document of a series of the identical document, as in the example of the banknote shown in Fig. 1. Alternatively, it may be individual to each document, such as in the example of the document of identification mentioned above. An individual model for each document has the advantage that a counterfeiter cannot copy the document by scanning with¬ out applying the manufacturing steps described here.
The surface model is typically represented by a series of nodes (each of which is characterized by a three-dimensional coordinate) and by a series of faces spanned by the nodes. For example, a mesh of triangles or quadlaterals can be used, where each corner of each triangle or quadlateral is given by a node.
Step b, attributing color or brightness:
Each (visible) point of the surface should have a color or at least a brightness value attributed to it. A color may e.g. be expressed as a tuple of coordinates in a color space (such as RGB or CMYK) , while a single coordinate (such as gray level) is typically sufficient for brightness.
The brightness or color may e.g. correspond to the brightness or color in real live. It can be a calculated value, such as in the example of Fig. 2, where the gray level for each (visible) point on the surface was calculated using conventional shading techniques, or it may have been recorded while scanning a real-world ob- j ect .
5 Step c, mapping to 2D parameter space:
In a next step, the model, including its color and brightness values, is to be mapped to a two- dimensional parameter space, i.e. a space having two coordinates, e.g. called u, v.
This process is typically called "parameterization" and is e.g. used for applying a texture to a three-dimensional object.
A mapping function must be provided to map each point of the surface model into parameter space. The mapping function must be bijective (for points that are visible on the surface, at least) , so that its inverse can be used for the step e below.
The mapping function should advantageously minimize distortions, although some distortions cannot be0 avoided for non-trivial surface models .
An overview of parameterization techniques can e.g. be found in K. Hormann, B. Levy, A. Sheffer, "Mesh Parameterization: Theory and Practice", ACM
SIGGRAPH Course Notes 2007.
5 Particularly useful mappings in this context are conformal mappings, such as "least-squares conformal" mappings, or isometric mappings (length-invariable mappings) , or equiareal mappings (mappings that leave areas invariant) .
o Examples of suitable mapping functions are known to the skilled person and e.g. described by:
- Least-squares conformal maps, see e.g.
B.Levy, B., Petitjean, S., Ray, N . , and Maillot, J., " Least squares conformal maps for automatic texture atlass generation", ACM Trans. Graph. 21, 3 (Jul. 2002), 362- 371. DOI= http://doi.acm.org/10.1145/566654.566590 - Angle-based flattening, see e.g. Sheffer, A., Levy, B., Mogilnitsky, M . , and Bogomyakov, A.,
"ABF++: fast and robust angle based flattening", ACM Trans. Graph. 24, 2 (Apr. 2005), 311-330. DOI=
http: //doi.acm. org/10.1145/1061347.1061354
- Most isometric parameterization, see e.g. U. Labsik, K. Hormann, G. Greiner, "Using Most Isometric Parametrizations for Remeshing Polygonal Surfaces", Proceedings of Geometric Modeling and Processing 2000, edited by R. Martin and W. Wang, IEEE Computer Society Press .
- Signal-specialized parameterization, see e.g. Pedro V. Sander, Steven J. Gortler, John Snyder2 and Hugues Hoppe, "Signal-Specialized Parameterization", Thirteenth Eurographics Workshop on Rendering (2002) , edited by P. Debevec and S. Gibson, The Eurographics Association .
Other types of mapping functions are known to the skilled person.
Mapping into parameter space can be a fully automated or semi-automatic process. In particular when mapping complex, varying objects, it may e.g. be advantageous for a human operator to select certain parameters of the mapping function, such as the location of clipping lines, or to identify regions that should be stretched or compressed during mapping, in order to minimize visual distortions or artifacts. These techniques are known to the skilled person.
The mapping of the surface model with its color or brightness into parameter space defines a color or brightness image in parameter space, with the color or brightness of each point in said image given by the color or brightness of the point on the surface model corresponding to the point of the image.
Such a color or brightness image for the model of Fig. 2 is shown in Fig 3. In the present example, as can be seen, the image comprises three distinct sections 10, 11, 12. A first section 10 corresponds to the inner and outer surfaces, and the upper rim of the cup's body. A second section 11 corresponds to the cup's handle, and a third section 12 to the cup's bottom side.
Even though, in the example of Fig. 3, all parts of the surface model have been mapped to the parameter space, it is actually not necessary to map those parts of the model that will, in its final rendering, be invisible .
Step d, halftone screening in parameter space: In a next step, the image obtained in step c is rendered as a halftone screen pattern into an array of texture pixels. If the image is a grayscale (brightness) image, the texture pixels can e.g. be an array of binary values (0 = black, 1 = white) , with each value being attributed to a point in the parameter space. If the image is a color image, the texture pixels are e.g. an array of n-tuples of binary values (0 = off, 1 = on), with n corresponding to the number of color inks to be used in the final printing process if a printing process is being used .
Numerous halftone screening (or dithering) techniques are known from the printing arts. They typically simulate a continuous tone image by means of pixels that can be either on or off.
In an advantageous first embodiment, the image can be divided into a plurality of halftone cells, with each halftone cell spanning a plurality of pixels. The pixels within each cell are switched on or off as a function of the color or brightness of the image in the given cell, in particular as a function of the average color or average brightness of the image in the given cell. For a grayscale image, for example, the average gray scale factor of the original image at a given cell is calculated and then a number of pixels proportional to this factor is switched on in the cell, while the others remain switched off.
In an other advantageous, second embodiment, the image is divided into a plurality of vector-based symbols, with each symbol having a parameter that varies the coverage of the symbol. For example, the symbols can e.g. be stars drawn as outlines, and the parameter can e.g. be the line thickness of the star's outline. The parameter can e.g. be 0, in which case the star is not drawn at all, or 1, in which case the star is drawn with such a large outline thickness that it basically covers all its assigned area, or the parameter can be any value in between. The parameter is varied according to the average gray scale factor of the original image at the area assigned to each symbol, and then the symbols are rendered into the texture pixels. In the example of stars as mentioned above, the stars at dark regions are rendered with a thick line, while the stars at light regions are rendered with a thin line, thereby blackening more pixels in dark regions of the image.
In another example, the object may e.g. be a solid (filled) star whose size varies according to the average gray level of the image.
Similar as in the first embodiment, the second embodiment basically divides the image into a plurality of halftone cells, with each cell corresponding to the area assigned to a symbol. The texture pixels of a cell are again switched on or off as a function of a brightness or color level of said brightness or color image in said cell
Examples of halftone screening or dithering techniques are e.g. described in US 6 198 545.
Advantageously, the algorithm for switching the pixels of a cell on or off as a function of the color or brightness of the image at said cell is the same for all cells. Hence, all cells representing a given brightness look a the same, which allows to generate a texture of high regularity that emphasizes the curvature of the model surface in the final document.
Figs. 4 and 5 show a representation of the texture pixels in parameter space after halftone screen- ing. As can best be seen from the enlarged section in Fig. 5, the present example is based on substantially quadratic pixel cells, with each cell having a roughly star-shaped cluster of white pixels in the center and dark pixels at its edges. The number of dark pixels in a cell is a function of the average gray level of the image in the given cell - the darker the image is in the given cell, the higher the number of dark pixel is.
Step e, inverse mapping:
In a next step, the inverse of the mapping function of step c is used for mapping the texture pixel of step d back onto the surface model.
This is illustrated in Figs. 6 and 7, where the texture pixels have been mapped back as a texture onto the model .
In this process, each texture pixel is attributed to a point on the surface model, such that it can be projected as described in the following step. Step f, projecting the surface model:
In this step, the surface model with its texture is projected into a two-dimensional printing plane, i.e. in a space whose coordinates correspond to locations on the document to be manufactured.
Projection takes place by means of a three- dimensional projection, which is a mapping of the three- dimensional surface model into two-dimensional space by means of orthographic or perspective projection. This type of mapping corresponds to the mapping performed by a camera and generates an object in the plane that, when viewed, is perceived by the viewer as having depth. Orthographic and perspective projections are well known to the skilled person and need not be ex¬ plained in detail herein.
After step f, each point in the printing plane is attributed to one of the texture pixels .
Step g, rendering in the printing plane:
In a next step, the printing plane obtained in step f is rendered into a two-dimensional array of printing pixels.
When rendering a grayscale image, each printing pixel is e.g. represented by a binary value that in¬ dicates if the given pixel is on or off. A suitable algorithm can e.g. calculate the average value (weighted by overlap) of all texture pixels overlapping the given printing pixel, and then switch the printing pixel on if the average value is above a given threshold. When the printing pixel is on, an ink dot is applied to the given location of the document.
When rendering a color image, each printing pixel is e.g. represented by an n-tuple of binary values indicating which of n differently colored inks is/are to be applied at the given location.
Step h, application to ducument :
In a final step, the printing pixels of step g are physically applied the document using suitable techniques, such as:
- printing by means of suitable printer hardware and ink(s), e.g. using offset print;
- exposing a light-sensitive material to illumination patterned in accordance with the printing pixels ;
- manufacturing a diffractive structure, such as a hologram, or any other type of structure, such as a structured metal layer, that is structured according to the printing pixels, - manufacturing a micro-engraving according to the printing pixels,
- manufacturing an embossing according to the printing pixels.
The above techniques can be used to apply the printing pixels "directly" or "indirectly" to the document. In direct application, the printing pixels are applied directly to the document itself, e.g. by means of printing ink on the document. In indirect application, the printing pixels are applied to a carrier other than the document, such as a separate security element, which carrier is then attached to the document by suitable techniques, such as gluing.
The printing pixels can be applied to the document in such a manner that the rendered image can be verified by eye, or, alternatively, in such a manner that further tools are required to view them, such as optical polarizers, microlens-arrays or devices for making special inks visible.
Final result:
The result of this process is shown in Figs. 6 and 7. As can be seen, an apparently three-dimensional object is applied onto the document. The object carries a texture, namely the texture calculated in step d and mapped onto the object in step e. In the shown embodiment, the texture varies in such a manner that its brightness changes. This brightness is, in its turn, chosen such that it imitates light and shadow playing on the surface of the object.
The three-dimensional effect conveyed by the texture is strongest if the structure of the texture remains visible and crisp in the final result. This can be achieved if the resolution of the printing pixels (the printer resolution) is sufficiently fine. More precisely, the resolution of the printing pixels should be suffi- ciently large such that, for a majority of the halftone cells, each cell extends over a plurality of printing pixels. In particular, in order to resolve the individual cells in the final printed product optimally, for a ma- jority of the halftone cells, each cell should extend over a number of printing pixels being at least equal to (in particular being larger than) the number of texture pixels in the cell. Notes:
The present method allows to create a two- dimensional rendering of a three-dimensional textured object where the texture not only serves to emphasize the curvature and orientation of the object's surface, but also serves to encode the color or brightness of the same. In contrast to this, in conventional printing techniques, the brightness of an object is not encoded in its texture, but overlaid with said texture, and halftone printing techniques need to be applied to the object af- ter its three-dimensional projection into the printing plane, which affects the clarity and crispness of the result .
When manufacturing a series of security documents, the surface model can be common for all of them. For example, all banknotes of a given denomination and series may use an identical object seen from the same position and using the same texture.
Alternatively, as mentioned above, the surface model may be unique for each security document in a series of documents. In particular, when manufacturing passports, identification cards or other documents of identification, the surface model may be a scanned version of the face of the bearer of the document. This makes counterfeiting particularly difficult because a counterfeiter needs to be able to replicate the present mechanism. The printing pixels may be applied to a blank or uniform background, or they may be applied to or superimposed with a structure, e.g. a patterned background.
While there are shown and described presently preferred embodiments of the invention, it is to be distinctly understood that the invention is not limited thereto but may be otherwise variously embodied and practiced within the scope of the following claims.

Claims

Claims
1. A method for manufacturing a security document by printing an object thereon, which method comprises the steps of
providing a surface model of a two- dimensional surface in thee-dimensional space,
attributing, to each point on said surface model, a brightness or color,
using a bijective mapping function for mapping said surface model into a two-dimensional parameter space, thereby defining a color or brightness image in said parameter space, wherein the color or brightness of each point in said image is given by the color or brightness of the point on the surface model corresponding to the point of the image,
rendering, in said parameter space, said image as a halftone screen pattern, which halftone screen pattern comprises an array of texture pixels, with each texture pixel being attributed to a point in said parameter space,
using an inverse of said mapping function for mapping said texture pixels onto said surface model,
projecting said surface model into a two- dimensional printing plane using a three-dimensional projection, thereby projecting said texture pixels into said printing plane,
rendering the texture pixels in said printing plane into a two-dimensional array of printing pixels, applying said printing pixels to said security document.
2. The method of claim 1 wherein said three- dimensional projection is an orthographic projection or a perspective projection.
3. The method of any of the preceding claims wherein said mapping function is a conformal mapping function .
4. The method of claim 3 wherein said mapping function is a least squares conformal mapping function.
5. The method of any of the claims 1 or 2 wherein said mapping function is an isometric mapping function .
6. The method of any of the claims 1 or 2 wherein said mapping function is an equiareal mapping function.
7. The method of any of the preceding claims wherein said image is rendered in said parameter space by dividing said image into a plurality of halftone cells, each halftone cell comprising a plurality of said texture pixels, wherein the texture pixels of a cell are switched on or off as a function of a brightness or color level of said brightness or color image in said cell.
8. The method of claim 7 wherein a resolution of said printing pixels is sufficiently large such that, for a majority of said cells, each cell extends over a plurality of printing pixels.
9. The method of claim 8 wherein, for a majority of the halftone cells, each cell extends over a number of printing pixels being at least equal to a number of texture pixels in said cell.
10. The method of any of the claims 7 or 9 wherein an algorithm for switching the pixels of a cell on or off as a function of the color or brightness of the image at said cell is the same for all cells.
11. A method for manufacturing a series of documents using the steps of the method of any of the preceding claims for manufacturing each document, wherein a unique surface model is provided for each document
12. The method of claim 11 wherein the documents are documents of identification and wherein the surface model represents a face of the bearer.
13. The method of any of the preceding claims wherein said step of applying said printing pixels to said security document comprises a printing of said printing pixels onto said document or onto a carrier to be applied to said document.
14. A security document manufactured by the method of any of the preceding claims.
PCT/CH2009/000330 2009-10-15 2009-10-15 Manufacturing security documents using 3d surface parameterization and halftone dithering WO2011044704A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CH2009/000330 WO2011044704A1 (en) 2009-10-15 2009-10-15 Manufacturing security documents using 3d surface parameterization and halftone dithering
EP09744312A EP2488371A1 (en) 2009-10-15 2009-10-15 Manufacturing security documents using 3d surface parameterization and halftone dithering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CH2009/000330 WO2011044704A1 (en) 2009-10-15 2009-10-15 Manufacturing security documents using 3d surface parameterization and halftone dithering

Publications (1)

Publication Number Publication Date
WO2011044704A1 true WO2011044704A1 (en) 2011-04-21

Family

ID=41698079

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CH2009/000330 WO2011044704A1 (en) 2009-10-15 2009-10-15 Manufacturing security documents using 3d surface parameterization and halftone dithering

Country Status (2)

Country Link
EP (1) EP2488371A1 (en)
WO (1) WO2011044704A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013163287A1 (en) * 2012-04-25 2013-10-31 Visual Physics, Llc Security device for projecting a collection of synthetic images
CN104838304A (en) * 2012-09-05 2015-08-12 卢门科有限责任公司 Pixel mapping, arranging, and imaging for round and square-based micro lens arrays to achieve full volume 3D and multi-directional motion
WO2016110493A1 (en) * 2015-01-09 2016-07-14 Ovd Kinegram Ag Method for producing security elements, and security elements
EP3243668A1 (en) * 2016-05-10 2017-11-15 Agfa Graphics NV Manufacturing of a security document
US9873281B2 (en) 2013-06-13 2018-01-23 Visual Physics, Llc Single layer image projection film
US10173453B2 (en) 2013-03-15 2019-01-08 Visual Physics, Llc Optical security device
US10173405B2 (en) 2012-08-17 2019-01-08 Visual Physics, Llc Process for transferring microstructures to a final substrate
US10189292B2 (en) 2015-02-11 2019-01-29 Crane & Co., Inc. Method for the surface application of a security device to a substrate
US10434812B2 (en) 2014-03-27 2019-10-08 Visual Physics, Llc Optical device that produces flicker-like optical effects
US10766292B2 (en) 2014-03-27 2020-09-08 Crane & Co., Inc. Optical device that provides flicker-like optical effects
US10800203B2 (en) 2014-07-17 2020-10-13 Visual Physics, Llc Polymeric sheet material for use in making polymeric security documents such as banknotes
US10890692B2 (en) 2011-08-19 2021-01-12 Visual Physics, Llc Optionally transferable optical system with a reduced thickness
US11590791B2 (en) 2017-02-10 2023-02-28 Crane & Co., Inc. Machine-readable optical security device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107077063B (en) * 2014-10-29 2018-11-06 惠普发展公司,有限责任合伙企业 Three-dimensional halftone
WO2018052444A1 (en) * 2016-09-16 2018-03-22 Hewlett-Packard Development Company, L.P. Datasets representing aspects of 3d object

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6198545B1 (en) * 1994-03-30 2001-03-06 Victor Ostromoukhov Method and apparatus for generating halftone images by evolutionary screen dot contours

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6198545B1 (en) * 1994-03-30 2001-03-06 Victor Ostromoukhov Method and apparatus for generating halftone images by evolutionary screen dot contours

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
FREUDENBERG B ET AL: "REAL-TIME HALFTONING: A PRIMITIVE FOR NON-PHOTOREALISTIC SHADING", RENDERING TECHNIQUES 2002. EUROGRAPHICS WORKSHOP PROCEEDINGS. PISA, ITALY, JUNE 26 - 28, 2002; [PROCEEDINGS OF THE EUROGRAPHICS WORKSHOP], NEW YORK, NY : ACM, US, vol. WORKSHOP 13, 26 June 2002 (2002-06-26), pages 227 - 231, XP001232395, ISBN: 978-1-58113-534-3 *
HORMAN K ET AL: "Mesh parameterization: theory and practice", INTERNET CITATION, August 2007 (2007-08-01), XP002500927 *
JEROME THOMA: "Non-Photorealistic Rendering Techniques for Real-Time Character Animation", INTERNET CITATION, 10 December 2002 (2002-12-10), pages 136PP, XP007913433, Retrieved from the Internet <URL:http://www.gamecareerguide.com/education/theses/20030707/jerome_thoma_thesis.pdf> [retrieved on 20100611] *
OSTROMOUKHOV V ET AL: "ARTISTIC SCREENING", COMPUTER GRAPHICS PROCEEDINGS. LOS ANGELES, AUG. 6 - 11, 1995; [COMPUTER GRAPHICS PROCEEDINGS (SIGGRAPH)], NEW YORK, IEEE, US, 6 August 1995 (1995-08-06), pages 219 - 228, XP000546231, ISBN: 978-0-89791-701-8 *
OSTROMOUKHOV V: "DIGITAL FACIAL ENGRAVING", COMPUTER GRAPHICS PROCEEDINGS. ANNUAL CONFERENCE SERIES.SIGGRAPH, XX, XX, 8 August 1999 (1999-08-08), pages 417 - 424, XP001024741 *
PRAUN E ET AL: "REAL-TIME HATCHING", COMPUTER GRAPHICS. SIGGRAPH 2001. CONFERENCE PROCEEDINGS. LOS ANGELES, CA, AUG. 12 - 17, 2001; [COMPUTER GRAPHICS PROCEEDINGS. SIGGRAPH], NEW YORK, NY : ACM, US, 12 August 2001 (2001-08-12), pages 581 - 586, XP001049933, ISBN: 978-1-58113-374-5 *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10890692B2 (en) 2011-08-19 2021-01-12 Visual Physics, Llc Optionally transferable optical system with a reduced thickness
EP4198612A1 (en) * 2012-04-25 2023-06-21 Visual Physics, LLC Security device for projecting a collection of synthetic images
WO2013163287A1 (en) * 2012-04-25 2013-10-31 Visual Physics, Llc Security device for projecting a collection of synthetic images
US9482792B2 (en) 2012-04-25 2016-11-01 Visual Physics, Llc Security device for projecting a collection of synthetic images
CN104582978A (en) * 2012-04-25 2015-04-29 光学物理有限责任公司 Security device for projecting collection of synthetic images
RU2640716C2 (en) * 2012-04-25 2018-01-11 Визуал Физикс, Ллс Protective device for projecting set of synthetic images
EP3734352A1 (en) * 2012-04-25 2020-11-04 Visual Physics, LLC Security device for projecting a collection of synthetic images
RU2640716C9 (en) * 2012-04-25 2019-03-25 Визуал Физикс, Ллс Security device for projecting a collection of synthetic images
US10899120B2 (en) 2012-08-17 2021-01-26 Visual Physics, Llc Process for transferring microstructures to a final substrate
US10173405B2 (en) 2012-08-17 2019-01-08 Visual Physics, Llc Process for transferring microstructures to a final substrate
CN104838304A (en) * 2012-09-05 2015-08-12 卢门科有限责任公司 Pixel mapping, arranging, and imaging for round and square-based micro lens arrays to achieve full volume 3D and multi-directional motion
US10173453B2 (en) 2013-03-15 2019-01-08 Visual Physics, Llc Optical security device
US10787018B2 (en) 2013-03-15 2020-09-29 Visual Physics, Llc Optical security device
US9873281B2 (en) 2013-06-13 2018-01-23 Visual Physics, Llc Single layer image projection film
US10434812B2 (en) 2014-03-27 2019-10-08 Visual Physics, Llc Optical device that produces flicker-like optical effects
US11446950B2 (en) 2014-03-27 2022-09-20 Visual Physics, Llc Optical device that produces flicker-like optical effects
US10766292B2 (en) 2014-03-27 2020-09-08 Crane & Co., Inc. Optical device that provides flicker-like optical effects
US10800203B2 (en) 2014-07-17 2020-10-13 Visual Physics, Llc Polymeric sheet material for use in making polymeric security documents such as banknotes
JP2021060599A (en) * 2015-01-09 2021-04-15 オーファウデー キネグラム アーゲー Method for producing security element, and security element
JP2018504634A (en) * 2015-01-09 2018-02-15 オーファウデー キネグラム アーゲー Method for forming a security element and security document
EP3750717A1 (en) 2015-01-09 2020-12-16 OVD Kinegram AG Method for manufacturing a security element and a security element
CN107107646A (en) * 2015-01-09 2017-08-29 Ovd基尼格拉姆股份公司 Method and safety element for production safety element
US10583680B2 (en) 2015-01-09 2020-03-10 Ovd Kinegram Ag Method for producing security elements, and security elements
US11472216B2 (en) 2015-01-09 2022-10-18 Ovd Kinegram Ag Method for producing security elements, and security elements
WO2016110493A1 (en) * 2015-01-09 2016-07-14 Ovd Kinegram Ag Method for producing security elements, and security elements
US10189292B2 (en) 2015-02-11 2019-01-29 Crane & Co., Inc. Method for the surface application of a security device to a substrate
EP3243668A1 (en) * 2016-05-10 2017-11-15 Agfa Graphics NV Manufacturing of a security document
US10471759B2 (en) 2016-05-10 2019-11-12 Agfa Nv Manufacturing of a security document
US11590791B2 (en) 2017-02-10 2023-02-28 Crane & Co., Inc. Machine-readable optical security device

Also Published As

Publication number Publication date
EP2488371A1 (en) 2012-08-22

Similar Documents

Publication Publication Date Title
EP2488371A1 (en) Manufacturing security documents using 3d surface parameterization and halftone dithering
CA2453456C (en) Images incorporating microstructures
US5371627A (en) Random dot stereogram and method for making the same
EP1690697A1 (en) Method to apply an invisible mark on a media
US11037038B2 (en) Artwork generated to convey digital messages, and methods/apparatuses for generating such artwork
US20030021437A1 (en) Images and security documents protected by micro-structures
AU2002345270A1 (en) Images incorporating microstructures
US20140334665A1 (en) System and method for creating an animation from a plurality of latent images encoded into a visible image
JP2005512846A (en) Certificate of value
WO2010032718A1 (en) Forgery preventive printed matter, method for producing same, and recording medium in which dot data creation software is stored
JP4844894B2 (en) Three-dimensional moire formation
EP2953796A1 (en) Multiple shade latent images
CN114746904A (en) Three-dimensional face reconstruction
JP2018504634A (en) Method for forming a security element and security document
JP6134927B1 (en) Image generation method, image generation apparatus, engraving manufacturing method, program, engraving and printed matter
KR101123648B1 (en) Printing method for expressing solid texture of oil painting
JPWO2020096009A1 (en) Moire visualization pattern generation method, moire visualization pattern generation device, and moire visualization pattern generation system
CN110869555B (en) Paper preparation method capable of controlling watermark brightness and paper product
JP7185876B2 (en) latent image print
JP4229270B2 (en) Method and apparatus for simulating printed matter using two-dimensional data
EP2862345B1 (en) Simulated embossing and imprinting
US20190389243A1 (en) Optical illusion device
EP4360901A1 (en) A security element and a data carrier
WO2003042923A1 (en) Logic arrangements, storage mediums, and methods for generating digital images using brush strokes
JP4487086B2 (en) Printed material having a set pattern for continuous tone expression and its authenticity determination method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09744312

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2009744312

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009744312

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE