WO2016178206A1 - System and method of generating a visual code - Google Patents

System and method of generating a visual code Download PDF

Info

Publication number
WO2016178206A1
WO2016178206A1 PCT/IL2016/050342 IL2016050342W WO2016178206A1 WO 2016178206 A1 WO2016178206 A1 WO 2016178206A1 IL 2016050342 W IL2016050342 W IL 2016050342W WO 2016178206 A1 WO2016178206 A1 WO 2016178206A1
Authority
WO
WIPO (PCT)
Prior art keywords
visual code
lighting
visual
cells
values
Prior art date
Application number
PCT/IL2016/050342
Other languages
French (fr)
Inventor
Itamar FRIEDMAN
Original Assignee
Eyeconit Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyeconit Ltd. filed Critical Eyeconit Ltd.
Publication of WO2016178206A1 publication Critical patent/WO2016178206A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06046Constructional details
    • G06K19/06103Constructional details the marking being embedded in a human recognizable image, e.g. a company logo with an embedded two-dimensional code

Definitions

  • the presently disclosed subject matter relates, in general, to the field of generating a visual code.
  • Visual codes such as one-dimensional barcodes and two-dimensional codes have been developed as machine-readable image representations of information.
  • Many two-dimensional codes represent data in a way of dots distribution or patterns in a certain grid, such as matrix code.
  • QR Quick Response Code
  • a QR Code comprises an array of black cells (square dark dots) and white cells (square light dots). The black cells are arranged in a square pattern on a white background. In some other cases, a negative option where the background is black and the cells are white, is valid as well.
  • three distinctive squares, known as finder patterns are located at the corners of the matrix code. Image size, orientation, and angle of viewing can be normalized. Other functional patterns, such as the alignment and timing patterns, enhance this process.
  • Two-dimensional codes are used in many applications and presented on many different visual media.
  • visual media could include, but not limited to, e.g., magazines, newspaper, TV, Cinema and the Internet, etc. These media examples can be divided into two groups: the Printed Mediums (such as printed books, newspaper, magazines and billboards) and the Screen-Enabled Mediums (such us TV, Cinema, Digital Billboards, Internet through PC and Mobile devices screens).
  • a non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
  • a computerized method of generating a visual code comprising: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
  • a computerized system of generating a visual code the visual code including function patterns and cells
  • the system comprising a processor configured to: i) obtain the function patterns and decoded values of the cells; and ii) calculate lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
  • a non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a visual code, the visual code including function patterns and cells, comprising the steps of the following: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
  • the system comprising a processor configured to: for each lighting condition of the respective lighting conditions, i) obtain at least one lighting parameter indicative of the lighting condition; ii) generate one of the visual codes, including: a) obtain the function patterns and decoded values of the cells; b) calculate lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter; giving rise to the plurality of visual codes such that at least one of the plurality of visual codes is machine-readable upon being scanned by a scanning device under each the lighting condition.
  • a non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under the respective lighting conditions, the visual code including function patterns and cells, comprising the steps of the following: for each lighting condition of the respective lighting conditions, i) obtaining at least one lighting parameter indicative of the lighting condition; ii) generating one of the visual codes, including: a) obtaining the function patterns and decoded values of the cells; b) calculating lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter; giving rise to the plurality of visual codes such that at least one of the plurality of visual codes is machine-readable upon being scanned by a scanning device under each the lighting condition.
  • the lighting condition can include one or more lighting properties related to a medium that the visual code is expected to be displayed on.
  • the lighting condition can include a lighting condition of an environment in which the visual code is expected to be displayed.
  • the lighting condition can include one or more lighting properties of a background image on which the visual code is expected to be superimposed.
  • the lighting condition can be capable of causing color transform effect upon the visual code being scanned, and the generated visual code is machine-readable under the color transform effect.
  • the obtaining can comprise receiving the function patterns and decoded values of the cells.
  • the obtaining can comprise calculating the function patterns and decoded values of the cells.
  • the calculating the function patterns can be based on the at least one lighting parameter.
  • the calculating the function patterns can comprise changing structure of the function patterns based on the at least one lighting parameter.
  • the at least one lighting parameter can be selected from the following: a lightness parameter and a darkness parameter.
  • the calculating lightness values can further comprise obtaining the at least one lighting parameter.
  • the calculating lightness values can comprise assigning medium lightness values to cells with decoded values indicating the cells being light.
  • the lightness values can be calculated for image elements associated with each of the plurality of cells.
  • the visual code can be a two-dimensional code.
  • the decoded values of cells can be calculated based on an input message to be encoded in the visual code in accordance with a visual code specification.
  • the lightness values can be represented in RGB color model.
  • the method can further comprise superimposing each of the plurality of visual codes on the one or more frames of the video.
  • the video can include animation.
  • the changing structure of the function patterns can comprise decreasing size of light areas of the function patterns and increasing size of dark areas of the function patterns.
  • the generating method can further comprise decreasing size of cells with light decoded values and increasing size of cells with dark decoded values.
  • Fig. 1 schematically illustrates a functional block diagram of a system for generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter
  • Fig. 2 illustrates a generalized flowchart of generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter
  • Fig. 3 illustrates a generalized flowchart of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video in accordance with certain embodiments of the presently disclosed subject matter;
  • Fig. 4A illustrates an exemplified two-dimensional code with function patterns in accordance with certain embodiments of the presently disclosed subject matter
  • Fig. 4B illustrates an exemplified two-dimensional code with modified function patterns in accordance with certain embodiments of the presently disclosed subject matter
  • Figs. 5A and 5B respectively illustrate a visual code before and after assigning lightness values to some cells thereof in order to adapt to a given lighting condition in accordance with certain embodiments of the presently disclosed subject matter;
  • Fig. 6A illustrates a clipping effect that occurs when presenting a visual code on a dark background image of a TV screen in a dark room in accordance with certain embodiments of the presently disclosed subject matter
  • Fig. 6B illustrates after the light cells being assigned with the calculated medium lightness values, the resulted visual code is rendered machine-readable in accordance with certain embodiments of the presently disclosed subject matter.
  • Figs. 7A-7D show exemplified illustrations of different kinds of two- dimensional codes each embedding an input image in accordance with certain embodiments of the presently disclosed subject matter.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • non-transitory is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable to the presently disclosed subject matter.
  • the phrase “for example,” “such as”, “for instance” and variants thereof describe non-limiting embodiments of the presently disclosed subject matter.
  • Reference in the specification to “one case”, “some cases”, “other cases” or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiments) is included in at least one embodiment of the presently disclosed subject matter.
  • the appearance of the phrase “one case”, “some cases”, “other cases” or variants thereof does not necessarily refer to the same embodiment(s). It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are described in the context of separate embodiments, can also be provided in combination in a single embodiment. Conversely, various features of the presently disclosed subject matter, which are described in the context of a single embodiment, can also be provided separately or in any suitable subcombination.
  • one or more stages illustrated in the figures may be executed in a different order and/or one or more groups of stages may be executed simultaneously and vice versa.
  • FIG. 1 illustrating a schematic functional block diagram of a system for generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter.
  • visual code used herein should be expansively construed to cover any kind of machine-readable optical label that use standardized encoding modes to encode data and store information.
  • a visual code can be a one- dimensional barcode, or alternatively it can be a two-dimensional code.
  • two- dimensional code used herein should be expansively construed to cover any optical machine-readable representation of data in the form of a two-dimensional pattern of symbols.
  • matrix code which represents data in a way of dot distribution in a matrix grid, such as, for example, Quick Response (QR) code and EZcode, etc.
  • QR Quick Response
  • EZcode Quick Response
  • ⁇ visual code such as a two-dimensional code, typically includes function patterns and cells, represented by image elements.
  • a scanning process of a visual code by a scanning device including acquiring an image of the visual code (e.g., by an image acquisition module of the scanning device, such as the camera of the phone), detecting the visual code and decoding the detected visual code to obtain information encoded therein (including, e.g., numerical data, strings, pointers and/or any other digital data).
  • An image of a given visual code can be acquired with great differences under different conditions, such as different lighting conditions, although the original image of the visual code is the same. In some cases certain lighting conditions can cause a color transform effect upon the visual code being scanned.
  • color transform effect should be expansively construed to cover any color effect that can change the overall color (including, e.g., hue and lightness) of image elements.
  • it may include clipping effect (e.g., a burning effect) and any other color effects known in the art. Clipping can often occur as a result of an incorrect exposure when photographing or scanning an image. For instance it may cause the lightest areas of an image, such as the sky, or light sources, to clip and typically be completely white (or in some other cases to cause a few color channels to have their highest color values).
  • such effect is especially frequent when the visual code is displayed or presented on the Screen-Enabled media (such us TV, Cinema, Digital Billboards, Internet through PC and Mobile devices screens) under certain lighting conditions as aforementioned.
  • the visual code is displayed or presented on the Screen-Enabled media (such us TV, Cinema, Digital Billboards, Internet through PC and Mobile devices screens) under certain lighting conditions as aforementioned.
  • the area of the visual code in an acquired image will, in high probability, endure a burning effect, making the acquired visual code not machine-readable by, e.g., a visual code reader or scanner on the smart phone.
  • Fig. 6A illustrating a clipping effect that occurs when presenting a visual code on a dark background image of a TV screen or a cinema screen in a dark room in accordance with certain embodiments of the presently disclosed subject matter.
  • the acquired image of the visual code appears to be clipped and the light areas, e.g., the light cells and the function patterns of the visual code, are burnt to be completely white.
  • the white areas even appear to expand to the neighboring cells, making the visual code blurred and obscure.
  • Such a clipped visual code is most likely not machine-readable by a visual code reader or scanner.
  • a visual code can be deemed as "machine- readable” if the visual code can be scanned successfully by a scanning device, specifically, if the acquired visual code can be detected and decoded successfully.
  • the system 100 can comprise a processing unit 102 (such as, e.g. a digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc) that is configured to receive instructions and to manage, control and execute operations as specified by the instructions.
  • the processing unit 102 can include a lightness value calculator 108.
  • the processing unit 102 can further include a function pattern calculator 104 and a decoded value calculator 106.
  • the system 100 can further comprise an I/O interface 110, and a storage module 112 operatively coupled to the processing unit 102.
  • the I/O interface 110 can be configured to obtain the function patterns, e.g., to receive them from a remote computer through the Internet.
  • the functions patterns can be received via the I/O interface 110 from a user's input.
  • the source of the function patterns can be local or remote and can be obtained in different ways and using different technologies.
  • Hie function patterns can be pre-generated by the source computer, in accordance with a visual code specification.
  • the function pattern calculator 104 is configured to obtain the function patterns, e.g., to calculate the function patterns in accordance with a visual code specification.
  • the function pattern calculator 104 can also change the structure of the function patterns in order to adapt to a given lighting condition, as will be described below in further details with respect to Fig.2.
  • function pattern should be expansively construed to cover a group of cells or a set of shapes mat are defined in a corresponding visual code specification or a machine-readable image specification and which serve a predefined ancillary function which can be used in the imaging, scanning and/or decoding process of the data that is encoded in the visual code.
  • function patterns can be used for indicating the location of the visual code or to specify certain characteristics of the visual code.
  • QR Code specification includes provisions for the following function patterns: finder, timing patterns, and alignment patterns.
  • the I/O interface 110 can be configured to obtain decoded values of the cells, e.g., to receive them from a remote computer through the Internet, or from a user's input.
  • the source of the decoded values of the cells can be local or remote and can be obtained in different ways and using different technologies.
  • the decoded values can be pre- calculated by the source computer, in accordance with a visual code specification.
  • the decoded value calculator 106 can be configured to obtain the decoded values of the cells, e.g., to calculate the decoded values in accordance with a visual code specification.
  • cell also termed as “dot” or “dot module”, as used herein relates to a group of image elements, which include at least one image element in the visual code.
  • decoded value of the cell can be a binary value, i.e., the cell is required to be either dark or light.
  • the image element can be, for example, a pixel or a vector element.
  • the decoded values of the cells can be calculated by the decoded value calculator 106 based on certain input information (e.g., an input message) to be encoded in the visual code in accordance with a visual code specification.
  • the lightness value calculator 108 can be configured to calculate lightness values for a plurality of the cells based on the decoded values thereof as well as at least one lighting parameter.
  • the at least one lighting parameter can be indicative of a lighting condition under which the visual code is expected to be displayed.
  • the term "lightness value” is a representation of variation in the perception of a color or color space's brightness or luminance. In other words, it refers to a value mat represents relative lightness and darkness of a color.
  • Some color models have a separate channel to represent lightness or luminance, for example, the Y channel in YUV color model, the L channel in Lab color model, etc.
  • the lightness value that is calculated by the lightness value calculator 108 can also be presented in color models such as the RGB or CMYK color model.
  • the lightness value is the value of the image element itself.
  • the lightness value as used herein can have integer values ranging from 0 to 2SS, where 0 is associated with pure black, and 2SS is associated with pure white.
  • the range between 0 and 2SS represents gray values.
  • another kind of common representation can be a rational float value between 0 and 1.
  • a lighting condition can include one or more components each related to a medium, an environment, and a background image related to the display of the visual code.
  • the lighting condition can include one or more lighting properties related to a medium that the visual code is expected to be displayed or presented on.
  • the medium can be, for example, a cinema screen, a smart phone screen, a TV screen, etc.
  • the lighting condition can include a lighting condition of an environment in which the visual code is expected to be displayed or presented, such as a lightened room or a dark room.
  • the lighting condition can include one or more lighting properties of a background image on which the visual code is expected to be superimposed. For example, if the background image is dark, the probability that a burning effect may happen is high, as shown in Fig. 6.
  • the above possible components of a lighting condition such as the medium, environment and background image, are illustrated for exemplified purposes only and should not limit the disclosure. Accordingly the lighting condition can include additional or alternative components or elements except for the above.
  • the lighting condition can also include the lighting setting or configuration of a scanning device, e.g., the brightness setting of a display screen of a smart phone.
  • the at least one lighting parameter indicative of the lighting condition can in some cases include a plurality of separate parameters each indicating a respective component of a lighting condition, or alternatively it can include one comprehensive parameter that takes into consideration of all the possible components of a lighting condition.
  • the at least one parameter can include a medium parameter indicating the lighting properties related to a medium that the visual code is expected to be displayed or presented on.
  • the at least one parameter can include an environment parameter indicating the lighting condition of an environment in which the visual code is expected to be displayed or presented.
  • the at least one parameter can include a background image parameter and/or a scanning device parameter, etc.
  • the at least one parameter can include a comprehensive parameter that is a weighted parameter calculated based on all the separate parameters.
  • the at least one lighting parameter can be selected from the following: a lightness parameter and/or a darkness parameter which indicates the extent of lightness of the lighting condition.
  • Hie lightness parameter and darkness parameter are defined from different perspectives, e.g., the lightness parameter can express estimated color transformation effects for light areas of the image, and the darkness parameter can express estimated color transformation effects for dark areas of the image.
  • the lighting parameter can be predetermined based on a given lighting condition, or alternatively it can be received via the I/O interface 110.
  • lightness values can be calculated based on the decoded values and the at least one lighting parameter.
  • the visual code is generated which is adapted to the given lighting condition. Specifically, the generated visual code is machine-readable upon being scanned by a scanning device under the given lighting condition. Further details will be described below with respect to Figs.2 & 3.
  • the I/O interface 110 can be configured to obtain, e.g., the function patterns and/or the decoded values of the cells of the visual code from a source computer.
  • the I/O interface 110 can also be configured to obtain the at least one lighting parameter or indications of a lighting condition, such as the above described possible components from, e.g., a user. It would be appreciated mat the I/O interface 110 in some cases can obtain further inputs, such as a visual code specification, and/or tolerances and other characteristics and/or parameters which are associated with a decoder or scanner of the visual code. For example, certain characteristics of the hardware and/or the software that is involved in the imaging, scanning and/or decoding of the visual code can also be obtained as input to the process of generating a visual code.
  • the storage module 112 comprises a non-transitory computer readable storage medium mat stores data and enables retrieval of various data for processing unit 102 to process.
  • the storage module 112 can store, for example, one or more of the following: the acquired images, function patterns, decoded values, lightness values, and lighting parameters etc.
  • system 100 can be implemented as a standalone system dedicated for performing such generating process, or alternatively, the functionality of system 100 can be integrated as a sub-unit or component of a general purpose computer or electronic device.
  • the functionality of system 100 can be realized by running a computer readable program or application on a general purpose computer hardware including but not limited to servers, personal computers, laptop computers, smart phones (e.g. iPhone, etc), PDAs, tablet computers (e.g. Apple iPad), or any other suitable device.
  • system 100 described here with reference to Fig. 1 can be a distributed device or system, which includes several functional components which reside on different devices and are controlled by a control layer as a virtual entity to perform the operations described herein.
  • the I/O interface 110 and/or the storage module 112 can reside on a local computer, while the processing unit 102 or part of the components thereof can reside on a remote server for performing the processing and calculation.
  • the function pattern calculator 104 and/or the decoded value calculator 106 can reside on a different computer from the lightness value calculator 108.
  • the term "processing unit” should be expansively construed to include a single processor or a plurality of processors which may be distributed locally or remotely.
  • the processing unit and/or storage module can in some cases be cloud-based.
  • Fig. 2 schematically illustrating a generalized flowchart of generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter.
  • Function patterns of a visual code can be obtained (210) (e.g., by the I/O interface 110 illustrated in Fig. 1). In some cases the function patterns can be received from a remote computer through the Internet or alternatively from a user's input. It would be appreciated that the source that provides the function patterns can be local or remote and the function patterns can be obtained in different ways and using different technologies.
  • the function patterns can be pre-generated and provided by the source computer, in accordance with a visual code specification.
  • the function patterns can be calculated (e.g., by the function pattern calculator 104 illustrated in Fig. 1) in accordance with a visual code specification. Specifically, the values for the function patterns can be calculated at least according to a function pattern specification in the visual code specification.
  • the function patterns are used to enable certain operations that are related to the imaging, scanning and/or decoding of the visual code.
  • the function patterns can include various patterns or combinations of patterns that can enable a visual code reader/scanner to identify that a certain image contains a readable visual code, locate the readable visual code in the image, and configure a scanning and/or encoding process according to parameters indicated by the function patterns.
  • Fig. 4A showing an exemplified two-dimensional code with function patterns in accordance with certain embodiments of the presently disclosed subject matter.
  • three finder patterns 402, 404 and 406 can be located at the three corners of the visual code.
  • the finder patterns are capable of gaining a predefined frequency component ratio irrespective of orientation of the scanning line when a scanning line passes through the center of each positioning.
  • finder pattern 402 there is a large square 408 with light color values, a middle concentric square 410 with dark color values, another smaller concentric middle square 412 with light color values, and a small concentric square 414 with dark color values.
  • This pattern gains a frequency dark-and-light component ratio of 1:1:1:3:1:1:1.
  • Fig. 4B illustrating an exemplified two-dimensional code with modified function patterns in accordance with certain embodiments of the presently disclosed subject matter.
  • the structure of the function patterns can be changed or modified during or after the calculation of the function patterns as described above in order for the generated visual code to be adapted to a given lighting condition.
  • a possible way to enhance the readability of a visual code under a lighting condition that may cause such burning effect is to decrease the size of the light areas of the function patterns, and increase the size of the dark areas of the function patterns.
  • the finder pattern 402 as shown in Fig.
  • the burning effect when occurs, can make the frequency component ratio at the acquired image of the visual code to be closer to the original ratio of llight- 1 dark- lwhite-3dark-l white- 1 dark- 1 light, such that the acquired visual code is machine- readable by a reader/decoder of a scanning device.
  • the above described modification of the structure of function patterns is performed in accordance with at least one lighting parameter that indicates a given lighting condition.
  • the at least one lighting parameter can include a medium parameter indicating the medium that the visual code is displayed on, and the modification will only be triggered if the medium parameter is set to a certain type or certain value, such as, for instance, a cinema medium.
  • function patterns such as the frequency component ratio
  • the outer component e.g., the light square 408, of the function patterns may not be considered in the scanning process.
  • the exact size or ratio of such component is not important for the scanning process.
  • such component can be used to border the pattern that is searched for.
  • the scanning process might be looking for 1:1:3:1:1 (ldark-lwhite-3dark-lwhite-ldark) instead of 1:1:1:3:1:1:1 (llight-ldark-lwhite-3dark- 1 white-ldark-llight).
  • decoded values of the cells of a visual code can be obtained (220) (e.g., by the I/O interface 110 illustrated in Fig. 1).
  • the decoded values can be received from a remote computer through the Internet or alternatively from a user's input.
  • the decoded values can be pre-calculated and provided by the source computer, at least in accordance with a visual code specification.
  • the decoded values of the cells can be calculated (e.g., by the function pattern calculator 104 illustrated in Fig. 1) at least in accordance with a visual code specification.
  • the values for the function patterns can be calculated at least according to a general cell specification in the visual code specification.
  • the decoded values for the cells are calculated based on the visual code specification and further based on the input message that is to be encoded in the readable matrix code, such that the message can be read from the visual code, e.g., by decoding the visual code.
  • the message can be a predetermined message or alternatively the message can be an input received from a user or can be received from any other source.
  • the cells of the visual code can comprise free cells that encode the message, and derived cells that serve for e.g., error correction purposes and other functionalities.
  • a decoded value for a cell according to a cell specification can be a binary value, and for example, the cell color is suggested to be either dark or light.
  • the decoded value can be from a set of N (N ⁇ 2) values. When N equals two, these two values could be one and zero, and could be visually represented by light and dark colors.
  • N could be equal to four, which mean four values are allowed and possible as the decoded value. For example, these four values could be zero, one, two and three, and could be visually represented by Black, Cyan, Magenta, and Yellow colors.
  • these four values could be '25780',' 1407210' and 230, and could be visually represented by ranges of grayscale values: 0 to 50 for value '50', 51 to 110 for value '80', 111 to 170 for value '140' and 171 to 255 for value '210'.
  • range of values which can be used for representing one decoded value of a cell of a visual code beyond the range of the suggested color values, including using a range of color and/or luminosity values for the image elements representing the cell.
  • lightness values for a plurality of cells of the visual code can be calculated (230) (e.g., by the lightness value calculator 108 as illustrated in Fig. 1) based on the decoded values of such cells and at least one lighting parameter.
  • the at least one lighting parameter can be indicative of a lighting condition under which the visual code is expected to be displayed.
  • the visual code generated as such is adapted to be machine-readable upon being scanned by a scanning device under the given lighting condition.
  • the calculation of the lightness values can further comprise: i) obtaining the at least one lighting parameter indicative of a lighting condition under which the visual code is expected to be displayed, and ii) determining (and assigning) the lightness values based on the decoded values of the plurality of cells and the at least one lighting parameter.
  • a lighting condition can include one or more components each related to a medium, an environment and a background image related to the display of the visual code and possibly also a scanning device related to the scanning of the visual code.
  • the at least one lighting parameter can be obtained as indications of such lighting condition.
  • the at least one lighting parameter can include a plurality of separate parameters each indicating a respective component of a lighting condition, or alternatively it can include one comprehensive parameter that takes into consideration of all the possible components of a lighting condition.
  • the at least one parameter can include one or more of the following: a medium parameter, an environment parameter, a background image parameter and a scanning device parameter, etc.
  • the at least one lighting parameter can include a comprehensive parameter which can be, e.g., a weighted parameter calculated based on all the separate parameters.
  • a comprehensive parameter which can be, e.g., a weighted parameter calculated based on all the separate parameters.
  • a visual code that is expected to be presented on a medium of a cinema screen.
  • the environment for the display of the visual code is a cinema hall which can be dark.
  • the visual code can be superimposed on a background image or one or more frames of a video which can be dark as well.
  • an image of the visual code acquired by scanning device e.g., a phone with a camera, may by clipped, i.e.
  • the at least one lighting parameter can include three separate parameters including a medium parameter which can be set as cinema medium, or harsh cinema medium, an environment parameter which can be set as dark room, and a background image parameter which can be set as dark image.
  • these lighting parameters can also be configured with numerical values that are equivalently indicative.
  • a comprehensive parameter can be derived from the given lighting condition and the above separate parameters, and can be assigned with a numerical value.
  • such numerical value of the lightness parameter can further indicates the lightness values that can be assigned to some of the cells.
  • the lightness value can then be determined based on the decoded values of at least some of the cells and the at least one lighting parameter as configured above.
  • a plurality of cells that have decoded values indicating the cells to be light (hereinafter termed as “light cells”, or “cells with light decoded values”) can be assigned with a medium lightness value, such as, e.g., 128, in order to mitigate the clipping effect.
  • Cells that have dark decoded values hereinafter termed as “dark cells”, or “cells with dark decoded values” can remain their dark lightness value.
  • FIG. 5A shows a visual code 502 superimposed on a dark background image, the visual code S02 including light cells and dark cells.
  • the same visual code 502 on the same dark background image is displayed on a TV screen in a dark meeting room.
  • a scanning device e.g., a scanner software on a smart phone
  • the acquired image including the acquired visual code 602 is clipped, i.e.
  • the resulted visual code 504 is illustrated in Fig. SB, where it is shown the original light cells now appear to be in a medium level of gray colors.
  • the acquired visual code 604 by the same scanning device as shown in Fig. 6B appears to be clear and very close to the visual code 502, thus is rendered to be machine-readable.
  • the lightness values are calculated or assigned to image elements associated with each light cell.
  • each cell can be associated with a group of pixels.
  • the lightness value that is calculated for a given cell can be applied to each pixel that is associated with the cell.
  • the lightness value can also be assigned to part of the cell. For example, at least half of the image elements associated with a given cell can be assigned with the calculated lightness value.
  • the comprehensive lightness parameter is determined or configured as a numerical value, e.g., 128, the lightness values that can be assigned for a light cell, or a group of image pixels associated with the light cell can be determined in accordance with the value of the lightness parameter, e.g., in a range of [128- d, 128+3.
  • the modification of the structure of function patterns, as described above with respect to block 210 can be triggered based on the at least one lighting parameter, e.g., the medium parameter.
  • the medium parameter is assigned with a cinema medium, or harsh cinema medium
  • the structure of function patterns of the visual code can be changed or modified as described with respect to block 210.
  • the size of the light areas can be decreased and the size of the dark areas can be enlarged.
  • the generating method can further comprise decreasing the size of cells with light decoded values and increasing the size of cells with dark decoded values, in a similar manner as described above with reference to changing the structure of the function patterns.
  • the visual code generated to be adapted to a specific lighting condition as described above, especially to a specific medium may be not machine-readable when being presented on other non-designated mediums.
  • a clipping effect is mostly related to Screen- enabled media
  • the present disclosure is not limited to such media, and according can also be applied to printed media, in order to enhance readability of the printed visual codes, especially printed colored visual codes.
  • the presently disclosed subject matter can be used to generate visual codes that are expected to be printed on a printed medium, such as, e.g., a newspaper.
  • a printed medium such as, e.g., a newspaper.
  • the paper quality of certain newspaper could be low and have a grayish paper color.
  • the lighting parameter e.g., the comprehensive parameter
  • the lighting parameter could be assigned with a high value, such as, e.g., 245.
  • Each light cell can then be assigned, for at least half of the image elements associated with such cell, color values with lightness value equal to 245.
  • the lightness parameter 245, as described above can induce, that lightness value in a range of [245-3,245+5] can be assigned for a group of image pixels associated with a light cell.
  • the printing process could be an offset printing process which might present minor shifting errors between color plates.
  • the modification of the structure of function patterns, as described above with respect to block 210 can be triggered based on the at least one lighting parameter, e.g., the medium parameter. If the medium parameter is assigned with an offset printing medium, then the structure of the function patterns can be changed, for example, by enlarging the size of the light areas and decrease the size of the dark areas.
  • a visual code can be a two-dimensional code which refers to any optical machine-readable representation of data in the form of a two-dimensional pattern of symbols as defined above.
  • the visual code as referred to herein can be generated as a special two- dimensional code having an input image or graphic embedded therein.
  • Figs. 7A-7D there are shown exemplified illustrations of different kinds of two- dimensional codes each embedding an input image in accordance with certain embodiments of the presently disclosed subject matter.
  • Fig. 7A and Fig. 7B are two- dimensional codes that have input images superimposed thereon.
  • the calculation of the decoded values of the cells as performed by the decoded value calculator 106 can further include superimposing the input image on the code, e.g., by changing the transparency of the dots/cells in the two-dimensional code without changing the distribution of the dots or adjusting the decoded values thereof, such that the two-dimensional code, after being superimposed with the input image, is still machine-readable.
  • Fig. 7C shows a different kind of two-dimensional code in which the input image is not only simply superimposed thereon as described with respect to Fig. 7A and Fig. 7B. In the two-dimensional code illustrated in Fig.
  • the decoded values of cells that correspond to the encoded data/information are in fact calculated or determined by the decoded value calculator 106 such that the appearance of the two- dimensional code complies with a visual similarity criterion when compared with the input image.
  • An exemplified illustration of such a two-dimensional code is described in US patent No. 8,978,989, issued on date of March 17, 201S, which is incorporated herein in its entirety by reference.
  • Fig. 7D shows another kind of two- dimensional code having an input image embedded therein.
  • the calculation of the decoded values of the cells as performed by the decoded value calculator 106 can further include positioning the cells having decoded values corresponding to the encoded data in one or more encoding regions relative to the function patterns and a portion of the input image, rendering the two-dimensional code appears visually more appealing.
  • the system 100 can further comprise an additional functional component to calculate an image descriptor for the input image and associate the input image with the image descriptor which is used to verify the authenticity of the two-dimensional code in the reading process thus rendering the code to be functionally safer and stronger.
  • An exemplified illustration of such two-dimensional code is described in US patent application No. 62/097,748, filed on date of December 30, 2014, which is incorporated herein in its entirety by reference.
  • FIG.3 there is illustrated a generalized flowchart of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video in accordance with certain embodiments of the presently disclosed subject matter.
  • visual codes can be superimposed over a video or an animation which includes more than one frame.
  • different visual codes can be generated (in a similar manner as described above with reference to Fig. 2) in accordance with respective lighting conditions.
  • Each generated visual code can be superimposed on one or more frames of a video or animation that is expected to be displayed under the respective lighting conditions.
  • the result video or animation can include a plurality of such visual codes, each designated to a different lighting condition (e.g., a different medium).
  • At least one lighting parameter can be obtained (310) (e.g., by the lightness value calculator 108 illustrated in Fig. 1).
  • the at least one lighting parameter can be indicative of a specific lighting condition.
  • a lighting condition under which a video or animation is expected to be displayed can largely vary.
  • the lighting condition can include one or more components or scenarios each related to, e.g., a medium and an environment for the display of the video.
  • a video can be published in a video sharing service such as Youtube.com or Youku.com.
  • the video can then be presented on a small mobile screen, on a PC screen or a large smart TV screen.
  • the video can then be presented in a dark room or a lightened room.
  • the video may be presented as part of the video sharing website within a frame while a portion of the website is also presented, alternatively it can be presented in wide screen where only the video is presented.
  • the lighting condition can be different from others.
  • at least one lighting parameter can be obtained indicating such specific condition, in a similar way as described with reference to Fig. 2.
  • a visual code can be generated (320) for such specific lighting condition, including: obtaining (330) the function patterns (e.g., by the function pattern calculator 104 or the I/O interface 110 illustrated in Fig. 1) and decoded values of the cells (e.g., by the decoded value calculator 106 or the I/O interface 110 illustrated in Fig. 1), and calculating (340) (e.g., by the lightness value calculator 108 illustrated in Fig. 1) lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter.
  • the generation process of a visual code is performed in a similar manner as described with reference to Fig.2.
  • the process in Fig. 3 can further include superimposing each of the generated plurality of visual codes on one or more frames of the video. Therefore the resulted video can present many visual codes adapted to various designated lighting conditions, such mat for each of a plurality of lighting conditions (the plurality of lighting conditions being included in the designated lighting conditions) in which different users are watching the video and scanning the codes, there will be a visual code on one or more frames that can enable a successful scan for the user.
  • a video as used herein refers to one or more frames of images. It can in some cases include animation.
  • system can be implemented, at least partly, as a suitably programmed computer.
  • the presently disclosed subject matter contemplates a computer program being readable by a computer for executing the disclosed method.
  • the presently disclosed subject matter further contemplates a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing the disclosed method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

There is provided a computerized method and system of generating a visual code, the visual code including function patterns and cells, the method comprising: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition. There is further provided a computerized method and system of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under the respective lighting conditions.

Description

SYSTEM AND METHOD OF GENERATING A VISUAL CODE
TECHNICAL FIELD
The presently disclosed subject matter relates, in general, to the field of generating a visual code.
BACKGROUND
Visual codes, such as one-dimensional barcodes and two-dimensional codes have been developed as machine-readable image representations of information. Many two-dimensional codes represent data in a way of dots distribution or patterns in a certain grid, such as matrix code.
One common matrix code is the QR (Quick Response) Code. A QR Code comprises an array of black cells (square dark dots) and white cells (square light dots). The black cells are arranged in a square pattern on a white background. In some other cases, a negative option where the background is black and the cells are white, is valid as well. In one embodiment of the QR Code, three distinctive squares, known as finder patterns, are located at the corners of the matrix code. Image size, orientation, and angle of viewing can be normalized. Other functional patterns, such as the alignment and timing patterns, enhance this process.
Two-dimensional codes are used in many applications and presented on many different visual media. Examples of such visual media could include, but not limited to, e.g., magazines, newspaper, TV, Cinema and the Internet, etc. These media examples can be divided into two groups: the Printed Mediums (such as printed books, newspaper, magazines and billboards) and the Screen-Enabled Mediums (such us TV, Cinema, Digital Billboards, Internet through PC and Mobile devices screens).
GENERAL DESCRIPTION
In accordance with certain aspects of the presently disclosed subject matter, there is provided a computerized method of generating a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
In accordance with other aspects of the presently disclosed subject matter, there is provided a computerized system of generating a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
In accordance with other aspects of the presently disclosed subject matter, there is provided a non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a computerized method of generating a visual code, the visual code including function patterns and cells, the method comprising: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
In accordance with other aspects of the presently disclosed subject matter, there is provided a computerized system of generating a visual code, the visual code including function patterns and cells, the system comprising a processor configured to: i) obtain the function patterns and decoded values of the cells; and ii) calculate lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
In accordance with other aspects of the presently disclosed subject matter, there is provided a non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a visual code, the visual code including function patterns and cells, comprising the steps of the following: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of the cells based on the decoded values thereof and at least one lighting parameter, the at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine-readable upon being scanned by a scanning device under the lighting condition.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a computerized method of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under the respective lighting conditions, the visual code including function patterns and cells, the method comprising: for each lighting condition of the respective lighting conditions, i) obtaining at least one lighting parameter indicative of the lighting condition; ii) generating one of the visual codes, including: a) obtaining the function patterns and decoded values of the cells; b) calculating lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter, giving rise to the plurality of visual codes such that at least one of the plurality of visual codes is machine-readable upon being scanned by a scanning device under each the lighting condition.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a computerized system of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under the respective lighting conditions, the visual code including function patterns and cells, the system comprising a processor configured to: for each lighting condition of the respective lighting conditions, i) obtain at least one lighting parameter indicative of the lighting condition; ii) generate one of the visual codes, including: a) obtain the function patterns and decoded values of the cells; b) calculate lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter; giving rise to the plurality of visual codes such that at least one of the plurality of visual codes is machine-readable upon being scanned by a scanning device under each the lighting condition.
In accordance with certain aspects of the presently disclosed subject matter, there is provided a non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under the respective lighting conditions, the visual code including function patterns and cells, comprising the steps of the following: for each lighting condition of the respective lighting conditions, i) obtaining at least one lighting parameter indicative of the lighting condition; ii) generating one of the visual codes, including: a) obtaining the function patterns and decoded values of the cells; b) calculating lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter; giving rise to the plurality of visual codes such that at least one of the plurality of visual codes is machine-readable upon being scanned by a scanning device under each the lighting condition.
In accordance with further aspects of the presently disclosed subject matter, and optionally, in combination with any of the appropriate above aspects, the lighting condition can include one or more lighting properties related to a medium that the visual code is expected to be displayed on. The lighting condition can include a lighting condition of an environment in which the visual code is expected to be displayed. The lighting condition can include one or more lighting properties of a background image on which the visual code is expected to be superimposed. The lighting condition can be capable of causing color transform effect upon the visual code being scanned, and the generated visual code is machine-readable under the color transform effect. In accordance with further aspects of the presently disclosed subject matter, and optionally, in combination with any of the appropriate above aspects, the obtaining can comprise receiving the function patterns and decoded values of the cells. The obtaining can comprise calculating the function patterns and decoded values of the cells. The calculating the function patterns can be based on the at least one lighting parameter. The calculating the function patterns can comprise changing structure of the function patterns based on the at least one lighting parameter. The at least one lighting parameter can be selected from the following: a lightness parameter and a darkness parameter.
In accordance with further aspects of the presently disclosed subject matter, and optionally, in combination with any of the appropriate above aspects, the calculating lightness values can further comprise obtaining the at least one lighting parameter. The calculating lightness values can comprise assigning medium lightness values to cells with decoded values indicating the cells being light. The lightness values can be calculated for image elements associated with each of the plurality of cells. The visual code can be a two-dimensional code. The decoded values of cells can be calculated based on an input message to be encoded in the visual code in accordance with a visual code specification. The lightness values can be represented in RGB color model. The method can further comprise superimposing each of the plurality of visual codes on the one or more frames of the video. The video can include animation. The changing structure of the function patterns can comprise decreasing size of light areas of the function patterns and increasing size of dark areas of the function patterns. The generating method can further comprise decreasing size of cells with light decoded values and increasing size of cells with dark decoded values. BRIEF DESCRIPTION OF THE DRAWINGS
In order to understand the presently disclosed subject matter and to see how it may be carried out in practice, the subject matter will now be described, by way of non- limiting example only, with reference to the accompanying drawings, in which: Fig. 1 schematically illustrates a functional block diagram of a system for generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter; Fig. 2 illustrates a generalized flowchart of generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter;
Fig. 3 illustrates a generalized flowchart of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video in accordance with certain embodiments of the presently disclosed subject matter;
Fig. 4A illustrates an exemplified two-dimensional code with function patterns in accordance with certain embodiments of the presently disclosed subject matter;
Fig. 4B illustrates an exemplified two-dimensional code with modified function patterns in accordance with certain embodiments of the presently disclosed subject matter;
Figs. 5A and 5B respectively illustrate a visual code before and after assigning lightness values to some cells thereof in order to adapt to a given lighting condition in accordance with certain embodiments of the presently disclosed subject matter;
Fig. 6A illustrates a clipping effect that occurs when presenting a visual code on a dark background image of a TV screen in a dark room in accordance with certain embodiments of the presently disclosed subject matter;
Fig. 6B illustrates after the light cells being assigned with the calculated medium lightness values, the resulted visual code is rendered machine-readable in accordance with certain embodiments of the presently disclosed subject matter; and
Figs. 7A-7D show exemplified illustrations of different kinds of two- dimensional codes each embedding an input image in accordance with certain embodiments of the presently disclosed subject matter. DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the disclosed subject matter. However, it will be understood by those skilled in the art that the present disclosed subject matter can be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present disclosed subject matter. In the drawings and descriptions set forth, identical reference numerals indicate those components that are common to different embodiments or configurations.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "generating", "obtaining", "determining", "scanning", "receiving", "assigning", "causing", "superimposing'', or the like, include action and/or processes of a computer that manipulate and/or transform data into other data, said data represented as physical quantities, e.g. such as electronic quantities, and/or said data representing the physical objects. The terms "computer", "processor", and "processing unit" should be expansively construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, a personal computer, a server, a computing system, a communication device, a processor (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), any other electronic computing device, and or any combination thereof. The terms "computer", "processor", and "processing unit" can include a single computer/processor/processing unit or a plurality of distributed or remote such units.
The operations in accordance with the teachings herein can be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a non-transitory computer readable storage medium The present disclosure may also encompass the computer program for performing the method of the invention.
The term "non-transitory" is used herein to exclude transitory, propagating signals, but to otherwise include any volatile or non-volatile computer memory technology suitable to the presently disclosed subject matter.
As used herein, the phrase "for example," "such as", "for instance" and variants thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to "one case", "some cases", "other cases" or variants thereof means that a particular feature, structure or characteristic described in connection with the embodiments) is included in at least one embodiment of the presently disclosed subject matter. Thus the appearance of the phrase "one case", "some cases", "other cases" or variants thereof does not necessarily refer to the same embodiment(s). It is appreciated that, unless specifically stated otherwise, certain features of the presently disclosed subject matter, which are described in the context of separate embodiments, can also be provided in combination in a single embodiment. Conversely, various features of the presently disclosed subject matter, which are described in the context of a single embodiment, can also be provided separately or in any suitable subcombination.
In embodiments of the presently disclosed subject matter one or more stages illustrated in the figures may be executed in a different order and/or one or more groups of stages may be executed simultaneously and vice versa.
Bearing this in mind, attention is now drawn to Fig. 1, illustrating a schematic functional block diagram of a system for generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter.
The term "visual code" used herein should be expansively construed to cover any kind of machine-readable optical label that use standardized encoding modes to encode data and store information. By way of example, a visual code can be a one- dimensional barcode, or alternatively it can be a two-dimensional code. The term "two- dimensional code" used herein should be expansively construed to cover any optical machine-readable representation of data in the form of a two-dimensional pattern of symbols. One example of a known two-dimensional code is matrix code which represents data in a way of dot distribution in a matrix grid, such as, for example, Quick Response (QR) code and EZcode, etc. Λ visual code, such as a two-dimensional code, typically includes function patterns and cells, represented by image elements.
According to certain embodiments, a scanning process of a visual code by a scanning device, such as, for example, a smart phone, including acquiring an image of the visual code (e.g., by an image acquisition module of the scanning device, such as the camera of the phone), detecting the visual code and decoding the detected visual code to obtain information encoded therein (including, e.g., numerical data, strings, pointers and/or any other digital data). An image of a given visual code can be acquired with great differences under different conditions, such as different lighting conditions, although the original image of the visual code is the same. In some cases certain lighting conditions can cause a color transform effect upon the visual code being scanned. The term "color transform effect" should be expansively construed to cover any color effect that can change the overall color (including, e.g., hue and lightness) of image elements. By way of example, it may include clipping effect (e.g., a burning effect) and any other color effects known in the art. Clipping can often occur as a result of an incorrect exposure when photographing or scanning an image. For instance it may cause the lightest areas of an image, such as the sky, or light sources, to clip and typically be completely white (or in some other cases to cause a few color channels to have their highest color values). Without limiting the disclosure in any way, in some cases such effect is especially frequent when the visual code is displayed or presented on the Screen-Enabled media (such us TV, Cinema, Digital Billboards, Internet through PC and Mobile devices screens) under certain lighting conditions as aforementioned. By way of example, when presenting a visual code on a cinema screen, in a dark room with low lighting and the background image that incorporates the visual code is dark, the area of the visual code in an acquired image will, in high probability, endure a burning effect, making the acquired visual code not machine-readable by, e.g., a visual code reader or scanner on the smart phone.
Reference is now made to Fig. 6A, illustrating a clipping effect that occurs when presenting a visual code on a dark background image of a TV screen or a cinema screen in a dark room in accordance with certain embodiments of the presently disclosed subject matter. As shown, the acquired image of the visual code appears to be clipped and the light areas, e.g., the light cells and the function patterns of the visual code, are burnt to be completely white. The white areas even appear to expand to the neighboring cells, making the visual code blurred and obscure. Such a clipped visual code is most likely not machine-readable by a visual code reader or scanner.
According to certain embodiments, a visual code can be deemed as "machine- readable" if the visual code can be scanned successfully by a scanning device, specifically, if the acquired visual code can be detected and decoded successfully.
Turning back to Fig. 1, there is provided a system 100 for generating a visual code at least in accordance with a lighting condition. The system 100 can comprise a processing unit 102 (such as, e.g. a digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc) that is configured to receive instructions and to manage, control and execute operations as specified by the instructions. According to certain embodiments, the processing unit 102 can include a lightness value calculator 108. In some embodiments, the processing unit 102 can further include a function pattern calculator 104 and a decoded value calculator 106. According to certain embodiments, the system 100 can further comprise an I/O interface 110, and a storage module 112 operatively coupled to the processing unit 102.
According to certain embodiments, the I/O interface 110 can be configured to obtain the function patterns, e.g., to receive them from a remote computer through the Internet. In some cases the functions patterns can be received via the I/O interface 110 from a user's input. It would be appreciated that the source of the function patterns can be local or remote and can be obtained in different ways and using different technologies. Hie function patterns can be pre-generated by the source computer, in accordance with a visual code specification. According to other embodiments, the function pattern calculator 104 is configured to obtain the function patterns, e.g., to calculate the function patterns in accordance with a visual code specification. Optionally, during or after the calculation, the function pattern calculator 104 can also change the structure of the function patterns in order to adapt to a given lighting condition, as will be described below in further details with respect to Fig.2.
The term "function pattern" as used herein should be expansively construed to cover a group of cells or a set of shapes mat are defined in a corresponding visual code specification or a machine-readable image specification and which serve a predefined ancillary function which can be used in the imaging, scanning and/or decoding process of the data that is encoded in the visual code. For example, function patterns can be used for indicating the location of the visual code or to specify certain characteristics of the visual code. For example the QR Code specification includes provisions for the following function patterns: finder, timing patterns, and alignment patterns.
Similarly, according to certain embodiments, the I/O interface 110 can be configured to obtain decoded values of the cells, e.g., to receive them from a remote computer through the Internet, or from a user's input. It would be appreciated that the source of the decoded values of the cells can be local or remote and can be obtained in different ways and using different technologies. The decoded values can be pre- calculated by the source computer, in accordance with a visual code specification. According to some other embodiments, the decoded value calculator 106 can be configured to obtain the decoded values of the cells, e.g., to calculate the decoded values in accordance with a visual code specification.
The term "cell", also termed as "dot" or "dot module", as used herein relates to a group of image elements, which include at least one image element in the visual code. There is provided at least one value or function to a cell when it is read or decoded as part of a machine-reading process. This value is termed herein as the decoded value of the cell. By way of example and without limiting the disclosure, a decoded value for a cell according to the cell specification can be a binary value, i.e., the cell is required to be either dark or light. It should be appreciated that the image element can be, for example, a pixel or a vector element. According to some embodiments, the decoded values of the cells can be calculated by the decoded value calculator 106 based on certain input information (e.g., an input message) to be encoded in the visual code in accordance with a visual code specification.
The lightness value calculator 108 can be configured to calculate lightness values for a plurality of the cells based on the decoded values thereof as well as at least one lighting parameter. The at least one lighting parameter can be indicative of a lighting condition under which the visual code is expected to be displayed. The term "lightness value" is a representation of variation in the perception of a color or color space's brightness or luminance. In other words, it refers to a value mat represents relative lightness and darkness of a color. Some color models have a separate channel to represent lightness or luminance, for example, the Y channel in YUV color model, the L channel in Lab color model, etc. Some color models, such as RGB, although do not have a separate channel for it, can provide representation for lightness or luminance in different ways. In other words, the lightness value that is calculated by the lightness value calculator 108 can also be presented in color models such as the RGB or CMYK color model. In grayscale images, since there is only a single channel, the lightness value is the value of the image element itself. By way of example, the lightness value as used herein can have integer values ranging from 0 to 2SS, where 0 is associated with pure black, and 2SS is associated with pure white. The range between 0 and 2SS represents gray values. It should be appreciated that other kinds of ranges and representation of the lightness value could be used in addition or in lieu of the above. In one example, another kind of common representation can be a rational float value between 0 and 1. According to certain embodiments, a lighting condition can include one or more components each related to a medium, an environment, and a background image related to the display of the visual code. For example, the lighting condition can include one or more lighting properties related to a medium that the visual code is expected to be displayed or presented on. The medium can be, for example, a cinema screen, a smart phone screen, a TV screen, etc. It is to be noted that although the clipping effect as mentioned before is more related to Screen-enabled media, the present disclosed subject matter is not limited to utilized with respect to such media, and can also be applied to the printed media, such as newspaper, magazines, billboards, etc. Alternatively or additionally, the lighting condition can include a lighting condition of an environment in which the visual code is expected to be displayed or presented, such as a lightened room or a dark room. Alternatively or additionally, the lighting condition can include one or more lighting properties of a background image on which the visual code is expected to be superimposed. For example, if the background image is dark, the probability that a burning effect may happen is high, as shown in Fig. 6. It is to be noted that the above possible components of a lighting condition, such as the medium, environment and background image, are illustrated for exemplified purposes only and should not limit the disclosure. Accordingly the lighting condition can include additional or alternative components or elements except for the above. For example, the lighting condition can also include the lighting setting or configuration of a scanning device, e.g., the brightness setting of a display screen of a smart phone.
According to certain embodiments, the at least one lighting parameter indicative of the lighting condition can in some cases include a plurality of separate parameters each indicating a respective component of a lighting condition, or alternatively it can include one comprehensive parameter that takes into consideration of all the possible components of a lighting condition. For example, the at least one parameter can include a medium parameter indicating the lighting properties related to a medium that the visual code is expected to be displayed or presented on. Additionally or alternatively the at least one parameter can include an environment parameter indicating the lighting condition of an environment in which the visual code is expected to be displayed or presented. Additionally or alternatively the at least one parameter can include a background image parameter and/or a scanning device parameter, etc. By way of another example, the at least one parameter can include a comprehensive parameter that is a weighted parameter calculated based on all the separate parameters.
In some embodiments, the at least one lighting parameter, either each separate parameter or the comprehensive parameter, can be selected from the following: a lightness parameter and/or a darkness parameter which indicates the extent of lightness of the lighting condition. Hie lightness parameter and darkness parameter are defined from different perspectives, e.g., the lightness parameter can express estimated color transformation effects for light areas of the image, and the darkness parameter can express estimated color transformation effects for dark areas of the image. The lighting parameter can be predetermined based on a given lighting condition, or alternatively it can be received via the I/O interface 110.
According to certain embodiments, lightness values can be calculated based on the decoded values and the at least one lighting parameter. Upon the lightness value being calculated based on the decoded values and the at least one lighting parameter, including being assigned to at least part of the cells, the visual code is generated which is adapted to the given lighting condition. Specifically, the generated visual code is machine-readable upon being scanned by a scanning device under the given lighting condition. Further details will be described below with respect to Figs.2 & 3. As aforementioned, the I/O interface 110 can be configured to obtain, e.g., the function patterns and/or the decoded values of the cells of the visual code from a source computer. Additionally or alternatively the I/O interface 110 can also be configured to obtain the at least one lighting parameter or indications of a lighting condition, such as the above described possible components from, e.g., a user. It would be appreciated mat the I/O interface 110 in some cases can obtain further inputs, such as a visual code specification, and/or tolerances and other characteristics and/or parameters which are associated with a decoder or scanner of the visual code. For example, certain characteristics of the hardware and/or the software that is involved in the imaging, scanning and/or decoding of the visual code can also be obtained as input to the process of generating a visual code.
The storage module 112 comprises a non-transitory computer readable storage medium mat stores data and enables retrieval of various data for processing unit 102 to process. The storage module 112 can store, for example, one or more of the following: the acquired images, function patterns, decoded values, lightness values, and lighting parameters etc.
It is to be noted that system 100 can be implemented as a standalone system dedicated for performing such generating process, or alternatively, the functionality of system 100 can be integrated as a sub-unit or component of a general purpose computer or electronic device. For example, the functionality of system 100 can be realized by running a computer readable program or application on a general purpose computer hardware including but not limited to servers, personal computers, laptop computers, smart phones (e.g. iPhone, etc), PDAs, tablet computers (e.g. Apple iPad), or any other suitable device.
It should be appreciated that system 100 described here with reference to Fig. 1 can be a distributed device or system, which includes several functional components which reside on different devices and are controlled by a control layer as a virtual entity to perform the operations described herein. By way of example and without limiting the disclosure in any way, the I/O interface 110 and/or the storage module 112 can reside on a local computer, while the processing unit 102 or part of the components thereof can reside on a remote server for performing the processing and calculation. By way of another example, the function pattern calculator 104 and/or the decoded value calculator 106 can reside on a different computer from the lightness value calculator 108. The term "processing unit" should be expansively construed to include a single processor or a plurality of processors which may be distributed locally or remotely. In addition, the processing unit and/or storage module can in some cases be cloud-based.
Those versed in the art will readily appreciate that the teachings of the presently disclosed subject matter are not bound by the system illustrated in Fig. 1 and the above exemplified implementations. Equivalent and/or modified functionality can be consolidated or divided in another manner and can be implemented in any appropriate combination of software, firmware and hardware.
While not necessarily so, the process of operation of system 100 can correspond to some or all of the stages of the methods described with respect to Figs.2-3. Likewise, the methods described with respect to Figs. 2-3 and their possible implementations can be implemented by system 100. It is therefore noted that embodiments discussed in relation to the methods described with respect to Figs. 2-3 can also be implemented, mutatis mutandis as various embodiments of the system 100, and vice versa. Having described the functional block diagram of a system for generating a visual code at least in accordance with a lighting condition, attention is now directed to Fig. 2, schematically illustrating a generalized flowchart of generating a visual code at least in accordance with a lighting condition in accordance with certain embodiments of the presently disclosed subject matter.
Function patterns of a visual code can be obtained (210) (e.g., by the I/O interface 110 illustrated in Fig. 1). In some cases the function patterns can be received from a remote computer through the Internet or alternatively from a user's input. It would be appreciated that the source that provides the function patterns can be local or remote and the function patterns can be obtained in different ways and using different technologies. The function patterns can be pre-generated and provided by the source computer, in accordance with a visual code specification. In some other cases, the function patterns can be calculated (e.g., by the function pattern calculator 104 illustrated in Fig. 1) in accordance with a visual code specification. Specifically, the values for the function patterns can be calculated at least according to a function pattern specification in the visual code specification. The function patterns are used to enable certain operations that are related to the imaging, scanning and/or decoding of the visual code. For example, the function patterns can include various patterns or combinations of patterns that can enable a visual code reader/scanner to identify that a certain image contains a readable visual code, locate the readable visual code in the image, and configure a scanning and/or encoding process according to parameters indicated by the function patterns.
Reference is made to Fig. 4A, showing an exemplified two-dimensional code with function patterns in accordance with certain embodiments of the presently disclosed subject matter. As shown, three finder patterns 402, 404 and 406 can be located at the three corners of the visual code. By way of example, the finder patterns are capable of gaining a predefined frequency component ratio irrespective of orientation of the scanning line when a scanning line passes through the center of each positioning. As illustrated in finder pattern 402, there is a large square 408 with light color values, a middle concentric square 410 with dark color values, another smaller concentric middle square 412 with light color values, and a small concentric square 414 with dark color values. This pattern gains a frequency dark-and-light component ratio of 1:1:1:3:1:1:1.
Attention is directed to Fig. 4B, illustrating an exemplified two-dimensional code with modified function patterns in accordance with certain embodiments of the presently disclosed subject matter. According to certain embodiments, the structure of the function patterns can be changed or modified during or after the calculation of the function patterns as described above in order for the generated visual code to be adapted to a given lighting condition. In one embodiment, since a burning effect will most likely affect the light areas of an image to be clipped, a possible way to enhance the readability of a visual code under a lighting condition that may cause such burning effect is to decrease the size of the light areas of the function patterns, and increase the size of the dark areas of the function patterns. For example, the finder pattern 402 as shown in Fig. 4A which has a frequency component ratio of 1:1 :1 :3: 1:1:1 ( 1 light- 1 dark- lwhite-3dark-l white- 1 dark- 1 light) in accordance to the function pattern specification, can be assigned with values of 0.8white-1.2dark-0.8white-3.2dark-0.8white-1.2dark- 0.8white instead, by the function pattern calculator 104, as shown in 402' of Fig. 4B. Finder patterns 404 and 406 in Fig. 4A are processed in a similar way to corresponding modified patterns of 404' and 406' as shown in Fig. 4B, such that a possibility of the readability of the visual code being affected by a clipping/burning effect will be lower. In some cases, the burning effect, when occurs, can make the frequency component ratio at the acquired image of the visual code to be closer to the original ratio of llight- 1 dark- lwhite-3dark-l white- 1 dark- 1 light, such that the acquired visual code is machine- readable by a reader/decoder of a scanning device. According to certain embodiments, the above described modification of the structure of function patterns is performed in accordance with at least one lighting parameter that indicates a given lighting condition. By way of example, the at least one lighting parameter can include a medium parameter indicating the medium that the visual code is displayed on, and the modification will only be triggered if the medium parameter is set to a certain type or certain value, such as, for instance, a cinema medium.
It is to be noted that the above described structure of function patterns, such as the frequency component ratio, is illustrated for exemplified purposes only and should not limit the present disclosure in any way. Without limiting the disclosure in any way, in some cases, the outer component, e.g., the light square 408, of the function patterns may not be considered in the scanning process. In other words, the exact size or ratio of such component is not important for the scanning process. For example, such component can be used to border the pattern that is searched for. In the above mentioned example, for instance, the scanning process might be looking for 1:1:3:1:1 (ldark-lwhite-3dark-lwhite-ldark) instead of 1:1:1:3:1:1:1 (llight-ldark-lwhite-3dark- 1 white-ldark-llight).
Continuing with the description of Fig. 2 and following block 210, decoded values of the cells of a visual code can be obtained (220) (e.g., by the I/O interface 110 illustrated in Fig. 1). In some cases the decoded values can be received from a remote computer through the Internet or alternatively from a user's input. It would be appreciated that the source that provides the decoded values of the cells can be local or remote and can be obtained in different ways and using different technologies. The decoded values can be pre-calculated and provided by the source computer, at least in accordance with a visual code specification. In some other cases, the decoded values of the cells can be calculated (e.g., by the function pattern calculator 104 illustrated in Fig. 1) at least in accordance with a visual code specification. Specifically, the values for the function patterns can be calculated at least according to a general cell specification in the visual code specification.
According to examples of the presently disclosed subject matter, the decoded values for the cells are calculated based on the visual code specification and further based on the input message that is to be encoded in the readable matrix code, such that the message can be read from the visual code, e.g., by decoding the visual code. The message can be a predetermined message or alternatively the message can be an input received from a user or can be received from any other source. In some embodiments, the cells of the visual code can comprise free cells that encode the message, and derived cells that serve for e.g., error correction purposes and other functionalities.
As aforementioned, in one embodiment, a decoded value for a cell according to a cell specification can be a binary value, and for example, the cell color is suggested to be either dark or light. In another example, the decoded value can be from a set of N (N≥2) values. When N equals two, these two values could be one and zero, and could be visually represented by light and dark colors. By way of another example, N could be equal to four, which mean four values are allowed and possible as the decoded value. For example, these four values could be zero, one, two and three, and could be visually represented by Black, Cyan, Magenta, and Yellow colors. In another example, these four values could be '25780',' 1407210' and 230, and could be visually represented by ranges of grayscale values: 0 to 50 for value '50', 51 to 110 for value '80', 111 to 170 for value '140' and 171 to 255 for value '210'. As mentioned, it is possible to utilize the range of values which can be used for representing one decoded value of a cell of a visual code beyond the range of the suggested color values, including using a range of color and/or luminosity values for the image elements representing the cell.
Following block 220, lightness values for a plurality of cells of the visual code can be calculated (230) (e.g., by the lightness value calculator 108 as illustrated in Fig. 1) based on the decoded values of such cells and at least one lighting parameter. The at least one lighting parameter can be indicative of a lighting condition under which the visual code is expected to be displayed. The visual code generated as such is adapted to be machine-readable upon being scanned by a scanning device under the given lighting condition.
According to certain embodiments, the calculation of the lightness values can further comprise: i) obtaining the at least one lighting parameter indicative of a lighting condition under which the visual code is expected to be displayed, and ii) determining (and assigning) the lightness values based on the decoded values of the plurality of cells and the at least one lighting parameter.
As described above with respect to Fig. 1, a lighting condition can include one or more components each related to a medium, an environment and a background image related to the display of the visual code and possibly also a scanning device related to the scanning of the visual code. The at least one lighting parameter can be obtained as indications of such lighting condition. In some cases the at least one lighting parameter can include a plurality of separate parameters each indicating a respective component of a lighting condition, or alternatively it can include one comprehensive parameter that takes into consideration of all the possible components of a lighting condition. For example, the at least one parameter can include one or more of the following: a medium parameter, an environment parameter, a background image parameter and a scanning device parameter, etc. In another example, the at least one lighting parameter can include a comprehensive parameter which can be, e.g., a weighted parameter calculated based on all the separate parameters. For illustration purposes only, there is provided an exemplified embodiment with a visual code that is expected to be presented on a medium of a cinema screen. The environment for the display of the visual code is a cinema hall which can be dark. The visual code can be superimposed on a background image or one or more frames of a video which can be dark as well. In such a lighting condition, an image of the visual code acquired by scanning device, e.g., a phone with a camera, may by clipped, i.e. areas in the acquired image, which are associated with light areas in the real physical image presented on screen, might present a clipping effect. In such lighting condition, the at least one lighting parameter can include three separate parameters including a medium parameter which can be set as cinema medium, or harsh cinema medium, an environment parameter which can be set as dark room, and a background image parameter which can be set as dark image. It is to be noted that these lighting parameters can also be configured with numerical values that are equivalently indicative. By way of example, a comprehensive parameter can be derived from the given lighting condition and the above separate parameters, and can be assigned with a numerical value. In some cases such numerical value of the lightness parameter can further indicates the lightness values that can be assigned to some of the cells.
The lightness value can then be determined based on the decoded values of at least some of the cells and the at least one lighting parameter as configured above. By way of example, a plurality of cells that have decoded values indicating the cells to be light (hereinafter termed as "light cells", or "cells with light decoded values") can be assigned with a medium lightness value, such as, e.g., 128, in order to mitigate the clipping effect. Cells that have dark decoded values (hereinafter termed as "dark cells", or "cells with dark decoded values") can remain their dark lightness value.
Reference is made to Figs. 5A and SB now, respectively illustrating a visual code before and after assigning lightness values to some cells thereof in order to adapt to a given lighting condition in accordance with certain embodiments of the presently disclosed subject matter. Fig. 5A shows a visual code 502 superimposed on a dark background image, the visual code S02 including light cells and dark cells. As shown in Fig. 6A, the same visual code 502 on the same dark background image is displayed on a TV screen in a dark meeting room. Upon being scanned by a scanning device, e.g., a scanner software on a smart phone, the acquired image including the acquired visual code 602 is clipped, i.e. the light areas including the light cells and function patterns presenting a burning effect, thus render the acquired visual code 602 hardly machine- readable. After the light cells being assigned with the calculated medium lightness values as described above, the resulted visual code 504 is illustrated in Fig. SB, where it is shown the original light cells now appear to be in a medium level of gray colors. Presenting the resulted visual code 504 with the same dark background image on the same TV screen in the same dark meeting room, as visual code 502 was, the acquired visual code 604 by the same scanning device as shown in Fig. 6B appears to be clear and very close to the visual code 502, thus is rendered to be machine-readable.
In some cases, the lightness values are calculated or assigned to image elements associated with each light cell. For instance, each cell can be associated with a group of pixels. The lightness value that is calculated for a given cell can be applied to each pixel that is associated with the cell. Alternatively, the lightness value can also be assigned to part of the cell. For example, at least half of the image elements associated with a given cell can be assigned with the calculated lightness value. In some embodiments of the present disclosure, if the comprehensive lightness parameter is determined or configured as a numerical value, e.g., 128, the lightness values that can be assigned for a light cell, or a group of image pixels associated with the light cell can be determined in accordance with the value of the lightness parameter, e.g., in a range of [128- d, 128+3.
Continuing with the above exemplified embodiment with a visual code that is expected to be presented on a medium of a cinema screen, in some cases, such a lighting condition, especially with the medium being a cinema screen, may cause a clipping effect to be so harsh that even after assigning the calculated lightness values to light cells, the resulted visual code may still not be machine-readable. In such cases, the modification of the structure of function patterns, as described above with respect to block 210 can be triggered based on the at least one lighting parameter, e.g., the medium parameter. By way of example, if the medium parameter is assigned with a cinema medium, or harsh cinema medium, the structure of function patterns of the visual code can be changed or modified as described with respect to block 210. For example, the size of the light areas can be decreased and the size of the dark areas can be enlarged. According to certain embodiments, the generating method can further comprise decreasing the size of cells with light decoded values and increasing the size of cells with dark decoded values, in a similar manner as described above with reference to changing the structure of the function patterns.
It is to be noted that in some cases, the visual code generated to be adapted to a specific lighting condition as described above, especially to a specific medium, may be not machine-readable when being presented on other non-designated mediums.
As aforementioned, although a clipping effect is mostly related to Screen- enabled media, it is to be noted that the present disclosure is not limited to such media, and according can also be applied to printed media, in order to enhance readability of the printed visual codes, especially printed colored visual codes.
According to certain embodiments, the presently disclosed subject matter can be used to generate visual codes that are expected to be printed on a printed medium, such as, e.g., a newspaper. By way of example, the paper quality of certain newspaper could be low and have a grayish paper color. In such cases, the lighting parameter, e.g., the comprehensive parameter, could be assigned with a high value, such as, e.g., 245. Each light cell can then be assigned, for at least half of the image elements associated with such cell, color values with lightness value equal to 245. It should be appreciated that in some embodiments , the lightness parameter 245, as described above, can induce, that lightness value in a range of [245-3,245+5] can be assigned for a group of image pixels associated with a light cell. Continuing with the example, in such printing scenario, the printing process could be an offset printing process which might present minor shifting errors between color plates. In such cases the modification of the structure of function patterns, as described above with respect to block 210 can be triggered based on the at least one lighting parameter, e.g., the medium parameter. If the medium parameter is assigned with an offset printing medium, then the structure of the function patterns can be changed, for example, by enlarging the size of the light areas and decrease the size of the dark areas.
As aforementioned, in some cases a visual code can be a two-dimensional code which refers to any optical machine-readable representation of data in the form of a two-dimensional pattern of symbols as defined above. According to certain embodiments, the visual code as referred to herein can be generated as a special two- dimensional code having an input image or graphic embedded therein. Turning now to Figs. 7A-7D, there are shown exemplified illustrations of different kinds of two- dimensional codes each embedding an input image in accordance with certain embodiments of the presently disclosed subject matter. Fig. 7A and Fig. 7B are two- dimensional codes that have input images superimposed thereon. In such case, the calculation of the decoded values of the cells as performed by the decoded value calculator 106 can further include superimposing the input image on the code, e.g., by changing the transparency of the dots/cells in the two-dimensional code without changing the distribution of the dots or adjusting the decoded values thereof, such that the two-dimensional code, after being superimposed with the input image, is still machine-readable. Fig. 7C shows a different kind of two-dimensional code in which the input image is not only simply superimposed thereon as described with respect to Fig. 7A and Fig. 7B. In the two-dimensional code illustrated in Fig. 7C, the decoded values of cells that correspond to the encoded data/information are in fact calculated or determined by the decoded value calculator 106 such that the appearance of the two- dimensional code complies with a visual similarity criterion when compared with the input image. An exemplified illustration of such a two-dimensional code is described in US patent No. 8,978,989, issued on date of March 17, 201S, which is incorporated herein in its entirety by reference. Furthermore, Fig. 7D shows another kind of two- dimensional code having an input image embedded therein. The calculation of the decoded values of the cells as performed by the decoded value calculator 106 can further include positioning the cells having decoded values corresponding to the encoded data in one or more encoding regions relative to the function patterns and a portion of the input image, rendering the two-dimensional code appears visually more appealing. In some embodiments, the system 100 can further comprise an additional functional component to calculate an image descriptor for the input image and associate the input image with the image descriptor which is used to verify the authenticity of the two-dimensional code in the reading process thus rendering the code to be functionally safer and stronger. An exemplified illustration of such two-dimensional code is described in US patent application No. 62/097,748, filed on date of December 30, 2014, which is incorporated herein in its entirety by reference.
It is to be noted that the above described two-dimensional codes that can be used herein as various kinds of visual codes are illustrated for exemplified purposes only and should not be construed to limit the present disclosure in any way. Different designs or implementations of other suitable two-dimensional codes or visual codes can be used in addition or in lieu of the above.
Turning now to Fig.3, there is illustrated a generalized flowchart of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of the plurality of visual codes to be superimposed on one or more frames of a video in accordance with certain embodiments of the presently disclosed subject matter.
In accordance with certain aspects of the presently disclosed subject matter, visual codes can be superimposed over a video or an animation which includes more than one frame. According to certain embodiments, for different frames of a video, different visual codes can be generated (in a similar manner as described above with reference to Fig. 2) in accordance with respective lighting conditions. Each generated visual code can be superimposed on one or more frames of a video or animation that is expected to be displayed under the respective lighting conditions. Thus, the result video or animation can include a plurality of such visual codes, each designated to a different lighting condition (e.g., a different medium).
Specifically, for each lighting condition of the respective lighting conditions, at least one lighting parameter can be obtained (310) (e.g., by the lightness value calculator 108 illustrated in Fig. 1). The at least one lighting parameter can be indicative of a specific lighting condition.
According to certain embodiments, a lighting condition under which a video or animation is expected to be displayed can largely vary. In some embodiments, the lighting condition can include one or more components or scenarios each related to, e.g., a medium and an environment for the display of the video. For example, a video can be published in a video sharing service such as Youtube.com or Youku.com. The video can then be presented on a small mobile screen, on a PC screen or a large smart TV screen. The video can then be presented in a dark room or a lightened room. In some cases, the video may be presented as part of the video sharing website within a frame while a portion of the website is also presented, alternatively it can be presented in wide screen where only the video is presented. In each of the above scenarios the lighting condition can be different from others. Thus for each of the designated lighting conditions that the video is expected to be displayed, at least one lighting parameter can be obtained indicating such specific condition, in a similar way as described with reference to Fig. 2.
A visual code can be generated (320) for such specific lighting condition, including: obtaining (330) the function patterns (e.g., by the function pattern calculator 104 or the I/O interface 110 illustrated in Fig. 1) and decoded values of the cells (e.g., by the decoded value calculator 106 or the I/O interface 110 illustrated in Fig. 1), and calculating (340) (e.g., by the lightness value calculator 108 illustrated in Fig. 1) lightness values for a plurality of the cells based on the decoded values thereof and the at least one lighting parameter. The generation process of a visual code is performed in a similar manner as described with reference to Fig.2.
Upon a visual code being generated for each of the respective lighting conditions, there are provided a plurality of visual codes such that at least one of the plurality of visual codes is machine-readable upon being scanned by a scanning device under each of the lighting conditions. According to certain embodiments, the process in Fig. 3 can further include superimposing each of the generated plurality of visual codes on one or more frames of the video. Therefore the resulted video can present many visual codes adapted to various designated lighting conditions, such mat for each of a plurality of lighting conditions (the plurality of lighting conditions being included in the designated lighting conditions) in which different users are watching the video and scanning the codes, there will be a visual code on one or more frames that can enable a successful scan for the user.
It is to be noted that a video as used herein refers to one or more frames of images. It can in some cases include animation.
It is to be noted that not all the frames need to be superimposed with visual codes. It can be predetermined the number of visual codes that to be superimposed in the video, as well as to which one or more frames each visual code is superimposed on.
Continuing with the example of a cinema lighting condition, it could be that during the same video sometimes mere will be soft lighting, sometimes there is no light at all or sometimes with other lighting conditions. In such cases, generating a plurality of visual codes to be superimposed on different frames in accordance with different lighting conditions can result in a video which presents visual codes that fit for different lighting conditions. It is to be noted that the orders and sequences of executing the processes as described with reference to Figs. 2-3 are illustrated for exemplified purposes only and should not be construed to limit the present disclosure in any way. One or more stages illustrated in Figs. 2-3 may be executed in a different order and/or one or more groups of stages may be executed simultaneously and vice versa.
It is further to be noted that the interpretation of the terms in the present disclosure should not be limited to the definitions above and should be given its broadest reasonable interpretation. It is to be understood that the presently disclosed subject matter is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The presently disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based can readily be utilized as a basis for designing other structures, methods, and systems for carrying out the several purposes of the present presently disclosed subject matter.
It will also be understood that the system according to the presently disclosed subject matter can be implemented, at least partly, as a suitably programmed computer. Likewise, the presently disclosed subject matter contemplates a computer program being readable by a computer for executing the disclosed method. The presently disclosed subject matter further contemplates a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing the disclosed method.

Claims

1. A computerized method of generating a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under said lighting condition.
2. A computerized method of generating a visual code, the visual code including function patterns and cells, the method comprising: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of said cells based on the decoded values thereof and at least one lighting parameter, said at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine- readable upon being scanned by a scanning device under said lighting condition.
3. A computerized method of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of said plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under said respective lighting conditions, said visual code including function patterns and cells, the method comprising: for each lighting condition of said respective lighting conditions, i) obtaining at least one lighting parameter indicative of said lighting condition; ii) generating one of said visual codes, including: a) obtaining the function patterns and decoded values of the cells; b) calculating lightness values for a plurality of said cells based on the decoded values thereof and said at least one lighting parameter; giving rise to said plurality of visual codes such that at least one of said plurality of visual codes is machine-readable upon being scanned by a scanning device under each said lighting condition.
4. The computerized method of any of claims 1 - 3, wherein the lighting condition includes one or more lighting properties related to a medium that the visual code is expected to be displayed on.
5. The computerized method of any of claims 1 - 4, wherein the lighting condition includes a lighting condition of an environment in which the visual code is expected to be displayed.
6. The computerized method of any of claims 1 - 5, wherein the lighting condition includes one or more lighting properties of a background image on which the visual code is expected to be superimposed.
7. The computerized method of any of claims 1 - 6, wherein the lighting condition is capable of causing color transform effect upon the visual code being scanned, and the generated visual code is machine-readable under said color transform effect.
8. The computerized method of any of claims 2 - 7, wherein the obtaining comprises receiving the function patterns and decoded values of the cells.
9. The computerized method of any of claims 2 - 7, wherein the obtaining comprises calculating the function patterns and decoded values of the cells.
10. The computerized method of claim 9, wherein the calculating the function patterns is based on the at least one lighting parameter.
11. The computerized method of claim 10, wherein the calculating the function patterns comprises changing structure of the function patterns based on the at least one lighting parameter.
12. The computerized method of any of claims 2-11, wherein the at least one lighting parameter is selected from the following: a lightness parameter and a darkness parameter.
13. The computerized method of claim 2, wherein the calculating lightness values further comprises obtaining the at least one lighting parameter.
14. The computerized method of any of claims 2-13, wherein said calculating lightness values comprises assigning medium lightness values to cells with light decoded values.
15. The computerized method of any of claims 2-14, wherein the lightness values are calculated for image elements associated with each of said plurality of cells.
16. The computerized method of any of claims 1-15, wherein the visual code is a two-dimensional code.
17. The computerized method of any of claims 2-16, wherein the decoded values of cells are calculated based on an input message to be encoded in the visual code in accordance with a visual code specification.
18. The computerized method of any of claims 2-17, wherein the lightness values are represented in RGB color model.
19. The computerized method of claim 3, wherein the method further comprising superimposing each of said plurality of visual codes on said one or more frames of the video.
20. The computerized method of claim 3, wherein the video includes animation.
21. The computerized method of claim 11, wherein the changing structure of the function patterns comprises decreasing size of light areas of the function patterns and increasing size of dark areas of the function patterns.
22. The computerized method of any of claims 2- 21, wherein the decoded values include light decoded values and dark decoded values, the method further comprising decreasing size of cells with light decoded values and increasing size of cells with dark decoded values.
23. A computerized system of generating a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under said lighting condition.
24. A computerized system of generating a visual code, the visual code including function patterns and cells, the system comprising a processor configured to: i) obtain the function patterns and decoded values of the cells; and ii) calculate lightness values for a plurality of said cells based on the decoded values thereof and at least one lighting parameter, said at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine- readable upon being scanned by a scanning device under said lighting condition.
25. A computerized system of generating a plurality of visual codes in accordance with respective lighting conditions, each visual code of said plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under said respective lighting conditions, said visual code including function patterns and cells, the system comprising a processor configured to: for each lighting condition of said respective lighting conditions, i) obtain at least one lighting parameter indicative of said lighting condition; ii) generate one of said visual codes, including: a) obtain the function patterns and decoded values of the cells; b) calculate lightness values for a plurality of said cells based on the decoded values thereof and said at least one lighting parameter; giving rise to said plurality of visual codes such that at least one of said plurality of visual codes is machine-readable upon being scanned by a scanning device under each said lighting condition.
26. A non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a visual code at least in accordance with a lighting condition under which the visual code is expected to be displayed, such that the generated visual code is adapted to be machine-readable upon being scanned by a scanning device under said lighting condition.
27. A non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a visual code, the visual code including function patterns and cells, comprising the steps of the following: i) obtaining the function patterns and decoded values of the cells; and ii) calculating lightness values for a plurality of said cells based on the decoded values thereof and at least one lighting parameter, said at least one lighting parameter being indicative of a lighting condition under which the visual code is expected to be displayed, giving rise to the visual code adapted to be machine- readable upon being scanned by a scanning device under said lighting condition.
28. A non-transitory computer readable storage medium tangibly embodying a program of instructions executable by a machine to generate a plurality of visual codes in accordance with respective lighting conditions, each visual code of said plurality of visual codes to be superimposed on one or more frames of a video that is expected to be displayed under said respective lighting conditions, said visual code including function patterns and cells, comprising the steps of the following: for each lighting condition of said respective lighting conditions, i) obtaining at least one lighting parameter indicative of said lighting condition; ii) generating one of said visual codes, including: a) obtaining the function patterns and decoded values of the cells; b) calculating lightness values for a plurality of said cells based on the decoded values thereof and said at least one lighting parameter; giving rise to said plurality of visual codes such that at least one of said plurality of visual codes is machine-readable upon being scanned by a scanning device under each said lighting condition.
PCT/IL2016/050342 2015-05-07 2016-03-31 System and method of generating a visual code WO2016178206A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562158065P 2015-05-07 2015-05-07
US62/158,065 2015-05-07

Publications (1)

Publication Number Publication Date
WO2016178206A1 true WO2016178206A1 (en) 2016-11-10

Family

ID=57217619

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2016/050342 WO2016178206A1 (en) 2015-05-07 2016-03-31 System and method of generating a visual code

Country Status (1)

Country Link
WO (1) WO2016178206A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734250A (en) * 2018-05-29 2018-11-02 西安理工大学 Vision two-dimensional code generation method based on Sobel operators
JP2021056613A (en) * 2019-09-27 2021-04-08 マイクロインテレクス株式会社 Two-dimensional code, print object having the same printed thereon, two-dimensional code generator, two-dimensional code reader, two-dimensional code generation method, two-dimensional code reading method, two-dimensional code generation program, two-dimensional code reading program, computer readable recording medium and recorded device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020028015A1 (en) * 2000-05-09 2002-03-07 Tack-Don Han Machine readable code image and method of encoding and decoding the same
WO2014002086A2 (en) * 2012-06-26 2014-01-03 Eyeconit Ltd. Image mask providing a machine-readable data matrix code
US20140117073A1 (en) * 2012-10-25 2014-05-01 Microsoft Corporation Connecting to meetings with barcodes or other watermarks on meeting content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020028015A1 (en) * 2000-05-09 2002-03-07 Tack-Don Han Machine readable code image and method of encoding and decoding the same
WO2014002086A2 (en) * 2012-06-26 2014-01-03 Eyeconit Ltd. Image mask providing a machine-readable data matrix code
US20140117073A1 (en) * 2012-10-25 2014-05-01 Microsoft Corporation Connecting to meetings with barcodes or other watermarks on meeting content

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734250A (en) * 2018-05-29 2018-11-02 西安理工大学 Vision two-dimensional code generation method based on Sobel operators
CN108734250B (en) * 2018-05-29 2021-06-15 西安理工大学 Visual two-dimensional code generation method based on Sobel operator
JP2021056613A (en) * 2019-09-27 2021-04-08 マイクロインテレクス株式会社 Two-dimensional code, print object having the same printed thereon, two-dimensional code generator, two-dimensional code reader, two-dimensional code generation method, two-dimensional code reading method, two-dimensional code generation program, two-dimensional code reading program, computer readable recording medium and recorded device

Similar Documents

Publication Publication Date Title
US10817971B2 (en) System and method for embedding of a two dimensional code with an image
CN103718195B (en) readable matrix code
US11064149B1 (en) Blended integration of quick response codes into images and video
CA2851598C (en) Apparatus and method for automatically recognizing a qr code
US10863202B2 (en) Encoding data in a source image with watermark image codes
US10885411B2 (en) Machine-readable image encoding data
US20150339838A1 (en) Image mask providing a machine-readable data matrix code
US9497355B2 (en) Image processing apparatus and recording medium for correcting a captured image
US8830533B2 (en) System and method for creating machine-readable codes in combination with other images such as logos
CN108399405A (en) Business license recognition methods and device
Yang et al. ARTcode: preserve art and code in any image
US9633294B2 (en) Optical code
KR102311367B1 (en) Image processing apparatus, image processing method, and storage medium
Chen et al. RA code: A robust and aesthetic code for resolution-constrained applications
WO2016178206A1 (en) System and method of generating a visual code
EP2731057A1 (en) Image with visible information coded therein
US20110221775A1 (en) Method for transforming displaying images
US10142643B2 (en) Marker generating method, marker decoding method, and marker reading device
CN106682717B (en) Method and system for generating halftone two-dimensional code
US20210203994A1 (en) Encoding data in a source image with watermark image codes
WO2018061232A1 (en) Information processing device, display method, reading method, and computer-readable non-transitory storage medium
JP6296319B1 (en) Information processing apparatus, display method, reading method, and computer-readable non-transitory storage medium
Koddenbrock et al. An innovative 3D color barcode: Intuitive and realistic visualization of digital data
CN110809770A (en) Small vector image generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16789402

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16789402

Country of ref document: EP

Kind code of ref document: A1