WO2014170763A2 - Information exchange using color space encoded image - Google Patents

Information exchange using color space encoded image Download PDF

Info

Publication number
WO2014170763A2
WO2014170763A2 PCT/IB2014/001482 IB2014001482W WO2014170763A2 WO 2014170763 A2 WO2014170763 A2 WO 2014170763A2 IB 2014001482 W IB2014001482 W IB 2014001482W WO 2014170763 A2 WO2014170763 A2 WO 2014170763A2
Authority
WO
WIPO (PCT)
Prior art keywords
encoded image
image
encoded
stream
integers
Prior art date
Application number
PCT/IB2014/001482
Other languages
French (fr)
Other versions
WO2014170763A3 (en
Inventor
Alisa MESH-ILIESCU
Vladimir Kolmanovitch
Original Assignee
Mesh-Iliescu Alisa
Vladimir Kolmanovitch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/842,817 external-priority patent/US8973844B2/en
Priority claimed from US13/843,132 external-priority patent/US9027843B2/en
Priority claimed from US13/841,338 external-priority patent/US9117151B2/en
Application filed by Mesh-Iliescu Alisa, Vladimir Kolmanovitch filed Critical Mesh-Iliescu Alisa
Publication of WO2014170763A2 publication Critical patent/WO2014170763A2/en
Publication of WO2014170763A3 publication Critical patent/WO2014170763A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding

Definitions

  • the present disclosure relates to a data storage and exchange device for color space encoded images, and methods of encoding and decoding color space encoded images. More particularly, the present disclosure relates to encoding and decoding of machine-based data using high capacity multi-colored composite two-dimensional pictures having different symbols organized in specific order and sets in a color space.
  • Example data capture technologies include bar codes, QR codes, Radio Frequency Identification (RFID) and Near Field Communication (NFC). Most of these technologies are used as either a visible or an invisible tag to connect these databases and user devices.
  • RFID Radio Frequency Identification
  • NFC Near Field Communication
  • Wi-Fi Wireless Fidelity
  • End user devices such as smartphones and tablets are frequently equipped with photo and/or video cameras. These cameras can be used for capturing information presented in different visual forms.
  • Bar codes have been used for mobile applications to deliver a multitude of different mobile services over mobile phones and other mobile communication or computing devices. Such applications range from providing Uniform Resource Locator (URL) information to link a mobile device to the Internet, through to using a bar code as a form of e- ticket for airlines or event admissions.
  • URL Uniform Resource Locator
  • Traditional approaches to higher capacity bar codes include: (i) using a colored symbol set and (ii) increasing the size of the bar code.
  • the traditional data capacity of ID and 2D bar codes is severely limited. This severely limited data capacity constrains possible applications of ID and 2D bar codes, and their primary task is simply linking camera phones to designated Web sites. Additional tasks may then be performed based on the Web site. This operation again is based on using of Wi-Fi or 3G connectivity.
  • bar codes can be used to provide 2D encoded data, they have not been used to provide real storage devices and media.
  • Embodiments of the present invention include systems, methods, and non- transitory computer program products for information exchange using color space encoded images.
  • a color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays.
  • a second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.
  • Certain embodiments include methods for exchanging information.
  • the method includes receiving information for exchanging and creating an image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image.
  • the method also includes displaying the encoded image for one or more devices to decode.
  • Certain embodiments include systems for exchanging information.
  • the system includes at least one processor configured to receive information for exchanging and create an image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers
  • the at least one processor is further configured to display the encoded image for one or more devices to decode.
  • Certain embodiments include non-transitory computer program products for exchanging information.
  • the non-transitory computer program product is tangibly embodied in a computer-readable medium.
  • the non-transitory computer program product includes instructions operable to cause a data processing apparatus to receive information for exchanging and create an image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image.
  • the instructions are also operable to cause a data processing apparatus to display the encoded image for one or more devices to decode.
  • Embodiments of the present invention include systems, methods, and non- transitory computer program products for information exchange using color space encoded images.
  • a color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays.
  • a second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.
  • Certain embodiments include a computer-implemented method for exchanging information between a first device and a second device.
  • the method includes receiving, at the first device, information for exchanging with the second device and creating a first image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers
  • the method also includes displaying the first encoded image for a second device to decode and in response to the displaying the first encoded image, receiving a second encoded image from the second device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
  • Certain embodiments include a first device for exchanging information with a second device.
  • the first device includes memory, storage, and at least one processor configured to receive, at the first device, information for exchanging with the second device.
  • the at least one processor is further configured to create a first image encoding the information by creating colored cells within a color space to provide an encoded
  • the at least one processor is also configured to display the first encoded image for a second device to decode and in response to the displaying the first encoded image, receive a second encoded image from the second device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
  • the embodiments described herein can include additional aspects of the present invention.
  • the method can further include locating the second encoded image by locating one or more additional cells adjoining the second encoded image and decoding the response based on the colored cells and the one or more additional cells adjoining the second encoded image.
  • the at least one processor is further configured to locate the second encoded image by locating one or more additional cells adjoining the second encoded image and decode the response based on the colored cells and the one or more additional cells adjoining the second encoded image.
  • Certain embodiments include a computer-implemented method for exchanging information between a first device and a second device.
  • the method includes receiving, at the second device, a first encoded image from the first device, the first encoded image having colored cells corresponding to a stream of integers encoding data and decoding the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding a request from the first device based on the colored cells and the one or more additional cells adjoining the first encoded image.
  • the method also includes determining a response to the first encoded image and creating a second encoded image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in the second encoded image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the second encoded image, the one or more additional cells for assisting in decoding the second encoded image.
  • the method further includes displaying the second encoded image for the first device to decode.
  • Certain embodiments include a first device for exchanging information with a second device.
  • the first device includes memory, storage, and at least one processor configured to receive, at the first device, a first encoded image from the second device.
  • the first encoded image has colored cells corresponding to a stream of integers encoding data.
  • the at least one processor is also configured to decode the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding a request from the second device based on the colored cells and the one or more additional cells adjoining the first encoded image.
  • the at least one processor is further configured to determine a response to the first encoded image and create a second encoded image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in the second encoded image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the second encoded image, the one or more additional cells for assisting in decoding the second encoded image.
  • the at least one processor is also configured to display the second encoded image for the second device to decode.
  • the embodiments described herein can include additional aspects of the present invention.
  • the method can also include receiving, at the first device, the second encoded image to decode, the second encoded image having colored cells corresponding to a stream of integers encoding the response, locating the second encoded image by locating one or more additional cells adjoining the second encoded image, and decoding data based on the colored cells and the one or more additional cells adjoining the second encoded image.
  • the first device can be further configured to receive, at the first device, the second encoded image to decode, the second encoded image having colored cells corresponding to a stream of integers encoding the response, locate the second encoded image by locating one or more additional cells adjoining the second encoded image, and decode data based on the colored cells and the one or more additional cells adjoining the second encoded image.
  • Embodiments of the present invention include systems, methods, and non- transitory computer program products for information exchange using color space encoded images.
  • a color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays.
  • a second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.
  • devices, methods, and non- transitory computer program products are provided for exchanging information between a camera and a device.
  • Certain embodiments include methods of exchanging information between a camera and a device.
  • a method includes receiving, at the camera, information for exchange with the device and creating a first image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received content to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the encoded image. The one or more additional cells for assisting in decoding the encoded image.
  • the method also includes displaying the first encoded image for the device to decode and in response to the displaying the first encoded image, receiving a second encoded image from the device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
  • Certain embodiments include methods of exchanging information between a camera and a device.
  • a method includes receiving, at the device, a first encoded image from the camera the first encoded image having colored cells corresponding to a stream of integers encoding data.
  • the method also includes decoding the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding the data based on the colored cells and the one or more additional cells adjoining the first encoded image.
  • the method further includes determining a response to the first encoded image and creating a second image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in an image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the image.
  • the method also includes displaying the second encoded image for the camera to decode.
  • Certain embodiments include camera systems for exchanging information with a device.
  • the camera system includes memory, storage, and at least one processor configured to receive, at the camera system, information for exchange with the device.
  • the at least one processor is also configured to create a first image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received content to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image.
  • the at least one processor is further configured to display the first encoded image for the device to decode and in response to the displaying the first encoded image, receive a second encoded image from the device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
  • a device includes memory, storage, and at least one processor configured to receive, at the device, a first encoded image from the camera, the first encoded image having colored cells corresponding to a stream of integers encoding data.
  • the at least one processor is also configured to decode the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding the data based on the colored cells and the one or more additional cells adjoining the first encoded image.
  • the at least one processor is further configured to determine a response to the first encoded image and create a second image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in an image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the image.
  • the at least one processor is also configured to display the second encoded image for the camera to decode.
  • the device can comprise at least one of a computer, a phone, a smartphone, a palmtop device, a handheld device, a tablet, an e-reader, and a scanner.
  • FIG. 1 illustrates a system for storing, encoding, and decoding a 2D color space image in accordance with certain embodiments of the present disclosure.
  • FIGS. 2A-2B illustrate the 2D color space encoded image in accordance with certain embodiments of the present disclosure.
  • FIG. 3 illustrates a method that the system performs for encoding an image in accordance with certain embodiments of the present disclosure.
  • FIG. 4 illustrates a method that the system performs for decoding an image in accordance with certain embodiments of the present disclosure.
  • FIG. 5 illustrates a block diagram of color calibration cells for restoration of proper color based on distinguishable color samples in accordance with certain embodiments of the present disclosure.
  • FIG. 6 illustrates a block diagram of a frame arrangement for geometrical distortion correction of an encoded image in accordance with certain embodiments of the present disclosure.
  • FIG. 7 illustrates a block diagram for frame fragmentation for distortion correction in accordance with certain embodiments of the present disclosure.
  • FIG. 8 illustrates a block diagram of image based information exchange from one device to another device in accordance with certain aspects of the present disclosure.
  • FIG. 9 illustrates a block diagram of an example of information exchange of a color space encoded image in accordance with certain embodiments of the present disclosure.
  • FIG. 10 illustrates information exchange using a photo camera to display a color space encoded image in accordance with certain embodiments of the present invention.
  • the present systems and methods allow for use of 2D color-space codes to encode images with greater data capacity.
  • the 2D color-space encoded images described herein can provide a wide variety of applications, regardless of network connectivity.
  • the present systems and methods allow a sufficient amount of data or even an entire database to be encoded and placed in a storage device using the 2D images encoded and organized by color- space codes as described herein.
  • the present disclosure has utility in, for example, data back-up storage
  • Figure 1 illustrates a system 100 for storing, encoding, and decoding a 2D color space image in accordance with certain embodiments of the present disclosure.
  • System 100 includes a 2D color space image 102.
  • System 100 can encode any type of desired machine- based data.
  • Non-limiting examples of machine-based data include text 104a, books 104b (containing, for example, text and images), databases 104c, photographs 104d or other artwork, images, video, film, or movies, music 104e, or any other type of binary data 104f.
  • capturing of desired data can be done using a conventional camera or scanner, or a smartphone or tablet camera.
  • a user can encode selected information into a 2D color-space coded image and print the image on any desired media (e.g., paper, textile, plastic, etc.).
  • the 2D color-space coded image can also be depicted on any display, including posters, TV screens or computer monitors, as well as displayed on mobile devices such as smartphones or tablets.
  • Digital photo cameras can be used as display devices as well.
  • the image code can be decoded and the encoded data can be read.
  • the present systems and methods allow information to be retrieved, decoded, and saved in the memory of a device or presented on a display for further use.
  • the decoded data can be read on a laptop computer 106a or a desktop computer 106b, a smart phone 106c, a photo camera 106d, a palmtop or handheld device 106e, or a tablet 106f. All this can be done without manual work. No network connectivity or additional cost is required.
  • the present systems and methods relate to a data storage device for storing 2D color space encoded images, methods of encoding and decoding the 2D color space encoded images, and application of the 2D color space encoded images to data exchange.
  • the present disclosure relates to encoding, transmission, and decoding of machine-based data using an appropriately formatted high capacity color image.
  • the present systems and methods can be utilized, for example, for compact backup of printed materials, and/or data exchange between a color-image-displaying device and a plurality of image-reading devices.
  • Figures 2A-2B illustrate the 2D color space encoded image 102 in accordance with certain embodiments of the present disclosure.
  • encoded image 102 includes a number of rectangular cells 202 of different colors.
  • encoded image 102 can include a black frame 204, a white frame 206, a "zebra frame" 208 having alternating colors, and color calibration cells 210.
  • Figure 2B illustrates a close-up view of the black frame 204, the white frame 206, the zebra frame 208, and the color calibration cells 210 in accordance with certain
  • Black frame 204 and white frame 206 can be used for rough localization of the data image including rectangular cells 202, determination of geometric distortion of the data image, and estimation of cell size and scale of rectangular cells 202.
  • Zebra frame 208 can be used for further calibration of cell size for rectangular cells 202.
  • Color calibration cells 210 can be used for determining a number of colors N c used for encoding the data, and for indexing the colors to map the colors to numbers for encoding and decoding.
  • rectangular cells 202 can use more than eight different colors.
  • the number of colors can be expressed as N c , for example N c > 8.
  • the number of colors is selected to be a power of two.
  • Figure 3 illustrates a method 300 that the system performs for encoding an image in accordance with certain embodiments of the present disclosure.
  • color-difference metrics can help achieve stable color resolution for image-source media and image-reading devices.
  • Most imaging devices operating in the visible spectrum are adapted to human eye color resolution.
  • a color space describes color capabilities of a particular device by relating numbers to actual colors.
  • Embodiments of the present disclosure utilize metrics based on low-cost non-Euclidean distance in red green blue (RGB) color space.
  • RGB red green blue
  • the present system uses color space metrics which are close to Commission International de l'eclairage (CIE) 1976 (L*,u*,v*) color space metrics.
  • CIE Commission International de l'eclairage
  • L*,u*,v* Commission International de l'eclairage
  • the present systems and methods are able to use any color space, including the CIE 1931 XYZ color space, CIELAB color space, and CIEUVW color space.
  • the color space used in certain embodiments is advantageously less sensitive to a color temperature of relevant image displaying media or device, and advantageously less sensitive to external illumination.
  • Equation (1) Equation (1) where r describes an average red component value of C 1 and C 2 .
  • the present system first receives machine data for encoding into an image.
  • the machine data can be contained in a file.
  • the present system compresses the file (step 302).
  • the machine data can be compressed into a binary compressed file such as zip, rar, arc or any other archive format file.
  • the present system determines an appropriate color encoding, illustrated by box 310. For example, the present system maps the resulting code sequence to a stream of integers from 0 to N c -1, inclusive. Using the stream of integers, the present system creates a bitmap with frame formatting. Generation of frame formatting is described in further detail later. The present system fills the interior of the frames with color bitmap cells, encoded as described following.
  • the encoding can use error correction.
  • a non- limiting example of error correction is the two-stage error correction described herein.
  • the encoding can use scrambling.
  • the encoding can use error correction in combination with scrambling.
  • the present system determines the color of each cell based on the color calibration cells. The resulting encoded image can then be sent to any display, including a color printer, display, or other color image output device.
  • the present system then performs a first level of error correction (step 304).
  • error correction is that the present system is able to decode an image properly, even if a portion of the image is damaged.
  • the present system can determine a header containing a cyclic redundancy check (CRC) error- detecting code, file name, format information, and sequence number (e.g., if the machine data and resulting encoded image represent only part of a larger data stream such as a video stream).
  • CRC cyclic redundancy check
  • the present system can use any error detecting code such as repetition codes, addition of parity bits, checksums, hash functions, or more powerful error correcting codes.
  • the present system can use a Reed-Solomon encoder of high order.
  • a Reed-Solomon error correction code is a cyclic error-correcting code capable of detecting and correcting multiple errors in a bit stream.
  • the present system can use other error-correcting codes such as low-density parity-check (LDPC) codes or turbo codes.
  • LDPC low-density parity-check
  • Reed-Solomon error correction codes are capable of detecting errors, erasures, and combinations thereof.
  • a word size for this high order level of Reed-Solomon encoding can be equal to k*log 2 (N c ) rounded down, where k > 1.
  • the two-stage error coding can perform high order Reed-Solomon encoding in a first stage, and low order Reed-Solomon encoding in a second stage.
  • the second stage can also include smaller words.
  • the word size can be equal to log 2 (N c ) bits, rounded down.
  • the present system then scrambles the obtained code (step 306).
  • the resulting code can be interleaved using a predefined pseudorandom sequence.
  • the present system scrambles the obtained code to spread out potential errors.
  • the two-level error correction described herein can perform better. That is, when registering or capturing an image for decoding, there can be color distorgions due to non-uniform lighting, or damage on the media on which the image is printed. For example, the media can have spots, be wrinkled, and the like.
  • the scrambling provides that "bad" color cells with uncertain data can be distributed uniformly (i.e., independently) into different encoding blocks, thereby decreasing the number of errors per block. Because error correction can have upper limits of correctable errors, the present spreading improves the effectiveness of the error correction.
  • the present system then performs a second level of error correction (step 308).
  • the resulting scrambled code is encoded using a Reed- Solomon encoder with .
  • the word size can be selected to be smaller than the word size used in the first level of error correction.
  • this two-stage error correction allows the present system to mark unrecoverable blocks of data during a first stage as erasures.
  • the error correction described herein finds an index location of an error word inside a block, and also corrects the error word to its correct or true value.
  • the two-stage error correction allows the present system to correct twice as many erasures as errors, as described in more detail later.
  • the present system is able to use any kind of high definition (HD) color registration device to register an encoded image.
  • registration devices can include a digital camera, a smartphone camera, a tablet camera, a video camera, a webcam, a scanner (to scan a hard copy image), etc.
  • Figure 4 illustrates a method 400 that the system performs for decoding an image in accordance with certain embodiments of the present disclosure.
  • the present system can copy a registered image to any processing device containing a decoding module.
  • An example processing device can include a computer, containing a decoding module with computer-readable instructions for decoding, where the computer-readable instructions are stored on a non- transitory computer storage medium, or containing hardware having hard-coded or embedded decoding instructions stored thereon.
  • decoding method 400 includes two high-level processes: (1) image recognition, and (2) two-stage Reed-Solomon decoding.
  • the present system uses Viterbi decoding as a Reed-Solomon decoder.
  • the present system locates the main frame (step 402). For example, the present system can locate the black frame and the white frame vertices. The present system can further determine the middles of the corresponding edges of the black frame and the white frame. If the main frame is not found (step 404: No), the present system returns an error (step 406).
  • the error can be that the image cannot be decoded, or the error can be more specific such as that the main frame cannot be found.
  • the present system corrects geometric distortion based on the main frame and the zebra frame (step 412).
  • Geometric distortion can occur when a user uses a camera lens, for example on a smartphone or tablet camera, or on a digital camera, to decode an image. Accounting for distortion is described in further detail later, in connection with Figure 6.
  • the present system locates the color calibration cells (step 408).
  • the color calibration cells are located inside the black frame and the white frame, on the left of the rectangular cells encoding the machine data.
  • the color frame could be located outside the black frame, between the black and white frames, or in other locations within the white frame.
  • the present system decodes the colors in the rectangular cells (step 410). In general, for every pixel found in the encoded image, the present system finds the closest sample pixel based on the color calibration cells and substitutes the sample pixel for the detected pixel. In some embodiments, the present system obtains a color-value correspondence. The present system determines cell size calibration based on the zebra frame. The present system divides the data image into color cells. When decoding the image, the present system captures or collects all colors belonging to a given pixel rectangle. If most of the pixel colors in the pixel rectangle could be recognized as one color from the color calibration cells, that color from the color calibration cells is used. Otherwise, the present system marks the pixel as an erasure.
  • the present system treats the whole pixel block as an erasure.
  • the present system reads the rectangular cells sequentially and determines an index of the nearest color from the color calibration cells to the median cell color. In particular, the present system determines the nearest color according to nearest Euclidean distance in the appropriate color space, according to Equation (1), described earlier.
  • the present system decodes the bytes (step 414).
  • the present system applies a corresponding decoder to the error-correcting encoder described earlier.
  • the present system applies a low-level Reed-Solomon decoder to the code stream obtained from the decoded colors. Accordingly, due to the nature of the error- correcting code, the present system is able to restore erasures and errors in the code stream. If the present system detects unrepairable blocks, the unrepairable blocks are marked as "erasures.”
  • the present system then descrambles the bytes (step 416). In some embodiments, the present system de-interleaves the resulting byte stream.
  • the present system then decodes the original information based on the decoded byte stream (step 418).
  • the present system uses a corresponding decoder to the error-correcting encoder described earlier, to restore pixel-block- level erasures and errors.
  • the present system applies a high-level Reed-Solomon decoder. After applying the high-level Reed-Solomon decoder, if some pixel blocks remain un-repairable, the present system attempts to change the indices to those indices corresponding to the next nearest color in the color calibration cells, and then repeats Reed-Solomon decoding of the modified block. If error free data is obtained, the present system maps the resulting sequence of integers to a bit stream.
  • the present system then parses the resulting bit stream and computes a cyclic redundancy check (CRC) of the data.
  • CRC cyclic redundancy check
  • the present system compares the resulting CRC with the CRC contained in the header of the image.
  • the present system then saves or appends the file.
  • the two-level error correction used in the present system and described earlier in connection with Figures 3 and 4 allows the present system to correct two types of errors: errors and erasures.
  • the Reed-Solomon error-correction encoding used is able to detect errors (which correspond to wrong, incorrect, or unexpected
  • correcting erasures can be performed twice as effectively than correcting errors.
  • the present system uses a two-level error correction schema.
  • N c 16 (i.e., 2 4 ). Accordingly, there are sixteen distinctive colors which allow an encoded image to encode 4 bits of information per rectangular cell. Every byte of information (8 bits) is encoded by two color indices.
  • the group of half-bytes is then appended with Reed-Solomon source control data determined based on the group of half-bytes. Because the Reed-Solomon source control data can be about the same size as the input data, the addition of the Reed-Solomon source control data results in the size of the group reaching fifteen bytes. For example, the size of the input data and the control data block size can be fifteen bytes.
  • Each half-byte in the group is mapped to an appropriate color, for example by using the corresponding half-bytes as indices into the color array. Accordingly, the encoding process applies two levels of Reed-Solomon encoding, first at the byte array level and second at the half-byte level.
  • the present system collects all colors belonging to a detected pixel rectangle. If most of the colors in the pixel rectangle could be recognized as one of the colors in the color calibration cells, the present system uses the corresponding color in the color calibration cells. Otherwise, the pixel rectangle is marked as an erasure. If a pixel-level block cannot be decoded, the whole pixel block is treated as an erasure. As described earlier, the erasures can be further restored by an appropriate error-correction algorithm.
  • the two-level error correction and leveraging of erasure determinations significantly improves the quality of the decoding described earlier in connection with Figure 4. In particular, the two-level error correction described herein allows the present system to recover information which otherwise would be lost.
  • Figure 5 illustrates a block diagram of color calibration cells for restoration of proper color based on distinguishable color samples in accordance with certain embodiments of the present disclosure.
  • Figure 5 includes color calibration cells 210.
  • Color calibration cells 210 include all colors used to encode the rectangular cells.
  • a white frame or border surrounds color calibration cells 210 for the color white.
  • the present system may also include white as a color block in the color calibration cells arranged on the left hand side. The present system discerns each color based on its placement within color calibration cells 210. The present system then stores distorted pixels as color samples. Different variants of a distorted pixel are saved to point or refer to the same correct pixel. That is, distorted pixels are determined dynamically, based on the position of their colors among the color calibration cells. The distorted pixels are assigned estimated colors according to the color calibration cells.
  • the present system translates a distorted pixel into one of the original colors in color calibration cells 210 using the nearest original color to the distorted pixel.
  • color C (R, G, B) represent a red green blue (RGB) component representation of the color.
  • RGB red green blue
  • Equation (1) described earlier, a pseudo-Euclidean distance between two colors C; and C2 can be given by AC, as described in Equation (1):
  • a color mapping can be represented by the set of pairs: (distorted color)— > (correct color).
  • the distorted color can be Ci, and the correct color can be C2.
  • the number of such color pairs should be greater than or equal to N c , the total number of colors. This mapping can be determined based on the color calibration cells and on the color arrangement therein.
  • the present system iterates over all pairs of distorted colors and correct colors. In particular, the present system selects the pair in which the distorted color portion of the pair is closest to the correct or proper color c . In some embodiments, the determination of whether the distorted color portion is closest to the correct or proper color is made according to the pseudo-Euclidian metrics in the RGB color space. That is, the present system selects the pair in which AC is smallest according to Equation (1). The selected pair provides the correct color.
  • the present system allows the correct colors to be restored, because the present system saves the correspondence between the distorted values and the original values.
  • the present system is able to leverage information about the way in which the colors were distorted relative to the original values as captured in the color calibration cells.
  • Frame arrangement for automatic image geometrical distortion correction is able to leverage information about the way in which the colors were distorted relative to the original values as captured in the color calibration cells.
  • Figure 6 illustrates a block diagram of a frame arrangement for geometrical distortion correction of an encoded image in accordance with certain embodiments of the present disclosure.
  • the present system includes arrangements to recognize placement of rectangular cells and other elements in the encoded image.
  • these arrangements can include black frame 204, white frame 206, zebra frame 208, and color calibration sidebar 210 (shown in Figure 2).
  • the black frame and white frame surround the encoded rectangular cells.
  • the color calibration sidebar contains sample colors (described earlier, in connection with Figure 5).
  • the zebra frame surrounds the encoded rectangular cells to allow the present system to determine appropriate resolution and pixel division of the encoded image.
  • the black frame and white frame allow the present system to determine and correct nonlinear distortion.
  • nonlinear distortion can be introduced, for example, by camera lenses when registering an encoded image for decoding.
  • the present system uses the white frame as follows.
  • the white frame can be restricted by curve x le ft(y) 602, curve x r ight(y) 604, curve yt op (x) 606, and curve ybottom(x) 608.
  • the present system can determine parameters a and b (0 ⁇ a ⁇ l and 0 ⁇ b ⁇ l) according to the following equations:
  • the equation described above maps the area restricted by curves 602-608 to a unit rectangle defined by 0 ⁇ a ⁇ l , 0 ⁇ b ⁇ l .
  • the unit rectangle can represent what the image would look like when "straightened out,” by mapping the area restricted by curves 602-608 to the unit rectangle. That is, the distorted boundary of the image determined according to the area restricted by curves 602-608 can be mapped to the "straightened out" unit rectangle.
  • the zebra frame can be used to determine parameters a and b.
  • the present distortion correction therefore allows the original area having curved or non-linear sub-regions to be transformed according to the unit rectangle using parameters a and b.
  • the zebra frame allows the present system to determine a resolution of the embedded picture, and to divide the image into "bold” or unknown pixels. As described earlier, the color value of some pixels may be unknown. "Bold" pixels allow the present system to determine the proper color value of a pixel according to a voting process. In some embodiments, the voting process can choose the color value of the most frequent pixel close to the "bold" or unknown pixel to be the color value of the "bold" pixel. Otherwise, if the result of the voting is unclear— for example, if there is no pixel with a frequency much larger than the other pixels— the present system can mark the pixel as an erasure (i.e., missing). Therefore, using the two-level error correction described earlier, the pixel can be restored.
  • Image Fragmentation can choose the color value of the most frequent pixel close to the "bold" or unknown pixel to be the color value of the "bold" pixel. Otherwise, if the result of the voting is unclear— for example, if there is no
  • Figure 7 illustrates a block diagram for frame fragmentation for distortion correction in accordance with certain embodiments of the present disclosure.
  • the rectangular cells can be divided into sub-images 704 using separating lines 702.
  • Frame fragmentation or image fragmentation refers to a complementary technique to distortion correction (shown in Figure 6).
  • dividing the image into a set of sub-images 704 allows the present system to work with every sub-image 704, without requiring the distortion compensation technique described earlier because sub-images 704 are sufficiently small that there is not as much distortion as with the entire encoded image.
  • sub-images 704 can be separated with separating lines 702.
  • separating lines 702 can be black lines, which can also became part of the zebra border.
  • the present system can use the same encoding and decoding techniques described earlier (shown in Figures 3 and 4) for encoding and decoding of sub-images 704.
  • the present system can leverage the two-level error correction, scrambler, etc.
  • the present system can skip the distortion correction described earlier (shown in Figure 6).
  • the present system can leverage the relatively smaller size of sub-images 704 to skip non- linear distortion compensation of each sub-image 704. Accordingly, the present system can decode sub-images 704 with better accuracy.
  • Figure 8 illustrates a block diagram of image based information exchange from one device to another device in accordance with certain aspects of the present disclosure.
  • Figure 8 includes smartphones 802a-b, and encoded images 804a-b.
  • Some smartphones currently have large displays and many are evolving to include even larger displays. These displays can be used for information exchange, for example using color-space encoded images 804a-b.
  • the color-space encoded images 804a-b can be encoded according to the encoding process described earlier, in Figure 3. Accordingly, smartphones 802a-b are able to exchange information using the encoded images.
  • an embedded camera on smartphone 802b can register or capture an encoded image 804a being displayed on smartphone 802a.
  • Embedded cameras used on devices such as smartphones 802a-b are improving. As described herein, the resolution per frame as captured by embedded cameras can be less important, because the requirements on the number of pixels which can be pixel number limitation can be avoided. Descriptions of the pixel number limitation are described in further detail below.
  • smartphones 802a-b can have hardware modules or software applications loaded thereon, for example for encoding or decoding according to the processes described in Figures 3 and 4.
  • a first user can create machine data or otherwise store machine data on a device such as a smartphone. The first user can then decide to exchange related information between devices, for example with a second user.
  • the machine data created by the first user or stored on the phone can be encoded according to the process described in Figure 3, and then displayed on the smartphone display for exchanging information with smartphone 802b.
  • smartphone 802b can acquire the encoded image through its embedded camera.
  • this acquisition does not require smartphone 802b to download or acquire encoded image 804a through electromagnetic radio transmissions such as a wireless or cellular network.
  • Smartphone 802b can perform the decoding as described in Figure 4 to determine the machine data from the first user.
  • the decoding can be performed in hardware or in software installed on smartphone 802b.
  • the machine data encoded by the first user can be displayed on the screen or otherwise stored in smartphone 802b.
  • security can be added to the encoded image. For example, when the present system compresses or archives the machine data, the present system can add a password or otherwise encrypt the relevant machine data.
  • smartphone 802a can partition or divide the information to be transmitted.
  • smartphone 802a can transmit a first frame including only partial information.
  • the first frame can include the color calibration frame for smartphone 802b to calibrate colors in preparation for decoding encoded image 804a, as described earlier in connection with Figure 5.
  • the first frame can include the main frames (i.e., the black frame and/or the white frame) to correct geometrical distortions, etc., as described earlier in connection with Figure 6.
  • smartphone 802a can construct and generate special frames for partial transmission.
  • the special frames can include the color calibration frame, the main frames, or the zebra frame, to obtain a stable information and data exchange.
  • smartphones 802a-b can be placed substantially against each other, for example by aligning the embedded camera of smartphone 802b with the display of smartphone 802a displaying encoded image 804a. Accordingly, smartphones 802a-b can sense the proximity to establish automatically the data transmission channel described earlier. In further embodiments, smartphones 802a-b can change the position of which smartphone represents the source for sending machine data, and which smartphone represents the sink for receiving machine data.
  • Figure 9 illustrates a block diagram of an example of information exchange of a color space encoded image in accordance with certain embodiments of the present disclosure.
  • Figure 9 includes a number of devices including smartphones 906a, 912a, camera 906b, 912b, palmtop or handheld device 906c, 912c, and tablet or e-reader 906d, 912d.
  • the present system can include information exchange between a number of devices which can register, capture, or "see" an encoded image 102 on a display 902.
  • the devices which can register encoded image 102 include devices with embedded cameras 904.
  • the devices can include smartphone 912a, camera 912b, palmtop or handheld device 912c, and tablet or e-reader 912d.
  • these devices can be similar to devices 106c-f (shown in Figure 1). These registering devices can register or capture the corresponding data source.
  • the information exchange described herein can include data distribution including multiple devices. Accordingly, the information exchange described herein can include point-to-multipoint data distribution, in which a number of devices all capture a single encoded image 102.
  • the present system can include sequential frame
  • the encoded images can include multiple video frames, such as video frames 910a-c. Accordingly, the present system allows exchange of video files such as machine data including movies or films 908.
  • the sequential frame transmission described herein can be used for machine data other than video files.
  • multiple encoded images can be used to encode large files and/or large amounts of information.
  • Smartphones such as smartphones 906a, 912a are capable of recording full high definition (HD) video.
  • HD video can include video at 1920x1080 pixels, and at least 25 frames per second (fps).
  • smartphones 912a can often include high-performance cameras such as for capturing photographs or video.
  • smartphone displays continue to improve, allowing for even more resolution for encoding machine data, and for displaying and transmitting encoded images.
  • Smartphones often include a second camera, for example on the front of the smartphone along with the display, for use in videoconferencing.
  • the present system can allow two devices to be placed with their displays next to or adjacent with each other. In such an orientation, embodiments of the present disclosure can use the front camera.
  • having devices paired as described herein can allow the devices to exchange information interactively.
  • a first device such as tablet 906d can display a first frame.
  • the first frame can include color calibration information.
  • the second device can encode an acknowledgement in an encoded image of its own (not shown), and send the acknowledgement for the first device to decode.
  • the second device can send a request for the next frame or additional partial information.
  • the present system can send a number of encoded images, for example in a video-like transmission.
  • This video-like transmission can be used for two- way interactive or one-way directional information or data exchange.
  • Figure 10 illustrates information exchange using a photo camera to display a color space encoded image in accordance with certain embodiments of the present invention.
  • FIG 10 illustrates a photo camera 1002 and a smartphone 802b.
  • Photo camera 1002 can display an encoded image 804a.
  • Current digital cameras such as photo camera 1002 have good optics in their camera lenses.
  • Current digital cameras often have good displays as well, which can be used for displaying encoded image 804a.
  • Camera manufacturers are also adding wireless connectivity such as Wi-Fi or cellular connectivity, or even GPS receivers to make cameras "connected.” Cameras can also include digital signal processors used for image data processing.
  • the present system includes digital signal processors modified in photo camera such as photo camera 1002.
  • the modifications can include hardware modules or embedded software such as firmware for encoding or decoding color space images in accordance with the processes described in Figures 3 and 4.
  • photo cameras can be modified for information or data exchange between cameras.
  • a camera is a good "image" acquisition, registration, or capture device, due to high quality optics and ability to work under difficult light conditions.
  • Information acquired by a camera can often be stored on a memory card, for example flash memory.
  • the present system can allow a user to insert a memory card into a compatible device, such as a PC or notebook computer. Modules on the PC or notebook computer can decode a color space encoded image without requiring radio-related approaches such as using a Wi-Fi or cellular network.
  • an embedded camera on a PC or tablet can be used as an image acquisition, registration, or capture device.
  • the present system can use the display on the PC or tablet as a data source device as well, for example to display encoded image 804. Accordingly, the decoding modules described earlier can be utilized on corresponding decoding devices.
  • Figures 3 and 4 include processes that, in various embodiments, are carried out by a processor under the control of computer-readable and computer-executable instructions.
  • Embodiments of the present disclosure may thus be stored as non-transitory computer-readable media or computer-executable instructions including, but not limited to, firmware updates, software update packages, or hardware (e.g., ROM).
  • disabling refers to actions and processes of a computer system or similar electronic computing device or processor.
  • the computer system or similar electronic computing device manipulates and transforms data represented as physical (electronic) quantities within the system memories, registers or other such information storage, transmission or display devices.
  • the present systems and methods can be implemented in a variety of architectures and configurations.
  • the present systems and methods can be implemented as part of a distributed computing environment, a cloud computing environment, a client server environment, etc.
  • the embodiments described herein may be discussed in the general context of computer-executable instructions residing on some form of non-transitory computer-readable storage medium, such as program modules, executed by one or more computers, computing devices, or other devices.
  • non-limiting examples of computer-readable storage media may include storage media and communication media.
  • program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or distributed as desired in various embodiments.

Abstract

Embodiments of the present invention include systems, methods, and non-transitory computer program products for information exchange using color space encoded images. A color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays. A second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.

Description

INFORMATION EXCHANGE USING COLOR SPACE ENCODED IMAGE
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of the following applications, the entire contents of which are incorporated by reference herein:
U.S. Patent Application Ser. No. 13/841,338, entitled "Information Exchange Using Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/843,132, entitled "Information Exchange Display Using Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/842,817, entitled "Information Exchange Using Photo Camera as Display for Color Space Encoded Image" and filed March 15, 2013;
[0002] This application is related to the following applications:
U.S. Patent Application Ser. No. 13/836,447, entitled "Data Storage and Exchange Device for Color Space Encoded Images'" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/842,932, entitled "Broadcasting Independent of Network Availability Using Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/837,155, entitled "Image Encoding and Decoding Using Color Space" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/837,895, entitled "Color Restoration for Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/842,856, entitled "Two-Level Error Correcting Codes for Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/843,111, entitled "Frame of Color Space Encoded Image for Distortion Correction" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/844,184, entitled "Large Documents Using Color Space Encoded Image" and filed March 15, 2013
U.S. Patent Application Ser. No. 13/844,207, entitled "Combination Book With E- Book Using Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/837,769, entitled "Image Fragmentation for Distortion Correction of Color Space Encoded Image" and filed March 15, 2013.
U.S. Patent Application Ser. No. 13/844,000, entitled "Book Using Color Space Encoded Image" and filed March 15, 2013;
U.S. Patent Application Ser. No. 13/844,127, entitled "Data Backup Using Color Space Encoded Image" and filed March 15, 2013; U.S. Patent Application Ser. No. 13/844,168, entitled "Self-Publication Using Color Space Encoded Image" and filed March 15, 2013; and
U.S. Patent Application Ser. No. 13/840,504, entitled "Information Broadcast Using Color Space Encoded Image" and filed March 15, 2013.
FIELD OF THE DISCLOSURE
[0003] The present disclosure relates to a data storage and exchange device for color space encoded images, and methods of encoding and decoding color space encoded images. More particularly, the present disclosure relates to encoding and decoding of machine-based data using high capacity multi-colored composite two-dimensional pictures having different symbols organized in specific order and sets in a color space.
BACKGROUND
[0004] With smartphones and tablets having become widely deployed as well as 3G and Wi-Fi based access to Internet, data capture technologies work as an interface between databases and user devices. Example data capture technologies include bar codes, QR codes, Radio Frequency Identification (RFID) and Near Field Communication (NFC). Most of these technologies are used as either a visible or an invisible tag to connect these databases and user devices. The introduction of cloud storage adds extended use of data exchange between data storage and end user devices.
[0005] Many households include many different devices simultaneously connected to Wi-Fi access points. This ubiquitous Wi-Fi usage increases electromagnetic fields around the home environment. Despite the usefulness of Wi-Fi, long term consequences for the human body and general health are not clear.
[0006] End user devices such as smartphones and tablets are frequently equipped with photo and/or video cameras. These cameras can be used for capturing information presented in different visual forms. The other data capture technologies descried earlier, including bar codes and especially two dimensional (2D) bar codes such as QR codes, have attracted the attention of many researchers and companies due to the potential for inexpensive operation.
[0007] Bar codes have been used for mobile applications to deliver a multitude of different mobile services over mobile phones and other mobile communication or computing devices. Such applications range from providing Uniform Resource Locator (URL) information to link a mobile device to the Internet, through to using a bar code as a form of e- ticket for airlines or event admissions. Hence, there is an ever growing demand for higher capacity bar codes, suited for robust application on mobile devices. Traditional approaches to higher capacity bar codes include: (i) using a colored symbol set and (ii) increasing the size of the bar code. There are known limitations for either approach, especially related to mobile devices with compromised, low resolution cameras. The traditional data capacity of ID and 2D bar codes is severely limited. This severely limited data capacity constrains possible applications of ID and 2D bar codes, and their primary task is simply linking camera phones to designated Web sites. Additional tasks may then be performed based on the Web site. This operation again is based on using of Wi-Fi or 3G connectivity.
[0008] The maximum data capacity of 2D bar codes has been improving over the recent years, resulting in the introduction of newer applications.
[0009] Presently, the use of color-based bar code and other symbolic encoding in color space technologies using mobile devices such as camera mobile phones has been limited by the physical size of the actual bar code symbol in relation to the information encoded within, and also by the image capturing mechanism on mobile devices to discriminate or resolve effectively a greater multitude of colors, especially under varied lighting conditions by an inexperienced, lay user. Another limiting factor has been color fidelity of hard copy output devices, such as color printers, in reproducing the true colors of such color based bar code or symbol sets in a color space.
[0010] While bar codes can be used to provide 2D encoded data, they have not been used to provide real storage devices and media.
SUMMARY
[0011] Embodiments of the present invention include systems, methods, and non- transitory computer program products for information exchange using color space encoded images. A color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays. A second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.
[0012] In accordance with the disclosed subject matter, methods, systems, and non- transitory computer program products are provided for exchanging information.
[0013] Certain embodiments include methods for exchanging information. The method includes receiving information for exchanging and creating an image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image. The method also includes displaying the encoded image for one or more devices to decode.
[0014] Certain embodiments include systems for exchanging information. The system includes at least one processor configured to receive information for exchanging and create an image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers
corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image. The at least one processor is further configured to display the encoded image for one or more devices to decode.
[0015] Certain embodiments include non-transitory computer program products for exchanging information. The non-transitory computer program product is tangibly embodied in a computer-readable medium. The non-transitory computer program product includes instructions operable to cause a data processing apparatus to receive information for exchanging and create an image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image. The instructions are also operable to cause a data processing apparatus to display the encoded image for one or more devices to decode.
[0016] Embodiments of the present invention include systems, methods, and non- transitory computer program products for information exchange using color space encoded images. A color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays. A second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.
[0017] In accordance with the disclosed subject matter, methods, devices, and non- transitory computer program products are provided for exchanging information between a first device and a second device.
[0018] Certain embodiments include a computer-implemented method for exchanging information between a first device and a second device. The method includes receiving, at the first device, information for exchanging with the second device and creating a first image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers
corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image. The method also includes displaying the first encoded image for a second device to decode and in response to the displaying the first encoded image, receiving a second encoded image from the second device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
[0019] Certain embodiments include a first device for exchanging information with a second device. The first device includes memory, storage, and at least one processor configured to receive, at the first device, information for exchanging with the second device. The at least one processor is further configured to create a first image encoding the information by creating colored cells within a color space to provide an encoded
representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image. The at least one processor is also configured to display the first encoded image for a second device to decode and in response to the displaying the first encoded image, receive a second encoded image from the second device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
[0020] The embodiments described herein can include additional aspects of the present invention. For example, the method can further include locating the second encoded image by locating one or more additional cells adjoining the second encoded image and decoding the response based on the colored cells and the one or more additional cells adjoining the second encoded image. The at least one processor is further configured to locate the second encoded image by locating one or more additional cells adjoining the second encoded image and decode the response based on the colored cells and the one or more additional cells adjoining the second encoded image.
[0021] Certain embodiments include a computer-implemented method for exchanging information between a first device and a second device. The method includes receiving, at the second device, a first encoded image from the first device, the first encoded image having colored cells corresponding to a stream of integers encoding data and decoding the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding a request from the first device based on the colored cells and the one or more additional cells adjoining the first encoded image. The method also includes determining a response to the first encoded image and creating a second encoded image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in the second encoded image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the second encoded image, the one or more additional cells for assisting in decoding the second encoded image. The method further includes displaying the second encoded image for the first device to decode.
[0022] Certain embodiments include a first device for exchanging information with a second device. The first device includes memory, storage, and at least one processor configured to receive, at the first device, a first encoded image from the second device. The first encoded image has colored cells corresponding to a stream of integers encoding data. The at least one processor is also configured to decode the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding a request from the second device based on the colored cells and the one or more additional cells adjoining the first encoded image. The at least one processor is further configured to determine a response to the first encoded image and create a second encoded image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in the second encoded image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the second encoded image, the one or more additional cells for assisting in decoding the second encoded image. The at least one processor is also configured to display the second encoded image for the second device to decode.
[0023] The embodiments described herein can include additional aspects of the present invention. For example, the method can also include receiving, at the first device, the second encoded image to decode, the second encoded image having colored cells corresponding to a stream of integers encoding the response, locating the second encoded image by locating one or more additional cells adjoining the second encoded image, and decoding data based on the colored cells and the one or more additional cells adjoining the second encoded image. The first device can be further configured to receive, at the first device, the second encoded image to decode, the second encoded image having colored cells corresponding to a stream of integers encoding the response, locate the second encoded image by locating one or more additional cells adjoining the second encoded image, and decode data based on the colored cells and the one or more additional cells adjoining the second encoded image.
[0024] Embodiments of the present invention include systems, methods, and non- transitory computer program products for information exchange using color space encoded images. A color space encoded image can be displayed, for example on media such as posters, billboards, or paper, or on a display of a first device such as smartphone displays, palmtop displays, camera displays, tablet displays, or e-reader displays. A second device can acquire the displayed encoded image, for example by photographing the image. The second device can decode the color space encoded image or transfer the color space encoded image to a device that decodes the image.
[0025] In accordance with the disclosed subject matter, devices, methods, and non- transitory computer program products are provided for exchanging information between a camera and a device.
[0026] Certain embodiments include methods of exchanging information between a camera and a device. A method includes receiving, at the camera, information for exchange with the device and creating a first image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received content to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the encoded image. The one or more additional cells for assisting in decoding the encoded image. The method also includes displaying the first encoded image for the device to decode and in response to the displaying the first encoded image, receiving a second encoded image from the device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
[0027] Certain embodiments include methods of exchanging information between a camera and a device. A method includes receiving, at the device, a first encoded image from the camera the first encoded image having colored cells corresponding to a stream of integers encoding data. The method also includes decoding the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding the data based on the colored cells and the one or more additional cells adjoining the first encoded image. The method further includes determining a response to the first encoded image and creating a second image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in an image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the image. The method also includes displaying the second encoded image for the camera to decode.
[0028] Certain embodiments include camera systems for exchanging information with a device. The camera system includes memory, storage, and at least one processor configured to receive, at the camera system, information for exchange with the device. The at least one processor is also configured to create a first image encoding the information by creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received content to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image. The at least one processor is further configured to display the first encoded image for the device to decode and in response to the displaying the first encoded image, receive a second encoded image from the device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
[0029] Certain embodiments include devices for exchanging information with a camera. A device includes memory, storage, and at least one processor configured to receive, at the device, a first encoded image from the camera, the first encoded image having colored cells corresponding to a stream of integers encoding data. The at least one processor is also configured to decode the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image and decoding the data based on the colored cells and the one or more additional cells adjoining the first encoded image. The at least one processor is further configured to determine a response to the first encoded image and create a second image encoding the response by creating colored cells within a color space to provide an encoded representation of the response in an image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image and creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the image. The at least one processor is also configured to display the second encoded image for the camera to decode.
[0030] The embodiments described herein can include additional aspects of the present invention. For example, the device can comprise at least one of a computer, a phone, a smartphone, a palmtop device, a handheld device, a tablet, an e-reader, and a scanner.
BRIEF DESCRIPTION OF THE DRAWINGS
[0031] Various objects, features, and advantages of the present disclosure can be more fully appreciated with reference to the following detailed description when considered in connection with the following drawings, in which like reference numerals identify like elements. The following drawings are for the purpose of illustration only and are not intended to be limiting of the invention, the scope of which is set forth in the claims that follow.
[0032] FIG. 1 illustrates a system for storing, encoding, and decoding a 2D color space image in accordance with certain embodiments of the present disclosure.
[0033] FIGS. 2A-2B illustrate the 2D color space encoded image in accordance with certain embodiments of the present disclosure.
[0034] FIG. 3 illustrates a method that the system performs for encoding an image in accordance with certain embodiments of the present disclosure.
[0035] FIG. 4 illustrates a method that the system performs for decoding an image in accordance with certain embodiments of the present disclosure. [0036] FIG. 5 illustrates a block diagram of color calibration cells for restoration of proper color based on distinguishable color samples in accordance with certain embodiments of the present disclosure.
[0037] FIG. 6 illustrates a block diagram of a frame arrangement for geometrical distortion correction of an encoded image in accordance with certain embodiments of the present disclosure.
[0038] FIG. 7 illustrates a block diagram for frame fragmentation for distortion correction in accordance with certain embodiments of the present disclosure.
[0039] FIG. 8 illustrates a block diagram of image based information exchange from one device to another device in accordance with certain aspects of the present disclosure.
[0040] FIG. 9 illustrates a block diagram of an example of information exchange of a color space encoded image in accordance with certain embodiments of the present disclosure.
[0041] FIG. 10 illustrates information exchange using a photo camera to display a color space encoded image in accordance with certain embodiments of the present invention.
DETAILED DESCRIPTION
[0042] The present systems and methods allow for use of 2D color-space codes to encode images with greater data capacity. The 2D color-space encoded images described herein can provide a wide variety of applications, regardless of network connectivity. The present systems and methods allow a sufficient amount of data or even an entire database to be encoded and placed in a storage device using the 2D images encoded and organized by color- space codes as described herein.
[0043] The present disclosure has utility in, for example, data back-up storage
applications or data exchange using mobile devices to provide color two-dimensional pictures for processing.
[0044] Figure 1 illustrates a system 100 for storing, encoding, and decoding a 2D color space image in accordance with certain embodiments of the present disclosure. System 100 includes a 2D color space image 102. System 100 can encode any type of desired machine- based data. Non-limiting examples of machine-based data include text 104a, books 104b (containing, for example, text and images), databases 104c, photographs 104d or other artwork, images, video, film, or movies, music 104e, or any other type of binary data 104f.
[0045] In some embodiments, capturing of desired data can be done using a conventional camera or scanner, or a smartphone or tablet camera. [0046] For example, a user can encode selected information into a 2D color-space coded image and print the image on any desired media (e.g., paper, textile, plastic, etc.). The 2D color-space coded image can also be depicted on any display, including posters, TV screens or computer monitors, as well as displayed on mobile devices such as smartphones or tablets. Digital photo cameras can be used as display devices as well.
[0047] Using a capturing device (e.g. , a conventional camera or scanner or a smartphone or tablet camera) with an external PC or internal processor having image reading or decoding software, the image code can be decoded and the encoded data can be read. The present systems and methods allow information to be retrieved, decoded, and saved in the memory of a device or presented on a display for further use. For example, the decoded data can be read on a laptop computer 106a or a desktop computer 106b, a smart phone 106c, a photo camera 106d, a palmtop or handheld device 106e, or a tablet 106f. All this can be done without manual work. No network connectivity or additional cost is required.
High volume Data storage and exchange device and encoding/decoding methods
[0048] The present systems and methods relate to a data storage device for storing 2D color space encoded images, methods of encoding and decoding the 2D color space encoded images, and application of the 2D color space encoded images to data exchange.
[0049] More particularly, the present disclosure relates to encoding, transmission, and decoding of machine-based data using an appropriately formatted high capacity color image. The present systems and methods can be utilized, for example, for compact backup of printed materials, and/or data exchange between a color-image-displaying device and a plurality of image-reading devices.
[0050] Figures 2A-2B illustrate the 2D color space encoded image 102 in accordance with certain embodiments of the present disclosure. As illustrated in Figure 2A, encoded image 102 includes a number of rectangular cells 202 of different colors. In some embodiments, encoded image 102 can include a black frame 204, a white frame 206, a "zebra frame" 208 having alternating colors, and color calibration cells 210.
[0051] Figure 2B illustrates a close-up view of the black frame 204, the white frame 206, the zebra frame 208, and the color calibration cells 210 in accordance with certain
embodiments of the present disclosure. Black frame 204 and white frame 206 can be used for rough localization of the data image including rectangular cells 202, determination of geometric distortion of the data image, and estimation of cell size and scale of rectangular cells 202. Zebra frame 208 can be used for further calibration of cell size for rectangular cells 202. Color calibration cells 210 can be used for determining a number of colors Nc used for encoding the data, and for indexing the colors to map the colors to numbers for encoding and decoding.
[0052] In some embodiments, rectangular cells 202 can use more than eight different colors. The number of colors can be expressed as Nc, for example Nc > 8. Preferably, the number of colors is selected to be a power of two.
[0053] Figure 3 illustrates a method 300 that the system performs for encoding an image in accordance with certain embodiments of the present disclosure.
[0054] The choice of appropriate color-difference metrics can help achieve stable color resolution for image-source media and image-reading devices. Most imaging devices operating in the visible spectrum are adapted to human eye color resolution. A color space describes color capabilities of a particular device by relating numbers to actual colors.
Embodiments of the present disclosure utilize metrics based on low-cost non-Euclidean distance in red green blue (RGB) color space.
[0055] In some embodiments, the present system uses color space metrics which are close to Commission International de l'eclairage (CIE) 1976 (L*,u*,v*) color space metrics. The CIELUV color space distributes colors roughly proportional to perceived color
differences. In other embodiments, the present systems and methods are able to use any color space, including the CIE 1931 XYZ color space, CIELAB color space, and CIEUVW color space. Although embodiments of the present system are based on the CIELUV color space, according to experimentation, the color space used in certain embodiments is advantageously less sensitive to a color temperature of relevant image displaying media or device, and advantageously less sensitive to external illumination. Assuming R, G, and B are one byte (i.e., 0..255 or 0x0..0xFF) normalized intensity of red, green and blue color components correspondingly, a color difference AC between colors C1 = (R^ G1; B ) and C2 =
(R2, G2, B2) can be given by the following equations:
_ Ri + R2
AR = R2— R1
AG = G2— G1
AB = B2 - B1
Equation (1)
Figure imgf000013_0001
where r describes an average red component value of C1 and C2.
[0056] The present system first receives machine data for encoding into an image. For example, the machine data can be contained in a file. The present system compresses the file (step 302). For example, the machine data can be compressed into a binary compressed file such as zip, rar, arc or any other archive format file.
[0057] The present system then determines an appropriate color encoding, illustrated by box 310. For example, the present system maps the resulting code sequence to a stream of integers from 0 to Nc-1, inclusive. Using the stream of integers, the present system creates a bitmap with frame formatting. Generation of frame formatting is described in further detail later. The present system fills the interior of the frames with color bitmap cells, encoded as described following. In some embodiments, the encoding can use error correction. A non- limiting example of error correction is the two-stage error correction described herein. In some embodiments, the encoding can use scrambling. In some embodiments, the encoding can use error correction in combination with scrambling. The present system determines the color of each cell based on the color calibration cells. The resulting encoded image can then be sent to any display, including a color printer, display, or other color image output device.
[0058] In some embodiments, the present system then performs a first level of error correction (step 304). One benefit of error correction is that the present system is able to decode an image properly, even if a portion of the image is damaged. For example, the present system can determine a header containing a cyclic redundancy check (CRC) error- detecting code, file name, format information, and sequence number (e.g., if the machine data and resulting encoded image represent only part of a larger data stream such as a video stream). In some embodiments, the present system can use any error detecting code such as repetition codes, addition of parity bits, checksums, hash functions, or more powerful error correcting codes.
[0059] In some embodiments, the present system can use a Reed-Solomon encoder of high order. A Reed-Solomon error correction code is a cyclic error-correcting code capable of detecting and correcting multiple errors in a bit stream. For example, the present system can use 8-bit words and blocks of size 255 bytes, if Nc = 16. In further embodiments, the present system can use other error-correcting codes such as low-density parity-check (LDPC) codes or turbo codes. Reed-Solomon error correction codes are capable of detecting errors, erasures, and combinations thereof. In some embodiments, a word size for this high order level of Reed-Solomon encoding can be equal to k*log2(Nc) rounded down, where k > 1. As described herein, in some embodiments the two-stage error coding can perform high order Reed-Solomon encoding in a first stage, and low order Reed-Solomon encoding in a second stage. The second stage can also include smaller words. For example, the word size can be equal to log2(Nc) bits, rounded down.
[0060] In some embodiments, the present system then scrambles the obtained code (step 306). For example, the resulting code can be interleaved using a predefined pseudorandom sequence. The present system scrambles the obtained code to spread out potential errors. Accordingly, the two-level error correction described herein can perform better. That is, when registering or capturing an image for decoding, there can be color distorgions due to non-uniform lighting, or damage on the media on which the image is printed. For example, the media can have spots, be wrinkled, and the like. The scrambling provides that "bad" color cells with uncertain data can be distributed uniformly (i.e., independently) into different encoding blocks, thereby decreasing the number of errors per block. Because error correction can have upper limits of correctable errors, the present spreading improves the effectiveness of the error correction.
[0061] In some embodiments, the present system then performs a second level of error correction (step 308). For example, the resulting scrambled code is encoded using a Reed- Solomon encoder with . For the second level of error correction, the word size can be selected to be smaller than the word size used in the first level of error correction. As described earlier, this two-stage error correction allows the present system to mark unrecoverable blocks of data during a first stage as erasures. The error correction described herein finds an index location of an error word inside a block, and also corrects the error word to its correct or true value. The two-stage error correction allows the present system to correct twice as many erasures as errors, as described in more detail later.
Registration
[0062] The present system is able to use any kind of high definition (HD) color registration device to register an encoded image. Non-limiting example registration devices can include a digital camera, a smartphone camera, a tablet camera, a video camera, a webcam, a scanner (to scan a hard copy image), etc.
Decoding
[0063] Figure 4 illustrates a method 400 that the system performs for decoding an image in accordance with certain embodiments of the present disclosure. [0064] Once an encoded image is registered, the present system can copy a registered image to any processing device containing a decoding module. An example processing device can include a computer, containing a decoding module with computer-readable instructions for decoding, where the computer-readable instructions are stored on a non- transitory computer storage medium, or containing hardware having hard-coded or embedded decoding instructions stored thereon.
[0065] Generally, decoding method 400 includes two high-level processes: (1) image recognition, and (2) two-stage Reed-Solomon decoding. In some embodiments, the present system uses Viterbi decoding as a Reed-Solomon decoder.
[0066] The present system locates the main frame (step 402). For example, the present system can locate the black frame and the white frame vertices. The present system can further determine the middles of the corresponding edges of the black frame and the white frame. If the main frame is not found (step 404: No), the present system returns an error (step 406). For example, the error can be that the image cannot be decoded, or the error can be more specific such as that the main frame cannot be found.
[0067] If the main frame is found (step 404: Yes), the present system corrects geometric distortion based on the main frame and the zebra frame (step 412). Geometric distortion can occur when a user uses a camera lens, for example on a smartphone or tablet camera, or on a digital camera, to decode an image. Accounting for distortion is described in further detail later, in connection with Figure 6.
[0068] The present system locates the color calibration cells (step 408). As described earlier in connection with Figures 2A-2B, in some embodiments, the color calibration cells are located inside the black frame and the white frame, on the left of the rectangular cells encoding the machine data. In other embodiments, the color frame could be located outside the black frame, between the black and white frames, or in other locations within the white frame.
[0069] Based on the color frame, the present system decodes the colors in the rectangular cells (step 410). In general, for every pixel found in the encoded image, the present system finds the closest sample pixel based on the color calibration cells and substitutes the sample pixel for the detected pixel. In some embodiments, the present system obtains a color-value correspondence. The present system determines cell size calibration based on the zebra frame. The present system divides the data image into color cells. When decoding the image, the present system captures or collects all colors belonging to a given pixel rectangle. If most of the pixel colors in the pixel rectangle could be recognized as one color from the color calibration cells, that color from the color calibration cells is used. Otherwise, the present system marks the pixel as an erasure. If a pixel-level block cannot be decoded, the present system treats the whole pixel block as an erasure. The present system reads the rectangular cells sequentially and determines an index of the nearest color from the color calibration cells to the median cell color. In particular, the present system determines the nearest color according to nearest Euclidean distance in the appropriate color space, according to Equation (1), described earlier.
[0070] The present system decodes the bytes (step 414). In some embodiments, the present system applies a corresponding decoder to the error-correcting encoder described earlier. For example, the present system applies a low-level Reed-Solomon decoder to the code stream obtained from the decoded colors. Accordingly, due to the nature of the error- correcting code, the present system is able to restore erasures and errors in the code stream. If the present system detects unrepairable blocks, the unrepairable blocks are marked as "erasures."
[0071] The present system then descrambles the bytes (step 416). In some embodiments, the present system de-interleaves the resulting byte stream.
[0072] The present system then decodes the original information based on the decoded byte stream (step 418). In general, the present system uses a corresponding decoder to the error-correcting encoder described earlier, to restore pixel-block- level erasures and errors. In some embodiments, the present system applies a high-level Reed-Solomon decoder. After applying the high-level Reed-Solomon decoder, if some pixel blocks remain un-repairable, the present system attempts to change the indices to those indices corresponding to the next nearest color in the color calibration cells, and then repeats Reed-Solomon decoding of the modified block. If error free data is obtained, the present system maps the resulting sequence of integers to a bit stream. The present system then parses the resulting bit stream and computes a cyclic redundancy check (CRC) of the data. The present system compares the resulting CRC with the CRC contained in the header of the image. The present system then saves or appends the file.
Two-level COLOR error correction schema with erasures and statistical color restoration
[0073] The two-level error correction used in the present system and described earlier in connection with Figures 3 and 4 allows the present system to correct two types of errors: errors and erasures. In some embodiments, the Reed-Solomon error-correction encoding used is able to detect errors (which correspond to wrong, incorrect, or unexpected
information) and erasures (which correspond to absent or missing information).
[0074] In some embodiments, correcting erasures can be performed twice as effectively than correcting errors. To make use of this feature, the present system uses a two-level error correction schema.
[0075] For example, assume Nc=16 (i.e., 24). Accordingly, there are sixteen distinctive colors which allow an encoded image to encode 4 bits of information per rectangular cell. Every byte of information (8 bits) is encoded by two color indices. The error correction scheme has two levels. First, the byte array is divided into chunks, and every chunk is appended with Reed-Solomon source control data. Second, every byte is divided into two half-bytes (4 bits). The half-bytes are then gathered into groups or chunks which are appended with Reed-Solomon source control data. For example, the present system can process < Nc-1 half-bytes. That is, if Nc=16, the present system gathers less than fifteen half- bytes. The group of half-bytes is then appended with Reed-Solomon source control data determined based on the group of half-bytes. Because the Reed-Solomon source control data can be about the same size as the input data, the addition of the Reed-Solomon source control data results in the size of the group reaching fifteen bytes. For example, the size of the input data and the control data block size can be fifteen bytes. Each half-byte in the group is mapped to an appropriate color, for example by using the corresponding half-bytes as indices into the color array. Accordingly, the encoding process applies two levels of Reed-Solomon encoding, first at the byte array level and second at the half-byte level.
[0076] As described earlier, when decoding the image, the present system collects all colors belonging to a detected pixel rectangle. If most of the colors in the pixel rectangle could be recognized as one of the colors in the color calibration cells, the present system uses the corresponding color in the color calibration cells. Otherwise, the pixel rectangle is marked as an erasure. If a pixel-level block cannot be decoded, the whole pixel block is treated as an erasure. As described earlier, the erasures can be further restored by an appropriate error-correction algorithm. The two-level error correction and leveraging of erasure determinations significantly improves the quality of the decoding described earlier in connection with Figure 4. In particular, the two-level error correction described herein allows the present system to recover information which otherwise would be lost.
Restoration of proper color in the case of distinguishable color samples [0077] Figure 5 illustrates a block diagram of color calibration cells for restoration of proper color based on distinguishable color samples in accordance with certain embodiments of the present disclosure. Figure 5 includes color calibration cells 210.
[0078] Color calibration cells 210 include all colors used to encode the rectangular cells. In some embodiments, a white frame or border surrounds color calibration cells 210 for the color white. In other embodiments, the present system may also include white as a color block in the color calibration cells arranged on the left hand side. The present system discerns each color based on its placement within color calibration cells 210. The present system then stores distorted pixels as color samples. Different variants of a distorted pixel are saved to point or refer to the same correct pixel. That is, distorted pixels are determined dynamically, based on the position of their colors among the color calibration cells. The distorted pixels are assigned estimated colors according to the color calibration cells.
[0079] As described earlier, during decoding (shown in Figure 4), the present system translates a distorted pixel into one of the original colors in color calibration cells 210 using the nearest original color to the distorted pixel. For example, let color C = (R, G, B) represent a red green blue (RGB) component representation of the color. According to Equation (1), described earlier, a pseudo-Euclidean distance between two colors C; and C2 can be given by AC, as described in Equation (1):
Figure imgf000019_0001
[0080] This distance generates a native pseudo-Euclidean metric in the color space. A color mapping can be represented by the set of pairs: (distorted color)— > (correct color). The distorted color can be Ci, and the correct color can be C2. The number of such color pairs should be greater than or equal to Nc, the total number of colors. This mapping can be determined based on the color calibration cells and on the color arrangement therein.
[0081] To restore a correct or proper color c from a distorted color, the present system iterates over all pairs of distorted colors and correct colors. In particular, the present system selects the pair in which the distorted color portion of the pair is closest to the correct or proper color c . In some embodiments, the determination of whether the distorted color portion is closest to the correct or proper color is made according to the pseudo-Euclidian metrics in the RGB color space. That is, the present system selects the pair in which AC is smallest according to Equation (1). The selected pair provides the correct color.
Accordingly, the present system allows the correct colors to be restored, because the present system saves the correspondence between the distorted values and the original values.
Therefore, the present system is able to leverage information about the way in which the colors were distorted relative to the original values as captured in the color calibration cells. Frame arrangement for automatic image geometrical distortion correction
[0082] Figure 6 illustrates a block diagram of a frame arrangement for geometrical distortion correction of an encoded image in accordance with certain embodiments of the present disclosure.
[0083] In some embodiments, the present system includes arrangements to recognize placement of rectangular cells and other elements in the encoded image. For example, these arrangements can include black frame 204, white frame 206, zebra frame 208, and color calibration sidebar 210 (shown in Figure 2). The black frame and white frame surround the encoded rectangular cells. The color calibration sidebar contains sample colors (described earlier, in connection with Figure 5). The zebra frame surrounds the encoded rectangular cells to allow the present system to determine appropriate resolution and pixel division of the encoded image.
[0084] The black frame and white frame allow the present system to determine and correct nonlinear distortion. As described earlier, nonlinear distortion can be introduced, for example, by camera lenses when registering an encoded image for decoding. In some embodiments, the present system uses the white frame as follows. The white frame can be restricted by curve xleft(y) 602, curve xright(y) 604, curve ytop(x) 606, and curve ybottom(x) 608. For every point (x ,y ) 610 which belongs to the area restricted by curves 602-608, the present system can determine parameters a and b (0<a<l and 0<b<l) according to the following equations:
(1-a) xieft(y*) +a Xright(y*) = x*
(1-b) ytoP(x*) +b ybottom(x*) = y*
The equation described above maps the area restricted by curves 602-608 to a unit rectangle defined by 0<a<l , 0<b<l . The unit rectangle can represent what the image would look like when "straightened out," by mapping the area restricted by curves 602-608 to the unit rectangle. That is, the distorted boundary of the image determined according to the area restricted by curves 602-608 can be mapped to the "straightened out" unit rectangle. The zebra frame can be used to determine parameters a and b. The present distortion correction therefore allows the original area having curved or non-linear sub-regions to be transformed according to the unit rectangle using parameters a and b.
[0085] The zebra frame allows the present system to determine a resolution of the embedded picture, and to divide the image into "bold" or unknown pixels. As described earlier, the color value of some pixels may be unknown. "Bold" pixels allow the present system to determine the proper color value of a pixel according to a voting process. In some embodiments, the voting process can choose the color value of the most frequent pixel close to the "bold" or unknown pixel to be the color value of the "bold" pixel. Otherwise, if the result of the voting is unclear— for example, if there is no pixel with a frequency much larger than the other pixels— the present system can mark the pixel as an erasure (i.e., missing). Therefore, using the two-level error correction described earlier, the pixel can be restored. Image Fragmentation
[0086] Figure 7 illustrates a block diagram for frame fragmentation for distortion correction in accordance with certain embodiments of the present disclosure. The rectangular cells can be divided into sub-images 704 using separating lines 702.
[0087] Frame fragmentation or image fragmentation refers to a complementary technique to distortion correction (shown in Figure 6). In some embodiments, dividing the image into a set of sub-images 704 allows the present system to work with every sub-image 704, without requiring the distortion compensation technique described earlier because sub-images 704 are sufficiently small that there is not as much distortion as with the entire encoded image. In some embodiments, sub-images 704 can be separated with separating lines 702. In some embodiments, separating lines 702 can be black lines, which can also became part of the zebra border.
[0088] The present system can use the same encoding and decoding techniques described earlier (shown in Figures 3 and 4) for encoding and decoding of sub-images 704. For example, the present system can leverage the two-level error correction, scrambler, etc.
described earlier.
[0089] In some embodiments, the present system can skip the distortion correction described earlier (shown in Figure 6). The present system can leverage the relatively smaller size of sub-images 704 to skip non- linear distortion compensation of each sub-image 704. Accordingly, the present system can decode sub-images 704 with better accuracy.
"Image" based Data exchange between devices without using Wi-Fi or Cellular [0090] Figure 8 illustrates a block diagram of image based information exchange from one device to another device in accordance with certain aspects of the present disclosure. Figure 8 includes smartphones 802a-b, and encoded images 804a-b. Some smartphones currently have large displays and many are evolving to include even larger displays. These displays can be used for information exchange, for example using color-space encoded images 804a-b. The color-space encoded images 804a-b can be encoded according to the encoding process described earlier, in Figure 3. Accordingly, smartphones 802a-b are able to exchange information using the encoded images. For example, an embedded camera on smartphone 802b can register or capture an encoded image 804a being displayed on smartphone 802a.
[0091] Embedded cameras used on devices such as smartphones 802a-b are improving. As described herein, the resolution per frame as captured by embedded cameras can be less important, because the requirements on the number of pixels which can be pixel number limitation can be avoided. Descriptions of the pixel number limitation are described in further detail below.
[0092] In some embodiments, smartphones 802a-b can have hardware modules or software applications loaded thereon, for example for encoding or decoding according to the processes described in Figures 3 and 4. A first user can create machine data or otherwise store machine data on a device such as a smartphone. The first user can then decide to exchange related information between devices, for example with a second user. The machine data created by the first user or stored on the phone can be encoded according to the process described in Figure 3, and then displayed on the smartphone display for exchanging information with smartphone 802b.
[0093] As described earlier, smartphone 802b can acquire the encoded image through its embedded camera. Advantageously, this acquisition does not require smartphone 802b to download or acquire encoded image 804a through electromagnetic radio transmissions such as a wireless or cellular network. Smartphone 802b can perform the decoding as described in Figure 4 to determine the machine data from the first user. As described earlier, the decoding can be performed in hardware or in software installed on smartphone 802b. The machine data encoded by the first user can be displayed on the screen or otherwise stored in smartphone 802b. [0094] In some embodiments, security can be added to the encoded image. For example, when the present system compresses or archives the machine data, the present system can add a password or otherwise encrypt the relevant machine data.
[0095] In some embodiments, if there are low light conditions and /or low camera resolution for smartphone 802b to register or otherwise capture encoded image 804a, smartphone 802a can partition or divide the information to be transmitted. For example, smartphone 802a can transmit a first frame including only partial information. In some instances, the first frame can include the color calibration frame for smartphone 802b to calibrate colors in preparation for decoding encoded image 804a, as described earlier in connection with Figure 5. In some embodiments, the first frame can include the main frames (i.e., the black frame and/or the white frame) to correct geometrical distortions, etc., as described earlier in connection with Figure 6. Generally, smartphone 802a can construct and generate special frames for partial transmission. The special frames can include the color calibration frame, the main frames, or the zebra frame, to obtain a stable information and data exchange.
[0096] In some embodiments, smartphones 802a-b can be placed substantially against each other, for example by aligning the embedded camera of smartphone 802b with the display of smartphone 802a displaying encoded image 804a. Accordingly, smartphones 802a-b can sense the proximity to establish automatically the data transmission channel described earlier. In further embodiments, smartphones 802a-b can change the position of which smartphone represents the source for sending machine data, and which smartphone represents the sink for receiving machine data.
Direct data exchange
[0097] Figure 9 illustrates a block diagram of an example of information exchange of a color space encoded image in accordance with certain embodiments of the present disclosure. Figure 9 includes a number of devices including smartphones 906a, 912a, camera 906b, 912b, palmtop or handheld device 906c, 912c, and tablet or e-reader 906d, 912d. In some embodiments, the present system can include information exchange between a number of devices which can register, capture, or "see" an encoded image 102 on a display 902. The devices which can register encoded image 102 include devices with embedded cameras 904. For example, the devices can include smartphone 912a, camera 912b, palmtop or handheld device 912c, and tablet or e-reader 912d. In some embodiments, these devices can be similar to devices 106c-f (shown in Figure 1). These registering devices can register or capture the corresponding data source. In some embodiments, the information exchange described herein can include data distribution including multiple devices. Accordingly, the information exchange described herein can include point-to-multipoint data distribution, in which a number of devices all capture a single encoded image 102.
[0098] In some embodiments, the present system can include sequential frame
transmission of multiple encoded images. The encoded images can include multiple video frames, such as video frames 910a-c. Accordingly, the present system allows exchange of video files such as machine data including movies or films 908.
[0099] In some embodiments, the sequential frame transmission described herein can be used for machine data other than video files. For example, multiple encoded images can be used to encode large files and/or large amounts of information. Smartphones such as smartphones 906a, 912a are capable of recording full high definition (HD) video. HD video can include video at 1920x1080 pixels, and at least 25 frames per second (fps). Accordingly, smartphones 912a can often include high-performance cameras such as for capturing photographs or video. Furthermore, smartphone displays continue to improve, allowing for even more resolution for encoding machine data, and for displaying and transmitting encoded images.
[0100] Smartphones often include a second camera, for example on the front of the smartphone along with the display, for use in videoconferencing. In some embodiments, the present system can allow two devices to be placed with their displays next to or adjacent with each other. In such an orientation, embodiments of the present disclosure can use the front camera. Advantageously, having devices paired as described herein can allow the devices to exchange information interactively. For example, a first device, such as tablet 906d can display a first frame. As described earlier, the first frame can include color calibration information. When the second device has completed processing the color calibration information, the second device can encode an acknowledgement in an encoded image of its own (not shown), and send the acknowledgement for the first device to decode. In some embodiments, the second device can send a request for the next frame or additional partial information.
[0101] In further embodiments, the present system can send a number of encoded images, for example in a video-like transmission. This video-like transmission can be used for two- way interactive or one-way directional information or data exchange. [0102] Figure 10 illustrates information exchange using a photo camera to display a color space encoded image in accordance with certain embodiments of the present invention.
Figure 10 illustrates a photo camera 1002 and a smartphone 802b. Photo camera 1002 can display an encoded image 804a. Current digital cameras such as photo camera 1002 have good optics in their camera lenses. Current digital cameras often have good displays as well, which can be used for displaying encoded image 804a.
[0103] Camera manufacturers are also adding wireless connectivity such as Wi-Fi or cellular connectivity, or even GPS receivers to make cameras "connected." Cameras can also include digital signal processors used for image data processing.
[0104] In some embodiments, the present system includes digital signal processors modified in photo camera such as photo camera 1002. The modifications can include hardware modules or embedded software such as firmware for encoding or decoding color space images in accordance with the processes described in Figures 3 and 4.
[0105] Accordingly, photo cameras can be modified for information or data exchange between cameras. A camera is a good "image" acquisition, registration, or capture device, due to high quality optics and ability to work under difficult light conditions. Information acquired by a camera can often be stored on a memory card, for example flash memory. As described earlier, the present system can allow a user to insert a memory card into a compatible device, such as a PC or notebook computer. Modules on the PC or notebook computer can decode a color space encoded image without requiring radio-related approaches such as using a Wi-Fi or cellular network.
[0106] In some embodiments, an embedded camera on a PC or tablet can be used as an image acquisition, registration, or capture device. In further embodiments, the present system can use the display on the PC or tablet as a data source device as well, for example to display encoded image 804. Accordingly, the decoding modules described earlier can be utilized on corresponding decoding devices.
[0107] Although specific steps are disclosed in Figures 3 and 4, such steps are exemplary. That is, the present system is well-suited to performing various other steps or variations of the steps recited in Figures 3 and 4. The steps in Figures 3 and 4 may be performed in an order different than presented, and that not all of the steps may be performed. Figures 3 and 4 include processes that, in various embodiments, are carried out by a processor under the control of computer-readable and computer-executable instructions. Embodiments of the present disclosure may thus be stored as non-transitory computer-readable media or computer-executable instructions including, but not limited to, firmware updates, software update packages, or hardware (e.g., ROM).
[0108] Reference has been made in detail to various embodiments in accordance with the the present disclosure, examples of which are illustrated in the accompanying drawings. While the invention has been described in conjunction with various embodiments, these various embodiments are not intended to limit the invention. On the contrary, the invention is intended to cover alternatives, modifications, and equivalents, which may be included within the scope of the invention as construed according to the appended claims.
Furthermore, in the detailed description of various embodiments, numerous specific details have been set forth in order to provide a thorough understanding of the invention. However, the invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail, so as not to unnecessarily obscure aspects of the invention.
[0109] Some portions of the detailed descriptions have been presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a non-transitory computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present disclosure, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of operations or steps or instructions leading to a desired result. The operations or steps are those utilizing physical manipulations and transformations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system or computing device. It has been convenient at times, principally for reasons of common usage, to refer to these signals as records, transactions, bits, values, elements, symbols, characters, samples, pixels, or the like.
[0110] Of course, all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the foregoing discussions, it is appreciated that throughout the present disclosure, discussions utilizing terms such as "deactivating,"
"disabling," "re-activating," "enabling," "sending," "receiving," "determining," "flushing," "responding," "generating," "making," "blocking," "accessing," "taking a snapshot," "associating," "allowing," "updating," or the like, refer to actions and processes of a computer system or similar electronic computing device or processor. The computer system or similar electronic computing device manipulates and transforms data represented as physical (electronic) quantities within the system memories, registers or other such information storage, transmission or display devices.
[0111] The present systems and methods can be implemented in a variety of architectures and configurations. For example, the present systems and methods can be implemented as part of a distributed computing environment, a cloud computing environment, a client server environment, etc. The embodiments described herein may be discussed in the general context of computer-executable instructions residing on some form of non-transitory computer-readable storage medium, such as program modules, executed by one or more computers, computing devices, or other devices. As described earlier, non-limiting examples of computer-readable storage media may include storage media and communication media. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
[0112] The foregoing descriptions of specific embodiments of the present systems and methods have been presented for purposes of illustration and description. The specific embodiments are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and many modifications and variations are possible in light of the above description. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.
What is claimed is:

Claims

1. A computer-implemented method for exchanging information, the method comprising: receiving information for exchanging;
creating an image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image; and
displaying the encoded image for one or more devices to decode.
2. A system for exchanging information, the system comprising:
memory;
storage;
at least one processor configured to use the memory and the storage to:
receive information for exchanging;
create an image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image; and
display the encoded image for one or more devices to decode.
3. A non-transitory computer program product for exchanging information, the non- transitory computer program product tangibly embodied in a computer-readable medium, the non-transitory computer program product including instructions operable to cause a data processing apparatus to:
receive information for exchanging;
create an image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the encoded image; and
display the encoded image for one or more devices to decode.
4. A computer-implemented method for exchanging information between a first device and a second device, the method comprising:
receiving, at the first device, information for exchanging with the second device; creating a first image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image;
displaying the first encoded image for a second device to decode; and
in response to the displaying the first encoded image, receiving a second encoded image from the second device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
5. The method of claim 4, further comprising:
locating the second encoded image by locating one or more additional cells adjoining the second encoded image; and
decoding the response based on the colored cells and the one or more additional cells adjoining the second encoded image.
6. A computer-implemented method of exchanging information between a first device and a second device, the method comprising:
receiving, at the second device, a first encoded image from the first device, the first encoded image having colored cells corresponding to a stream of integers encoding data; decoding the first encoded image by locating the first encoded image by locating one or more additional cells adjoining the first encoded image; and
decoding a request from the first device based on the colored cells and the one or more additional cells adjoining the first encoded image;
determining a response to the first encoded image;
creating a second encoded image encoding the response by:
creating colored cells within a color space to provide an encoded representation of the response in the second encoded image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the second encoded image, the one or more additional cells for assisting in decoding the second encoded image; and
displaying the second encoded image for the first device to decode.
7. The method of claim 6, further comprising:
receiving, at the first device, the second encoded image to decode, the second encoded image having colored cells corresponding to a stream of integers encoding the response; locating the second encoded image by locating one or more additional cells adjoining the second encoded image; and
decoding data based on the colored cells and the one or more additional cells adjoining the second encoded image.
8. A device for exchanging information with a second device, the first device comprising: one or more displays;
memory;
storage; and
at least one processor configured to use the memory and the storage to:
receive, at the device, information for exchanging with the second device;
create a first image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received information to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image;
display, on the one or more displays, the first encoded image for a second device to decode; and
in response to the displaying the first encoded image, receive a second encoded image from the second device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
9. The device of claim 8, wherein the at least one processor is further configured to:
locate the second encoded image by locating one or more additional cells adjoining the second encoded image; and
decode the response based on the colored cells and the one or more additional cells adjoining the second encoded image.
10. A device for exchanging information with a second device, the first device comprising: one or more displays;
memory;
storage; and
at least one processor configured to use the memory and the storage to:
receive, at the first device, a first encoded image from the second device, the first encoded image having colored cells corresponding to a stream of integers encoding data; decode the first encoded image by
locating the first encoded image by locating one or more additional cells adjoining the first encoded image; and
decoding a request from the second device based on the colored cells and the one or more additional cells adjoining the first encoded image;
determine a response to the first encoded image;
create a second encoded image encoding the response by:
creating colored cells within a color space to provide an encoded representation of the response in the second encoded image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and creating one or more additional cells adjoining the second encoded image, the one or more additional cells for assisting in decoding the second encoded image; and
display, on the one or more displays, the second encoded image for the second device to decode.
11. The device of claim 10, wherein the first device is further configured to:
receive, at the first device, the second encoded image to decode, the second encoded image having colored cells corresponding to a stream of integers encoding the response; locate the second encoded image by locating one or more additional cells adjoining the second encoded image; and
decode data based on the colored cells and the one or more additional cells adjoining the second encoded image.
12. A computer-implemented method of exchanging information between a camera and a device, the method comprising:
receiving, at the camera, information for exchange with the device;
creating a first image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received content to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image;
displaying the first encoded image for the device to decode; and
in response to the displaying the first encoded image, receiving a second encoded image from the device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
13. The method of claim 12, wherein the device comprises at least one of a computer, a phone, a smartphone, a palmtop device, a handheld device, a tablet, an e-reader, and a scanner.
14. A computer-implemented method of exchanging information between a camera and a device, the method comprising:
receiving, at the device, a first encoded image from the camera, the first encoded image having colored cells corresponding to a stream of integers encoding data;
decoding the first encoded image by
locating the first encoded image by locating one or more additional cells adjoining the first encoded image; and
decoding the data based on the colored cells and the one or more additional cells adjoining the first encoded image;
determining a response to the first encoded image;
creating a second image encoding the response by:
creating colored cells within a color space to provide an encoded representation of the response in an image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the image; and
displaying the second encoded image for the camera to decode.
15. The method of claim 14, wherein the device comprises at least one of a computer, a phone, a smartphone, a palmtop device, a handheld device, a tablet, an e-reader, and a scanner.
16. A camera for exchanging information with a device, the camera comprising:
memory;
storage;
a camera module;
at least one processor configured to use the memory and the storage to:
receive, at the camera module, information for exchange with the device;
create a first image encoding the information by:
creating colored cells within a color space to provide an encoded representation of the received information in an image by processing the received content to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the encoded image, the one or more additional cells for assisting in decoding the encoded image;
display the first encoded image for the device to decode; and
in response to the displaying the first encoded image, receive a second encoded image from the device, the second encoded image having colored cells corresponding to a stream of integers encoding a response.
17. The camera of claim 16, wherein the device comprises at least one of a computer, a phone, a smartphone, a palmtop device, a handheld device, a tablet, an e-reader, and a scanner.
18. A device for exchanging information with a camera, the device comprising:
a camera module;
memory;
storage; and
at least one processor configured to use the memory and the storage to:
receive, at the device, a first encoded image from the camera module, the first encoded image having colored cells corresponding to a stream of integers encoding data; decode the first encoded image by
locating the first encoded image by locating one or more additional cells adjoining the first encoded image; and
decoding the data based on the colored cells and the one or more additional cells adjoining the first encoded image;
determine a response to the first encoded image;
create a second image encoding the response by:
creating colored cells within a color space to provide an encoded representation of the response in an image by processing the response to obtain a stream of integers, each integer in the stream of integers corresponding to a color in the color space for a cell in the image; and
creating one or more additional cells adjoining the image, the one or more additional cells for assisting in decoding the image; and display the second encoded image for the camera to decode.
19. The device of claim 18, the device further comprising at least one of a computer, a phone, a smartphone, a palmtop device, a handheld device, a tablet, an e-reader, and a scanner.
PCT/IB2014/001482 2013-03-15 2014-03-14 Information exchange using color space encoded image WO2014170763A2 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US13/842,817 US8973844B2 (en) 2013-03-15 2013-03-15 Information exchange using photo camera as display for color space encoded image
US13/842,817 2013-03-15
US13/843,132 US9027843B2 (en) 2013-03-15 2013-03-15 Information exchange display using color space encoded image
US13/843,132 2013-03-15
US13/841,338 2013-03-15
US13/841,338 US9117151B2 (en) 2013-03-15 2013-03-15 Information exchange using color space encoded image

Publications (2)

Publication Number Publication Date
WO2014170763A2 true WO2014170763A2 (en) 2014-10-23
WO2014170763A3 WO2014170763A3 (en) 2015-04-09

Family

ID=51731919

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/001482 WO2014170763A2 (en) 2013-03-15 2014-03-14 Information exchange using color space encoded image

Country Status (1)

Country Link
WO (1) WO2014170763A2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100914515B1 (en) * 2006-06-23 2009-09-02 주식회사 칼라짚미디어 Color classification method for color based image code recognition
CN100511271C (en) * 2006-11-16 2009-07-08 深圳市天朗时代科技有限公司 Two-dimensional decoding method
US20100327066A1 (en) * 2009-06-25 2010-12-30 Samsung Electronics Co. Ltd. Network based reliable decoding of bar codes

Also Published As

Publication number Publication date
WO2014170763A3 (en) 2015-04-09

Similar Documents

Publication Publication Date Title
US9532060B2 (en) Two-level error correcting codes for color space encoded image
US9161062B2 (en) Image encoding and decoding using color space
WO2014140893A2 (en) Documents and data backup using color space encoded image
US9558438B2 (en) Information broadcast using color space encoded image
US9161061B2 (en) Data storage and exchange device for color space encoded images
US9152830B2 (en) Color restoration for color space encoded image
US9514400B2 (en) Information exchange using color space encoded image
US9129346B2 (en) Image fragmentation for distortion correction of color space encoded image
WO2018095149A1 (en) Method and system for generating two-dimensional code having embedded visual image, and reading system
US9027843B2 (en) Information exchange display using color space encoded image
US8973844B2 (en) Information exchange using photo camera as display for color space encoded image
US9014473B2 (en) Frame of color space encoded image for distortion correction
US9147143B2 (en) Book using color space encoded image
Melgar et al. High density two-dimensional color code
US9396169B2 (en) Combination book with e-book using color space encoded image with color correction using a pseudo-euclidean metric in the color space
US9386185B2 (en) Encoding large documents using color space encoded image with color correction using a pseudo-euclidean metric in the color space
Hermans et al. Focus: Robust visual codes for everyone
US9027842B2 (en) Broadcasting independent of network availability using color space encoded image
US9152613B2 (en) Self-publication using color space encoded image
US9189721B2 (en) Data backup using color space encoded image
Lyons et al. Two-dimensional barcodes for mobile phones
CN102841928B (en) File security sending, receiving method and device between net
Melgar et al. Channel capacity analysis of 2D barcodes: QR Code and CQR Code-5
WO2014170763A2 (en) Information exchange using color space encoded image
WO2014140895A2 (en) Data storage and exchange device for color space encoded images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14785324

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 14785324

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 14785324

Country of ref document: EP

Kind code of ref document: A2