KR20170128239A - Customizable functional patterns for optical barcodes - Google Patents

Customizable functional patterns for optical barcodes Download PDF

Info

Publication number
KR20170128239A
KR20170128239A KR1020177023059A KR20177023059A KR20170128239A KR 20170128239 A KR20170128239 A KR 20170128239A KR 1020177023059 A KR1020177023059 A KR 1020177023059A KR 20177023059 A KR20177023059 A KR 20177023059A KR 20170128239 A KR20170128239 A KR 20170128239A
Authority
KR
South Korea
Prior art keywords
image
shape feature
data
custom
feature
Prior art date
Application number
KR1020177023059A
Other languages
Korean (ko)
Other versions
KR102018143B1 (en
Inventor
랜던 앤더튼
가레트 지
라이언 혼버거
커크 위메트
카메론 셰필드
벤자민 털리
Original Assignee
스냅 인코포레이티드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201562105141P priority Critical
Priority to US62/105,141 priority
Priority to US14/612,409 priority patent/US9111164B1/en
Priority to US14/612,409 priority
Priority to US14/826,301 priority patent/US9659244B2/en
Priority to US14/826,301 priority
Application filed by 스냅 인코포레이티드 filed Critical 스냅 인코포레이티드
Priority to PCT/US2016/012669 priority patent/WO2016118338A1/en
Publication of KR20170128239A publication Critical patent/KR20170128239A/en
Application granted granted Critical
Publication of KR102018143B1 publication Critical patent/KR102018143B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06037Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06046Constructional details
    • G06K19/06103Constructional details the marking being embedded in a human recognizable image, e.g. a company logo with an embedded two-dimensional code
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06046Constructional details
    • G06K19/06131Constructional details the marking comprising a target pattern, e.g. for indicating the center of the bar code or for helping a bar code reader to properly orient the scanner or to retrieve the bar code inside of an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14131D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1456Methods for optical code recognition including a method step for retrieval of the optical code determining the orientation of the optical code with respect to the reader and correcting therefore
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/32Aligning or centering of the image pick-up or image-field
    • G06K9/3216Aligning or centering of the image pick-up or image-field by locating a pattern

Abstract

A system and method for a customized functional pattern for an optical bar code is provided. In an exemplary embodiment, image data of an image is received from a user device. A candidate shape feature of the image is extracted from the image data. A determination is made that the shape feature satisfies the shape feature rule. In response to the candidate shape feature satisfying the shape feature rule, the customized graphics in the image are identified by comparing the candidate shape feature to the reference shape feature of the custom graphics. In response to identifying the custom graphics, the encoded data within a portion of the image is decoded.

Description

Customizable functional patterns for optical barcodes

This international application is a continuation-in-part of US Application No. 14 / 826,301, entitled " CUSTOM FUNCTIONAL PATTERNS FOR OPTICAL BARCODES " filed August 14, 2015, the entire contents of each of which are incorporated herein by reference, U.S. Application Serial No. 14 / 612,409, entitled "CUSTOM FUNCTIONAL PATTERNS FOR OPTICAL BARCODES", filed Feb. 3, and U.S. Patent Application Serial No. 10 / 612,409, filed January 19, 2015 entitled "CUSTOM FUNCTIONAL PATTERNS FOR OPTICAL BARCODES" 62 / 105,141, which is incorporated herein by reference in its entirety.

Embodiments of the present disclosure generally relate to mobile computing technologies, and more particularly, but not exclusively, to customizable functional patterns for optical barcodes.

Quick Response (QR) codes, and other optical bar codes are a convenient way to share a small amount of information to users of mobile devices, wearable devices, and other smart devices. Typically, the optical bar code uses a finder pattern for identification of the optical bar code. Conventional finder patterns typically use a plurality of generic markings that are prominently placed within the optical bar code. These prominent generic markings can be unsightly and often do not serve any purpose other than to function as a finder pattern.

The various figures of the accompanying drawings are merely representative of the exemplary embodiments of the present disclosure and are not to be construed as limiting the scope thereof.
1 is a block diagram illustrating a networked system in accordance with some exemplary embodiments.
2 is a block diagram illustrating an exemplary embodiment of a custom pattern system in accordance with some exemplary embodiments.
Figs. 3A and 3B are views showing examples of optical barcodes employing a customized function pattern according to some exemplary embodiments. Fig.
4 is an illustration showing an example of identifying and decoding an optical barcode employing a customized function pattern according to some exemplary embodiments.
5 is a flow diagram illustrating an exemplary method for identifying and decoding an optical bar code using a customized function pattern according to some exemplary embodiments.
6 is a flow diagram illustrating further exemplary operation for identifying an optical bar code using a customized function pattern according to some exemplary embodiments.
7 is a diagram showing an example of identifying an optical barcode using a customized function pattern according to some exemplary embodiments.
Figure 8 is a flow diagram illustrating further exemplary operation for identifying an optical bar code using a customized function pattern according to some exemplary embodiments.
9 is a diagram showing an example of identifying an optical barcode using a customized function pattern according to some exemplary embodiments.
10 is a flow diagram illustrating a further exemplary operation for decoding an optical bar code using a customized function pattern according to some exemplary embodiments.
11 is a diagram showing an example for decoding an optical barcode using a customized function pattern according to some exemplary embodiments.
Figures 12A, 12B, and 12C are diagrams illustrating various image transformations used to enable decoding of optical bar codes using a customizable function pattern according to some exemplary embodiments.
Figure 13 is a flow diagram illustrating further exemplary operation for decoding an optical bar code using a customized function pattern according to some exemplary embodiments.
14 is a diagram showing an example of decoding an optical bar code using a customized function pattern according to some exemplary embodiments.
15 is a user interface diagram illustrating an exemplary user interface for identifying optical barcodes in accordance with some exemplary embodiments.
Figure 16 is a user interface diagram illustrating an exemplary user interface for performing operations associated with optical barcodes according to some exemplary embodiments.
17 is a flow diagram illustrating further exemplary operation for creating an optical bar code using a customized function pattern according to some exemplary embodiments.
18 is a user interface diagram illustrating an exemplary user interface for creating an optical barcode using a customized function pattern according to some exemplary embodiments.
19 is a user interface diagram illustrating an exemplary mobile device and mobile operating system interface in accordance with some exemplary embodiments.
20 is a block diagram illustrating an example of a software architecture that may be installed on a machine in accordance with some example embodiments.
21 is a block diagram illustrating a schematic diagram of a machine in the form of a computer system in which a set of instructions may be executed to cause the machine according to the illustrative embodiment to perform any of the methods discussed herein.

The following description includes systems, methods, techniques, instruction sequences and computing machine program products that implement the exemplary embodiments of the present disclosure. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the subject matter of the present invention. However, it will be apparent to one of ordinary skill in the art that embodiments of the subject matter of the present invention may be practiced without these specific details.

In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.

QR codes, and other optical bar codes (e.g., Universal Product Code (UPC) bar codes, Aztec codes, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code) It's a convenient way to share a small amount of information with your users. For example, the QR code is a two-dimensional optical barcode that encodes information that can be read by a device (e.g., a smartphone) equipped with a camera sensor. Typically, the QR code comprises one or more functional patterns, such as a finder pattern used to identify and recognize a QR code, or an alignment pattern used to enable decoding. Conventional finder patterns include a plurality of marks common in designs such as square marks placed at all corners except the lower right corner (as in the case of QR codes). These finder patterns do not have aesthetic elements such as curves, irregularities, and other cool elements, and often adhere to certain standards to facilitate the open use of optical barcodes.

In various exemplary embodiments, optical barcodes utilizing custom or non-standard functional patterns provide users with an aesthetically pleasing branded barcode that enables an advanced experience associated with optical barcodes. For example, an entity logo (e.g., company, organization, or personal logo) may be used as a finder pattern, or in some cases an alignment pattern, to provide a machine- An optical bar code can be generated. In a particular example, a "snap code" is an optical bar code that uses the SNAPCHAT® logo as a functional pattern.

In one exemplary embodiment, the custom pattern system receives image data representing an image from a user device. For example, a custom pattern system receives image data from an optical sensor (e.g., a camera sensor) of a user's smartphone. In various embodiments, image data from a user device is received in response to user-initiated image capture, periodic monitoring of image data detected by an optical sensor of a user device, access to stored image data, or a combination thereof. A portion of the image data may include data representing an optical barcode that employs custom graphics for a particular functional pattern (e.g., a finder pattern). In some scenarios, the image data includes heterogeneous or irrelevant data along with data related to the optical bar code (e.g., the image of the optical bar code includes a background unrelated to decoding the optical bar code). In a particular example, the optical sensor of the user device captures an image of a promotional poster comprising a specific optical bar code. The image of the promotional poster may include irrelevant portions surrounding the particular optical bar code, such as a promotional poster or background, along with a specific optical bar code.

After the custom pattern system receives the image data, the custom pattern system retrieves the image data of the image for the custom graphic to determine if the image contains the optical bar code. That is, the customized graphic is used as a finder pattern for recognition, identification, or detection of the optical barcode in the image. In one exemplary embodiment, the custom pattern system retrieves custom graphics by extracting candidate feature features or a plurality of candidate feature features from the image data. For example, a custom pattern system performs edge detection techniques or other image processing techniques to identify candidate shape features, such as contours of an image. The custom pattern system then determines if the candidate shape feature satisfies the shape feature rule or criterion. For example, if a particular candidate shape feature is an outline, the customized pattern system can determine if the outline is an enclosed closed curve surrounding a portion of the image. Consistent with some embodiments, the feature feature rules filter out candidate feature features that are unlikely to be irrelevant or heterogeneous candidate feature features or customizable graphics.

In response to the candidate shape feature that satisfies the shape feature rule, the custom pattern system identifies the customized graphic by comparing the candidate shape feature with the reference shape feature of the custom graphic. For example, the custom pattern system may compare the area or size of the candidate shape feature to the reference area or size of the reference feature feature. In this example, the custom pattern system identifies a custom graphic based on a match or a sub match (e.g., a percentage match above the threshold) between the candidate feature feature and the reference feature feature. In this manner, the custom pattern system utilizes custom graphics as a finder pattern to identify the presence of optical barcodes within a portion of the image.

In a further exemplary embodiment, the custom graphic serves as an alignment pattern that enables the custom pattern system to decode the data encoded in the optical bar code. In one exemplary embodiment, the custom pattern system extracts the spatial attributes of the custom graphics within the image from the image data. For example, a custom pattern system extracts the position, scale, or orientation of a custom graphic from image data. The custom pattern system decodes the encoded data within the image from the image data using the spatial properties of the customized graphics in the image. For example, a custom pattern system may perform image conversion using spatial attributes (e.g., de-skew, rotate, scale, or another type of image conversion) to detect / Reliability can be improved. In this manner, a custom pattern system utilizes custom graphics as an alignment pattern to enable decoding of the optical bar code.

Therefore, the customized pattern system uses the customized graphic as the functional pattern of the optical bar code without using the conventional function pattern. Using custom graphics as a functional pattern allows for an aesthetically pleasing design and can provide proprietary functionality to a particular software application since the functional pattern does not necessarily have to follow an open standard and can only be read by a particular software application.

1 is a network diagram illustrating a network system 100 having a client-server architecture configured to exchange data over a network, according to one embodiment. For example, network system 100 may be a messaging system in which a client communicates within network system 100 and exchanges data. The data may relate to various aspects of the network system 100 and its various functions (e.g., transmission and reception of text and media communication signals, determination of location information, etc.) and aspects. Although illustrated herein with a client-server architecture, other embodiments may include other network architectures such as peer-to-peer or distributed network environments.

As shown in FIG. 1, network system 100 includes a social messaging system 130. The social messaging system 130 is generally based on a three-tier architecture comprising an interface layer 124, an application logic layer 126, and a data layer 128. As will be appreciated by those of ordinary skill in the relevant computer and Internet arts, each module or engine shown in FIG. 1 includes an executable set of software instructions and corresponding hardware (e.g., memory and processor) . In order to avoid obscuring the subject matter of the present invention with unnecessary details, various functional modules and engines that are not closely related to conveying the subject matter of the present invention have been omitted from FIG.

Of course, additional feature modules and engines such as those shown in Figure 1 may be used with the social messaging system to enable additional functionality not specifically described herein. In addition, the various functional modules and engines illustrated in FIG. 1 may reside on a single server computer, or may be distributed across multiple server computers in various configurations. Also, while the social messaging system 130 is shown in FIG. 1 as a three-tiered architecture, the subject matter of the present invention is by no means limited to such an architecture.

As shown in Figure 1, the interface layer 124 may be implemented in various ways, such as a client device 110 executing a client application 112 and a third party server 120 executing a third party application 122 (E. G., A web server) 140 that receives requests from a client computing device and a server. In response to the received request, the interface module 140 forwards the appropriate response to the requesting devices via the network 104. For example, the interface module 140 may receive a request, such as a Hypertext Transfer Protocol (HTTP) request, or other web-based API (Application Programming Interface) request.

The client device 110 may be a conventional web browser application or applications developed for a particular platform, including various mobile computing devices and mobile-specific operating systems (e.g., IOS ™, ANDROID ™, WINDOWS® PHONE) (Also referred to as "apps"). In one example, client device 110 is executing client application 112. The client application 112 may provide functionality for presenting information to the user 106 and communicating over the network 104 to exchange information with the social messaging system 130. Each of the client devices 110 may include a computing device including at least a display and communication capability with the network 104 for accessing the social messaging system 130. [ The client device 110 may be a wireless device such as a remote device, a workstation, a computer, a general purpose computer, an Internet appliance, a handheld device, a wireless device, a portable device, a wearable computer, a cellular or mobile phone, a personal digital assistant , Tablet, ultrabook, netbook, laptop, desktop, multi-processor system, microprocessor-based or programmable consumer electronics, game console, set-top box, network PC, minicomputer and the like. The user 106 may include a person, machine, or other means for interacting with the client device 110. In some embodiments, the user 106 interacts with the social messaging system 130 via the client device 110.

As shown in FIG. 1, the data layer 128 has one or more database servers 132 that enable access to an information storage repository or database 134. The database 134 is a storage device that stores data such as member profile data, social graph data (e.g., relationships among members of the social messaging system 130), and other user data.

An individual may register with the social messaging system 130 and become a member of the social messaging system 130. Once registered, a member may form a social network relationship (e.g., a friend, a follower, or a contact) on the social messaging system 130 and may interact with a wide range of applications provided by the social messaging system 130 .

The application logic layer 126 includes a variety of application logic modules 150 that may be associated with the interface module 140 to provide various data sources or data services Create various user interfaces with data. Each application logic module 150 may be used to implement functions associated with various applications, services, and features of the social messaging system 130. For example, the social messaging application may be implemented with one or more of the application logic modules 150. The social messaging application provides a messaging mechanism for a user of the client device 110 to send and receive messages, including text and media content such as video and video. The client device 110 may access and view messages from the social messaging application for a specified period of time (e.g., limited or unlimited). In one example, a particular message may be accessed by the message recipient for a predefined period of time (e.g., specified by the message sender) that starts when the particular message is first accessed. After a predefined period of time has elapsed, the message is deleted and can no longer be accessed by the message recipient. Of course, other applications and services may be implemented separately in their own application logic module 150.

As shown in FIG. 1, the social messaging system 130 or client application 112 includes a custom pattern system 160 that provides the ability to identify and decode optical barcodes employing custom function patterns. In various embodiments, the custom pattern system 160 may be implemented as a stand alone system, and is not necessarily included in the social messaging system 130. In some embodiments, client device 110 includes a portion of custom pattern system 160 (e.g., a portion of custom pattern system 160 may be included independently or within client application 112). In embodiments in which the client device 110 includes a portion of the custom pattern system 160, the client device 110 may include a custom pattern system 160 included in a particular application server or included in the social messaging system 130, Or may operate alone.

FIG. 2 is a block diagram 200 of a custom pattern system 160. FIG.

The custom pattern system 160 includes a communication module 210, a presentation module 220, a finder module 230, an alignment module 240, a decoder module 250, an operation module 260, and an encoder module 270 ). ≪ / RTI > All or a portion of the modules 210-270 communicate with each other via, for example, network coupling, shared memory, and the like. Each module of modules 210-270 may be implemented as a single module, combined with other modules, or further subdivided into a plurality of modules. Other modules not related to the illustrative embodiments may also be included, but are not shown.

The communication module 210 provides various communication functions. For example, communication module 210 receives, accesses or otherwise acquires image data of an image from a user device. In a particular example, communication module 210 receives substantially real-time image data (e.g., a continuous stream of image data or frames of a single frame captured by a camera sensor of a smartphone) from the camera sensor of the smartphone. The communication module 210 exchanges network communication signals with the database server 132, the client device 110, and the third party server 120. The information retrieved by communication module 210 may include data associated with a user (e.g., member profile data from an online account or social network service data) or other data to enable the functionality described herein .

Presentation module 220 provides various presentation and user interface functions that are operable to present information to the user interactively and to receive information from the user. For example, the presentation module 220 is available to present the user interface generated in response to decoding of the optical bar code. In another example, the presentation module 220 creates a user interface that includes the optical bar code (s). In various embodiments, the presentation module 220 may present information or cause presentation of the information (e.g., visually displaying information on the screen, acoustic output, haptic feedback). The process of interactively presenting information is intended to involve the exchange of information between a particular device and a user. The user may select one or more of the following: alphanumeric, point based (e.g., cursor), tactile, or other input (e.g., touch screen, tactile sensor, optical sensor, infrared sensor, biometric sensor, microphone, gyroscope, accelerometer, Or other sensors), for example, to provide an input for interacting with the user interface. The presentation module 220 provides many different user interfaces to enable the functionality described herein. The term "presenting " as used herein includes conveying information or instructions to a specific device that is operable to perform a presentation based on the information or commands conveyed.

The finder module 230 provides an image processing function for identifying, recognizing, or detecting a customized graphic being employed as a finder pattern in an optical bar code. For example, the finder module 230 extracts and analyzes candidate feature or candidate contour properties from the image data of the image received from the user device (e.g., client device 110). The finder module 230 determines the satisfaction of various rules or criteria associated with the extracted candidate shape features. The finder module 230 identifies the custom graphics contained in the image by comparing the extracted candidate shape features with a reference shape feature or other reference image of the custom graphics. The finder module 230 may employ various schemes and techniques for extracting candidate shape features from the image data of the image and subsequently identifying the customized graphics based on analysis of the candidate shape features. Examples of these techniques are described below with respect to Figures 5-14.

Alignment module 240 provides image processing functions for determining alignment of optical bar codes using custom graphics. The custom pattern system 160 may enable alignment of the data encoded in the optical bar code. In this way, the custom graphic functions as an alignment pattern for the optical bar code. For example, the sorting module 240 extracts spatial attributes of the custom graphics within the image from the image data. In various embodiments, the spatial attribute includes at least one of the position, orientation, scale, or other spatial aspects of the optical bar code. Alignment module 240 determines the alignment of the optical bar code based on spatial attributes (e.g., a particular orientation of the optical bar code). In one example, the alignment module 240 may determine an alignment that includes position and orientation based on spatial attributes and generate an image that is transformed in accordance with the alignment. The custom pattern system 160 may use the transformed image to decode the encoded data within a portion of the transformed image.

Decoder module 250 provides the ability to decode data encoded within an image using spatial properties or determined alignment of the custom graphics within the image. For example, the decoder module 250 may decode encoded data in the image from the transformed image according to spatial properties of the custom graphics extracted from the image data. In one embodiment, the decoder module 250 detects markings (e.g., high contrast dots, squares, or other marks in the image) representing data encoded within a portion of the image from the image data. In a particular example, the decoder module 250 uses the Reed-Solomon error correction scheme to decode the encoded data in the image. The Reed-Solomon error correction scheme allows successful or valid decoding even when it is not possible to decode a certain percentage of the data (e.g., corrupted bits or incorrectly decoded bits) from the optical bar code. In some embodiments, the user or manager of custom pattern system 160 constructs an acceptance value for the amount of corrupted or incorrectly decoded data that is acceptable when decoding the optical bar code. In some embodiments, the decoder module 250 also provides an image processing function that improves the decoding of the optical bar code. For example, the decoder module 250 as well as the alignment module 240 may perform image transformation of the image (e.g., image sharpening, noise reduction processing, other digital filtering, or improving the decoding accuracy Perform other image processing techniques).

The action module 260 provides the ability to perform various actions based on decoding the encoded data in the image. For example, data encoded within a portion of an image may represent a particular action or may include information to be used in conjunction with a particular action. In a particular example, the data encoded within a portion of the image may include a username or other user identification of a member of the social networking service, and based on decoding the username, the action module 260 may respond to the username (E.g., sending a message to a member associated with a user name). In some embodiments, the action module 260 performs actions specific to the particular app that is scanning the image (e.g., a function available to the user of the app, but otherwise unavailable). In some instances, the action module 260 performs an action (e.g., an action performed locally on the user device that scanned the snap code) without communicating with an external server.

Encoder module 270 provides (e.g., generates a snap code) the ability to create and encode data to optically barcode to employ custom graphics as one or more functional patterns. As discussed above in connection with decoder module 250, in a particular example, encoder module 270 may employ data encoding techniques such as Reed-Solomon error correction. In one exemplary embodiment, the encoder module 270 renders an array of machine-readable marks representing the data to be encoded. The encoder module 270 may then generate a machine-readable optical bar code using the custom graphics to be used as an array of rendered marks and a functional pattern.

3A and 3B are diagrams showing examples (e.g., snap codes) of an optical bar code using a custom graphic for a finder pattern or an alignment pattern. Drawing 300 illustrates an exemplary optical barcode that includes a custom graphic 310 (e.g., a company logo) and a marking 320 that is encoded to indicate optically barcoded data. In this example, the custom graphic 310 is a company logo such as the SNAPCHAT® "ghost" logo. It will be appreciated that the SNAPCHAT® "Ghost" logo is merely exemplary custom graphics and that other graphics, icons, or symbols may be used as a finder pattern or an alignment pattern using the techniques described herein. Other example custom graphics used as a functional pattern may include a plurality of paths, a plurality of polygons, a plurality of aesthetic elements, or other design features.

As shown in the diagram 300, the markings 320 are dots arranged in a pattern having a specific spacing or position readable by the machine. Although illustration 300 depicts markings 320 as dots, other shapes and marks may be employed (e.g., asymmetric shapes of a rectangle or various geometries). The markings 320 may be arranged in a uniform pattern or a non-uniform pattern. In some cases, the marks may be of different sizes or of a uniform size. In addition, marking 320 can be a predetermined arrangement or an arrangement that can be dynamically determined when decoding data from a marking. In some embodiments, the custom graphic 310 and marking 320 may be surrounded by a border shape, such as the outer box 325, or the like. Although the outer box 325 of the drawing 300 is shown as a square with rounded corners, the outer box 325 may be in a variety of different shapes with various geometries. Drawing 330 in FIG. 3B shows another exemplary optical barcode employing a custom graphic for the finder pattern or alignment pattern. Drawing 330 shows an optical bar code with markings excluded from the custom graphics. In these and other embodiments, the space within the custom graphics may be reserved for other uses. For example, pictures, graphics, animations, annotations, or images selected by the user can be inserted.

Referring now to FIG. 4, there is shown a drawing 400 illustrating an example of identifying and decoding an optical barcode employing custom graphics for a finder pattern or an alignment pattern. Figure 4 is an overview of a specific exemplary embodiment for identifying and decoding optical bar codes using custom graphics. Additional details and alternative implementations are discussed in connection with the following figures. In the drawing 400, a scene 402 shows a poster 404 and a user 410 that include an optical bar code 406.

It will be appreciated that the optical bar code 406 may be displayed in a variety of ways, such as on a user device display, a computer display, woven or otherwise attached to a garment or other article, or included in various prints. Callout 412 represents an enlarged view of a portion of scene 402. The callout 412 includes a user device 414 of a user 410 that includes an optical sensor (e.g., a camera sensor of a smart phone) operable to detect an optical signal 408 of the optical bar code 406 .

In one exemplary embodiment, the user device 414 captures an image of a poster 404 that includes an optical bar code 406. The custom pattern system 160 receives image data representing an image from the user device 414. In this exemplary embodiment, the custom pattern system 160 is included in a user device 414 (e.g., an application running on the smartphone of the user 410), but in another exemplary embodiment, (E.g., a server of the social messaging system 130) that is communicatively coupled to the user device 414. [ The callout 416 represents exemplary image processing performed by the finder module 230 to identify custom graphics in the image and to utilize the custom graphics as an alignment pattern for decoding the data contained in the optical bar code 406 . At callout 416, the finder module 230 extracts candidate feature features from the image data of the image. Subsequently, the finder module 230 determines whether the candidate feature meets certain rules and criteria to filter out low-probability feature features that are irrelevant or customizable graphics. The finder module 230 may then compare the candidate feature features that meet the shape feature criteria or rules with the customized graphics reference feature features. In one example, the finder module 230 identifies a customized graphic based on a match between the candidate feature feature and the reference feature feature (e.g., a match score that exceeds a threshold).

Following the identification of the custom graphics by the finder module 230, the custom pattern system 160 may utilize the custom graphics as an alignment pattern for decoding. For example, the sort module 240 extracts the spatial attributes of the custom graphics from the image and compares the extracted spatial attributes with the reference spatial attributes to determine the alignment of the custom graphics. Alignment module 240 or decoder module 250 may generate a transformed image of the image according to alignment (e.g., rotation or deskew) as shown in callout 418. [ After generating the transformed image, the decoder module 250 decodes the encoded data within a portion of the transformed image as shown in the callout 420. In the callout 420, the dots of the optical bar code 406 are converted to data shown as 1 for dots and 0 for non-dots, but this is merely exemplary and other schemes may be exploited. In this manner, the custom pattern system 160 utilizes the custom graphics contained in the optical bar code 406 as one or more functional patterns, such as a finder pattern or an alignment pattern.

FIG. 5 is a flow chart illustrating an exemplary method 500 for an optical bar code employing a custom function pattern (e.g., optical bar code 406 of FIG. 4). The operations of method 500 may be performed by components of custom pattern system 160 and are described below for illustrative purposes.

In operation 510, the communication module 210 receives image data of an image from the user device. For example, the communication module 210 receives image data from an optical sensor (e.g., a camera sensor) of the user's smartphone. In various embodiments, image data from a user device is received in response to user-initiated image capture, periodic monitoring of image data detected by an optical sensor of a user device, or a combination thereof. In some embodiments, the image data represents an image or video (e.g., a live image feed from a camera sensor of a smartphone) that is captured by the user device in substantially real time. In another embodiment, the image data represents an image captured by a user device or other device from a past time and stored on the user device (e.g., a still image stored on a user device or downloaded from a social networking service Or video). In embodiments where the image data includes video image data, the custom pattern system 160 may analyze individual frames of video or a combination of multiple frames of video to detect and decode the optical bar code. A portion of the image data may include data representing an optical barcode employing custom graphics, custom symbols, or specific graphics for a particular functional pattern (e.g., a finder pattern or an alignment pattern).

In some scenarios, the image data includes heterogeneous or irrelevant data along with data related to the optical bar code (e.g., the image of the optical bar code includes a background unrelated to decoding the optical bar code). In a particular example, the optical sensor of the user device captures an image of a movie poster that includes a particular optical bar code. An image of a movie poster may include, in conjunction with a particular optical bar code, irrelevant portions surrounding the particular optical bar code, i. E., A movie poster or background.

At operation 520, the finder module 230 extracts a candidate feature or candidate feature of the image from the image data. The candidate shape features may represent an identification of the custom graphic (e.g., including any feature or characteristic that represents a custom graphic). For example, the finder module 230 performs edge detection techniques or other image processing techniques to identify feature features such as contours of the image or local concentration of hue or shading. In some embodiments, the finder module 230 extracts a plurality of candidate shape features from the image data. In some embodiments, the candidate shape features include various shape feature data such as the position of the candidate shape feature relative to the boundaries of the image, the brightness of the candidate shape feature with respect to the image, and the average color of the candidate shape feature.

In a further exemplary embodiment, the finder module 230 generates a low-resolution copy of the image. The finder module 230 may perform various image processing on a low resolution copy of the image, such as blur (e.g., a Gaussian Blur function or other Blur function) and thresholding, to produce a modified low resolution image have. Thresholding image processing may be performed by applying a lighter color (determined, for example, by a critical or critical range) of low resolution copies of the image for white and darker colors (e.g., determined by a critical or critical range) . ≪ / RTI > The finder module 230 can then extract candidate feature features from the modified low resolution images to improve the detection of custom graphics within the image and improve the computational efficiency of identifying custom graphics within the image.

In yet a further exemplary embodiment, the finder module 230 creates a high-resolution copy of a portion of the image. For example, the finder module 230 may generate a high-resolution copy of a particular portion of the image corresponding to the extracted candidate shape feature. The finder module 230, the alignment module 240 or the decoder module 250 may use high resolution copies for subsequent analysis as described below to improve detection, alignment, and decoding results.

At operation 530, the finder module 230 determines that the candidate shape feature meets one or more shape feature criteria or rules. For example, if a particular feature feature is an outline, the finder module 230 can determine if the outline is a closed curve that surrounds a portion of the image. Consistent with some embodiments, the feature feature rules filter out irrelevant or heterogeneous features. Specific shape feature rules are for various purposes or for various purposes. For example, a particular shape feature rule may be intended to filter out candidate feature features that are less likely to be custom graphics. In this example, the specific shape feature rules may be specific to the custom graphics. In other examples, some shape feature rules may be intended to filter out candidate shape features that are not likely to be associated with the optical bar code. In these examples, the shape feature rules do not necessarily have to be specific to the custom graphics.

In operation 540, in response to a candidate shape feature that satisfies the shape feature rule, the finder module 230 identifies a customized graphic or custom symbol in the image by comparing the candidate shape feature to a custom feature of the custom or custom symbol . For example, the finder module 230 may compare the area or size of the candidate feature with the reference area or size of the reference feature. In this example, the finder module 230 identifies a custom graphic based on a match or a sub match (e.g., a percentage match above the threshold) between the candidate feature and the reference feature. In this manner, the finder module 230 identifies the presence of an optical bar code in a portion of the image using at least a portion of the custom graphic or custom graphic as a finder pattern.

In some embodiments, the finder module 230 extracts a plurality of candidate shape features from the image data. In these embodiments, the finder module 230 scales each candidate shape feature and ranks the plurality of candidate shape features according to each score. For example, the finder module 230 determines a shape feature score for each candidate shape feature based on a count or a weighted count of shape feature rules that each candidate shape feature satisfies. The finder module 230 repeats the ranked candidate shape features, starting from the highest score candidate shape feature, and performs additional analysis (e.g., determining the candidate shape feature as a reference shape feature Comparison) can be performed.

In some embodiments, the reference shape feature is predetermined, and in another embodiment, the reference shape feature is determined dynamically. For example, the finder module 230 may dynamically determine a reference feature feature by analyzing the reference image of the custom graphics. For example, the finder module 230 may perform an analysis technique similar to an analysis technique for analyzing image data on a reference image, such as calculating a reference area value for a particular feature or characteristic of a reference image . In these embodiments, the finder module 230, which dynamically determines a reference shape feature, allows dynamic use of a particular customized graphic as a functional pattern in the optical bar code. For example, the custom pattern system 160 may be provided (e.g., to be received at the communication module 210) with data indicative of a reference image or data indicative of a reference feature when the method 500 is performed . In this manner, the customized function pattern does not necessarily have to be fixed before performing method 500.

In a further exemplary embodiment, the finder module 230 retrieves a plurality of custom graphics from the image data of the image (e.g., when multiple versions or different custom graphics are used as a function pattern). In a particular example, the customized graphic may include a first company logo, and the company may change the logo to a second company logo. The custom pattern system 160 may utilize the first company logo as a finder pattern and the second company logo as a finder pattern and the custom pattern system 160 may retrieve the respective logo when performing the method 500 .

In a further exemplary embodiment, the finder module 230 identifies custom graphics within the image in conjunction with other candidate shape features extracted from the image data. For example, the finder module 230 may search both a custom graphic (e.g., a logo) and an outer box (e.g., outer box 325) that surrounds the custom graphic. In these embodiments, the finder module 230 identifies a combination of custom graphics and one or more additional candidate shape features extracted from the image data.

In operation 550, in response to identifying the custom graphics, the alignment module 240 extracts spatial, geometric, or spatial characteristics of the customized graphics or custom symbols in the image from the image data. For example, the alignment module 240 extracts the position, scale, or orientation of the custom graphics from the image data. In various exemplary embodiments, the spatial attribute represents the orientation of the customized graphics in the image. Alignment module 240 or decoder module 250 may enable decoding of optical bar codes using spatial attributes.

In a further embodiment, the alignment module 240 extracts the spatial properties, geometric properties, or spatial properties of another candidate shape feature extracted from the image data of the image. For example, the sorting module 240 extracts the spatial attributes of the outer box (e.g., outer box 325 of FIG. 3A) surrounding the markings and the custom graphics to encode the data. Throughout the following discussion, the alignment module 240 and the decoder module 250 can be used to determine the alignment of the optical barcode used to enable decoding using the spatial properties of the outline box in the same or similar manner as the spatial properties of the customized graphics It will be noted that you can decide. For example, the alignment module 240 or the decoder module 250 may generate the transformed image of the image used to decode the data encoded in the image using the spatial property of the outer box.

At operation 560, the decoder module 250 decodes the encoded data within the portion of the image from the image data using the spatial properties of the customized graphics in the image. For example, the decoder module 250 performs image transformation using spatial attributes (e.g., de-skew, rotate, scale, or another type of image transformation) to detect Performance or reliability can be improved. In one embodiment, the decoder module 250 decodes the encoded data within a portion of the image by detecting markings (e.g., dots, squares, or other markings) indicative of the data contained in the image. In this manner, the decoder module 250 utilizes at least some of the customized or customized graphics as an alignment pattern to enable decoding of the optical bar code. In various embodiments, the decoder module 250 employs a Reed-Solomon error correction scheme to decode the encoded data in the image. The Reed-Solomon error correction scheme allows successful decoding of encoded data in an image with a predetermined percentage of data encoded within the image being corrupted, corrupted, or incorrectly decoded. In a further embodiment, the decoder module 250 uses a small amount of checksum to verify that the value decoded from the image data is a value that contains real data rather than random data (e.g., random bits).

In a further exemplary embodiment, the decoder module 250 decodes the decoded data from image data that is known to be invalid, as specified by the administrator of the custom pattern system 160 (e.g., Of the results). For example, decoder module 250 may decode decoded data, including all 0, all ones, or other specified results, even if the decoded data passes other data integrity tests (e.g., error correction and checksum) You can refuse. For example, this may occur when the custom pattern system 160 scans a custom graphic without any associated marking representing the data (e.g., if the custom graphic is a logo, simply scanning the logo may result in the decoded data All zeros may be generated and may be rejected by the decoder module 250). In a particular example, scanning an icon associated with the social messaging application 1908, as shown below in FIG. 19, will result in data all zeros, and the decoder module 250 will reject the scan.

6 is a flow chart illustrating additional exemplary operation for identifying an optical bar code (e.g., optical bar code 406) using a customized function pattern. At operation 530, the finder module 230 determines that the candidate shape feature satisfies the shape feature rule. In some embodiments, operation 530 includes the operations of FIG.

At operation 610, the finder module 230 determines from the image data that the candidate shape feature includes a closed curve. That is, the shape feature rule includes a path rule, and the finder module 230 determines that the candidate shape feature satisfies the path rule. The finder module 230 may employ various techniques to determine if the candidate shape feature satisfies the path rule.

In operation 630, the finder module 230 determines whether the candidate shape feature is a closed curve by determining that the candidate shape feature has a path starting at a particular point and returning to that same point at the same location, surrounding the portion of the image. In the exemplary embodiment, if the candidate shape feature does not satisfy the path rule (denoted as "NO" in FIG. 6), no further analysis of the candidate shape feature is performed and the finder module 230 returns another candidate shape feature Does not analyze or perform any additional operations. Alternatively, at operation 640, if the finder module 230 determines that the candidate shape feature satisfies the path rule (indicated as "Yes" in FIG. 6), subsequent operations of the method 500 are performed.

To illustrate the concept of Fig. 6, Fig. 7 is a drawing 700 illustrating an example of identifying an optical bar code using a customized function pattern. In the drawing 700, an image 710 is an exemplary image that is received or accessed from a user device. Image 720 is an exemplary image representing exemplary candidate shape features 730. For example, the finder module 230 performs edge detection image processing on the image 710 to derive an image 720. From image 720, finder module 230 identifies candidate feature features 730.

Callout 740 illustrates a particular candidate shape feature among the candidate shape features 730. Callout 740 illustrates a contour 750, path 760, and a point 770 of a particular candidate shape feature (shown in dotted lines) of a particular candidate shape feature. At callout 740, the finder module 230 determines that the path rule is satisfied if path 760 starting at point 770 can return to point 770 along contour 750. In figure 700, the particular candidate shape feature shown in callout 740 satisfies the path rule because path 760 can return to point 770 along contour 750.

Fig. 8 is a flow diagram illustrating a further exemplary operation for identifying an optical bar code using a customized function pattern. At operation 530, the finder module 230 determines that the candidate shape feature satisfies the shape feature rule. In some embodiments, operation 530 includes the operations of FIG.

At operation 810, the finder module 230 calculates an area value or magnitude approximation of the candidate shape feature. For example, the finder module 230 approximates the shape of a candidate shape feature using a proxy shape such as a polygonal (e.g., square, rectangular, or quadrilateral) or non-polygonal shape (e.g., ellipse). The finder module 230 aligns or approximates the proxy shape to the outer edge or outer boundary of the candidate shape feature so that the proxy shape represents the area of the candidate shape feature. Subsequently, the finder module 230 calculates the area value of the proxy shape to determine the area value or magnitude approximation of the candidate shape feature. In some embodiments, the finder module 230 uses this technique (e.g., polygon area approximation) to avoid computationally expensive area calculations of candidate shape features in situations where the candidate shape features may be complex in shape (For example, area calculations for features with irregular or irregular shapes are usually computationally more expensive). In some embodiments, other techniques, such as pixel based counting, can be exploited to determine area values.

At operation 820, the finder module 230 determines the area score or size score of the candidate shape feature. The finder module 230 determines the area score by comparing the area value of the candidate shape feature with the reference area value. In some embodiments, the reference area value includes an area value of the corresponding proxy shape (e.g., an area value of the proxy shape aligned with the ghost logo in front view) tailored to the reference image of the custom graphics. In another embodiment, the reference area value includes an area value of the custom graphic (e.g., an area value of the ghost logo). The finder module 230 calculates the area score, for example, by determining a matching percentage between the candidate shape feature area value and the reference area value. The finder module 230 may employ various other schemes and techniques to calculate the area score.

At operation 830, the finder module 230 determines if the area score exceeds a threshold. The threshold may be predefined or dynamically determined (e.g., statistically determined based on the rolling historical average of the scans).

In operation 840, based on the area score exceeding the threshold (denoted as "Yes" in FIG. 8), the finder module 230 determines that the candidate shape feature satisfies the area rule and proceeds to a subsequent operation. In another exemplary embodiment, the finder module 230 compares the area score with an area range that meets the area rule (e.g., greater than a particular value and less than a particular value). If the area score does not exceed the threshold (denoted as "NO" in FIG. 8), in accordance with the exemplary embodiment, the finder module 230 does not analyze another candidate shape feature or perform any additional operations. In some exemplary embodiments, the finder module 230 may use the determination of whether the candidate shape feature satisfies the shape feature rules (e.g., to remove or skip a candidate shape feature that is not likely to be a customized graphic) To identify candidate shape features to be further analyzed in the process of identifying a customized graphic in the image.

To further illustrate the concept of FIG. 8, FIG. 9 is a diagram 900 illustrating an example of identifying an optical bar code using a customized function pattern. In the drawing 900, an image 902 is an exemplary image received from a user device. Callout 904 illustrates the spatial orientation of image 902. In this example, an image 902 is displayed and viewed from the front right view. The image 902 includes an optical bar code 906. In this example, the optical bar code 906 employs a custom graphic as a functional pattern.

The callout 908 shows an enlarged portion of the image 902 that includes candidate feature features that are analyzed by the finder module 230 to identify the customized graphics. At the callout 908, a polygon 910 (e.g., a quadrangle) is shown aligned to the boundary of the candidate shape feature. The area value 912 is the area of the polygon 910.

The callout 914 shows the reference image of the custom graphic. Callout 916 shows the spatial orientation of the reference image. In this example, the reference image is shown in front view. The polygon 918 is shown aligned to the boundary of the reference image. The reference area value 920 is the area of the polygon 918. Although FIG. 9 illustrates polygons 910 and 918 as quadrants, the finder module 230 may be configured to include other shapes such as shapes that follow or track the contours of square or candidate shape features (e.g., contour points of candidate shape features Lt; / RTI > polygonal or smoothly conforming shape).

The finder module 230 compares the area value 912 with the reference area value 920 to determine if the candidate shape feature satisfies the area rule. Another candidate shape feature of the image 902, such as one of the notes of the image 902, will not have an area value similar to the reference area value and thus will not satisfy the area rule. In this manner, the finder module 230 can quickly remove or skip certain candidate feature features that are not likely to be identified as custom graphics.

10 is a flow diagram illustrating a further exemplary operation for decoding an optical bar code using a customized function pattern. At operation 540, the finder module 230 identifies the custom graphics within the image by comparing the candidate shape feature with a reference shape feature of the custom graphics. Following operation 540, the operations of FIG. 10 are performed in some exemplary embodiments.

At operation 1010, the alignment module 240 extracts the intrinsic features of the custom graphics from the image data, where the intrinsic features represent the alignment of the custom graphics (e.g., customizable Specific asymmetry of graphics). For example, the intrinsic feature may include an intrinsic point of the custom graphic, a characteristic curve, a specific asymmetry, a specific non-uniformity, or another characteristic of the custom graphic.

In operation 1020, the alignment module 240 determines the orientation of the custom graphics in the image by comparing the intrinsic features to the reference intrinsic features of the custom graphics. For example, the alignment module 240 maps the extracted intrinsic features of the custom graphics to a reference intrinsic feature to determine spatial differences between intrinsic features. In this manner, the alignment module 240 can determine the alignment of the customized graphics by comparing it with the reference image of the customized graphic based on the determined spatial difference.

In operation 1030, the alignment module 240 generates the transformed image by transforming the image according to the orientation of the custom graphics. For example, the alignment module 240 may rotate, deskew, scale, or otherwise spatially transform an image to allow more accurate decoding of the data in the image.

At operation 1040, the decoder module 250 decodes the encoded data within the image using the orientation and position of the custom graphics within the image. For example, the decoder module 250 decodes the encoded date in the image from the transformed image. In certain scenarios, an image is converted to a front view to improve the visibility and uniformity of marks in the image that represent the data encoded within the image.

To assist in understanding the teachings of FIG. 10, FIG. 11 is a drawing 1100 illustrating an example of decoding an optical bar code using a customized function pattern. In FIG. 1100, similar to FIG. 9 described above, image 1102 is an exemplary image received from a user device. In this example, an image 1102 is displayed and viewed from the front right view. The image 1102 includes an optical bar code 1106. In this example, the optical bar code 1106 employs a custom graphic as a functional pattern.

Callout 1108 shows an enlarged portion of image 1102 that includes candidate shape features that are analyzed by alignment module 240. The callout 1110 shows an enlarged portion of the callout 1108 showing the unique features of the candidate shape feature.

Callout 1112 shows a custom image of the reference image. Callout 1114 shows the spatial orientation of the reference image. In this example, the reference image is shown in front view. Callout 1116 shows an enlarged portion of callout 1112 showing the reference intrinsic feature of the reference image.

Alignment module 240 compares the intrinsic feature and the reference intrinsic feature to determine alignment, including orientation, scale, or position. For example, if the image containing the custom graphic is an image from the front view, then the intrinsic feature of the custom graphic within the image must match the reference intrinsic feature. Alignment module 240 may determine a view change based on a mismatch between the intrinsic feature and the reference intrinsic feature. The alignment module 240 uses this mismatch to infer or determine the perspective of the image or other spatial attributes of the image that can be used by the decoder module 250 to more accurately decode the data from the image.

Figures 12A, 12B, and 12C are diagrams illustrating various image transformations used to enable decoding of an optical bar code using a customizable function pattern. In an exemplary embodiment, alignment module 240 or decoder module 250 performs image conversion, such as rotation, illustrated by the transitions between exemplary optical bar codes 1200 and 1202. In another embodiment, the alignment module 240 or the decoder module 250 performs de-queuing, scale conversion, or other type of image conversion. In another exemplary embodiment, the alignment module 240 or decoder module 250 performs other image conversions, such as color reversal, as illustrated by the transitions between the exemplary optical bar codes 1204 and 1206. Alignment module 240 or decoder module 250 may perform other image transformations not shown, such as image sharpening, noise reduction, or other image processing.

Figure 12C shows an example of a technique for determining alignment of custom graphics. The exemplary optical bar code 1208 is rotated slightly off from zero degrees. The ellipse 1210 may be tailored to a custom graphic to determine alignment, such as the rotation value of the optical bar code 1208. [ The long axis 1212 of the ellipse provides an indication of the rotation value 1214 from zero degrees (of course, a short axis value or other axis can be similarly used to determine the rotation value). Alignment module 240 or decoder module 250 may perform image transformation to adjust rotation value 1214 as shown by an exemplary optical bar code 1216 rotated from its original orientation 1218 have. In this manner, the alignment module 240 or the decoder module 250 can assist in deciding the alignment for the optical bar code included in the image using the custom graphics to decode the encoded data in the image.

Figure 13 is a flow diagram illustrating a further exemplary operation for decoding an optical bar code using a customized function pattern. At operation 1040, decoder module 250 decodes the encoded data within the portion of the image from the image data. Following operation 1040, the operations of FIG. 13 are performed in some exemplary embodiments.

At operation 1310, decoder module 250 determines decoding failure of the encoded data within a portion of the image using the transformed image. For example, if the decoded data from the image is broken, incomplete, or distorted, the decoder module 250 determines the decoding failure of the data. In another example, a portion of the encoded data in the image may be for data validation. That is, a known or determinable value is encoded into the data, but the data may be encoded to be valid if the value is decoded from the image. Decoder module 250 may employ various other techniques and techniques to determine decoding failure of encoded data within a portion of an image.

In operation 1320, the alignment module 240 generates another transformed image by transforming the image according to the different orientations of the custom graphics. For example, the alignment module 240 generates a rotated 180 degree rotated image, and the decoder module 250 attempts to decode the data a second time. Alignment module 240 may perform a common transform that can resolve decode failures such as 90 degrees rotation or other transformations that often resolve decoding failures in past scans. In some embodiments, the alignment module 240 performs another analysis of the image data of the image to determine another alignment to use when generating another transformed image. Alignment module 240 may perform other types of transformed images by applying different types of filters (e.g., orientation, color reduction, brightness manipulation, etc.) to the custom graphics.

At operation 1330, the decoder module 250 decodes the encoded data within a portion of the image using another transformed image. Alignment module 240 and decoder module 250 may attempt to iterate (for example, a set number of attempts or an unlimited number of iterations) any number of alignments that end when data is successfully decoded from the image. In this manner, the custom pattern system 160 may utilize marking for self-alignment.

To further illustrate the discussion with reference to FIG. 13, FIG. 14 is a drawing 1400 illustrating an example of decoding an optical bar code using a customized function pattern. An exemplary optical bar code 1410 represents a location for marking with an empty circle. Each empty circle of the optical barcode 1410 is a position for the marker. The exemplary optical bar code 1420 represents the misalignment between the marking position and the marking. An exemplary optical bar code 1430 represents the aligned alignment between the marking and marking positions.

Referring now to Figures 15-16, the user interfaces described herein (e.g., Figures 15, 16, and 18) illustrate specific exemplary user interface and user interface elements, but these are only non-limiting examples Many other alternative user interfaces and user interface elements may be generated by the presentation module 220 and presented to the user. Alternative presentations of the display described herein include additional information, graphics, options, and the like; Note that other presentations may contain less information or provide summary information that is readily available to the user.

15 is a user interface diagram 1500 illustrating an exemplary user interface 1510 for identifying optical barcodes. In user interface diagram 1500, user interface 1510 represents a substantially real-time image captured from a camera sensor of a user device (e.g., client device 110, user device 414). The user interface 1510 may include graphical and user interface elements superimposed or overlaid on a substantially real-time image displayed below. For example, the user interface element 1520 is a bracket that indicates the identification of the optical bar code. The user interface 1510 may display to the user a successful scan or a failed scan of a particular optical bar code.

16 is a user interface diagram 1600 illustrating an exemplary user interface 1610 for performing operations associated with optical bar codes. In one exemplary embodiment, the user interface 1610 is displayed after the user interface 1510 of FIG. 15 (e.g., after a successful scan, various operational options associated with the scan are displayed). User interface 1610 may include various operational options associated with detecting a particular optical bar code, such as user interface element 1620. [ In some embodiments, certain operations are performed automatically by the custom pattern system 160 in response to detecting and decoding a particular optical bar code.

In a further exemplary embodiment, the operation is proprietary to software that provides a scanning function for an optical barcode using a customized function pattern (e.g., a snap code). In some embodiments, the software that scans the optical bar code may perform certain proprietary operations without communicating with the server. This is due to the exclusively branded nature of the customized function pattern, which can not necessarily be publicly decoded by other third party software applications. Snap codes can specify this behavior because software that scans branded optical barcodes (for example, mobile computing software such as apps) is likely to be associated with the branded optical barcode.

Fig. 17 is a flowchart showing an exemplary operation for generating an optical bar code using a customized function pattern. Fig. The operations of method 1700 may be performed by components of custom pattern system 160 and are described below for illustrative purposes.

At operation 1710, the communication module 210 receives a request to create a machine-readable image, such as an optical barcode, using a customized function pattern. In some embodiments, the request includes user specified data to be encoded into an image.

At operation 1720, the encoder module 270 renders a machine-readable arrangement of marks that encode user specified data. For example, the mark may comprise dots, squares, or other markings arranged in a predetermined pattern. In an exemplary embodiment, the presence of a mark at a particular location in the array represents data.

At operation 1730, the encoder module 270 generates a machine-readable image by placing a machine-readable arrangement of marks in the machine-readable image with respect to the location of the custom graphics contained in the machine-readable image. For example, the custom graphic may be located in the center of the optical bar code or may be located elsewhere (e.g., the exemplary optical bar code of FIGS. 3A and 3B).

At operation 1740, communication module 210 stores or transmits a machine-readable image. For example, communication module 210 may store a machine-readable image on a user device, a server, or other storage repository (stored locally or remotely). In other examples, the communication module 210 transmits the machine-readable image to a user device, a server, or one or more other devices.

18 is a user interface diagram 1800 illustrating an exemplary user interface 1810 for creating an optical bar code 1820 using custom graphics. User interface element 1830 provides the user with an option to create, share or store machine-readable images. In some embodiments, the user interface diagram 1800 includes a user interface 1800 configured to receive user-specified data (e.g., a social networking service member identifier, website address, or other information) to encode with a machine- Interface.

FIG. 19 illustrates an exemplary mobile device 1900 executing a mobile operating system (e.g., IOS ™, ANDROID ™, WINDOWS® Phone, or other mobile operating system), according to some embodiments. In one embodiment, the mobile device 1900 includes a touch screen operable to receive tactile data from the user 1902. For example, the user 1902 may physically touch (1904) the mobile device 1900, and in response to the touch 1904, the mobile device 1900 may provide a touch location, touch force, Of the tactile sense data. In various exemplary embodiments, the mobile device 1900 displays a home screen 1906 (e.g., Springboard on IOS ™) that is operable to launch applications or manage various aspects of the mobile device 1900. In some exemplary embodiments, the home screen 1906 provides status information, such as battery life, connectivity, or other hardware status. The user 1902 may activate the user interface element by touching the area occupied by each user interface element. In this manner, the user 1902 interacts with the applications of the mobile device 1900. For example, touching an area occupied by a particular icon included in the home screen 1906 causes launching of an application corresponding to a particular icon.

(E. G., An application programmed in another suitable language running on Objective-C, Swift, or IOS (TM), or an application programmed with Java running on ANDROID (TM)), a mobile web application A number of various applications (also referred to as "apps "), such as applications written in Hypertext Markup Language 5) or hybrid applications (e.g., native shell applications launching HTML5 sessions) . For example, mobile device 1900 may be a messaging app, an audio recording app, a camera app, a book reader app, a media app, a fitness app, a file management app, a location app, a browser app, , Or other apps (e.g., game apps, social networking apps, biometric monitoring apps). In another example, mobile device 1900 includes a social messaging application 1908, such as SNAPCHAT, which allows a user to exchange ephemeral messages containing media content, consistent with some embodiments. do. In this example, the social messaging application 1908 may include aspects of the embodiments described herein.

Certain embodiments are described herein as including logic or a plurality of components, modules, or mechanisms. A module may comprise a software module (e.g., code implemented on a machine-readable medium) or a hardware module. A "hardware module" is a tangible unit capable of performing certain operations and may be constructed or arranged in a predetermined physical manner. In various exemplary embodiments, one or more hardware modules (e.g., a processor or group of processors) of one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) May be configured as a hardware module that is operative to perform certain operations described herein by a processor (e.g., an application or application portion).

In some embodiments, the hardware modules may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic configured to perform certain operations permanently. For example, the hardware module may be a special-purpose processor such as an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit). The hardware module may also include programmable logic or circuitry temporarily configured by software to perform certain operations. For example, the hardware module may comprise software executable by a general purpose processor or other programmable processor. Once configured by such software, the hardware module becomes a specific machine (or a specific component of the machine) that is uniquely tailored to perform the configured function and is no longer a general purpose processor. It will be appreciated that the decision to implement a hardware module in a mechanically, duly and permanently configured circuit, or in a transiently configured circuit (e.g., configured by software), may be determined by cost and time considerations.

Thus, the phrase "hardware module" is intended to encompass all types of hardware components, including, but not limited to, those that are physically configured to perform certain operations, Quot;) < / RTI > type entity. As used herein, "hardware-implemented module" refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one time. For example, if a hardware module includes a general-purpose processor configured to be a special-purpose processor by software, then the general-purpose processor may be used as each of the different special-purpose processors at different times (e.g., including different hardware modules) Lt; / RTI > The software thereby configures a particular processor or processors to configure a particular hardware module, for example at a given point in time, and to configure a different hardware module at a different point in time.

A hardware module may provide information to other hardware modules and may receive information from other hardware modules. Thus, the described hardware modules may be considered to be communicatively coupled. If multiple hardware modules are present at the same time, communication may be accomplished through signal transmission between two or more hardware modules (e.g., via appropriate circuitry and buses). In embodiments where a plurality of hardware modules are configured or instantiated at different times, communication between such hardware modules may be accomplished through, for example, storage and retrieval of information in a memory structure accessed by a plurality of hardware modules have. For example, one hardware module may perform some operation and store the output of the operation in a communicatively coupled memory device. The additional hardware module may then access the memory device to recover and process the stored output. A hardware module may also initiate communication with an input or output device and may operate on a resource (e.g., a collection of information).

The various operations of the exemplary methods described herein may be performed, at least in part, by one or more processors temporarily configured or permanently configured (e.g., by software) to perform related operations. These processors, whether temporarily or permanently configured, constitute processor-implemented modules that operate to perform one or more of the operations or functions described herein. As used herein, "processor-implemented module " refers to a hardware module implemented using one or more processors.

Similarly, the methods described herein may be at least partially processor-implemented with a particular processor or processors, which is an example of hardware. For example, at least some of the operations of the method may be performed by one or more processors or processor-implemented modules. In addition, the one or more processors may also operate as a " software as a service " (SaaS) to support the performance of related operations in a "cloud computing" environment. For example, at least some of the operations may be performed by a group of computers (as an example of machines including processors), which may be a network (e.g., the Internet) and one or more appropriate interfaces , An application program interface (API)).

The performance of certain operations may be distributed among the processors not only within a single machine but also across multiple machines. In some exemplary embodiments, the processor or processor-implemented modules may be located in a single geographic location (e.g., in a home environment, office environment, or server farm). In another exemplary embodiment, the processor or processor-implemented modules are distributed across multiple geographic locations.

The modules, methods, applications, etc. described in connection with the above figures are, in some embodiments, implemented in the context of the machine and the associated software architecture. The following sections describe exemplary software architecture (s) and machine (e.g., hardware) architectures suitable for use in the disclosed embodiments.

Software architectures are used in conjunction with hardware architectures to create devices and machines tailored for specific purposes. For example, a particular hardware architecture combined with a particular software architecture will create mobile devices such as mobile phones, tablet devices, and the like. A slightly different hardware and software architecture can create smart devices for use in the "things Internet ". Yet another combination creates a server computer for use within the cloud computing architecture. Any combination of such software and hardware architectures is not provided herein, as one of ordinary skill in the art can readily understand how to implement the invention in different contexts than the disclosure contained herein.

20 is a block diagram 2000 illustrating an exemplary software architecture 2002 that may be utilized in conjunction with the various hardware architectures described herein. 20 is a non-limiting example of a software architecture, and it will be appreciated that many different architectures may be implemented to enable the functionality described herein. The software architecture 2002 may be specifically executed on hardware such as the machine 2100 of Figure 21 including a processor 2110, a memory / storage 2130, and an I / O component 2150. A representative hardware layer 2004 is illustrated, and may, for example, represent the machine 2100 of FIG. Exemplary hardware layer 2004 includes one or more processing units 2006 having associated executable instructions 2008. Executable instructions 2008 represent executable instructions of the software architecture 2002 that include implementations of the methods, modules, and the like in the drawings and description above. The hardware layer 2004 also includes a memory and storage module 2010 that also has executable instructions 2008. [ The hardware layer 2004 may also include other hardware labeled 2012 that represents any other hardware of the hardware layer 2004, such as the other hardware illustrated as part of the machine 2100.

In the exemplary architecture of Fig. 20, the software 2002 can be conceptualized as a stack of layers, each layer providing a specific function. For example, the software 2002 may include layers such as an operating system 2014, a library 2016, a framework / middleware 2018, an application 2020, and a presentation layer 2022. The application 2020 or other components in the layers may invoke an application programming interface (API) call 2024 via the software stack and may send a message 2026 in response to the API call 2024 An example response, a value returned, and so on. The illustrated layers are exemplary in nature and not all software architectures have all layers. For example, some mobile or special-purpose operating systems may not provide a framework / middleware layer 2018, while others may provide such a layer. Other software architectures may include additional or different layers.

The operating system 2014 can manage hardware resources and provide a common service. The operating system 2014 may include, for example, a kernel 2028, a service 2030, and a driver 2032. The kernel 2028 may act as an abstraction layer between hardware and other software layers. For example, the kernel 2028 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and the like. Service 2030 may provide other common services for different software layers. The driver 2032 may be responsible for controlling or interfacing with the base hardware. For example, the driver 2032 may be a display driver, a camera driver, a BLUETOOTH® driver, a flash memory driver, a serial communication driver (for example, a USB (Universal Serial Bus) driver), a WI- Drivers, audio drivers, power management drivers, and the like. In an exemplary embodiment, the operating system 2014 includes an imaging service 2033 that can provide image processing services, such as hardware accelerated image processing, or image capture services such as low-level access to optical or light sensor data ).

The library 2016 may provide a common infrastructure that may be utilized by applications 2020 or other components or layers. The library 2016 typically includes a task 2026 that facilitates the execution of a task 202 in a manner that is easier than other software modules to interface directly with a base operating system 2014 function (e.g., kernel 2028, service 2030, or driver 2032) To be performed. The library 2016 may include a system library 2034 (e.g., a C standard library) capable of providing functions such as a memory allocation function, a string manipulation function, a mathematical function, and the like. The library 2016 may also include a media library (e.g., a library that supports presentation and manipulation of various media formats such as MPREG4, H.264, MP3, AAC, AMR, JPG or PNG) (E.g., OpenGL framework that can be used to render 2D and 3D in graphics content on a display), database libraries (e.g., SQLite that can provide a variety of relational database capabilities), Web libraries (e.g., APIs 2036 such as WebKit (which can provide functions). Library 2016 may also include various other libraries 2038 that provide a number of different APIs to application 2020 and other software components / modules. In one exemplary embodiment, the library 2016 includes an imaging library 2039 that provides image processing or image capture functionality that may be utilized by the custom pattern system 160.

The framework 2018 (sometimes referred to as middleware) may provide a high-level common infrastructure that may be utilized by applications 2020 or other software components / modules. For example, the framework 2018 may provide various graphical user interface (GUI) functions, high-level resource management, high-level location services, and the like. The framework 2018 provides a wide variety of other APIs that may be utilized by the application 2020 or other software components / modules, some of which may be specific to a particular operating system or platform. In one exemplary embodiment, the framework 2018 includes an image processing framework 2022 and an image capture framework 2023. The image processing framework 2022 may provide a high level of support for image processing functions that may be utilized in aspects of the custom pattern system 160. [ Similarly, the image capture framework 2023 may provide a high level of support for image capture and interfacing with optical sensors.

The application 2020 includes an embedded application 2040 or a third party application 2042. Exemplary embedded applications 2040 may include, but are not limited to, a contact application, a browser application, a book reader application, a location application, a media application, a messaging application, or a gaming application. Third party application 2042 may include any embedded application as well as a wide variety of other applications. In a particular example, a third party application 2042 (e.g., an application developed using an ANDROID (TM) or IOS (TM) Software Development Kit (SDK) by an entity other than a vendor of a particular platform) , WINDOWS® Phone, or other mobile operating system. In this example, the third party application 2042 may launch an API call 2024 provided by a mobile operating system, such as an operating system 2014, to enable the functionality described herein. In one exemplary embodiment, the application 220 includes a messaging application 2043 that includes a custom pattern system 160 as part of the application. In another embodiment, the application 220 includes a stand alone application 2045 that includes a custom pattern system 160. [

The application 2020 may include one or more of the following: embedded operating system functionality (e.g., kernel 2028, service 2030 or driver 2032), library (e.g., system 2034, API 2036, Library 2038), and framework / middleware 2018 that creates a user interface that interacts with a user of the system. Alternatively, or in addition, in some systems, interaction with a user may occur through a presentation layer, such as the presentation layer 2044. In these systems, the application / module "logic" may be separate from the aspects of the application / module that interact with the user.

Some software architectures use virtual machines. In the example of FIG. 20, this is illustrated by the virtual machine 2048. The virtual machine creates a software environment in which the application / module can be executed as it is executed in a hardware machine (e.g., the machine of FIG. 21). The virtual machine is hosted by a host operating system (the operating system 2014 of FIG. 21) and is typically, but not always, interfaced with the host machine operating system (i.e., the operating system 2014) And a virtual machine monitor 2046 which manages the virtual machine monitor 2046. The software architecture is implemented within a virtual machine such as an operating system 2050, a library 2052, a framework / middleware 2054, an application 2056, or a presentation layer 2058. The layers of this software architecture that run within the virtual machine 2048 may be the same or different from the corresponding layers described above.

21 is a block diagram of a machine according to some illustrative embodiments that can read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. 0.0 > 2100 < / RTI > 21 illustrates a schematic diagram of a machine 2100 in an exemplary form of a computer system and includes, within a machine, instructions for causing machine 2100 to perform any one or more of the methodologies discussed herein (E.g., software, program, application, applet, app or other executable code) may be executed. For example, the instructions may cause the machine to execute the flowcharts of FIGS. 5, 6, 8, 10, 13, and 17. Additionally or alternatively, the instructions may include the communication module 210, the presentation module 220, the finder module 230, the alignment module 240, the decoder module 250, the operation module 260, Module 270 or the like. The instructions convert a generic, non-programmed machine to a specific machine programmed to perform the described and illustrated functions in the manner described. In an alternative embodiment, the machine 2100 may operate as a standalone device or may be coupled (e.g., networked) to another machine. In a networked deployment, the machine 2100 may operate in the capacity of a server machine or client machine in a server-client network environment, or may operate as a peer machine in a peer-to-peer (or distributed) network environment. The machine 2100 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set top box (STB), a personal digital assistant (PDA), an entertainment media system, Devices, worn devices (e.g., smart clocks), smart home devices (e.g., smart appliances), other smart devices, web appliances, network routers, network switches, network bridges, or machines 2100 But is not limited to, any machine capable of executing instructions 2116 that specify actions to be taken in a sequential or other manner. Also, while only one machine 2100 is illustrated, the term "machine" may also be used to refer to a machine 2100 that executes instructions 2116 individually or collectively to perform any one or more of the methodologies discussed herein Aggregate < / RTI > (2100).

The machine 2100 may include a processor 2110, a memory / storage 2130 and an I / O component 2150, which may be configured to communicate with one another via a bus 2102 or the like. In an exemplary embodiment, a processor 2110 (e.g., a central processing unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit A digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio frequency integrated circuit (RFIC), another processor, or any suitable combination thereof) A processor 2112 and a processor 2114 that are capable of executing instructions (not shown). The term "processor" is intended to include a multi-core processor that may include two or more independent processors (sometimes also referred to as "cores ") capable of executing instructions simultaneously. 21 illustrates a plurality of processors, the machine 2100 may be implemented as a single processor with a single core, a single processor (e.g., a multi-core processor) with multiple cores, multiple processors with a single core, Multiple processors, or any combination thereof.

The memory / storage 2130 may include a memory 2132, such as main memory or other memory storage, and a storage unit 2136, both of which may access the processor 2110 through a bus 2102, It is possible. Storage unit 2136 and memory 2132 may store instructions 2116 that implement any one or more of the methodologies or functions described herein. The instructions 2116 may also be stored in the memory 2132, in the storage unit 2136, in at least one of the processors 2110 (e.g., within the processor 2110), in whole or in part, during its execution by the machine 2100 Of cache memory), or any suitable combination thereof. Thus, the memory 2132, the storage unit 2136, and the memory of the processor 2110 are examples of machine-readable media.

As used herein, the term "machine-readable medium" means a device that is capable of temporarily or permanently storing instructions and data, including random access memory (RAM), read only memory (ROM) Flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., erasable programmable read only memory (EEPROM)), or any suitable combination thereof, It is not. The term "machine-readable medium" refers to a medium or medium that includes a single medium or a plurality of media (e.g., a centralized or distributed database, and / or associated cache and server) Should be considered. The term "machine-readable medium" also refers to a medium in which instructions, when executed by one or more processors (e.g., processors 2110) of the machine 2100, cause the machine 2100 to perform, (E.g., instructions 2116) for execution by a machine (e.g., machine 2100) to cause the processor to perform any one or more of the following: Should be considered to include combinations. Thus, "machine-readable medium" refers to a "cloud-based" storage system or storage network that includes a plurality of storage devices or devices, as well as a single storage device or device.

I / O component 2150 may include various components that perform the functions of receiving inputs, providing outputs, generating outputs, transmitting information, exchanging information, capturing measurements, etc. . The specific I / O component 2150 included in the particular machine will depend on the type of machine. For example, a portable machine, such as a mobile phone, may include a touch input device or other such input mechanism, while a headless server machine will not include such a touch input device. It will be appreciated that I / O component 2150 may include many other components not shown in FIG. I / O components 2150 are grouped according to functionality only to simplify the discussion below, and this grouping is not limiting in any way. In various exemplary embodiments, I / O component 2150 may include an output component 2152 and an input component 2154. The output component 2152 may be a visual component such as a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT) (E.g., speakers), haptic components (e.g., vibration motors, resistance mechanisms), other signal generators, and the like. The input component 2154 may include an alphanumeric input component (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard or other alphanumeric input component), a point-based input component (E.g., a touch screen, or other tactile input component that provides the location and force of a physical button, touch or touch gesture), audio input (e.g., a touchpad, trackball, joystick, motion sensor or other pointing tool) Components (e. G., Microphones), and the like.

In further exemplary embodiments, the I / O component 2150 may include, among other things, a biometric component 2156, a motion component 2158, an environmental component 2160, or a location component 2162 . For example, the biometric component 2156 may be configured to detect an expression (e.g., a hand expression, a facial expression, a voice expression, a gesture, or an eye trace) and generate a biometric signal (e.g., blood pressure, heart rate, Sweat or brain wave) and identify the person (e.g., voice identification, retina identification, face identification, fingerprint identification, or brain-based based identification). Motion component 2158 may include an acceleration sensor component (e.g., an accelerometer), a gravity sensor component, a rotation sensor component (e.g., a gyroscope), and the like. The environmental component 2160 may include, for example, a light sensor component (e.g., a photometer), a temperature sensor component (e.g., one or more thermometers that detect ambient temperature), a humidity sensor component, (E.g., barometers), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensor components A smell detection sensor, a gas detection sensor that detects the concentration of dangerous gases for safety or measures contaminants in the atmosphere), or other components that can provide indicia, measurements, or signals corresponding to the surrounding physical environment can do. The position component 2162 may include a position sensor component (e.g., a Global Positioning System (GPS) receiver component), an altitude sensor component (an altimeter or barometer that detects the altitude at which altitude can be derived), an orientation sensor component For example, a magnetometer), and the like.

Communication can be implemented using various technologies. The I / O component 2150 may include a communication component 2164 operable to couple the machine 2100 to the network 2180 or device 2170 via a coupling 2182 and a coupling 2172, respectively . For example, communication component 2164 includes a network interface component, or other device suitable for interfacing with network 2180. In a further example, the communication component 2164 may be a wired communication component, a wireless communication component, a cellular communication component, a near field communication (NFC) component, a BLUETOOTH® component (eg, BLUETOOTH® Low Energy), a WI- And other communication components that provide communication over other aspects. The device 2170 may be another machine or any of a variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).

In addition, the communication component 2164 may comprise a component operable to detect an identifier or to detect an identifier. For example, the communication component 2164 may be a one-dimensional barcode such as a radio frequency identification (RFID) tag reader component, an NFC smart tag detection component, an optical reader component (e.g., Universal Product Code An optical sensor for detecting a multi-dimensional barcode such as a response code, an Aztec code, a Data Matrix, a Dataglyph, a MaxiCode, a PDF417, an Ultra code, a UCC RSS (Uniform Commercial Code Reduced Space Symbology) Sound detection components (e. G., A microphone that identifies the tagged audio signal), or any suitable combination thereof. In addition, various information such as the location through internet protocol (IP) geo-location, the position through WI-FI signal triangulation, the location through BLUETOOTH or NFC beacon signal detection which can indicate a specific location, ). ≪ / RTI >

In various exemplary embodiments, one or more portions of the network 2180 may include one or more of an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless local area network (WLAN) A part of a public switched telephone network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a WI-FI (Public Switched Telephone Network) Network, another type of network, or a combination of two or more such networks. For example, a portion of network 2180 or network 2180 may include a wireless or cellular network, and combination 2182 may include a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) Or other types of cellular or wireless coupling. In this example, the combination 2182 may include one or more of the following: 1xRTT, Evolution-Data Optimized (EVDO), General Packet Radio Service (GPRS), Enhanced Data Rates for GSM Evolution (EDGE) (LTE) standard defined by various standard-setting mechanisms, such as a 4G (fourth generation wireless) network, a Universal Mobile Telecommunications System (UMTS), a High Speed Packet Access (HSPA), a Worldwide Interoperability for Microwave Access Such as third generation partnership projects (3GPP), including, among others, other long distance protocols, or other data transmission techniques.

Instructions 2116 may be transmitted using a transmission medium through a network interface device (e.g., a network interface component included in communication component 2164) and any one of a number of well known protocols (e.g., May be transmitted or received over the network 2180 using a hypertext transfer protocol (" HTTP "). Similarly, the instructions 2116 may be transmitted or received using a transmission medium via a coupling 2172 (e.g., peer-to-peer coupling) to the device 2170. The term "transmission medium" includes digital or analog communication signals or other intangible media that can store, encode, or transport instructions 2116 for execution by the machine 2100 and enable the transmission of such software Or any other type of medium. The transmission medium is one embodiment of a machine-readable medium.

The following numbered examples are embodiments.

1. As a system,

A communication module for receiving reference image data of a reference image for a customized graphic;

A finder module for determining from the reference image data the reference shape feature of the customized graphic, the reference shape feature representing an identity of the customized graphic;

A memory configured to store geometry feature rules and reference geometry features of the custom graphics;

A hardware processor coupled to the memory,

Lt; / RTI >

The communication module is further configured to receive image data of an image from a client device;

The finder module also includes:

Extract a candidate shape feature of the image from the image data;

Determine that the candidate shape feature satisfies the shape feature rule;

Identify the custom graphic in the image based on a comparison of the candidate shape feature and the reference shape feature of the custom graphic in response to the candidate shape feature satisfying the shape feature rule; And

A decoder module configured to decode data encoded within a portion of the image from the image data by detecting markings indicative of data contained in the image, in response to the finder module identifying the customized graphics;

≪ / RTI >

2. The method of embodiment 1 wherein said finder module further comprises:

Calculating an area value of the candidate shape feature from the image data, the area value being computed in conjunction with another candidate shape feature to scale the area value;

Determine an area score for the candidate shape feature by comparing the area value with a reference area value of the custom graphics;

Based on the area score exceeding the threshold, determines that the candidate shape feature satisfies an area rule - the shape feature rule includes the area rule

≪ / RTI >

3. A computer-implemented method,

Receiving reference image data of a reference image for a custom symbol;

Determining a reference shape feature of the reference image from the reference image data, the reference shape feature representing an identity of the customized symbol;

Receiving image data of an image from a user device;

Extracting a candidate shape feature of the image from the image data;

Determining that the candidate shape feature satisfies a shape feature criterion;

Identifying the customizable symbol in the image by comparing the candidate shape feature with the reference shape feature of the customizable symbol in response to the candidate shape feature satisfying the feature feature criteria;

Extracting a geometric attribute of the customizable symbol in the image from the image data in response to identifying the customizable symbol; And

Decoding the data encoded within a portion of the image from the image data using the geometric property of the custom symbol in the image

≪ / RTI >

4. In Example 3,

Calculating a magnitude approximation of the candidate shape feature from the image data, wherein the magnitude approximation is computed in conjunction with another candidate shape feature to scale the magnitude approximation;

Determining a size score for the candidate shape feature by comparing the size approximation with a reference size of the custom symbol; And

Determining that the candidate shape feature is based on a magnitude criterion and the feature feature criterion comprises the magnitude criterion based on the magnitude score exceeding a threshold value

≪ / RTI >

5. In Example 3 or Example 4,

Determining from the image data that the candidate shape feature includes a closed curve, the closed curve having a path starting at a particular point and returning to the particular point; And

Determining that the candidate shape feature is based on the candidate shape feature comprising the closed curve and the shape feature reference comprises the path reference;

≪ / RTI >

6. The method as in any of the embodiments 3-5, wherein the geometric attribute comprises at least one of position, scale, or orientation of the custom symbol in the image.

7. The process of any of Examples 3 to 6,

Extracting from the image data an intrinsic feature of the custom symbol, the intrinsic feature representing an identification of the custom symbol;

Determining an orientation of the custom symbol in the image by comparing the intrinsic feature with a reference intrinsic feature of the custom symbol; And

Decoding the encoded data in the image using the orientation and position of the customized symbol in the image

≪ / RTI >

8. In Example 7,

Identifying an intrinsic point of the custom symbol in the image, the intrinsic feature comprising the intrinsic point; And

Determining an orientation of the custom symbol in the image by comparing a position of the unique point with respect to the custom symbol in the image to a position of a reference point of the custom symbol

≪ / RTI >

9. The process according to example 7 or 8,

Generating a transformed image by transforming the image according to the orientation of the customized symbol; And

Decoding the encoded data within the portion of the image using the transformed image

≪ / RTI >

10. The method of claim 9,

Determining a decoding failure of the encoded data in the portion of the image using the transformed image;

Generating another transformed image by transforming the image according to a different orientation of the customized symbol; And

Decoding the encoded data within the portion of the image using the another transformed image

≪ / RTI >

11. The method as in any of the embodiments 3-10, wherein the data encoded in the portion of the image is encoded using a plurality of marks located relative to the customizable symbol in the image, wherein each mark of the plurality of marks ≪ / RTI >

12. The method as in any of the embodiments 3-11, wherein the customizable symbol comprises at least one of a logo, an icon, or a trademark.

13. The process of any of Examples 3 to 12,

And receiving the image data in real time from an image sensor of the user device.

14. The method as in any of the embodiments 3-13, wherein the shape of the customizable symbol comprises at least one asymmetry.

15. The process of any one of Examples 3 to 14,

And in response to decoding the encoded data in the image, performing an operation on the user device using the decoded data from the image.

16. The method of embodiment 15, wherein the operation comprises an operation specified by decoded data from the image.

17. The method of embodiment 15 or 16 wherein the operation is performed without communication with a server.

18. The method as in any of the embodiments 15-17, wherein the operation is dedicated to mobile computing applications that decoded data encoded within a portion of the image.

19. A machine-readable medium comprising instructions that, when executed by at least one processor of a machine, cause the machine to perform operations,

Receiving reference image data of a reference image for a particular design;

Determining from the reference image data the reference shape feature of the reference image-the reference shape feature-identifies the identity of the particular design;

Receiving image data of an image from a client system;

Extracting a candidate shape feature of the image from the image data;

Determining that the candidate shape feature satisfies a shape feature rule;

Identifying the particular design in the image by comparing the candidate shape feature with the reference shape feature of the particular design in response to the candidate shape feature satisfying the shape feature rule;

Extracting a spatial attribute of the particular design within the image from the image data in response to identifying the particular design; And

Decoding the data encoded within a portion of the image from the image data using the spatial property of the particular design within the image

Readable medium.

20. The method of embodiment 19,

Calculating a magnitude approximation of the candidate shape feature from the image data, wherein the magnitude approximation is computed in conjunction with another candidate shape feature to scale the magnitude approximation;

Determining a size score for the candidate shape feature by comparing the size approximation to a reference size of the particular design; And

Determining that the candidate shape feature is based on a size rule that exceeds a threshold value, the shape feature rule includes the size rule,

Readable medium.

21. A machine-readable medium comprising instructions, when executed by at least one processor of a computer, to cause the computer to perform the method of any one of Examples 3 to 18.

Throughout this specification, a plurality of instances may implement the described components, acts, or structures as a single instance. While individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and operations need not be performed in the order illustrated. The structures and functions presented as separate components in an exemplary configuration may be implemented as a combined structure or component. Similarly, the structure and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.

Although an overview of the subject matter of the present invention has been set forth with reference to specific exemplary embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. These embodiments of the subject matter of the present invention are hereby incorporated by reference in their entirety for all purposes, without even the intention of voluntarily limiting the scope of the present application to any single disclosure or inventive concept, May be referred to individually or collectively by the term "invention ".

The embodiments illustrated herein have been described in sufficient detail to enable those of ordinary skill in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived from the present disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the present disclosure. Accordingly, this description is not to be taken in a limiting sense, and the scope of the various embodiments is defined only by the appended claims and their full scope of equivalents.

As used in this specification, the term "or" can be interpreted as inclusive or exclusive. In addition, multiple instances may be provided for a resource, operation, or structure described herein as a single instance. In addition, the boundaries between various resources, operations, modules, engines and data stores are somewhat arbitrary, and the specific operation is illustrated in the context of a particular exemplary configuration. Other assignments of functionality may be envisaged and may be within the scope of various embodiments of the present disclosure. Generally, structures and functions presented as discrete resources in an exemplary configuration may be implemented as a combined structure or resource. Similarly, the structure and function presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within the scope of the embodiments of the present disclosure as expressed in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims (21)

  1. As a system,
    A communication module for receiving reference image data of a reference image for a customized graphic;
    A finder module for determining a reference shape feature of the customized graphic from the reference image data, the reference shape feature representing an identity of the customized graphic;
    A memory configured to store a shape feature feature and a shape feature rule of the custom graphics; And
    A hardware processor coupled to the memory,
    Lt; / RTI >
    The communication module being further configured to receive image data of an image from a client device;
    The finder module comprises:
    Extract a candidate shape feature of the image from the image data;
    Determine that the candidate shape feature satisfies the shape feature rule;
    Identify a custom graphic in the image based on a comparison of the candidate shape feature with the reference shape feature of the custom graphic in response to the candidate shape feature satisfying the shape feature rule; And
    A decoder module configured to decode data encoded within a portion of the image from the image data by detecting markings indicative of data contained in the image in response to the finder module identifying the customized graphics;
    ≪ / RTI >
  2. The apparatus of claim 1, wherein the finder module comprises:
    Calculating an area value of the candidate shape feature from the image data, the area value calculated in conjunction with another candidate shape feature to scale the area value;
    Determine an area score for the candidate shape feature by comparing the area value with a reference area value of the custom graphics;
    Based on the area score exceeding the threshold, determines that the candidate shape feature satisfies an area rule - the shape feature rule includes the area rule
    Further comprising:
  3. As a computer implemented method,
    Receiving reference image data of a reference image for a custom symbol;
    Determining a reference shape feature of the reference image from the reference image data, the reference shape feature representing an identity of the customized symbol;
    Receiving image data of an image from a user device;
    Extracting a candidate shape feature of the image from the image data;
    Determining that the candidate shape feature satisfies a shape feature criterion;
    Identifying the customizable symbol in the image by comparing the candidate shape feature with the reference shape feature of the customizable symbol in response to the candidate shape feature satisfying the feature feature criteria;
    Extracting a geometric attribute of the customizable symbol in the image from the image data in response to identifying the customizable symbol; And
    Decoding the data encoded within a portion of the image from the image data using the geometric property of the custom symbol in the image
    ≪ / RTI >
  4. The method of claim 3,
    Calculating a magnitude approximation of the candidate shape feature from the image data, wherein the magnitude approximation is computed in conjunction with another candidate shape feature to scale the magnitude approximation;
    Determining a size score for the candidate shape feature by comparing the size approximation with a reference size of the custom symbol; And
    Determining that the candidate shape feature satisfies a size criterion based on the size score exceeding a threshold, the shape feature criterion including the size criterion,
    ≪ / RTI >
  5. The method of claim 3,
    Determining from the image data that the candidate shape feature comprises a closed curve, the closed curve having a path starting at a particular point and returning to the particular point, surrounding the portion of the image; And
    Determining that the candidate shape feature satisfies a path criterion based on the candidate feature comprising the closed curve, the shape feature criterion comprising the path criterion,
    ≪ / RTI >
  6. 4. The method of claim 3, wherein the geometric attribute comprises at least one of position, scale, or orientation of the customized symbol in the image.
  7. The method of claim 3,
    Extracting a unique feature of the custom symbol from the image data, the unique feature indicating identification of the custom symbol;
    Determining an orientation of the custom symbol in the image by comparing the intrinsic feature with a reference intrinsic feature of the custom symbol; And
    Decoding the encoded data in the image using the orientation and position of the customized symbol in the image
    ≪ / RTI >
  8. 8. The method of claim 7,
    Identifying an intrinsic point of the custom symbol in the image, the intrinsic feature comprising the intrinsic point; And
    Determining an orientation of the custom symbol in the image by comparing a position of the unique point with respect to the custom symbol in the image to a position of a reference point of the custom symbol
    ≪ / RTI >
  9. 8. The method of claim 7,
    Generating a transformed image by transforming the image according to the orientation of the customized symbol; And
    Decoding the encoded data within the portion of the image using the transformed image
    ≪ / RTI >
  10. 10. The method of claim 9,
    Determining a decoding failure of the encoded data in the portion of the image using the transformed image;
    Generating another transformed image by transforming the image according to a different orientation of the customized symbol; And
    Decoding the encoded data within the portion of the image using the another transformed image
    ≪ / RTI >
  11. 4. The method of claim 3, wherein the data encoded within the portion of the image is encoded using a plurality of marks located relative to the customizable symbol in the image, wherein each mark of the plurality of marks represents a piece of data.
  12. 4. The method of claim 3, wherein the customizable symbol comprises at least one of a logo, an icon, or a trademark.
  13. The method of claim 3,
    Further comprising receiving the image data in real time from an image sensor of the user device.
  14. 4. The method of claim 3, wherein the shape of the custom symbol comprises at least one asymmetry.
  15. The method of claim 3,
    Further comprising performing, in response to decoding the encoded data in the image, operation with respect to the user device using decoded data from the image.
  16. 16. The method of claim 15, wherein the operation includes an operation specified by decoded data from the image.
  17. 16. The method of claim 15, wherein the operation is performed without communication with a server.
  18. 16. The method of claim 15, wherein the operation is dedicated to a mobile computing application that decoded data encoded within a portion of the image.
  19. A machine-readable medium comprising instructions that, when executed by at least one processor of a machine, cause the machine to perform operations,
    Receiving reference image data of a reference image for a particular design;
    Determining a reference shape feature of the reference image from the reference image data, the reference shape feature representing an identity of the particular design;
    Receiving image data of an image from a client system;
    Extracting a candidate shape feature of the image from the image data;
    Determining that the candidate shape feature satisfies a shape feature rule;
    Identifying the particular design in the image by comparing the candidate shape feature with the reference shape feature of the particular design in response to the candidate shape feature satisfying the shape feature rule;
    Extracting a spatial attribute of the particular design within the image from the image data in response to identifying the particular design; And
    Decoding the data encoded within a portion of the image from the image data using the spatial property of the particular design within the image
    Readable medium.
  20. 20. The method of claim 19,
    Calculating a magnitude approximation of the candidate shape feature from the image data, wherein the magnitude approximation is computed in conjunction with another candidate shape feature to scale the magnitude approximation;
    Determining a size score for the candidate shape feature by comparing the size approximation to a reference size of the particular design; And
    Determining that the candidate shape feature satisfies a size rule based on the size score exceeding a threshold value, the shape feature rule including the size rule,
    Readable < / RTI > media.
  21. 19. A machine-readable medium comprising instructions that when executed by at least one processor of a computer cause the computer to perform the method of any of claims 3-18.
KR1020177023059A 2015-01-19 2016-01-08 Custom Function Patterns for Optical Barcodes KR102018143B1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US201562105141P true 2015-01-19 2015-01-19
US62/105,141 2015-01-19
US14/612,409 US9111164B1 (en) 2015-01-19 2015-02-03 Custom functional patterns for optical barcodes
US14/612,409 2015-02-03
US14/826,301 US9659244B2 (en) 2015-01-19 2015-08-14 Custom functional patterns for optical barcodes
US14/826,301 2015-08-14
PCT/US2016/012669 WO2016118338A1 (en) 2015-01-19 2016-01-08 Custom functional patterns for optical barcodes

Publications (2)

Publication Number Publication Date
KR20170128239A true KR20170128239A (en) 2017-11-22
KR102018143B1 KR102018143B1 (en) 2019-11-04

Family

ID=53786051

Family Applications (2)

Application Number Title Priority Date Filing Date
KR1020177023059A KR102018143B1 (en) 2015-01-19 2016-01-08 Custom Function Patterns for Optical Barcodes
KR1020197025362A KR20190104247A (en) 2015-01-19 2016-01-08 Custom functional patterns for optical barcodes

Family Applications After (1)

Application Number Title Priority Date Filing Date
KR1020197025362A KR20190104247A (en) 2015-01-19 2016-01-08 Custom functional patterns for optical barcodes

Country Status (5)

Country Link
US (3) US9111164B1 (en)
EP (1) EP3248145A4 (en)
KR (2) KR102018143B1 (en)
CN (1) CN107430697A (en)
WO (1) WO2016118338A1 (en)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10439972B1 (en) 2013-05-30 2019-10-08 Snap Inc. Apparatus and method for maintaining a message thread with opt-in permanence for entries
US9594555B2 (en) 2014-04-07 2017-03-14 Quikkly Ltd. Computer readable storage media for invoking direct actions and processes and systems utilizing same
US9113301B1 (en) 2014-06-13 2015-08-18 Snapchat, Inc. Geo-location based event gallery
US10284508B1 (en) 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
JP6520616B2 (en) * 2014-10-07 2019-05-29 株式会社デンソーウェーブ Information code generation method, program for generating information code, and information code generation apparatus
US9754355B2 (en) 2015-01-09 2017-09-05 Snap Inc. Object recognition based photo filters
US9111164B1 (en) 2015-01-19 2015-08-18 Snapchat, Inc. Custom functional patterns for optical barcodes
US9906479B1 (en) 2015-06-16 2018-02-27 Snap Inc. Storage management for ephemeral messages
US9721551B2 (en) 2015-09-29 2017-08-01 Amper Music, Inc. Machines, systems, processes for automated music composition and generation employing linguistic and/or graphical icon based musical experience descriptions
US9652896B1 (en) 2015-10-30 2017-05-16 Snap Inc. Image based tracking in augmented reality systems
US10338753B2 (en) 2015-11-03 2019-07-02 Microsoft Technology Licensing, Llc Flexible multi-layer sensing surface
US9911073B1 (en) * 2016-03-18 2018-03-06 Snap Inc. Facial patterns for optical barcodes
US9813642B1 (en) 2016-05-06 2017-11-07 Snap Inc. Dynamic activity-based image generation
US9681265B1 (en) 2016-06-28 2017-06-13 Snap Inc. System to track engagement of media items
US10182047B1 (en) 2016-06-30 2019-01-15 Snap Inc. Pictograph password security system
WO2018085426A1 (en) 2016-11-01 2018-05-11 Snap Inc. Systems and methods for fast video capture and sensor adjustment
US10242477B1 (en) * 2017-01-16 2019-03-26 Snap Inc. Coded vision system
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10374993B2 (en) 2017-02-20 2019-08-06 Snap Inc. Media item attachment system
US10146971B1 (en) 2017-03-14 2018-12-04 Snap Inc. Optical barcodes without orientation
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US9980100B1 (en) 2017-08-31 2018-05-22 Snap Inc. Device location based on machine learning classifications
US10474900B2 (en) 2017-09-15 2019-11-12 Snap Inc. Real-time tracking-compensated image effects
US10217488B1 (en) 2017-12-15 2019-02-26 Snap Inc. Spherical video editing
US10482565B1 (en) 2018-02-12 2019-11-19 Snap Inc. Multistage neural network processing using a graphics processor

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6327388B1 (en) * 1998-08-14 2001-12-04 Matsushita Electric Industrial Co., Ltd. Identification of logos from document images
US7410099B2 (en) * 2003-06-05 2008-08-12 Ntt Docomo, Inc. Apparatus and method for reading and decoding information contained in a barcode
US7412089B2 (en) * 2005-05-23 2008-08-12 Nextcode Corporation Efficient finder patterns and methods for application to 2D machine vision problems
US8868902B1 (en) * 2013-07-01 2014-10-21 Cryptite LLC Characteristically shaped colorgram tokens in mobile transactions

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5978773A (en) 1995-06-20 1999-11-02 Neomedia Technologies, Inc. System and method for using an ordinary article of commerce to access a remote computer
EP0814611B1 (en) 1996-06-17 2002-08-28 Siemens Aktiengesellschaft Communication system and method for recording and managing digital images
US7173651B1 (en) 1998-06-02 2007-02-06 Knowles Andrew T Apparatus and system for prompt digital photo delivery and archival
US6505123B1 (en) 2000-07-24 2003-01-07 Weatherbank, Inc. Interactive weather advisory system
US7411493B2 (en) 2003-03-01 2008-08-12 User-Centric Ip, L.P. User-centric event reporting
US7535890B2 (en) 2003-12-18 2009-05-19 Ayalogic, Inc. System and method for instant VoIP messaging
KR100653886B1 (en) * 2004-11-05 2006-12-05 주식회사 칼라짚미디어 Mixed-code and mixed-code encondig method and apparatus
US8332475B2 (en) 2005-08-22 2012-12-11 Triplay Communications Ltd. Messaging system and method
WO2007089730A2 (en) 2006-01-27 2007-08-09 Spyder Lynk, Llc Encoding and decoding data in an image
US20080048044A1 (en) 2006-08-25 2008-02-28 Microsoft Corporation Barcode Encoding and Decoding
US8194914B1 (en) 2006-10-19 2012-06-05 Spyder Lynk, Llc Encoding and decoding data into an image using identifiable marks and encoded elements
USRE47534E1 (en) 2007-04-23 2019-07-23 Ramot At Tel Aviv University Ltd. System, method and a computer readable medium for providing an output image
US9491184B2 (en) 2008-04-04 2016-11-08 Samsung Electronics Co., Ltd. Method and apparatus for managing tokens for digital rights management
US20100098702A1 (en) 2008-09-16 2010-04-22 Longgui Wang Method of treating androgen independent prostate cancer
JP5331128B2 (en) 2008-12-26 2013-10-30 パナソニック株式会社 Imaging device
WO2011101784A1 (en) 2010-02-16 2011-08-25 Tigertext Inc. A messaging system apparatuses circuits and methods of operation thereof
JP4874436B2 (en) 2010-03-26 2012-02-15 A・Tコミュニケーションズ株式会社 Two-dimensional code with logo, two-dimensional code generation device with logo, two-dimensional code generation method with logo, and program
GB201122284D0 (en) 2011-12-23 2012-02-01 Zappar Ltd Content identification and distribution
US9177130B2 (en) 2012-03-15 2015-11-03 Google Inc. Facial feature detection
US8515139B1 (en) 2012-03-15 2013-08-20 Google Inc. Facial feature detection
US8441548B1 (en) 2012-06-15 2013-05-14 Google Inc. Facial image quality assessment
US8411909B1 (en) 2012-06-26 2013-04-02 Google Inc. Facial recognition
US8457367B1 (en) 2012-06-26 2013-06-04 Google Inc. Facial recognition
US8396265B1 (en) 2012-06-26 2013-03-12 Google Inc. Facial recognition
US8886953B1 (en) 2012-09-14 2014-11-11 Google Inc. Image processing
US20140263674A1 (en) * 2013-03-15 2014-09-18 Conformis, Inc. Systems, Methods, and Apparatus for Integrating Scannable Codes in Medical Devices
CA2863124A1 (en) 2014-01-03 2015-07-03 Investel Capital Corporation User content sharing system and method with automated external content integration
US9111164B1 (en) 2015-01-19 2015-08-18 Snapchat, Inc. Custom functional patterns for optical barcodes
JP2017010543A (en) 2015-06-24 2017-01-12 三星電子株式会社Samsung Electronics Co.,Ltd. Face recognition method and apparatus
US9911073B1 (en) 2016-03-18 2018-03-06 Snap Inc. Facial patterns for optical barcodes

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6327388B1 (en) * 1998-08-14 2001-12-04 Matsushita Electric Industrial Co., Ltd. Identification of logos from document images
US7410099B2 (en) * 2003-06-05 2008-08-12 Ntt Docomo, Inc. Apparatus and method for reading and decoding information contained in a barcode
US7412089B2 (en) * 2005-05-23 2008-08-12 Nextcode Corporation Efficient finder patterns and methods for application to 2D machine vision problems
US8868902B1 (en) * 2013-07-01 2014-10-21 Cryptite LLC Characteristically shaped colorgram tokens in mobile transactions

Also Published As

Publication number Publication date
US9111164B1 (en) 2015-08-18
KR20190104247A (en) 2019-09-06
EP3248145A1 (en) 2017-11-29
KR102018143B1 (en) 2019-11-04
US10068117B1 (en) 2018-09-04
US9659244B2 (en) 2017-05-23
WO2016118338A1 (en) 2016-07-28
EP3248145A4 (en) 2017-12-13
CN107430697A (en) 2017-12-01
US20160210545A1 (en) 2016-07-21

Similar Documents

Publication Publication Date Title
US9195898B2 (en) Systems and methods for image recognition using mobile devices
US9239834B2 (en) Systems, methods and apparatus for dynamic content management and delivery
KR20130118897A (en) Smartphone-based methods and systems
CN103189864B (en) For determining the method for shared good friend of individual, equipment and computer program
EP2988209A1 (en) Mobile computing device with data cognition software
US9165406B1 (en) Providing overlays based on text in a live camera view
US9699203B1 (en) Systems and methods for IP-based intrusion detection
AU2014223732B2 (en) Systems and methods for authenticating a user based on a biometric model associated with the user
KR20170129222A (en) Geo-fence Provisioning
US20170287006A1 (en) Mutable geo-fencing system
US9836890B2 (en) Image based tracking in augmented reality systems
US20150058123A1 (en) Contextually aware interactive advertisements
US8727225B2 (en) System and method for calibration and mapping of real-time location data
WO2017100476A1 (en) Image search system
US20160085773A1 (en) Geolocation-based pictographs
US20170295250A1 (en) Messaging achievement pictograph display system
KR102003813B1 (en) Automated 3D Model Generation
US10115015B2 (en) Method for recognizing a specific object inside an image and electronic device thereof
US9430835B2 (en) Information processing method and system
US10492025B2 (en) Super geo-fences and virtual fences to improve efficiency of geo-fences
KR20170077183A (en) Hierarchical deep convolutional neural network
US9111164B1 (en) Custom functional patterns for optical barcodes
US20190311422A1 (en) Methods and arrangements including data migration among computing platforms, e.g. through use of audio encoding
US9256795B1 (en) Text entity recognition
US9652734B2 (en) Portable encoded information reading terminal configured to acquire images

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
A107 Divisional application of patent