CN112699703A - Pattern recognition method and apparatus, and computer-readable recording medium - Google Patents

Pattern recognition method and apparatus, and computer-readable recording medium Download PDF

Info

Publication number
CN112699703A
CN112699703A CN202011132620.2A CN202011132620A CN112699703A CN 112699703 A CN112699703 A CN 112699703A CN 202011132620 A CN202011132620 A CN 202011132620A CN 112699703 A CN112699703 A CN 112699703A
Authority
CN
China
Prior art keywords
pattern
visual code
preview
visual
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011132620.2A
Other languages
Chinese (zh)
Inventor
朴憓隣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Line Plus Corp
Original Assignee
Line Plus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Line Plus Corp filed Critical Line Plus Corp
Publication of CN112699703A publication Critical patent/CN112699703A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/0008General problems related to the reading of electronic memory record carriers, independent of its reading method, e.g. power transfer

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a pattern recognition method and apparatus and a computer-readable recording medium. The pattern recognition method includes: a step of receiving an image including a first pattern and a second pattern; judging the pattern type of the first pattern; judging the pattern type of the second pattern; a step of comparing the priority of the pattern type of the first pattern with the priority of the pattern type of the second pattern; and a step of automatically executing the processing of the first pattern when it is determined that the priority of the pattern type of the first pattern is higher than the priority of the pattern type of the second pattern.

Description

Pattern recognition method and apparatus, and computer-readable recording medium
Technical Field
The present invention relates to a method and apparatus for recognizing a pattern of a visual code, and more particularly, to a method and apparatus for recognizing a plurality of patterns of a visual code to automatically perform or display previews for the plurality of patterns of the visual code.
Background
Terminals such as smartphones, tablet computers, personal computers, notebook computers (laptop), desktop computers, and the like, to which cameras are attached, are increasingly becoming popular. In such an environment, cases of using a camera attached to a terminal are increasing. As a representative example, there are provided various services using a barcode captured by a camera attached to a terminal.
On the other hand, when a plurality of barcodes are included in an image captured by a camera attached to a terminal, a barcode cannot be recognized or a barcode unnecessary for a user is recognized. In this case, there is an inconvenience that the user needs to photograph the barcode to be recognized again in order to receive the required service. Accordingly, the time required for the user to receive the desired service increases, and it is difficult to provide a smooth user experience.
Disclosure of Invention
The present invention provides a pattern recognition method and apparatus for solving the above problems, and a computer program stored in a recording medium.
The present invention provides a method, an apparatus, and a computer program stored in a recording medium for receiving an image including a plurality of visually encoded patterns from a user terminal, determining a pattern type of the plurality of patterns detected from the received image, and comparing the priority of the pattern type, thereby automatically executing processing of the pattern having a high priority.
The present invention provides a method, an apparatus, and a computer program stored in a recording medium for calculating reliability scores of a plurality of patterns having the same priority of a pattern type of visual coding, thereby automatically executing processing of a pattern having a high reliability score.
The present invention provides a method, apparatus, and computer program stored on a recording medium for receiving an image including a plurality of visually encoded patterns from a user terminal, generating a preview of the pattern(s) detected from the received image, and displaying the preview in a preview area on a display of the user terminal.
The present invention may be implemented in a variety of ways including as a method, apparatus, and computer program stored in a computer readable storage medium.
The pattern recognition method executed in the user terminal of an embodiment of the present invention includes: a step of receiving an image comprising a first pattern of visual coding and a second pattern of visual coding; judging the pattern type of the first pattern of the visual code; judging the pattern type of the second pattern of the visual code; a step of comparing the priority of the pattern type of the first pattern of the visual code with the priority of the pattern type of the second pattern of the visual code; and a step of automatically executing the processing of the first pattern of the visual coding when it is determined that the priority of the pattern type of the first pattern of the visual coding is higher than the priority of the pattern type of the second pattern of the visual coding.
According to an embodiment, the step of determining the pattern type of the visually encoded first pattern comprises: a step of judging whether the first pattern of the visual code is a pattern related to a specific service; and a step of determining a pattern type of the first pattern of the visual coding as a first pattern type when it is determined that the first pattern of the visual coding is a pattern related to the specific service.
According to an embodiment, the step of determining the pattern type of the visually encoded first pattern comprises: a step of judging whether the first pattern of the visual code is a pattern related to the service of the specific company or not when the first pattern of the visual code is judged not to be a pattern related to the specific service; and a step of judging the pattern type of the first pattern of the visual code as the second pattern type when judging that the first pattern of the visual code is the pattern related to the service of the specific company.
According to an embodiment, in the case that it is determined that the first pattern of the visual code is not related to the service of the specific company, the step of determining the pattern type of the first pattern of the visual code further includes: judging the pattern type of the first pattern of the visual code as a third pattern type under the condition that the first pattern of the visual code is a two-dimensional pattern; and a step of recognizing a pattern type of the first pattern of the visual code as a fourth pattern type in a case where the first pattern of the visual code is a one-dimensional pattern.
According to an embodiment, the first pattern type, the second pattern type, the third pattern type and the fourth pattern type have a gradually decreasing priority in order.
According to an embodiment, in the case where it is determined that the pattern type of the first pattern of the visual code has the same priority as the pattern type of the second pattern of the visual code, the present invention further includes: a step of calculating a reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image; a step of calculating a reliability score of the visually encoded second pattern based on at least one of a position and a size of the visually encoded second pattern within the received image; and displaying a preview of the first pattern of the visual code on a display of the user terminal when it is determined that the reliability score of the first pattern of the visual code is higher than the reliability score of the second pattern of the visual code.
According to an embodiment, the step of calculating the reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image comprises: a step of calculating a distance between a center position of the received image and a center position of the visually encoded first pattern.
According to an embodiment, the step of calculating the reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image comprises: a step of calculating the proportion of the area occupied in the received image by the region corresponding to the visually encoded first pattern.
According to one embodiment, the present invention further comprises: a step of calculating a reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image, in a case where it is determined that the pattern type of the visually encoded first pattern has the same priority as the pattern type of the visually encoded second pattern; a step of calculating a reliability score of the visually encoded second pattern based on at least one of a position and a size of the visually encoded second pattern within the received image; and a step of automatically executing the processing of the first pattern of the visual code when it is determined that the reliability score of the first pattern of the visual code is higher than the reliability score of the second pattern of the visual code.
According to an embodiment, the present invention further comprises the step of displaying a mark on a first area corresponding to the first pattern of the visual code to foretell a user a process of automatically executing the first pattern of the visual code.
According to one embodiment, the present invention further comprises: a step of detecting a first area containing a visually encoded first pattern; a step of detecting a second region containing a visually encoded second pattern; and labeling the first region and the second region with different colors.
According to one embodiment, the present invention further comprises: a step of receiving a plurality of images in sequence in an image capturing mode; and displaying a reminder for confirming whether to enter a pattern scanning mode on a display of the user terminal when receiving an image including the first pattern of the visual code and the second pattern of the visual code.
An embodiment of the present invention provides a computer-readable recording medium storing a computer program for executing the above-described pattern recognition method in a computer.
The pattern recognition apparatus of an embodiment of the present invention includes: an image sensor for receiving an image comprising a first pattern of visual codes and a second pattern of visual codes; a pattern type judging section for judging a pattern type of a first pattern of the visual code and a pattern type of a second pattern of the visual code; and a pattern processing section for comparing the priority of the pattern type of the first pattern of the visual coding with the priority of the pattern type of the second pattern of the visual coding, and automatically executing the processing of the first pattern of the visual coding when it is determined that the priority of the pattern type of the first pattern of the visual coding is higher than the priority of the pattern type of the second pattern of the visual coding.
An image recognition method executed in a user terminal according to an embodiment of the present invention includes: a step of receiving an image comprising a first pattern of visual coding and a second pattern of visual coding; a step of generating a preview of the visually encoded first pattern; a step of generating a preview for the visually encoded second pattern; a step of displaying a preview for the first pattern and a preview for the second pattern in a preview area on a display of the user terminal; a step of receiving a selection from a user relating to a preview of the first pattern; and a step of performing processing for the visually encoded first pattern in response to the selection.
According to an embodiment, the step of displaying a preview for the first image and a preview for the second pattern in a preview area on a display of the user terminal comprises: a step of calculating a reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image; a step of calculating a reliability score of the visually encoded second pattern based on at least one of a position and a size of the visually encoded second pattern within the received image; and a step of displaying a preview for the first pattern of the visual code at an upper end of a preview area on a display of the user terminal, when the reliability score of the first pattern of the visual code is higher than the reliability score of the second pattern of the visual code.
According to one embodiment, the present invention further comprises: a step of detecting a first area containing a visually encoded first pattern; a step of detecting a second region containing a visually encoded second pattern; and labeling the first region and the second region with different colors.
According to an embodiment, the step of generating a preview for the visually encoded first pattern further comprises: a step of transmitting Uniform Resource Locator information encoded by the first pattern of the visual code to an external apparatus, when the first pattern of the visual code corresponds to a code of a Uniform Resource Locator (Uniform Resource Locator); a step of receiving at least one of a thumbnail, a title, and a description sentence related to uniform resource locator information from an external device; and a step of displaying at least one of the received thumbnail, title, and description sentence in a preview area on the display.
An embodiment of the present invention provides a computer-readable recording medium storing a computer program for executing the above-described pattern recognition method in a computer.
The pattern recognition apparatus of an embodiment of the present invention includes: an input interface for receiving input from a user; an image sensor for receiving an image comprising a first pattern of visual codes and a second pattern of visual codes; a pattern preview generating section for generating a preview of a first pattern of the visual code and a preview of a second pattern of the visual code; a display for outputting a preview of the generated first and second patterns of visual coding in a preview area; and a pattern processing section that performs processing for the first pattern of the visual coding in response to receiving a selection related to the preview for the first pattern through the input interface.
According to the embodiment of the invention, for a plurality of visually encoded patterns, the processing of the image can be automatically performed on the basis of the priority of the pattern type and/or the reliability score of the pattern, thereby improving the accuracy of image recognition and easily providing the service required by the user to provide better user experience.
According to an embodiment of the present invention, a preview for a plurality of patterns is displayed in a preview area on a display of a user terminal, whereby a user can select a pattern related to a desired service.
The effects of the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood from the following description by those skilled in the art to which the present invention pertains.
Drawings
Embodiments of the present invention will be described with reference to the following illustrative figures, wherein like reference numerals refer to like elements, but are not limited thereto.
Fig. 1 is a diagram illustrating an example of a pattern recognition method performed in a user terminal according to an embodiment of the present invention.
Fig. 2 is a diagram showing a configuration in which a plurality of user terminals are communicably connected to a server in order to perform pattern recognition according to an embodiment of the present invention.
Fig. 3 is a block diagram showing a structure of a user terminal according to an embodiment of the present invention.
Fig. 4 is a diagram illustrating a lookup table defining priorities of respective pattern types according to an embodiment of the present invention.
Fig. 5 is a diagram illustrating an example of a process of automatically performing a pattern without user input according to an embodiment of the present invention.
Fig. 6 is a flowchart illustrating a pattern recognition method of visual coding according to an embodiment of the present invention.
Fig. 7 is a flowchart illustrating a method of performing processing for one of patterns having the same priority in one embodiment of the present invention.
Fig. 8 is a flowchart illustrating a method of performing processing for one of patterns having the same priority in one embodiment of the present invention.
Fig. 9 is a flowchart illustrating a method of determining a pattern type of a visually encoded pattern according to an embodiment of the present invention.
Fig. 10 is a diagram showing an example of displaying a reminder for confirming whether to enter a pattern scanning mode in a case where a user terminal receives a plurality of visually encoded patterns according to an embodiment of the present invention.
FIG. 11 is a diagram illustrating an example of displaying a preview for a pattern selected by a user on a display in accordance with an embodiment of the present invention.
FIG. 12 is a diagram illustrating an example of displaying a preview for each of a plurality of patterns contained within an image on a display, according to an embodiment of the present invention.
FIG. 13 is a diagram illustrating an illustration of displaying a preview for a pattern selected by a user on a display according to an embodiment of the present invention.
Fig. 14 is a flowchart illustrating a pattern recognition method of visual coding according to an embodiment of the present invention.
Fig. 15 is a flowchart illustrating a pattern recognition method of visual coding according to an embodiment of the present invention.
Description of reference numerals
110: user' s
120: user terminal
130: object
132. 134, 136: visually encoded patterns
210. 220, 230: user terminal
240: network
250: server
300: user terminal
310: image sensor with a plurality of pixels
320: display device
330: input interface
340: communication module
350: processor with a memory having a plurality of memory cells
352: pattern region detection unit
354: pattern type determination section
356: pattern preview generation unit
358: pattern processing part
360: storage unit
Detailed Description
Hereinafter, specific contents used for embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, a detailed description of known functions and configurations will be omitted when it is considered that the gist of the present invention is unclear.
In the drawings, the same or corresponding structural elements may be given the same reference numerals. In the following description of the embodiments, the same or corresponding components may not be described repeatedly. However, even if the description of the structural elements is omitted, the structural elements are included in one embodiment.
The advantages, features, and methods of accomplishing the same of the disclosed embodiments will become more apparent with reference to the embodiments as described hereinafter with reference to the accompanying drawings. This invention is not limited to the embodiments disclosed below, but may be embodied in many different forms, which are intended to be exhaustive or to convey the full scope of the invention to those skilled in the art.
Brief description of the drawingsthe terms used in this specification are intended to describe the disclosed embodiments in detail.
Terms used in the present specification have been selected from general terms used as widely as possible in consideration of functions in the present invention, but may be changed according to intentions or conventions of those skilled in the art to which the present invention pertains, the advent of new technology, and the like. In addition, in a specific case, there is a term arbitrarily selected by the applicant, and in this case, the detailed meaning thereof is described in the description section of the corresponding invention. Therefore, the terms used in the present invention are not names of simple terms but are required to be defined in the meanings of the above-mentioned terms and the overall contents of the present invention.
The singular expressions in this specification include the plural expressions, as long as the context does not explicitly specify the singular. Also, the expression "a" or "an" includes an expression of a singular number as long as it is not explicitly specified in the context.
In the present invention, the term "includes" or "including" when a portion is referred to as "including" or "including" other portions, means that other structural elements may be included, and does not mean that other structural elements are excluded, unless otherwise stated.
In addition, terms such as "module" or "section" used in the specification mean a software or hardware component, and the "module" or "section" performs some functions. However, the "module" or "section" is not limited to software or hardware. A "module" or "section" may be configured to be formed on an addressable storage medium, or one or more processors may be regenerated. Thus, as an example, a "module" or "section" may include at least one of a software structural element, an object-oriented software structural element, a class structural element, a task structural element, a flow, a function, an attribute, a program, a subroutine, a field of program code, a driver, firmware, microcode, circuitry, data, a database, a data structure, a table, an array, and a variable. In the structural elements and the "modules" or "sections", the functions provided herein are combined in a smaller number of structures and "modules" or "sections", or may be separated into additional structural elements and "modules" or "sections".
According to an embodiment of the invention, a "module" or "section" may be implemented as a processor and a memory. "processor" should be interpreted to include a general purpose processor, a Central Processing Unit (CPU), a microprocessor, a Digital Signal Processor (DSP), a controller, a microcontroller, a state machine, and the like. In various environments, a "processor" may be an Application Specific Integrated Chip (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), or the like. For example, a "processor" may be a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or a processing device such as any other combination of configurations. Also, "memory" should be construed to include any electronic component that can store electronic information. "memory" may be of various types of processor-readable media such as Random Access Memory (RAM), read-only memory (ROM), non-volatile random access memory (NVRAM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory, magnetic or optical data storage devices, registers, and the like. The memory is in electronic communication with the processor if the processor can read information from the memory and/or can record information in the memory. A memory integrated in the processor is in electronic communication with the processor.
In the present invention, the "image" may include an image captured by an image sensor attached to the user terminal and an image captured by an image sensor that can communicate with the user terminal. Also, the image may include a portion of an image extracted from the captured video imagery.
In the present invention, a "visually encoded pattern" refers to a code that displays data or information in a machine-readable visual form. The visually encoded pattern may be scanned or read using an application of a device such as a pattern reader or a smart phone that has a built-in camera. The pattern of visual codes includes a one-dimensional pattern of visual codes and a two-dimensional pattern of visual codes. The one-dimensional pattern of the visual Code is formed of parallel lines of different respective widths and spacings, and may include Universal Product Code (UPC) or the like. The two-dimensional pattern of the visual code may be formed by a geometrical pattern of quadrangle, dots, hexagon, etc., and may include a Data Matrix code (Data Matrix), a Maxi code, a two-dimensional code (QR), etc. In this description, a "visually encoded pattern" may also be referred to as a "pattern".
Fig. 1 is a diagram illustrating an example of a pattern recognition method performed in a user terminal 120 according to an embodiment of the present invention. As shown in fig. 1, the user 110 holds the user terminal 120 such that the image sensor of the user terminal 120 faces the object 130 in order to scan one of the first pattern 132, the second pattern 134, and the third pattern 136 included in the object 130. For example, object 130 may be a document printed with a plurality of visually encoded patterns. As shown, the first pattern 132 and the second pattern 134 may be two-dimensional patterns, and the third pattern 136 may be one-dimensional patterns.
In the case where the user terminal 120 receives an image including the first pattern 132, the second pattern 134, and the third pattern 136, the user terminal 120 may detect an area including the visually encoded patterns 132, 134, 136 in the image and determine a pattern type of each pattern. The user terminal 120 may compare the priority of the pattern type of each pattern and may automatically perform the process of the pattern having the highest priority. For example, in the case where the user terminal 120 determines that the priority of the first pattern 132 is higher than the priorities of the second and third patterns 134 and 136, the user terminal 120 may automatically perform the process of the first pattern 132 without user input.
To this end, the user terminal 120 may store preset priorities of respective pattern types. In one embodiment, the user terminal 120 may determine a pattern related to a financial service (e.g., a LINE PAY service) of a specific company as a first pattern type, determine a pattern related to a non-financial service (e.g., a LINE chat tool service) of the specific company as a second pattern type, determine a two-dimensional pattern related to a service of other companies as a third pattern type, and determine the remaining one-dimensional pattern as a fourth pattern type. In this case, the user terminal 120 may previously set to have a priority level gradually decreasing in the order of the first pattern type, the second pattern type, the third pattern type, and the fourth pattern type.
In the case where there are a plurality of patterns having the same pattern type within one image, and there is no pattern having a higher priority than the corresponding pattern, the user terminal 120 may provide a preview of the corresponding pattern to the user to allow the user to select one pattern that needs to be performed (or scanned). Alternatively, the user terminal may calculate the reliability score of the corresponding pattern to automatically perform the process of the pattern having the highest reliability score. In this case, the reliability score of each pattern can be calculated based on at least one of the position and the size of the pattern within the image.
In another embodiment, the user terminal 120 may not automatically perform the processing for one of the plurality of patterns within the image, but rather cause the user to select one of the plurality of patterns 132, 134, 136 included in the image. In this case, the user terminal 120 may display the plurality of patterns 132, 134, 146 detected in the image as selectable options on the display. Additionally or alternatively, the user terminal 120 may generate a preview of the plurality of patterns 132, 134, 146 detected in the image for display on the display. In this case, the user 110 selects one pattern to perform processing for the corresponding pattern after confirming previews for a plurality of patterns displayed on the display.
Fig. 2 is a diagram illustrating a configuration in which a plurality of user terminals 210, 220, 230 are communicably connected to a server 250 in order to perform pattern recognition according to an embodiment of the present invention. Fig. 2 shows a mobile phone terminal 210, a tablet terminal 220, and a notebook terminal 230 as examples of the user terminal, but the user terminal is not limited to this, and may be any computing device that can perform wired and/or wireless communication, is provided with an image sensor (e.g., a camera), or can communicate with the image sensor, and is provided with an application program that can scan a pattern of a visual code to execute. For example, the user terminal may include a smart phone (smart phone), a mobile phone, a navigator, a computer, a notebook computer, a Digital broadcasting terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a tablet computer, a game console (game console), a wearable device (wearable), an internet of things (IoT) device, a Virtual Reality (VR) device, an Augmented Reality (AR) device, and the like.
Further, fig. 2 shows that 3 user terminals 210, 220, 230 communicate with the server 250 through the network 240, but the present invention is not limited thereto, and another number of user terminals may communicate with the server 250 through the network 240. The communication method through the network 240 is not limited thereto, and may include a mobile communication network, a wired internet, a wireless internet, a broadcasting network, a satellite network, short-range wireless communication between user terminals, and the like. For example, the network 240 may include any one or more of Personal Area Network (PAN), Local Area Network (LAN), Campus Area Network (CAN), Metropolitan Area Network (MAN), Wide Area Network (WAN), broadband network (BBN), internet, and the like. Also, the network 240 may include one or more of a network topology having a bus network, a star network, a ring network, a mesh network, a star bus network, a tree or hierarchical (hierarchical) network, and the like, but is not limited thereto.
According to an embodiment, server 250 may include one or more server devices and/or databases capable of storing, providing, and executing computer-executable programs and data or one or more decentralized computing devices and/or decentralized databases based on cloud computing services. In one embodiment, the server 250 may be a web server storing at least one of a thumbnail image, a title, and a descriptive sentence associated with a Uniform Resource Locator (URL).
In an embodiment, in case the pattern corresponds to the coding of the uniform resource locator, the user terminal 210, 220, 230 may transmit the corresponding pattern to the server 250 in order to generate a preview for the corresponding pattern, or may transmit uniform resource locator information coded by the pattern to the server 250. In this case, the server 250 may provide at least one of a thumbnail, a title, and a description sentence related to the corresponding uniform resource locator to the user terminal 210, 220, 230. The user terminal 210, 220, 230 may display at least one of the received thumbnail, title, and description sentence as a preview for the corresponding pattern on the display.
Fig. 3 is a block diagram illustrating a structure of a user terminal 300 according to an embodiment of the present invention. The user terminal 300 may be any device having image capturing and image processing capabilities. As shown, the user terminal 300 may include an image sensor 310, a display 320, an input interface 330, a communication module 340, a processor 350, and a storage 360.
Image sensor 310 may receive one or more images. For example, the image sensor 310 may be a camera module. The image received through the image sensor 310 can be temporarily or permanently stored in the storage part 360 of the user terminal 300 in the form of an electronic file. In one embodiment, the images received by the image sensor 310 may be provided to the processor 350 for the purpose of determining whether the corresponding images contain a visually encoded pattern. In another embodiment, the image sensor 310 may store an image in the storage 360 or provide an image to the processor 350 in response to receiving an input instruction from a user through the input interface 330.
Display 320 may output information and/or data generated by user terminal 300. According to an embodiment, an image received by the image sensor 310 may be displayed on the display 320. In addition to the display 320, devices such as speakers, haptic feedback devices, and the like may also be provided as output devices with the display 320.
Input interface 330 may be configured to enable a user to input information and/or data to user terminal 300. For example, the input interface 330 may include a device such as a touch screen, buttons, a numeric keypad, a touch pad, a keyboard, a microphone, a mouse, and the like. Fig. 3 shows the display 320 as a separate component from the input interface 330, but the display 320 may be an input/output device in which the display 320 and the input interface 330 are combined.
As shown in fig. 3, the user terminal 300 may communicate information and/or data with other devices via the network 370 using the communication module 340. For example, the request generated by the processor 350 of the user terminal 300 based on the program code stored in the storage unit 360 may be transmitted to another device (e.g., a server) through the network 370 under the control of the communication module 340. On the contrary, control signals, instructions, contents, files, etc. provided by other devices may be provided to the user terminal 300 through the communication module 340. For example, control signals, instructions, content, files, etc. of other devices received through the communication module 340 may be transferred to the processor 350 or the storage 360.
The processor 350 may perform basic arithmetic, logic, and input-output operations, thereby executing instructions of a computer program. The instructions may be provided through the storage 360 or the communication module 340. For example, processor 350 may execute instructions received in accordance with program code stored in memory 360. In one embodiment, the processor 350 may include a pattern area detecting section 352, a pattern type determining section 354, a pattern preview generating section 356, and a pattern processing section 358.
The storage section 360 may include any computer-readable recording medium. According to an embodiment, the storage portion 360 may include a storage device such as a random access memory, a read only memory, a disk Drive, a Solid State Drive (SSD), a flash memory (flash memory), and the like. At least one program code such as an operating system, an application program, and the like may be stored in the storage section 360.
In an embodiment, the image sensor 310 may receive an image including a plurality of patterns. In this case, the image received by the image sensor 310 may be output on the display 320 and may be provided to the processor 350. The processor 350 may provide the received image to the pattern region detecting unit 352, and the pattern region detecting unit 352 may detect a region including the visually encoded pattern within the image to determine whether the visually encoded pattern is included in the image.
If the pattern area detection unit 352 determines that the received image does not include the visually encoded pattern, the processor 350 may cause the image sensor 310 to receive a subsequent image. In the case where the pattern region detection unit 352 determines that one visually encoded pattern is included in the received image, the pattern region detection unit 352 may supply the corresponding pattern to the pattern processing unit 358 so that the pattern processing unit 358 may perform the processing of the corresponding pattern. In contrast, in the case where the pattern region detection section 352 determines that a plurality of patterns are included in the received image, the pattern region detection section 352 may supply the pattern type determination section 354 with images of the detected plurality of patterns or regions in which the patterns are detected.
The pattern type determination section 354 may determine the pattern type of the received plurality of patterns. For example, a pattern related to a financial service (LINE payment service) of a specific company may be judged as a first pattern type, a pattern related to a non-financial service (e.g., LINE chat tool service) of the specific company may be judged as a second pattern type, a two-dimensional pattern related to a service of other companies may be judged as a third pattern type, and a one-dimensional pattern may be judged as a fourth pattern type. In one embodiment, the pattern type judging part 354 may transmit the corresponding pattern or information encoded by the corresponding pattern to another module (e.g., a LINE payment application) in order to judge whether each pattern is a pattern related to a financial service (e.g., a LINE payment service) of a specific company. In another embodiment, the pattern type judging part 354 may include a module (e.g., a LINE payment application) for judging whether each pattern is a pattern related to a financial service (e.g., a LINE payment service) of a specific company.
The pattern processing section 358 may receive the pattern type of each of the plurality of patterns from the pattern type judging section 354, and may compare the priority of the patterns according to the priority of the pattern types stored in advance in the storage section 360. For example, the storage section 360 may be predefined to have a gradually decreasing priority in the order of the first pattern type, the second pattern type, the third pattern type, and the fourth pattern type. When it is determined that the pattern having the highest priority is one pattern, the pattern processing section 358 may automatically perform the process of the corresponding pattern without the user input.
In contrast, when determining that the pattern having the highest priority is 2 or more, the pattern processing section 358 may calculate the reliability score of each pattern. In an embodiment, the reliability score of the visually encoded pattern can be calculated based on at least one of a position and a size of the corresponding pattern. For example, the reliability score of a visually encoded pattern can be calculated based on the distance between the center position of the image and the center position of the corresponding pattern, or can be calculated as a proportion of the area that the corresponding pattern occupies in the image. Thereafter, the pattern processing section 358 may automatically execute the processing of the pattern having the highest reliability score without the input of the user.
In another embodiment, when determining that the number of patterns having the highest priority is two or more, the pattern processing section 358 may supply the plurality of patterns having the highest priority to the pattern preview generating section 356. In this case, the pattern preview generating part 356 may generate a preview for each pattern to be displayed on the display 320 so that the user selects one pattern. In this case, the pattern preview generating part 356 may calculate the reliability score of each pattern to display a preview for a pattern having a higher reliability score at the upper end of the preview area on the display 320. In a case where the user selects one of the previews displayed on the display 200 through the input interface 330, the pattern processing section 358 may perform a process of a pattern related to the preview selected by the user. Alternatively, the preview generating section 356 may display only a preview for the pattern with the highest reliability score in the preview area on the display 320.
Also, in another embodiment, in the case where the pattern area detecting section 352 detects a plurality of patterns within the received image, the detected plurality of patterns may be supplied to the pattern preview generating section 356 instead of the pattern type judging section 354. In this case, the pattern preview generating part 356 may generate a preview for each pattern to be displayed on the display 320. For example, a preview of the various patterns may be displayed in a preview area on the display 320. In this case, the pattern preview generating section 356 may determine the display order of the preview of the pattern based on the priority of the pattern and/or the reliability score of the pattern. In a case where the user selects one of the plurality of previews displayed on the display 320, the pattern processing section 358 may perform processing of a pattern related to the preview selected by the user.
Illustratively, in the case where the pattern of the visual code corresponds to the code of the uniform resource locator, the pattern preview generating part 356 may transmit the corresponding pattern or the uniform resource locator information coded by the corresponding pattern to an external device (e.g., a web server) through the communication module 340. In this case, the user terminal 300 may receive at least one of a thumbnail, a title, and a description sentence related to the corresponding uniform resource locator from an external device. The pattern preview generation section 356 can generate a preview for the corresponding pattern based on at least one of the thumbnail, the title, and the explanatory sentence received from the external apparatus.
As another example, when the pattern of the visual code corresponds to the code of the text information, the pattern preview generating section 356 may generate a pattern preview of the visual code based on the corresponding text information. In the case where the pattern of the visual code corresponds to a code of information (e.g., uniform resource locator information) related to financial services of a specific company or non-financial services of a specific company, the pattern preview generation section 356 may generate, as a preview, a user interface capable of executing financial services or non-financial services of a specific company coded by the corresponding pattern.
The user terminal 300 may include more structural elements than those shown in fig. 3. According to an embodiment, the user terminal 300 may further include other components such as a transceiver (transceiver), a Global Positioning System (GPS) module, various sensors, a database, and the like. For example, in the case where the user terminal 300 is a smartphone, it may include structural elements that the smartphone generally includes, for example, the user terminal 300 may further include various structural elements such as an acceleration sensor, a gyro sensor, various physical buttons, a button using a touch panel, an input/output port, a vibrator for vibration, and the like. In this case, the user terminal 300 may provide feedback (e.g., tactile feedback) to the user in the case where a pattern is detected in the received image.
Fig. 4 is a diagram illustrating a lookup table 400 defining priorities of respective pattern types according to an embodiment of the present invention. In one embodiment, the pattern of the visual code may be divided into a pattern related to a financial service (e.g., a LINE Payment service) of a specific company, a pattern related to a non-financial service (e.g., a LINE chat tool service) of a specific company, a two-dimensional pattern related to a service of another company, and a one-dimensional pattern. According to the lookup table 400, a pattern related to financial services of a specific company, a pattern related to non-financial services of a specific company, a two-dimensional pattern related to services of other companies, a one-dimensional pattern may have a gradually decreasing priority in order.
For example, the pattern related to the financial service of the specific company may include a pattern for settling a fee (for example, a two-dimensional code for a store disposed in the store, a two-dimensional code used by the user to pay a fee in the store, etc.), a pattern for a billing (split bill) (for example, a two-dimensional code for the user to bill a settlement fee), a pattern for remittance of a fee, and the like. The patterns related to non-financial services of a specific company may include a pattern for adding friends (e.g., a two-dimensional code for adding a specific user as a friend), a chat room invitation pattern, an open chat invitation pattern, a pattern for performing login (e.g., a two-dimensional code login), and the like. The two-dimensional pattern related to the services of other companies may include a two-dimensional pattern encoding a uniform resource locator, a two-dimensional pattern encoding text information, and the like.
Fig. 5 is a diagram illustrating an example of a process of automatically performing a pattern without user input according to an embodiment of the present invention. Fig. 5 shows a first operation step 510 and a second operation step 520 of the user terminal. In a first operation step 510, a first pattern 512 is automatically selected without user input among a plurality of patterns 512, 514 contained in an image received by an image sensor of a user terminal. The first pattern 512 may be a two-dimensional pattern related to financial services of a particular company and the second pattern 514 may be a two-dimensional pattern related to non-financial services of a particular company. In this case, the first pattern 512 may be determined to have a higher priority than the second pattern 514.
In a first operation step 510, the user terminal may mark on the display an area containing the first pattern 512 and an area containing the second pattern 514 with different colors. Fig. 5 shows that the detected first and second patterns 512 and 514 are marked with different colors, but not limited thereto, the first and second patterns 512 and 514 may be displayed in various manners such as shadow display, so that the user can easily confirm the respective pattern regions on the display. Also, the user terminal may display a mark 516 on the first pattern 512 in order to inform the user of the automatic execution of the process of the first pattern 512 having a higher priority.
In a second operation step 520, a screen is illustrated in which the processing of the first pattern 512 is automatically executed without user input. The first pattern 512 is a pattern related to a financial service of a specific company, and thus a financial service providing screen of the specific company related to the corresponding pattern may be displayed on a display of the user terminal. In this case, the user may select the settlement button 522 displayed on the display to perform settlement of the fee.
FIG. 6 is a flow chart illustrating a method 600 of visually encoded pattern recognition in accordance with an embodiment of the present invention. The method 600 may begin at step 610, where at step 610, a user terminal receives an image comprising a first pattern and a second pattern. For example, an image containing a plurality of patterns may be received during operation of the user terminal in a pattern scanning mode for recognizing visually encoded patterns. Thereafter, the user terminal may detect an area including the visually encoded patterns (the first pattern and the second pattern) in the image, and may determine a pattern type of each pattern (step 620). In an embodiment, the user terminal may determine a pattern related to a financial service (e.g., a LINE payment service) of a specific company as a first pattern type, determine a pattern related to a non-financial service (e.g., a LINE chat tool service) of the specific company as a second pattern type, determine a two-dimensional pattern related to a service of other companies as a third pattern type, and determine a one-dimensional pattern as a fourth pattern type.
In step 630, the user terminal may compare the pattern type of the first pattern with the pattern type of the second pattern in priority. For example, the user terminal may define in advance that the first pattern type, the second pattern type, the third pattern type, and the fourth pattern type have a gradually decreasing priority in order. Based on the determination result in step 630, the user terminal may determine whether the pattern types of the first pattern and the second pattern have the same priority (step 640).
In step 640, in the case where it is determined that the priority of the pattern type of the first pattern is not the same as the priority of the pattern type of the second pattern (i.e., no in step 640), the user terminal may automatically perform the process of determining a pattern having a higher priority without an input of the user (step 650). With the above configuration, even if the user does not press the camera shooting button or select one of the plurality of patterns, the processing of the pattern having the highest priority is automatically performed, and therefore, a better user experience can be provided. In contrast, in step 640, in the case where it is determined that the pattern type of the first pattern is the same as the priority of the pattern type of the second pattern (i.e., yes in step 640), the method 600 may proceed to method 700 of fig. 7 or method 800 of fig. 8.
FIG. 7 is a flow diagram illustrating a method 700 of performing processing for one of the patterns having the same priority in accordance with one embodiment of the present invention. In step 640 of fig. 6, in the case where it is determined that the priority of the pattern type of the first pattern is the same as the priority of the pattern type of the second pattern (i.e., yes in step 640), the user terminal may calculate the reliability score of the first pattern (step 710). For example, the reliability score of the first pattern can be calculated based on the result of calculating the distance between the center position of the first pattern and the center position of the received image. In this case, the reliability score of the first pattern may be inversely proportional to a distance between a center position of the first pattern and a center position of the received image. That is, the closer the center position of the visually encoded pattern is to the center position of the received image, the higher the reliability score of the calculated pattern. This is because there is a high possibility that the user holds the user terminal so that the pattern of interest is close to the center position of the image.
Alternatively, the reliability score of the first pattern can be calculated based on a proportion of an area occupied in the received image by a region corresponding to the first pattern. In this case, the higher the proportion of the area occupied in the received image by the region corresponding to the pattern, the higher the reliability score of the calculated pattern. This is because there is a high possibility that the user holds the user terminal in such a manner that the pattern of interest occupies a high area ratio in the image.
In step 720, the user terminal may calculate a reliability score for the second pattern. The reliability score of the second pattern may be calculated in the same manner as the reliability score of the first pattern is calculated. That is, the reliability score of the second pattern can be calculated based on the result of calculating the distance between the center position of the second pattern and the center position of the received image or the proportion of the area occupied in the received image by the region corresponding to the second pattern. Fig. 7 shows that the step 710 of calculating the reliability score of the first pattern is performed before the step 720 of calculating the reliability score of the second pattern, but is not limited thereto, and the reliability score of any pattern may be calculated preferentially, or the reliability scores may be calculated at the same time.
The user terminal may then compare the calculated reliability score of the first pattern with the reliability score of the second pattern to display a preview of the pattern with a higher reliability score on the display (step 730). For example, in a case where the first pattern and the second pattern have the same priority, and the first pattern is closer to the center of the image than the second pattern, the user terminal may determine that the reliability score of the first pattern is higher than that of the second pattern, and display a preview for the first pattern on the display.
In an embodiment, in the case of a pattern related to a financial service of a specific company or a non-financial service of a specific company, the user terminal may display a user interface capable of performing a corresponding service (a financial service of a specific company or a non-financial service of a specific company) as a preview on the display. In the case where the pattern is a uniform resource locator type pattern related to a service of other companies, the user terminal may receive at least one of a thumbnail, a title, and a description sentence for the uniform resource locator encoded by the corresponding pattern from the external server to display as a preview on the display. In the case of a pattern of encoded text messages, the user terminal can generate and display a preview on the display based on the corresponding text information.
In step 740, the user terminal may receive a user selection of a pattern displayed as a preview. For example, the user's selection may be an input for performing processing of a pattern related to a preview displayed on the display. In this case, the user terminal may perform processing of the pattern selected by the user (step 750). Fig. 7 shows a preview of the pattern with the higher reliability score being displayed on the display, but without being limited thereto, a preview of all patterns with the same priority may be displayed on the display. In this case, a preview for the pattern with the higher reliability score may be displayed on the upper end of the preview area on the display.
FIG. 8 is a flow diagram illustrating a method 800 of performing processing for one of the patterns having the same priority in one embodiment of the invention. In step 640 of fig. 6, in the case where it is determined that the priority of the pattern type of the first pattern is the same as the priority of the pattern type of the second pattern (i.e., yes in step 640), the user terminal may calculate the reliability score of the first pattern (step 810). For example, the reliability score of the first pattern can be calculated based on the result of calculating the distance between the center position of the first pattern and the center position of the received image. In this case, the reliability score of the first pattern may be inversely proportional to a distance between a center position of the first pattern and a center position of the received image. That is, the closer the center position of the visually encoded pattern is to the center position of the received image, the higher the reliability score of the calculated pattern. This is because there is a high possibility that the user holds the user terminal so that the pattern of interest is close to the center position of the image.
Alternatively, the reliability score of the first pattern can be calculated based on a proportion of the area occupied in the received image by the region corresponding to the first pattern. In this case, the higher the proportion of the area occupied by the region corresponding to the pattern in the received image, the higher the reliability score of the calculated pattern. This is because there is a high possibility that the user holds the user terminal in such a manner that the pattern of interest occupies a high area ratio in the image.
In step 820, the user terminal may calculate a reliability score for the second pattern. The reliability score of the second pattern may be calculated in the same manner as the reliability score of the first pattern is calculated. That is, the reliability score of the second pattern can be calculated based on the result of calculating the distance between the center position of the second pattern and the center position of the received image or the proportion of the area occupied in the received image by the region corresponding to the second pattern. Fig. 8 shows that the step 810 of calculating the reliability score of the first pattern is performed prior to the step 820 of calculating the reliability score of the second pattern, but is not limited thereto, and the reliability score of any pattern may be calculated preferentially, or the reliability scores may be calculated at the same time.
Next, the user terminal may compare the calculated reliability score of the first pattern with the reliability score of the second pattern to automatically perform the process of the pattern having a higher reliability score without user input (step 830). For example, in the case where the first pattern and the second pattern have the same priority and the first pattern is closer to the center of the image than the second pattern, the user terminal determines that the reliability score of the first pattern is higher than the reliability score of the second pattern, so that the processing of the first pattern can be automatically performed without user input. With the above configuration, even if the user does not press the camera shooting button or select one of the plurality of patterns, the processing of the pattern with the highest priority and the highest reliability score can be automatically performed, and therefore, a better user experience can be provided.
FIG. 9 is a flow chart illustrating a method 900 of determining a pattern type of a visually encoded pattern in accordance with an embodiment of the present invention. For example, when a plurality of patterns are detected in an image while the user terminal is operating in the pattern scanning mode, the determination of the pattern type for each pattern may be started (step 910). The user terminal may preferentially determine whether a pattern is a pattern related to a financial service (e.g., LINE payment service) of a specific company for one pattern (step 920).
In the case where the corresponding pattern is determined to be a pattern related to financial services of a specific company, the user terminal may determine that the corresponding pattern is of a first pattern type (step 930). In contrast, in the case where the corresponding pattern is determined not to be a pattern related to a financial service of the specific company, the user terminal proceeds to step 940 to determine whether the corresponding pattern is a pattern related to a non-financial service (e.g., LINE chat tool service) of the specific company. In the case where the corresponding pattern is determined to be a pattern related to a non-financial service of the specific company, the user terminal may determine that the corresponding pattern is of the second pattern type (step 950). In contrast, in the case where the corresponding pattern is determined not to be a pattern related to the non-financial service of the specific company, the user terminal proceeds to step 960 to determine whether the corresponding pattern is a two-dimensional pattern.
If it is determined that the corresponding pattern is a two-dimensional pattern, the user terminal may determine that the corresponding pattern is a third pattern type (step 970). In contrast, in case that the corresponding pattern is determined as the two-dimensional pattern (e.g., one-dimensional pattern), the user terminal may determine that the corresponding pattern is the fourth pattern type (step 980).
Fig. 10 is a diagram showing an example of displaying a reminder for confirming whether to enter a pattern scanning mode in a case where a user terminal receives a plurality of visually encoded patterns according to an embodiment of the present invention. In the course of the user terminal sequentially receiving a plurality of images through the image sensor in the image capturing mode, the user selects the image capturing button 1016 by a touch input or the like to capture a photograph. And, if a pattern is detected within the received image while the user terminal sequentially receives a plurality of images through the image sensor in the pattern scanning mode, the detected pattern is automatically executed without an input of a user or a preview for the detected pattern may be displayed on the display. Additionally, in the case where the user terminal detects a pattern within the image received in the image capturing mode, whether to enter the pattern scanning mode may be confirmed to the user.
In a first operation step 1010, the user terminal receives an image comprising a plurality of patterns 1012, 1014 by means of an image sensor in an image capturing mode. In this case, the user terminal may detect multiple patterns 1012, 1014 within the received image. In a second operation 1020, the user terminal detects a plurality of patterns 1012, 1014 within the image received in the image capturing mode and displays a reminder 1022 for confirming whether to enter the pattern scanning mode on a display of the user terminal. In the case where the user selects the reminder 1022 by touch input or the like, the user terminal may enter a pattern scanning mode to perform processing for the plurality of patterns 1012, 1014.
FIG. 11 is a diagram illustrating an example of displaying a preview for a pattern selected by a user on a display in accordance with an embodiment of the present invention. Fig. 11 shows a first operation step 1110, a second operation step 1120, and a third operation step 1130 of the user terminal. In a first operation step 1110, in a state where the user terminal operates in the pattern scanning mode, in a case where an image including the first pattern 1112 and the second pattern 1114 of the visual code is received by the image sensor, the first pattern 1112 and the second pattern 1114 are detected, and the detected plurality of patterns 1112, 1114 may be displayed on the display as selectable options. In this case, the area containing the first pattern 1112 and the area containing the second pattern 1114 may be marked in different colors on the display.
In a second operation 1120, the user terminal receives a user selection of the first pattern 1112 from the user, and a preview 1124 of the first pattern 1112 is displayed in a preview area on the display. In the case where the user selects the first pattern 1112 through a touch input or the like, the user terminal may display a mark 1122 informing that the first pattern 1112 has been selected on the first pattern 1112, and display a preview 1124 thereof in a preview area of the display. As shown, the first pattern 1112 may be a pattern that adds "James" as a friend in a non-financial service (i.e., LINE chat facility service) of a particular company.
The preview 1124 for the first pattern may include a LINE chat tool logo, information about the user added as a friend (e.g., "James"), an Add friend button 1126, and a share button 1128. In the case where the user selects the add friend button 1126 through a touch input or the like, the user terminal may execute a friend add function in the LINE chat tool. In the case where the user selects the share button 1128 through a touch input or the like, the user terminal may enable the user to share the first pattern 1112 or information encoded by the first pattern 1112 with other users. In a second operation 1120, the preview 1124 for the first pattern includes, but is not limited to, a LINE chat tool logo, information ("James") about the user added as a friend, an Add friend button 1126, and a share button 1128, and may include other information.
In a third operational step 1130, the user terminal receives a user selection of the second pattern 1114 from the user, displaying a preview 1134 for the second pattern in a preview area on the display. In the case where the user selects the second pattern 1114 through a touch input or the like, the user terminal may display a mark 1132 on the second pattern 1114 informing that the second pattern 1114 has been selected, and may display a preview 1134 thereof in a preview area of the display. As shown, the second pattern 1114 may be a pattern that encodes uniform resource locator information related to services of companies other than the particular company.
The user terminal transmits the corresponding pattern to an external device (e.g., a web server) or transmits uniform resource locator information encoded by the corresponding pattern to the external device in order to generate a preview of the second pattern 1114 encoding the uniform resource locator information. In this case, the user terminal may receive at least one of a thumbnail, a title, and a description sentence related to the corresponding uniform resource locator from the external device. The preview 1134 for the second pattern can include a thumbnail, title, description sentence, and share button 1136 received from the external device.
In the event that the user selects the preview 1134 for the second pattern via touch input or the like, the user terminal processes the uniform resource locator information encoded by the second pattern 1114 in a web browser application or the like. In the case where the user selects the share button 1136 through a touch input or the like, the user terminal may enable the user to share the second pattern 1114 or uniform resource locator information encoded by the second pattern 1114 with other users. In the third operation 1130, the preview 1134 for the second pattern includes a thumbnail, a title, a description sentence, and a share button 1136, but is not limited thereto and may include other information.
Fig. 11 shows that the first pattern 1112 and the second pattern 1114 are displayed on the display, but not limited thereto, and in the case that more than 3 patterns are included in the received image, more than 3 patterns may be displayed on the display with selectable options. Further, fig. 11 shows that when the user selects one of the selectable options (patterns) displayed on the display, a preview of the corresponding pattern is displayed, but the present invention is not limited to this, and alternatively, a preview of a pattern with a high priority may be automatically displayed without user input based on the priority of the pattern within the image. In this case, when the priority of the patterns in the image is the same, the reliability score of each pattern is calculated, and a preview of the pattern with a high reliability score can be automatically displayed without user input.
FIG. 12 is a diagram illustrating an example of displaying a preview for a plurality of patterns contained within an image on a display according to an embodiment of the present invention. Fig. 12 shows a first operation 1210 and a second operation 1220 of the user terminal. In a first operation 1210, when an image including first and second visually encoded patterns 1212 and 1214 is received by an image sensor in a state where a user terminal operates in a pattern scanning mode, the first and second patterns 1212 and 1214 are detected, and an area including the detected first pattern 1112 and an area including the second pattern 1114 are marked with different colors on a display.
In a second operation 1220, in response to detecting the first pattern 1212 and the second pattern 1214 within the image received at the user terminal, a preview 1222 for the first pattern and a preview 1228 for the second pattern are automatically displayed in a preview area on the display. As shown in fig. 12, the first pattern 1212 is a two-dimensional pattern for adding "James" as a friend in a non-financial service (i.e., LINE chat tool service) of a specific company, and the second pattern 1214 may be a text-coded two-dimensional pattern. The preview 1222 for the first pattern may include a LINE chat tool logo, information related to the user added as a friend ("James"), an add friend button 1224, and a share button 1226. In case that the user selects the add friend button 1224 within the preview 1222 for the first pattern by a touch input or the like, the user terminal may perform an add friend function at the LINE chat tool. In the case where the user selects the share button 1226 within the preview 1222 for the first pattern through a touch input or the like, the user terminal may enable the user to share the first pattern 1212 or information encoded by the first pattern 1212 with other users.
The preview 1228 for the second pattern may include text information encoded by the pattern, a text copy button 1230, and a share button 1232. In case that the user selects the text copy button 1230 by a touch input or the like, text information encoded by a pattern may be copied in a memory area of the user terminal. In the case where the user selects the share button 1232 through a touch input or the like, the user terminal may enable the user to share the second pattern 1214 or the text information encoded by the second pattern 1214 with other users.
In fig. 12, a preview 1222 for the first pattern and a preview 1228 for the second pattern are shown displayed on the display, but not limited thereto, and in the case where 3 or more patterns are included in the received image, a preview for each of the 3 or more patterns may be displayed on the display. Further, although fig. 12 shows the preview being displayed in the order of the plurality of patterns in the image, the present invention is not limited to this, and a preview of a pattern with a high priority may be displayed at the upper end of the preview area based on the priority of the plurality of patterns in the image instead. In this case, when the priorities of the plurality of patterns in the image are the same, the reliability scores of the respective patterns may be calculated, and a preview of the pattern with a high reliability score may be displayed at the upper end of the preview area.
FIG. 13 is a diagram illustrating an illustration of displaying a preview for a pattern selected by a user on a display according to an embodiment of the present invention. Fig. 13 shows a first operation step 1310, a second operation step 1320, and a third operation step 1330 of the user terminal. In a first operation 1310, in a state where the user terminal is operating in the pattern scanning mode, in a case where an image including the first pattern 1312 and the second pattern 1314 of the visual code is received by the image sensor, the first pattern 1312 and the second pattern 1314 are detected. In this case, the area containing the first pattern 1312 and the area containing the second pattern 1314 may be marked with different colors on the display.
In a second operation 1320, the user terminal displays the detected plurality of patterns 1312, 1314 as selectable options 1324, 1326 on an option area 1322 on the display. For example, selectable options 1324, 1326 may be icons formed by resizing the detected plurality of patterns 1312, 1314. All of the detected plurality of patterns may be displayed in the option area 1322.
In the event that the user selects the first option 1324 by touch input or the like, a preview 1328 for the first pattern 1312 associated with the first option 1324 may be displayed in a preview area of the display. As shown, the first pattern 1312 may be a two-dimensional pattern for adding "James" as a friend in a non-financial service (i.e., LINE chat tool service) of a particular company. In this case, to alert that the first option 1324 has been selected, the unselected options (i.e., the second option 1326) may be shaded. In case the user selects the add friend button within the preview 1328 for the first pattern 1312 related to the first option 1324 by touch input or the like, the user terminal may perform the add friend function in the LINE chat tool.
In a third operation 1330 the user selects the second option 1326 by touch input or the like, displaying a preview 1332 in a preview area of the display for the second pattern 1314 relating to the second option 1326. As shown, the second pattern 1314 may be a two-dimensional pattern encoding uniform resource locator information related to services of other companies. In this case, the user terminal may receive at least one of a thumbnail, a title, and a description sentence related to the corresponding uniform resource locator from the external device, and may display the at least one of the thumbnail, the title, and the description sentence received from the external device as a preview area on the display for a preview 1332 of the second pattern 1314. In the event that the user selects the preview 1332 for the second pattern 1314, the user terminal may process the uniform resource locator information encoded by the second pattern 1314 in a web browser application or the like.
FIG. 14 is a flow chart illustrating a method 1400 of visually encoded pattern recognition in accordance with an embodiment of the present invention. Step 1400 may begin at step 1410, where the user terminal receives an image including a first pattern and a second pattern at step 1410. For example, an image containing a plurality of patterns may be received during a period in which the user terminal operates in a pattern scanning mode for recognizing a pattern of a visual code.
Thereafter, the user terminal may detect an area in the image that includes the visually encoded patterns (first and second patterns), displaying the first and second patterns as selectable options on the display (step 1420). In one embodiment, the area containing the first pattern and the area containing the second pattern may be marked with different colors on the display to be used as selectable options. In another embodiment, an icon formed by resizing the detected plurality of patterns may be displayed as a selectable option in an option area on the display.
In step 1430, the user terminal may receive a selection of a first pattern from the user. In this case, a preview of the selected first pattern may be displayed in a preview area on the display (step 1440). If a selection related to a preview for the first pattern is received from the user in step 1450, the user terminal may perform processing for the first pattern (step 1460).
Fig. 14 shows an option area in which the first pattern and the second pattern are displayed as selectable options on the display, but the present invention is not limited to this, and in the case where 3 or more patterns are included in the received image, 3 or more patterns are displayed as selectable options on the option area on the display.
FIG. 15 is a flow chart illustrating a method 1500 of pattern recognition for visual coding in accordance with an embodiment of the present invention. Step 1500 may begin at step 1510, where the user terminal receives an image comprising a first pattern and a second pattern at step 1510. For example, an image containing a plurality of patterns may be received during a period in which the user terminal operates in a pattern scanning mode for recognizing a pattern of a visual code.
Thereafter, the user terminal may detect the first pattern and the second pattern from the received image to display a preview of the first pattern and a preview of the second pattern in a preview area on the display (step 1520). In an embodiment, the configuration of the previews for the first and second patterns may be determined by comparing the priority of the pattern type of the first pattern with the pattern type of the second pattern. Alternatively or additionally, the configuration of the preview for the first pattern and the second pattern may be determined by comparing the reliability score of the first pattern with the reliability score of the second pattern.
The user terminal may receive a selection from the user related to the preview of the first pattern displayed on the display (step 1530). In this case, the user terminal may perform the process for the selected first pattern (step 1540). Fig. 15 shows previews of the first pattern and the second pattern displayed in the preview area on the display, but the present invention is not limited thereto, and when 3 or more patterns are included in the received image, previews of 3 or more patterns may be displayed in the preview area on the display.
The pattern recognition method of the above-described visual code may be implemented as a computer-readable recording medium in a computer-readable code (e.g., a computer program). The computer-readable recording medium includes all kinds of recording devices that store data that can be read by a computer system. Examples of the computer readable recording medium include read-only memory, random-access memory, read-only optical disk, magnetic disk, floppy disk, optical data storage device, and the like. Also, the computer readable recording medium may be dispersed in computer systems connected through a network, store the computer readable code in a dispersed manner, and may be executed. Also, functional (functional) programs, codes, and code fields for implementing the above-described embodiments can be easily inferred by programmers skilled in the art to which the present invention pertains.
The method, operation or process of the present invention may be carried out by a variety of units. For example, such an approach may be implemented in hardware, firmware, software, or a combination thereof. Those of skill in the art would understand that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, and such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In a hardware implementation, the processing unit for performing the method may be implemented in one or more application specific integrated chips, digital signal processors, Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, microprocessors, microcontrollers, electronic devices, other electronic units designed to perform the functions described in the present disclosure, computers, or a combination thereof.
Thus, the various illustrative logical blocks, modules, and circuits described in connection with the invention may be implemented or performed with a general purpose processor, a digital signal processor, an application specific integrated chip, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described in this invention. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. Also, a processor may be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other configuration.
In the implementation of firmware and/or software, an engineering method may be implemented as instructions (e.g., a computer program) stored on a computer-readable medium such as a Random Access Memory (RAM), a read-only memory (ROM), a non-volatile random access memory (NVRAM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a flash memory, a Compact Disc (CD), a magnetic or optical data storage device, etc. The instructions may be executed by one or more processors to cause the plurality of processors to perform a specific aspect of the functions described in the present invention.
In the case of software implementation, the above-described method may be stored as one or more instructions or codes on a computer-readable medium, or may be transmitted through the computer-readable medium. Computer-readable media include any medium that can readily convey a computer program from one location to another and include computer storage media and communication media. A storage media may be any available media that can be accessed by a computer. By way of non-limiting example, such computer-readable media can comprise random access memory, read only memory, electrically erasable programmable read only memory, optical disk drives or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to move or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
For example, if the software is transmitted from a web page, server, or other remote source using a coaxial cable, fiber optic cable, wire, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, wire, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk (disk) and disc (disc) used in the present invention include compact disc, laser disc, optical disc, Digital Video Disc (DVD), floppy disk and blu-ray disc, wherein disk (disk) generally plays data magnetically, and disc (disc) plays optical data using laser light, on the contrary. Combinations of the above are also within the scope of computer-readable media.
The software modules may reside in ram, flash memory, rom, eprom, eeprom, registers, hard disk, a removable disk, a cd-rom, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor in a manner that enables the processor to read information from, and record information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside on an application specific integrated chip. The application specific integrated chip may be present in the user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
The embodiments described above use the presently disclosed subject matter in the form of one or more stand-alone computer systems, but the invention is not limited thereto and may be implemented in connection with any computing environment, such as a network or distributed computing environment. Further, in the present invention, the form of the main body may be implemented in a plurality of processing chips or devices, and the memory is similarly affected by the plurality of devices. Such devices may include personal computers, network servers, and portable devices.
In the present specification, the present invention is described with reference to a part of examples, and various modifications and alterations can be made without departing from the scope of the present invention as understood by those skilled in the art to which the present invention pertains. Such modifications and variations are intended to be included herein within the scope of the appended claims.

Claims (20)

1. A pattern recognition method, performed in a user terminal, comprising:
a step of receiving an image comprising a first pattern of visual coding and a second pattern of visual coding;
judging the pattern type of the first pattern of the visual code;
judging the pattern type of the second pattern of the visual code;
comparing the priority of the pattern type of the first pattern of the visual code with the priority of the pattern type of the second pattern of the visual code; and
and automatically executing the processing of the first pattern of the visual code when the pattern type of the first pattern of the visual code is judged to have higher priority than the pattern type of the second pattern of the visual code.
2. The pattern recognition method of claim 1, wherein the step of determining the pattern type of the first pattern of the visual code comprises:
judging whether the first pattern of the visual code is a pattern related to a specific service; and
and determining a pattern type of the first pattern of the visual code as a first pattern type when the first pattern of the visual code is determined as a pattern related to a specific service.
3. The pattern recognition method of claim 2, wherein the step of determining the pattern type of the first pattern of the visual code further comprises:
determining whether the first pattern of the visual code is a pattern related to a service of a specific company when it is determined that the first pattern of the visual code is a pattern not related to a specific service; and
and a step of judging the pattern type of the first pattern of the visual code as a second pattern type when judging that the first pattern of the visual code is a pattern related to the service of a specific company.
4. The pattern recognition method according to claim 3, wherein the step of determining the pattern type of the first pattern of the visual code when it is determined that the first pattern of the visual code is not related to a service of a specific company further comprises:
determining a pattern type of the first pattern of the visual code as a third pattern type when the first pattern of the visual code is a two-dimensional pattern; and
and recognizing the pattern type of the first pattern of the visual code as a fourth pattern type when the first pattern of the visual code is a one-dimensional pattern.
5. The pattern recognition method according to claim 4, wherein the first pattern type, the second pattern type, the third pattern type, and the fourth pattern type have a gradually decreasing priority in this order.
6. The pattern recognition method according to claim 1, wherein when it is determined that the pattern type of the first pattern of the visual code has the same priority as the pattern type of the second pattern of the visual code, the pattern recognition method further comprises:
calculating a reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image;
calculating a reliability score of the visually encoded second pattern based on at least one of a position and a size of the visually encoded second pattern within the received image; and
and displaying a preview of the first pattern of the visual code on a display of the user terminal when it is determined that the reliability score of the first pattern of the visual code is higher than the reliability score of the second pattern of the visual code.
7. The pattern recognition method of claim 6, wherein the step of calculating the reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image comprises: and calculating a distance between a center position of the received image and a center position of the first pattern of the visual code.
8. The pattern recognition method of claim 6, wherein the step of calculating the reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image comprises: and calculating the area ratio occupied by the area corresponding to the first pattern of the visual code in the received image.
9. The pattern recognition method according to claim 1, further comprising, when it is determined that the pattern type of the first pattern of the visual code has the same priority as the pattern type of the second pattern of the visual code:
calculating a reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image;
calculating a reliability score of the visually encoded second pattern based on at least one of a position and a size of the visually encoded second pattern within the received image; and
and automatically executing the processing of the first pattern of the visual code when the reliability score of the first pattern of the visual code is determined to be higher than the reliability score of the second pattern of the visual code.
10. The pattern recognition method according to claim 1, further comprising: and displaying a mark on a first region corresponding to the first pattern of the visual code to inform a user of the automatic execution of the first pattern of the visual code.
11. The pattern recognition method according to claim 1, further comprising:
detecting a first region including the first pattern of the visual code;
detecting a second region including a second pattern of the visual code; and
and labeling the first region and the second region with different colors.
12. The pattern recognition method according to claim 1, further comprising:
a step of receiving a plurality of images in sequence in an image capturing mode; and
and displaying a reminder for confirming whether to enter a pattern scanning mode on a display of the user terminal when the image including the first pattern of the visual code and the second pattern of the visual code is received.
13. A computer-readable recording medium characterized by storing a computer program for executing the pattern recognition method according to any one of claims 1 to 12 in a computer.
14. A pattern recognition apparatus, comprising:
an image sensor for receiving an image comprising a first pattern of visual codes and a second pattern of visual codes;
a pattern type determination unit for determining a pattern type of the first pattern of the visual code and a pattern type of the second pattern of the visual code; and
a pattern processing section for comparing the priority of the pattern type of the first pattern of the visual code with the priority of the pattern type of the second pattern of the visual code,
the pattern processing unit automatically executes the processing of the first pattern of the visual code when it is determined that the priority of the pattern type of the first pattern of the visual code is higher than the priority of the pattern type of the second pattern of the visual code.
15. A pattern recognition method, performed in a user terminal, comprising:
a step of receiving an image comprising a first pattern of visual coding and a second pattern of visual coding;
generating a preview of the first pattern of the visual code;
generating a preview of the second pattern of the visual code;
displaying a preview of the first pattern and a preview of the second pattern in a preview area on a display of the user terminal;
a step of receiving a selection related to the preview of the first pattern from a user; and
a step of performing a processing of the first pattern of the visual coding in response to the selection.
16. The pattern recognition method according to claim 15, wherein the step of displaying the preview for the first image and the preview for the second pattern in a preview area on a display of the user terminal includes:
calculating a reliability score of the visually encoded first pattern based on at least one of a position and a size of the visually encoded first pattern within the received image;
calculating a reliability score of the visually encoded second pattern based on at least one of a position and a size of the visually encoded second pattern within the received image; and
and displaying a preview of the first pattern of the visual code at an upper end of a preview area on a display of the user terminal when the reliability score of the first pattern of the visual code is higher than the reliability score of the second pattern of the visual code.
17. The pattern recognition method according to claim 15, further comprising:
detecting a first region including the first pattern of the visual code;
detecting a second region including a second pattern of the visual code; and
and labeling the first region and the second region with different colors.
18. The pattern recognition method of claim 15, wherein the step of generating a preview of the visually encoded first pattern further comprises:
transmitting uniform resource locator information encoded by the first pattern of the visual code to an external device when the first pattern of the visual code corresponds to a code of a uniform resource locator;
a step of receiving at least one of a thumbnail, a title, and a description sentence related to the uniform resource locator information from the external device; and
and a step of displaying at least one of the received thumbnail image, title and description sentence on a preview area on the display.
19. A computer-readable recording medium characterized by storing a computer program for executing the pattern recognition method according to any one of claims 15 to 18 in a computer.
20. A pattern recognition apparatus, comprising:
an input interface for receiving input from a user;
an image sensor for receiving an image comprising a first pattern of visual codes and a second pattern of visual codes;
a pattern preview generating unit configured to generate a preview of a first pattern of the visual code and a preview of a second pattern of the visual code;
a display for outputting a preview of the generated first and second patterns of the visual code in a preview area; and
and a pattern processing unit configured to execute a process for the first pattern of the visual code in response to receiving a selection related to the preview of the first pattern through the input interface.
CN202011132620.2A 2019-10-22 2020-10-21 Pattern recognition method and apparatus, and computer-readable recording medium Pending CN112699703A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190131705A KR102273198B1 (en) 2019-10-22 2019-10-22 Method and device for recognizing visually coded patterns
KR10-2019-0131705 2019-10-22

Publications (1)

Publication Number Publication Date
CN112699703A true CN112699703A (en) 2021-04-23

Family

ID=75505859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011132620.2A Pending CN112699703A (en) 2019-10-22 2020-10-21 Pattern recognition method and apparatus, and computer-readable recording medium

Country Status (3)

Country Link
JP (1) JP2021068447A (en)
KR (2) KR102273198B1 (en)
CN (1) CN112699703A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7425913B1 (en) 2023-04-14 2024-01-31 東京瓦斯株式会社 Service provision system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007172471A (en) * 2005-12-26 2007-07-05 Canon Inc Scanner driver and control method for scanner driver
JP5341951B2 (en) * 2011-05-30 2013-11-13 東芝テック株式会社 Code reader and program
JP2018018372A (en) * 2016-07-29 2018-02-01 オリンパス株式会社 Bar-code reading device, bar-code reading method, and program
KR101784287B1 (en) * 2016-12-20 2017-10-11 에스케이플래닛 주식회사 Integrative image searching system and service method of the same

Also Published As

Publication number Publication date
KR20210084386A (en) 2021-07-07
KR102273198B1 (en) 2021-07-05
KR102542046B1 (en) 2023-06-12
JP2021068447A (en) 2021-04-30
KR20210047749A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN106327185B (en) Starting method of payment application and mobile terminal
CN101997561B (en) Data transfer method and system
JP5974976B2 (en) Information processing apparatus and information processing program
JP2019075124A (en) Method and system for providing camera effect
KR20140075681A (en) Establishing content navigation direction based on directional user gestures
KR20150059466A (en) Method and apparatus for recognizing object of image in electronic device
CN103365413A (en) Information processing device, computer-readable storage medium and projecting system
CN105809162B (en) Method and device for acquiring WIFI hotspot and picture associated information
US20210065269A1 (en) Listing support method and system
CN112699703A (en) Pattern recognition method and apparatus, and computer-readable recording medium
US10114518B2 (en) Information processing system, information processing device, and screen display method
JP6423933B2 (en) Information processing apparatus, form management system, form management server, information processing method, and program thereof
US9826108B2 (en) Mobile device camera display projection
CN112183149B (en) Graphic code processing method and device
KR20200121064A (en) Method, system, and non-transitory computer readable record medium for p managing event messages
KR20200106186A (en) How to recommend profile pictures and system and non-transitory computer-readable recording media
US11810231B2 (en) Electronic device and method for editing content of external device
KR20220109170A (en) Method and system for providing mini-map in chatroom
CN112287713A (en) Two-dimensional code identification method and device
US11960949B2 (en) Information processing apparatus, information processing system, and information processing method
CN114615377B (en) Application program control method, device and equipment
US20240143961A1 (en) Information processing apparatus, method, and storage medium
US20230169039A1 (en) Information processing apparatus, information processing method, and information processing system
JP7248279B2 (en) Computer system, program and method
WO2022079881A1 (en) Information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination