KR101862649B1 - Code generation and recognition system based on scan distance - Google Patents

Code generation and recognition system based on scan distance Download PDF

Info

Publication number
KR101862649B1
KR101862649B1 KR1020170179819A KR20170179819A KR101862649B1 KR 101862649 B1 KR101862649 B1 KR 101862649B1 KR 1020170179819 A KR1020170179819 A KR 1020170179819A KR 20170179819 A KR20170179819 A KR 20170179819A KR 101862649 B1 KR101862649 B1 KR 101862649B1
Authority
KR
South Korea
Prior art keywords
image
code
background
terminal device
cell
Prior art date
Application number
KR1020170179819A
Other languages
Korean (ko)
Inventor
김성모
Original Assignee
주식회사 케이비인베스트
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 케이비인베스트 filed Critical 주식회사 케이비인베스트
Priority to KR1020170179819A priority Critical patent/KR101862649B1/en
Application granted granted Critical
Publication of KR101862649B1 publication Critical patent/KR101862649B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition

Abstract

The present invention relates to a code generation and recognition system based on scan distance, and more particularly to a code generation and recognition system based on a scan distance, which includes a code generation device for transmitting different information to a terminal device based on a portion of an image scanned by a terminal device, Based code generation and recognition system.
One problem to be solved by the proposed invention is to transmit different information to the terminal device based on the part of the image scanned by the terminal device with respect to the same image and the size of the part, Is to break down the existing perception that there is only one, and to provide various information based on one image.
The code generating apparatus includes: an image receiving unit that receives an image scanned from the terminal apparatus; And an image recognition unit for transmitting different information to the terminal according to the ratio of the scanned area and the scanned area of the image occupied in the entire area of the image.

Description

[0001] The present invention relates to a code generation and recognition system based on scan distance,

The present invention relates to a code generation and recognition system based on a scan distance in which a terminal device scanning an image receives different information from a code generation device according to a scan distance at which the image is scanned, And a code generation device for transmitting different information to the terminal device based on the size of the part.

The existing barcode is basically a one-dimensional structure that can only contain numerical information of up to 20 letters in the horizontal array. However, the QR code is a two-dimensional structure that can record a maximum of 7,089 characters, a maximum of 4,296 characters, and a maximum of 1,817 characters in Chinese characters, using the width and height.

The bar code can only record information such as a specific product name or manufacturer due to the limit of the information that can be accepted. However, the QR code may contain a long sentence Internet address (URL), photo and video information, map information, .

However, such a QR code is recognized as whole code and provides information, but only a part of QR code does not provide independent information. Also, since existing codes focus only on information transmission, it was difficult to see the codes as beautiful.

As a result, even if only a part of the entire code is scanned to the terminal device, it is necessary to provide information to the terminal device, and it is necessary to implement the entire code as shown in the figure to provide esthetics.

One problem to be solved by the proposed invention is to transmit different information to the terminal device based on the part of the image scanned by the terminal device with respect to the same image and the size of the part, Is to break down the existing perception that there is only one, and to provide various information based on one image.

One problem to be solved by the proposed invention is to generate a code including a figure that is moved to a predetermined distance or less in a certain direction from a lattice point included in a lattice pattern, divide the code into one or more cells, Information can be matched to record information in a portion of the code.

One challenge for the proposed invention is to enable the code to be perceived as a totally aesthetic picture.

One problem to be solved by the proposed invention is that there is no restriction on the types of graphics the code includes.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.

The code generating apparatus includes: an image receiving unit that receives an image scanned from the terminal apparatus; And an image recognition unit for transmitting different information to the terminal according to the ratio of the scanned area and the scanned area of the image occupied in the entire area of the image.

The image recognition unit compares the background included in the received image with the background stored in the database to specify the background of the received image.

The image recognition unit extracts the contour of the background included in the image and compares the extracted contour with the contour of the background of the image stored in the database. If the extracted contour matches the preset ratio of the contour length of the background of the image on the database, And specifies the background of the received image as the background of the database image.

The image recognition unit extracts a code from an image in which a background is specified.

The image recognizing unit identifies cells included in the scanned image when the number of the graphics included in the same group among the graphics included in the extracted code is equal to or greater than a predetermined number.

The image recognition unit transmits information matched to the specified cell to the terminal device.

Wherein the image recognizing unit identifies a cell matched to a group including a figure having a larger size among the plurality of cells when each of the plurality of figures included in different groups is equal to or greater than a predetermined number.

The proposed invention transmits different information to the terminal device based on the portion of the image scanned by the terminal device and the size of the portion of the same image, and the information that can be acquired from one image is only one And can provide various information based on one image.

The proposed invention generates a code including a graphic object shifted to a predetermined distance or less in an arbitrary direction from a lattice point included in a grid pattern, divides the code into one or more cells, matches specific information to the cell, Information can be recorded on the part.

The proposed invention allows the code to be perceived as a picture giving aesthetics as a whole.

The proposed invention can be made so that there is no restriction on the type of graphic included in the code.

The effects of the present invention are not limited to the above-mentioned effects, and various effects can be included within the range that is obvious to a person skilled in the art from the following description.

1 illustrates a code generation and recognition system based on scan distance including a code generation apparatus and a terminal apparatus according to an embodiment.
2 shows an overall configuration of a code generation apparatus according to an embodiment.
FIG. 3 shows a terminal device that photographs the same image at different distances.
4 shows a terminal device to which a magnifying glass is coupled.
Figure 5 shows a code according to an embodiment.
6 shows grid patterns, grid points and graphics.
Fig. 7 shows a horizontal line, a vertical line, and a derivation distance.
Fig. 8 shows an arbitrary figure and a circle included in the figure.
Figure 9 shows a code according to one embodiment.
Figure 10 shows a code according to another embodiment.
Figure 11 shows a code and one or more cells.
12 shows a code and one or more cells according to another embodiment.
13 shows a terminal device for outputting a code recognition area.
14 shows a background of a code and a graphic form of the code.
FIG. 15 shows a process in which a code generating device recognizes the background of an image from an image scanned by a terminal device.
16 shows a process in which a code generation device recognizes a code from an image scanned by a terminal device.
17 shows various methods by which the code generating apparatus converts an image into black and white.
FIG. 18 shows matching information for each area of a business card scanned by the terminal device in the business card including the code generated by the code generating device according to an example.

The foregoing and further aspects are embodied through the embodiments described with reference to the accompanying drawings. It is to be understood that the components of each embodiment are capable of various combinations within an embodiment as long as no other mention or mutual contradiction exists. Furthermore, the proposed invention may be embodied in many different forms and is not limited to the embodiments described herein.

In order to clearly illustrate the claimed invention, parts not related to the description are omitted, and like reference numerals are used for like parts throughout the specification. And, when a section is referred to as "including " an element, it does not exclude other elements unless specifically stated to the contrary.

In addition, throughout the specification, when a part is referred to as being "connected" to another part, it is not limited to a case where it is "directly connected", but also an "electrically connected" . Further, in the specification, a signal means a quantity of electricity such as a voltage or a current.

As used herein, the term " block " refers to a block of hardware or software configured to be changed or pluggable, i.e., a unit or block that performs a specific function in hardware or software.

1 illustrates a code generation and recognition system based on scan distance including a code generation apparatus 100 and a terminal apparatus 200 according to an embodiment.

The code generation device 100 may be any type of device capable of communicating with a computer (e.g., a desktop, a laptop, a tablet, etc.), a media computing platform (e.g., a cable, a satellite set top box, a digital video recorder), a handheld computing device , Email client, etc.), any form of mobile phone, or any other type of computing or communication platform, but the invention is not so limited.

The code generation apparatus 100 generates a code including a figure 12 that is moved to a predetermined distance or less in an arbitrary direction from a lattice point 11 included in the lattice pattern 10, 19), and the specific information is matched to the cell (19). A description of the sequence in which the code generating apparatus 100 generates a code will be described later.

The terminal device 200 may be a personal computer, a personal digital assistant, a personal digital assistant (PDA), a personal digital assistant (PDA), a personal digital assistant E-mail client, etc.), any form of mobile phone, or any other type of computing or communication platform, but the invention is not so limited.

1, the code generation apparatus 100 generates and outputs an image including a code and a background. The code generation apparatus 100 can output the image on a display unit and output the image in a manner of replicating the image on the surface of an object such as paper or cloth. The terminal device 200 may be operatively connected to the code generation device 100 by executing application software. The terminal device 200 recognizes all or a part of the image with the provided camera, and receives information corresponding to the recognized part from the code generation device 100.

1, in (1), the code generating apparatus 100 generates a code based on a method to be described later. (2), the terminal device 200 installs and executes the application software. (3), the terminal device 200 scans all or part of the image including the code. The terminal device 200 receives information corresponding to the scanned area from the code generation device 100 in step (4).

2 shows an overall configuration of a code generation apparatus 100 according to an embodiment.

A code generation device for wirelessly communicating with a terminal device for scanning an image, comprising: an image reception unit (107) for receiving an image scanned from the terminal device; And an image recognition unit (108) for transmitting different information to the terminal according to the ratio of the scanned area and the scanned area of the image occupied in the entire area of the image.

The image receiving unit 107 receives the scanned image from the terminal device. The scanning of the image by the terminal device may mean that the image is captured by the camera provided in the terminal device.

The image recognizing unit 108 transmits different information to the terminal according to the ratio of the scanned area and the scanned area of the image in the entire image area.

The scanned area of the image means an area scanned by the terminal device from the entire image. The scanned region may be an entire image, or it may be a specific portion of an image. Even if the sizes of the specific portions of the scanned image of the terminal device are the same, the positions in the entire image may be different from each other.

If the scanned area is the entire image, the percentage occupying the entire area of the image is 100 percent. If the area scanned by the terminal device is a specific part of the image, the percentage occupied by the entire image area is less than 100 percent.

As such, the image recognition unit 108 can transmit different information to the terminal device based on the ratio of the scanned area and the entire image of the area. A detailed description thereof will be described later with reference to FIG.

The image recognition unit 108 compares the background included in the received image with the background stored in the database to specify the background of the received image.

The scanned image of the code recognition area 201 output by the terminal device 200 is transmitted to the code generation device 100. [ The code recognition unit 108 compares the background included in the received image with the background stored in the database to specify the background of the received image. Specifying the background of the image means that the background of the received image is treated as the background of any one of a plurality of images stored in the database.

The image recognition unit 108 extracts the contour of the background included in the image and compares the extracted contour with the contour of the background of the image stored in the database to determine whether the extracted contour is a predetermined ratio of the contour length of the background of the image on the database The background of the received image is specified as the background of the database image.

The image recognition unit 108 can extract a contour line with a change in color. A point where the amount of change in color is equal to or greater than the reference value can be defined as a point on the contour line. The image recognizing unit 108 may set the database image background to be the same as the background of the received image if the extracted outline matches 80% or more of the contour length of the background of the image on the database.

Even if the code contained in the image is the same, different information can be transmitted to the code device if the background included in the image is specified differently. Since the background itself can be an element for determining information, the image recognition unit 108 specifies the background of the received image.

The image recognition unit 108 extracts a code from an image whose background is specified.

The image recognition unit 108 extracts the code included in the image after specifying the background of the image. An image contains a background and code. The code includes only the figure 12 described below.

The image recognition unit 108 converts the image specific to the background in black and white. In order to extract only the code from the monochrome converted image, the image recognition unit 108 adjusts at least one of saturation, brightness, contrast, color, and sharpness. The saturation can be adjusted in the range of 0 to 401%. The brightness can be adjusted in the range of 0 to 100%. The contrast can be adjusted in the range of 0 to 100%. In the case of RGB, the color can be adjusted in the range of 0 to 255 for each of red and green blue. When the color is CMYK, each of blue, purple, and yellow black can be adjusted in the range of 0 to 100. The sharpness can be adjusted in the range of 0 to 100%.

The image recognition unit 108 adjusts at least one of saturation, brightness, contrast, color, and sharpness to stop the adjustment and extract the figure 12 if the figure 12 has two or more differences in background and color values . Extracting the figure 12 is extracting the code.

If the color is RGB, if at least one of the red and green blues of the figure 12 has a difference of at least 2 from the value of at least one of the background red and green blues located at the boundary of the figure 12, Stop. The respective values of the red and green blue are RGB values.

If the hue is CMYK, if at least one of the values of the blueness, blueness, and yellow test of the figure 12 is greater than or equal to at least one value of at least one of the blueness, , And stops the adjustment. The respective values of blue, purple, and yellow are CMYK values.

The image recognition unit 108 separates the background and the code from the image, and extracts the code.

FIG. 3 shows a terminal device that photographs the same object 300 at different distances.

The terminal device according to FIG. 3 captures the same object 300 at different distances to acquire different information. There are the same background and code on the surface of the same object 300. However, the terminal apparatus can receive different information according to the distance that the terminal 300 scans the object 300.

4 shows a terminal device to which a magnifying glass 202 is coupled.

The terminal device can combine the magnifying glass 202 and collect information based on a code that is hard to be visually recognized.

Figure 5 shows a code according to an embodiment. The code here refers to the code contained in the image. If the image contains no background, the image itself may be code. If the image includes a background, the code according to FIG. 5 may be an extracted code by separating the background and the code after specifying the background of the image.

The image recognition unit 108 identifies cells included in the scanned image when the number of the graphics included in the same group among the graphics included in the extracted code is equal to or greater than a predetermined number.

Shapes in which the difference in area is within a predetermined range may be included in the same group. In the example according to Fig. 5, the first group figure 401 is the largest figure. The second group graphic 402 is the middle sized graphic. The third group graphic 403 is the smallest sized graphic. According to FIG. 5, there are three groups.

The predetermined number is four, for example.

If the number of figures included in the same group included in the scanned area is equal to or greater than the existing number, the image recognition unit 108 specifies a cell included in the scanned image. Cells have different sizes from group to group. The cell 601 based on the first isotope 401, the cell 602 based on the second isotope 402 and the cell 603 based on the third isotope 403 are different from each other Size.

In the embodiment according to FIG. 5, the cell 601 based on the first isoform 401 is a cell formed by the first group line segment 501. The cell 602 based on the second isoform 402 is the smallest cell among the cells formed by the second group line segment 502. The cell 603 based on the third isoform 403 is the smallest cell among the cells formed by the third group segment 503.

The image recognition unit 108 transmits information matched to the specified cell to the terminal device. The image recognition unit 108 may determine which area of the entire image is scanned based on the arrangement of the graphics included in the scanned area of the terminal. That is, a portion of the image on the database matching the scanned area is detected by comparing the graphics included in the scanned area and the graphics of the image stored on the database. Finally, the image recognizing unit 108 extracts and specifies the cell including the part.

The image recognition unit 108 transmits information matched to the specified cell to the terminal device. Information matched to the cell can be defined in advance by the code generation device. The information matched to each cell can be changed later.

The image recognizing unit 108 recognizes that each of the plurality of graphic cells included in the different groups is equal to or greater than a predetermined number and that the plurality of cells are matched to the group including the larger graphic among the plurality of cells, Specify.

Each group includes figures of different sizes, and the size of the matched cells differs for each group. Referring to FIG. 5, there are a total of three groups, one group includes a first isoform 401, another group includes a second isoform 402, Includes a three-arch type (403).

The cell 601 matched to the group including the first isoform 401 is a cell formed by the first group line segment 501 and the cell 602 matched to the group including the second isoform 402 And is the smallest cell among the cells formed by the second group line segments 502. The cell 603 including the third isoform 403 is the smallest cell among the cells formed by the third group line segment 503. The larger the size of the figure included in the group, the larger the size of the cell matched to the group.

5, when a terminal device scans a cell 601 composed of a first group line segment 501, the figure included in the cell includes a first isoform 401, a second isoform 402, And an isoform 403 are all included. The first isoform 401, the second isoform 402, and the third isoform 403 included in the cell 601 are all equal to or greater than the predetermined number. That is, the cell 601 composed of the first group line segment 501 includes both the cell 602 composed of the second group line segment 502 and the cell 603 composed of the third group line segment 503.

At this time, the image recognizer 108 selects a cell 601 composed of a first group line segment 501, which is a cell matched to a group including a figure having a larger size among the plurality of cells, And transmits the matched information to the terminal device.

Accordingly, when the terminal device captures an image from the printed object 300, if the pattern is scanned close to the scanned area, the figure with the number of figures included in the scanned area is equal to or greater than a predetermined number, to be.

When a terminal captures an image from an object 300 on which a pattern is printed, as the pattern is further scanned farther, a figure whose number of figures included in the scanned area is equal to or greater than a predetermined number, In addition, relatively large figures may be included. That is, a cell matched to a group including a figure having a larger size among the predetermined number of graphic objects included in the scanned area is determined as a cell based on information transmission.

Based on the above-described principle, the image recognition unit 108 can transmit different information to the terminal device based on the ratio of the scanned area and the entire image of the area.

As a result, the image recognition unit 108 transmits specific information to the terminal device based on the specified background and the specified cell from the scanned image.

Figure 6 shows a grid pattern 10, a grid point 11 and a figure 12.

The code generation apparatus 100 includes a figure size input unit 101 for inputting the size of a figure 12 centered at a lattice point 11 included in the lattice pattern 10; A derivation distance (15) input unit (102) for inputting a derivation distance (15) in which the center of the figure (12) is a movable distance from the lattice point (11); A grating distance 16 input unit 103 for inputting a grating distance 16 which is a distance between any one of the grating points 11 and the one of the grating points 11 and the closest grating point 11; And generating a grid pattern (10) having the grid distance (16), wherein a figure (12) centered at a distance less than a derived distance (15) from each grid point (11) And a code generation unit 104 for generating and outputting a code.

Figure 12 The size input unit 101 inputs the size of the figure 12 centered on the lattice point 11 included in the lattice pattern 10. The grid pattern 10 is a pattern in which horizontal-vertical lines cross at right angles at the same interval, and FIG. 6 shows this. The lattice points 11 are located at the intersection of the horizontal line 13 and the vertical line 14. The figure 12 is an arbitrary figure 12 and its shape is not limited. The center of the figure 12 is located at the lattice point 11 but may be located away from the lattice point 11 based on the derivation distance 15 to be described later. A detailed description of the size of the graphic form 12 will be described later. The center of the graphic form 12 may mean the center of gravity. The present invention is not limited thereto and may be the center of the circle 18 having the largest radius among the circles 18 that can be accommodated in the figure 12.

7 shows the horizontal line 13, the vertical line 14, and the derivative distance 15.

Derivation distance 15 The input unit 102 inputs a derivation distance 15 whose center of the figure 12 is a movable distance from the lattice point 11. Referring to FIG. 7, the center of the figure 12 may be located on a dotted line or inside a circle 18 indicated by a dashed line based on the set derivation distance 15. If the center of the graphic 12 is located on the one dotted chain line or inside the circles 18 indicated by the one-dotted chain line, there is no restriction on the position.

The center of the figure 12 is moved in any direction by a derived distance 15 or less from the lattice point 11 according to the derived distance 15. There is no limitation on the direction in which the center of the figure 12 is moved if the distance is 15 or less.

In another embodiment, the center of the figure 12 is moved by a derivative distance 15 or less from the lattice point 11 according to the derivation distance 15, and the center of the figure 12 is divided into a horizontal line 13 and a vertical line 14). ≪ / RTI >

Lattice Distance 16 The input unit 103 inputs a lattice distance 16 which is a distance between any one of the lattice points 11 and the lattice point 11 closest to any one of the lattice points 11. Figure 6 shows the lattice distance 16 which can be defined as the distance between any one horizontal line 13 and any one horizontal line 13 and the nearest horizontal line 13 . The lattice distance 16 may be defined as the distance between any one of the vertical lines 14 and the vertical line 14 closest to any one of the vertical lines 14.

The code generation unit 104 generates a grid pattern 10 having the grid distance 16 so as to generate a grid pattern 12 that is centered at a distance less than the derived distance 15 from each grid point 11 And arranges it on the grid pattern 10 to generate and output a code. The lattice distance 16 and the lattice pattern 10 have been described above. Figure 6 shows a code comprising a figure 12 arranged on a grid pattern 10 based on a grid pattern 10 and an input grid distance 16 and a derivation distance 15. The code generation unit 104 outputs the code to an object such as a display unit or paper such as a display device.

The horizontal line 13, the vertical line 14, the lattice point 11 and the lattice distance 16 according to FIG. 6 are shown for the sake of explanation. 16). The code may include only graphics 12.

The code generating apparatus 100 includes a horizontal line grating 11 for inputting the number of lattice points 11 of a horizontal line 13 which is the number of lattice points 11 located on any one horizontal line 13 included in the lattice pattern 10, And a point number input unit 105. When the number of lattice points 11 is inputted to the horizontal line 13, all the horizontal lines 13 included in the lattice pattern 10 include the lattice points 11 corresponding to the number. The lattice points 11 may be equally spaced at a lattice distance 16.

The code generation apparatus 100 includes a vertical grid 14 for inputting the number of grid points 11 of a vertical line 14 which is the number of grid points 11 located on any one vertical line 14 included in the grid pattern 10, And a point number inputting unit 106. When the number of grid points 11 is input, all the vertical lines 14 included in the grid pattern 10 include the grid points 11 corresponding to the number. The lattice points 11 may be equally spaced at a lattice distance 16.

The figure 12 includes a circle 18 and the size of the figure 12 includes the radius of the circle 18. When the figure 12 is a circle 18 and a radius length of the circle 18 is input, the circle 18 having the radius is spaced apart from the lattice point 11 by a derivation distance 15.

The figure 12 includes a rectangle, and the size of the figure 12 includes the length of the sides of the rectangle. The lengths of two sides that are not parallel to each other can be input. If two sides that are not parallel to each other have the same length, the rectangle is square. If the sides of two sides that are not parallel to each other are different, the rectangle is rectangular. The quadrangle is not limited to a square or a rectangle, but may include all quadrangles having four sides. That is, the length of a maximum of four sides can be inputted.

Fig. 8 shows an arbitrary figure 12 and a circle 18 included in the figure 12. Fig.

The lattice distance 16 is at least seven times the radius of the circle 18. In detail, the lattice distance 16 is 7 to 8 times the radius of the circle 18. The figure 12 is not limited to the circle 18, but may be any figure 12. In addition, the circle 18 may mean the figure 12 itself, but it may mean the circle 18 having the largest radius among the circles 18 included in the figure 12.

8, for a circle 18 having the largest radius among the circles 18 that the figure 12 can include, the lattice distance 16 is a radius of the circle 18 of the circle 18, It can be more than 7 times.

Referring to FIG. 8, the center of the figure 12 may be the center of the circle 18 having the largest radius included in the figure 12.

The derivation distance 15 is less than 1.15 times the radius of the circle 18. Here, the circle 18 may mean the figure 12 itself, but it may mean the circle 18 having the largest radius among the circles 18 included in the figure 12. Since the derivation distance 15 is 1.15 times or less the radius length of the circle 18, the figure 12 has a radius of 1.15 times the radius of the circle 18 having the largest radius included in the figure 12 The center of the circle 18 may be located at the center.

If the length of the derivation distance 15 becomes too long, one of the figures 12 may overlap with the other figure 12, and accordingly the accuracy of code recognition may be lowered. The derivative distance 15 may be different for each figure located at the lattice point. Also, all figures can move to the same derivation distance 15.

Figure 9 shows a code according to one embodiment. Figure 10 shows a code according to another embodiment. 9 is a diagram showing the shape of a bears character based on the input derivation distance 15, the lattice distance 16, the number of lattice points 11 on the horizontal line 13 and the number of lattice points 11 on the vertical line 14 This is the generated code. 10 is a diagram 12 that is generated based on the input derivation distance 15, the grid distance 16, the number of grid points 11 on the horizontal line 13, and the number of grid points 11 on the vertical line 14 .

Figure 11 shows a code and one or more cells 19. Referring to FIG. 11, there are 49 cells 19. The number of the cells 19 is not limited to this, but may be one or more. The code generation device 100 may match information to the cell 19. [ Different information can be matched to each cell 19, and the same information can be matched. When the code recognition area 201 of the terminal device 200 to be described later scans the cell 19, the terminal device 200 receives the information matched to the cell 19 from the code generation device 100.

12 shows a code and one or more cells 19 according to another embodiment. One cell 19 can be formed by specifying nine positions in the code and grouping one or more positions. In the group distinction 1 according to FIG. 12, positions 1, 2, 4 and 5 are one cell 19, positions 3, 6 and 9 are another cell 19 and positions 7 and 8 are another cell 19 Cell 19. The specification of the cell 19 is not limited to the group distinction 1, and there is no limitation on the method of specifying the cell 19 as in the group distinction 2 to 6.

Fig. 13 shows a terminal device 200 that outputs the code recognition area 201. Fig. The terminal device 200 can execute the above-described application program and output the code recognition area 201. [ When the code recognition area 201 scans a specific cell 19 of the code, the code generation device 100 transmits information corresponding to the cell 19 to the terminal device 200. That is, the code recognized in the code recognition area 201 is transmitted to the code generation device 100 in real time, and the code generation device 100 transmits the information matched to the recognized code to the terminal device 200 .

The code recognition area 201 can simultaneously scan a plurality of cells shown in Fig. At this time, it is assumed that the code generating apparatus 100 has matched one information for each cell in the example of FIG. The code generating apparatus 100 transmits to the terminal device 200 information in which the code recognition area 201 has matched the largest cell among the plurality of scanned cells.

When the code recognition area 201 of the terminal device 200 scans the specific cell 19 of the code for more than 1 second and less than 2 seconds, the code generation device 100 outputs green light on the display of the terminal device 200 A light emission command is transmitted to the terminal device.

When the code recognition area 201 of the terminal device 200 scans the specific cell 19 of the code for more than 1 second and less than 2 seconds, the code generation device 100 outputs a yellow light from the display of the terminal device 200 A light emission command is transmitted to the terminal device.

When the code recognition area 201 of the terminal device 200 scans the specific cell 19 of the code for more than 2 seconds but less than 3 seconds, the code generation device 100 outputs an orange light from the display of the terminal device 200 A light emission command is transmitted to the terminal device.

When the code recognition area 201 of the terminal device 200 scans the specific cell 19 of the code for more than 3 seconds but less than 4 seconds, the code generation device 100 generates red light from the display of the terminal device 200 And transmits a light emission command to the terminal device to output the light emission command.

It is obvious that the code generating apparatus 100 transmits information corresponding to the specific cell 19 to the terminal apparatus 200 separately from the light emitting command. The code generation apparatus 100 transmits a light emission command with a high visibility color to the terminal device 200 as the terminal device 200 scans a specific cell 19 for a long time, It can be reminded that the terminal device 200 has received the information.

If the number of figures included in the code scanned by the code recognition area 201 of the terminal device 200 is less than a predetermined number, the code generation device 100 generates a zoom-out command so that the camera included in the terminal device 200 zooms out Lt; / RTI > When the number of graphic objects is equal to or greater than a predetermined number, the code generating device 100 can check which part of the entire code the terminal device 200 scans. If the number of figures is less than the predetermined number, the distance between the terminal device 200 and the cord must be further increased. However, even if the user does not further distant the distance between the terminal device 200 and the code, the code generating device 100 transmits a zoom-out command so that the camera included in the terminal device 200 zooms out.

More specifically, the user himself can recognize that the number of graphic objects is less than a predetermined number, and can place the terminal device 200 away from the code. Therefore, the code generating apparatus 100 may be configured to transmit the code to the terminal device 200 after 10 seconds from the point of time when the code recognition area 201 of the terminal device 200 has detected that the number of the graphics included in the scanned code is less than the predetermined number And sends a zoom-out command so that the included camera zooms out. The 10 seconds is the minimum time when the user of the terminal device 100 determines that the number of the graphic objects is less than the predetermined number.

Likewise, if the saturation of the figure included in the code scanned by the code recognition area 201 of the terminal device 200 is less than the reference value, the code generation device 100 determines that the lamp included in the terminal device 200 is turned on Command. Saturation is below the standard, which means that the area around the code is dark. Accordingly, the code generating apparatus 100 transmits a lamp-on command to turn on the lamp included in the terminal apparatus 200, thereby guiding the terminal apparatus 200 to increase the saturation of the code to be scanned.

More specifically, the user can recognize that the figure is not clearly visible, and turn on the light to adjust the saturation around the code. Therefore, the code generating apparatus 100 may be configured such that, after 10 seconds from the point of time when the code recognition area 201 of the terminal device 200 has detected that the saturation of the figure included in the code scanned by the code recognition area 201 is less than the reference value, And sends a zoom-out command so that the camera zooms out. The 10 seconds is a minimum time when the user of the terminal device 100 determines that the saturation of the graphic form is less than the reference value.

14 shows the background of the code and the graphic 12 of the code. A series of processes in which the code generating apparatus 100 generates a code by disposing the figure 12 on the grid pattern 10 has been described above. The grid pattern 10 is an invisible grid pattern 10 that is introduced to explain the arrangement of the graphic 12 and is not exposed to the actual code. It has been described above that the graphic form 12 itself may exist in various forms. Further, the background on which the graphic object 12 is placed is also not limited. In Fig. 13, the background is red, but the color is not limited thereto. In addition, a background image may be formed of a plurality of colors.

Apart from the background, the color of the figure 12 may be white, black, or a similar color to the background. The shape of the figure 12 is not limited thereto, and one figure 12 may include a plurality of colors. Also, one of the figures 12 and the other figure 12 may include different colors.

The transparency may be different even if the color of the graphic 12 is white as shown in Fig. The transparency may range from 0 to 98%. Not only the transparency but also the brightness and contrast value for the specified color can be set variously. The range of the brightness is 0 to 100%. The range of contrast is 0 to 100%. The contrast is an attribute employed in known editing software, and is a difference between lightness and darkness. The higher the contrast, the brighter the lighter the darker the darker the color.

FIG. 15 illustrates a process in which the code generation apparatus 100 recognizes the background of an image from an image scanned by the terminal device 200. FIG. The scanning of the image by the terminal device 200 may mean taking an image through a camera provided in the terminal device 200. The scanned image of the code recognition area 201 output by the terminal device 200 is transmitted to the code generation device 100. [ The code generating apparatus 100 compares the background included in the received image with the background stored in the database to specify the background of the received image. That is, as shown in FIG. 15, the code generation apparatus 100 specifies the background included in the image transmitted by the terminal device 200 as a 'rabbit holding a load' background.

The code generating apparatus 100 extracts the contour of the background included in the received image. Based on Fig. 15, the background is rabbit, and the code generation device extracts the outline of the rabbit. At this time, the code generation apparatus can extract the contour line by changing the color. A point where the color change amount is equal to or greater than the reference value is defined as a point on the contour line.

The code generating apparatus 100 extracts the contour of the background contained in the image and compares it with the background of the image stored in the database. It consists of the outline of the background of the image stored in the database. If the extracted outline matches 80% or more of the outline length of the background of the image on the database, the database image background is set to be the same as the background of the received image by comparing the extracted outline with the outline of the background of the image stored in the database. That is, the code generating apparatus 100 specifies the background of the received image as the background of the database image.

The code generating apparatus 100 recognizes the code included in the image according to a method described later after specifying the background of the received image.

The function performed by the code generating apparatus 100 according to FIG. 15 may be a function performed by the code recognizing unit 108 included in the code generating apparatus 100.

16 shows a process in which the code generating apparatus 100 recognizes a code from an image recognized by the terminal apparatus 200. [ The code generation device 100 extracts the code included in the image after specifying the background of the image. An image contains a background and code. The code includes only the figure 12 described above.

The code generating apparatus 100 converts the background-specific image into black and white. A specific method of switching between black and white will be described later with reference to FIG. In order to extract only the code from the monochrome converted image, the code generating device 100 adjusts at least one of saturation, brightness, contrast, color, and sharpness. The saturation can be adjusted in the range of 0 to 401%. The brightness can be adjusted in the range of 0 to 100%. The contrast can be adjusted in the range of 0 to 100%. In the case of RGB, the color can be adjusted in the range of 0 to 255 for each of red and green blue. When the color is CMYK, each of blue, purple, and yellow black can be adjusted in the range of 0 to 100. The sharpness can be adjusted in the range of 0 to 100%.

The code generating apparatus 100 controls at least one of saturation, brightness, contrast, color, and sharpness so that if the figure 12 has two or more differences in background and color values, the adjustment is discontinued and the figure 12 is extracted . Extracting the figure 12 is extracting the code.

If the color is RGB, if the sum of the respective values of the red and green blues of the figure 12 is different from the sum of the respective values of the red and green blues of the background located at the boundary of the figure 12, Stop.

If the color is CMYK, if the sum of the values of the blue, green, and yellow colors of the figure 12 is different from the sum of the respective values of the red and green colors of the background located at the boundary of the figure 12, Stop the adjustment.

The adjustment is stopped, and the code generating apparatus 100 separates the background and the code from the image, and extracts the code. Then, the terminal device 200 transmits information matched to the cell 19 including the scanned portion to the terminal device 200.

The function performed by the code generating apparatus 100 according to FIG. 16 may be a function performed by the code recognizing unit 108 included in the code generating apparatus 100.

17 shows various methods by which the code generating apparatus 100 switches the image in black and white. The code generating apparatus 100 switches a specific image to black and white. At this time, the specific image is switched to black and white based on any one of red-based black-and-white conversion, green-based black-and-white conversion, and blue-based black-and-white conversion.

The red-based black-and-white conversion is a bright conversion of the red-based color in the original (18) image. A bright conversion is that it converts to a white series.

The black-and-white conversion of the green series is to brightly convert the color of the green series in the original (18) image. A bright conversion is that it converts to a white series.

The black-and-white conversion of the blue system converts the color of the blue system from the original image 18 brightly. A bright conversion is that it converts to a white series.

FIG. 18 shows matching information for each area of a business card scanned by the terminal device 200 in a business card including a code generated by the code generating apparatus 100 according to an example. 18, when the terminal device 200 scans the logo, the terminal device 200 receives the website address of the company from the code generation device 100, and the terminal device 200 transmits the website address . When the terminal device 200 scans the printed portion of the name and rank, it receives an address book storage command from the code generation device 100 to store the name and rank in the address book of the terminal device 200. Further, when the terminal device 200 scans a portion where the telephone number is printed, the terminal device 200 receives a calling instruction from the code generating device 100 so that the terminal device 200 calls the number.

Thus, those skilled in the art will appreciate that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. It is therefore to be understood that the above-described embodiments are illustrative only and not restrictive of the scope of the invention. It is also to be understood that the flow charts shown in the figures are merely the sequential steps illustrated in order to achieve the most desirable results in practicing the present invention and that other additional steps may be provided or some steps may be deleted .

The technical features and implementations described herein may be implemented in digital electronic circuitry, or may be implemented in computer software, firmware, or hardware, including the structures described herein, and structural equivalents thereof, . Also, implementations that implement the technical features described herein may be implemented as computer program products, that is, modules relating to computer program instructions encoded on a program storage medium of the type for execution by, or for controlling, the operation of the processing system .

The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter that affects the machine readable propagation type signal, or a combination of one or more of the foregoing.

In the present specification, the term " apparatus "or" system "includes all apparatuses, apparatuses, and machines for processing information, including, for example, a processor, a computer or a multiprocessor or a computer. The processing system may include, in addition to the hardware, all of the code that forms an execution environment for the computer program upon request, such as code comprising the processor firmware, a protocol stack, an information base management system, an operating system, .

A computer program, known as a program, software, software application, script or code, may be written in any form of programming language, including compiled or interpreted language or a priori, procedural language, Routines, or other units suitable for use in a computer environment.

On the other hand, a computer program does not necessarily correspond to a file in the file system, but may be stored in a single file provided to the requested program or in a plurality of interactive files (for example, one or more modules, File), or a portion of a file holding another program or information (e.g., one or more scripts stored in a markup language document).

A computer program may be embodied to run on multiple computers or on one or more computers located at one site or distributed across a plurality of sites and interconnected by a wired / wireless communication network.

On the other hand, computer readable media suitable for storing computer program instructions and information include, for example, semiconductor memory devices such as EPROM, EEPROM and flash memory devices, for example magnetic disks such as internal hard disks or external disks, And any type of non-volatile memory, media and memory devices, including CD and DVD discs. The processor and memory may be supplemented by, or incorporated in, special purpose logic circuits.

Implementations that implement the technical features described herein may include, for example, a back-end component such as an information server, or may include a middleware component, such as, for example, an application server, Or a client computer having a graphical user interface, or any combination of one or more of such backend, middleware or front end components. The components of the system may be interconnected by any form or medium of digital information communication, for example, a communication network.

Hereinafter, more specific embodiments capable of implementing the configurations including the system described in this specification and the MO service-based benefit providing method will be described in detail.

The methods herein may be used, in part or in whole, through means for executing computer software, program code or instructions on one or more processors included in a server or server associated with a client device or a web-based storage system. The processor may be part of a computing platform, such as a server, a client, a network infrastructure, a mobile computing platform, a fixed computing platform, and the like, and may specifically be a type of computer or processing device capable of executing program instructions, code, In addition, the processor may further include a memory for storing the method, the instruction, the code and the program, and the method, the instruction, the code and the program according to the present invention may be stored in a CD-ROM, A DVD, a memory, a hard disk, a flash drive, a RAM, a ROM, a cache, and the like.

Further, the systems and methods described herein may be used, in part or in whole, through a server, a client, a gateway, a hub, a router, or an apparatus executing computer software on network hardware. The software may be executed in various types of servers such as a file server, a print server, a domain server, an Internet server, an intranet server, a host server, a distributed server, A storage medium, a communication device, a port, a client, and other servers via a wired / wireless network.

Also, the methods, instructions, code, etc., according to the present invention may also be executed by the server, and other devices needed to implement the method may be implemented as part of a hierarchical structure associated with the server.

In addition, the server can provide an interface to other devices including, without limitation, clients, other servers, printers, information base servers, print servers, file servers, communication servers, distributed servers, It is possible to facilitate remote execution of the program via the network.

Also, the central processor of the server may include instructions, code, etc., to be executed on different devices, and may further include at least one storage device capable of storing methods, instructions, codes, etc., To be stored in the storage device.

On the other hand, the method herein may be used partly or entirely through a network infrastructure. The network infrastructure may include both a device such as a computing device, a server, a router, a hub, a firewall, a client, a personal computer, a communication device, a routing device, etc. and a separate module capable of performing each function, In addition to one device and module, it may further include storage media such as a story flash memory, buffer, stack, RAM, ROM, and the like. Also, methods, commands, code, etc. may be executed and stored by any of the devices, modules, and storage media included in the network infrastructure, and other devices required to implement the method may also be implemented as part of the network infrastructure .

In addition, the systems and methods described herein may be implemented in hardware or a combination of hardware and software suitable for a particular application. Herein, the hardware includes both general-purpose computer devices such as personal computers, mobile communication terminals, and enterprise-specific computer devices, and the computer devices may include memory, a microprocessor, a microcontroller, a digital signal processor, an application integrated circuit, a programmable gate array, Or the like, or a combination thereof.

Computer software, instructions, code, etc., as described above, may be stored or accessed by a readable device, such as a computer component having digital information used to compute for a period of time, such as RAM or ROM Permanent storage such as semiconductor storage, optical disc, large capacity storage such as hard disk, tape, drum, optical storage such as CD or DVD, flash memory, floppy disk, magnetic tape, paper tape, Memory such as storage and dynamic memory, static memory, variable storage, network-attached storage such as the cloud, and the like. Here, the commands and codes can be classified into information-oriented languages such as SQL and dBase, system languages such as C, Objective C, C ++, and assembly, architectural languages such as Java and NET and application languages such as PHP, Ruby, Perl and Python But it is not so limited and may include all languages well known to those skilled in the art.

In addition, "computer readable media" as described herein includes all media that contribute to providing instructions to a processor for program execution. But are not limited to, non-volatile media such as information storage devices, optical disks, magnetic disks, and the like, transmission media such as coaxial cables, copper wires, optical fibers, etc. that transmit information to volatile media such as dynamic memory and the like.

On the other hand, configurations implementing the technical features of the present invention, which are included in the block diagrams and flowcharts shown in the accompanying drawings, refer to the logical boundaries between the configurations.

However, according to an embodiment of the software or hardware, the depicted arrangements and their functions may be implemented in the form of a stand alone software module, a monolithic software structure, a code, a service and a combination thereof and may execute stored program code, All such embodiments are to be regarded as being within the scope of the present invention since they can be stored in a medium executable on a computer with a processor and their functions can be implemented.

Accordingly, the appended drawings and the description thereof illustrate the technical features of the present invention, but should not be inferred unless a specific arrangement of software for implementing such technical features is explicitly mentioned. That is, various embodiments described above may exist, and some embodiments may be modified while retaining the same technical features as those of the present invention, and these should also be considered to be within the scope of the present invention.

It should also be understood that although the flowcharts depict the operations in the drawings in a particular order, they are shown for the sake of obtaining the most desirable results, and such operations must necessarily be performed in the specific order or sequential order shown, Should not be construed as being. In certain cases, multitasking and parallel processing may be advantageous. In addition, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems are generally integrated into a single software product, It can be packaged.

As such, the specification is not intended to limit the invention to the precise form disclosed. While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is evident that many alternatives, modifications, and variations will be apparent to those skilled in the art without departing from the spirit and scope of the present invention as defined by the appended claims. It is possible to apply a deformation.

The scope of the present invention is defined by the appended claims rather than the foregoing description, and all changes or modifications derived from the meaning and scope of the claims and equivalents thereof are deemed to be included in the scope of the present invention. .

10: grid pattern
11: grid point
12: Shape
13: Horizontal line
14: Vertical line
15: Derived distance
16: Grid distance
18: won
19: cell
100: Code generating device
101: Shape size input unit
102: Derived distance input unit
103: Grid distance input unit
104: Code generation unit
105: Horizontal line grid point number input section
106: Vertical grid point number input unit
107: image receiver
108: image recognition unit
200: terminal device
201: code recognition area
202: Magnifying glass
300: object
401: First arch type
402: The Second Archipelago
403: Third arch type
501: Group 1 segment
502: Segment 2 of the second group
503: Group 3 line segment
601: Cell based on the first isoform
602: cell based on the second isoform
603: cell based on the third isoform

Claims (7)

A code generation device for wirelessly communicating with a terminal device that scans a first image printed with codes,
A figure size input unit receiving a size of a figure centered at a lattice point included in the lattice pattern;
A derivation distance input unit receiving a derivation distance whose center of the figure is a movable distance from a lattice point;
A lattice distance input unit which receives a lattice distance which is a distance between any one lattice point and a lattice point closest to any one lattice point; And
Generating a grid pattern having the grid distance by arranging a figure of arbitrary shape centered in an arbitrary direction by a derived distance or less from each grid point on a grid pattern to generate a code of the arranged figure A code generation unit for outputting to the display unit of the electronic device or outputting the code to the surface of the object;
An image receiving unit for receiving a second image obtained by scanning a first image from the terminal device; And
And an image recognition unit that identifies different cells according to a position and a ratio of the second image occupied in the entire area of the first image and transmits information matched to the cell to the terminal device,
The code composed of the figure is divided into one or more cells, information is matched to each cell,
Characterized in that the cell is formed by specifying one or more positions in the code and grouping one or more positions of the specified positions,
The graphic form is characterized in that at least one of size, color, transparency, contrast, and contrast is variable,
The image recognizing unit,
The background included in the received second image is compared with the background stored in the database to specify the background by treating the background of the received second image as the same as the background of any one of the plurality of images stored in the database ,
Code generation device.

delete The method according to claim 1,
The image recognizing unit,
If the extracted contour is equal to or more than a predetermined ratio of the contour length of the background of the image on the database, comparing the extracted contour with the contour of the background of the image stored in the database, The background of the second image is regarded as the same as the background of any one of the plurality of images stored in the database,
Code generation device.


The method of claim 3,
The image recognition unit
Extracting a code consisting of the arranged graphic form in a second image having a background,
Code generation device.


5. The method of claim 4,
The image recognition unit
If the number of the graphics included in the same group among the graphics included in the extracted code is equal to or greater than a predetermined number, acquiring information matched to the cell by specifying a cell included in the received second image,
Wherein the group is formed of the same group of figures whose difference in area is within a predetermined range.
Code generation device.


6. The method of claim 5,
The image recognition unit
And transmitting information matched to the specified cell to a terminal device,
Code generation device.
6. The method of claim 5,
The image recognition unit
When each of the graphics included in the different groups among the graphics included in the extracted code is equal to or larger than a predetermined number,
And specifying a cell matched to the group including the figure having the largest size,
Code generation device.



KR1020170179819A 2017-12-26 2017-12-26 Code generation and recognition system based on scan distance KR101862649B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020170179819A KR101862649B1 (en) 2017-12-26 2017-12-26 Code generation and recognition system based on scan distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020170179819A KR101862649B1 (en) 2017-12-26 2017-12-26 Code generation and recognition system based on scan distance

Publications (1)

Publication Number Publication Date
KR101862649B1 true KR101862649B1 (en) 2018-06-29

Family

ID=62781118

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020170179819A KR101862649B1 (en) 2017-12-26 2017-12-26 Code generation and recognition system based on scan distance

Country Status (1)

Country Link
KR (1) KR101862649B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070044149A (en) * 2005-10-24 2007-04-27 주식회사 케이티프리텔 Method and device for decoding the code having code information according to image
JP2010182348A (en) * 2009-02-03 2010-08-19 Toppan Printing Co Ltd Information recording medium
KR20100106475A (en) * 2007-12-12 2010-10-01 켄지 요시다 Information input device, information processing device, information input system, information processing system, two-dimensional format information server, information input method, control program, and recording medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070044149A (en) * 2005-10-24 2007-04-27 주식회사 케이티프리텔 Method and device for decoding the code having code information according to image
KR20100106475A (en) * 2007-12-12 2010-10-01 켄지 요시다 Information input device, information processing device, information input system, information processing system, two-dimensional format information server, information input method, control program, and recording medium
JP2010182348A (en) * 2009-02-03 2010-08-19 Toppan Printing Co Ltd Information recording medium

Similar Documents

Publication Publication Date Title
US10186023B2 (en) Unified multi-image fusion approach
US11205105B1 (en) Machine-readable label generator
EP3155593B1 (en) Method and device for color processing of digital images
US11620794B2 (en) Determining visually reflective properties of physical surfaces in a mixed reality environment
JP2019505872A (en) Method and apparatus for generating a two-dimensional code image having a dynamic effect
US10121271B2 (en) Image processing apparatus and image processing method
CN107960150A (en) Image processing apparatus and method
CN110008943A (en) A kind of image processing method and device, a kind of calculating equipment and storage medium
JP2016142988A (en) Display device and display control program
US20210056275A1 (en) Qr code generation method and apparatus for terminal device
KR101862649B1 (en) Code generation and recognition system based on scan distance
EP3165018A2 (en) System and method for quantifying reflection e.g. when analyzing laminated documents
KR101862650B1 (en) Code generation and recognition system for security and activation
US9076263B2 (en) Image processing apparatus, picture style conversion method and storage medium
JP2017060090A (en) Image data generation device, printer and image data generation program
JPWO2016063392A1 (en) Projection apparatus and image processing program
CN112927321B (en) Intelligent image design method, device, equipment and storage medium based on neural network
RU2571510C2 (en) Method and apparatus using image magnification to suppress visible defects on image
KR102062982B1 (en) Method for video synthesis
JP2011044059A (en) Moving image generation method and apparatus
KR20210112345A (en) Multi-area image scanning
JP2015087726A (en) Image projector
US11893433B1 (en) Leveraging a uniform resource locator (URL) to produce a corresponding machine-readable label
JP6544004B2 (en) Color sample creating apparatus and color sample creating method, and image processing system using color sample
WO2023044731A1 (en) Image processing method and apparatus, and electronic device and storage medium

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant