CN108830361B - Invisible code identification method, terminal and server - Google Patents

Invisible code identification method, terminal and server Download PDF

Info

Publication number
CN108830361B
CN108830361B CN201710310668.XA CN201710310668A CN108830361B CN 108830361 B CN108830361 B CN 108830361B CN 201710310668 A CN201710310668 A CN 201710310668A CN 108830361 B CN108830361 B CN 108830361B
Authority
CN
China
Prior art keywords
code
target object
area
invisible
mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710310668.XA
Other languages
Chinese (zh)
Other versions
CN108830361A (en
Inventor
陈政安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jiuzhou Media Technology Co ltd
Original Assignee
Shenzhen Jiuzhou Media Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jiuzhou Media Technology Co ltd filed Critical Shenzhen Jiuzhou Media Technology Co ltd
Priority to CN201710310668.XA priority Critical patent/CN108830361B/en
Publication of CN108830361A publication Critical patent/CN108830361A/en
Application granted granted Critical
Publication of CN108830361B publication Critical patent/CN108830361B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06046Constructional details
    • G06K19/06178Constructional details the marking having a feature size being smaller than can be seen by the unaided human eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/067Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
    • G06K19/07Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
    • G06K19/0723Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips the record carrier comprising an arrangement for non-contact communication, e.g. wireless communication circuits on transponder cards, non-contact smart cards or RFIDs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products
    • G06Q30/0185Product, service or business identity fraud

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention provides a invisible code identification method, a terminal and a server, wherein the method comprises the following steps: controlling a laser emitting device to emit laser to a target area of a target object to form a light spot on the target area; controlling a camera device to shoot a target object containing light spots, and acquiring a target object image containing the light spots; identifying the position of the light spot in the target object image and the mark code in the target object image; uploading the mark codes in the target object images to a server, and enabling the server to return corresponding area code distribution files according to the mark codes, wherein the area code distribution files comprise hidden code distribution strategies of the target objects; and identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file. The embodiment of the invention can identify the invisible code value in the target area on the target object in a non-contact mode, and is beneficial to the expansion of the application scene of the invisible identification code.

Description

Invisible code identification method, terminal and server
Technical Field
The invention belongs to the technical field of invisible code identification, and particularly relates to an invisible area code identification method, a terminal and a server.
Background
In recent years, the invisible identification code is widely applied to the technical field of product anti-counterfeiting, and guarantees the rights and interests of consumers, so that the consumers can identify fake and shoddy commodities according to the invisible code on product packages when purchasing the products. At present, the invisible codes are generally recognized by a touch and talk pen with the invisible code optical recognition function on the market, the pen point of the touch and talk pen is required to be contacted with and recognized above the paper surface printed with the invisible codes, and once the pen point of the touch and talk pen leaves the paper surface, the recognition of the invisible codes is invalid, so that the expansion of the application scene of the invisible codes is not facilitated.
Disclosure of Invention
The embodiment of the invention aims to provide a invisible code identification method, a terminal and a server, which can identify invisible codes in an invisible area on a target object in a non-contact mode and are beneficial to expansion of an application scene of the invisible codes.
The embodiment of the invention is realized in such a way that a hidden code identification method comprises the following steps:
controlling a laser emitting device to emit laser to a target area of a target object, and forming a light spot on the target area;
controlling a camera device to shoot a target object containing light spots, and acquiring a target object image containing the light spots;
identifying the position of the light spot in the target object image and the mark code in the target object image;
uploading the mark codes in the target object images to a server, and enabling the server to return corresponding area code distribution files according to the mark codes, wherein the area code distribution files comprise invisible code distribution strategies of the target objects;
and identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
In another aspect, an embodiment of the present invention further provides a method for recognizing a hidden code, where the method for recognizing a hidden code includes:
receiving a mark code of a target object sent by a terminal;
inquiring an area code distribution file corresponding to the mark code from a database according to the mark code, wherein the area code distribution file comprises a hidden code distribution strategy of the target object;
and sending the area code distribution file to the terminal, so that the terminal identifies the invisible code value of the target area on the target object according to the area code distribution file.
Another aspect of the embodiments of the present invention further provides a terminal, where the terminal includes:
the first control unit is used for controlling the laser emitting device to emit laser to a target area of a target object, and forming light spots on the target area;
the second control unit is used for controlling the camera device to shoot the target object containing the light spots and acquiring the target object image containing the light spots;
the light spot identification unit is used for identifying the position of the light spot in the target object image;
a marker code recognition unit for recognizing a marker code in the target object image;
the network communication unit is used for uploading the mark codes in the target object images to a server, so that the server returns corresponding area code distribution files according to the mark codes, and the area code distribution files comprise hidden code distribution strategies of the target object;
and the invisible code identification unit is used for identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
Another aspect of the embodiments of the present invention further provides a server, where the server includes:
the receiving unit is used for receiving the mark code of the target object sent by the terminal;
the region code distribution file query unit is used for querying a region code distribution file corresponding to the mark code from a database according to the mark code, and the region code distribution file comprises a hidden code distribution strategy of the target object;
and the sending unit is used for sending the area code distribution file to the terminal so that the terminal can identify the invisible code value of the target area on the target object according to the area code distribution file.
According to the embodiment of the invention, the mark code is set on the target object, the mark code and the area distribution code file of the target object are stored in the server, the laser emission device is used for emitting the light spot to the target area of the target object, the position of the light spot on the target object is identified, the camera device is used for acquiring the mark code on the target object, the pre-stored area code distribution file of the target object is acquired from the server according to the mark code of the target object, and finally the invisible code value in the target area is identified according to the position of the light spot on the target object and the area code distribution file of the target object, so that the aim of identifying the invisible code value in the target area on the target object in a non-contact mode can be achieved, and the expansion of the application scene of the invisible code is facilitated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for recognizing a hidden code according to an embodiment of the present invention;
FIG. 2 is a schematic representation of an encoded representation of a marker code in a particular application;
FIG. 3 is a schematic illustration of a stealth area distribution strategy for a target object in a particular application;
fig. 4 is a schematic flow chart of a invisible code identification method according to a second embodiment of the present invention;
fig. 5 is a schematic block diagram of a terminal according to a third embodiment of the present invention;
fig. 6 is a schematic block diagram of a server according to a fourth embodiment of the present invention;
fig. 7 is a schematic block diagram of a server according to a fifth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminals described in embodiments of the invention include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
Fig. 1 is a schematic flowchart of a method for recognizing a hidden code according to an embodiment of the present invention, where an execution main body of the method is a terminal according to an embodiment of the present invention. Referring to fig. 1, the method comprises the steps of:
step S101, controlling a laser emitting device to emit laser to a target area of a target object, and forming a light spot on the target area.
In this embodiment, step S101 specifically includes:
and controlling a laser emitting device to emit laser to a target area of a target object, and forming a circular light spot with the area within a preset range on the target area.
The laser emitting device may be disposed on the terminal, the laser emitting device emits laser including, but not limited to, red laser to a target region of a target object, a shape of a spot formed on the target region includes, but not limited to, a circle, and a size of an area of the spot should be within a preset range.
The maximum value in the preset range of the light spot area size can be determined by a user according to the size of a target object or a hidden area division strategy, the light spot area should be smaller than the area of any one hidden area on the target object, the minimum value in the preset range of the light spot area size depends on the image recognition accuracy of a terminal, and the light spot area should be larger than or equal to the minimum image area recognizable by the terminal.
And step S102, controlling the camera device to shoot the target object containing the light spots, and acquiring the target object image containing the light spots.
In this embodiment, the image capturing device includes, but is not limited to, a high-speed and high-resolution image capturing component, the image capturing component may be disposed on the terminal, the terminal may form a light spot on a target area of a certain target object, and at the same time, the image capturing device is controlled to complete the overall shooting of the target object including the light spot, so as to obtain a target object image including the light spot, and the shot image may include a plurality of images, which facilitates the subsequent accurate recognition of the position of the light spot on the target object by comparing the shot plurality of target object images including the light spot.
Step S103, identifying the position of the light spot in the target object image and the mark code in the target object image.
In this embodiment, the identifying the position of the light spot in the target object image specifically includes:
identifying light spots in the target object image;
and calculating the position coordinate of the center of the light spot relative to the lower left corner of the target object according to the identified light spot.
And the position coordinate of the center of the light spot relative to the lower left corner of the target object is the position of the light spot in the target object image.
In this embodiment, the mark codes are preset as visible identification codes on the target object, different mark codes are arranged on different target objects, and different mark codes correspond to different mark code values. The identifying of the mark code in the target object image specifically includes:
identifying the code of the mark code in the target object image;
and calculating the mark code value of the target object according to the code of the mark code.
Referring to fig. 2, in a specific application, the mark code is located at four corner positions in the target object image, and the identifying the mark code in the target object image includes: from three bits at the lower left corner of the target object image, the code is extracted by rotating and analyzing in the clockwise direction, as shown in fig. 2, the code extraction is:
Figure GDA0003245165530000071
wherein ". smallcircle" represents 0; "●" represents 1;
then, the codes are analyzed, and the marker code values corresponding to the marker codes are calculated, wherein the three bytes of the marker code values corresponding to the marker codes are respectively '0 x 40', '0 xA 0' and '0 xCC'. It should be noted that the expression form of the coding of the marker code shown in fig. 2 is only a preferred implementation example of the present invention, and is not intended to limit the present invention, and other expression forms may be adopted in other implementation examples as long as the "0" and "1" states and the region directions can be expressed.
And step S104, uploading the mark codes in the target object images to a server, and enabling the server to return corresponding area code distribution files according to the mark codes.
In this embodiment, uploading the tag code in the target object image to the server specifically means uploading the tag code value of the identified target object image to the server, where an association relationship between the tag code value and an area code distribution file of the target object is stored in advance in a database of the server, and after receiving the tag code value of the target object sent by the terminal, the server queries the area code distribution file corresponding to the tag code value from the database according to the tag code value of the target object, and then returns the queried area code distribution file to the terminal.
And S105, recognizing the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
In this embodiment, the area code distribution file of the target object includes a hidden code distribution strategy of the target object, and the hidden code distribution strategy of the target object includes a hidden area division strategy of the target object and a hidden code value corresponding to each hidden area.
In this embodiment, the identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file includes:
identifying which invisible area of the target object the light spot is located in according to the position of the light spot in the target object image and an invisible area division strategy of the target object;
and inquiring an invisible code value corresponding to the invisible area according to the invisible area of the identified light spot on the target object.
In this embodiment, the strategy for dividing the invisible area of the target object includes dividing the target object into several invisible areas and the area range of each invisible area, as shown in fig. 3, which shows the strategy for dividing the invisible area of a certain target object in a specific application, in the application shown in fig. 3, the target object is divided into 6 invisible areas, and each invisible area corresponds to one invisible code value.
In this embodiment, the terminal may determine which stealth area of the target object the light spot is located in according to the center position coordinates of the light spot and the stealth area division policy of the target object, specifically: as shown in fig. 3, if it is identified that the center position coordinate of the light spot is located in the invisible area 2 of the target object, it indicates that the light spot is located in the invisible area 2, and at this time, the invisible code value corresponding to the invisible area 2 is queried from the area code distribution file, where the invisible code value corresponding to the invisible area 2 is the invisible code value in the target area on the target object.
In this embodiment, after the terminal identifies the hidden code value of the target area, the terminal may perform various interactive control operations by using the hidden code value, such as: after the invisible code value is identified, the content which is stored in the terminal in advance and is matched with the invisible code value can be triggered to be played, and the like.
As can be seen from the above, the invisible code identification method provided by this embodiment, by setting the mark code on the target object, and stores the marking code and the region distribution code file of the target object in the server, and then emits light spots to the target region of the target object through the laser emitting device, and identifies the position of the light spot on the target object, and simultaneously obtains the mark code on the target object through the camera device, acquiring a pre-stored region code distribution file of the target object from the server according to the mark code of the target object, finally identifying the invisible code value in the target region according to the position of the light spot on the target object and the region code distribution file of the target object, therefore, the aim of identifying the invisible code value in the target area on the target object in a non-contact mode can be achieved, and the method is beneficial to expansion of the application scene of the invisible identification code.
Fig. 4 is a schematic flowchart of a hidden code identification method according to a second embodiment of the present invention, where an execution subject of the method is the server according to the second embodiment of the present invention. Referring to fig. 4, the method may include the steps of:
step S401, receiving the mark code of the target object sent by the terminal.
In this embodiment, the target object tag code sent by the receiving terminal is specifically a tag code value corresponding to the tag code on the target object identified by the terminal.
Step S402, inquiring an area code distribution file corresponding to the mark code from a database according to the mark code, wherein the area code distribution file comprises a hidden code distribution strategy of the target object.
In this embodiment, the database of the server stores in advance an area code value distribution file of a target object and an association relationship between a marker code value of the target object and the area code value distribution file of the target object, and each target object corresponds to a unique marker code value and a unique area code value distribution file.
In this embodiment, after receiving the tag code value of the target object sent by the terminal, the server may find out an area code value distribution file corresponding to the tag code value from the database according to the tag code value, and then return the area code value distribution file to the terminal through network communication.
And S403, sending the area code distribution file to the terminal, so that the terminal can identify the invisible code value of the target area on the target object according to the area code distribution file.
In this embodiment, the area code distribution file of the target object includes a hidden code distribution strategy of the target object, and the hidden code distribution strategy of the target object includes a hidden area division strategy of the target object and a hidden code value corresponding to each hidden area.
In this embodiment, after the terminal acquires the area code distribution file of the target object returned by the server, the stealth code value of the target area on the target object can be obtained according to the area code distribution file and the position of the light spot on the target object detected by the terminal.
As can be seen from the above, the invisible code identification method provided by this embodiment can also achieve the purpose of identifying the invisible code value in the target area on the target object in a non-contact manner, and is beneficial to the expansion of the application scene of the invisible identification code.
Fig. 5 is a schematic block diagram of a terminal according to a third embodiment of the present invention. For convenience of explanation, only the portions related to the present embodiment are shown.
Referring to fig. 5, the present embodiment provides a terminal 100, including:
a first control unit 13 for controlling the laser emitting device 11 to emit laser to a target area of a target object, forming a spot on the target area;
the second control unit 14 is configured to control the camera 12 to shoot a target object containing light spots, and acquire a target object image containing light spots;
a light spot identification unit 15, configured to identify a position of the light spot in the target object image;
a marker code recognition unit 16 for a marker code in the target object image;
the network communication unit 17 is configured to upload the tag code in the target object image to a server, so that the server returns a corresponding area code distribution file according to the tag code, where the area code distribution file includes a hidden code distribution policy of the target object;
and the invisible code identification unit 18 is used for identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
Optionally, the invisible code distribution strategy of the target object includes an invisible area division strategy of the target object and an invisible code value corresponding to each invisible area; the invisible code recognition unit 8 includes:
a stealth area identification unit 181, configured to identify which stealth area on the target object the light spot is located in according to the position of the light spot in the target object image and a stealth area division policy of the target object;
and the invisible code inquiring unit 182 is configured to inquire an invisible code value corresponding to the invisible area according to the invisible area of the identified light spot on the target object.
Optionally, the first control unit 13 is specifically configured to:
and controlling a laser emitting device to emit laser to a target area of a target object, and forming a circular light spot with the area within a preset range on the target area.
It should be noted that, since each unit in the terminal provided in the embodiment of the present invention is based on the same concept as the method embodiment shown in fig. 1 of the present invention, the technical effect brought by the unit is the same as the method embodiment shown in fig. 1 of the present invention, and specific contents may refer to the description in the method embodiment shown in fig. 1 of the present invention, and are not described again here.
Therefore, it can be seen that the terminal provided by the embodiment can also achieve the purpose of identifying the invisible code value in the target area on the target object in a non-contact manner, and is beneficial to the expansion of the application scene of the invisible identification code.
Fig. 6 is a schematic block diagram of a server according to a fourth embodiment of the present invention. For convenience of explanation, only the portions related to the present embodiment are shown.
Referring to fig. 6, the present embodiment provides a server 2 including:
a receiving unit 21, configured to receive a tag code of a target object sent by a terminal;
the region code distribution file query unit 22 is configured to query a region code distribution file corresponding to the tag code from a database according to the tag code, where the region code distribution file includes a hidden code distribution policy of the target object;
and the sending unit 23 is configured to send the region code distribution file to the terminal, so that the terminal identifies the invisible code value of the target region on the target object according to the region code distribution file.
Optionally, the invisible code distribution strategy of the target object includes an invisible area division strategy of the target object and an invisible code value corresponding to each invisible area.
It should be noted that, since each unit in the server side provided in the embodiment of the present invention is based on the same concept as the method embodiment shown in fig. 4 of the present invention, the technical effect brought by the unit is the same as the method embodiment shown in fig. 4 of the present invention, and specific contents may refer to the description in the method embodiment shown in fig. 4 of the present invention, and are not described again here.
Therefore, it can be seen that the server provided by the embodiment also enables the terminal to achieve the purpose of identifying the invisible code value in the target area on the target object in a non-contact manner, and is beneficial to the expansion of the application scene of the invisible identification code.
Fig. 7 is a schematic block diagram of a terminal according to a fifth embodiment of the present invention. The terminal in the present embodiment shown in fig. 7 may include: one or more processors 701; one or more input devices 702, one or more output devices 703, and memory 704. The processor 701, the input device 702, the output device 703, and the memory 704 are connected by a bus 705. The memory 702 is used for storing instructions and the processor 701 is used for executing the instructions stored by the memory 702. Wherein the processor 701 is configured to:
controlling a laser emitting device to emit laser to a target area of a target object, and forming a light spot on the target area;
controlling a camera device to shoot a target object containing light spots, and acquiring a target object image containing the light spots;
identifying the position of the light spot in the target object image and the mark code in the target object image;
uploading the mark codes in the target object images to a server, and enabling the server to return corresponding area code distribution files according to the mark codes, wherein the area code distribution files comprise invisible code distribution strategies of the target objects;
and identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
Optionally, the invisible code distribution strategy of the target object includes an invisible area division strategy of the target object and an invisible code value corresponding to each invisible area; the processor 701 is further configured to:
the identifying the invisible code values in the target area according to the position of the light spot in the target object image and the area code distribution file comprises:
identifying which invisible area of the target object the light spot is located in according to the position of the light spot in the target object image and an invisible area division strategy of the target object;
and inquiring an invisible code value corresponding to the invisible area according to the invisible area of the identified light spot on the target object.
Optionally, the processor 701 is further configured to:
and controlling a laser emitting device to emit laser to a target area of a target object, and forming a circular light spot with the area within a preset range on the target area.
It should be understood that, in the embodiment of the present invention, the Processor 701 may be a Central Processing Unit (CPU), and the Processor may also be other general processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 702 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 703 may include a display (LCD, etc.), a speaker, etc.
The memory 704 may include both read-only memory and random-access memory, and provides instructions and data to the processor 701. A portion of the memory 704 may also include non-volatile random access memory. For example, the memory 704 may also store device type information.
In a specific implementation, the processor 701, the input device 702, and the output device 703 described in this embodiment of the present invention may execute the implementation manners described in the first embodiment and the second embodiment of the method provided in this embodiment of the present invention, and may also execute the implementation manners of the terminal described in this embodiment of the present invention, which is not described herein again.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the terminal and the unit described above may refer to corresponding processes in the foregoing method embodiments, and are not described herein again.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs.
The units in the terminal of the embodiment of the invention can be merged, divided and deleted according to actual needs.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal and method can be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A invisible code identification method is characterized by comprising the following steps:
controlling a laser emitting device to emit laser to a target area of a target object, and forming a light spot on the target area;
controlling a camera device to shoot a target object containing light spots, and acquiring a target object image containing the light spots;
identifying the position of the light spot in the target object image and a mark code in the target object image, wherein the mark code is located at four corner positions in the target object image, the code of the mark code at each corner position is used for representing the area direction of each corner position, and the mark code has a corresponding mark code value; rotating, analyzing and code fetching according to a clockwise direction from three bits of the lower left corner of the target object image to obtain a marker code value corresponding to the marker code;
uploading the mark codes in the target object images to a server, and enabling the server to return corresponding area code distribution files according to the mark codes, wherein the area code distribution files comprise invisible code distribution strategies of the target objects;
and identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
2. The invisible code identification method according to claim 1, wherein the invisible code distribution strategy of the target object comprises an invisible area division strategy of the target object and an invisible code value corresponding to each invisible area;
the identifying the invisible code values in the target area according to the position of the light spot in the target object image and the area code distribution file comprises:
identifying which invisible area of the target object the light spot is located in according to the position of the light spot in the target object image and an invisible area division strategy of the target object;
and inquiring an invisible code value corresponding to the invisible area according to the invisible area of the identified light spot on the target object.
3. The invisible code identification method according to claim 1, wherein the controlling the laser emitting device to emit laser light to a target area of a target object, and forming a spot on the target area comprises:
and controlling a laser emitting device to emit laser to a target area of a target object, and forming a circular light spot with the area within a preset range on the target area.
4. A invisible code identification method is characterized by comprising the following steps:
receiving a mark code of a target object sent by a terminal, wherein the mark code has a corresponding mark code value, the mark code determines four corner positions of the mark code in a target object image according to the position of a light spot on the target object, the code of the mark code of each corner position is used for representing the region direction of each corner position, and the mark code has a corresponding mark code value; rotating and analyzing code fetching according to a clockwise direction from three bits at the lower left corner of the target object image to obtain a mark code value corresponding to the mark code, wherein the image of the target object is obtained by shooting a target object containing a light spot by the terminal, and the mark code is also used for identifying the position of the light spot in the target object image by the terminal;
inquiring an area code distribution file corresponding to the mark code from a database according to the mark code, wherein the area code distribution file comprises a hidden code distribution strategy of the target object, and the hidden code distribution strategy of the target object comprises a hidden area division strategy of the target object and a hidden code value corresponding to each hidden area;
and sending the region code distribution file to the terminal, so that the terminal can identify the invisible code value of the target region on the target object according to the region code distribution file and the position of the light spot in the target object image.
5. A terminal, comprising:
the first control unit is used for controlling the laser emitting device to emit laser to a target area of a target object, and forming light spots on the target area;
the second control unit is used for controlling the camera device to shoot the target object containing the light spots and acquiring the target object image containing the light spots;
the light spot identification unit is used for identifying the position of the light spot in the target object image;
the mark code identification unit is used for a mark code in the target object image, the mark code has a corresponding mark code value, the mark code is positioned at four corner positions in the target object image, the code of the mark code at each corner position is used for representing the area direction of each corner position, and the mark code has a corresponding mark code value; rotating, analyzing and code fetching according to a clockwise direction from three bits of the lower left corner of the target object image to obtain a marker code value corresponding to the marker code;
the network communication unit is used for uploading the mark codes in the target object images to a server, so that the server returns corresponding area code distribution files according to the mark codes, and the area code distribution files comprise hidden code distribution strategies of the target object;
and the invisible code identification unit is used for identifying the invisible code value in the target area according to the position of the light spot in the target object image and the area code distribution file.
6. The terminal of claim 5, wherein the invisible code distribution strategy of the target object comprises an invisible area division strategy of the target object and an invisible code value corresponding to each invisible area; the invisible code recognition unit includes:
the invisible area identification unit is used for identifying which invisible area of the target object the light spot is positioned in according to the position of the light spot in the target object image and the invisible area division strategy of the target object;
and the invisible code inquiring unit is used for inquiring the invisible code value corresponding to the invisible area according to the invisible area of the identified light spot on the target object.
7. The terminal of claim 6, wherein the first control unit is specifically configured to:
and controlling a laser emitting device to emit laser to a target area of a target object, and forming a circular light spot with the area within a preset range on the target area.
8. A server, comprising:
the receiving unit is used for receiving a mark code of a target object sent by a terminal, wherein the mark code has a corresponding mark code value, the mark code is positioned at four corner positions in an image of the target object, the code of the mark code at each corner position is used for representing the area direction of each corner position, and the mark code has a corresponding mark code value; rotating and analyzing code fetching according to a clockwise direction from three bits at the lower left corner of the target object image to obtain a mark code value corresponding to the mark code, wherein the image of the target object is obtained by shooting a target object containing a light spot by the terminal, and the mark code is also used for identifying the position of the light spot in the target object image by the terminal;
the area code distribution file query unit is used for querying an area code distribution file corresponding to the mark code from a database according to the mark code, wherein the area code distribution file comprises a hidden code distribution strategy of the target object, and the hidden code distribution strategy of the target object comprises a hidden area division strategy of the target object and a hidden code value corresponding to each hidden area;
and the sending unit is used for sending the area code distribution file to the terminal so that the terminal can identify the invisible code value of the target area on the target object according to the area code distribution file and the position of the light spot in the target object image.
CN201710310668.XA 2017-05-03 2017-05-03 Invisible code identification method, terminal and server Expired - Fee Related CN108830361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710310668.XA CN108830361B (en) 2017-05-03 2017-05-03 Invisible code identification method, terminal and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710310668.XA CN108830361B (en) 2017-05-03 2017-05-03 Invisible code identification method, terminal and server

Publications (2)

Publication Number Publication Date
CN108830361A CN108830361A (en) 2018-11-16
CN108830361B true CN108830361B (en) 2021-12-14

Family

ID=64153979

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710310668.XA Expired - Fee Related CN108830361B (en) 2017-05-03 2017-05-03 Invisible code identification method, terminal and server

Country Status (1)

Country Link
CN (1) CN108830361B (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7316566B2 (en) * 2001-03-15 2008-01-08 International Business Machines Corporation Method for accessing interactive multimedia information or services from Braille documents
GB2381687B (en) * 2001-10-31 2005-08-24 Hewlett Packard Co Assisted reading method and apparatus
TW581970B (en) * 2002-01-11 2004-04-01 Sonix Technology Co Ltd Electronic apparatus for utilizing a graphical indicator
US7237723B2 (en) * 2004-02-12 2007-07-03 Grant Isaac W Coordinate designation interface
US20060233462A1 (en) * 2005-04-13 2006-10-19 Cheng-Hua Huang Method for optically identifying coordinate information and system using the method
KR100974900B1 (en) * 2008-11-04 2010-08-09 한국전자통신연구원 Marker recognition apparatus using dynamic threshold and method thereof
CN201540655U (en) * 2009-05-13 2010-08-04 崔伟 Phonic book
CN102136201B (en) * 2010-01-21 2013-10-30 深圳市华普教育科技有限公司 Image pickup type point-reading machine
CN103021304B (en) * 2012-08-01 2014-12-24 顿丽波 Laser overview system and operation method thereof
CN105403235A (en) * 2014-09-15 2016-03-16 吴旻升 Two-dimensional positioning system and method
US9607200B2 (en) * 2014-10-09 2017-03-28 Cognex Corporation Decoding barcodes

Also Published As

Publication number Publication date
CN108830361A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108961157B (en) Picture processing method, picture processing device and terminal equipment
CN108596955B (en) Image detection method, image detection device and mobile terminal
US20170337449A1 (en) Program, system, and method for determining similarity of objects
CN110119733B (en) Page identification method and device, terminal equipment and computer readable storage medium
CN108898082B (en) Picture processing method, picture processing device and terminal equipment
CN108961267B (en) Picture processing method, picture processing device and terminal equipment
CN110288710B (en) Three-dimensional map processing method and device and terminal equipment
CN110377215B (en) Model display method and device and terminal equipment
US20140240253A1 (en) Method of detecting protection case and electronic device thereof
CN103679788A (en) 3D image generating method and device in mobile terminal
CN109491502B (en) Haptic rendering method, terminal device and computer-readable storage medium
CN103294766A (en) Associating strokes with documents based on the document image
US10909232B2 (en) Terminal verification method, terminal device, and computer readable storage medium
CN110021062B (en) Product characteristic acquisition method, terminal and storage medium
CN111142650A (en) Screen brightness adjusting method, screen brightness adjusting device and terminal
TWI502411B (en) Touch detecting method and touch control device using the same
CN108491152B (en) Touch screen terminal control method, terminal and medium based on virtual cursor
CN110658976B (en) Touch track display method and electronic equipment
CN111597009A (en) Application program display method and device and terminal equipment
CN109444905B (en) Dynamic object detection method and device based on laser and terminal equipment
CN108830361B (en) Invisible code identification method, terminal and server
CN108932704B (en) Picture processing method, picture processing device and terminal equipment
CN111695405B (en) Dog face feature point detection method, device and system and storage medium
WO2018032426A1 (en) Method for detecting input device, and detection device
CN107609119B (en) File processing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211214