CN116757686A - Identification method, device and equipment for aggregate collection two-dimensional code - Google Patents

Identification method, device and equipment for aggregate collection two-dimensional code Download PDF

Info

Publication number
CN116757686A
CN116757686A CN202310771222.2A CN202310771222A CN116757686A CN 116757686 A CN116757686 A CN 116757686A CN 202310771222 A CN202310771222 A CN 202310771222A CN 116757686 A CN116757686 A CN 116757686A
Authority
CN
China
Prior art keywords
dimensional code
code
target
image
card
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310771222.2A
Other languages
Chinese (zh)
Inventor
王英博
孙涛
管超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202310771222.2A priority Critical patent/CN116757686A/en
Publication of CN116757686A publication Critical patent/CN116757686A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/04Payment circuits
    • G06Q20/06Private payment circuits, e.g. involving electronic currency used among participants of a common payment scheme
    • G06Q20/065Private payment circuits, e.g. involving electronic currency used among participants of a common payment scheme using e-cash
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1452Methods for optical code recognition including a method step for retrieval of the optical code detecting bar code edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3276Short range or proximity payments by means of M-devices using a pictured code, e.g. barcode or QR-code, being read by the M-device

Abstract

The embodiment of the specification discloses a method, a device and equipment for identifying an aggregate checkout two-dimensional code. The scheme may include: acquiring an image to be identified of a code card image containing a target collection two-dimensional code acquired by a user terminal; identifying the image to be identified to obtain a code value of a target collection two-dimensional code, and judging whether the target collection two-dimensional code is an aggregate collection two-dimensional code according to the code value; if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection, and then judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area, wherein the preset display state is used for prompting a user to pay by using a preset payment mode.

Description

Identification method, device and equipment for aggregate collection two-dimensional code
Technical Field
The application relates to the technical field of electronic payment, in particular to a method, a device and equipment for identifying two-dimension codes of aggregate collection.
Background
Nowadays, the more and more the use scenes of merchant cashing codes are, the higher the use frequency is, and most consumers use a mobile phone payment form to purchase and consume goods during the consumption period. The merchant may use a wide variety of payouts, such as direct payouts provided by different account-side payment institutions, indirect payouts (aggregated payouts) provided by different order outsourcing service institutions, and so on. For example, in the case of using an aggregate code, a consumer can complete purchase and payment using various payment tools such as a payment instrument and a Unionpay, and the trouble of cash change, laying a plurality of two-dimensional codes and the like is saved.
When a user scans two-dimensional code payment using an intelligent terminal such as a mobile phone, the two-dimensional code of the receipt needs to be described.
Disclosure of Invention
The embodiment of the specification provides a method, a device and equipment for identifying an aggregate checkout two-dimensional code, which are used for providing an identification scheme with high accuracy for the aggregate checkout two-dimensional code.
In order to solve the above technical problems, the embodiments of the present specification are implemented as follows:
the identification method for the aggregate collection two-dimensional code provided by the embodiment of the specification comprises the following steps:
acquiring an image to be identified acquired by a user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
identifying the image to be identified to obtain a code value of the target collection two-dimensional code;
judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection;
judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
The embodiment of the specification provides an identification device for an aggregate collection two-dimensional code, which comprises:
the image acquisition module is used for acquiring an image to be identified acquired by the user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
the code value identification module is used for identifying the image to be identified to obtain the code value of the target collection two-dimensional code;
the aggregation code judging module is used for judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
the sign image area determining module is used for determining a sign image area of the target collection two-dimensional code in the image to be identified through edge detection if the target collection two-dimensional code is an aggregation collection two-dimensional code;
the display state identification module is used for judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
The embodiment of the specification provides an identification device for an aggregate checkout two-dimensional code, which comprises:
at least one processor; the method comprises the steps of,
a memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
The memory stores instructions executable by the at least one processor to enable the at least one processor to:
acquiring an image to be identified; the image to be identified comprises a code card image of the target collection two-dimensional code;
identifying the image to be identified to obtain a code value of the target collection two-dimensional code;
judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection;
judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
One embodiment of the present disclosure can achieve at least the following advantages: the two-dimension code of the aggregate collection is identified from the image to be identified, the code card image area of the code card to which the two-dimension code of the aggregate collection belongs is determined through edge detection, and then whether the code card of the two-dimension code of the aggregate collection accords with a preset display state is judged based on the image analysis result of the code card image area, so that the identification accuracy of the code card state detection result of the aggregate two-dimension code is improved, the calculated amount is reduced, and the identification efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present description or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments described in the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic application scenario diagram of an identification method for an aggregated checkout two-dimensional code according to an embodiment of the present disclosure;
fig. 2 is a flow chart of a method for identifying two-dimensional codes of aggregated collection according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of determining a positive direction according to a positioning pile of a two-dimensional code according to an embodiment of the present disclosure;
fig. 4 is a schematic flow chart of a method for identifying an aggregated checkout two-dimensional code in a practical application scenario provided in an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an identification device for aggregated two-dimensional codes corresponding to fig. 2 according to an embodiment of the present disclosure
Fig. 6 is a schematic structural diagram of an identification device for an aggregated checkout two-dimensional code corresponding to fig. 2 according to an embodiment of the present disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of one or more embodiments of the present specification more clear, the technical solutions of one or more embodiments of the present specification will be clearly and completely described below in connection with specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without undue burden, are intended to be within the scope of one or more embodiments herein.
It should be understood that although the terms first, second, third, etc. may be used in this document to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
Currently, in an online payment process, a user may use a user terminal application provided by an account-side payment mechanism (e.g., commercial bank, non-financial institution with payment license, etc.) to scan a checkout code and use the user's account with the account-side payment mechanism to make a payment. In practice, the checkout may be provided by a payment-side institution or by a acquirer-side payment institution such as a Unionpay business, referred to as a direct connect checkout; the receipt may also be provided by a receipt outsourcing service organization filed in the chinese payment clearing association, referred to as a m-linked receipt, also referred to as an aggregate receipt.
In practical application, the account side payment mechanism can reach an agreement with the order-receiving outsourcing service mechanism to agree with the preset display state of the aggregation check-out. For an account-side payment mechanism, when a user scans the aggregate checkout for payment, the aggregate checkout may be identified to determine whether it meets a preset display state agreed in advance.
In the prior art, the whole picture acquired by the user terminal can be analyzed to identify whether the aggregate checkout code accords with a preset display state. However, in reality, due to the variety of the receipts available to the merchant, the merchant may place multiple receipts adjacent to each other, which results in multiple codes in the picture collected by the user terminal, which is very likely to cause misrecognition of the display status of the aggregated receipts.
In order to solve the drawbacks of the prior art, the present solution gives the following examples.
Fig. 1 is a schematic diagram of an application scenario of an identification method for an aggregated checkout two-dimensional code in an embodiment of the present disclosure.
As shown in fig. 1, when a user needs to pay by scanning a code, the user terminal 102 installed with a payment application corresponding to an account-side payment mechanism may be used to collect an image 101 to be identified. In practical applications, the image 101 to be identified collected by the user terminal 102 may include one or more sign images of the two-dimensional code of collection. For example, fig. 1 shows a card image including 3 collection two-dimensional codes in the collected image to be recognized 101, and the number of card images of collection two-dimensional codes shown here is only exemplary, and 1, 2, 4 or more card images of collection two-dimensional codes may be included. In addition, in the image to be identified 101, the card image of the check-out code included may be a complete card image or an incomplete card image. In addition, alternatively, in the image to be recognized 101, the positive direction of the card image of the check-out code included may be the same as or different from the positive direction of the image to be recognized 101, respectively. Furthermore, in the image to be identified, the respective two-dimensional codes for collection are usually different from each other; the form (e.g., color, text included, etc.) of the code board to which each of the different two-dimensional code pieces belongs may also be different.
The acquired image 101 to be identified may then be identified in the user terminal 102. Specifically, the code value of each two-dimension code of collection can be identified from the image 101 to be identified, and the two-dimension code of collection is determined from each two-dimension code of collection according to the code value; determining a card image area of a card to which the aggregate collection two-dimensional code belongs through edge detection; and judging whether the code card accords with a preset display state appointed in advance or not based on the code card image area of the aggregate collection two-dimensional code.
In practical application, since the calculation amount required by the identification scheme of the aggregate checkout two-dimensional code in the embodiment of the present disclosure is not large, the method can be executed on the user terminal side 102, so that the identification result can be obtained quickly under the condition of lower delay, and then the identification result is sent to the cloud server 103 of the payment mechanism on the account side. Optionally, after the user terminal obtains the image 101 to be identified, under the condition of user authorization, the image 101 to be identified may also be sent to the cloud server 103 corresponding to the account-side payment mechanism to execute the process of identifying the image 101 to be identified.
Next, a method for identifying an aggregate checkout two-dimensional code provided in an embodiment of the present disclosure will be specifically described with reference to the accompanying drawings.
Fig. 2 is a flow chart of an identification method for an aggregated checkout two-dimensional code according to an embodiment of the present disclosure.
From the program perspective, the execution subject of the flow may be a program installed on an application server or an application terminal. It is understood that the method may be performed by any apparatus, device, platform, cluster of devices having computing, processing capabilities.
In practical applications, the execution subject of the method of fig. 2 may be a user terminal that scans the two-dimensional code of collection, and in particular, may be a payment application program installed on the user terminal.
As shown in fig. 2, the process may include the steps of:
step 202: acquiring an image to be identified acquired by a user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code.
From a hardware point of view, the user terminal may be a terminal device loaded with a third party payment application. For example, the terminal device may be a smart phone, a wearable smart device, or the like. The third party application may be an application that scans a checkout two-dimensional code. In practical application, after the user scans the two-dimension code of collection, payment can be performed according to funds in the user account corresponding to the third party application program.
The image to be identified can be an image acquired by an image acquisition device of the user terminal, wherein the image acquisition device can be a mobile phone camera. From the perspective of the application scenario, the image to be identified may be an image acquired by the user during the process of scanning the checkout two-dimensional code using a third party payment application in the user terminal.
The image to be identified can contain one or more sign images of the two-dimensional code of collection. When the image to be identified includes (a card image of) a plurality of two-dimensional codes, the two-dimensional codes may be two-dimensional codes different from each other, and display states of the cards to which the two-dimensional codes belong may be different from each other, for example, may have different colors, display different characters, and the like. Alternatively, when the image to be recognized contains a plurality of card images of the two-dimensional code for collection, the positive directions of the respective card images may be the same or different from each other.
The target two-dimensional code for collection may be any one of one or more two-dimensional codes for collection in the image to be identified. In the embodiment of the present specification, the target two-dimensional code for collection is used to represent the two-dimensional code for collection processed when the flow of fig. 2 is currently executed.
In the embodiment of the present disclosure, in order to identify the aggregate two-dimensional code, an image to be identified acquired by the user terminal needs to be acquired first, and each two-dimensional code in the image to be identified needs to be identified respectively, so as to determine the aggregate two-dimensional code from the image to be identified, as in step 204 and step 206.
Step 204: and identifying the image to be identified to obtain the code value of the target collection two-dimensional code.
After the user terminal collects the image to be identified, the third party payment application program can identify the code value of each two-dimension code (including the target two-dimension code) of collection from the image to be identified. Specifically, the two-dimensional code image may be analyzed by using a preset two-dimensional code encoding/decoding rule based on the two-dimensional code image, so as to obtain the code value of the two-dimensional code. Different two-dimensional codes can correspond to different code values.
In practical applications, the code value of the two-dimensional code may be uniform resource locator (Uniform Resource Locator, URL) information. From the perspective of the application scenario, the code value may be URL information that is located to a preset payment page.
Step 206: and judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value.
In practical application, the code values of the two-dimension codes of different types have different code value characteristics. The types of the two-dimensional code of collection may include a direct collection two-dimensional code and an indirect collection two-dimensional code (i.e., an aggregate collection two-dimensional code). The direct-connection cash-collecting two-dimensional code can comprise direct-connection cash-collecting two-dimensional codes corresponding to different account-side payment mechanisms; the indirect receipt two-dimensional code can comprise the indirect receipt two-dimensional code provided by different receipt outsourcing service institutions.
In the embodiment of the present specification, whether the target collection two-dimensional code is the aggregated collection two-dimensional code may be identified according to the code value of the target collection two-dimensional code. In practical application, the order receiving and outsourcing service mechanism corresponding to the target receipt two-dimensional code can be identified.
Step 208: if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection.
In the embodiment of the present disclosure, after identifying the aggregate two-dimensional code from the image to be identified according to step 204 and step 206, the identification of the aggregate two-dimensional code may be further performed subsequently, and specifically, whether the display state of the code board to which the aggregate two-dimensional code belongs accords with the preset display state may be identified.
In practical application, in order to improve the identification accuracy of the display state of the code cards to which the aggregated two-dimensional code belongs, the whole image to be identified can be identified not directly, but the code card image area corresponding to the aggregated two-dimensional code in the image to be identified. In step 208, an edge detection method may be used to identify a tile image area corresponding to each tile from the image to be identified.
The edge detection may optionally include performing with an edge detection operator such as Sobel, prewitt, roberts, canny, marr-Hildeth, or the like.
Step 210: judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
Specifically, step 210 may include: and determining the display state of the code card of the target collection two-dimensional code based on the code card image area, and further judging whether the display state accords with a preset display state.
The display state of the sign may include at least one of a color display state and a text display state.
The color display state may specifically include displaying or not displaying a target color (e.g., blue), a pixel duty ratio of the target color in the sign-on area, displaying or not displaying other non-black-and-white colors (e.g., colors other than blue, black, white) other than the target color in the sign-on area, and the like. In practical applications, different color display states may prompt the user to use different preset payment methods. For example, when the pixel ratio of the first target color in the sign reaches a certain preset proportion threshold, the function of prompting the user to use the first preset payment mode can be achieved. For another example, when the pixel ratio of the second target color in the sign reaches a certain preset proportion threshold, the function of prompting the user to use the second preset payment mode can be achieved.
The character display state may specifically include displaying or not displaying the target character, the size of the target character block, the font/color/size of the target character, the relative position of the target character block in the sign board and the two-dimensional code of collection, and the like. In practical application, different text display states can prompt a user to use different preset payment modes. For example, when the first target text is displayed in the sign board, the sign board can play a role in prompting the user to use the first preset payment mode. For another example, when the second target text is displayed in the sign board, the function of prompting the user to use the second preset payment mode can be achieved.
In practical applications, when the display state of the sign-up card includes multiple aspects, for example, when the display state of the sign-up card includes both a color display state and a text display state, it may be determined, according to a preset policy, whether the sign-up card of the target two-dimensional code accords with the preset display state.
In an alternative embodiment, the identification result of the display state may be a binary result. For example, the preset policy is to determine that the code card of the target two-dimensional code accords with the preset display state if at least one of the color display state and the text display state of the target two-dimensional code accords with the preset condition. For another example, if the color display state and the text display state of the target two-dimensional code meet the preset conditions, it is determined that the code card of the target two-dimensional code meets the preset display state.
In further alternative embodiments, the recognition result of the display status may be a scoring result. For example, if the score corresponding to the color display state of the target two-dimensional code is a first score and the score corresponding to the text display state of the target two-dimensional code is a second score, it may be determined whether the code card of the target two-dimensional code accords with the preset display state based on the first score and the second score. Alternatively, the first score and the second score may have the same or different preset weights.
Alternatively, if the result of the determination in step 206 is that the target two-dimensional code is not an aggregate two-dimensional code (for example, a direct-connected two-dimensional code), the step of displaying the status of the identification code card in step 210 may not be performed on the card image area to which the target two-dimensional code belongs. Because the code card state recognition and judgment are carried out only on the code card image area corresponding to the aggregation two-dimensional code in the image to be recognized, the accuracy of the state recognition is improved, the calculated amount of the state recognition is reduced, and the calculation efficiency of the state recognition is improved.
It should be understood that, in the method described in one or more embodiments of the present disclosure, the order of some steps may be adjusted according to actual needs, or some steps may be omitted.
In the method in fig. 2, the two-dimension code of the aggregate collection is identified from the image to be identified acquired by the user terminal, and then the code card image area of the code card to which the two-dimension code of the aggregate collection belongs is determined through edge detection, and then whether the code card of the two-dimension code of the aggregate collection accords with the preset display state is judged based on the image analysis result of the code card image area, so that the identification accuracy of the code card state detection result of the aggregate two-dimension code is improved, the calculated amount is reduced, and the identification efficiency is improved.
Based on the method of fig. 2, the present description examples also provide some specific implementations of the method, as described below.
Based on the scheme of fig. 2, after step 210, the target payment two-dimensional code may be further marked according to the determination result of step 210. Specifically, if the judgment result in the step 210 is that the code card of the target two-dimensional code accords with the preset display state, the target two-dimensional code may be marked as a positive sample; if the result of the determination in step 210 is that the sign of the target two-dimensional code does not conform to the preset display state, the target two-dimensional code may be marked as a negative sample.
In practical application, the proportion of positive samples of the aggregated two-dimension codes can be counted according to the identification result that a large number of actually collected aggregated two-dimension codes are positive samples or negative samples. Therefore, the account-side payment mechanism can adjust the marketing strategy according to the positive sample proportion of the aggregate receipt two-dimensional code.
Optionally, in the process of counting the positive sample proportion of the aggregated receipts two-dimensional code, statistics may be performed respectively with the receipts and outsourcing service mechanism as a unit. Specifically, according to the code value of the target collection two-dimensional code, a receipt outsourcing service mechanism for providing the target collection two-dimensional code can be determined; therefore, after the sample type (including positive samples or negative samples) of the code card of the target collection two-dimensional code is determined according to the code card display state of the target collection two-dimensional code, the positive sample proportion corresponding to the receipt outsourcing service mechanism can be determined. Thus, the account side payment mechanism can adjust the specific marketing strategy aiming at the order taking outsourcing service mechanism according to the positive sample proportion corresponding to the order taking outsourcing service mechanism.
In an optional embodiment of the present disclosure, when the display state of the sign includes a color display state, in step 210, determining, based on the sign image area, whether the sign of the target two-dimensional code corresponds to a preset display state may specifically include: determining the proportion of the pixel points of the target color in the pixel points in the sign image area; and if the proportion is greater than or equal to a preset proportion threshold value, determining that the code card of the target collection two-dimensional code accords with a preset display state.
The target color can be pre-agreed by an account side payment mechanism and a receipt outsourcing service mechanism. In practical applications, the target color may be a color for prompting a user to pay using the account-side payment mechanism. Optionally, the target color may be a color having a similarity with a logo color of the account-side payment mechanism greater than a preset similarity threshold. Alternatively, the logo color of the account-side payment mechanism may be one color having a certain RGB value, and the target color may be a set of colors that are not greatly different from the vision of the certain RGB value. For example, the logo color may be blue, green, etc., and accordingly, the target color may be a corresponding blue tone, green tone color, etc.
The preset proportion threshold value can be pre-agreed by an account side payment mechanism and a receipt outsourcing service mechanism. In practical applications, the preset proportional threshold may be empirically set or determined from historical transaction data.
In practical application, the fact that the environment of off-line code scanning payment is complex is considered, the collected picture color has a great relationship with the collection environment, and the picture colors obtained by the same sample under different collection environments are different. The image to be identified acquired by the user terminal may itself have a color deviation. For example, in a red light environment of a vegetable market, an image to be recognized collected by a user terminal may be colored red as a whole. Therefore, correction of the color of the image to be recognized is required
In an optional embodiment of the present disclosure, before the determining the proportion of the pixel point of the target color in the pixel points in the tile image area, the method may further include: and carrying out color correction on the image to be identified by adopting a preset color correction method to obtain a color corrected image.
In particular, the color correction method may include Gamma (Gamma) correction. Gamma correction is a method of editing a gamma curve of an image to perform nonlinear tone editing on the image, detecting dark color portions and light color portions in an image signal, and increasing the ratio of the dark color portions to the light color portions, thereby improving the image contrast effect.
As an example, gamma (Gamma) correction may be performed on pixel points in an image to be recognized according to the following formulas (1) and (2):
wherein input epsilon { R, G, B }, i.e., input represents RGB values of each pixel point in the image to be identified; output represents corrected RGB values for each pixel in the image to be identified.
In the above embodiment, before the color display state of the code card is identified, the color of the entire image to be identified may be corrected, so that the accuracy of the identification result of the color display state of the code card may be improved, and the accuracy of the judgment result of whether the code card of the target two-dimensional code accords with the preset display state may be further improved.
In an optional embodiment of the present disclosure, when the display state of the sign includes a text display state, in step 210, the determining, based on the sign image area, whether the sign of the target two-dimensional code corresponds to a preset display state may specifically include: judging whether a target text exists in the sign image area or not; if the target text exists in the sign image area, determining that the sign of the target collection two-dimensional code accords with a preset display state.
The target text can be pre-agreed by an account side payment mechanism and a receipt outsourcing service mechanism. In practical applications, the target text may be a color for prompting a user to pay using the account-side payment mechanism.
Alternatively, the target text may be text containing at least a target keyword. The target keyword may be a keyword corresponding to the account-side payment mechanism, and the target keyword may be pre-agreed by the account-side payment mechanism and the order-receiving outsourcing service mechanism. For example, if the target keyword is "payment treasury", the target text may be "payment treasury", "recommended use of payment treasury", or the like.
In practical application, in order to improve the accuracy of identifying the character display state of the code cards of the aggregate collection two-dimensional code, the character display state may be identified based on the code card image area corresponding to the aggregate collection two-dimensional code, so as to avoid interference to the identification result caused by characters on other code cards.
In some optional embodiments, after determining the card image area corresponding to the aggregate checkout two-dimensional code through edge detection, the recognition scheme of the text display state may be executed on the card image area.
Specifically, the determining, based on the card image area, whether the card of the target collection two-dimensional code meets a preset display state may include: performing optical character recognition on the code card image area to obtain a text recognition result; judging whether the text recognition result is consistent with the target text; if the text recognition result is consistent with the target text, determining that the code card of the target collection two-dimensional code accords with a preset display state.
Wherein, optical character recognition (Optical Character Recognition), abbreviated as OCR, refers to a process in which an electronic device (e.g., a scanner or a digital camera) checks characters printed on paper, determines the shape thereof by detecting dark and bright patterns, and then translates the shape into computer characters by a character recognition method; that is, the technology of converting the characters in the paper document into the image file of black-white lattice by optical mode and converting the characters in the image into the text format by the recognition software for further editing and processing by the word processing software is adopted. In practice, OCR can be implemented using models in the prior art, for example, lightweight model pad OCR can be employed for text extraction and to determine the location of text blocks.
The text recognition result can comprise a recognition result text and also comprise a text block position of the recognition result text. The text block position may refer to a position of the recognition result text in the image to be recognized.
Wherein, the determining whether the text recognition result is consistent with the target text or not may be optionally determining whether the text recognition result includes a target keyword or not; alternatively, it may be determined whether the semantic similarity between the text recognition result and the target text is greater than a preset similarity threshold, or the like, which is not limited to the example given herein.
Based on the above embodiment, the edge detection is performed first to determine the relatively smaller card image area, and then the OCR is performed for the card image area, so that the OCR is performed in a smaller range, and the consumption of computing resources is low.
In some alternative embodiments, on the one hand, OCR may be performed on the image to be identified to determine whether text meeting the preset condition is contained therein; on the other hand, a code plate image area corresponding to the aggregate collection two-dimensional code can be determined through edge detection; and finally, judging whether the text recognition result is positioned in the determined sign image area.
Specifically, the determining, based on the card image area, whether the card of the target collection two-dimensional code meets a preset display state may include: performing optical character recognition on the image to be recognized to obtain a text recognition result; judging whether the text recognition result is consistent with the target text and is positioned in the sign image area; and if the text recognition result is consistent with the target text and is positioned in the code card image area, determining that the code card of the target collection two-dimensional code accords with a preset display state.
Based on the above embodiment, the operations of determining the code card image area and performing the optical character recognition can be performed in parallel without waiting for the optical character recognition to be performed after determining the code card image area, and the calculation efficiency is high.
On the basis of the scheme of fig. 2, when the sign image area of the target two-dimensional code is determined through edge detection, in order to further improve the accuracy of the edge detection result, a rough detection range may be defined in advance before edge detection.
In an embodiment of the present disclosure, before determining the card image area of the target checkout two-dimensional code by edge detection, the method may further include: identifying code position information of a target collection two-dimensional code contained in the image to be identified; and determining the positive direction of the target collection two-dimensional code. Correspondingly, determining the code card image area of the target collection two-dimensional code through edge detection specifically can comprise: determining a predicted sign range based on the code position information and the positive direction; and in the estimated card-stacking range, performing edge detection, and determining the edge of a card-stacking image area of a card to which the target two-dimensional code belongs to obtain the card-stacking image area.
The code position information of the target collection two-dimensional code may refer to a position of the two-dimensional code image in the image to be identified. Because the image to be identified is composed of pixel points, the code position information of the target collection two-dimensional code can be specifically the pixel point position of the target collection two-dimensional code. In practical application, the pixel point positions of the pixel points at four corners of the target two-dimensional code can be used for representing the code position information of the target two-dimensional code. For example, the code position information may be a two-dimensional array containing four elements. The four elements respectively correspond to four corners of the target collection two-dimensional code.
The positive direction of the target two-dimensional code can be a direction determined based on a specific positioning point (also called a two-dimensional code positioning pile) of the target two-dimensional code. Therefore, the determining the positive direction of the target two-dimensional code may specifically include: identifying a code image area corresponding to the target collection two-dimensional code, and determining the position of a specific positioning point of the target collection two-dimensional code; and determining the positive direction of the target collection two-dimensional code based on the position of the specific positioning point.
Typically, three positioning piles may be included in the two-dimensional code. In practical application, the positive direction of the two-dimensional code can be the direction consistent with the direction from the left lower positioning pile to the left upper positioning pile when the three positioning piles of the two-dimensional code are respectively positioned at the left upper, the left lower and the right upper. As in fig. 3, a schematic diagram of three spuds and a positive direction of a two-dimensional code is shown, specifically, a lower left spud 301, an upper left spud 302, an upper right spud 303, and a positive direction 304 of the two-dimensional code.
In practical application, the code card is generally rectangular-like, and the positive direction of the two-dimensional code is generally consistent with the long-side direction of the rectangular-like code card, so that the image range of the code card can be estimated based on the positive direction of the target collection two-dimensional code.
Specifically, the determining the estimated playing card range based on the code position information and the positive direction may include: according to the code position information, determining the center point coordinates of the target collection two-dimensional code; determining the length direction and the width direction of the target collection two-dimensional code based on the positive direction; starting from the center point coordinates, expanding a preset length value to the length direction and expanding a preset width value to the width direction to obtain a predicted card-coded range. Wherein the preset length value and the preset width value may be set according to statistical data or according to experience.
Wherein, when expanding in the length direction, the length of the upward expansion and the length of the downward expansion can be the same or different; when expanding in the width direction, the width of the leftward expansion and the width of the rightward expansion may be the same or different.
As an example, assuming that the center point coordinates of the target two-dimensional code for collection determined based on the code position information are (a, b), assuming that the size of the two-dimensional code for collection determined based on statistical data or based on experience is wide w and high h, the estimated code range of the two-dimensional code for collection may be: a region having a width between X1 and X2 and a height between Y1 and Y2, wherein x1=a-w/2; x2=a+w/2; y1=b-h/2; y2=b+h/2.
Based on the above embodiment, after the estimated sign-up range of the aggregate checkout two-dimensional code is determined in a relatively simple manner, edge detection is performed based on the estimated sign-up range, and therefore, the efficiency and accuracy of edge detection can be improved.
Alternatively, in order to avoid that the card edge of the aggregate two-dimensional code cannot be detected within the determined first estimated card range, a second estimated card range may be further set, so that detection is performed based on the second estimated card range when the card edge cannot be identified within the first estimated card range.
Specifically, the determining, by edge detection, the card image area of the target two-dimensional code for collection may specifically include: determining a first estimated coded range and a second estimated coded range based on the positive direction and the code position information; the second estimated card range is larger than the first estimated card range and comprises the first estimated card range; and executing first edge detection in the first estimated card range, and if the edge of the card image area of the card to which the target two-dimensional code belongs cannot be identified, executing second edge detection in the second estimated card range, and continuously identifying the edge of the card image area of the card to which the target two-dimensional code belongs to obtain the card image area.
Wherein the length of the upward expansion and the length of the downward expansion may be the same or different when expanding in the length direction, for example, the width of the upward expansion may be smaller than the width of the downward expansion. When expanding in the width direction, the width of the leftward expansion and the width of the rightward expansion may be the same or different.
Along the above example, assuming that the center point coordinates of the target two-dimensional code for collection determined based on the code position information are (a, b), assuming that the size of the two-dimensional code card for collection determined according to statistical data or according to experience is wide w and high h, the estimated code card range of the two-dimensional code card for collection can be obtained as follows: a region having a width between X1 and X2 and a height between Y1 and Y2. As an example, for a first estimated tile range, x1=a-w/2; x2=a+w/2; y1=b-h/2; y2=b+h/2; for the second estimated tile range, x1=a-0.9 w; x2=a+0.9w; y1=b-1.1 h; y2=b+1.25 h.
In an alternative embodiment of the present specification, the edge detection method may specifically include a Canny edge detection method.
More specifically, the determining, by edge detection, the card image area of the target two-dimensional code for collection may specifically include: performing Gaussian filtering on the image to be identified to remove noise, so as to obtain a denoised image; calculating a gradient image and an angle image aiming at the denoised image; performing non-maximum suppression based on the gradient image and the angle image, and determining edge pixel points; and performing edge connection on the edge pixel points by using a double-threshold method to obtain the edge of the sign image area.
In the edge detection method, the Gaussian filtering is adopted, specifically, the weighted average is carried out according to the gray values of the pixel points to be filtered and the neighborhood points thereof and the parameter rule generated by the Gaussian formula, so that the high-frequency noise superposed in the ideal image can be effectively filtered.
Specifically, the performing gaussian filtering on the image to be identified to remove noise may specifically include: generating a Gaussian filter; and performing Gaussian filter smoothing on the image to be identified by using a Gaussian filter. Wherein, the process of generating the gaussian filter may comprise: determining a Gaussian standard deviation; determining the length of a Gaussian filter according to the Gaussian standard deviation; a one-dimensional gaussian filter is generated based on the gaussian filter length. In practical application, the gaussian filter adopted in the scheme can be one of the gaussian filters commonly used in the prior art.
In the foregoing edge detection method, the method for calculating the gradient image and the angle image may be implemented by a method in the prior art.
In the foregoing edge detection method, considering that the computed gradient image may have numerous problems such as coarse edge width, weak edge interference, etc., the local maximum value of the pixel point may be found by using non-maximum value suppression, and the gray value corresponding to the non-maximum value is set to 0, so as to reject most of the non-edge pixel points.
In the above-described edge detection method, when edge connection is performed, a double-threshold method is employed in order to avoid the influence of a false edge. Specifically, two thresholds are selected, a point smaller than the low threshold is considered to be a false edge set 0, a point larger than the high threshold is considered to be a strong edge set 1, and the pixel points in the middle need to be further checked. In practical application, edges in the image can be linked into a contour according to a high threshold value, when the end point of the contour is reached, points meeting a low threshold value can be found in 8 neighborhood points of the breakpoint, and then new edges are collected according to the points with the low threshold value until the whole image is closed.
In practical applications, the edge detection method can be implemented by using languages such as Matlab, c++, python, and the like.
According to the above description, in a practical application scenario provided in the embodiment of the present disclosure, a flow chart of a method for identifying an aggregate checkout two-dimensional code is shown in fig. 4.
In fig. 4, step 401: the user terminal can acquire an image to be identified, which contains at least one code card image of the two-dimension code of collection through the camera;
step 402: for the image to be identified, the position and the code value of the two-dimensional code of collection can be identified;
Step 403: based on the image of the two-dimensional code, the positive direction of the two-dimensional code can be determined;
step 404: based on the positive direction of the two-dimension code, the code card area of the two-dimension code can be determined by combining edge detection;
step 405: for the image to be identified, optionally, color correction may be performed;
step 406: target color recognition can be performed on the coded card area in the color corrected image to be recognized, so that whether the coded card accords with a preset color display state can be judged;
step 407: for the image to be identified, on the other hand, OCR text extraction can be performed, and position matching can be performed with the code card area determined in step 404, so that whether the code card accords with the preset text display state can be judged;
step 408: based on the above-mentioned judgment result, a recognition result is obtained whether the sign board accords with a preset display state (including at least one of a preset color display state and a preset text display state), and further, a subsequent service processing logic is executed according to the recognition result.
It will be appreciated that fig. 4 is merely illustrative of one possible implementation of the various embodiments of the present disclosure, and may be implemented in a manner not exactly the same as previously described. For example, the steps may also be performed in a different order than that shown in fig. 4. For example, part of the steps in fig. 4 may be omitted. For example, steps not shown in fig. 4 may be newly added.
The OCR text extraction may be performed on the whole of the image to be recognized, or may be performed only on the card-coded area determined in step 404, for example.
In the prior art, when the two-dimension codes of the aggregate collection are identified, if the color identification and OCR identification are carried out on the whole picture acquired by the user terminal, the error identification is very easy to cause under the condition of multiple codes. For example, if the code cards of the aggregate code have no target color or no target text, and other code cards in the figure have target colors or target text, misrecognition may be caused.
According to one or more schemes of the embodiments of the present disclosure, boundaries of a bill-board (e.g., a standing bill, a hanging bill, etc.) are constructed through color correction and code value recognition, specific texts in the boundaries are recognized, and color detection is performed in the boundaries of the image after the color correction, so that the accuracy of the code-board recognition result of the aggregation code is integrally improved.
Based on one or more schemes of the embodiments of the present specification, the problem of misidentification easily caused under the condition of multiple codes in the scene of understanding the code cards is solved; the problem that color recognition is easily influenced by background light under the condition of understanding the code cards, so that false recognition is caused is solved; the accuracy of identifying the display state of the coded cards in a complex scene can be improved.
Based on the same thought, the embodiment of the specification also provides a device corresponding to the method.
Fig. 5 is a schematic structural diagram of an identification device for an aggregated checkout two-dimensional code corresponding to fig. 2 according to an embodiment of the present disclosure. The device can be applied to a user terminal in actual application.
As shown in fig. 5, the apparatus may include:
the image acquisition module 502 is configured to acquire an image to be identified acquired by the user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
the code value identifying module 504 is configured to identify the image to be identified, so as to obtain a code value of the target collection two-dimensional code;
the aggregate code judging module 506 is configured to judge whether the target two-dimensional code is an aggregate two-dimensional code according to the code value;
the sign image area determining module 508 is configured to determine, by edge detection, a sign image area of the target collection two-dimensional code in the image to be identified if the target collection two-dimensional code is an aggregate collection two-dimensional code;
the display state identification module 510 is configured to determine, based on the sign image area, whether a sign of the target collection two-dimensional code meets a preset display state; the preset display state is used for prompting a user to pay by using a preset payment mode.
The present examples also provide some embodiments of the method based on the apparatus of fig. 5, as described below.
Optionally, the display status identifying module 510 may specifically include a color display status identifying sub-module.
Specifically, the color display state identification sub-module may be specifically configured to: determining the proportion of the pixel points of the target color in the pixel points in the sign image area; and if the proportion is greater than or equal to a preset proportion threshold value, determining that the code card of the target collection two-dimensional code accords with a preset display state.
Optionally, the apparatus may further include a color correction module for: and before the color display state recognition submodule determines the proportion of the pixel points of the target color in the pixel points in the code card image area, performing color correction on the image to be recognized by adopting a preset color correction method to obtain a color corrected image.
Optionally, the display status identifying module 510 may specifically include a text display status identifying sub-module.
Specifically, the color display state identification sub-module may be specifically configured to: judging whether a target text exists in the sign image area or not; if the target text exists in the sign image area, determining that the sign of the target collection two-dimensional code accords with a preset display state.
More specifically, the color display status identification sub-module may be specifically configured to: performing optical character recognition on the code card image area to obtain a text recognition result; judging whether the text recognition result is consistent with the target text; if the text recognition result is consistent with the target text, determining that the code card of the target collection two-dimensional code accords with a preset display state.
More specifically, the color display status identification sub-module may be specifically configured to: performing optical character recognition on the image to be recognized to obtain a text recognition result; judging whether the text recognition result is consistent with the target text and is positioned in the sign image area; and if the text recognition result is consistent with the target text and is positioned in the code card image area, determining that the code card of the target collection two-dimensional code accords with a preset display state.
Optionally, the code value identifying module 504 may be further configured to: identifying code position information of a target collection two-dimensional code contained in the image to be identified; determining the positive direction of the target collection two-dimensional code; the sign image area determination module 508 may be further configured to: determining a predicted sign range based on the code position information and the positive direction; and in the estimated card-stacking range, performing edge detection, and determining the edge of a card-stacking image area of a card to which the target two-dimensional code belongs to obtain the card-stacking image area.
The determining the positive direction of the target collection two-dimensional code may specifically include: identifying a code image area corresponding to the target collection two-dimensional code, and determining the position of a specific positioning point of the target collection two-dimensional code; and determining the positive direction of the target collection two-dimensional code based on the position of the specific positioning point.
The determining the estimated coded range based on the coded position information and the positive direction may specifically include: according to the code position information, determining the center point coordinates of the target collection two-dimensional code; determining the length direction and the width direction of the target collection two-dimensional code based on the positive direction; starting from the center point coordinates, expanding a preset length value to the length direction and expanding a preset width value to the width direction to obtain a predicted card-coded range.
Optionally, the sign image area determining module 508 may be further configured to: determining a first estimated coded range and a second estimated coded range based on the positive direction and the code position information; the second estimated card range is larger than the first estimated card range and comprises the first estimated card range; and executing first edge detection in the first estimated card range, and if the edge of the card image area of the card to which the target two-dimensional code belongs cannot be identified, executing second edge detection in the second estimated card range, and continuously identifying the edge of the card image area of the card to which the target two-dimensional code belongs to obtain the card image area.
Optionally, the tile image area determining module 508 may specifically be configured to: performing Gaussian filtering on the image to be identified to remove noise, so as to obtain a denoised image; calculating a gradient image and an angle image aiming at the denoised image; performing non-maximum suppression based on the gradient image and the angle image, and determining edge pixel points; and performing edge connection on the edge pixel points by using a double-threshold method to obtain the edge of the sign image area.
It will be appreciated that each of the modules described above refers to a computer program or program segment for performing one or more particular functions. Furthermore, the distinction of the above-described modules does not represent that the actual program code must also be separate.
Based on the same thought, the embodiment of the specification also provides equipment corresponding to the method.
Fig. 6 is a schematic structural diagram of an identification device for an aggregated checkout two-dimensional code corresponding to fig. 2 according to an embodiment of the present disclosure. The device may be a user terminal when actually applied.
As shown in fig. 6, the apparatus 600 may include:
at least one processor 610; the method comprises the steps of,
a memory 630 communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
The memory 630 stores instructions 620 executable by the at least one processor 610 to enable the at least one processor 610 to:
acquiring an image to be identified acquired by a user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
identifying the image to be identified to obtain a code value of the target collection two-dimensional code;
judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection;
judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
Based on the same thought, the embodiment of the specification also provides a computer readable medium corresponding to the method. Computer readable instructions stored on a computer readable medium, the computer readable instructions being executable by a processor to perform a method of:
Acquiring an image to be identified acquired by a user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
identifying the image to be identified to obtain a code value of the target collection two-dimensional code;
judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection;
judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are referred to each other. Each embodiment focuses on differences from the other embodiments. In particular, for the apparatus, device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, as relevant to see the section description of the method embodiments. The apparatus, the device, and the method provided in the embodiments of the present disclosure correspond to each other, so that the apparatus, the device, and the method also have similar beneficial technical effects as the corresponding method, and since the beneficial technical effects of the method have been described in detail above, the beneficial technical effects of the corresponding apparatus, device are not described here again.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., a field programmable gate array (Field Programmable gate array, FPGA)) is an integrated circuit whose logic function is determined by the user programming the device. The designer programs itself to "integrate" a digital system onto a single PLD without requiring the chip manufacturer to design and fabricate application specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (AlteraHardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in the same piece or pieces of software and/or hardware when implementing the present application.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (13)

1. An identification method for an aggregate checkout two-dimensional code, comprising:
acquiring an image to be identified acquired by a user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
identifying the image to be identified to obtain a code value of the target collection two-dimensional code;
judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection;
judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
2. The method of claim 1, wherein the determining whether the sign of the target two-dimensional code corresponds to a preset display state based on the sign image area specifically comprises:
Determining the proportion of the pixel points of the target color in the pixel points in the sign image area;
and if the proportion is greater than or equal to a preset proportion threshold value, determining that the code card of the target collection two-dimensional code accords with a preset display state.
3. The method of claim 2, further comprising, prior to determining the proportion of pixels of the target color among pixels within the tile image area:
and carrying out color correction on the image to be identified by adopting a preset color correction method to obtain a color corrected image.
4. The method of claim 1, wherein the determining whether the sign of the target two-dimensional code corresponds to a preset display state based on the sign image area specifically comprises:
judging whether a target text exists in the sign image area or not;
if the target text exists in the sign image area, determining that the sign of the target collection two-dimensional code accords with a preset display state.
5. The method of claim 4, wherein the determining whether the sign of the target two-dimensional code corresponds to a preset display state based on the sign image area specifically comprises:
performing optical character recognition on the code card image area to obtain a text recognition result;
Judging whether the text recognition result is consistent with the target text;
if the text recognition result is consistent with the target text, determining that the code card of the target collection two-dimensional code accords with a preset display state.
6. The method of claim 4, wherein the determining whether the sign of the target two-dimensional code corresponds to a preset display state based on the sign image area specifically comprises:
performing optical character recognition on the image to be recognized to obtain a text recognition result;
judging whether the text recognition result is consistent with the target text and is positioned in the sign image area;
and if the text recognition result is consistent with the target text and is positioned in the code card image area, determining that the code card of the target collection two-dimensional code accords with a preset display state.
7. The method of claim 1, further comprising, prior to determining the tile image area of the target checkout two-dimensional code by edge detection:
identifying code position information of a target collection two-dimensional code contained in the image to be identified;
determining the positive direction of the target collection two-dimensional code;
the determining the sign image area of the target collection two-dimensional code through edge detection specifically comprises the following steps:
Determining a predicted sign range based on the code position information and the positive direction;
and in the estimated card-stacking range, performing edge detection, and determining the edge of a card-stacking image area of a card to which the target two-dimensional code belongs to obtain the card-stacking image area.
8. The method of claim 7, wherein the determining the positive direction of the target two-dimensional code specifically comprises:
identifying a code image area corresponding to the target collection two-dimensional code, and determining the position of a specific positioning point of the target collection two-dimensional code;
and determining the positive direction of the target collection two-dimensional code based on the position of the specific positioning point.
9. The method of claim 7, wherein determining the estimated tile range based on the code location information and the positive direction comprises:
according to the code position information, determining the center point coordinates of the target collection two-dimensional code;
determining the length direction and the width direction of the target collection two-dimensional code based on the positive direction;
starting from the center point coordinates, expanding a preset length value to the length direction and expanding a preset width value to the width direction to obtain a predicted card-coded range.
10. The method of claim 1, further comprising, prior to determining the tile image area of the target checkout two-dimensional code by edge detection:
Identifying code position information of a target collection two-dimensional code contained in the image to be identified;
determining the positive direction of the target collection two-dimensional code;
the determining the sign image area of the target collection two-dimensional code through edge detection specifically comprises the following steps:
determining a first estimated coded range and a second estimated coded range based on the positive direction and the code position information; the second estimated card range is larger than the first estimated card range and comprises the first estimated card range;
and executing first edge detection in the first estimated card range, and if the edge of the card image area of the card to which the target two-dimensional code belongs cannot be identified, executing second edge detection in the second estimated card range, and continuously identifying the edge of the card image area of the card to which the target two-dimensional code belongs to obtain the card image area.
11. The method of claim 1, wherein the determining the sign image area of the target checkout two-dimensional code by edge detection specifically comprises:
performing Gaussian filtering on the image to be identified to remove noise, so as to obtain a denoised image;
calculating a gradient image and an angle image aiming at the denoised image;
Performing non-maximum suppression based on the gradient image and the angle image, and determining edge pixel points;
and performing edge connection on the edge pixel points by using a double-threshold method to obtain the edge of the sign image area.
12. An identification device for an aggregated checkout two-dimensional code, comprising:
the image acquisition module is used for acquiring an image to be identified acquired by the user terminal; the image to be identified comprises a code card image of the target collection two-dimensional code;
the code value identification module is used for identifying the image to be identified to obtain the code value of the target collection two-dimensional code;
the aggregation code judging module is used for judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
the sign image area determining module is used for determining a sign image area of the target collection two-dimensional code in the image to be identified through edge detection if the target collection two-dimensional code is an aggregation collection two-dimensional code;
the display state identification module is used for judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
13. An identification device for an aggregated checkout two-dimensional code, comprising:
at least one processor; the method comprises the steps of,
a memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
acquiring an image to be identified; the image to be identified comprises a code card image of the target collection two-dimensional code;
identifying the image to be identified to obtain a code value of the target collection two-dimensional code;
judging whether the target collection two-dimensional code is an aggregation collection two-dimensional code according to the code value;
if the target collection two-dimensional code is an aggregate collection two-dimensional code, determining a code card image area of the target collection two-dimensional code in the image to be identified through edge detection;
judging whether the code card of the target collection two-dimensional code accords with a preset display state or not based on the code card image area; the preset display state is used for prompting a user to pay by using a preset payment mode.
CN202310771222.2A 2023-06-27 2023-06-27 Identification method, device and equipment for aggregate collection two-dimensional code Pending CN116757686A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310771222.2A CN116757686A (en) 2023-06-27 2023-06-27 Identification method, device and equipment for aggregate collection two-dimensional code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310771222.2A CN116757686A (en) 2023-06-27 2023-06-27 Identification method, device and equipment for aggregate collection two-dimensional code

Publications (1)

Publication Number Publication Date
CN116757686A true CN116757686A (en) 2023-09-15

Family

ID=87947675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310771222.2A Pending CN116757686A (en) 2023-06-27 2023-06-27 Identification method, device and equipment for aggregate collection two-dimensional code

Country Status (1)

Country Link
CN (1) CN116757686A (en)

Similar Documents

Publication Publication Date Title
US20200334793A1 (en) Method and device for blurring image background, storage medium and electronic apparatus
Kakar et al. Exposing digital image forgeries by detecting discrepancies in motion blur
US8905314B2 (en) Barcode recognition using data-driven classifier
JP5318122B2 (en) Method and apparatus for reading information contained in bar code
US20150016687A1 (en) Method, system and computer storage medium for face detection
Sörös et al. Blur-resistant joint 1D and 2D barcode localization for smartphones
CN111738262A (en) Target detection model training method, target detection model training device, target detection model detection device, target detection equipment and storage medium
US20120039508A1 (en) Target detecting method and apparatus
CN105740876A (en) Image preprocessing method and device
US20140193063A1 (en) Duplicate check image resolution
CN108573184B (en) Two-dimensional code positioning method, module and computer readable storage medium
Lelore et al. Super-resolved binarization of text based on the fair algorithm
CN110287125A (en) Software routine test method and device based on image recognition
US20120104100A1 (en) Image-based barcode reader
CN109815854B (en) Method and device for presenting associated information of icon on user equipment
CN116363140B (en) Method, system and device for detecting defects of medium borosilicate glass and storage medium
CN111160395A (en) Image recognition method and device, electronic equipment and storage medium
CN111563517B (en) Image processing method, device, electronic equipment and storage medium
RU2608239C1 (en) Method and system for determining suitability of document image for optical character recognition and other image processing operations
CN102915522A (en) Smart phone name card extraction system and realization method thereof
CN111199240A (en) Training method of bank card identification model, and bank card identification method and device
US20060126940A1 (en) Apparatus and method for detecting eye position
CN111062922B (en) Method and system for distinguishing flip image and electronic equipment
CN116757686A (en) Identification method, device and equipment for aggregate collection two-dimensional code
JP2012048326A (en) Image processor and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination