CN115130491A - Automatic code scanning method and terminal - Google Patents

Automatic code scanning method and terminal Download PDF

Info

Publication number
CN115130491A
CN115130491A CN202211038629.6A CN202211038629A CN115130491A CN 115130491 A CN115130491 A CN 115130491A CN 202211038629 A CN202211038629 A CN 202211038629A CN 115130491 A CN115130491 A CN 115130491A
Authority
CN
China
Prior art keywords
terminal
dimensional code
code
preset
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211038629.6A
Other languages
Chinese (zh)
Other versions
CN115130491B (en
Inventor
颜应华
周俊伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211038629.6A priority Critical patent/CN115130491B/en
Publication of CN115130491A publication Critical patent/CN115130491A/en
Application granted granted Critical
Publication of CN115130491B publication Critical patent/CN115130491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06037Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding

Abstract

The embodiment of the application provides an automatic code scanning method and a terminal. In the method, after the terminal detects that the user intends to scan the code, the TOF camera can be automatically started to acquire a first gray image and a first depth image, and whether the TOF camera includes the complete two-dimensional code is determined based on the first gray image. If the two-dimensional code is complete, the terminal may determine whether the distance from the complete two-dimensional code to the terminal is less than a preset threshold based on the first depth image. When the distance is smaller than the preset threshold, the terminal may display a first interface, where the first interface includes a first indicator preset by a user, the first indicator may be used to trigger opening of a code scanning function in an application corresponding to the first indicator, the subsequent terminal may scan the first grayscale image by using the code scanning function and display an interface after completion of code scanning, and then, the user may perform a corresponding operation (for example, payment, etc.) based on the interface after completion of code scanning through the terminal.

Description

Automatic code scanning method and terminal
Technical Field
The application relates to the field of terminals, in particular to an automatic code scanning method and a terminal.
Background
Payments by code scanning have been commonplace and frequent. When in payment, the user prefers to use the terminal to scan the payment code for payment, or the terminal displays the collection code or the payment code, so that other code scanning equipment can scan the collection code or the payment code to complete payment. In a general situation, if a certain two-dimensional code (denoted as two-dimensional code a) needs to be scanned, the terminal needs to open a corresponding payment application first, and then open a code scanning function provided by the payment application, and then scan the two-dimensional code a through the code scanning function. If the payment is carried out by showing the collection code or the payment code, the terminal firstly needs to open the corresponding payment application, and then displays the collection code or the payment code, so that other code scanning equipment can scan the collection code or the payment code to complete the payment.
Therefore, the code scanning process is relatively complex, and the code scanning can be realized only after the user holds the terminal to perform multiple operations. This sweep the sign indicating number and include that the terminal sweeps the sign indicating number to the two-dimensional code to and the terminal shows that the two-dimensional code makes other sweep sign indicating number equipment and can sweep the sign indicating number to this two-dimensional code.
How to improve code scanning efficiency and simplify code scanning process is the direction of value research.
Disclosure of Invention
The application provides an automatic code scanning method and a terminal, after the terminal detects that a user intends to scan a code (actively scan the code), an indicator or a two-dimensional code preset by the user can be displayed, and the user can quickly scan the code by selecting the indicator or the two-dimensional code.
In a first aspect, the present application provides an automatic code scanning method, including: when the terminal detects that the terminal generates a preset gesture, the terminal starts a target camera to obtain a first depth image and a first gray image; the first gray image comprises a shot object, and the first depth image is used for indicating the distance from a three-dimensional point in the shot object to the terminal; under the condition that the terminal determines that the two-dimensional code is included in the first gray image, the terminal takes the two-dimensional code included in the first gray image as a target two-dimensional code; when the terminal determines that the target two-dimensional code is a payment type two-dimensional code and determines that the distance between the target two-dimensional code and the terminal is smaller than a first threshold value based on the first depth image; or, when the terminal determines that the target two-dimensional code is not a payment-type two-dimensional code and the terminal determines that the distance between the target two-dimensional code and the terminal is smaller than a second threshold value based on the first depth image; the terminal displays a first interface; the first interface comprises one or more first indicators; wherein a first indicator corresponds to a code scanning function in an application; the first threshold is less than the second threshold; responding to the operation of selecting a target first indicator in the first interface, and enabling a code scanning function corresponding to the target first indicator by the terminal; the terminal scans the target two-dimensional code based on the first gray image.
In the above embodiment, after the terminal detects that the user has an operation of scanning a code intentionally (scanning a code actively), a first indicator preset by the user may be displayed, and the user may start a code scanning function in different applications by selecting different indicators to achieve quick and automatic code scanning.
In combination with the first aspect, in some embodiments, the method further comprises: the terminal displays a second interface under the condition that the terminal determines that the first gray-scale image does not include the two-dimensional code and determines that the first gray-scale image includes the code scanning device; the second interface comprises one or more second indicators or one or more first preset two-dimensional codes with a first size; wherein, a second indicator corresponds to a second preset two-dimensional code in an application; under the condition that one or more second indicators are included in the second interface, in response to the operation of selecting a target second indicator in the second interface, the terminal displays a second preset two-dimensional code corresponding to the target second indicator so that the code scanning equipment scans the code based on the second preset two-dimensional code; or, in the case that the second interface includes one or more first preset two-dimensional codes, in response to an operation of selecting a target first preset two-dimensional code in the second interface, the terminal displays the target first preset two-dimensional code in a second size, so that the code scanning device scans codes based on the target first preset two-dimensional code; wherein the second size is larger than the first size.
In the above embodiment, after the terminal detects that the user has an operation of scanning a code intentionally (scanning a code actively), a first indicator preset by the user may be displayed, and the user may start a code scanning function in different applications by selecting different indicators to scan a code quickly. And after the terminal detects that the user has an operation of intentionally scanning the code (passively scanning the code), a second indicator preset by the user or a first preset two-dimensional code preset by the user can be displayed, the user can select different second indicators to enable the terminal to display the second preset two-dimensional code, or the user can select different first preset two-dimensional codes to enable the terminal to display the first preset two-dimensional code in a second size, that is, the terminal can quickly display the preset two-dimensional code, so that other devices capable of scanning the code can scan the preset two-dimensional code to complete corresponding operations, such as payment and receipt operations.
With reference to the first aspect, in some embodiments, after the terminal takes the two-dimensional code included in the first grayscale image as the target two-dimensional code, before the terminal determines that the target two-dimensional code is the payment-type two-dimensional code, the method further includes: the terminal determines whether the target two-dimensional code is a complete two-dimensional code; under the condition that the target two-dimensional code is determined to be a complete two-dimensional code, the terminal determines the type of the target two-dimensional code; under the condition that the target two-dimensional code is determined not to be the complete two-dimensional code, the terminal generates first prompt information to prompt a user to align the two-dimensional code; the first prompt message is one of first vibration or first prompt sound; after the terminal generates the first prompt message, the terminal acquires the first depth image and the first gray image again based on the target camera.
In the above embodiment, when the two-dimensional code included in the first grayscale image is a complete two-dimensional code, the terminal scans the two-dimensional code successfully, so before scanning the code, it is detected whether the two-dimensional code is a complete two-dimensional code, whether the two-dimensional code is scanned can be further determined, and when the two-dimensional code is not a complete two-dimensional code, the terminal generates the first prompt information to extract the user to obtain the complete two-dimensional code aligned with the two-dimensional code. This may improve the probability of successful code scanning.
In combination with the first aspect, in some embodiments, the method further comprises: when the terminal determines that the target two-dimensional code is a payment-type two-dimensional code and determines that the distance between the target two-dimensional code and the terminal is greater than or equal to a first threshold value based on the first depth image, the terminal generates second prompt information to prompt a user to approach the two-dimensional code; the second prompt message is one of second vibration or second prompt sound, and the second prompt message is different from the first prompt message; after the terminal generates the second prompt message, the terminal acquires the first depth image and the first gray image again based on the target camera.
In the above embodiment, when the distance between the target two-dimensional code and the terminal is greater than or equal to the first preset threshold, it indicates that the target two-dimensional code in the first grayscale image is small, and scanning based on the first grayscale image may fail, so that second prompt information different from the first prompt information may be generated so that the user approaches the two-dimensional code and the terminal may re-acquire the first grayscale image at a short distance. The reason why the first prompt information and the second prompt information are different here is that different information can be provided to the user based on different prompt information. So that the user can determine how to move the terminal according to the type of the prompt message. For example, when the terminal generates the first prompt message, after the user perceives the first prompt message, the user can move the terminal to the left and right to align the two-dimensional code; when the terminal generates the second prompt message, the user can move the terminal forward to be close to the two-dimensional code after perceiving the second prompt message.
In combination with the first aspect, in some embodiments, the preset gesture includes one of a first motion gesture or a second motion gesture; the first motion posture is vertical movement and upward overturning; the second motion posture is vertical movement or the second posture is vertical movement and non-upturning.
With reference to the first aspect, in some embodiments, the terminal determines that the target camera is a rear camera if the terminal generates the first motion gesture.
In the above embodiment, the first motion gesture is vertical movement and upward turning, and the first motion gesture is generated by the terminal when the user has an intention of actively scanning a code, for example, the user can raise the hand of the terminal by holding the terminal and then turn upward to scan the code actively, so that the terminal can start the rear TOF camera to acquire the first grayscale image and the first depth image when the terminal generates the first motion gesture.
With reference to the first aspect, in some embodiments, in a case that the terminal determines that the terminal generates the second motion gesture, the target camera is a leading camera.
In the above embodiment, the second motion gesture is generated by the terminal when the user has an intention of passive code scanning, for example, the user can raise the hand of the terminal by holding the terminal and turn down to scan the code passively, so that the terminal can start the front TOF camera to acquire the first grayscale image and the first depth image when the terminal generates the second motion gesture.
In combination with the first aspect, in some embodiments, the method further comprises: the first interface comprises a first control; detecting a first operation on the first control; and responding to the first operation, and closing the first interface by the terminal.
In the above embodiment, if the user does not intend to actively scan the code, the first interface may be closed, thereby improving user experience.
In combination with the first aspect, in some embodiments, the method further comprises: the second interface comprises a second control; detecting a second operation on the second control; and responding to the second operation, and closing the second interface by the terminal.
In the above embodiment, if the user does not intend to passively scan the code, the second interface may be closed, so as to improve the user experience.
In combination with the first aspect, in some embodiments, the method further comprises: in a case where the terminal determines that the two-dimensional code is not included in the first grayscale image and determines that the scannable code device is not included in the first grayscale image, the terminal deletes the first depth image and the first grayscale image.
With reference to the first aspect, in some embodiments, the first indicator, the second indicator and the first preset two-dimensional code are preset.
With reference to the first aspect, in some embodiments, the first predetermined two-dimensional code has no timeliness, the second predetermined two-dimensional code has timeliness, and the second predetermined two-dimensional code is invalid after exceeding a third threshold.
In a second aspect, the present application provides a terminal, comprising: one or more processors and memory; the memory is coupled to the one or more processors and is configured to store computer program code comprising computer instructions that are invoked by the one or more processors to cause the terminal to perform the method as described in the first aspect or any one of the embodiments of the first aspect.
In the above embodiment, after the terminal detects that the user has an operation of scanning a code intentionally (scanning a code actively), a first indicator preset by the user may be displayed, and the user may start a code scanning function in different applications by selecting different indicators to scan a code quickly. And after the terminal detects that the user has an operation of intentionally scanning the code (passively scanning the code), a second indicator preset by the user or a first preset two-dimensional code preset by the user can be displayed, the user can select different second indicators to enable the terminal to display the second preset two-dimensional code, or the user can select different first preset two-dimensional codes to enable the terminal to display the first preset two-dimensional code in a second size, that is, the terminal can quickly display the preset two-dimensional code, so that other devices capable of scanning the code can scan the preset two-dimensional code to complete corresponding operations, such as payment and receipt operations.
In a third aspect, an embodiment of the present application provides a chip system, which is applied to a terminal, and the chip system includes one or more processors, and the processor is configured to invoke a computer instruction to cause the terminal to perform a method as described in the first aspect or any one implementation manner of the first aspect.
In the above embodiment, after the terminal detects that the user has an operation of scanning a code intentionally (scanning a code actively), a first indicator preset by the user may be displayed, and the user may start a code scanning function in different applications by selecting different indicators to scan a code quickly. And after the terminal detects that the user has an operation of intentionally scanning the code (passively scanning the code), a second indicator preset by the user or a first preset two-dimensional code preset by the user can be displayed, the user can select different second indicators to enable the terminal to display the second preset two-dimensional code, or the user can select different first preset two-dimensional codes to enable the terminal to display the first preset two-dimensional code in a second size, that is, the terminal can quickly display the preset two-dimensional code, so that other devices capable of scanning the code can scan the preset two-dimensional code to complete corresponding operations, such as payment and receipt operations.
In a fourth aspect, embodiments of the present application provide a computer program product containing instructions, which when run on a terminal, cause the terminal to perform the method as described in the first aspect or any one of the embodiments of the first aspect.
In the above embodiment, after the terminal detects that the user has an operation of scanning a code intentionally (actively scanning a code), a first indicator preset by the user may be displayed, and the user may start a code scanning function in different applications by selecting different indicators to realize quick code scanning. And after the terminal detects that the user has an operation of intentionally scanning the code (passively scanning the code), a second indicator preset by the user or a first preset two-dimensional code preset by the user can be displayed, the user can select different second indicators to enable the terminal to display the second preset two-dimensional code, or the user can select different first preset two-dimensional codes to enable the terminal to display the first preset two-dimensional code in a second size, that is, the terminal can quickly display the preset two-dimensional code, so that other devices capable of scanning the code can scan the preset two-dimensional code to complete corresponding operations, such as payment and receipt operations.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on a terminal, cause the terminal to perform the method described in the first aspect or any one of the implementation manners of the first aspect.
In the above embodiment, after the terminal detects that the user has an operation of scanning a code intentionally (actively scanning a code), a first indicator preset by the user may be displayed, and the user may start a code scanning function in different applications by selecting different indicators to realize quick code scanning. And, after the terminal detects that the user has an operation of intentionally scanning the code (passively scanning the code), a second indicator preset by the user or a first preset two-dimensional code preset by the user can be displayed, the user can select different second indicators to enable the terminal to display a second preset two-dimensional code, or the user can select different first preset two-dimensional codes to enable the terminal to display the first preset two-dimensional code in a second size, that is, the terminal can quickly display the preset two-dimensional code, so that other devices capable of scanning the code can scan the preset two-dimensional code to complete corresponding operations, such as payment and receipt operations.
Drawings
FIG. 1 is a schematic flow chart illustrating a scheme for a terminal to scan codes;
FIGS. 2A and 2B are a set of diagrams of the automatic code scanning method in case 1;
3A-3J illustrate diagrams of a user presetting a first indicator;
fig. 4 is a schematic diagram illustrating that the terminal completes fast active code scanning based on the preset first indicator;
5A-5L are diagrams illustrating a user presetting a second indicator and a first preset two-dimensional code;
fig. 6A and 6B are schematic diagrams illustrating that the terminal completes fast passive code scanning based on the preset second indicator and the first preset two-dimensional code;
FIG. 7 is a flow chart illustrating an exemplary process involved in an automatic code scanning method in an embodiment of the present application;
fig. 8A to 8D are schematic views illustrating the terminal setting a first vibration and a second vibration;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
In one scheme, the process of scanning the two-dimensional code by the terminal to perform the relevant operation (for example, payment) may be described in the following, where the terminal may first open the payment-type application, and then open the code scanning function provided by the payment-type application, before scanning the two-dimensional code a by the code scanning function.
Fig. 1 shows a schematic flow chart of a scheme in which a terminal performs code scanning.
The process of scanning the code by the terminal may refer to the following description of step 1 to step 5.
Step 1, after the code scanning operation is detected, the operation is influenced, and the terminal opens the camera to collect images.
The code scanning operation may be an operation in which a user clicks a "code scanning" control in a payment-type application. The code scanning control is used for triggering the terminal to open the camera to collect images, identifying the collected images and determining the two-dimensional codes included in the images. The code scanning control is also regarded as a code scanning function entrance of the terminal.
After detecting the operation on the "code scanning" control, the terminal can open a camera (usually a rear camera) to acquire an image in response to the operation.
And 2, the terminal determines whether the content in the image comprises the two-dimensional code characteristics.
The two-dimensional code feature may be used to preliminarily identify whether the content in the image includes a two-dimensional code. The two-dimensional code features may include icon information and/or digital information in a two-dimensional code picture.
In the case where the terminal determines that the content in the image includes the two-dimensional code feature, the terminal may perform step 3, and further determine whether the image includes the two-dimensional code in step 3.
In the case where the terminal determines that the content in the image does not include the two-dimensional code feature, the terminal may determine that the two-dimensional code is not included in the image, and perform the following step 5 to notify the user that the code scanning has failed.
And 3, the terminal determines whether the image comprises the two-dimensional code.
The terminal performs image recognition on the image to further determine whether a two-dimensional code is included therein.
In case it is determined that the two-dimensional code is included therein, the terminal may perform step 4, and the terminal may scan the two-dimensional code in the image in step 4.
In case it is determined that the two-dimensional code is not included in the image, the terminal may perform step 5 to notify the user that the code scanning has failed.
And 4, calling a corresponding interface by the terminal to scan the two-dimensional code in the image, and displaying the interface after the code scanning is successful.
The terminal can call the corresponding interface to scan the two-dimensional code in the image, and the interface after the code scanning is successful is displayed to execute other operations. For example, when the two-dimensional code is a payment-type two-dimensional code, the terminal may display a payment interface, and may complete payment operations through the interface.
And 5, the terminal determines that the code scanning fails and displays an interface after the code scanning fails.
In the case where the two-dimensional code is not included in the image, the terminal may display an interface after the code scanning failure to notify the user of the code scanning failure.
Based on the foregoing description, the above-mentioned code scanning process is relatively cumbersome, and it is necessary for the user to open the camera to scan the code after holding the terminal and performing multiple operations. The foregoing explains that the terminal scans the two-dimensional code as an example, and in fact, the terminal displays the two-dimensional code, so that other code scanning devices can scan the two-dimensional code relatively tedious, and the description is omitted here.
In the embodiment of the application, an automatic code scanning method is provided. In the method, in some possible cases, after the terminal detects that the user has an operation of intending to scan the code, that is, after the terminal determines that the preset gesture is generated, a time of flight (TOF) camera may be automatically turned on to acquire a first grayscale image (IR image) and a first depth image, and then, whether the two-dimensional code is included therein is determined based on the first grayscale image. In the case that it is determined that the two-dimensional code is included in the first grayscale image, the terminal further determines whether the two-dimensional code is complete, and if so, the terminal may determine whether the distance from the complete two-dimensional code to the terminal is smaller than a preset threshold value based on the first depth image. When the distance is smaller than the preset threshold, the terminal may display a first interface, where the first interface includes a first indicator preset by a user, the first indicator may be used to trigger opening of a code scanning function in an application corresponding to the first indicator, the subsequent terminal may scan the first grayscale image by using the code scanning function and display an interface after completion of code scanning, and then, the user may perform a corresponding operation (for example, payment, etc.) based on the interface after completion of code scanning through the terminal. The form of the indicator includes, but is not limited to, a link, an icon, or a text.
The TOF camera includes one or more of a rear TOF camera or a front TOF camera, and specifically, the following description of step S101 to step S106 may be given by using the rear TOF camera or the front TOF camera to acquire an image, which is not repeated herein.
It should be understood that, in a case where it is determined that the two-dimensional code is included in the first grayscale image (which may be referred to as case 1 below), the operation in which the user has an intention to scan the code is an operation in which the user intends to scan the two-dimensional code with the terminal. The operation of case 1, in which the user has an intention to scan a code, can be regarded as an active code scan.
In the case where it is determined that the two-dimensional code is not included in the first gray image, the terminal further determines whether a scannable code device is included in the first gray image. The terminal may display a third interface based on a display of one or more of a second indicator preset by a user or a first preset two-dimensional code image (first size), where the second indicator corresponds to a second preset two-dimensional code and may be used to trigger a display of the second preset two-dimensional code. Subsequently, the terminal may display the preset two-dimensional code (displayed in a second size, where the second size is larger than the first size) so that other devices capable of scanning the preset two-dimensional code may complete corresponding operations, such as receiving and paying, after scanning the preset two-dimensional code, where the preset two-dimensional code includes the first preset two-dimensional code and the second preset two-dimensional code.
It should be understood that, in a case where it is determined that the two-dimensional code is not included in the first grayscale image, and the terminal further determines whether or not the code-scannable device is included in the first grayscale image (hereinafter referred to as case 2), the operation of intending to scan the code is an operation in which the user intends to show the two-dimensional code through the terminal so that the other code-scannable devices perform code scanning. The presence of an intended code scan operation in case 2 by the user may be considered a passive code scan.
Wherein the operation of attempting to scan codes includes, but is not limited to, one or more of the following operations:
operation 1: the terminal moves vertically and turns upward. The vertical movement refers to a speed of the terminal in a vertical direction, and when the terminal moves vertically, the motion posture of the terminal can move upwards or downwards. The flip-up refers to rotating the screen of the terminal from a horizontal (or nearly horizontal) to a vertical (or nearly vertical) direction, and after the rotation, the screen faces the user (the object holding the terminal is the user).
Operation 2: the terminal moves vertically and is not flipped up. The non-upward turning includes turning the terminal to the left, turning the terminal to the right, or turning the terminal to the bottom, where the turning to the bottom refers to rotating the screen of the terminal from horizontal (or nearly horizontal) to vertical (or nearly vertical), and after the rotation, the screen faces away from the user (the object holding the terminal is the user), that is, the rear camera faces the user.
Operation 3: the terminal moves vertically without flipping.
Here, operation 1 is an operation of intending to scan a code (active scan code) in case 1. Operations 2 and 3 are intended code scanning (passive code scanning) operations in case 2.
The first gray image comprises a shot object (such as a two-dimensional code) in a field angle range of the TOF camera, each pixel point in the first gray image corresponds to a sampling color, the sampling color is a color from black (darkest) to white (brightest), and the value range of each pixel point is [0, 2% M ]Wherein, the closer the value of the pixel is to 0, the closer the sampling color of the pixel is to black, and the closer the value of the pixel is to 2 M The closer the sampling color of the pixel point is to white. Generally, the value of M may be 8 or 16, and may be adjusted according to the computing capability of the terminal, which is not limited in the embodiment of the present application.
The first depth image includes a distance from a three-dimensional point in a captured image (for example, a two-dimensional code) to the terminal. The distance of the subject from the terminal may be determined based on the first depth image.
Fig. 2A and 2B show a set of diagrams involved in the automatic code scanning method in case 1.
As shown in (1) in fig. 2A, the user interface 10 is a desktop of the terminal. The user intends to scan the code, lift the terminal upwards and turn the terminal upwards so that the image comprising the two-dimensional code information can be collected by the rear-mounted TOF camera of the terminal. At this time, the terminal may detect an operation of vertical movement and upward flip (i.e., an operation of the user intending to scan a code), in response to which the terminal turns on the TOF camera to acquire the first grayscale image and the first depth image, and after the terminal determines that the two-dimensional code exists within the preset distance based on the first grayscale image and the first depth image, the terminal may display the user interface 11 as shown in (2) in fig. 2A.
As shown in (2) of fig. 2A, the user interface 11 is a first interface related to the foregoing. An indicator 101 (the indicator 101 is one of the first indicators mentioned above) may be included in the user interface 11, and the indicator 101 is in the form of a link. The link corresponds to a code scanning function in a payment application and is used for triggering and starting the code scanning function in the payment application. An operation (e.g., a click operation) for the indicator 101 is detected, and in response to the operation, the terminal may open a code scanning function in the payment class application corresponding to the link.
Subsequently, the terminal may scan the first grayscale image by using the code scanning function and display the interface after the code scanning is completed, and the user interface involved in the code scanning process may refer to the user interface 12 shown in fig. 2B described below.
It should be appreciated that, in some possible cases, when the indicator is a link, the link may include information of its corresponding payment class application (e.g., payment class application 1 shown in indicator 101) and the function information that the link may trigger to open (e.g., "scan code" shown in indicator 101), which may facilitate the user in distinguishing the roles of the different links.
In this way, after the terminal detects that the user has an operation of scanning the code intentionally (scanning the code actively), the first indicator preset by the user can be displayed, and the user can start the code scanning function in different applications by selecting different indicators to scan the code quickly. And, after the terminal detects that the user has an operation of intentionally scanning the code (passively scanning the code), a second indicator preset by the user or a first preset two-dimensional code preset by the user can be displayed, the user can select different second indicators to enable the terminal to display a second preset two-dimensional code, or the user can select different first preset two-dimensional codes to enable the terminal to display the first preset two-dimensional code in a second size, that is, the terminal can quickly display the preset two-dimensional code, so that other devices capable of scanning the code can scan the preset two-dimensional code to complete corresponding operations, such as payment and receipt operations.
Fig. 3A-3J and fig. 4 are schematic diagrams illustrating a user presetting a first indicator and completing a fast active code scan.
Fig. 3A to 3J are schematic diagrams illustrating a first indicator preset by a user. Fig. 4 is a schematic diagram illustrating that the terminal completes a fast active code scanning based on the preset first indicator.
As shown in fig. 3A, the user interface 30 may be a setting interface provided for a setting application, and includes a "quick code scanning" setting item 301, and the "quick code scanning" function may provide a function of setting a first indicator, a second indicator, and a first preset two-dimensional code for the terminal. In some possible cases, the "quick code scanning" setting item 301 may also have other existing manners, for example, the setting item may also be independent of the setting application, for example, a separate application, which is not limited in this embodiment of the application.
In response to an operation (for example, a click operation) on the "quick scan code" setting item 301, the terminal may display a setting interface corresponding to the "quick scan code" setting item 301, which may be specifically referred to the following description of the user interface 31 in fig. 3B.
As shown in fig. 3B, the user interface 31 corresponds to an exemplary setting interface for the "quick code scan" setting item 301. The user interface 31 may include a "quick code scan" setting item 301 and its corresponding control switch 311. When the control switch 311 is ON (ON), the terminal may use the automatic code scanning method according to the present application, and when the control switch 311 is OFF (OFF), the terminal may not use the automatic code scanning method according to the present application.
The user interface 31 may further include a "usable device" setting item 312, and the "usable device" setting item 312 may also be used to acquire the first grayscale image and the first depth image for the terminal, for example, the "usable device" setting item 312 may include a "back TOF camera" setting 312a and a "front TOF camera" setting item 312 b. The "post TOF camera" setting 312a may be used to control whether to acquire a first grayscale image and a first depth image by using the post TOF camera; this "front TOF camera" setting 312b can be used to control whether the first grayscale image and the first depth image are acquired with the front TOF camera.
The user interface 31 may further include a "preset two-dimensional code" setting item 313, a "code scanning" setting item 314, and a code scanning prompt setting item 315. The code scanning prompt setting item 315 may be used to set a prompt mode, where the prompt mode is a mode for prompting the user to move the mobile terminal to implement quick code scanning; the "preset two-dimensional code" setting item 313 may be used for a user to set a first preset two-dimensional code and a second preset two-dimensional code through the terminal, where the second preset two-dimensional code is a preset two-dimensional code corresponding to the second indicator, and the setting of the second preset two-dimensional code is to set a second indicator corresponding to the second preset two-dimensional code. For details of the "preset two-dimensional code" setting item 313, reference may be made to the following description of fig. 5A to 5L, which is not repeated herein. For the related contents of the scan code prompt setting item 315, reference may be made to the following description of fig. 8A to 8D, which is not repeated herein.
It should be understood that, in some possible cases, the first preset two-dimensional code may be a two-dimensional code without timeliness; the second preset two-dimension code can be a two-dimension code with timeliness; the second preset two-dimensional code is time-efficient, which means that the second preset two-dimensional code is invalid after a preset time (e.g. 10s, 20s, etc.) is exceeded, and the preset time may also be referred to as a third threshold.
The "scan code" setting item 314 may be used for the user to set a first indicator through the terminal, the first indicator corresponding to a scan code function in the payment-class application, which may be used to trigger the opening of the scan code function (in the payment-class application) corresponding to the first indicator. Subsequently, the terminal can display the first indicator in a first interface, and realize quick code scanning based on the first interface.
The "scan code" setting item 314 may be used for setting the first indicator by the user through the terminal to mean: the terminal can set the form of the different first indicators and the code scanning functions (in payment applications) corresponding to the different first indicators through the "code scanning" setting item 314. In response to the operation for the "scan code" setting item 314, the terminal may display a setting interface corresponding to the "scan code" setting item 314. In some possible cases, the setup interface may refer to the user interface 32 shown in FIG. 3C described below.
As shown in fig. 3C, the user interface 32 includes different templates, where the templates correspond to the first interface, and the forms or arrangement rules of the first indicators in the different templates are different (the forms or arrangement rules of the first indicators in the first interface are different), for example, here, the forms of the first indicators in the template 1 may be text prompts with links; the form of the first indicator in the template 2 may be an icon, and may be another form, for example, a link, which is not limited in the embodiment of the present application. In addition, in some possible cases, the user can also perform custom template setting through the terminal. The template 1 preview frame 321 may be used to trigger the terminal to generate a first indicator according to the template 1, and arrange the first indicator according to the template 1 to generate a first interface. In response to the operation for the template 1 preview box 321, the terminal may display the user interface 33 shown in fig. 3D described below.
The user interface 33 shown in fig. 3D is an exemplary user interface for presetting the first indicator. In response to operation of the add indicator control 331, the terminal may trigger setting of the first indicator, at which point the terminal may display a user interface 34 as shown in fig. 3E. In some possible cases, prompt text may be further included in the indicator addition control 331 to prompt the user to add the first indicator, for example, the prompt text may be: "add a preset" scan "indicator," which is a first indicator.
As shown in FIG. 3E, user interface 34 is an exemplary setup interface for the first indicator. A name setting control 341 and an execution setting control 342 may be included in the user interface 34. The execution setting control 342 may be configured to set a code scanning function corresponding to the first indicator, that is, a code scanning function that may be triggered at the first indicator; the name setting control 341 may be used to edit the name of the first indicator. The name may be input by a user through a terminal, or may be generated according to a code scanning function corresponding to a first indicator set by the user, specifically, if the code scanning function is a code scanning function in the payment application a, the name may include the name of the payment application a, or may also include characters such as the code scanning function in the payment application a. And subsequently displaying the name in the first indicator, wherein the name can be used for prompting the user of an application corresponding to a code scanning function which can be specifically triggered by the first indicator.
The execution setting control 342 may include an application setting control 342a and a function setting control 342b, the application setting control 342a may be used to set the first indicator-triggered open application (payment class application), and the function setting control 342b may be used to set the first indicator-triggered open function (code scanning function). After the user completes setting the first indicator, the terminal may obtain the code scanning function (in the payment application) corresponding to the first indicator, and encapsulate the code scanning function (in the payment application) corresponding to the first indicator into a link for the user to trigger execution of the code scanning function corresponding to the first indicator through the terminal. The terminal can also display the name of the first indicator related to the first indicator and the link to obtain the first indicator, so that the user can determine the code scanning function (in a payment application) corresponding to the first indicator through the name.
In response to the operation of the application setting control 342a, the terminal may display the user interface 35 as shown in fig. 3F, the user interface 35 including therein the list of Applications (APP), and the terminal may place the payment class application in front of the displayed list of Applications (APP) 352 and place other types of applications behind the payment class application. In some possible cases, the terminal may also display only the applications with the code scanning function in the Application (APP) list, and the other applications do not display. In response to the operation of the add control 351 corresponding to the payment class application 1, the terminal may treat the payment class application 1 as an application corresponding to the first indicator. While displaying another exemplary setup interface for the first indicator as shown in fig. 3G below.
As shown in fig. 3G, the user interface 36 is another exemplary setting interface for the first indicator, at which time, the terminal may update the text in the application setting control 342a to: the payment application 1 prompts the user that the application corresponding to the first indicator is the payment application 1.
In response to operation with respect to the function setting control 342b, the terminal may display the user interface 37 as shown in fig. 3H, and a function list 372 may be displayed in the user interface 37, and functions in the payment-class application, including a code-scanning function, (display) a payment code, and the like, are shown in the function list 372. In some possible cases, only the code scanning function corresponds to the adding control, so that the user can set the code scanning function to the code scanning function corresponding to the first indicator (in the payment application 1) through the adding control. In other possible cases, the terminal may display only the scan code function in the function list, and the other functions are not displayed.
In response to the operation of the add control 371 corresponding to the code scanning function in the payment class application 1, the terminal may use the code scanning function in the payment class application 1 as the code scanning function corresponding to the first indicator. While displaying another exemplary setup interface for the first indicator as shown in fig. 3I below.
As shown in FIG. 3I, user interface 38 is an exemplary user interface after the first indicator setting is complete. The text in the application settings control 342a in the execution settings control 342 of the user interface 38 is updated to: the payment application 1 prompts the user that the application corresponding to the first indicator is the payment application 1; the text in the function setting control 342b is updated to: "scan code" to prompt the user that the function corresponding to the first indicator is a scan code function (in payment class application 1). In response to an operation on the application control 381, the terminal may complete setting of the first indicator and display the first indicator according to the template. At this time, the terminal may display a user interface 39 as shown in fig. 3J described below.
As shown in fig. 3J, the user interface 39 is an exemplary user interface after setting of a first indicator, the indicator 391 (a first indicator) may include a name (name of the first indicator, text form) and a link, where the link may be used for a user to trigger execution of a code scanning function (in the payment class application 1) corresponding to the indicator 391 through a terminal; the name may be used to prompt the user that indicator 391 corresponds to a code scanning function (in payment class application 1). In some possible cases, subsequently, the terminal may further add another first indicator through the indicator adding control 331, and arrange the other first indicator and the indicator 391 according to the template (template 1). The terminal may store the indicator 391 in response to an operation directed to save control 392. Subsequently, the terminal may display the indicator 391 in the first interface. The first interface is an interface displayed after the terminal determines that there is an operation of scanning a code intended by a user (active scanning a code), and the first interface includes one or more first indicators preset by the user (for example, the indicator 391 mentioned above). Subsequently, the terminal may complete fast active code scanning based on the preset first indicator, and the process may refer to the following description of fig. 4.
As shown in fig. 4, in a case that it is determined that the user intends to actively scan the code, the terminal starts the TOF camera to acquire a first grayscale image and a first depth image, and after the terminal determines that the two-dimensional code exists within the preset distance based on the first grayscale image and the first depth image, the terminal may display a user interface 40 as shown in (1) of fig. 4, where the user interface 40 is also a first interface as the user interface 11 shown in (2) of fig. 2A. The user interface 40 may include the aforementioned preset indicator 391, where the indicator 391 is a preset first indicator. In response to an operation (e.g., a click operation) for the indicator 391, the terminal may turn on a code scanning function (in the payment class application 1) corresponding to the first indicator. Then, the terminal may scan the first grayscale image using the scan function, and the exemplary user interface involved in this case may refer to the user interface 41 shown in (2) in fig. 4. After the code scanning of the first gray image is completed, the terminal may display a user interface 42 as shown in (3) of fig. 4, where the user interface 42 is a payment interface, and a payment operation may be completed through the user interface 42.
Fig. 5A-5L, fig. 6A and fig. 6B are schematic diagrams illustrating that a user presets a second indicator and a first preset two-dimensional code and completes quick passive code scanning.
Fig. 5A to 5L are schematic diagrams illustrating a second indicator preset by a user and a first preset two-dimensional code. Fig. 6A and 6B are schematic diagrams illustrating that the terminal completes fast passive code scanning based on the preset second indicator and the first preset two-dimensional code.
As shown in fig. 5A, the user interface 30 may be a setting interface provided for a setting application, and includes a "quick code scanning" setting item 301, and the "quick code scanning" function may provide a function of presetting a first indicator, a second indicator, and a two-dimensional code for the terminal. In some possible cases, the "quick code scanning" setting item 301 may also have other existing manners, for example, the setting item may also be independent of the setting application, for example, a separate application, which is not limited in this embodiment of the application.
In response to an operation (e.g., a click operation) on the "quick code scan" setting item 301, the terminal may display a setting interface corresponding to the "quick code scan" setting item 301, which may be specifically referred to the following description of the user interface 31 in fig. 5B. For the description of the user interface 31, reference may be made to the foregoing description of the user interface 31 in fig. 3B, and details are not repeated here.
As shown in fig. 5B, the user interface 31 may include a "preset two-dimensional code" setting item 313, where the "preset two-dimensional code" setting item 313 may be used for a user to set a first preset two-dimensional code and a second preset two-dimensional code through a terminal, where the second preset two-dimensional code is a preset two-dimensional code corresponding to a second indicator, and the setting of the second preset two-dimensional code is to set a second indicator corresponding to the second preset two-dimensional code. The subsequent terminal can display the first preset two-dimension code and the second indicator in the second interface, and based on the second interface, the preset two-dimension code (including the first preset two-dimension code and the second preset two-dimension code) can be rapidly displayed.
In response to the operation for the "preset two-dimensional code" setting item 313, the terminal may display a setting interface corresponding to the "preset two-dimensional code" setting item 313. In some possible cases, the setup interface may refer to the user interface 50 shown in FIG. 5C described below.
As shown in fig. 5C, the user interface 50 includes different templates, where the different templates correspond to the second interface, and the arrangement rules of the first preset two-dimensional code and the second indicator in the different templates are different or the forms of the second indicator are different (the arrangement rules of the first preset two-dimensional code and the second indicator in the second interface are different or the forms of the second indicator are different). For example, here, the first preset two-dimensional code precedes the second indicator in the template 11, and the first preset two-dimensional code follows the second indicator in the template 21. The form of the second indicator in the template 11 may be a text prompt with a link; the present invention may be in other forms, for example, in the form of an icon, and the embodiment is not limited thereto. In addition, in some possible cases, the user can also perform custom template setting through the terminal. The template 11 preview frame 501 may be used to trigger the terminal to generate the second indicator and the first preset two-dimensional code according to the template 11, and to arrange the first preset two-dimensional code and the second indicator according to the template 11 to generate the second interface. In response to the operation of previewing the frame 501 for the template 11, the terminal may display the user interface 51 shown in fig. 5D described below.
The user interface 51 shown in fig. 5D is an exemplary user interface for setting the second indicator and setting the first preset two-dimensional code. A two-dimensional code addition control 511 and an indicator addition control 512 may be included in the user interface 51. The two-dimensional code adding control 511 may be configured to trigger setting of a first preset two-dimensional code; the indicator addition control 512 can be used to trigger setting of the second indicator. The process of setting the second indicator may refer to the following description of fig. 5F to 5K, which is not repeated herein, and the following description of setting the first preset two-dimensional code is described based on fig. 5D and 5E. The first preset two-dimensional code and the second preset two-dimensional code added by the user through the terminal may be various types of two-dimensional codes, for example, a payment type two-dimensional code, a group two-dimensional code involved in group establishment, a two-dimensional code involved in detection, and the like, which is not limited in the embodiment of the present application.
In response to the operation of adding the control 511 to the two-dimensional code, the terminal may trigger setting of the first preset two-dimensional code, which may be implemented in such a manner that the terminal may display an album, and a user may select an image including the two-dimensional code from the album as the first preset two-dimensional code through the terminal, at this time, the terminal may display a user interface 52 shown in fig. 5E described below.
As shown in fig. 5E, the user interface 52 is an exemplary user interface of an album, the album may include a plurality of images, the terminal may regard the image in the image preview frame 521 as a first preset two-dimensional code in response to an operation (e.g., a click operation) of the terminal on the image preview frame 521, and subsequently, the terminal may display the user interface 53 as described in fig. 5F below. In some possible cases, the user interface 52 may include only images carrying two-dimensional codes.
As shown in fig. 5F, the user interface 53 is an exemplary user interface after adding the image in the image preview box 521 as the first preset two-dimensional code image (displayed in the two-dimensional code preview box 531).
In response to operation of the add indicator control 512, the terminal may trigger setting of the second indicator, at which point the terminal may display the user interface 54 as shown in fig. 5G. In some possible cases, prompt text may also be included in the indicator addition control 512 to prompt the user to add the second indicator, for example, the prompt text may be: "add a predetermined two-dimensional code indicator", which is a second indicator.
As shown in FIG. 5G, user interface 54 is an exemplary setup interface for the second indicator. The user interface 54 may include a name setting control 541 and an execution setting control 542. The execution setting control 542 may be configured to set a second preset two-dimensional code corresponding to the second indicator, that is, the second indicator may trigger to display the second preset two-dimensional code; the name setting control 541 can be used to edit the name of the second indicator. The name may be input by a user through a terminal, or may be generated according to a second preset two-dimensional code corresponding to a second indicator set by the user, specifically, if the second preset two-dimensional code is the second preset two-dimensional code in the payment application B, the name may include the name of the payment application B, or may also include characters such as the two-dimensional code in the payment application B. The name is subsequently displayed in the second indicator, which may be used to prompt the user that the second indicator may specifically trigger information of the displayed second preset indicator, such as an application from which the second preset indicator comes.
The execution setting control 542 may include an application setting control 542a and a function setting control 542b, where the application setting control 542a may be used to set an application triggered to be opened by the second indicator, and the function setting control 542b may be used to set a function (displaying the second preset two-dimensional code) in the application triggered to be opened by the second indicator. The terminal may obtain a function (displaying a second preset two-dimensional code) corresponding to the second indicator after the user completes setting the second indicator, and encapsulate the function (displaying the second preset two-dimensional code) corresponding to the second indicator into a link for the user to execute the function corresponding to the second indicator through the terminal trigger. The terminal can also combine the name of the aforementioned related second indicator with the link to display the second indicator, so that the user can determine the function corresponding to the second indicator by the name (display the second preset two-dimensional code).
In response to the operation of the set application control 542a, the terminal may display the user interface 55 as shown in fig. 5H, the user interface 55 may include a list 552 of Applications (APPs), and the terminal may place the payment class application in front of the displayed list 552 of Applications (APPs) and place other types of applications behind the payment class application. In response to the operation of the addition control 551 corresponding to the payment class application 2, the terminal may treat the payment class application 2 as an application corresponding to the second indicator. While displaying another exemplary setup interface for a second indicator as shown in fig. 5I below.
As shown in FIG. 5I, the user interface 56 is another exemplary setup interface for the second indicator, at which point the terminal can update the text in the application setup control 542a to: and the payment application 2 prompts the user that the application corresponding to the second indicator is the payment application 2.
In response to the operation for the function setting control 542b, the terminal may display the user interface 57 as shown in fig. 5J, and a function list 572 may be displayed in the user interface 57, where functions in the payment-type application are shown in the function list 572, including (displaying) a payment code, a code scanning function, and the like. In some possible cases, only the payment code is (displayed) corresponding to the adding control, so that the user can set the (displayed) payment code to the code scanning function corresponding to the second indicator (displaying the second preset two-dimensional code) through the adding control. In other possible cases, the terminal may display only the payment code in the function list, and other functions are not displayed.
In response to the operation of the add control 571 corresponding to the function of (displaying) the payment code in the payment application 2, the terminal may use the function of (displaying) the payment code in the payment application 2 as the function corresponding to the second indicator. While displaying another exemplary setup interface for the first indicator as shown in fig. 5K below.
As shown in FIG. 5K, user interface 58 is one exemplary user interface after completion of the second indicator setting. The text in the application settings control 542a in the execution settings control 542 of the user interface 58 is updated to: the payment application 2 prompts the user that the application corresponding to the second indicator is the payment application 2; the text in the function setting control 542b is updated to: the "pay code" prompts the user that the function corresponding to the second indicator is to display a pay code (the pay code is a second predetermined pay code and is in the payment-type application 2). In response to the operation on the application control 581, the terminal may complete setting of the second indicator and display the second indicator according to the template. At this time, the terminal may display a user interface 59 as shown in fig. 5L described below.
As shown in fig. 5L, the user interface 59 is an exemplary user interface after setting of a second indicator, the indicator 591 (a second indicator) may include a name (name of the second indicator, text form) and a link, where the link may be triggered by the user through the terminal to execute a function corresponding to the indicator 591 (display a second predetermined two-dimensional code, where the second predetermined two-dimensional code is a payment code in the payment class application 2); the name may be used to prompt the user that indicator 591 corresponds to displaying a second preset two-dimensional code (the pay code in the payment-type application 2). In some possible cases, subsequently, the terminal may further add another second indicator through the indicator adding control 512, and arrange the other second indicator and the indicator 591 according to the template (template 11). The terminal may store the indicator 591 in response to operation with the save control 592. Subsequently, the terminal may display the indicator 591 in the second interface. The second interface is an interface displayed after the terminal determines that there is an operation of scanning a code intended by the user (passive code scanning), the second interface includes one or more first preset two-dimensional codes preset by the user, and one or more second indicators (for example, the indicator 591 mentioned above) may also be reserved, and the second indicators correspond to the second preset two-dimensional codes. Subsequently, the terminal may complete a fast passive code scanning based on the second interface, and the process may refer to the following description of fig. 6A and fig. 6B.
As shown in fig. 6A, in the case where it is determined that the user intends to scan the code passively, the terminal turns on the TOF camera to capture a first grayscale image, and in the case where it is determined that the scannable code device is included therein based on the first grayscale image, the terminal may display a user interface 60 as shown in (1) in fig. 6A, the user interface 60 being a kind of second interface. The user interface 60 may include the aforementioned preset indicator 591 and a first preset two-dimensional code (displayed in the two-dimensional code preview box 531), where the indicator 591 is a preset second indicator. The second indicator corresponds to a second preset two-dimensional code and can be used for triggering the terminal to display the corresponding second preset two-dimensional code.
In the two-dimensional code preview frame 531, the terminal displays the first preset two-dimensional code in the first size, and in response to an operation (for example, a click operation) on the two-dimensional code preview frame 531, the terminal may display the first preset two-dimensional code in the second size, at this time, the terminal may display a user interface 61 as shown in (2) in fig. 6A, and the user interface 61 may include the first preset two-dimensional code in the second size. Other devices capable of scanning the code may scan the first predetermined two-dimensional code displayed in the user interface 61 to complete the related operation, for example, in a case that the first predetermined two-dimensional code is a payment code, scanning the first predetermined two-dimensional code may complete the payment operation. In other cases, the first preset two-dimensional code may also be another type of two-dimensional code, which is not limited in this embodiment of the application.
In the case where it is determined that the user intends to scan the code passively, as shown in fig. 6B, the terminal turns on the TOF camera to capture a first gray-scale image, and in the case where it is determined that the code-scannable device is included therein based on the first gray-scale image, the terminal may display a user interface 60 as shown in (1) in fig. 6B, the user interface 60 being a kind of second interface. The user interface 60 may include the aforementioned preset indicator 591 and a first preset two-dimensional code (displayed in the two-dimensional code preview box 531), where the indicator 591 is a preset second indicator. The second indicator corresponds to a second preset two-dimensional code and can be used for triggering the terminal to display the corresponding second preset two-dimensional code.
In response to an operation (e.g., a click operation) on the indicator 591, the terminal may start a function corresponding to the indicator 591, that is, display a second preset two-dimensional code (e.g., a payment code in the payment-type application 2 of the aforementioned device) corresponding to the indicator 591, at this time, the terminal may display (display according to a second size) the user interface 62 shown in (2) in fig. 6B, where the user interface 62 may include the second preset two-dimensional code of the second size. Other code scanning devices may scan the second predetermined two-dimensional code displayed in the user interface 62 to complete the related operation, for example, in a case that the second predetermined two-dimensional code is a payment code, scanning the second predetermined two-dimensional code may complete the payment operation. In other cases, the second preset two-dimensional code may also be another type of two-dimensional code, which is not limited in this embodiment of the application.
Fig. 7 shows an exemplary flowchart involved in the automatic code scanning method in the embodiment of the present application.
For a detailed description of the automatic code scanning method in the embodiment of the present application, reference may be made to the following description of steps S101 to S122.
S101, the terminal sets a first preset two-dimensional code to be displayed, a first indicator to be displayed and a second indicator to be displayed through a quick code scanning function.
The quick code scanning function can provide the functions of setting a first indicator, a second indicator and a first preset two-dimensional code for the terminal. One possible implementation of the quick code scanning function can be referred to in the foregoing description of fig. 3A, and details thereof are not repeated here.
The first indicator may be used to trigger the opening of a code scanning function in an application corresponding to the first indicator. The code scanning functions corresponding to different first indicators are set by the terminal based on the quick code scanning function.
The second indicator may be used to trigger the opening of the two-dimensional code display function corresponding to the second indicator, that is, to trigger the display of a second preset two-dimensional code in the application corresponding to the second indicator. And the second preset two-dimensional codes corresponding to different second indicators are set by the terminal based on the quick code scanning function.
The first preset two-dimension code is set by the terminal based on a quick code scanning function.
Wherein, the first indicator is used for enabling the terminal to realize the function of quickly opening the code scanning. In the case that the terminal determines that the user has an active code scanning intention, the terminal can display a first interface comprising one or more first indicators, so that the user can select one of the first indicators to realize the code scanning function. The description of the process may refer to fig. 2A and 2B, or to fig. 4, as described above.
The second indicator and the first preset two-dimensional code have the function of enabling the terminal to rapidly display the two-dimensional code (including the first preset two-dimensional code and the second preset two-dimensional code corresponding to the second indicator). In the case that the terminal determines that the user has the passive code scanning intention, the terminal may display a second interface, wherein the second interface includes one or more first preset two-dimensional codes, or may further include one or more second indicators. So that a user can select one of the second indicators to display a second preset two-dimensional code corresponding to the second indicator, or select one of the first preset two-dimensional codes to display, so that other code-scannable devices can scan the two-dimensional code displayed by the terminal to complete a related function. For a description of this process, reference may be made to the foregoing description of fig. 6A and 6B.
The following description of steps S102 to S119 may be referred to in relation to the process of determining that the user has an active code scanning intention and displaying the first interface to implement fast code scanning; the following description of steps S102 to S107 and steps S120 to S122 may be referred to for the process of the terminal determining that the user has the passive code scanning intention and displaying the second interface to realize the fast code scanning.
In some possible cases, the process of setting the first indicator by the terminal through the quick code scanning function may refer to the foregoing description of fig. 3A to 3J, and is not described herein again. The process of setting the first preset two-dimensional code by the terminal through the shortcut code scanning function may refer to the foregoing description of fig. 5A to 5F, and details are not repeated here. The process of setting the second indicator by the terminal through the shortcut code scanning function may refer to the foregoing description of fig. 5F to 5L, and is not described herein again.
The following steps S102-S106 describe one way for the terminal to determine whether the user has an intention to scan a code, and to acquire the first grayscale image and the first depth image. In some possible cases, when the terminal detects the first posture (vertical movement and flip-up), the terminal may turn on the rear TOF camera to acquire a first grayscale image and a first depth image. When the terminal detects the second posture (vertical movement and non-upward turning), the terminal can start the front-facing TOF camera to acquire a first grayscale image and a first depth image. Specifically, reference may be made to the following description of steps S102 to S106.
S102, the terminal detects that the terminal has a first posture, wherein the first posture is that the terminal vertically moves and turns upwards, and the vertical movement and the turning upwards are continuous.
The first posture is vertical movement and upward turning. The vertical movement refers to the speed of the terminal in the vertical direction, and when the terminal moves vertically, the motion posture of the terminal can move upwards or downwards, namely the vertical movement comprises the upward movement or the downward movement of the terminal; the flip-up refers to rotating the screen of the terminal from a horizontal (or nearly horizontal) to a vertical (or nearly vertical) direction, and after the rotation, the screen faces the user (the object holding the terminal is the user).
In the first posture, the vertical movement and the upward turning are not sequential, as long as they are continuous. For example, it may include first moving vertically and then flipping upward; or the step of turning upwards and then vertically moving; it is also possible to move vertically while turning upwards.
It should be understood that the scene in which the terminal detects the first gesture may be in a case where the user raises his hand to lift the terminal and turns the terminal such that the rear camera aligns with the two-dimensional code. In the event that the first gesture is detected, the terminal may determine that the user has an intent to scan a code, and this is typically the active scan referred to in scenario 1 above. At this time, usually, the back surface (the surface including the rear-facing camera) of the terminal faces the two-dimensional code, and when the first posture is detected, the terminal may execute the following step S104, and turn on the rear-facing TOF camera to acquire the first grayscale image and the first depth image.
S103, the terminal detects that the terminal has a second posture, the second posture is that the terminal moves vertically, or the second posture is that the terminal moves downwards and does not turn upwards, and the vertical movement and the non-upwards turning are continuous.
The second posture is a vertical movement that does not include a flip, or the second posture may be a vertical movement and a non-flip up. The vertical movement refers to the speed of the terminal in the vertical direction, and when the terminal moves vertically, the motion posture of the terminal can move upwards or downwards, namely the vertical movement comprises the upward movement or the downward movement of the terminal; the non-upward turning includes turning the terminal to the left, turning the terminal to the right, or turning the terminal to the bottom, where the turning to the bottom refers to rotating the screen of the terminal from horizontal (or nearly horizontal) to vertical (or nearly vertical), and after the rotation, the screen faces away from the user (the object holding the terminal is the user), that is, the rear camera faces the user.
In the case where the second posture includes vertical movement and non-upward turning, the vertical movement and the non-upward turning are not in succession as long as they are continuous. For example, vertical movement followed by non-upturning may be included; or the step of turning upwards and then moving vertically can be included; it is also possible to move vertically while not flipping upwards.
It should be understood that the scenario in which the terminal detects the second gesture may be in a case where the user raises his or her hand to raise the terminal and turns the terminal so that the screen (the side containing the front camera) faces the scannable code device. Then, in the event that a second gesture is detected, the terminal may determine that the user has an intent to scan the code, and this time is typically a passive scan as referred to in case 2 above. At this time, usually, the screen (the side including the front camera) of the terminal faces the scannable code device, and when the second posture is detected, the terminal may execute the following step S105 to turn on the front TOF camera to acquire the first grayscale image and the first depth image.
It should be understood that, in some possible cases, the terminal may detect whether the terminal has the first posture or the second posture through the gyro sensor. In other possible cases, the terminal may detect whether the first posture or the second posture exists in the terminal through a gyro sensor in combination with an acceleration sensor.
And S104, the terminal starts the rear TOF camera and takes the rear TOF camera as a target camera.
In response to the detected first gesture, the terminal may start a rear TOF camera, regard the rear TOF camera as a target camera, and acquire a first grayscale image and a first depth image based on the rear TOF camera in subsequent step S106.
The rear TOF camera is a TOF camera arranged on the back of the terminal. The TOF camera can be used for detecting the distance from a shot object to the TOF camera (or a terminal) and generating a correlation image, wherein the correlation image can comprise a gray scale image (a first gray scale image) corresponding to the shot object and a depth image (a first depth image) corresponding to the shot object.
It should be understood that, since the TOF camera is disposed in the terminal, the distance between the TOF camera and the terminal is negligible, and in some possible cases, the distance between the object a and the TOF camera may also be regarded as the distance between the object a and the terminal, and the object a may include the object to be photographed, or a three-dimensional point in the object to be photographed, or the like.
It should be understood that the first grayscale image includes a photographed object (e.g., a two-dimensional code) within a field angle range of the TOF camera. The first gray image can be represented as a (or B rows and C columns) pixel points, each pixel point in the first gray image corresponds to a sampling color, the sampling color corresponding to each pixel point is the color of a three-dimensional point in an actual space corresponding to the pixel point, the sampling color is a color between black (darkest) and white (brightest), and the value range of each pixel point is [0,2 ] M ]Wherein, the closer the value of the pixel is to 0, the closer the sampling color of the pixel is to black, and the closer the value of the pixel is to 2 M The closer the sampling color of the pixel point is to white. Generally, the value of M may be 8 or 16, and may be adjusted according to the computing capability of the terminal, which is not limited in the embodiment of the present application.
The first depth image includes a distance from a three-dimensional point in a captured image (e.g., a two-dimensional code) to a terminal, which may be represented as a (or B rows and C columns) pixel points. The pixel value of each pixel point in the first gray image represents depth information, and the depth information corresponding to each pixel point is the distance from a three-dimensional point in an actual space corresponding to the pixel point to a terminal. The terminal may determine a distance of the subject from the terminal based on the first depth image. It should be understood that, when the TOF camera cannot acquire distance information from a three-dimensional point to the TOF camera, depth information of a pixel point corresponding to the three-dimensional point is missing, and a pixel value of the pixel point may be assigned to be 0.
It should be understood that the pixel points in the first grayscale image and the first depth image are in one-to-one correspondence, that is, the ith pixel point in the first grayscale image and the first depth image corresponds to the same three-dimensional point in the captured object. And because the first gray image is a black-and-white image, in the embodiment of the present application, only the shape information of the two-dimensional code in the first gray image is needed, and the color information of the two-dimensional code is not needed, so that even in a shooting environment with insufficient light, the operation of scanning the code based on the two-dimensional code in the first gray image in the embodiment of the present application is not affected. It is also understood that the first grayscale image includes a subject (a three-dimensional point of the subject), and the first depth image includes a distance from the three-dimensional point in the subject to the terminal.
And S105, the terminal starts the front TOF camera to serve as a target camera.
In response to the detected second gesture, the terminal may start the front TOF camera, regard the front TOF camera as a target camera, and acquire a first grayscale image and a first depth image based on the front TOF camera in subsequent step S106.
The front TOF camera is a TOF camera arranged on the side of the terminal screen.
For the description about the TOF camera, the first grayscale image and the first depth image, reference may be made to the foregoing description of step S104, and details are not repeated here.
And S106, the terminal acquires a first gray image and a first depth image through the target camera.
The target camera (which may be a rear TOF camera or a front TOF camera) may emit light to the object in a desired signal form, and then obtain a distance from a three-dimensional point in the object to the target camera (or a terminal) based on a phase difference or a time difference between light signals reflected by the object, i.e., obtain a depth image (e.g., a first depth image).
In some possible cases, within one frame period, the target camera may emit four times of light to a subject (e.g., a two-dimensional code) to be photographed through the projector, resulting in four frames of original images (raw images) with different phases (also referred to as ir frames or ir phases), which may be 0 °, 90 °, 180 °, and 270 °, respectively, for example. Then, the projector is turned off to acquire one frame of original image, which is acquired based on the ambient light reflected by the subject. Subsequently, the terminal may obtain a grayscale image (IR image), for example, a first grayscale image, based on the frame of the original image and the four frames of the original images with different phases. And the terminal may determine a phase difference based on the four frames of original images with different phases, and then determine depth information corresponding to the three-dimensional point in the object to be shot based on the phase difference to obtain a depth image (e.g., a first depth image), where the depth information of the different three-dimensional points is a distance from the three-dimensional point to a target camera (or the terminal).
S107, the terminal determines whether the first gray image comprises the two-dimensional code or not, and takes the two-dimensional code contained in the first gray image as a target two-dimensional code.
The terminal performs image recognition based on the first grayscale image to determine whether a two-dimensional code is included therein.
In some possible cases, the terminal may extract the feature points of the first grayscale image, for example, if the pixel points in the first grayscale image frame at black-and-white intervals are marked as the feature points, and if the comparison color value of the current pixel is different from that of the previous pixel, the current pixel points are marked as the feature points, so as to obtain all the feature points. Then, the terminal may determine the ratio of all the feature points in all the pixel points of the first grayscale image, and determine whether the ratio is greater than or equal to a ratio threshold. In a case that the ratio threshold is determined to be greater than or equal to the ratio threshold, the terminal may determine that the first grayscale image includes the two-dimensional code; in a case where it is determined that the duty threshold is smaller than the duty threshold, the terminal may determine that the two-dimensional code is not included in the first grayscale image. The value range of the percentage threshold may be adjusted in actual conditions, for example, may be 5% to 50%, for example, 20%, and the value of the percentage threshold should not constitute a limitation on the application embodiment.
When the terminal determines that the two-dimensional code is not included in the first grayscale image, the terminal may perform the following step S120 to further determine whether a scannable code device is included in the first grayscale image.
When the terminal determines that the first grayscale image includes the two-dimensional code, the terminal may perform the following step S108 to further determine whether the two-dimensional code is complete.
And S108, the terminal determines whether the target two-dimensional code is a complete two-dimensional code.
This step S108 is optional. In some possible cases, after the terminal has performed step S107, the terminal may perform step S110 described below.
The terminal may determine whether the target two-dimensional code is a complete two-dimensional code based on the first depth image.
In some possible cases, after recognizing that a two-dimensional code (target two-dimensional code) is included in the first depth image, the terminal may perform edge detection on the target two-dimensional code to determine whether the shape of the target two-dimensional code is complete, for example, when detecting that the target two-dimensional code has four sides (defined as a wide side 1, a wide side 2, a long side 1, and a long side 2), the terminal may determine that the target two-dimensional code is a complete two-dimensional code when determining that the wide side 1 and the wide side 2 are equal in length or that a difference between the wide side 1 and the wide side 2 is smaller than a first length threshold, and when determining that the long side 1 and the long side 2 are equal in length or that a difference between the long side 1 and the long side 2 is smaller than the first length threshold. When the difference value between the wide side 1 and the wide side 2 is not equal to or larger than a first length threshold, or when the difference value between the long side 1 and the long side 2 is not equal to or larger than the first length threshold, it can be determined that the target two-dimensional code is not a complete two-dimensional code. For another example, when the terminal detects that the target two-dimensional code is in a shape with regular edges (e.g., a circle), the terminal may determine that the target two-dimensional code is a complete two-dimensional code. When the terminal detects that the target two-dimensional code is in an irregular shape (for example, the edge includes an arc and a straight line), it can be determined that the target two-dimensional code is not a complete two-dimensional code.
It should be understood that, in the present application, whether the target two-dimensional code is a complete two-dimensional code may also be determined based on other manners, for example, performing edge detection on the target two-dimensional code to obtain a frame of the two-dimensional code, and when it is determined that the frame conforms to a frame type of the two-dimensional code, it may be determined whether the target two-dimensional code is a complete two-dimensional code. For example, in a case where the frame is determined to be square or circular, the terminal may determine that the target two-dimensional code is a complete two-dimensional code. Under the condition that the frame is determined not to be square or not to be circular, the terminal can determine that the target two-dimensional code is not a complete two-dimensional code.
In a case that the terminal determines that the two-dimensional code (target two-dimensional code) in the first grayscale image is a complete two-dimensional code, the terminal may execute the following step S110 to further determine whether the target two-dimensional code is a payment-type two-dimensional code or another two-dimensional code, and the terminal determines that the target two-dimensional code is another two-dimensional code, that is, the terminal determines that the target two-dimensional code is not a payment-type two-dimensional code.
In a case where the terminal determines that the two-dimensional code (target two-dimensional code) in the first grayscale image is not a complete two-dimensional code, the terminal may execute the following step S109 to acquire the first depth image and the first grayscale image again for determination.
S109, the terminal generates a first vibration to prompt a user to align the two-dimensional code.
This step S109 is optional. In the case where the aforementioned step S108 is not performed, the terminal may not perform step S109.
The terminal generates the first vibration to prompt the user to align the two-dimensional code, so that the user can align the two-dimensional code with the target camera of the terminal through the mobile terminal. The first vibration may be preset by the user, for example, in a quick code scanning setting item.
After the terminal generates the first vibration, the terminal may perform step S106 again to obtain the first grayscale image and the first depth image again through the target camera, and continue to perform subsequent operations based on the first grayscale image and the first depth image so that the terminal may implement the automatic code scanning method in the embodiment of the present application.
The type of the first vibration may be preset by a user, for example, the user may set the number of vibrations to 3 times. In some possible cases, the terminal may preset some types of vibrations for the user to select. For example, the terminal may preset the type of the vibration to be a single vibration, 2 vibrations, or 3 vibrations, etc. From which the user may select a type of vibration as the first type of vibration. As for the process of setting the type of the first shock, reference may be made to the following description of fig. 8A to 8D, which is not repeated herein.
It should be understood that other alert means than the first vibration may be included, such as a first alert tone. The first prompt tone is used for prompting a user to align the two-dimensional code, and the setting of the first prompt tone can be used for describing the first vibration, which is not described herein again.
In some possible cases, besides the first vibration and the first prompt sound, other types of prompt information may be included to prompt the user to align the two-dimensional code, which is not limited in this embodiment of the present application. The first vibration or the first prompt sound is adopted for prompting the user through vibration or sound generated by the terminal, prompting contents do not need to be displayed on an interface, and therefore the user using a display screen of the terminal is not influenced.
And S110, the terminal determines the type of the target two-dimensional code, wherein the type can comprise a payment type two-dimensional code and other two-dimensional codes.
The terminal may determine whether the two-dimensional code (target two-dimensional code) in the first grayscale image is a payment-type two-dimensional code or another two-dimensional code based on the first grayscale image.
The method for determining whether the target two-dimensional code is the payment-type two-dimensional code or the other two-dimensional codes by the terminal includes, but is not limited to, the following methods:
mode 1: in some possible cases, the terminal may pre-store K images including the pay-type two-dimensional code, and extract image features from the K images to obtain K image features. Then, the terminal obtains the image characteristics of the two-dimensional code (target two-dimensional code) in the first gray level image to obtain the image characteristics of the target two-dimensional code. Subsequently, the terminal compares the image characteristics of the target two-dimensional code with the K image characteristics, and when it is determined that one image characteristic in the K images is matched with the image characteristics of the target two-dimensional code, the terminal can determine that the target two-dimensional code is a payment type two-dimensional code. When it is determined that the image features of the K images do not match the image features of the target two-dimensional code, the terminal may determine that the target two-dimensional code is another two-dimensional code. The image features may be used to describe one or more of color features, texture features, shape features, and spatial relationship features of the image.
Mode 2: the terminal may perform text recognition on the first grayscale image to identify whether text related to payment is included therein, such as: "pay", etc. When determining that the characters related to payment are included, the terminal may determine that the target two-dimensional code is a payment-type two-dimensional code. When determining that the characters related to payment are not included, the terminal may determine that the target two-dimensional code is another two-dimensional code.
Upon determining that the target two-dimensional code is a payment-type two-dimensional code, the terminal may perform the following step S112.
Upon determining that the target two-dimensional code is a payment-type two-dimensional code, the terminal may perform the following step S114.
And S111, the terminal determines that the target two-dimensional code is other two-dimensional codes.
S112, the terminal determines whether the distance between the target two-dimensional code and the terminal is less than 100cm or not based on the first depth image.
In a case that the terminal determines that the two-dimensional code (target two-dimensional code) in the first depth image is another two-dimensional code, the terminal may determine whether a distance between the target two-dimensional code and the terminal is smaller than a first preset threshold based on the first depth image, where the first preset threshold may be 100cm, or may be another value, for example, 80cm-110cm, for example, 90cm, and the like.
In some possible cases, the terminal may use an average value of distances from all the pixel points in the first depth image to the terminal as the distance between the target two-dimensional code and the terminal.
In some possible cases, the terminal may use an average value of distances from all the pixel points in the middle region of the first depth image to the terminal as the distance between the target two-dimensional code and the terminal. The center of the middle region coincides with the center of the first depth image, and the length and width of the center region may be R times of the first depth image, where R is a number less than 1 and greater than 0.
The terminal may also determine the distance between the target two-dimensional code and the terminal in other manners, for example, the terminal may use the distance between the center pixel point of the first depth image and the terminal as the distance between the target two-dimensional code and the terminal. The embodiments of the present application do not limit this.
When the terminal determines that the distance between the target two-dimensional code and the terminal is less than a first preset threshold (e.g., 100 cm), the terminal may further determine that the user has an intention of scanning the code actively, and at this time, the terminal may perform the following step S116 to display a first interface to implement quick code scanning (active code scanning).
When the terminal determines that the distance between the target two-dimensional code and the terminal is greater than a first preset threshold (for example, 100 cm), the terminal may further determine that the user does not have an intention of scanning the code, and the terminal may execute the following step S119 to delete the first depth image and the first grayscale image acquired this time, and release the memory of the terminal.
When the terminal determines that the distance of the target two-dimensional code from the terminal is equal to a first preset threshold (e.g., 100 cm), the terminal may perform one of the following steps S119 or S116.
S113, determining that the target two-dimensional code is a payment type two-dimensional code.
S114, the terminal determines whether the distance between the target two-dimensional code and the terminal is less than 60cm or not based on the first depth image.
In a case where the terminal determines that the two-dimensional code (target two-dimensional code) in the first depth image is the payment-type two-dimensional code, the terminal may determine whether a distance between the target two-dimensional code and the terminal is smaller than a second preset threshold based on the first depth image, where the second preset threshold may be 60cm, or may be another value, for example, 40cm to 60cm, for example, 50cm, and the like.
The terminal determines the description of the distance between the target two-dimensional code and the terminal based on the first depth image, which refers to the foregoing description of the related content in step S112 and is not described herein again.
When the terminal determines that the distance between the target two-dimensional code and the terminal is less than a second preset threshold (for example, 60 cm), the terminal may further determine that the user has an active code scanning intention, and at this time, the terminal may perform the following step S116 to display the first interface to implement quick code scanning.
When the terminal determines that the distance between the target two-dimensional code and the terminal is greater than a second preset threshold (for example, 60 cm), the terminal may perform step S115, and in step S115, the terminal may generate a second vibration to prompt the user to approach the two-dimensional code, so that the terminal may re-acquire the first grayscale image and the first depth image, and obtain the first grayscale image acquired when the distance between the terminal and the two-dimensional code is less than the second preset threshold (for example, 60 cm) to implement quick code scanning.
When the terminal determines that the distance between the target two-dimensional code and the terminal is equal to a second preset threshold (e.g., 60 cm), the terminal may perform one of the following steps S115 or S116.
And S115, the terminal generates a second vibration to prompt the user to approach the two-dimensional code.
This step S115 is optional. In some possible cases, when the terminal determines that the distance between the target two-dimensional code and the terminal is greater than or equal to a second preset threshold (for example, 60 cm), the terminal may perform the foregoing step S102 or step S103; in other possible cases, when the terminal determines that the distance between the target two-dimensional code and the terminal is greater than or equal to a second preset threshold (e.g., 60 cm), the terminal may perform the foregoing step S106.
The terminal generates the second vibration to prompt the user to approach the two-dimensional code, so that the terminal can acquire the first gray image and the first depth image again to obtain the first gray image acquired when the distance between the terminal and the two-dimensional code is smaller than a second preset threshold (for example, 60 cm) to realize quick code scanning. The second vibration may be preset by the user, for example, in a quick code scanning setting item.
After the terminal generates the second vibration, the terminal may perform step S106 again to obtain the first grayscale image and the first depth image again through the target camera, and continue to perform subsequent operations based on the first grayscale image and the first depth image so that the terminal may implement the automatic code scanning method in the embodiment of the present application.
The second vibration is different from the first vibration in the type mentioned above, and may be preset by a user, for example, the user may set the vibration number to 1. In some possible cases, the terminal may preset some types of vibrations for the user to select. For example, the terminal may preset the type of the vibration to be a single vibration, 2 vibrations, or 3 vibrations, etc. The user may select a type different from the first vibration as the type of the second vibration. The procedure for the type setting of the first shock and the second shock may refer to the following description of fig. 8A to 8D.
It should be understood that other alert means than the second vibration may be included, such as a second alert tone. The second prompt tone is used for prompting the user to approach the two-dimensional code, and the setting of the second prompt tone can be used for describing the second vibration, so that the description is omitted here. This second alert tone is different from the first alert tone referred to previously.
In some possible cases, besides the second vibration and the second prompt sound, other types of prompt messages may be included to prompt the user to approach the two-dimensional code, which is not limited in this embodiment of the present application. The second vibration or the second prompt sound is adopted for prompting the user through the vibration or the sound generated by the terminal without displaying prompting contents on an interface, so that the use of a display screen of the terminal by the user is not influenced.
It should be understood that the reason why the first vibration and the second vibration are different here is that different information may be provided to the user based on different vibration types (the type of the first vibration is different from the type of the second vibration). So that the user can determine how to move the terminal in detail according to the type of vibration. For example, when the terminal generates a first vibration, after the user senses the first vibration, the terminal can be moved left and right to align the two-dimensional code; when the terminal generates the second vibration, the user may move the terminal forward to be close to the two-dimensional code after perceiving the second vibration. The reason why the first shock is different from the second shock may also be referred to herein.
Fig. 8A to 8D are diagrams illustrating the terminal setting the first vibration and the second vibration.
As shown in fig. 8A, the user interface 31 corresponds to an exemplary setting interface for the "quick code scan" setting item 301. A scan prompt setting 315 may be included in the user interface 31. This sweep sign indicating setting item 315 can be used to set up the suggestion mode, and this suggestion mode is the mode of suggestion user mobile terminal in order to realize sweeping the sign indicating number fast, promptly through this sweep sign indicating setting item 315, the terminal can realize the setting to first vibrations and second vibrations.
In response to an operation (e.g., a click operation) for the scan prompt setting item 315, the terminal may display setting interfaces for the first vibration and the second vibration, such as the user interface 80 shown in fig. 8B described below.
As shown in fig. 8B, the user interface 80 is a setting interface for the first vibration and the second vibration. The user interface 80 may include an "alignment cue" setting item 801 and an "approach cue" setting item 802. The "alignment prompt" setting item 801 is a setting item corresponding to the first shake, and the terminal can set the type of the first shake through the "alignment prompt" setting item 801, where the "alignment prompt" setting item 801 includes one or more setting values, for example, 3, and the 3 setting values include shake 1 time (single shake), shake 2 times, and shake 3 times. In the "alignment prompt" setting item 801, the setting value "shake 1 time" corresponds to the selection control 801a, the setting value "shake 2 times" corresponds to the selection control 801b, and the setting value "shake 3 times" corresponds to the selection control 801 c. The "approach prompt" setting item 802 is a setting item corresponding to the second vibration, and through the "approach prompt" setting item 802, the terminal can set the type of the second vibration, and the "approach prompt" setting item 802 includes one or more setting values, for example, 3, and the 3 setting values include vibration 1 time (single vibration), vibration 2 times, and vibration 3 times. In the "alignment prompt" setting item 801, the setting value "shake 1 time" corresponds to the selection control 802a, the setting value "shake 2 times" corresponds to the selection control 802b, and the setting value "shake 3 times" corresponds to the selection control 802 c.
In response to an operation (e.g., a click operation) on the selection control 801a corresponding to "shake 1 time" in the "alignment prompt" setting item 801, the terminal may perform graying on the selection control 801a to a first degree to prompt the user that the selection is successful, where the type of the first shake is shake once. In addition, the terminal may also perform ash placement to a second degree on the selection control 802a corresponding to "shake 1 time" in the "approach prompt" setting item 802, so as to prompt the user that the selection control 802a cannot perform selection. The user interface involved after graying may be the user interface 81 involved in fig. 8C, among others.
Referring to the user interface 81 shown in fig. 8C, in response to an operation (e.g., a click operation) for the selection control 802b corresponding to "shake 2 times" in the "approach prompt" setting item 802, the terminal may grayout the selection control 802b to a first degree. At this time, the terminal may display the user interface 82 shown in fig. 8D described below.
As shown in the user interface 82 of fig. 8D, the terminal grays out the selection control 802b to a first degree to prompt the user that the selection is successful, where the type of the second vibration is two vibrations. Moreover, the terminal may also perform a second degree of graying on the selection control 801b corresponding to "shake 2" in the "alignment prompt" setting item 801 to prompt the user that the selection control 801b cannot be selected.
S116, the terminal displays a first interface, and the first interface comprises a first indicator.
One or more first indicators may be included in the first interface. The form of the first indicator includes but is not limited to a link, an icon or a text.
An example of the first interface may be the user interface 40 shown in fig. 4, and may also be the user interface 11 shown in (2) in fig. 2A.
The first indicator may be used to trigger to open a code scanning function in the application corresponding to the first indicator, and the subsequent terminal may scan the first grayscale image by using the code scanning function and display an interface after the code scanning is completed, and then the user may perform a corresponding operation (e.g., payment, etc.) based on the interface after the code scanning is completed through the terminal. For other descriptions of the first indicator, reference may be made to the foregoing description, and details are not repeated herein.
The first indicator is preset by the user, and the setting process of the first indicator may refer to the foregoing description of fig. 3A to 3J, and is not described herein again.
And S117, the terminal detects the operation of selecting a target first indicator in the first interface, responds to the operation, opens a code scanning function in the application corresponding to the target first indicator, scans the target two-dimensional code in the first gray-scale image, and displays the interface after code scanning, wherein the interface after code scanning is the interface after scanning the target two-dimensional code.
The process involved in step S117 may refer to the foregoing description of fig. 4 or the foregoing fig. 2A and 2B.
Here, the step S117 will be described by taking fig. 4 as an example.
The operation of the terminal detecting a selected target first indicator in the first interface may be regarded as the operation of the user with respect to the indicator 391 (in the user interface 40). The indicator 391 may be considered a target first indicator.
In response to the operation of selecting a target first indicator in the first interface, the terminal may open a code scanning function in an application corresponding to the target first indicator, and scan a target two-dimensional code in the first grayscale image. At this time, the user interface displayed by the terminal may refer to the user interface 41 shown in (2) in fig. 4.
Then, the terminal may display an interface after the code scanning, where the interface after the code scanning is an interface after the target two-dimensional code is scanned. For example, reference may be made to the user interface 42 shown in FIG. 4 previously described.
And S118, the terminal detects the operation of closing the first interface, and responds to the operation to close the first interface.
In some possible cases, a close control may be included in the first interface, for example, the close control may refer to the close control 401 in the user interface 40 shown in (1) in fig. 4.
In response to an operation (e.g., a click operation) with respect to the close control, the terminal may close the first interface.
In other possible cases, the terminal may close the first interface by other means, such as sliding the screen to close the first interface. The method for closing the first interface by the terminal is not limited in the embodiment of the application.
After the step S118 is executed, the terminal may execute the following step S119, in which the first depth image and the first grayscale image acquired this time are deleted in the step S119, and the memory of the terminal is released.
And S119, the terminal deletes the first depth image and the first gray level image.
The purpose of deleting the first depth image and the first gray image by the terminal is to release the memory of the terminal and save the storage space.
Subsequently, the terminal may continue to perform step S102 or step S103 to determine again whether the user has an operation of attempting to scan a code.
S120, the terminal determines whether the first gray-scale image comprises the code scanning device.
In some possible cases, the terminal may pre-store S images including the code scannable device, and extract image features in the S images to obtain S image features. Then, the terminal obtains the image characteristics in the first gray level image to obtain the target image characteristics. Subsequently, the terminal compares the target image feature with the S image features, and when it is determined that one image feature in the S images matches the target image feature, the terminal may determine that the first grayscale image includes the scannable code device. Upon determining that no image features in the S images match the target image features, the terminal may determine that a scannable code device is not included in the first grayscale image. The image features may be used to describe one or more of color features, texture features, shape features, and spatial relationship features of the image.
When the terminal determines that the first grayscale image does not include the scannable code device, the terminal may continue to perform step S106 to obtain the first grayscale image and the first depth image again through the target camera, and continue to perform subsequent operations based on the first grayscale image and the first depth image so that the terminal may implement the automatic code scanning method in the embodiment of the present application. Meanwhile, the terminal may further record the number H of times of continuously executing steps S106 and S107 and step S120, and if the number H reaches the threshold number, and the output result of step S120 still indicates that the scannable code device is not included in the first grayscale image, the terminal may perform step S102 or step S103 after continuously executing step S120H and in a case that the output result indicates that the scannable code device is not included, instead of performing step S106, and re-determine whether the user has an intention to scan the code. The threshold value of the number of times may be 1 to 10, for example, 5, or may be other values, which is not limited in the embodiments of the present application. The consecutive execution of steps S106, S107 and S120 means: after step S106 is performed based on the first depth image and the first grayscale image acquired at one time, step S107 is performed, and step S120 is performed again.
In some possible cases, after determining that the scannable code device is not included in the first grayscale image, the terminal may further delete the first depth image and the first grayscale image to release the memory of the terminal.
When the terminal determines that the first grayscale image includes the code-scanable device, the terminal may perform the following step S121 to display the second interface to implement fast code scanning (passive code scanning).
S121, the terminal displays a second interface, and the second interface can comprise a second indicator and one or more of first preset two-dimensional codes with the first size.
In some possible cases, the second interface may include one or more of a second indicator and a first preset two-dimensional code of a first size. One example of this first interface may be the user interface 60 shown in FIG. 6A described previously.
The second interface including the first preset two-dimensional code with the first size means that the terminal displays the first preset two-dimensional code with the first size, for example, the terminal may set a two-dimensional code preview frame, and the terminal displays the first preset two-dimensional code in the two-dimensional code preview frame with the first size, for example, refer to the two-dimensional code preview frame 531 mentioned above.
The second indicator corresponds to a second preset two-dimensional code and can be used for triggering display of the second preset two-dimensional code. The form of the second indicator includes but is not limited to a link, an icon or a text. For other descriptions of the second indicator, reference may be made to the foregoing description, and details are not repeated here.
The second indicator is preset by the user, and the setting process of the second indicator may refer to the foregoing description of fig. 5A to 5L, which is not described herein again.
And S122, the terminal detects the operation of selecting the target first preset two-dimensional code in the second interface, and responds to the operation to display the target first preset two-dimensional code of the second size.
Here, the first predetermined two-dimensional code with the first size included in the second interface is taken as an example for explanation.
The process involved in step S122 may refer to the foregoing description of fig. 6A.
Here, the step S122 will be described by taking fig. 6A as an example.
The operation of the terminal detecting the target first preset two-dimensional code in the second interface may be regarded as an operation of the user for the two-dimensional code preview box 531 (in the user interface 60) shown in (1) in fig. 6A. The first preset two-dimensional code (according to the first size) displayed in the two-dimensional code preview box 531 can be regarded as a target first preset two-dimensional code.
In response to the operation of selecting the target first preset two-dimensional code in the second interface, the terminal can display the target first preset two-dimensional code of the second size, namely the terminal can display the target first preset two-dimensional code according to the second size. At this time, the user interface displayed by the terminal may refer to the user interface 61 shown in (2) in fig. 6A.
It should be understood that when the second size is larger than the first size and the target first preset two-dimensional code of the second size is displayed, the terminal may not display other two-dimensional codes to prevent errors (which may cause the code-scannable device to scan the other two-dimensional codes instead of the target first preset two-dimensional code).
It should be further understood that, in step S122, the operation of the terminal detecting the target first preset two-dimensional code in the second interface is taken as an example for explanation. In some possible cases, the terminal may further detect an operation of selecting a target second indicator in the second interface, and in response to the operation, the terminal may start a function corresponding to the target second indicator, that is, display (display according to the second size) a second preset two-dimensional code corresponding to the target second indicator. This process may be referred to as described above for fig. 6B, where indicator 591 may be considered a target second indicator.
It should be understood that a close control may be included in the second interface, where possible. In response to an operation (e.g., a click operation) with respect to the close control, the terminal may close the second interface. In other possible cases, the terminal may close the second interface by other means, such as sliding the screen to close the second interface. The embodiment of the application does not limit the way in which the terminal closes the second interface.
In the embodiment of the present application, the first preset threshold (e.g., 100 cm) may also be referred to as a second threshold; the second preset threshold (e.g., 60 cm) may also be referred to as the first threshold. The closing control of the first interface can also be referred to as a first control, and the operation for the closing control in the first interface can also be referred to as a first operation; the closing control of the second interface may also be referred to as a second control, and the operation for the closing control in the second interface may also be referred to as a second operation. The first posture and the second posture in the embodiment of the present application may be collectively referred to as a preset posture.
It should be understood that the first preset threshold (for example, 100 cm) mentioned above should be larger than the second preset threshold (for example, 60 cm), because the first preset threshold is suitable for the case that the target two-dimensional code is another two-dimensional code, which is generally a two-dimensional code with a larger size (compared with a payment-type two-dimensional code), such as a location code, and therefore, the user generally scans the code through the terminal at a longer distance. But the payment type two-dimensional code is small, and the user usually scans the code through the terminal at a short distance. The first preset threshold (e.g. 100 cm) is therefore greater than the second preset threshold (e.g. 60 cm). The first vibration and the first prompt sound may be collectively referred to as first prompt information, that is, the first prompt information may be one of the first vibration or the first prompt sound. The second vibration and the second prompt sound may be collectively referred to as second prompt information, that is, the second prompt information may be one of the second vibration or the second prompt sound; the second prompt message is different from the first prompt message. The first pose may be referred to as a first motion pose; the second pose may also be referred to as a second motion pose.
An exemplary terminal provided in an embodiment of the present application is first described below.
Fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application.
The following describes embodiments in detail by taking a terminal as an example. It should be understood that a terminal may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The terminal may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the terminal. In other embodiments of the present application, the terminal may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
Wherein, the controller can be the neural center and the command center of the terminal. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not form a limitation on the structure of the terminal. In other embodiments of the present application, the terminal may also adopt different interface connection manners or a combination of multiple interface connection manners in the foregoing embodiments.
The wireless communication function of the terminal can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in a terminal may be used to cover a single or multiple communication bands.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication and the like applied on the terminal.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal.
The wireless communication module 160 may provide solutions for wireless communication applied to a terminal, including Wireless Local Area Networks (WLANs) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), and the like.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD). The display panel may also be made of organic light-emitting diodes (OLEDs), active matrix organic light-emitting diodes (OLEDs), active-matrix organic light-emitting diodes (AMOLEDs), or the like. In some embodiments, the terminal may include 1 or N display screens 194, with N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal may include 1 or N cameras 193, N being a positive integer greater than 1. The camera of the terminal can include at least one of a front-mounted TOF camera and a rear-mounted TOF camera.
The terminal can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The gyro sensor 180B may be used to determine the motion attitude of the terminal. In some embodiments, the angular velocity of the terminal about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor 180B detects the shake angle of the terminal, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal through reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal calculates altitude from the barometric pressure measured by barometric pressure sensor 180C to assist in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal may detect the opening and closing of the flip holster using the magnetic sensor 180D.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal in various directions (typically three axes). When the terminal is static, the size and the direction of gravity can be detected. The method can also be used for recognizing the terminal gesture, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The terminal may measure the distance by infrared or laser. In some embodiments, a scene is photographed and the terminal may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is used to sense the ambient light level. The terminal may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal is in a pocket, to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel".
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal may receive a key input, and generate a key signal input related to user setting and function control of the terminal.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card.
In this embodiment, the processor 110 may call a computer instruction stored in the internal memory 121, so that the terminal executes the automatic code scanning method in this embodiment.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (15)

1. An automatic code scanning method, characterized in that the method comprises:
when the terminal detects that the terminal generates a preset gesture, the terminal starts a target camera to obtain a first depth image and a first gray image; the first gray image comprises a shot object, and the first depth image is used for indicating the distance from a three-dimensional point in the shot object to the terminal;
under the condition that the terminal determines that the first gray image comprises the two-dimension code, the terminal takes the two-dimension code in the first gray image as a target two-dimension code;
when the terminal determines that the target two-dimensional code is a payment-type two-dimensional code and the terminal determines that the distance between the target two-dimensional code and the terminal is smaller than a first threshold value based on the first depth image; or, when the terminal determines that the target two-dimensional code is not a payment-type two-dimensional code and the terminal determines that the distance between the target two-dimensional code and the terminal is smaller than a second threshold value based on the first depth image; the terminal displays a first interface; the first interface comprises one or more first indicators; wherein a first indicator corresponds to a code scanning function in an application; the first threshold is less than the second threshold;
responding to the operation of selecting a target first indicator in the first interface, and enabling a code scanning function corresponding to the target first indicator by the terminal;
and the terminal scans the target two-dimensional code based on the first gray level image.
2. The method of claim 1, further comprising:
under the condition that the terminal determines that the first gray-scale image does not comprise the two-dimensional code and determines that the first gray-scale image comprises the code scanning device, the terminal displays a second interface; the second interface comprises one or more second indicators or one or more first preset two-dimensional codes with a first size; wherein, a second indicator corresponds to a second preset two-dimensional code in an application;
under the condition that one or more second indicators are included in the second interface, in response to an operation of selecting a target second indicator in the second interface, the terminal displays a second preset two-dimensional code corresponding to the target second indicator so that the code scanning device scans the code based on the second preset two-dimensional code; or, in the case that the second interface includes one or more first preset two-dimensional codes, in response to an operation of selecting a target first preset two-dimensional code in the second interface, the terminal displays the target first preset two-dimensional code in a second size, so that the code scanning device scans codes based on the target first preset two-dimensional code; wherein the second size is greater than the first size.
3. The method according to claim 2, wherein after the terminal takes the two-dimensional code included in the first grayscale image as a target two-dimensional code, before the terminal determines that the target two-dimensional code is a payment-type two-dimensional code, the method further comprises:
the terminal determines whether the target two-dimensional code is a complete two-dimensional code;
under the condition that the target two-dimensional code is determined to be a complete two-dimensional code, the terminal determines the type of the target two-dimensional code;
under the condition that the target two-dimensional code is determined not to be a complete two-dimensional code, the terminal generates first prompt information to prompt a user to align the two-dimensional code; the first prompt message is one of first vibration or first prompt sound;
after the terminal generates the first prompt message, the terminal acquires the first depth image and the first gray image again based on the target camera.
4. The method according to any one of claims 1-3, further comprising:
when the terminal determines that the target two-dimensional code is a payment-type two-dimensional code and determines that the distance between the target two-dimensional code and the terminal is greater than or equal to a first threshold value based on the first depth image, the terminal generates second prompt information to prompt a user to approach the two-dimensional code; the second prompt message is one of second vibration or second prompt sound, and the second prompt message is different from the first prompt message;
and after the terminal generates second prompt information, the terminal acquires the first depth image and the first gray-scale image again based on the target camera.
5. The method of any one of claims 1-3, wherein the preset gesture comprises one of a first motion gesture or a second motion gesture; the first motion posture is vertical movement and upward turning; the second motion posture is vertical movement or the second motion posture is vertical movement and non-upturning.
6. The method according to claim 5, wherein the terminal determines that the target camera is a rear camera if the terminal generates the first motion gesture.
7. The method of claim 5, wherein the target camera is a front-facing camera if the terminal determines that the terminal generates the second motion gesture.
8. The method of any of claims 1-3, 6, 7, further comprising:
the first interface comprises a first control;
detecting a first operation on the first control;
and responding to the first operation, and closing the first interface by the terminal.
9. The method of claim 2, 3, 6 or 7, further comprising:
the second interface comprises a second control;
detecting a second operation for the second control;
and responding to the second operation, and closing the second interface by the terminal.
10. The method of claim 2, 3, 6 or 7, further comprising:
in a case where the terminal determines that the two-dimensional code is not included in the first grayscale image and determines that the scannable code device is not included in the first grayscale image, the terminal deletes the first depth image and the first grayscale image.
11. The method of claim 2, 3, 6 or 7, wherein the first indicator, the second indicator and the first preset two-dimensional code are preset.
12. The method of claim 2, 3, 6 or 7, wherein the first predetermined two-dimensional code is not time sensitive, the second predetermined two-dimensional code is time sensitive, and the second predetermined two-dimensional code is invalid after exceeding a third threshold.
13. A terminal, characterized in that the terminal comprises: one or more processors and memory; the memory coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions, the one or more processors invoking the computer instructions to cause the terminal to perform the method of any of claims 1-12.
14. A system-on-chip for application to a terminal, the system-on-chip comprising one or more processors configured to invoke computer instructions to cause the terminal to perform the method of any one of claims 1 to 12.
15. A computer-readable storage medium comprising instructions that, when executed on a terminal, cause the terminal to perform the method of any one of claims 1 to 12.
CN202211038629.6A 2022-08-29 2022-08-29 Automatic code scanning method and terminal Active CN115130491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211038629.6A CN115130491B (en) 2022-08-29 2022-08-29 Automatic code scanning method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211038629.6A CN115130491B (en) 2022-08-29 2022-08-29 Automatic code scanning method and terminal

Publications (2)

Publication Number Publication Date
CN115130491A true CN115130491A (en) 2022-09-30
CN115130491B CN115130491B (en) 2023-01-31

Family

ID=83387338

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211038629.6A Active CN115130491B (en) 2022-08-29 2022-08-29 Automatic code scanning method and terminal

Country Status (1)

Country Link
CN (1) CN115130491B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116306733A (en) * 2023-02-27 2023-06-23 荣耀终端有限公司 Method for amplifying two-dimensional code and electronic equipment

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110069152A1 (en) * 2009-09-24 2011-03-24 Shenzhen Tcl New Technology Ltd. 2D to 3D video conversion
US20120162366A1 (en) * 2010-12-27 2012-06-28 Dolby Laboratories Licensing Corporation 3D Cameras for HDR
CN106875191A (en) * 2017-02-27 2017-06-20 努比亚技术有限公司 One kind scanning payment processing method, device and terminal
CN107665324A (en) * 2016-07-27 2018-02-06 腾讯科技(深圳)有限公司 A kind of image-recognizing method and terminal
US20180322214A1 (en) * 2017-05-08 2018-11-08 Shenzhen Youbess Tech Service Co., Ltd Data display method
CN109241807A (en) * 2018-08-17 2019-01-18 湖南大学 A kind of remote two dimensional code localization method
CN109376830A (en) * 2018-10-17 2019-02-22 京东方科技集团股份有限公司 Two-dimensional code generation method and device
US20200043130A1 (en) * 2018-08-04 2020-02-06 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for scan-matching oriented visual slam
CN111311244A (en) * 2018-12-11 2020-06-19 北京意锐新创科技有限公司 Passive code scanning method and device based on QR (quick response) code
CN111311233A (en) * 2018-12-11 2020-06-19 北京意锐新创科技有限公司 Passive code scanning method and device based on multi-trigger mode
CN112465092A (en) * 2020-10-29 2021-03-09 深圳大学 Two-dimensional code sample generation method and device, server and storage medium
CN112585613A (en) * 2020-11-30 2021-03-30 华为技术有限公司 Code scanning method and device
CN113065374A (en) * 2021-04-01 2021-07-02 支付宝(杭州)信息技术有限公司 Two-dimensional code identification method, device and equipment
CN114330400A (en) * 2020-10-12 2022-04-12 珠海格力电器股份有限公司 Two-dimensional code image processing method, system, device, electronic equipment and storage medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110069152A1 (en) * 2009-09-24 2011-03-24 Shenzhen Tcl New Technology Ltd. 2D to 3D video conversion
US20120162366A1 (en) * 2010-12-27 2012-06-28 Dolby Laboratories Licensing Corporation 3D Cameras for HDR
CN107665324A (en) * 2016-07-27 2018-02-06 腾讯科技(深圳)有限公司 A kind of image-recognizing method and terminal
CN106875191A (en) * 2017-02-27 2017-06-20 努比亚技术有限公司 One kind scanning payment processing method, device and terminal
US20180322214A1 (en) * 2017-05-08 2018-11-08 Shenzhen Youbess Tech Service Co., Ltd Data display method
US20200043130A1 (en) * 2018-08-04 2020-02-06 Beijing Jingdong Shangke Information Technology Co., Ltd. System and method for scan-matching oriented visual slam
CN109241807A (en) * 2018-08-17 2019-01-18 湖南大学 A kind of remote two dimensional code localization method
CN109376830A (en) * 2018-10-17 2019-02-22 京东方科技集团股份有限公司 Two-dimensional code generation method and device
CN111311244A (en) * 2018-12-11 2020-06-19 北京意锐新创科技有限公司 Passive code scanning method and device based on QR (quick response) code
CN111311233A (en) * 2018-12-11 2020-06-19 北京意锐新创科技有限公司 Passive code scanning method and device based on multi-trigger mode
CN114330400A (en) * 2020-10-12 2022-04-12 珠海格力电器股份有限公司 Two-dimensional code image processing method, system, device, electronic equipment and storage medium
CN112465092A (en) * 2020-10-29 2021-03-09 深圳大学 Two-dimensional code sample generation method and device, server and storage medium
CN112585613A (en) * 2020-11-30 2021-03-30 华为技术有限公司 Code scanning method and device
WO2022110106A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Code scanning method and apparatus
CN113065374A (en) * 2021-04-01 2021-07-02 支付宝(杭州)信息技术有限公司 Two-dimensional code identification method, device and equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116306733A (en) * 2023-02-27 2023-06-23 荣耀终端有限公司 Method for amplifying two-dimensional code and electronic equipment
CN116306733B (en) * 2023-02-27 2024-03-19 荣耀终端有限公司 Method for amplifying two-dimensional code and electronic equipment

Also Published As

Publication number Publication date
CN115130491B (en) 2023-01-31

Similar Documents

Publication Publication Date Title
WO2020041952A1 (en) Method and electronic apparatus for controlling express delivery cabinet on the basis of express delivery message
CN108399349B (en) Image recognition method and device
CN114365476A (en) Shooting method and equipment
CN110636174B (en) Bus code calling method and mobile terminal
KR101725533B1 (en) Method and terminal for acquiring panoramic image
CN112492193B (en) Method and equipment for processing callback stream
CN110006340B (en) Object size measuring method and electronic equipment
CN110300267B (en) Photographing method and terminal equipment
CN107124556B (en) Focusing method, focusing device, computer readable storage medium and mobile terminal
CN110519503B (en) Method for acquiring scanned image and mobile terminal
CN110839128A (en) Photographing behavior detection method and device and storage medium
CN115130491B (en) Automatic code scanning method and terminal
CN108616687B (en) Photographing method and device and mobile terminal
WO2022100219A1 (en) Data transfer method and related device
CN111897465B (en) Popup display method, device, equipment and storage medium
CN113473372B (en) Equipment positioning method and related device
CN110677537B (en) Note information display method, note information sending method and electronic equipment
US20220417447A1 (en) Imaging Method for Non-Line-of-Sight Object and Electronic Device
EP3872753A1 (en) Wrinkle detection method and terminal device
CN112991439A (en) Method, apparatus, electronic device, and medium for positioning target object
CN115032640A (en) Gesture recognition method and terminal equipment
CN111147745B (en) Shooting method, shooting device, electronic equipment and storage medium
CN107194363B (en) Image saturation processing method and device, storage medium and computer equipment
EP3855358A1 (en) Object recognition method and terminal device
CN113364970A (en) Imaging method of non-line-of-sight object and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant