CN115016712A - Method and device for exiting two-dimensional code - Google Patents

Method and device for exiting two-dimensional code Download PDF

Info

Publication number
CN115016712A
CN115016712A CN202111136786.6A CN202111136786A CN115016712A CN 115016712 A CN115016712 A CN 115016712A CN 202111136786 A CN202111136786 A CN 202111136786A CN 115016712 A CN115016712 A CN 115016712A
Authority
CN
China
Prior art keywords
interface
terminal equipment
wrist
code
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111136786.6A
Other languages
Chinese (zh)
Other versions
CN115016712B (en
Inventor
李丹洪
邸皓轩
张晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311129985.3A priority Critical patent/CN117453105A/en
Priority to CN202111136786.6A priority patent/CN115016712B/en
Publication of CN115016712A publication Critical patent/CN115016712A/en
Priority to PCT/CN2022/118319 priority patent/WO2023045789A1/en
Application granted granted Critical
Publication of CN115016712B publication Critical patent/CN115016712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method and a device for exiting a two-dimensional code, which relate to the technical field of terminals, and the method comprises the following steps: the terminal equipment displays a first interface; the first interface comprises a first two-dimensional code; the method comprises the steps that terminal equipment obtains first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data is acquired by an acceleration sensor, and the angular acceleration data is acquired by a gyroscope sensor; and when the first data indicate that the wrist turning action occurs, the terminal equipment quits from displaying the first interface. Therefore, the terminal equipment can determine whether to quit the two-dimensional code or not by identifying the wrist turning action in the code scanning process of the user, and the flexibility of quitting the code scanning interface is improved.

Description

Method and device for quitting two-dimensional code
Technical Field
The application relates to the technical field of terminals, in particular to a method and a device for quitting a two-dimensional code.
Background
With the wide popularization of the two-dimensional code, more terminal devices can realize functions of riding, paying and the like by using the two-dimensional code. For example, a user can utilize the two-dimensional code to realize trip non-blocking, and when the user takes, the riding two-dimensional code (or simply riding code) in the intelligent trip Application (APP) of the terminal device can be opened before the user enters the station, and the riding code is directed to the code scanning port of the gate to scan the code and enter the station.
In general, after the user finishes scanning the code, a code scanning completion interface can be displayed in the terminal device of the user, and the code scanning completion interface is used for prompting the user to complete the code scanning by bus. And when the user needs to quit the riding code, the user can quit the riding code by triggering a quit button in the code scanning completion interface.
However, the flexibility of the above method of exiting the ride code is low.
Disclosure of Invention
The embodiment of the application provides a method and a device for quitting a two-dimensional code, so that terminal equipment can recognize a wrist turning action in a code scanning process of a user, and can automatically quit a bus code interface after the user finishes code scanning based on the wrist turning action, and the flexibility of quitting the code scanning interface is enhanced.
In a first aspect, an embodiment of the present application provides a method for exiting a two-dimensional code, which is applied to a terminal device, where the terminal device includes an acceleration sensor and a gyroscope sensor, and the method includes: the terminal equipment displays a first interface; the first interface comprises a first two-dimensional code; the method comprises the steps that terminal equipment obtains first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data is acquired by an acceleration sensor, and the angular acceleration data is acquired by a gyroscope sensor; and when the first data indicate that the wrist turning action occurs, the terminal equipment quits from displaying the first interface. Therefore, the terminal equipment can determine whether to quit the two-dimensional code or not by identifying the wrist turning action in the code scanning process of the user, and the flexibility of quitting the code scanning interface is improved.
The first two-dimensional code is a riding code in the embodiment of the application; the first interface is used for scanning the bus.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs, the terminal device quits displaying the first interface, including: and when the first data indicate that the wrist turning action occurs and the time interval from the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold value, the terminal equipment quits from displaying the first interface. Therefore, the terminal equipment can identify the wrist turning action in the code scanning process of the user, and based on the wrist turning action and detection of a time interval from the wrist turning action to the code scanning completion, the two-dimension code interface is automatically withdrawn after the user finishes the code scanning, and the flexibility of withdrawing the code scanning interface is enhanced.
The value of the first preset threshold may be 5 seconds, and the like.
In one possible implementation manner, the displaying, by the terminal device, the first interface includes: the method comprises the steps that terminal equipment receives a first operation of opening a first application in a desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; responding to the first operation, and displaying a home page of the first application by the terminal equipment; the home page of the first application comprises a first control for opening a first two-dimensional code; the terminal equipment receives a second operation aiming at the first control; and responding to the second operation, and displaying the first interface by the terminal equipment. Therefore, the terminal equipment can conveniently enter the two-dimensional code interface from the home page of the first application.
The first application can be a smart travel application in the embodiment of the application, the home page of the first application can be the home page of the smart travel application, and the first control can be a two-dimensional code control in the home page of the smart travel application.
In one possible implementation manner, the exiting displaying the first interface by the terminal device includes: the terminal device displays a home page of the first application. Like this, when terminal equipment detected user's wrist action and/or by wrist action to sweep the code detection of time interval when accomplishing and satisfy the preset condition, terminal equipment can withdraw from the two-dimensional code interface and show the first page of first application, increases the flexibility of withdrawing the two-dimensional code.
In a possible implementation manner, the displaying, by the terminal device, the first interface includes: the terminal equipment receives a third operation aiming at the first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; responding to the third operation, the terminal equipment displays an interface containing the functional component of the first application; the functional components comprise a second control for opening the first two-dimensional code; the terminal equipment receives a fourth operation aiming at the second control; and responding to the fourth operation, and displaying the first interface by the terminal equipment. Therefore, the terminal equipment can conveniently enter the two-dimensional code interface from the functional component of the first application.
In a possible implementation manner, the exiting of the terminal device from displaying the first interface includes: the terminal device displays an interface containing the functional component of the first application. Therefore, when the terminal device detects that the wrist-turning action of the user and/or the detection of the time interval from the wrist-turning action to the code scanning completion meets the preset condition, the terminal device can quit the two-dimension code interface and display the interface containing the functional component of the first application, the flexibility of quitting the two-dimension code is improved, and the interface containing the functional component of the first application can facilitate the user to use other functions of the first application.
In a possible implementation manner, the displaying, by the terminal device, the first interface includes: the terminal equipment receives a fifth operation aiming at the third control element in the desktop; the third control element is used for opening the first two-dimensional code; and responding to the fifth operation, and displaying the first interface by the terminal equipment. Therefore, the terminal equipment can conveniently enter the two-dimensional code interface from the desktop.
In one possible implementation manner, the exiting displaying the first interface by the terminal device includes: and the terminal equipment displays the desktop interface. Therefore, when the terminal equipment detects that the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the code scanning completion meets the preset condition, the terminal equipment can quit the two-dimension code interface and display the desktop interface, and the flexibility of quitting the two-dimension code is improved.
In a possible implementation manner, the displaying, by the terminal device, the first interface includes: the terminal equipment displays a second interface; the second interface is an interface operated in the second application; the terminal equipment receives a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; and responding to the sixth operation, and displaying the first interface by the terminal equipment. Therefore, the terminal equipment can conveniently enter the two-dimensional code interface in the first application under the condition of opening other applications.
The second application may be a video application in this embodiment, and the second interface may be an interface for viewing a video in this embodiment.
In a possible implementation manner, the exiting of the terminal device from displaying the first interface includes: and the terminal equipment displays the second interface. Like this, when terminal equipment detected user's wrist action and/or by wrist action to sweep the code detection of time interval when accomplishing, terminal equipment can withdraw from the two-dimensional code interface and show the interface that the second applied the place, increases the flexibility of withdrawing from the two-dimensional code, also is convenient for the user to use this second and uses.
In a possible implementation manner, when the first data indicates that a wrist flipping action occurs, the terminal device quits displaying the first interface, including: and when the first data indicate that the wrist turning action occurs and the time interval for switching from the second interface to the first interface is smaller than a second preset threshold value, the terminal equipment displays the second interface. Therefore, when the terminal device detects the wrist turning action of the user and the time interval for switching the second interface to the first interface meets the preset condition, the terminal device can quit the two-dimension code interface and display the second interface, the flexibility of quitting the two-dimension code is improved, and the user can conveniently check the second interface.
In a possible implementation manner, the displaying, by the terminal device, the first interface includes: the terminal equipment displays a third interface; the third interface is an interface operated in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; the terminal equipment receives a seventh operation of switching from the third interface to the first interface; and responding to the seventh operation, and displaying the first interface by the terminal equipment. Therefore, the terminal equipment can conveniently enter the two-dimensional code interface in the first application under the condition of opening other interfaces of the first application.
In one possible implementation manner, the exiting displaying the first interface by the terminal device includes: and the terminal equipment displays the third interface. Therefore, when the terminal device detects the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the completion of code scanning meets the preset condition, the terminal device can quit the two-dimensional code interface and display other interfaces of the first application, the flexibility of quitting the two-dimensional code is increased, and the user can use other functions of the first application conveniently.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs, the terminal device quits displaying the first interface, including: and when the first data indicate that the wrist turning action occurs and the time interval for switching from the third interface to the first interface is smaller than a third preset threshold value, the terminal equipment displays the third interface. Therefore, when the terminal equipment detects the wrist turning action of the user and the time interval for switching the third interface to the first interface meets the preset condition, the terminal equipment can quit the two-dimensional code interface and display the second interface, the flexibility of quitting the two-dimensional code is improved, and the user can conveniently check the third interface.
In one possible implementation manner, the wrist flipping action is obtained by the terminal device recognizing the first data by using a neural network model; the neural network model is obtained by the terminal equipment based on training of second data, the second data comprise acceleration sample data and angular acceleration sample data, and the second data are related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port. Therefore, the terminal equipment can accurately identify the wrist-turning action by using the neural network model.
In one possible implementation, the state of the code scanning port includes at least one of: the code scanning port is in a vertical state, the code scanning port is in a horizontal state, the code scanning port is in a side-tipping state or the code scanning port is in a backward-tipping state. Therefore, the terminal equipment can recognize the wrist turning action of a user when the user carries out riding code scanning at different code scanning ports on the basis of the neural network model obtained by training second data related to the state of the code scanning port, and the accuracy of wrist turning recognition is improved.
In one possible implementation, the state of the terminal device includes at least one of: the method comprises the following steps that the terminal device is in a vertical screen state, the terminal device is in a vertical screen inclined state, the terminal device is in a transverse screen inclined state, the terminal device is in an upper right inclined state, the terminal device is in an upper left inclined state, the terminal device is in an upper inclined state, the terminal device is in a lower inclined state, or the terminal device is in a horizontal downward state. Therefore, the terminal equipment can recognize the wrist turning action of the user in different motion states when the user scans the code by taking a car based on the neural network model obtained by training the second data related to the state of the terminal equipment, and the accuracy of wrist turning recognition is improved.
In one possible implementation manner, the method further includes: and when the first data indicate that the wrist turning action does not occur, and/or the time interval from the wrist turning action to the scanning of the first two-dimensional code is larger than or equal to a first preset threshold value, the terminal equipment displays a first interface. Therefore, the terminal equipment does not exit the two-dimension code interface based on the wrist turning action in the code scanning process of the user and/or based on the wrist turning action and the detection of the time interval from the wrist turning action to the code scanning completion, so that the user can continue to use the two-dimension code interface.
In a second aspect, an embodiment of the present application provides a device for exiting a two-dimensional code, where the device includes an acceleration sensor, a gyroscope sensor, and a display unit, and is configured to display a first interface; the first interface comprises a first two-dimensional code; the processing unit is used for acquiring first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data is acquired by an acceleration sensor, and the angular acceleration data is acquired by a gyroscope sensor; and when the first data indicate that the wrist-turning action occurs, the display unit is also used for quitting displaying the first interface.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs, and a time interval from the wrist flipping action to the scanning of the first two-dimensional code is smaller than a first preset threshold, the display unit is specifically configured to quit displaying the first interface.
In a possible implementation manner, the processing unit is specifically configured to receive a first operation of opening a first application in a desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; responding to the first operation, and specifically displaying a home page of the first application by a display unit; the home page of the first application comprises a first control for opening a first two-dimensional code; the processing unit is further specifically configured to receive a second operation for the first control; and responding to the second operation, and the display unit is also specifically used for displaying the first interface.
In one possible implementation, the display unit is specifically configured to display a home page of the first application.
In a possible implementation manner, the processing unit is specifically configured to receive a third operation for the first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; the display unit is specifically used for displaying an interface containing a functional component of the first application; the functional components comprise a second control for opening the first two-dimensional code; the processing unit is further specifically configured to receive a fourth operation for the second control; and responding to the fourth operation, and displaying the first interface by the display unit.
In one possible implementation, the display unit is specifically configured to display an interface including a functional component of the first application.
In a possible implementation, the processing unit is specifically configured to receive a fifth operation for a third control in the desktop; the third control element is used for opening the first two-dimensional code; and responding to the fifth operation, and displaying the first interface.
In one possible implementation, the display unit is specifically configured to display a desktop interface.
In a possible implementation manner, the display unit is specifically configured to display the second interface; the second interface is an interface operated in the second application; the processing unit is specifically used for receiving a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; and responding to the sixth operation, and the display unit is also specifically used for displaying the first interface.
In a possible implementation manner, the display unit is specifically configured to display the second interface.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs and a time interval for switching from the second interface to the first interface is smaller than a second preset threshold, the display unit is specifically configured to display the second interface.
In a possible implementation manner, the display unit is specifically configured to display a third interface; the third interface is an interface operated in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; the processing unit is specifically used for receiving a seventh operation of switching from the third interface to the first interface; and responding to the seventh operation, and displaying the first interface.
In a possible implementation manner, the display unit is specifically configured to display the third interface.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs and a time interval for switching from the third interface to the first interface is smaller than a third preset threshold, the display unit is specifically configured to display the third interface.
In one possible implementation manner, the wrist flipping action is obtained by the terminal device recognizing the first data by using a neural network model; the neural network model is obtained by the terminal device based on training of second data, the second data comprises acceleration sample data and angular acceleration sample data, and the second data is related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port.
In one possible implementation, the state of the code scanning port includes at least one of: the code scanning port is in a vertical state, the code scanning port is in a horizontal state, the code scanning port is in a side-tipping state or the code scanning port is in a backward-tipping state.
In one possible implementation, the state of the terminal device includes at least one of: the method comprises the following steps that the terminal device is in a vertical screen state, the terminal device is in a vertical screen inclined state, the terminal device is in a transverse screen inclined state, the terminal device is in an upper right inclined state, the terminal device is in an upper left inclined state, the terminal device is in an upper inclined state, the terminal device is in a lower inclined state, or the terminal device is in a horizontal downward state.
In a possible implementation manner, when the first data indicates that the wrist flipping action does not occur, and/or a time interval from the wrist flipping action to the scanning of the first two-dimensional code is greater than or equal to a first preset threshold, the display unit is further configured to display the first interface.
In a third aspect, an embodiment of the present application provides an apparatus for exiting a two-dimensional code, including a processor and a memory, where the memory is used for storing a code instruction; the processor is configured to execute the code instructions to cause the electronic device to perform the exit two-dimensional code method as described in the first aspect or any implementation manner of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which stores instructions that, when executed, cause a computer to perform the two-dimensional code exit method as described in the first aspect or any implementation manner of the first aspect.
In a fifth aspect, a computer program product comprises a computer program which, when executed, causes a computer to perform the exit two-dimensional code method as described in the first aspect or any implementation manner of the first aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar, and are not described again.
Drawings
Fig. 1 is a schematic view of a bus scanning interface according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a software structure of a terminal device according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a method for exiting a two-dimensional code according to an embodiment of the present disclosure;
fig. 5 is a schematic view of an interface for opening a ride code according to an embodiment of the present disclosure;
fig. 6 is a schematic view of another interface for opening a ride code according to an embodiment of the present disclosure;
fig. 7 is a schematic flowchart of a process for constructing a neural network model according to an embodiment of the present disclosure;
fig. 8 is a schematic view of an interface of a quit ride code according to an embodiment of the present disclosure;
FIG. 9 is a schematic view of another interface for a quit ride code according to an embodiment of the present application;
FIG. 10 is a schematic view of yet another exit ride code interface provided in an embodiment of the present application;
fig. 11 is a schematic view of an interface for displaying prompt information according to an embodiment of the present application;
fig. 12 is a schematic flowchart of another method for exiting a two-dimensional code according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an apparatus for exiting a two-dimensional code according to an embodiment of the present application;
fig. 14 is a schematic hardware structure diagram of a control device according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used to distinguish different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," and the like do not denote any order or importance, but rather the terms "first," "second," and the like do not denote any order or importance.
It is noted that the words "exemplary" or "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
In recent years, with the wide popularization of the two-dimensional codes, the card-free degree of users during traveling is higher and higher, most users are used to take public transport means such as buses, subways and light rails through the bus taking codes, and therefore the improvement of the experience of the users in using the bus taking codes is of great significance.
Fig. 1 is a schematic interface diagram of a ride scanning code according to an embodiment of the present disclosure. In the embodiment corresponding to fig. 1, a terminal device is taken as an example for illustration, and the example does not limit the embodiment of the present application.
When the user utilizes the wisdom trip APP in the cell-phone to sweep sign indicating number car, can open the sign indicating number by bus in this wisdom trip APP through the interface as shown by a in fig. 1. For example, when the mobile phone receives an operation of a user long pressing a smart row APP101 in an interface shown as a in fig. 1, the mobile phone may display a corresponding functional component 102 of the smart row APP, where the functional component 102 may include one or more of the following: a control for sweeping, a ride code control 103 for riding, or a control for adding other functionality, etc. As shown in a in fig. 1, one or more of the following application controls may also be included in the interface, for example: file management, email, music, gallery, smart travel APP101, camera, contact list, phone, or information, etc.
When the mobile phone receives that the user triggers the operation of the riding code control 103 in the interface shown as a in fig. 1, the mobile phone may display the interface shown as b in fig. 1. The interface, as shown at b in fig. 1, may include one or more of the following, for example: the bus control 104 is used for displaying a two-dimensional code used when taking a bus, the subway control 105 is used for displaying a two-dimensional code used when taking a subway, an identifier of an interface of a bus taking code, text information of an electronic subway card in an X city, and a two-dimensional code 106 is used for taking a subway. Wherein the subway control 105 is in a selected state.
When the user directs the two-dimensional code 106 shown as b in fig. 1 toward the code scanning port of the gate, the handset can display the interface shown as c in fig. 1 after a few seconds. An interface, as shown at c in fig. 1, which may include one or more of the following: the riding information includes a prompt message 107 for prompting the successful riding, a quit control 108 for quitting the riding code, a control for returning the riding code, and the like.
Further, when the mobile phone receives that the user triggers the exit control 108 in the interface shown in c in fig. 1, the mobile phone may exit the riding code interface and display a main interface or other interfaces corresponding to the smart travel APP.
However, the flexibility of the quitting bus code interface is low, and especially when the user needs to use other functions in the intelligent travel APP, the inconvenience caused by the quitting bus code interface is high, which affects the use experience of the user using the intelligent travel APP.
In view of this, the embodiment of the present application provides a method for exiting a two-dimensional code, so that a terminal device can recognize a wrist flipping action during a code scanning process of a user, and based on the wrist flipping action and time from the wrist flipping to the code scanning of the user, a user can rapidly exit a bus code interface after completing the code scanning, thereby enhancing flexibility of exiting the bus code.
It can be understood that the terminal device may be a smart phone, a tablet, or the like, or the terminal device may also be a wearable device, such as a smart watch, a smart bracelet, a wearable Virtual Reality (VR) device, or a wearable Augmented Reality (AR) device. The specific technology and the specific device form adopted by the terminal device are not limited in the embodiment of the application.
It can be understood that the method for exiting the two-dimensional code provided in the embodiment of the present application may be applied not only to a scene of scanning a code by bus, but also to a scene of payment, and the like.
Therefore, in order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application. Exemplarily, fig. 2 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, an indicator 192, a camera 193, a display screen 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the terminal device. In other embodiments of the present application, a terminal device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. The different processing units may be separate devices or may be integrated into one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device, and may also be used to transmit data between the terminal device and the peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charging management module 140 and the processor 110.
The wireless communication function of the terminal device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in terminal devices may be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication applied to a terminal device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), and the like.
The terminal device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1. In the embodiment of the application, when the terminal device receives the triggering operation of the user for the riding code in the smart trip APP, the display screen 194 can be used for displaying a riding code interface.
The terminal device can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area.
The terminal device may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The terminal device can listen to music through the speaker 170A, or listen to a handsfree call. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear. The headphone interface 170D is used to connect a wired headphone. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device. In this embodiment, the gyro sensor 180B may be a three-axis (including x-axis, y-axis, and z-axis) gyro sensor, and is configured to measure angular acceleration data of the terminal device when the user turns the wrist.
The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device in various directions (generally, three axes). In this embodiment, the acceleration sensor 180E may be a three-axis (including x-axis, y-axis, and z-axis) acceleration sensor, and is configured to measure acceleration data of the terminal device when the user turns the wrist.
It is understood that the gyro sensor 180B and the acceleration sensor 180E may be used together for detecting the wrist-flipping motion of the user.
A distance sensor 180F for measuring a distance. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense the ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is used to detect temperature. The touch sensor 180K is also called a "touch device". The bone conduction sensor 180M may acquire a vibration signal.
The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, or "touch screen". In this embodiment, the touch sensor 180K is configured to detect a touch operation of a user on the display screen.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, which will not be described herein again.
Exemplarily, fig. 3 is a schematic diagram of a software structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the layered architecture may divide the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into three layers, an application layer, an application framework (framework) layer, and a hardware (hardware) layer from top to bottom. In a possible implementation manner, a smart sensor hub (sensorhub) may be further included in the hierarchical architecture.
In particular, the application layer may include a series of application packages. As shown in fig. 3, the application package may include one or more of the following, for example: camera, phone, wisdom trip APP, sign indicating number APP by bus etc..
In the embodiment of the application, the intelligent travel APP is used for achieving functions of bus code scanning and the like; this sign indicating number APP by bus can understand the third party application in the wisdom trip APP for it serves to ride the sign indicating number for wisdom trip APP provides. In a possible implementation manner, when the embodiment of the present application is applied to a payment scenario, an application for payment may also be included in the application layer.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a resource manager, a notification manager, and a gesture service module, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, touch the screen, drag the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
The gesture service module is used for monitoring the wrist turning action of the user. In the embodiment of the application, when the gesture service module monitors the message for indicating the wrist flipping recognition result, the message for indicating the wrist flipping recognition result can be transmitted to the intelligent travel APP.
Acceleration sensors and gyroscope sensors may be included in the hardware layer. In the embodiment of the application, the acceleration sensor is used for acquiring the acceleration data of the terminal equipment of a user in the wrist turning process; the gyroscope sensor is used for acquiring angular acceleration data of the terminal equipment of a user in the wrist turning process. The gyroscope sensor and the acceleration sensor can be jointly used for detecting the wrist turning action of the user.
The smart sensor hub may be a solution based on a combination of software and hardware on a low power consumption Micro Control Unit (MCU) and a lightweight real-time operating system (RTOS), and its main function is to connect and process data from various sensor devices.
In the embodiment of the application, the intelligent sensing hub can comprise a wrist-turning recognition algorithm, and the wrist-turning recognition algorithm is used for performing algorithm recognition on data acquired from the sensor and acquiring a wrist-turning state in a code scanning process of a user during riding.
The following describes the technical solution of the present application and how to solve the above technical problems in detail by specific embodiments. The following embodiments may be implemented independently or in combination, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Exemplarily, when the user utilizes the wisdom trip APP among the terminal equipment to take the subway, can open the sign indicating number by bus among the wisdom trip APP to with the sign indicating number by bus mouth of sweeping of sign indicating number by bus orientation floodgate machine, terminal equipment can be based on the user sweep the wrist action of turning over of sign indicating number in-process and the user by turning over the wrist to sweep the time interval of sign indicating number, realizes that the sign indicating number by bus interface is left automatically after the user finishes sweeping the sign indicating number, the reinforcing flexibility of withdrawing from the sign indicating number by bus.
Exemplarily, fig. 4 is a schematic flowchart of a method for exiting a two-dimensional code according to an embodiment of the present application. In the embodiment corresponding to fig. 4, the two-dimensional code is taken as an example to illustrate the riding code, and the example does not limit the embodiment itself. As shown in fig. 4, the method for exiting the two-dimensional code may include the following steps:
s401, when the terminal equipment receives the operation that the user opened the bus code in the smart trip APP, the terminal equipment can display a bus code interface.
In the embodiment of the present application, the operation of opening the ride code may include one or more of the following: touch operation, key operation, air gesture operation, voice operation, and the like.
For example, the user may open the car code in the smart travel APP in the following various manners (e.g., the embodiment corresponding to fig. 1, or the embodiments corresponding to fig. 5 to 6). In the embodiments corresponding to fig. 5 to fig. 6, a terminal device is taken as an example for description, and the example does not limit the embodiments of the present application.
In one implementation, the mobile phone may trigger the display of the functional component of the smart travel APP by the operation of pressing the smart travel APP for a long time by the user as shown in a in fig. 1 based on the embodiment corresponding to fig. 1, and then open the riding code in the smart travel APP by the operation of the user on the riding code control 103 in the functional component.
In another implementation, the mobile phone can open the riding code through the operation of the user on the control corresponding to the riding code function preset in the desktop. Exemplarily, fig. 5 is a schematic interface diagram for opening a ride code according to an embodiment of the present disclosure.
In the desktop state shown in a in fig. 5, when the mobile phone receives an operation that the user triggers the riding code control 501, the mobile phone may display an interface shown in b in fig. 5. As shown in an interface a in fig. 5, the riding code control 501 in the interface may be a control formed when the user adds the riding code function to a desktop in a smart trip APP (or a setting function), and other contents shown in a in fig. 5 are similar to those shown in a in fig. 1, and are not repeated here. The content shown in b in fig. 5 is similar to that shown in b in fig. 1, and is not described in detail here.
In another realization, the cell-phone can receive the operation that the user opened wisdom trip APP to open the sign indicating number by bus through the operation of user to being arranged in opening the controlling part of sign indicating number by bus in this wisdom trip APP. For example, fig. 6 is a schematic interface diagram of another opening of a ride code according to an embodiment of the present application.
In the desktop state shown in a in fig. 6, when the mobile phone receives an operation that the user triggers the smart travel control 601, the mobile phone may display an interface shown in b in fig. 6. As shown in b in fig. 6, the interface may be a home page of the smart trip APP, and the interface may include one or more of the following, for example: the system comprises a sweeping control, a riding code control 602, a lift control, a control for adding other functions, news 1 or news 2 corresponding to the current news function, and the like. The other contents displayed in the interface shown by a in fig. 6 are similar to the contents displayed in the interface shown by a in fig. 1, and are not repeated herein.
Further, in the interface shown as b in fig. 6, when the mobile phone receives an operation that the user triggers the riding code control 602, the mobile phone may display the interface shown as c in fig. 6. As shown in c in fig. 6, the content displayed in the interface is similar to that displayed in the interface shown in b in fig. 1, and is not described herein again.
It can be understood that the manner of opening the ride code may include other contents according to an actual scenario, which is not limited in the embodiment of the present application.
S402, the terminal equipment collects data corresponding to the wrist flipping action in the code scanning process of the user taking a bus, and inputs the data corresponding to the wrist flipping action into a neural network model for wrist flipping identification to obtain a wrist flipping identification result.
In this embodiment, the data corresponding to the wrist flipping motion may include: acceleration data acquired by an acceleration sensor, angular acceleration data acquired by a gyro sensor, and the like.
For example, the frequency of the data collected by the acceleration sensor and the gyroscope sensor may be 100 hertz (hz), which may be understood as collecting 100 sets of data every 10 milliseconds (ms) for 1 s. The set of data may include 3 acceleration data (including x, y, and z axes) and 3 angular acceleration data (including x, y, and z axes), among others.
In this embodiment, the terminal device may construct a neural network model for wrist rollover recognition based on one or more of the following neural network models, for example: a gated recurrent neural network (GRU), a back propagation neural network (back propagation network), a Convolutional Neural Network (CNN), a residual error shrinkage network (residual adaptive network, GAN), a Support Vector Machine (SVM), or a deep residual error network (DRN) based on a gated recurrent unit, etc.
Fig. 7 is a schematic flowchart of a process for constructing a neural network model according to an embodiment of the present disclosure.
As shown in fig. 7, for the model training process corresponding to the dashed box 701, the training process may include: acquiring training data, preprocessing, training a neural network model for wrist-flipping recognition and the like. The preprocessing process may be used to filter out the influence of high-frequency noise, for example, the preprocessing process may include mean filtering or the like.
Specifically, the training data may include: in the gate comprising the code scanning openings with different orientations, a user is in a wrist turning state, for example, wrist turning sample data when the bus code is turned over to face the gate code scanning openings in the bus code scanning process; and the user is in a non-wrist-flipping state, for example, non-wrist-flipping sample data in a static standing state (for example, the user stands and opens the car code), or a walking state (for example, the user opens the car code during walking), and the like. Wherein, the code scanning ports with different orientations can comprise at least one of the following: the code scanning port is in a vertical state, the code scanning port is in a horizontal state, the code scanning port is in a side-tipping state, or the code scanning port is in a backward-tipping state; based on the code scanning ports in different directions, the wrist-turning code scanning action of the user can comprise at least one of the following actions: the method comprises the following steps that the terminal device is in a vertical screen state, the terminal device is in a vertical screen inclined state, the terminal device is in a transverse screen inclined state, the terminal device is in an upper right inclined state, the terminal device is in an upper left inclined state, the terminal device is in an upper screen inclined state, the terminal device is in a lower inclined state and the like. In the horizontal code scanning port, the wrist-turning code scanning action of the user can also be a wrist-turning code scanning state that the user turns the wrist to face downwards.
It can be understood that the training data may be considered comprehensively, and include gates facing different code scanning ports, different code scanning actions during code scanning by a user, and other factors, and the training data may also include other contents according to an actual scenario, which is not limited in this embodiment of the application.
Illustratively, according to the wrist sample data and the non-wrist sample data, one possible implementation of training the neural network model is as follows: inputting wrist turning sample data of a user in a wrist turning state and non-wrist turning sample data of a user in a non-wrist turning state in a neural network model to be trained, outputting a predicted wrist turning condition by using the neural network model to be trained, comparing and predicting a difference between the wrist turning condition and a real wrist turning condition by using a loss function, for example, calculating a recall rate or a false recognition rate of the predicted wrist turning condition, and the like; and ending the model training until the difference between the predicted wrist overturning condition output by the model and the real wrist overturning condition meets the loss function, and obtaining the neural network model for wrist overturning recognition. Further, the terminal device may identify whether the user is in a wrist-flipping state based on the detected acceleration data and the angular acceleration data.
As shown in fig. 7, for the wrist-flipping recognition process, the wrist-flipping recognition process may include: the method comprises the steps of obtaining real-time data, preprocessing, performing wrist-turning recognition based on a neural network model, outputting a recognition result and the like.
It is to be understood that the wrist-flipping recognition process may be implemented in the smart sensor hub of the terminal device or may also be implemented in the server. For example, the terminal device may upload the acceleration data and the angular acceleration data obtained in the step shown in S402 to the server, execute, in the server, a process of recognizing the acceleration data and the angular acceleration data by using the neural network model in the step shown in S402, and further, the server may send the above-mentioned wrist-flipping recognition result to the terminal device. It can be understood that the process of putting the wrist-flipping recognition process into the server for realization can effectively reduce the operation memory and power consumption of the terminal equipment.
In a possible implementation manner, the wrist-flipping recognition result may be information about whether a wrist-flipping action is satisfied, or the wrist-flipping recognition result may also be information about a score of the wrist-flipping, and a specific form of the wrist-flipping recognition result is not specifically limited in the embodiment of the present application.
And S403, when the terminal equipment receives the code scanning operation of the user, the terminal equipment can calculate the time interval from wrist turning to code scanning.
In the embodiment of the application, the wrist turning can be understood as a wrist turning action generated for aligning the bus code to the code scanning port of the gate after the bus code is opened by a user; the code scanning time can be understood as code scanning completion time; the time interval from the wrist turning to the code scanning can be understood as the time interval between the wrist turning action generated when the user aligns the bus code to the code scanning port of the gate after opening the bus code and the code scanning completion time when the gate scans the bus code.
Illustratively, after the user opens the riding code, the riding code is aligned to the code scanning port of the gate, the terminal device may detect a wrist flipping action based on the neural network model in the step shown in S402, for example, the terminal device may detect that the wrist flipping action occurs based on data generated when the user aligns the riding code to the code scanning port of the gate, and the generation time of the wrist flipping action is 08: 30; further, when the user moves the riding code of the terminal device to the code scanning port of the gate and the distance between the riding code of the terminal device and the code scanning port of the gate is close enough, the gate can scan the riding code, and the terminal device can detect that the code scanning is completed, and if the time for completing the scanning is detected to be 08:33, the terminal device can calculate that the time interval from the wrist turning to the code scanning is 3 seconds.
S404, the terminal equipment determines whether to quit the bus code interface or not based on the wrist flipping recognition result and the time interval from wrist flipping to code scanning.
In one implementation, when the terminal device determines that the wrist flipping recognition result meets the wrist flipping state (or the numerical value of the wrist flipping score exceeds the score threshold), and the time interval from wrist flipping to code scanning is less than (or less than or equal to) the time threshold, the terminal device may exit the code-taking interface. Wherein, the time threshold value can be 5 seconds.
The wrist flipping recognition result or the wrist flipping score can be understood as a result output by the neural network model in the step shown in S402 based on recognition of data corresponding to the wrist flipping action in the code scanning process of the user taking a car. For example, the wrist-flipping recognition result may be 0 (it may be understood that the wrist-flipping action is not satisfied) or 1 (it may be understood that the wrist-flipping action is satisfied), and when the terminal device obtains the wrist-flipping recognition result as 1, the terminal device may determine that the wrist-flipping action is detected, or when the terminal device obtains the wrist-flipping recognition result as 0, the terminal device may determine that the wrist-flipping action is not detected. Alternatively, the wrist-flipping score may also be a numerical value, for example, when the terminal device obtains the wrist-flipping score of 95 and the scoring threshold is 90, the terminal device may determine that the wrist-flipping motion is detected, or when the terminal device obtains the wrist-flipping score of 75 and the scoring threshold is 90, the terminal device may determine that the wrist-flipping motion is not detected.
In another implementation, when the terminal device determines that the wrist-flipping recognition result does not satisfy the wrist-flipping state (or the numerical value of the wrist-flipping score is less than or equal to the score threshold); or, the time interval from wrist turning to code scanning is greater than or equal to (or greater than) the time threshold; or when the terminal device determines that the wrist-flipping recognition result does not satisfy the wrist-flipping state (or the numerical value of the wrist-flipping score is smaller than or equal to the score threshold), and the time interval from wrist-flipping to code-scanning is greater than or equal to (or greater than) the time threshold, the terminal device may display the riding code interface, or understand that the riding code interface is not exited.
For example, when the time interval from wrist turning to code scanning is greater than or equal to (or greater than) a time threshold, it can be understood that the user may interrupt code scanning for various reasons in the code scanning process by bus, and at this time, the user does not need to quit the code scanning interface, so that the user can scan the code for the next time. For example, when a user makes a call to a mobile phone in the process of directing a bus code to a code scanning port of a gate after wrist turning, the terminal device detects that the time interval from wrist turning to code scanning is greater than or equal to (or greater than) a time threshold, the user does not need to quit the bus code interface, and the user can continue to scan the code and take the bus based on the bus code interface after answering or hanging up the call. For another example, when the user recognizes that the riding code may be switched incorrectly in the process of facing the riding code to the code scanning port of the gate after wrist turning, the wrist turning is continued to face the riding code to the user and the riding code is switched to the correct riding code, and at this time, the terminal device can detect that the time interval from wrist turning to code scanning is greater than or equal to (or greater than) the time threshold, so that the user does not exit the riding code interface, and further, the user can switch the correct riding code in the riding code interface.
For example, fig. 8 is a schematic interface diagram of a stop-taking code according to an embodiment of the present application. In the embodiment corresponding to fig. 8, a terminal device is taken as an example for illustration, and the example does not limit the embodiment of the present application.
The user can scan the code by taking the car based on the interface shown as a in fig. 8, when the mobile phone determines to exit from the code by taking the car based on the wrist turning recognition action of the user and the time interval from the wrist turning to the code scanning, the mobile phone can display an interface shown as b in fig. 8, and the interface can be a home page of the smart trip APP; alternatively, when the mobile phone determines not to exit the car code interface based on the wrist-flipping recognition action of the user and the time interval from wrist-flipping to code-scanning, the mobile phone may continue to display the interface as shown in a in fig. 8 (or display the interface as shown in c in fig. 1). The content displayed in the interface shown as a in fig. 8 is similar to the content displayed in the interface shown as c in fig. 6, and the content displayed in the interface shown as b in fig. 8 is similar to the content displayed in the interface shown as b in fig. 6, which are not repeated herein.
In a possible implementation manner, when the mobile phone opens the riding code in the functional component corresponding to the smart travel APP in the desktop state in the embodiment corresponding to fig. 1, and when the mobile phone determines to exit the riding code interface based on the wrist flipping recognition action of the user and the time interval from the wrist flipping to the code scanning, the mobile phone may also display the interface including the functional component.
In a possible implementation manner, when the mobile phone opens the car-taking code in the desktop state in the embodiment corresponding to fig. 5, and when the mobile phone determines to exit the car-taking code interface based on the wrist-turning recognition action of the user and the time interval from wrist-turning to code-scanning, the mobile phone may also display the desktop.
In a possible implementation mode, when a user needs to ride a car under the condition of using other APPs, for example, using a video APP, a riding code interface can be opened in the smart travel APP from a background or from a desktop in time, and a code scanning riding is performed, and when the mobile phone determines to quit the riding code interface based on the wrist turning recognition action of the user and the time interval from the wrist turning to the code scanning, the mobile phone can also display the interface corresponding to the video APP.
For example, fig. 9 is a schematic interface diagram of another exit vehicle code provided in the embodiment of the present application. In the embodiment corresponding to fig. 9, a terminal device is taken as an example for description, and this example does not limit the embodiment of the present application.
When a user needs to scan a bus code in the process of watching a video in the interface of the video APP shown in a in fig. 9, the mobile phone may receive an operation that the user opens the bus code interface in the smart travel APP from the background multitask interface or from the desktop, and then the mobile phone may display the interface shown in b in fig. 9. Further, the user may scan the code for riding by using a riding code interface shown as b in fig. 9, and when the mobile phone determines to exit the riding code interface based on the wrist-flipping recognition action of the user and the time interval from the wrist-flipping to the code scanning, the mobile phone may display an interface corresponding to the video APP shown as a in fig. 9. The interface shown as a in fig. 9 may include video content, and content such as video 902, video 903, video 904, and video 905 recommended by video APP; as shown in b in fig. 9, the content displayed in the interface is similar to that in the interface shown in c in fig. 6, and is not described herein again.
Further, the mobile phone can also determine whether to quit the bus code interface or not based on the wrist turning identification action of the user, the time interval from wrist turning to code scanning and the time for switching the user from the video APP to the bus code interface. For example, when the mobile phone determines that the wrist-flipping recognition result is that the wrist-flipping state is satisfied, the time interval from wrist-flipping to code scanning is less than (or less than or equal to) a time threshold, and the time for switching from the video APP to the car code interface is less than (or less than or equal to) another time threshold, the mobile phone may also display an interface corresponding to the video APP.
In a possible implementation manner, when a user needs to take a car by using other functions in the smart travel APP, such as checking news, the user can quickly switch to a bus code interface and scan the bus code, and when the mobile phone determines to quit the bus code interface based on the wrist turning recognition action of the user and the time interval from the wrist turning to the code scanning, the mobile phone can also display the news interface.
For example, fig. 10 is a schematic interface diagram of another exit riding code provided in the embodiment of the present application. In the embodiment corresponding to fig. 10, a terminal device is taken as an example for illustration, and the example does not limit the embodiment of the present application.
When the user needs to scan the bus code in the process of viewing news in the smart travel APP as shown in a in fig. 10, the mobile phone may receive an operation that the user exits the news interface and opens the bus code interface, and then the mobile phone may display an interface as shown in b in fig. 10. Further, the user may scan the code for riding with the riding code interface as shown in b of fig. 10, and when the mobile phone determines to exit the riding code interface based on the wrist-flipping recognition action of the user and the time interval from the wrist-flipping to the code scanning, the mobile phone may display a news interface as shown in a of fig. 10. As shown in a in fig. 10, the interface may include content corresponding to news 1; as shown in b in fig. 10, the content displayed in the interface is similar to that in the interface shown in c in fig. 6, and is not described in detail here.
Furthermore, the mobile phone can determine whether to quit the bus taking code interface or not based on the wrist turning identification action of the user, the time interval from wrist turning to code scanning and the time for the user to rapidly switch to the bus taking code interface from the news page in the smart travel APP. For example, when the mobile phone determines that the wrist-flipping recognition result is that the wrist-flipping state is satisfied, the time interval from wrist-flipping to code-scanning is less than (or less than or equal to) a time threshold, and the time for switching from the news interface in the smart travel APP to the code-scanning interface is less than (or less than or equal to) another time threshold, the mobile phone may also display the news interface.
It can be understood that the interface displayed when the riding code interface exits may include other contents according to an actual scene, which is not limited in this embodiment of the application.
Based on the method, the terminal equipment can identify the wrist turning action in the code scanning process of the user, and based on the wrist turning action and the time from wrist turning to code scanning of the user, the user can quickly quit the bus code interface after completing code scanning, so that the flexibility of quitting the bus code can be enhanced.
On the basis of the embodiment corresponding to fig. 4, in a possible implementation manner, when the terminal device determines to exit the car code interface based on the wrist flipping action of the user and the time from the wrist flipping to the code scanning completion, the terminal device may further display prompt information, where the prompt information is used to prompt the user to exit the car code interface currently.
Exemplarily, fig. 11 is a schematic view of an interface for displaying a prompt message according to an embodiment of the present application. In the embodiment corresponding to fig. 11, a terminal device is taken as an example for description, and this example does not limit the embodiment of the present application.
When the mobile phone determines to exit the car code interface based on the wrist flipping action of the user and the time from the wrist flipping to the code scanning completion, the mobile phone may display an interface as shown in fig. 11, where the interface may include a prompt message 1101, for example, the prompt message 1101 may be: and detecting the wrist turning action, and already exiting the bus taking code for the user. In the embodiment of the present application, the specific form of the prompt information is not particularly limited.
Based on the method, the user can timely perceive the state change of the terminal equipment according to the prompt information, and the prompt information can enhance the intelligence of the terminal equipment.
On the basis of the embodiment corresponding to fig. 4, in a possible implementation manner, the terminal device may include: the device comprises an APP layer, a framework layer, a sensorhub layer, a hardware layer and the like, wherein the APP layer can comprise a riding code APP and a smart travel APP, the framework layer can comprise a gesture service module, the sensorhub layer can comprise a wrist-flipping recognition algorithm, and the hardware layer can comprise an acceleration sensor, a gyroscope sensor and the like. It is understood that the acceleration sensor and the gyro sensor may correspond to two modules, and the embodiment of the present application is briefly described with one module.
Exemplarily, fig. 12 is a schematic flowchart of another method for exiting a two-dimensional code according to an embodiment of the present application. As shown in fig. 12, the method for exiting the two-dimensional code may include the following steps:
in the embodiment of the present application, a user may open a ride code through the embodiment corresponding to fig. 1 (or the embodiment corresponding to fig. 5 or fig. 6), so that the ride code APP in the terminal device may perform the step shown in S1201.
S1201, when the bus code APP receives the operation that the user opens the bus code in the smart trip APP, the smart trip APP can obtain the broadcast used for indicating the user to open the bus code from the bus code APP.
S1202, the gesture service module can obtain an indication message for monitoring the wrist flipping action from the intelligent travel APP.
In the possible implementation, when there is not wisdom trip APP in the terminal equipment, or understand that this sign indicating number APP by bus can regard as independent APP, for terminal equipment provides when taking a bus the sign indicating number, then when this sign indicating number APP by bus detects that the user opens the sign indicating number by bus, issue and monitor the main part of turning over the wrist action and also can be accomplished by this sign indicating number APP by bus. For example, S1202 may also be that the gesture service module may obtain an instruction message for monitoring a wrist flipping action from the car code APP.
And S1203, obtaining an indication message for starting wrist-flipping recognition from the intelligent travel APP by using a wrist-flipping recognition algorithm.
It can be understood that, when the smart trip APP detects that the user opens the riding code, the steps shown in S1202 and S1203 may be executed simultaneously.
S1204, the gesture service module can be provided with a monitor for monitoring the wrist-turning motion.
In the embodiment of the application, the listener can be used for realizing the listening of the message (or the wrist-turning score) whether the wrist-turning action is recognized. For example, the gesture service module may receive a plurality of messages, or may include a plurality of listeners for listening to different types of messages. For example, when a listener in the gesture service module monitors a message identifying a wrist-turning action (or a message not identifying a wrist-turning action, or a score for wrist-turning), the gesture service module may report the message identifying the wrist-turning action (or a message not identifying a wrist-turning action, or a score for wrist-turning) to the smart travel APP in time. The message of whether the wrist-turning motion is recognized may be 0 (which may be understood as not satisfying the wrist-turning motion) or 1 (which may be understood as satisfying the wrist-turning motion), and the wrist-turning score may be a numerical value.
In the embodiment of the application, when a user needs to ride a car, a wrist flipping action can be executed, so that the wrist flipping recognition algorithm can execute the step shown in S1205. The wrist-turning action can be used for enabling the riding codes displayed by the terminal equipment to face the code scanning direction of the gate.
It is understood that the wrist flipping action of the user may be performed in any step from after S1201 to before S1205, which is not limited in the embodiment of the present application.
S1205, the wrist-turning identification algorithm can acquire acceleration data from the acceleration sensor and angular acceleration data from the gyroscope sensor.
And S1206, recognizing the wrist turning motion by the wrist turning recognition algorithm based on the acceleration data and the angular velocity data to obtain a wrist turning recognition result.
For example, the process of performing the wrist-flipping motion recognition by the wrist-flipping recognition algorithm based on the acceleration data and the angular velocity data may refer to the process of obtaining the wrist-flipping recognition result by the terminal device based on the neural network algorithm in the step shown in S402, which is not described herein again.
S1207, the gesture service module can obtain a message for indicating the wrist-flipping recognition result from the wrist-flipping recognition algorithm.
S1208 and the intelligent travel APP can obtain a message used for indicating a wrist-turning recognition result from the gesture service module.
In the embodiment of the application, after the user executes the wrist turning action, the user can get the bus code close to the code scanning port of the gate machine and scan the code to enter the station, so that the intelligent travel APP can execute the step shown in S1209.
It is understood that the code scanning action of the user may be performed after the wrist flipping action, for example, may be performed at any step after S1201 and before S1209, which is not limited in this embodiment of the present application.
S1209, the intelligent travel APP determines whether to quit the bus code interface or not based on the wrist turning recognition result and the time from wrist turning to code scanning.
In this application embodiment, wisdom trip APP can detect and sweep a yard finish time based on the change of taking a bus a yard page. The change of the page can be understood as that the riding code in the riding code page may be updated at the time of finishing the code scanning. For example, before the bus is scanned, the bus code in the bus code interface is a first bus code, when a user faces the first bus code towards a code scanning port of the gate and finishes scanning the code, the bus code in the bus code interface can be switched from the first bus code to a second bus code, the smart trip APP can detect the change of a page in a current bus code page and detect the time of the change of the page, and the time of the change of the page is understood as the time of finishing scanning the code.
For example, the smart travel APP determines whether to quit the bus code interface based on the wrist-flipping recognition result and the time from the wrist-flipping to the code-scanning, which may be referred to as the process of determining whether to quit the bus code interface by the terminal device in the step shown in S404, and is not described herein again.
Based on the method, the terminal equipment can identify the wrist turning action in the code scanning process of the user based on data interaction between modules in the equipment, and based on the wrist turning action and the time from wrist turning to code scanning of the user, the user can quickly quit the bus code interface after completing code scanning, so that the flexibility of quitting the bus code can be enhanced.
It should be understood that the interface of the terminal device provided in the embodiment of the present application is only an example, and is not limited to the embodiment of the present application.
The method provided by the embodiment of the present application is explained above with reference to fig. 4 to fig. 12, and the apparatus provided by the embodiment of the present application for performing the method is described below. As shown in fig. 13, fig. 13 is a schematic structural diagram of an apparatus for exiting a two-dimensional code according to an embodiment of the present application, where the apparatus for exiting a two-dimensional code may be a terminal device in the embodiment of the present application, and may also be a chip or a chip system in the terminal device.
As shown in fig. 13, the apparatus 130 for exiting the two-dimensional code can be used in a communication device, a circuit, a hardware component or a chip, and includes: display unit 1301, processing unit 1302. The display unit 1301 is used for supporting a step of exiting display executed by the two-dimensional code method; the processing unit 1302 is configured to support a step of executing information processing by a device that exits the two-dimensional code.
Specifically, the embodiment of the present application provides a device 130 for exiting a two-dimensional code, which includes a display unit 1301, configured to display a first interface; the first interface comprises a first two-dimensional code; the processing unit 1302 is configured to obtain first data in a wrist flipping process; the first data includes acceleration data and angular acceleration data; the acceleration data is acquired by an acceleration sensor, and the angular acceleration data is acquired by a gyroscope sensor; when the first data indicates that the wrist-flipping action occurs, the display unit 1301 is further configured to quit displaying the first interface.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs, and a time interval from the occurrence of the wrist flipping action to the scanning of the first two-dimensional code is smaller than a first preset threshold, the display unit 1301 is specifically configured to exit from displaying the first interface.
In a possible implementation manner, the processing unit 1302 is specifically configured to receive a first operation of opening a first application in a desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; in response to the first operation, the display unit 1301 is specifically configured to display a home page of the first application; the home page of the first application comprises a first control for opening a first two-dimensional code; the processing unit 1302 is further specifically configured to receive a second operation for the first control; in response to the second operation, the display unit 1301 is further specifically configured to display the first interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display a home page of the first application.
In a possible implementation, the processing unit 1302 is specifically configured to receive a third operation for the first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; a display unit 1301, specifically configured to display an interface including a function component of a first application; the functional component comprises a second control for opening the first two-dimensional code; the processing unit 1302 is further specifically configured to receive a fourth operation for the second control; in response to the fourth operation, the display unit 1301 is further specifically configured to display the first interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display an interface including a function component of the first application.
In a possible implementation, the processing unit 1302 is specifically configured to receive a fifth operation for a third control element in the desktop; the third control element is used for opening the first two-dimensional code; in response to the fifth operation, the display unit 1301 is specifically configured to display the first interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display a desktop interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display a second interface; the second interface is an interface operated in the second application; the processing unit 1302 is specifically configured to receive a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; in response to the sixth operation, the display unit 1301 is further specifically configured to display the first interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display the second interface.
In a possible implementation manner, when the first data indicates that the wrist flipping action occurs and a time interval for switching from the second interface to the first interface is smaller than a second preset threshold, the display unit 1301 is specifically configured to display the second interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display a third interface; the third interface is an interface operated in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; the processing unit 1302 is specifically configured to receive a seventh operation of switching from the third interface to the first interface; in response to the seventh operation, the display unit 1301 is specifically configured to display the first interface.
In a possible implementation manner, the display unit 1301 is specifically configured to display the third interface.
In a possible implementation manner, when the first data indicates that a wrist flipping action occurs and a time interval for switching from the third interface to the first interface is smaller than a third preset threshold, the display unit 1301 is specifically configured to display the third interface.
In one possible implementation manner, the wrist-turning action is obtained by the terminal equipment by utilizing a neural network model to identify first data; the neural network model is obtained by the terminal device based on training of second data, the second data comprises acceleration sample data and angular acceleration sample data, and the second data is related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port.
In one possible implementation, the state of the code scanning port includes at least one of: the code scanning port is in a vertical state, the code scanning port is in a horizontal state, the code scanning port is in a side-tipping state or the code scanning port is in a backward tipping state.
In one possible implementation, the state of the terminal device includes at least one of: the terminal device is in a vertical screen state, the terminal device is in a vertical screen inclined state, the terminal device is in a transverse screen inclined state, the terminal device is in a right inclined state, the terminal device is in a left inclined state, the terminal device is in an upward inclined state, the terminal device is in a downward inclined state, or the terminal device is in a horizontal downward state.
In a possible implementation manner, when the first data indicates that the wrist flipping action does not occur, and/or a time interval from the wrist flipping action to the scanning of the first two-dimensional code is greater than or equal to a first preset threshold, the display unit 1301 is further configured to display the first interface.
In a possible embodiment, the apparatus for exiting the two-dimensional code may further include: a storage unit 1304. The processing unit 1302 and the storage unit 1304 are connected by a line.
The storage unit 1304 may include one or more memories, which may be devices in one or more devices or circuits for storing programs or data.
The storage unit 1304 may be independent and connected to the processing unit 1302 included in the apparatus for ejecting the two-dimensional code through a communication line. The memory unit 1304 may also be integrated with the processing unit 1302.
In a possible embodiment, the apparatus for exiting the two-dimensional code may further include: and a communication unit 1303. The communication unit 1303 may be an input or output interface, a pin or a circuit, or the like.
The storage unit 1304 may store computer-executable instructions for methods in the terminal device to cause the processing unit 1302 to perform the methods in the embodiments described above.
The storage unit 1304 may be a register, a cache, a RAM, or the like, and the storage unit 1304 may be integrated with the processing unit 1302. The memory unit 1304 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the memory unit 1304 may be separate from the processing unit 1302.
Fig. 14 is a schematic diagram of a hardware structure of a control device according to an embodiment of the present disclosure, and as shown in fig. 14, the control device includes a processor 1401, a communication line 1404, and at least one communication interface (an exemplary communication interface 1403 is illustrated in fig. 14 as an example).
Processor 1401 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the teachings of the present disclosure.
The communication line 1404 may include circuitry to communicate information between the above-described components.
Communication interface 1403 is implemented using any transceiver or the like for communicating with other devices or communication networks, such as ethernet, Wireless Local Area Networks (WLAN), etc.
Possibly, the control device may also comprise a memory 1402.
The memory 1402 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these. The memory may be stand alone and coupled to the processor via the communication line 1404. The memory may also be integral to the processor.
The memory 1402 is used for storing computer-executable instructions for executing the present invention, and is controlled by the processor 1401 for execution. The processor 1401 is configured to execute the computer executable instructions stored in the memory 1402, so as to implement the two-dimensional code exiting method provided in the embodiment of the present application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 1401 may include one or more CPUs such as CPU0 and CPU1 in fig. 14 as an example.
In particular implementations, the control device may include multiple processors, such as processor 1401 and processor 1405 in fig. 14, for example, as an embodiment. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Exemplarily, fig. 15 is a schematic structural diagram of a chip provided in an embodiment of the present application. Chip 150 includes one or more (including two) processors 1520 and a communication interface 1530.
In some embodiments, memory 1540 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In an embodiment of the present application, memory 1540 may include a read-only memory and a random access memory and provides instructions and data to processor 1520. A portion of memory 1540 may also include non-volatile random access memory (NVRAM).
In the illustrated embodiment, memory 1540, communication interface 1530, and memory 1540 are coupled together by bus system 1510. The bus system 1510 may include a power bus, a control bus, a status signal bus, and the like, in addition to the data bus. For ease of description, the various buses are labeled as bus system 1510 in FIG. 15.
The methods described in the embodiments of the present application may be applied to the processor 1520 or implemented by the processor 1520. The processor 1520 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware integrated logic circuits or software in the processor 1520. The processor 1520 may be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an FPGA (field-programmable gate array) or other programmable logic device, discrete gate, transistor logic device or discrete hardware component, and the processor 1520 may implement or execute the methods, steps and logic blocks disclosed in the embodiments of the present invention.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read only memory, a programmable read only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 1540, and the processor 1520 reads the information in the memory 1540, and completes the steps of the above method in combination with the hardware thereof.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. Computer instructions may be stored in, or transmitted from, a computer-readable storage medium to another computer-readable storage medium, e.g., from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, Digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.), the computer-readable storage medium may be any available medium that a computer can store or a data storage device including one or more available media integrated servers, data centers, etc., the available media may include, for example, magnetic media (e.g., floppy disks, hard disks, or magnetic tape), optical media (e.g., digital versatile disks, DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), etc.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage media may be any target media that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (22)

1. A method for exiting a two-dimensional code is applied to a terminal device, the terminal device comprises an acceleration sensor and a gyroscope sensor, and the method comprises the following steps:
the terminal equipment displays a first interface; the first interface comprises a first two-dimensional code;
the terminal equipment acquires first data in a wrist turning process; the first data comprises acceleration data and angular acceleration data; the acceleration data is acquired by the acceleration sensor, and the angular acceleration data is acquired by the gyroscope sensor;
and when the first data indicate that the wrist turning action occurs, the terminal equipment quits displaying the first interface.
2. The method according to claim 1, wherein when the first data indicates that a wrist flipping action occurs, the terminal device quits displaying the first interface, and the method comprises the following steps:
and when the first data indicate that a wrist turning action occurs and the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold value, the terminal equipment quits from displaying the first interface.
3. The method according to claim 1 or 2, wherein the terminal device displays a first interface comprising:
the terminal equipment receives a first operation of opening a first application in a desktop; the first application is used for providing the first two-dimensional code for the terminal equipment;
responding to the first operation, and displaying a home page of the first application by the terminal equipment; the first application comprises a first control used for opening the first two-dimensional code in a home page;
the terminal equipment receives a second operation aiming at the first control;
and responding to the second operation, and displaying the first interface by the terminal equipment.
4. The method of claim 3, wherein the terminal device quitting displaying the first interface comprises:
and the terminal equipment displays the home page of the first application.
5. The method according to claim 1 or 2, wherein the terminal device displays a first interface comprising:
the terminal equipment receives a third operation aiming at the first application in the desktop; the first application is used for providing the first two-dimensional code for the terminal equipment;
responding to the third operation, the terminal equipment displays an interface containing the functional component of the first application; the functional component comprises a second control for opening the first two-dimensional code;
the terminal equipment receives a fourth operation aiming at the second control;
and responding to the fourth operation, and displaying the first interface by the terminal equipment.
6. The method of claim 5, wherein the terminal device quitting displaying the first interface comprises:
and the terminal equipment displays the interface containing the functional component of the first application.
7. The method according to claim 1 or 2, wherein the terminal device displays a first interface comprising:
the terminal equipment receives a fifth operation aiming at a third control element in the desktop; the third control element is used for opening the first two-dimensional code;
and responding to the fifth operation, and displaying the first interface by the terminal equipment.
8. The method of claim 7, wherein the terminal device quitting displaying the first interface comprises:
and the terminal equipment displays a desktop interface.
9. The method according to claim 1 or 2, wherein the terminal device displays a first interface comprising:
the terminal equipment displays a second interface; the second interface is an interface operated in a second application;
the terminal equipment receives a sixth operation that a second interface is switched to a first application and the first interface in the first application is opened; the first application is used for providing the first two-dimensional code for the terminal equipment;
and responding to the sixth operation, and displaying the first interface by the terminal equipment.
10. The method of claim 9, wherein the terminal device quitting displaying the first interface comprises:
and the terminal equipment displays the second interface.
11. The method of claim 10, wherein said terminal device exiting from displaying the first interface when the first data indicates that a wrist flick has occurred comprises:
and when the first data indicate that the wrist turning action occurs and the time interval for switching from the second interface to the first interface is smaller than a second preset threshold value, the terminal equipment displays the second interface.
12. The method according to claim 1 or 2, wherein the terminal device displays a first interface comprising:
the terminal equipment displays a third interface; the third interface is an interface operated in the first application; the first application is used for providing the first two-dimensional code for the terminal equipment;
the terminal equipment receives a seventh operation of switching from the third interface to the first interface;
and responding to the seventh operation, and displaying the first interface by the terminal equipment.
13. The method of claim 12, wherein the terminal device quitting displaying the first interface comprises:
and the terminal equipment displays the third interface.
14. The method of claim 13, wherein when the first data indicates that a wrist flipping action occurs, the terminal device quitting displaying the first interface, comprising:
and when the first data indicate that the wrist turning action occurs and the time interval for switching the third interface to the first interface is smaller than a third preset threshold value, the terminal equipment displays the third interface.
15. The method of claim 1, wherein the wrist flipping motion is derived as the terminal device recognizing the first data using a neural network model; the neural network model is obtained by the terminal device based on training of second data, the second data comprises acceleration sample data and angular acceleration sample data, and the second data is related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port.
16. The method of claim 15, wherein the status of the code scanning port comprises at least one of: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a tilted state, or the code scanning port is in a tilted-back state.
17. The method of claim 15, wherein the state of the terminal device comprises at least one of: the terminal equipment is in perpendicular screen state, the terminal equipment is in perpendicular screen tilt state, the terminal equipment is in horizontal screen tilt state, the terminal equipment is in the upper right tilt state, the terminal equipment is in the upper left tilt state, the terminal equipment is in the upper tilt state, the terminal equipment is in the lower tilt state, or the terminal equipment is in the level state down.
18. The method of claim 2, further comprising:
and when the first data indicate that no wrist turning action occurs, and/or the time interval from the wrist turning action to the scanning of the first two-dimensional code is larger than or equal to the first preset threshold, the terminal equipment displays the first interface.
19. The utility model provides an exit device of two-dimensional code which characterized in that, terminal equipment includes acceleration sensor and gyroscope sensor, the device includes:
the display unit is used for displaying a first interface; the first interface comprises a first two-dimensional code;
the processing unit is used for acquiring first data in the wrist overturning process; the first data comprises acceleration data and angular acceleration data; the acceleration data is acquired by the acceleration sensor, and the angular acceleration data is acquired by the gyroscope sensor;
and when the first data indicate that the wrist turning action occurs, the display unit is also used for quitting displaying the first interface.
20. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to perform the method of any of claims 1 to 18.
21. A computer-readable storage medium, in which a computer program is stored which, when executed by a processor, causes a computer to carry out the method according to any one of claims 1 to 18.
22. A computer program product, comprising a computer program which, when executed, causes a computer to perform the method of any one of claims 1 to 18.
CN202111136786.6A 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code Active CN115016712B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202311129985.3A CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code
CN202111136786.6A CN115016712B (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code
PCT/CN2022/118319 WO2023045789A1 (en) 2021-09-27 2022-09-13 Method and apparatus for exiting quick-response code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111136786.6A CN115016712B (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311129985.3A Division CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Publications (2)

Publication Number Publication Date
CN115016712A true CN115016712A (en) 2022-09-06
CN115016712B CN115016712B (en) 2024-05-14

Family

ID=83064915

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311129985.3A Pending CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code
CN202111136786.6A Active CN115016712B (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311129985.3A Pending CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Country Status (2)

Country Link
CN (2) CN117453105A (en)
WO (1) WO2023045789A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023045789A1 (en) * 2021-09-27 2023-03-30 荣耀终端有限公司 Method and apparatus for exiting quick-response code

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090649A (en) * 2014-05-20 2014-10-08 上海翰临电子科技有限公司 Intelligent watchband and operating control method thereof
CN206833458U (en) * 2017-06-02 2018-01-02 南京易自助网络科技有限公司 A kind of barcode scanning payment mechanism of Self-help car washer
CN109725699A (en) * 2017-10-20 2019-05-07 华为终端(东莞)有限公司 Recognition methods, device and the equipment of identification code
CN109918006A (en) * 2019-01-28 2019-06-21 维沃移动通信有限公司 A kind of screen control method and mobile terminal
CN109933191A (en) * 2019-02-13 2019-06-25 苏鹏程 Gesture identification and control method and its system
CN110058767A (en) * 2019-03-26 2019-07-26 努比亚技术有限公司 Interface operation method, wearable terminal and computer readable storage medium
CN110187759A (en) * 2019-05-08 2019-08-30 安徽华米信息科技有限公司 Display methods, device, intelligent wearable device and storage medium
CN111209904A (en) * 2018-11-21 2020-05-29 华为技术有限公司 Service processing method and related device
CN112613475A (en) * 2020-12-31 2021-04-06 Oppo广东移动通信有限公司 Code scanning interface display method and device, mobile terminal and storage medium
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium
WO2021164551A1 (en) * 2020-02-19 2021-08-26 中国银联股份有限公司 Control method, system and apparatus for gate access, and gate

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107734191A (en) * 2017-11-15 2018-02-23 深圳市沃特沃德股份有限公司 Utilize the method and apparatus of acceleration transducer control mobile phone
EP3757739A4 (en) * 2018-04-19 2021-03-10 Huawei Technologies Co., Ltd. Method for display when exiting an application, and terminal
CN111190563A (en) * 2019-12-31 2020-05-22 华为技术有限公司 Interface display method and related device
CN117453105A (en) * 2021-09-27 2024-01-26 荣耀终端有限公司 Method and device for exiting two-dimensional code

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090649A (en) * 2014-05-20 2014-10-08 上海翰临电子科技有限公司 Intelligent watchband and operating control method thereof
CN206833458U (en) * 2017-06-02 2018-01-02 南京易自助网络科技有限公司 A kind of barcode scanning payment mechanism of Self-help car washer
CN109725699A (en) * 2017-10-20 2019-05-07 华为终端(东莞)有限公司 Recognition methods, device and the equipment of identification code
CN111209904A (en) * 2018-11-21 2020-05-29 华为技术有限公司 Service processing method and related device
CN109918006A (en) * 2019-01-28 2019-06-21 维沃移动通信有限公司 A kind of screen control method and mobile terminal
CN109933191A (en) * 2019-02-13 2019-06-25 苏鹏程 Gesture identification and control method and its system
CN110058767A (en) * 2019-03-26 2019-07-26 努比亚技术有限公司 Interface operation method, wearable terminal and computer readable storage medium
CN110187759A (en) * 2019-05-08 2019-08-30 安徽华米信息科技有限公司 Display methods, device, intelligent wearable device and storage medium
WO2021164551A1 (en) * 2020-02-19 2021-08-26 中国银联股份有限公司 Control method, system and apparatus for gate access, and gate
CN112613475A (en) * 2020-12-31 2021-04-06 Oppo广东移动通信有限公司 Code scanning interface display method and device, mobile terminal and storage medium
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SHANHE YI等: "Exploring head gesture interface of smart glasses", 《 IEEE INFOCOM 2016 - THE 35TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATIONS》, 28 July 2016 (2016-07-28), pages 1 - 9 *
谢仁强等: "基于加速度传感器的可扩展手势识别", 《传感技术学报》, 31 May 2016 (2016-05-31), pages 659 - 664 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023045789A1 (en) * 2021-09-27 2023-03-30 荣耀终端有限公司 Method and apparatus for exiting quick-response code

Also Published As

Publication number Publication date
WO2023045789A1 (en) 2023-03-30
CN115016712B (en) 2024-05-14
WO2023045789A9 (en) 2023-08-03
CN117453105A (en) 2024-01-26

Similar Documents

Publication Publication Date Title
CN109445577B (en) Virtual room switching method and device, electronic equipment and storage medium
CN111983559A (en) Indoor positioning navigation method and device
CN110288689B (en) Method and device for rendering electronic map
CN115904208A (en) Split screen display method and device
CN111569435A (en) Ranking list generation method, system, server and storage medium
CN113282355A (en) Instruction execution method and device based on state machine, terminal and storage medium
CN111311155A (en) Method, apparatus, system, device and storage medium for modifying distribution position
CN111241499A (en) Application program login method, device, terminal and storage medium
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN113688019B (en) Response time duration detection method and device
CN109005102B (en) Information processing method and device and electronic device
CN112749590A (en) Object detection method, device, computer equipment and computer readable storage medium
CN115016712B (en) Method and device for exiting two-dimensional code
CN111249728B (en) Image processing method, device and storage medium
CN112118353A (en) Information display method, device, terminal and computer readable storage medium
CN110231049B (en) Navigation route display method, device, terminal and storage medium
CN109107163B (en) Analog key detection method and device, computer equipment and storage medium
CN114201738A (en) Unlocking method and electronic equipment
CN113781731B (en) Alarm method and device
CN114598992A (en) Information interaction method, device, equipment and computer readable storage medium
CN111159168A (en) Data processing method and device
CN115016666B (en) Touch processing method, terminal equipment and storage medium
CN116755977B (en) Motion monitoring method, electronic device and computer readable storage medium
CN111666214B (en) Client fault tolerance test method, device, terminal, server and storage medium
CN113064537B (en) Media resource playing method, device, equipment, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant