CN115016712B - Method and device for exiting two-dimensional code - Google Patents

Method and device for exiting two-dimensional code Download PDF

Info

Publication number
CN115016712B
CN115016712B CN202111136786.6A CN202111136786A CN115016712B CN 115016712 B CN115016712 B CN 115016712B CN 202111136786 A CN202111136786 A CN 202111136786A CN 115016712 B CN115016712 B CN 115016712B
Authority
CN
China
Prior art keywords
interface
wrist
code
terminal equipment
turning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111136786.6A
Other languages
Chinese (zh)
Other versions
CN115016712A (en
Inventor
李丹洪
邸皓轩
张晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311129985.3A priority Critical patent/CN117453105A/en
Priority to CN202111136786.6A priority patent/CN115016712B/en
Publication of CN115016712A publication Critical patent/CN115016712A/en
Priority to PCT/CN2022/118319 priority patent/WO2023045789A1/en
Application granted granted Critical
Publication of CN115016712B publication Critical patent/CN115016712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method and a device for exiting a two-dimensional code, which relate to the technical field of terminals, and the method comprises the following steps: the terminal equipment displays a first interface; the first interface comprises a first two-dimensional code; the method comprises the steps that terminal equipment obtains first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; acceleration data are acquired by an acceleration sensor, and angular acceleration data are acquired by a gyroscope sensor; when the first data indicate that the wrist turning action occurs, the terminal equipment exits to display the first interface. Therefore, the terminal equipment can determine whether to exit the two-dimensional code by identifying the wrist turning action of the user in the code scanning process, so that the flexibility of exiting the code scanning interface is improved.

Description

Method and device for exiting two-dimensional code
Technical Field
The application relates to the technical field of terminals, in particular to a method and a device for exiting a two-dimensional code.
Background
With the wide popularization of the two-dimension code, more terminal devices can realize functions of riding and paying by using the two-dimension code. For example, a user can realize travel without a card by using a two-dimension code, when the user takes a seat, the two-dimension code (or simply a riding code) of a terminal device in an intelligent travel application program (application) can be opened before the user takes a seat, and the riding code is oriented to a code scanning port of a gate to scan the code and take the seat.
In general, after the user finishes scanning the code, a code scanning completion interface can be displayed in the terminal device of the user, and the code scanning completion interface is used for prompting the user to finish riding and scanning the code. When the user needs to exit the riding code, the user can exit the riding code by triggering an exit button in the code scanning completion interface.
However, the above method of exiting the ride has low flexibility.
Disclosure of Invention
The embodiment of the application provides a method and a device for exiting a two-dimensional code, which enable terminal equipment to identify a wrist turning action in a code scanning process of a user, realize automatic exiting of a riding code interface after the user finishes code scanning based on the wrist turning action, and enhance the flexibility of exiting the code scanning interface.
In a first aspect, an embodiment of the present application provides a method for exiting a two-dimensional code, which is applied to a terminal device, where the terminal device includes an acceleration sensor and a gyroscope sensor, and the method includes: the terminal equipment displays a first interface; the first interface comprises a first two-dimensional code; the method comprises the steps that terminal equipment obtains first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; acceleration data are acquired by an acceleration sensor, and angular acceleration data are acquired by a gyroscope sensor; when the first data indicate that the wrist turning action occurs, the terminal equipment exits to display the first interface. Therefore, the terminal equipment can determine whether to exit the two-dimensional code by identifying the wrist turning action of the user in the code scanning process, so that the flexibility of exiting the code scanning interface is improved.
The first two-dimensional code is a riding code in the embodiment of the application; the first interface is an interface for riding and scanning codes.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs, the terminal device exits from displaying the first interface, including: when the first data indicate that the wrist turning action occurs, and the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold value, the terminal equipment exits to display the first interface. Therefore, the terminal equipment can identify the wrist turning action of the user in the code scanning process, and based on the wrist turning action and detection of the time interval from the wrist turning action to the code scanning completion, the two-dimensional code interface can be automatically withdrawn after the user finishes the code scanning, and the flexibility of withdrawing the code scanning interface is enhanced.
The value of the first preset threshold may be 5 seconds, etc.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment receives a first operation of opening a first application in a desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; responding to a first operation, and displaying a home page of a first application by the terminal equipment; the first control for opening the first two-dimensional code is included in the home page of the first application; the terminal equipment receives a second operation aiming at the first control; and responding to the second operation, and displaying the first interface by the terminal equipment. Thus, the terminal equipment can conveniently enter the two-dimensional code interface from the home page of the first application.
The first application may be an intelligent travel application in the embodiment of the present application, the first page of the first application may be a first page of the intelligent travel application, and the first control may be a two-dimensional code control in the first page of the intelligent travel application.
In one possible implementation manner, the terminal device exits from displaying the first interface, including: the terminal device displays a home page of the first application. Thus, when the terminal equipment detects that the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the code scanning completion meet the preset conditions, the terminal equipment can exit the two-dimensional code interface and display the first page of the first application, so that the flexibility of exiting the two-dimensional code is improved.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment receives a third operation aiming at the first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; responding to the third operation, and displaying an interface containing the functional components of the first application by the terminal equipment; the functional component comprises a second control for opening the first two-dimensional code; the terminal equipment receives a fourth operation aiming at the second control; in response to the fourth operation, the terminal device displays the first interface. Thus, the terminal equipment can conveniently enter the two-dimensional code interface from the functional component of the first application.
In one possible implementation manner, the terminal device exits from displaying the first interface, including: the terminal device displays an interface containing functional components of the first application. Thus, when the terminal equipment detects that the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the code scanning completion meets the preset condition, the terminal equipment can exit the two-dimensional code interface and display the interface containing the functional components of the first application, the flexibility of exiting the two-dimensional code is improved, and the interface containing the functional components of the first application can be convenient for the user to use other functions of the first application.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment receives a fifth operation aiming at the third control in the desktop; the third control is used for opening the first two-dimensional code; in response to the fifth operation, the terminal device displays the first interface. Thus, the terminal equipment can conveniently enter the two-dimensional code interface from the desktop.
In one possible implementation manner, the terminal device exits from displaying the first interface, including: and the terminal equipment displays a desktop interface. Therefore, when the terminal equipment detects that the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the code scanning completion meets the preset condition, the terminal equipment can exit the two-dimension code interface and display the desktop interface, so that the flexibility of exiting the two-dimension code is improved.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment displays a second interface; the second interface is an interface operated in the second application; the terminal equipment receives a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; in response to the sixth operation, the terminal device displays the first interface. Thus, the terminal equipment can conveniently enter the two-dimensional code interface in the first application under the condition of opening other applications.
The second application may be a video application in the embodiment of the present application, and the second interface may be an interface for watching video in the embodiment of the present application.
In one possible implementation manner, the terminal device exits from displaying the first interface, including: the terminal device displays a second interface. Therefore, when the terminal equipment detects that the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the code scanning completion meets the preset condition, the terminal equipment can exit the two-dimensional code interface and display the interface where the second application is located, the flexibility of exiting the two-dimensional code is improved, and the user can use the second application conveniently.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs, the terminal device exits from displaying the first interface, including: when the first data indicate that wrist turning action occurs and the time interval between the second interface and the first interface is smaller than a second preset threshold, the terminal equipment displays the second interface. Therefore, when the terminal equipment detects the wrist turning action of the user and the time interval from the second interface to the first interface meets the preset condition, the terminal equipment can exit the two-dimension code interface and display the second interface, the flexibility of exiting the two-dimension code is improved, and the user can conveniently view the second interface.
In one possible implementation manner, the terminal device displays a first interface, including: the terminal equipment displays a third interface; the third interface is an interface running in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; the terminal equipment receives a seventh operation of switching from the third interface to the first interface; in response to the seventh operation, the terminal device displays the first interface. In this way, the terminal device can conveniently enter the two-dimension code interface in the first application under the condition of opening other interfaces of the first application.
In one possible implementation manner, the terminal device exits from displaying the first interface, including: the terminal device displays a third interface. Therefore, when the terminal equipment detects that the wrist turning action of the user and/or the detection of the time interval from the wrist turning action to the code scanning completion meet the preset conditions, the terminal equipment can exit the two-dimension code interface and display other interfaces of the first application, the flexibility of exiting the two-dimension code is improved, and the user can use other functions of the first application conveniently.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs, the terminal device exits from displaying the first interface, including: when the first data indicate that wrist turning action occurs and the time interval between switching from the third interface to the first interface is smaller than a third preset threshold value, the terminal equipment displays the third interface. Therefore, when the terminal equipment detects the wrist turning action of the user and the time interval from the third interface to the first interface meets the preset condition, the terminal equipment can exit the two-dimension code interface and display the second interface, the flexibility of exiting the two-dimension code is improved, and the user can conveniently view the third interface.
In one possible implementation manner, the wrist turning action is obtained by the terminal equipment identifying the first data by using a neural network model; the neural network model is obtained by training the terminal equipment based on second data, the second data comprises acceleration sample data and angular acceleration sample data, and the second data is related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port. Therefore, the terminal equipment can accurately identify the wrist turning action by using the neural network model.
In one possible implementation, the state of the code scanning port includes at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state. Therefore, the terminal equipment can identify the wrist turning action of the user when riding and scanning codes at different code scanning ports based on the neural network model obtained by training the second data related to the states of the code scanning ports, and accuracy of wrist turning identification is improved.
In one possible implementation, the state of the terminal device includes at least one of: the terminal equipment is in a vertical screen state, the terminal equipment is in a vertical screen inclined state, the terminal equipment is in a horizontal screen inclined state, the terminal equipment is in an upper right inclined state, the terminal equipment is in an upper left inclined state, the terminal equipment is in an upward inclined state, the terminal equipment is in a downward inclined state or the terminal equipment is in a horizontal downward state. Therefore, the terminal equipment can identify the wrist turning action of the user when the user takes the car and sweeps the code in different motion states based on the neural network model obtained by training the second data related to the state of the terminal equipment, and the accuracy of wrist turning identification is improved.
In one possible implementation, the method further includes: when the first data indicate that the wrist turning action does not occur, and/or the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is greater than or equal to a first preset threshold value, the terminal equipment displays a first interface. Therefore, the terminal equipment can also not exit the two-dimensional code interface based on the wrist turning action in the code scanning process of the user and/or based on the detection of the wrist turning action and the time interval from the wrist turning action to the code scanning completion, so that the user can continue to use the two-dimensional code interface.
In a second aspect, an embodiment of the present application provides a device for exiting a two-dimensional code, where the device includes an acceleration sensor and a gyro sensor, and a display unit configured to display a first interface; the first interface comprises a first two-dimensional code; the processing unit is used for acquiring first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; acceleration data are acquired by an acceleration sensor, and angular acceleration data are acquired by a gyroscope sensor; when the first data indicates that the wrist turning action occurs, the display unit is further used for exiting and displaying the first interface.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs, and a time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold, the display unit is specifically configured to exit to display the first interface.
In one possible implementation, the processing unit is specifically configured to receive a first operation of opening a first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; the display unit is used for displaying a home page of a first application in response to a first operation; the first control for opening the first two-dimensional code is included in the home page of the first application; the processing unit is further specifically configured to receive a second operation for the first control; the display unit is also specifically configured to display the first interface in response to the second operation.
In a possible implementation, the display unit is specifically configured to display a first page of the first application.
In one possible implementation, the processing unit is specifically configured to receive a third operation for the first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; the display unit is specifically used for displaying an interface containing the functional components of the first application; the functional component comprises a second control for opening the first two-dimensional code; the processing unit is further specifically configured to receive a fourth operation for the second control; the display unit is further specifically configured to display the first interface in response to the fourth operation.
In a possible implementation, the display unit is specifically configured to display an interface containing the functional components of the first application.
In one possible implementation, the processing unit is specifically configured to receive a fifth operation for the third control in the desktop; the third control is used for opening the first two-dimensional code; and in response to the fifth operation, a display unit, specifically configured to display the first interface.
In a possible implementation, the display unit is specifically configured to display a desktop interface.
In a possible implementation, the display unit is specifically configured to display the second interface; the second interface is an interface operated in the second application; the processing unit is specifically used for receiving a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; the display unit is further specifically configured to display the first interface in response to the sixth operation.
In a possible implementation, the display unit is specifically configured to display the second interface.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs and a time interval between switching from the second interface to the first interface is smaller than a second preset threshold, the display unit is specifically configured to display the second interface.
In a possible implementation, the display unit is specifically configured to display the third interface; the third interface is an interface running in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; the processing unit is specifically used for receiving a seventh operation of switching from the third interface to the first interface; and in response to the seventh operation, a display unit, specifically configured to display the first interface.
In a possible implementation, the display unit is specifically configured to display the third interface.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs and a time interval between switching from the third interface to the first interface is smaller than a third preset threshold, the display unit is specifically configured to display the third interface.
In one possible implementation manner, the wrist turning action is obtained by the terminal equipment identifying the first data by using a neural network model; the neural network model is obtained by training the terminal equipment based on second data, the second data comprises acceleration sample data and angular acceleration sample data, and the second data is related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port.
In one possible implementation, the state of the code scanning port includes at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state.
In one possible implementation, the state of the terminal device includes at least one of: the terminal equipment is in a vertical screen state, the terminal equipment is in a vertical screen inclined state, the terminal equipment is in a horizontal screen inclined state, the terminal equipment is in an upper right inclined state, the terminal equipment is in an upper left inclined state, the terminal equipment is in an upward inclined state, the terminal equipment is in a downward inclined state or the terminal equipment is in a horizontal downward state.
In one possible implementation manner, when the first data indicates that the wrist turning action does not occur, and/or a time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is greater than or equal to a first preset threshold, the display unit is further configured to display the first interface.
In a third aspect, an embodiment of the present application provides a device for exiting a two-dimensional code, including a processor and a memory, where the memory is configured to store code instructions; the processor is configured to execute code instructions to cause the electronic device to perform the exit two-dimensional code method as described in the first aspect or any implementation of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed, cause a computer to perform a method of exiting two-dimensional codes as described in the first aspect or any implementation of the first aspect.
In a fifth aspect, a computer program product comprising a computer program which, when run, causes a computer to perform the exit two-dimensional code method as described in the first aspect or any implementation of the first aspect.
It should be understood that the second to fifth aspects of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is a schematic diagram of an interface of a riding code according to an embodiment of the present application;
fig. 2 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic software structure of a terminal device according to an embodiment of the present application;
fig. 4 is a flow chart of a method for exiting a two-dimensional code according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interface for opening a riding code according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface for opening a ride according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of a neural network model according to an embodiment of the present application;
FIG. 8 is an interface diagram of an exit code according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an interface of another exit code according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface of another exit code according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an interface for displaying prompt messages according to an embodiment of the present application;
fig. 12 is a flowchart of another method for exiting a two-dimensional code according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a device for exiting two-dimensional codes according to an embodiment of the present application;
Fig. 14 is a schematic hardware structure of a control device according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first value and the second value are merely for distinguishing between different values, and are not limited in their order. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b, c may be single or plural.
In recent years, with the wide popularization of two-dimension codes, the degree of non-blocking of a user trip is higher and higher, and most users are used to taking public transportation means such as buses, subways and light rails through taking the bus code, so that the improvement of the experience of the user using the bus code has important significance.
Exemplary, fig. 1 is an interface schematic diagram of a riding code according to an embodiment of the present application. In the embodiment corresponding to fig. 1, a terminal device is taken as an example for illustrating a mobile phone, and the example does not limit the embodiment of the present application.
When a user uses the smart travel APP code scanning and riding in the mobile phone, the riding code in the smart travel APP can be opened through an interface shown as a in fig. 1. For example, when the mobile phone receives the operation of the user pressing the smart trip APP101 for a long time in the interface shown as a in fig. 1, the mobile phone may display a function component 102 corresponding to the smart trip APP, where the function component 102 may include one or more of the following: a control for sweeping, a ride code control 103 for riding, or a control for adding other functions, etc. As indicated by a in fig. 1, one or more of the following application controls may also be included in the interface, such as: file management, email, music, gallery, smart trip APP101, camera, address book, phone, or information, etc.
When the handset receives that the user is in the interface shown as a in fig. 1, and triggers the operation of the ride control 103, the handset may display the interface shown as b in fig. 1. An interface as shown in b in fig. 1, which may include one or more of the following, for example: the system comprises a bus control 104 for displaying a two-dimensional code used when taking a bus, a subway control 105 for displaying the two-dimensional code used when taking a subway, an identification of an interface of a passenger car code, text information of an X city electronic subway card and a two-dimensional code 106 for taking the subway. Wherein the subway control 105 is in a selected state.
When the user directs the two-dimensional code 106 shown as b in fig. 1 to the code scanning port of the gate, the mobile phone can display the interface shown as c in fig. 1 after several seconds. An interface as shown in c in fig. 1, which may include one or more of the following: prompt information 107 for prompting success of ride, an exit control 108 for exiting the ride, or a control for returning the ride, etc.
Further, when the mobile phone receives that the user triggers the exit control 108 in the interface shown in c in fig. 1, the mobile phone can exit the riding code interface and display the main interface or other interfaces corresponding to the smart trip APP.
However, the flexibility of the exit riding code interface is low, and particularly when the user needs to use other functions in the intelligent travel APP, the inconvenience brought by the exit riding code interface is high, so that the use experience of the user using the intelligent travel APP is affected.
In view of this, the embodiment of the application provides a method for exiting a two-dimensional code, so that a terminal device can identify a wrist turning action in a code scanning process of a user, and based on the wrist turning action and the time from the wrist turning to the code scanning of the user, the method can realize rapid exiting of a riding code interface after the user finishes the code scanning, and further can enhance the flexibility of exiting the riding code.
It is understood that the terminal device may be a smart phone or a tablet, or the terminal device may be a wearable device, such as a smart watch, a smart bracelet, a wearable Virtual Reality (VR) device, or a wearable augmented reality (augmented reality, AR) device. The specific technology and the specific equipment form adopted by the terminal equipment are not limited in the embodiment of the application.
It can be understood that the method for exiting the two-dimensional code provided by the embodiment of the application can be applied to scenes of riding and scanning the code and scenes such as payment, and the scenes to which the method can be applied are not particularly limited in the embodiment of the application.
Therefore, in order to better understand the embodiments of the present application, the structure of the terminal device of the embodiments of the present application will be described below. Fig. 2 is a schematic hardware structure of a terminal device according to an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, an indicator 192, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge a terminal device, or may be used to transfer data between the terminal device and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charge management module 140 and the processor 110.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in the terminal device may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) such as wireless fidelity (WIRELESS FIDELITY, wi-Fi) network, bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), etc. applied on the terminal device.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1. In the embodiment of the application, when the terminal device receives the triggering operation of the user for the riding code in the smart trip APP, the display screen 194 can be used for displaying the riding code interface.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area.
The terminal device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device can listen to music through the speaker 170A or listen to hands-free calls. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the terminal device picks up a call or voice message, the voice can be picked up by placing the receiver 170B close to the human ear. The earphone interface 170D is used to connect a wired earphone. Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194.
The gyro sensor 180B may be used to determine a motion gesture of the terminal device. In the embodiment of the present application, the gyro sensor 180B may be a three-axis (including x-axis, y-axis and z-axis) gyro sensor for measuring angular acceleration data of the terminal device when the user turns his wrist.
The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the terminal device in various directions (typically three axes). In the embodiment of the present application, the acceleration sensor 180E may be a three-axis (including x-axis, y-axis and z-axis) acceleration sensor for measuring acceleration data of the terminal device when the user turns his wrist.
It will be appreciated that the gyroscopic sensor 180B and the acceleration sensor 180E may be used together to detect a user's wrist-turning motion.
A distance sensor 180F for measuring a distance. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is for detecting temperature. The touch sensor 180K, also referred to as a "touch device". The bone conduction sensor 180M may acquire a vibration signal.
The touch sensor 180K may be disposed on the display 194, and the touch sensor 180K and the display 194 form a touch screen, or "touch screen". In the embodiment of the present application, the touch sensor 180K is configured to detect a touch operation performed by a user on the display screen.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key.
The software system of the terminal device may adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like, which will not be described herein.
Fig. 3 is a schematic software structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the layered architecture may divide the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into three layers, from top to bottom, an application layer, an application framework layer, and a hardware layer. In a possible implementation, an intelligent sensor hub (sensorhub) may also be included in the layered architecture.
In particular, the application layer may include a series of application packages. As shown in fig. 3, the application package may include one or more of the following, for example: camera, phone, wisdom trip APP, riding yard APP etc..
In the embodiment of the application, the intelligent travel APP is used for realizing functions such as riding and code scanning; this sign indicating number APP of riding can understand as the third party application in the wisdom trip APP for wisdom trip APP provides the sign indicating number service of riding. In a possible implementation manner, when the embodiment of the present application is applied to a payment scenario, an application for payment may also be included in the application layer.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a resource manager, a notification manager, and a gesture service module, among others.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock the screen, touch the screen, drag the screen, intercept the screen, etc.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal equipment vibrates, and an indicator light blinks.
The gesture service module is used for monitoring wrist turning actions of the user. In the embodiment of the application, when the gesture service module monitors the message for indicating the wrist turning identification result, the message for indicating the wrist turning identification result can be transmitted to the intelligent travel APP.
The hardware layer may include an acceleration sensor and a gyro sensor. In the embodiment of the application, the acceleration sensor is used for collecting acceleration data of the terminal equipment of a user in the wrist turning process; the gyroscope sensor is used for collecting angular acceleration data of the terminal equipment in the wrist turning process of a user. The gyroscope sensor and the acceleration sensor can be used for detecting wrist turning actions of a user.
The smart sensor hub may be a solution based on a combination of software and hardware on a low power micro control unit (microcontroller unit, MCU) and a lightweight real-time operating system (real-time operating system, RTOS), the main function of which is to connect and process data from various sensor devices.
In the embodiment of the application, the intelligent sensing hub can comprise a wrist turning recognition algorithm, wherein the wrist turning recognition algorithm is used for carrying out algorithm recognition on the data acquired from the sensor to acquire the wrist turning state in the code scanning process of riding the vehicle by the user.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be implemented independently or combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
When the user takes the subway by using the intelligent travel APP in the terminal equipment, the riding code in the intelligent travel APP can be opened, the riding code faces to the code scanning port of the gate, and the terminal equipment can automatically exit the riding code interface after the user finishes the code scanning based on the wrist turning action in the code scanning process of the user and the time interval from the wrist turning to the code scanning of the user, so that the flexibility of exiting the riding code is enhanced.
Fig. 4 is a schematic flow chart of a method for exiting a two-dimensional code according to an embodiment of the present application. In the embodiment corresponding to fig. 4, the two-dimensional code is taken as an example of the riding code, and this example is not limited to the embodiment itself. As shown in fig. 4, the method for exiting the two-dimensional code may include the following steps:
S401, when the terminal equipment receives the operation of opening the riding code in the intelligent travel APP by the user, the terminal equipment can display a riding code interface.
In an embodiment of the present application, the operation of opening the riding code may include one or more of the following: touch operation, key operation, space gesture operation, voice operation, or the like.
By way of example, the user may turn on the ride code in the smart trip APP in a number of ways (e.g., the embodiment of fig. 1, or the embodiments of fig. 5-6) as follows. In the embodiments corresponding to fig. 5 to fig. 6, a terminal device is taken as an example of a mobile phone to be described as an example, which does not limit the embodiments of the present application.
In one implementation, the mobile phone may trigger the display of the functional component of the smart travel APP by long-pressing the operation of the smart travel APP by the user as shown in a of fig. 1 based on the embodiment corresponding to fig. 1, and then open the riding code in the smart travel APP by the operation of the user against the riding code control 103 in the functional component.
In another implementation, the mobile phone can open the riding code by the operation of a control corresponding to the riding code function preset in the desktop by a user. Exemplary, fig. 5 is an interface schematic diagram for opening a riding code according to an embodiment of the present application.
In the desktop state shown as a in fig. 5, when the handset receives an operation of the user triggering the ride control 501, the handset may display an interface shown as b in fig. 5. As shown in an interface a in fig. 5, the rider code control 501 in the interface may be a control formed when the user adds the rider code function to the desktop in the smart trip APP (or the set function), and other contents shown in a in fig. 5 are similar to those shown in a in fig. 1, and are not described herein. The content shown in b in fig. 5 is similar to the content shown in b in fig. 1, and will not be described again.
In still another implementation, the mobile phone may receive an operation of opening the smart travel APP by the user, and open the riding code by the user aiming at an operation of opening a control in the smart travel APP. Fig. 6 is a schematic diagram of another interface for opening a riding code according to an embodiment of the present application.
In the desktop state as shown in a of fig. 6, when the mobile phone receives the operation of the user triggering the smart travel control 601, the mobile phone may display an interface as shown in b of fig. 6. The interface shown in b in fig. 6 may be the home page of the smart trip APP, and may include one or more of the following, for example: sweep one sweep control, ride code control 602, lift control, control for adding other functions, or news 1 or news 2 corresponding to today's news functions. Other contents displayed in the interface a in fig. 6 are similar to those displayed in the interface a in fig. 1, and will not be described here.
Further, in the interface shown as b in fig. 6, when the mobile phone receives the operation of triggering the ride control 602 by the user, the mobile phone may display the interface shown as c in fig. 6. The interface shown in c in fig. 6 is similar to the interface shown in b in fig. 1, and will not be described here.
It will be appreciated that the manner of opening the riding code may include other contents according to the actual scenario, which is not limited in the embodiment of the present application.
S402, the terminal equipment collects data corresponding to the turning motion of the user in the riding and scanning process, and inputs the data corresponding to the turning motion into a neural network model for turning identification, so that a turning identification result is obtained.
In the embodiment of the present application, the data corresponding to the wrist turning action may include: acceleration data acquired by an acceleration sensor, angular acceleration data acquired by a gyro sensor, and the like.
By way of example, the frequency of the data collected by the acceleration sensor and the gyroscopic sensor may be 100 hertz (hz), which may be understood as 100 sets of data collected for 1s, one set of data collected every 10 milliseconds (ms). Wherein the set of data may include 3 acceleration data (including x-axis, y-axis, and z-axis) and 3 angular acceleration data (including x-axis, y-axis, and z-axis).
In the embodiment of the application, the terminal equipment can construct a neural network model for wrist turning identification based on one of the following multiple neural network models, for example: a gated loop unit based loop neural network (gated recurrent unit, GRU), a back propagation neural network (back propagation network), a convolutional neural network (convolutional neural networks, CNN), a residual contraction network, a generative countermeasure network (GENERATIVE ADVERSARIAL networks, GAN), a support vector machine (support vector machines, SVM), or a depth residual network (deep residual networks, DRN), etc.
Fig. 7 is a schematic flow chart of constructing a neural network model according to an embodiment of the present application.
As shown in fig. 7, for the model training process corresponding to the dashed box 701, the training process may include: acquiring training data, preprocessing, training a neural network model for wrist turning identification and the like. The preprocessing may be used to filter out the effects of high frequency noise, for example, the preprocessing may include mean filtering, etc.
Specifically, the training data may include: in the gate comprising the code scanning ports with different orientations, the user is in a wrist turning state, for example, wrist turning sample data when the riding code is turned over to face the gate code scanning port in the riding code scanning process; and non-wrist-turned sample data in a state where the user is in a non-wrist-turned state, such as in a stationary standing state (e.g., where the user stands and turns on a rider), or in a walking state (e.g., where the user turns on a rider during walking), etc. The code scanning ports with different orientations can comprise at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state, or the code scanning port is in a backward tilting state, etc.; based on the code scanning ports in different directions, the wrist turning and code scanning actions of the user can comprise at least one of the following steps: the terminal equipment is in a vertical screen state, the terminal equipment is in a vertical screen inclined state, the terminal equipment is in a horizontal screen inclined state, the terminal equipment is in an upper right inclined state, the terminal equipment is in an upper left inclined state, the terminal equipment is in a screen upward inclined state, the terminal equipment is in a downward inclined state and the like. In the horizontal code scanning port, the user's wrist turning and code scanning action can be in a state that the user turns the wrist to the wrist turning and code scanning state that the screen is downward.
It can be understood that the training data may comprehensively consider factors including gates with different directions towards the code scanning ports, different code scanning actions in the code scanning process of the user, and the like, and the training data may also include other contents according to the actual scene, which is not limited in the embodiment of the present application.
Illustratively, one possible implementation of training the neural network model described above based on wrist-everting sample data and non-wrist-everting sample data is: inputting in a neural network model to be trained, outputting predicted wrist turning conditions by using the neural network model to be trained according to wrist turning sample data of a user in a wrist turning state and non-wrist turning sample data in a non-wrist turning state, comparing the difference between the predicted wrist turning conditions and the real wrist turning conditions by using a loss function, for example, calculating the recall rate or the false recognition rate of the predicted wrist turning conditions, and when the predicted wrist turning conditions and the real wrist turning conditions output by the model do not meet the loss function, adjusting the model parameters, and continuing training; and (3) until the difference between the predicted wrist turning condition and the real wrist turning condition output by the model meets a loss function, finishing model training, and obtaining the neural network model for wrist turning identification. Furthermore, the terminal device can identify whether the user is in the wrist-turning state based on the detected acceleration data and the angular acceleration data.
As shown in fig. 7, for the wrist-turning recognition process, the wrist-turning recognition process may include: acquiring real-time data, preprocessing, performing wrist turning recognition based on a neural network model, outputting a recognition result and the like.
It will be appreciated that the wrist turning identification process may be implemented in an intelligent sensor hub of the terminal device or may also be implemented in a server. The terminal device may upload the acceleration data and the angular acceleration data acquired in the step shown in S402 to a server, and perform a process of identifying the acceleration data and the angular acceleration data using the neural network model in the step shown in S402 in the server, and further, the server may send the wrist turning identification result to the terminal device. It can be understood that the wrist turning identification process is placed in the server to realize the process, so that the running memory and the power consumption of the terminal equipment can be effectively reduced.
In a possible implementation manner, the wrist turning identification result may be information whether the wrist turning action is satisfied, or the wrist turning identification result may also be information for scoring the wrist turning, and in the embodiment of the present application, the specific form of the wrist turning identification result is not specifically limited.
S403, when the terminal equipment receives the code scanning operation of the user, the terminal equipment can calculate the time interval from wrist turning to code scanning.
In the embodiment of the application, the turning operation can be understood as the turning operation generated by a user aiming at the code scanning port of the gate after the user opens the code; the code scanning time can be understood as the code scanning completion time; the time interval from turning the wrist to scanning the code can be understood as the time interval between the time of turning the wrist when the user turns on the riding code and aligns the riding code to the code scanning port of the gate and the time of finishing the scanning of the code when the gate scans the riding code.
The example, after the user opens the riding code, the riding code is aligned to the code scanning port of the gate, the terminal device may detect the wrist turning action based on the neural network model in the step shown in S402, for example, the terminal device may detect that the wrist turning action occurs based on the data generated when the user aligns the riding code to the code scanning port of the gate, and the generation time of the wrist turning action is 08:30; further, when the user moves the riding code of the terminal device to the code scanning port of the gate, and the distance between the riding code of the terminal device and the code scanning port of the gate is close enough, the gate can scan the riding code, and then the terminal device can detect that the code scanning is completed, if the time for detecting that the scanning is completed is 08:33, the terminal device can calculate that the time interval from turning the wrist to scanning is 3 seconds.
S404, the terminal equipment determines whether to exit the riding code interface based on the wrist turning identification result and the time interval from wrist turning to code scanning.
In one implementation, when the terminal device determines that the wrist turning identification result is that the wrist turning state is met (or the value of the wrist turning score exceeds the scoring threshold), and the time interval from wrist turning to code scanning is smaller than (or smaller than or equal to) the time threshold, the terminal device can exit the riding code interface. The time threshold may be 5 seconds or the like.
The wrist turning recognition result or the wrist turning score can be understood as a result which is output by the neural network model in the step shown in S402 based on the recognition of the data corresponding to the wrist turning action in the code scanning process of the user riding. For example, the wrist turning recognition result may be 0 (which may be understood as not satisfying the wrist turning motion) or 1 (which may be understood as satisfying the wrist turning motion), when the terminal device obtains the wrist turning recognition result as 1, the terminal device may determine that the wrist turning motion is detected, or when the terminal device obtains the wrist turning recognition result as 0, the terminal device may determine that the wrist turning motion is not detected. Or the wrist-turning score may be a numerical value, for example, when the terminal device obtains the wrist-turning score as 95 and the score threshold is 90, the terminal device may determine that the wrist-turning motion is detected, or when the terminal device obtains the wrist-turning score as 75 and the score threshold is 90, the terminal device may determine that the wrist-turning motion is not detected.
In another implementation, when the terminal device determines that the wrist turning identification result does not meet the wrist turning state (or the value of the wrist turning score is smaller than or equal to a scoring threshold value); or the time interval from wrist turning to code scanning is more than or equal to (or more than) a time threshold value; or when the terminal equipment determines that the wrist turning identification result does not meet the wrist turning state (or the numerical value of the wrist turning score is smaller than or equal to the scoring threshold value), and the time interval from wrist turning to code scanning is larger than or equal to (or larger than) the time threshold value, the terminal equipment can display a riding code interface or understand that the riding code interface is not exited.
The time interval from turning the wrist to scanning the code can be used for determining whether the user scans the code, for example, when the time interval from turning the wrist to scanning the code is greater than or equal to (or greater than) a time threshold, it can be understood that the user may interrupt scanning the code due to various reasons in the process of scanning the code while riding the vehicle, and the user does not need to exit from the riding code interface at this time, so that the user is convenient to scan the code next time. For example, when a user calls a mobile phone in the process that the riding code faces to the code scanning port of the gate after turning the wrist, the terminal equipment detects that the time interval from turning the wrist to scanning the code is greater than or equal to (or greater than) a time threshold, the user can not exit from the riding code interface, and further after answering or hanging up the call, the user can continue to scan the code to ride the mobile phone based on the riding code interface. For another example, when the user realizes that the riding code may be switched by mistake in the process of turning the wrist and then directing the riding code to the code scanning port of the gate, the user continues to turn the wrist and direct the riding code to the user and switch to the correct riding code, at this time, the terminal device may detect that the time interval from turning the wrist to scanning the code is greater than or equal to (or greater than) the time threshold, and may not exit the riding code interface, so that the user may switch to the correct riding code in the riding code interface.
Fig. 8 is an interface schematic diagram of an exit riding code according to an embodiment of the present application. In the embodiment corresponding to fig. 8, a terminal device is taken as an example for a mobile phone to be described as an example, which does not limit the embodiment of the present application.
The user can scan the code based on the interface shown as a in fig. 8, when the mobile phone determines to exit the code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code scanning, the mobile phone can display an interface shown as b in fig. 8, and the interface can be the first page of the smart trip APP; or when the mobile phone determines not to exit the riding code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code scanning, the mobile phone can continue to display the interface shown as a in fig. 8 (or display the interface shown as c in fig. 1). The content displayed in the interface shown in a in fig. 8 is similar to the content displayed in the interface shown in c in fig. 6, and the content displayed in the interface shown in b in fig. 8 is similar to the content displayed in the interface shown in b in fig. 6, and will not be described again.
In a possible implementation manner, when the mobile phone opens the riding code in the functional component corresponding to the smart trip APP in the desktop state according to the embodiment corresponding to fig. 1, and when the mobile phone determines to exit the riding code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code scanning, the mobile phone may also display the interface including the functional component.
In a possible implementation manner, when the mobile phone opens the riding code in the desktop state according to the embodiment corresponding to fig. 5, and when the mobile phone determines to exit the riding code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code scanning, the mobile phone may also display the desktop.
In a possible implementation manner, when a user needs to take a car when using other APP, for example, using a video APP, the user can open a car taking code interface from the background or from a desktop in time to open the smart trip APP and sweep the code to take the car, and when the mobile phone determines to exit the car taking code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code sweeping, the mobile phone can also display the interface corresponding to the video APP.
Fig. 9 is an interface schematic diagram of another exit code according to an embodiment of the present application. In the embodiment corresponding to fig. 9, a terminal device is taken as an example for a mobile phone to be described as an example, and the example does not limit the embodiment of the present application.
When the user needs to take a car and sweep a code in the process of watching the video in the interface of the video APP shown as a in fig. 9, the mobile phone can receive the operation that the user opens the car taking code interface in the smart trip APP from the background multitasking interface or from the desktop, and then the mobile phone can display the interface shown as b in fig. 9. Further, the user can scan the car with the car riding code interface as shown in b in fig. 9, and when the mobile phone determines to exit the car riding code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code scanning, the mobile phone can display an interface corresponding to the video APP as shown in a in fig. 9. The interface shown in a in fig. 9 may include video content, and content such as video 902, video 903, video 904, and video 905 recommended by the video APP; the interface shown in b in fig. 9 is similar to the interface shown in c in fig. 6, and will not be described again.
Furthermore, the mobile phone can also determine whether to exit the riding code interface based on the wrist turning identification action of the user, the time interval from wrist turning to code scanning and the time when the user switches to the riding code interface from the video APP. For example, when the mobile phone determines that the wrist turning recognition result is that the wrist turning state is satisfied, the time interval from wrist turning to code scanning is smaller than (or smaller than or equal to) a time threshold, and the time for switching from the video APP to the riding code interface is smaller than (or smaller than or equal to) another time threshold, the mobile phone may also display the interface corresponding to the video APP.
In a possible implementation manner, when the user uses other functions in the smart trip APP, for example, when the user needs to take a car while looking at news, the user can quickly switch to the car taking code interface and take a car by sweeping the code, and when the mobile phone determines to exit the car taking code interface based on the wrist turning identification action of the user and the time interval from wrist turning to code sweeping, the mobile phone can also display the news interface.
Fig. 10 is a schematic diagram of an interface of another exit code according to an embodiment of the present application. In the embodiment corresponding to fig. 10, a terminal device is taken as an example for illustrating a mobile phone, and this example does not limit the embodiment of the present application.
When the user needs to get away from the news interface and open the operation of the riding code interface during the process of viewing news in the smart travel APP as shown in a in fig. 10, the mobile phone can receive the operation of the user to get away from the news interface, and then the mobile phone can display the interface as shown in b in fig. 10. Further, the user may scan the car with the car riding code interface as shown in b in fig. 10, and when the mobile phone determines to exit the car riding code interface based on the wrist turning recognition action of the user and the time interval from wrist turning to code scanning, the mobile phone may display the news interface as shown in a in fig. 10. The interface shown as a in fig. 10 may include content corresponding to news 1; the interface shown in b in fig. 10 is similar to the interface shown in c in fig. 6, and will not be described again.
Furthermore, the mobile phone can also determine whether to exit the riding code interface based on the wrist turning identification action of the user, the time interval from wrist turning to code scanning and the time when the user rapidly switches to the riding code interface from the news page in the intelligent trip APP. For example, when the mobile phone determines that the wrist turning identification result is that the wrist turning state is met, the time interval from wrist turning to code scanning is smaller than (or smaller than or equal to) a time threshold, and the time for switching from the news interface in the smart trip APP to the riding code interface is smaller than (or smaller than or equal to) another time threshold, the mobile phone can display the news interface.
It can be understood that the interface displayed when the vehicle-taking-out code interface exits may include other contents according to the actual scene, which is not limited in the embodiment of the present application.
Based on the method, the terminal equipment can identify the wrist turning action of the user in the code scanning process, and based on the wrist turning action and the time from the wrist turning to the code scanning of the user, the terminal equipment can rapidly exit from the riding code interface after the user finishes the code scanning, so that the flexibility of exiting from the riding code can be enhanced.
In a possible implementation manner based on the embodiment corresponding to fig. 4, when the terminal device determines to exit the riding code interface based on the wrist turning action and the time from the wrist turning to the code scanning completion of the user, the terminal device may further display a prompt message, where the prompt message is used to prompt the user to exit the riding code interface currently.
Fig. 11 is a schematic diagram of an interface for displaying prompt information according to an embodiment of the present application. In the embodiment corresponding to fig. 11, a terminal device is taken as an example for a mobile phone to be described as an example, which does not limit the embodiment of the present application.
When the mobile phone determines to exit the riding code interface based on the wrist turning action of the user and the time from the wrist turning to the code scanning completion, the mobile phone may display an interface as shown in fig. 11, where the interface may include prompt information 1101, for example, the prompt information 1101 may be: the wrist turning motion is detected, and the riding code is withdrawn for you. In the embodiment of the present application, the specific form of the prompt information is not specifically limited.
Based on the method, the user can timely perceive the state change of the terminal equipment according to the prompt information, and the prompt information can enhance the intelligence of the terminal equipment.
On the basis of the corresponding embodiment of fig. 4, in a possible implementation manner, the terminal device may include: APP layer, frame work layer, sensorhub layers, and hardware layer etc. wherein, can include riding sign indicating number APP and wisdom trip APP in this APP layer, can include gesture service module in this frame work layer, can include in this sensorhub layers and turn over wrist recognition algorithm, can include acceleration sensor and gyroscope sensor etc. in this hardware layer. It will be appreciated that the acceleration sensor and the gyro sensor may correspond to two modules, and in the embodiment of the present application, a brief description will be given by one module.
Fig. 12 is a schematic flow chart of another method for exiting two-dimensional codes according to an embodiment of the present application. As shown in fig. 12, the method for exiting the two-dimensional code may include the following steps:
in the embodiment of the present application, the user may open the riding code through the embodiment corresponding to fig. 1 (or the embodiment corresponding to fig. 5 or fig. 6), so that the riding code APP in the terminal device may execute the step shown in S1201.
S1201, when the riding code APP receives the operation of opening the riding code in the smart travel APP by the user, the smart travel APP may obtain a broadcast for indicating the user to open the riding code from the riding code APP.
S1202, the gesture service module can obtain an indication message for monitoring the wrist turning action from the intelligent trip APP.
In a possible implementation manner, when the terminal device does not have the intelligent travel APP, or the terminal device is understood that the riding code APP can be used as an independent APP, and when the riding code APP provides the riding code for the terminal device, when the riding code APP detects that a user opens the riding code, the main body for issuing the monitoring wrist turning action can also be completed by the riding code APP. For example, S1202 may be that the gesture service module may obtain an instruction message for monitoring the wrist turning action from the car code APP.
And S1203, the wrist turning recognition algorithm can obtain an indication message for starting wrist turning recognition from the intelligent trip APP.
It will be appreciated that the steps shown in S1202 and S1203 may be performed simultaneously when the smart trip APP detects that the user opens the ride code.
S1204, the gesture service module may set a monitor for monitoring the wrist turning motion.
In the embodiment of the application, the monitor can be used for monitoring whether the information (or the wrist turning scoring) of the wrist turning action is recognized or not. For example, the gesture service module may receive various messages, or may include various listeners for listening to different types of messages. For example, when the monitor in the gesture service module monitors a message identifying the wrist turning action (or a message not identifying the wrist turning action or a wrist turning score), the gesture service module may report the message identifying the wrist turning action (or the message not identifying the wrist turning action or the wrist turning score) to the smart trip APP in time. The message whether the wrist turning action is identified may be 0 (which may be understood as not satisfying the wrist turning action) or 1 (which may be understood as satisfying the wrist turning action), and the wrist turning score may be a numerical value.
In the embodiment of the present application, when the user needs to take a car, the wrist turning action may be performed, so that the wrist turning recognition algorithm may perform the steps shown in S1205. The wrist turning action can be used for enabling the riding code displayed by the terminal equipment to face the code scanning port direction of the gate.
It is understood that the wrist turning action of the user may be performed at any step after S1201 to before S1205, which is not limited in the embodiment of the present application.
And S1205, acquiring acceleration data from an acceleration sensor and acquiring angular acceleration data from a gyroscope sensor by a wrist turning recognition algorithm.
And S1206, the wrist turning recognition algorithm recognizes the wrist turning action based on the acceleration data and the angular velocity data, and obtains a wrist turning recognition result.
For example, the process of performing the wrist turning recognition by the wrist turning recognition algorithm based on the acceleration data and the angular velocity data may refer to the process of obtaining the wrist turning recognition result by the terminal device based on the neural network algorithm in the step shown in S402, which is not described herein.
S1207, the gesture service module may obtain a message indicating the wrist turning recognition result from the wrist turning recognition algorithm.
S1208, the intelligent trip APP can obtain a message for indicating the wrist turning recognition result from the gesture service module.
In the embodiment of the application, after the user performs the wrist turning action, the user can approach the riding code to the code scanning port of the gate and scan the code to get in, so that the smart trip APP can execute the step shown in S1209.
It is to be understood that the code scanning action of the user may be performed after the wrist turning action, for example, may be performed at any step after S1201-before S1209, which is not limited in the embodiment of the present application.
S1209, the intelligent travel APP determines whether to exit the riding code interface based on the wrist turning identification result and the completion time from wrist turning to code scanning.
In the embodiment of the application, the intelligent travel APP can detect the code scanning completion time based on the change of the riding code page. The change of the page can be understood as that the ride code in the ride code page may be updated at the time of finishing the code scanning. The code in the code interface is a first code, and when the user faces the first code to the code scanning port of the gate and finishes the code scanning, the code in the code interface can be switched from the first code to a second code, the smart trip APP can detect the change of the page in the current code page, detect the change moment of the page, and understand the change moment of the page as the finishing time of the code scanning.
For example, the process of determining whether to exit the riding code interface by the smart trip APP based on the wrist turning recognition result and the time from the wrist turning to the code scanning can be referred to the process of determining whether to exit the riding code interface by the terminal device in the step shown in S404, which is not described herein.
Based on the method, the terminal equipment can identify the wrist turning action of the user in the code scanning process based on data interaction among modules in the equipment, and realize quick exit from the riding code interface after the user finishes code scanning based on the wrist turning action and the time from wrist turning to code scanning, so that the flexibility of exiting from the riding code can be enhanced.
It may be understood that the interface of the terminal device provided in the embodiment of the present application is only used as an example, and is not limited to the embodiment of the present application.
The method provided by the embodiment of the present application is described above with reference to fig. 4 to 12, and the device for performing the method provided by the embodiment of the present application is described below. As shown in fig. 13, fig. 13 is a schematic structural diagram of a device for exiting two-dimensional codes according to an embodiment of the present application, where the device for exiting two-dimensional codes may be a terminal device in an embodiment of the present application, or may be a chip or a chip system in the terminal device.
As shown in fig. 13, the device 130 for exiting the two-dimensional code may be used in a communication device, a circuit, a hardware component, or a chip, where the device for exiting the two-dimensional code includes: a display unit 1301, a processing unit 1302. The display unit 1301 is configured to support a step of exiting display performed by the two-dimensional code method; the processing unit 1302 is configured to support a device that exits the two-dimensional code to perform the step of information processing.
Specifically, the embodiment of the application provides a device 130 for exiting a two-dimensional code, and a display unit 1301 is configured to display a first interface; the first interface comprises a first two-dimensional code; a processing unit 1302, configured to obtain first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; acceleration data are acquired by an acceleration sensor, and angular acceleration data are acquired by a gyroscope sensor; when the first data indicates that the wrist turning action occurs, the display unit 1301 is further configured to exit to display the first interface.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs, and a time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is less than a first preset threshold, the display unit 1301 is specifically configured to exit to display the first interface.
In one possible implementation, the processing unit 1302 is specifically configured to receive a first operation of opening a first application in a desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; a display unit 1301, in response to a first operation, specifically configured to display a home page of a first application; the first control for opening the first two-dimensional code is included in the home page of the first application; the processing unit 1302 is further specifically configured to receive a second operation for the first control; the display unit 1301 is also specifically configured to display the first interface in response to the second operation.
In one possible implementation, the display unit 1301 is specifically configured to display a home page of the first application.
In one possible implementation, the processing unit 1302 is specifically configured to receive a third operation for the first application in the desktop; the first application is used for providing a first two-dimensional code for the terminal equipment; a display unit 1301, configured to display an interface including a functional component of the first application; the functional component comprises a second control for opening the first two-dimensional code; the processing unit 1302 is further specifically configured to receive a fourth operation for the second control; the display unit 1301 is also specifically configured to display the first interface in response to the fourth operation.
In one possible implementation, the display unit 1301 is specifically configured to display an interface including a functional component of the first application.
In one possible implementation, the processing unit 1302 is specifically configured to receive a fifth operation for the third control in the desktop; the third control is used for opening the first two-dimensional code; in response to the fifth operation, the display unit 1301 is specifically configured to display the first interface.
In one possible implementation, the display unit 1301 is specifically configured to display a desktop interface.
In one possible implementation, the display unit 1301 is specifically configured to display the second interface; the second interface is an interface operated in the second application; the processing unit 1302 is specifically configured to receive a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; in response to the sixth operation, the display unit 1301 is also specifically configured to display the first interface.
In one possible implementation, the display unit 1301 is specifically configured to display the second interface.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs and the time interval between switching from the second interface to the first interface is smaller than the second preset threshold, the display unit 1301 is specifically configured to display the second interface.
In one possible implementation, the display unit 1301 is specifically configured to display a third interface; the third interface is an interface running in the first application; the first application is used for providing a first two-dimensional code for the terminal equipment; a processing unit 1302, specifically configured to receive a seventh operation of switching from the third interface to the first interface; in response to the seventh operation, the display unit 1301 is specifically configured to display the first interface.
In one possible implementation, the display unit 1301 is specifically configured to display the third interface.
In one possible implementation manner, when the first data indicates that the wrist turning action occurs and the time interval between switching from the third interface to the first interface is smaller than the third preset threshold, the display unit 1301 is specifically configured to display the third interface.
In one possible implementation manner, the wrist turning action is obtained by the terminal equipment identifying the first data by using a neural network model; the neural network model is obtained by training the terminal equipment based on second data, the second data comprises acceleration sample data and angular acceleration sample data, and the second data is related to one or more of the following states: the state of the terminal equipment or the state of the code scanning port.
In one possible implementation, the state of the code scanning port includes at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state.
In one possible implementation, the state of the terminal device includes at least one of: the terminal equipment is in a vertical screen state, the terminal equipment is in a vertical screen inclined state, the terminal equipment is in a horizontal screen inclined state, the terminal equipment is in an upper right inclined state, the terminal equipment is in an upper left inclined state, the terminal equipment is in an upward inclined state, the terminal equipment is in a downward inclined state or the terminal equipment is in a horizontal downward state.
In one possible implementation manner, when the first data indicates that the wrist turning action does not occur, and/or a time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is greater than or equal to a first preset threshold, the display unit 1301 is further configured to display the first interface.
In a possible embodiment, the device for exiting the two-dimensional code may further include: a storage unit 1304. The processing unit 1302 and the storage unit 1304 are connected by a line.
The memory unit 1304 may include one or more memories, which may be one or more devices, circuits, or means for storing programs or data.
The storage unit 1304 may exist independently and be connected to the processing unit 1302 of the device for exiting the two-dimensional code through a communication line. The memory unit 1304 may also be integrated with the processing unit 1302.
In a possible embodiment, the device for exiting the two-dimensional code may further include: the communication unit 1303. The communication unit 1303 may be an input or output interface, a pin, a circuit, or the like.
The storage unit 1304 may store computer-executable instructions of a method in the terminal device to cause the processing unit 1302 to perform the method in the above-described embodiment.
The storage unit 1304 may be a register, a cache, a RAM, or the like, and the storage unit 1304 may be integrated with the processing unit 1302. The memory unit 1304 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the memory unit 1304 may be independent of the processing unit 1302.
Fig. 14 is a schematic hardware structure of a control device according to an embodiment of the present application, as shown in fig. 14, where the control device includes a processor 1401, a communication line 1404, and at least one communication interface (the communication interface 1403 is illustrated in fig. 14 as an example).
The processor 1401 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 1404 may include circuitry for communicating information between the components described above.
Communication interface 1403 uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Possibly, the control device may also comprise a memory 1402.
Memory 1402 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory) or other optical disc storage, a compact disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1404. The memory may also be integrated with the processor.
Wherein the memory 1402 is used for storing computer-executable instructions for performing aspects of the present application, and is controlled for execution by the processor 1401. The processor 1401 is configured to execute computer-executable instructions stored in the memory 1402, thereby implementing the two-dimensional code exiting method provided by the embodiment of the present application.
Possibly, the computer-executable instructions in the embodiments of the present application may also be referred to as application program codes, which are not limited in particular.
In a particular implementation, processor 1401 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 14, as an example.
In a specific implementation, as an embodiment, the control device may include a plurality of processors, such as processor 1401 and processor 1405 in fig. 14. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Fig. 15 is a schematic structural diagram of a chip according to an embodiment of the present application. The chip 150 includes one or more (including two) processors 1520 and a communication interface 1530.
In some implementations, memory 1540 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
In an embodiment of the application, memory 1540 may include read-only memory and random access memory and provide instructions and data to processor 1520. A portion of memory 1540 may also include non-volatile random access memory (non-volatile random access memory, NVRAM).
In an embodiment of the application, memory 1540, communication interface 1530, and memory 1540 are coupled together by bus system 1510. The bus system 1510 may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For ease of description, the various buses are labeled as bus system 1510 in FIG. 15.
The methods described above for embodiments of the present application may be applied to the processor 1520 or implemented by the processor 1520. Processor 1520 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the methods described above may be performed by integrated logic circuitry in hardware or instructions in software in the processor 1520. The processor 1520 may be a general purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 1520 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the application.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in any well-known storage medium such as ram, rom, or EEPROM (ELECTRICALLY ERASABLE PROGRAMMABLE READ ONLY MEMORY, EEPROM). The storage medium is located in the memory 1540 and the processor 1520 reads the information in the memory 1540 and performs the steps of the above method in combination with its hardware.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor media (e.g., solid state disk (solid state STATE DISK, SSD)), the computer-readable storage medium may be any available medium that can be stored by the computer or a data storage device such as a server, data center, etc., comprising an integration of one or more available media.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (DIGITAL VERSATILE DISC, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The foregoing is merely illustrative embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present invention, and the invention should be covered. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (9)

1. A method for exiting a two-dimensional code, characterized by being applied to a terminal device, wherein the terminal device comprises an acceleration sensor and a gyroscope sensor, the method comprising:
The terminal equipment receives a fifth operation for a third control in the desktop; the third control is used for opening the first two-dimensional code;
responding to the fifth operation, and displaying a first interface by the terminal equipment; the first interface comprises the first two-dimensional code;
The terminal equipment acquires first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data are acquired by the acceleration sensor, and the angular acceleration data are acquired by the gyroscope sensor;
when the first data indicate that wrist turning action occurs and the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold value, the terminal equipment displays a desktop interface;
The wrist turning action is obtained by the terminal equipment through the neural network model to identify the first data; the neural network model is obtained by training non-turning-wrist sample data of the terminal equipment based on turning-wrist sample data when a user turns a riding code to face a code scanning port of the gate in a gate comprising code scanning ports with different orientations and turning-wrist sample data of the riding code on the non-turning-wrist state of the user, wherein the non-turning-wrist sample data is related to the state of the terminal equipment and/or the state of the code scanning port, and the non-turning-wrist sample data is related to the state of the terminal equipment; the code scanning ports with different orientations comprise at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state.
2. A method for exiting a two-dimensional code, characterized by being applied to a terminal device, wherein the terminal device comprises an acceleration sensor and a gyroscope sensor, the method comprising:
The terminal equipment displays a second interface; the second interface is an interface operated in a second application; the second application is a video application, and the second interface comprises video content;
The terminal equipment receives a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; wherein the first application is configured to provide a first two-dimensional code for the terminal device;
Responding to the sixth operation, and displaying a first interface by the terminal equipment; the first interface comprises the first two-dimensional code; the terminal equipment acquires first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data are acquired by the acceleration sensor, and the angular acceleration data are acquired by the gyroscope sensor;
When the first data indicate that wrist turning action occurs, the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold, and the time interval from the second interface to the first interface is smaller than a second preset threshold, the terminal equipment displays the second interface;
The wrist turning action is obtained by the terminal equipment through the neural network model to identify the first data; the neural network model is obtained by training non-turning-wrist sample data of the terminal equipment based on turning-wrist sample data when a user turns a riding code to face a code scanning port of the gate in a gate comprising code scanning ports with different orientations and turning-wrist sample data of the riding code on the non-turning-wrist state of the user, wherein the non-turning-wrist sample data is related to the state of the terminal equipment and/or the state of the code scanning port, and the non-turning-wrist sample data is related to the state of the terminal equipment; the code scanning ports with different orientations comprise at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state.
3. The method according to claim 1 or 2, characterized in that the state of the terminal device comprises at least one of the following: the terminal equipment is in a vertical screen state, the terminal equipment is in a vertical screen inclined state, the terminal equipment is in a horizontal screen inclined state, the terminal equipment is in an upper right inclined state, the terminal equipment is in an upper left inclined state, the terminal equipment is in an upward inclined state, the terminal equipment is in a downward inclined state or the terminal equipment is in a horizontal downward state.
4. The method according to claim 1 or 2, further comprising:
and when the first data indicate that the wrist turning action does not occur, and/or the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is greater than or equal to the first preset threshold value, the terminal equipment displays the first interface.
5. A device for exiting a two-dimensional code, wherein a terminal device comprises an acceleration sensor and a gyro sensor, the device comprising:
A display unit for receiving a fifth operation for the third control in the desktop; the third control is used for opening the first two-dimensional code; displaying a first interface in response to the fifth operation; the first interface comprises the first two-dimensional code;
the processing unit is used for acquiring first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data are acquired by the acceleration sensor, and the angular acceleration data are acquired by the gyroscope sensor;
when the first data indicate that wrist turning action occurs and the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold value, the display unit is further used for displaying a desktop interface;
The wrist turning action is obtained by the terminal equipment through the neural network model to identify the first data; the neural network model is obtained by training non-turning-wrist sample data of the terminal equipment based on turning-wrist sample data when a user turns a riding code to face a code scanning port of the gate in a gate comprising code scanning ports with different orientations and turning-wrist sample data of the riding code on the non-turning-wrist state of the user, wherein the non-turning-wrist sample data is related to the state of the terminal equipment and/or the state of the code scanning port, and the non-turning-wrist sample data is related to the state of the terminal equipment; the code scanning ports with different orientations comprise at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state.
6. A device for exiting a two-dimensional code, wherein a terminal device comprises an acceleration sensor and a gyro sensor, the device comprising:
The display unit is used for displaying the second interface; the second interface is an interface operated in a second application; receiving a sixth operation of switching from the second interface to the first application and opening the first interface in the first application; wherein the first application is configured to provide a first two-dimensional code for the terminal device; displaying a first interface in response to the sixth operation; the first interface comprises the first two-dimensional code; the second application is a video application, and the second interface comprises video content;
the processing unit is used for acquiring first data in the wrist turning process; the first data includes acceleration data and angular acceleration data; the acceleration data are acquired by the acceleration sensor, and the angular acceleration data are acquired by the gyroscope sensor;
when the first data indicates that the wrist turning action occurs, the time interval from the occurrence of the wrist turning action to the scanning of the first two-dimensional code is smaller than a first preset threshold, and the time interval from the second interface to the first interface is smaller than a second preset threshold, the display unit is further used for displaying the second interface;
The wrist turning action is obtained by the terminal equipment through the neural network model to identify the first data; the neural network model is obtained by training non-turning-wrist sample data of the terminal equipment based on turning-wrist sample data when a user turns a riding code to face a code scanning port of the gate in a gate comprising code scanning ports with different orientations and turning-wrist sample data of the riding code on the non-turning-wrist state of the user, wherein the non-turning-wrist sample data is related to the state of the terminal equipment and/or the state of the code scanning port, and the non-turning-wrist sample data is related to the state of the terminal equipment; the code scanning ports with different orientations comprise at least one of the following: the code scanning port is in an upright state, the code scanning port is in a horizontal state, the code scanning port is in a side tilting state or the code scanning port is in a backward tilting state.
7. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the computer program is caused by the processor to perform the method of any one of claims 1 to 5.
8. A computer readable storage medium storing a computer program, which when executed by a processor causes a computer to perform the method of any one of claims 1 to 5.
9. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any one of claims 1 to 5.
CN202111136786.6A 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code Active CN115016712B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202311129985.3A CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code
CN202111136786.6A CN115016712B (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code
PCT/CN2022/118319 WO2023045789A1 (en) 2021-09-27 2022-09-13 Method and apparatus for exiting quick-response code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111136786.6A CN115016712B (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311129985.3A Division CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Publications (2)

Publication Number Publication Date
CN115016712A CN115016712A (en) 2022-09-06
CN115016712B true CN115016712B (en) 2024-05-14

Family

ID=83064915

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311129985.3A Pending CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code
CN202111136786.6A Active CN115016712B (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311129985.3A Pending CN117453105A (en) 2021-09-27 2021-09-27 Method and device for exiting two-dimensional code

Country Status (2)

Country Link
CN (2) CN117453105A (en)
WO (1) WO2023045789A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117453105A (en) * 2021-09-27 2024-01-26 荣耀终端有限公司 Method and device for exiting two-dimensional code

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090649A (en) * 2014-05-20 2014-10-08 上海翰临电子科技有限公司 Intelligent watchband and operating control method thereof
CN206833458U (en) * 2017-06-02 2018-01-02 南京易自助网络科技有限公司 A kind of barcode scanning payment mechanism of Self-help car washer
CN109725699A (en) * 2017-10-20 2019-05-07 华为终端(东莞)有限公司 Recognition methods, device and the equipment of identification code
CN109918006A (en) * 2019-01-28 2019-06-21 维沃移动通信有限公司 A kind of screen control method and mobile terminal
CN109933191A (en) * 2019-02-13 2019-06-25 苏鹏程 Gesture identification and control method and its system
CN110058767A (en) * 2019-03-26 2019-07-26 努比亚技术有限公司 Interface operation method, wearable terminal and computer readable storage medium
CN110187759A (en) * 2019-05-08 2019-08-30 安徽华米信息科技有限公司 Display methods, device, intelligent wearable device and storage medium
CN111209904A (en) * 2018-11-21 2020-05-29 华为技术有限公司 Service processing method and related device
CN112613475A (en) * 2020-12-31 2021-04-06 Oppo广东移动通信有限公司 Code scanning interface display method and device, mobile terminal and storage medium
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium
WO2021164551A1 (en) * 2020-02-19 2021-08-26 中国银联股份有限公司 Control method, system and apparatus for gate access, and gate

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107734191A (en) * 2017-11-15 2018-02-23 深圳市沃特沃德股份有限公司 Utilize the method and apparatus of acceleration transducer control mobile phone
EP3757739A4 (en) * 2018-04-19 2021-03-10 Huawei Technologies Co., Ltd. Method for display when exiting an application, and terminal
CN111190563A (en) * 2019-12-31 2020-05-22 华为技术有限公司 Interface display method and related device
CN117453105A (en) * 2021-09-27 2024-01-26 荣耀终端有限公司 Method and device for exiting two-dimensional code

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090649A (en) * 2014-05-20 2014-10-08 上海翰临电子科技有限公司 Intelligent watchband and operating control method thereof
CN206833458U (en) * 2017-06-02 2018-01-02 南京易自助网络科技有限公司 A kind of barcode scanning payment mechanism of Self-help car washer
CN109725699A (en) * 2017-10-20 2019-05-07 华为终端(东莞)有限公司 Recognition methods, device and the equipment of identification code
CN111209904A (en) * 2018-11-21 2020-05-29 华为技术有限公司 Service processing method and related device
CN109918006A (en) * 2019-01-28 2019-06-21 维沃移动通信有限公司 A kind of screen control method and mobile terminal
CN109933191A (en) * 2019-02-13 2019-06-25 苏鹏程 Gesture identification and control method and its system
CN110058767A (en) * 2019-03-26 2019-07-26 努比亚技术有限公司 Interface operation method, wearable terminal and computer readable storage medium
CN110187759A (en) * 2019-05-08 2019-08-30 安徽华米信息科技有限公司 Display methods, device, intelligent wearable device and storage medium
WO2021164551A1 (en) * 2020-02-19 2021-08-26 中国银联股份有限公司 Control method, system and apparatus for gate access, and gate
CN112613475A (en) * 2020-12-31 2021-04-06 Oppo广东移动通信有限公司 Code scanning interface display method and device, mobile terminal and storage medium
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Exploring head gesture interface of smart glasses;Shanhe Yi等;《 IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications》;20160728;第1-9页 *
基于加速度传感器的可扩展手势识别;谢仁强等;《传感技术学报》;20160531;第659-664页 *

Also Published As

Publication number Publication date
WO2023045789A1 (en) 2023-03-30
WO2023045789A9 (en) 2023-08-03
CN117453105A (en) 2024-01-26
CN115016712A (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN111602108B (en) Application icon display method and terminal
CN111983559A (en) Indoor positioning navigation method and device
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
CN111569435A (en) Ranking list generation method, system, server and storage medium
CN113282355A (en) Instruction execution method and device based on state machine, terminal and storage medium
CN117435087A (en) Two-dimensional code identification method, electronic equipment and storage medium
CN115016712B (en) Method and device for exiting two-dimensional code
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN113688019B (en) Response time duration detection method and device
CN112001442B (en) Feature detection method, device, computer equipment and storage medium
CN111249728B (en) Image processing method, device and storage medium
CN114911400A (en) Method for sharing pictures and electronic equipment
CN112118353A (en) Information display method, device, terminal and computer readable storage medium
CN114090140A (en) Interaction method between devices based on pointing operation and electronic device
CN114201738A (en) Unlocking method and electronic equipment
CN113781731B (en) Alarm method and device
CN113744736B (en) Command word recognition method and device, electronic equipment and storage medium
CN115016666B (en) Touch processing method, terminal equipment and storage medium
CN113207022B (en) Video playing method, device, computer equipment and storage medium
US20220291832A1 (en) Screen Display Method and Electronic Device
CN116755977B (en) Motion monitoring method, electronic device and computer readable storage medium
CN111135571B (en) Game identification method, game identification device, terminal, server and readable storage medium
CN118069263A (en) Window control method and electronic device
CN117406611A (en) Electronic equipment interaction method, system, terminal and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant