CN117911019A - Payment method and related device - Google Patents

Payment method and related device Download PDF

Info

Publication number
CN117911019A
CN117911019A CN202210901709.3A CN202210901709A CN117911019A CN 117911019 A CN117911019 A CN 117911019A CN 202210901709 A CN202210901709 A CN 202210901709A CN 117911019 A CN117911019 A CN 117911019A
Authority
CN
China
Prior art keywords
vehicle
payment
camera
electronic device
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210901709.3A
Other languages
Chinese (zh)
Inventor
高澍阳
游子婷
黄思雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210901709.3A priority Critical patent/CN117911019A/en
Priority to PCT/CN2023/109377 priority patent/WO2024022394A1/en
Publication of CN117911019A publication Critical patent/CN117911019A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/14Payment architectures specially adapted for billing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/02Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points taking into account a variable factor such as distance or time, e.g. for passenger transport, parking systems or car rental systems
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/06Arrangements for road pricing or congestion charging of vehicles or vehicle users, e.g. automatic toll systems

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Physics & Mathematics (AREA)
  • Finance (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a payment method and a related device. The method comprises the following steps: the vehicle identifies and acquires the payment code, or the vehicle receives payment information sent by the merchant device. After the vehicle acquires the payment code or the payment information, the vehicle prompts the user to carry out identity authentication and complete payment under the condition that the vehicle has the payment capability. After the payment is completed, the vehicle prompts the user that the payment was successful. And under the condition that the vehicle has no payment capability, the vehicle sends a payment authentication request to the electronic equipment, and the electronic equipment prompts the user to carry out identity authentication and complete payment after receiving the payment authentication request sent by the vehicle. Therefore, the outfield payment of the vehicle-mounted user is completed through the cooperation of the vehicle and the electronic equipment, the convenience of the outfield payment is improved, and the user experience is improved.

Description

Payment method and related device
Technical Field
The application relates to the technical field of vehicle-to-machine payment, in particular to a payment method and a related device.
Background
In recent years, with the development of the internet of vehicles industry, vehicle-mounted service manufacturers use vehicle-mounted machines as centers to construct vehicle owner service ecology so as to meet the service demands of vast vehicle owners. And payment is an essential link for paid use of the service, and convenience and safety are particularly important.
At present, when a vehicle-mounted user performs outfield payment, the vehicle-mounted user needs to park and use a mobile phone to complete related payment operation, and the payment is also dependent on the mobile phone to complete processes of scanning, identification, payment and the like, so that the operation is inconvenient. How to improve the convenience of the vehicle-mounted user in performing outfield payment needs to be further studied.
Disclosure of Invention
The application provides a payment method and a related device, which realize that a vehicle and electronic equipment cooperatively finish the outfield payment of a vehicle-mounted user, improve the convenience of outfield payment and improve the user experience.
In a first aspect, the present application provides a payment method, the method comprising: the vehicle scans the image through a camera on the vehicle to obtain a payment code, or the vehicle receives payment information sent by the collection device; the vehicle obtains payment details based on the payment code or payment information, the payment details including one or more of: payment amount, commodity name, payee name; the vehicle completes payment based on the payment details through the first electronic device.
Therefore, the vehicle can complete payment through the electronic equipment without judging whether the vehicle has the payment capability or not.
According to the payment method provided by the first aspect, the vehicle and the electronic equipment are cooperated to finish the outfield payment of the vehicle-mounted user, the convenience of the outfield payment is improved, and the user experience is improved.
With reference to the first aspect, in one possible implementation manner, the payment is completed by the vehicle through the first electronic device based on the payment details, and specifically includes: in the event that the vehicle does not have payment capabilities, the vehicle completes payment based on the payment details via the first electronic device.
Therefore, after the vehicle acquires the payment details, whether the vehicle has the payment capability or not can be judged, and under the condition that the vehicle does not have the payment capability, the vehicle finishes the payment through the electronic equipment.
With reference to the first aspect, in one possible implementation manner, the method further includes: in the case of a vehicle having payment capability, the vehicle completes payment based on the payment details.
Therefore, after the vehicle acquires the payment details, whether the vehicle has the payment capability or not can be judged, and under the condition that the vehicle has the payment capability, the vehicle can directly complete payment without depending on other electronic equipment to complete payment.
With reference to the first aspect, in one possible implementation manner, the camera on the vehicle includes a plurality of cameras in a plurality of directions, each direction including one or more cameras, wherein the first direction includes a first camera and a second camera; the vehicle obtains the payment code through the scanning image of a camera on the vehicle, and specifically comprises the following steps: the vehicle scans images through a first camera and a second camera in a first direction and obtains a payment code; wherein the first camera is different from the second camera.
Thus, the plurality of different cameras can be included in the same direction, and the plurality of different cameras in the same direction can be opened simultaneously or in a time-sharing manner. The vehicle scans the image through a plurality of different cameras in one direction to acquire the payment code, so that the accuracy of acquiring the payment code by the scanned image can be improved.
With reference to the first aspect, in one possible implementation manner, the vehicle scans an image through a first camera and a second camera in a first direction and obtains a payment code, which specifically includes: starting a first camera in a first direction by the vehicle, and scanning an image through the first camera; under the condition that the first camera does not acquire the payment code in the first time, the vehicle starts a second camera in the first direction, and the second camera scans an image to acquire the payment code.
Therefore, a plurality of different cameras in the same direction can be started in a time-sharing mode, and the problem of increased power consumption caused by simultaneous starting is avoided.
With reference to the first aspect, in one possible implementation manner, the camera on the vehicle includes a camera with a plurality of orientations, where the plurality of orientations includes a first orientation and a second orientation; the vehicle obtains the payment code through the scanning image of a camera on the vehicle, and specifically comprises the following steps: the vehicle scans images through the camera in the first direction and the camera in the second direction and obtains a payment code; wherein the first orientation is different from the second orientation.
Therefore, the vehicle can comprise a plurality of cameras in different directions, and the cameras in different directions can be opened simultaneously or in a time-sharing manner. The vehicle scans the images through the cameras in different directions to acquire the payment code, so that the accuracy of acquiring the payment code by the scanned images can be improved.
With reference to the first aspect, in one possible implementation manner, the vehicle scans an image through the camera in the first direction and the camera in the second direction and obtains a payment code, which specifically includes: starting a camera in a first direction by the vehicle, and scanning an image through the camera in the first direction; under the condition that the camera in the first direction does not acquire the payment code in the first time, the vehicle starts the camera in the second direction, and the payment code is acquired through the scanning image of the camera in the second direction.
Therefore, the cameras in a plurality of different directions can be started in a time-sharing mode, and the problem that power consumption is increased due to the fact that the cameras in a plurality of different directions are started simultaneously is avoided.
With reference to the first aspect, in a possible implementation manner, in a case that the vehicle obtains the payment code through a camera scanning image on the vehicle, before the vehicle obtains the payment code through the camera scanning image on the vehicle, the method further includes: the vehicle turns on a camera on the vehicle based on one or more of vehicle status, vehicle position, and environmental information.
Therefore, the vehicle can automatically start the camera to scan the image to acquire the payment code, and the operation of a user is reduced. For example, the vehicle may automatically turn on a camera in a first orientation and/or a camera in a second orientation on the vehicle. The vehicle may also automatically turn on the first camera in the first orientation and/or the second camera in the first orientation on the vehicle.
With reference to the first aspect, in a possible implementation manner, in a case that the vehicle obtains the payment code through a camera scanning image on the vehicle, before the vehicle obtains the payment code through the camera scanning image on the vehicle, the method further includes: the vehicle receives and responds to a first operation of a user, and a camera on the vehicle is started.
The first operation may be a voice operation, an operation for a shortcut key, or the like.
The first operation may be for turning on a camera in a first orientation and/or a camera in a second orientation on the vehicle. The first operation may also be turning on a first camera in a first orientation and/or a second camera in the first orientation on the vehicle.
With reference to the first aspect, in one possible implementation manner, the vehicle receives payment information sent by the collection device, and specifically includes: the vehicle establishes short-distance communication connection with the collection equipment and receives payment information sent by the collection equipment; the short-range communication connection includes: bluetooth communication connection, wi-Fi communication connection, near field communication NFC communication connection.
With reference to the first aspect, in one possible implementation manner, the first electronic device is an electronic device that logs in to the same account with the vehicle. The vehicle and the electronic equipment are trusted equipment, so that the reliability of the collaborative payment process is ensured.
With reference to the first aspect, in a possible implementation manner, the vehicle completes payment based on the payment details through the first electronic device, specifically includes that in a situation that the vehicle establishes a communication connection with the first electronic device, the vehicle completes payment based on the payment details through the first electronic device.
With reference to the first aspect, in a possible implementation manner, in a case that the vehicle does not establish a communication connection with the first electronic device, the vehicle displays a first image based on the payment code or the payment information, and the first image is used for prompting a user to scan the first image through the electronic device and complete payment.
For example, the first image may be a two-dimensional code. In this way, the vehicle displays the first image to prompt the user to complete payment based on the other electronic device scanning the first image.
With reference to the first aspect, in one possible implementation manner, the vehicle obtains payment details based on the payment code or the payment information, and specifically includes: under the condition that the vehicle has resolving power, the vehicle resolves the payment code or the payment information to obtain payment details; under the condition that the vehicle has no resolving power, the vehicle analyzes the payment details from the payment code or the payment information through the first electronic equipment, and receives the payment details sent by the first electronic equipment.
With reference to the first aspect, in one possible implementation manner, after the vehicle obtains the payment details based on the payment code or the payment information, before the vehicle completes payment based on the payment details through the first electronic device, the method further includes: under the condition that the vehicle has the authority of displaying the sensitive information, the vehicle displays a payment interface based on the payment details; and under the condition that the vehicle does not display the authority of the sensitive information, the vehicle displays a payment interface through the first electronic equipment.
In a second aspect, the present application provides a vehicle comprising: one or more processors, one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that the one or more processors invoke to cause the vehicle to perform a payment method provided in any of the possible implementations of the above.
In a third aspect, the application provides a computer readable storage medium having instructions stored therein which, when run on a vehicle, cause the vehicle to perform a payment method as provided in any one of the possible implementations of the above aspect.
In a fourth aspect, the application provides a computer program product which, when executed by a vehicle, causes the vehicle to perform a payment method as provided in any one of the possible implementations of the above aspect.
For the advantageous effects of the second aspect to the fourth aspect, reference may be made to the description of the advantageous effects in the first aspect, and the present application will not be repeated here.
Drawings
Fig. 1-fig. 2 are schematic views of a scene provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a system architecture according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a vehicle 100 according to an embodiment of the present application;
Fig. 5 is a schematic functional block diagram of a vehicle 100 according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device 200 according to an embodiment of the present application;
FIG. 7 is a schematic view of a scenario provided by an embodiment of the present application;
fig. 8A-8B are schematic diagrams illustrating a payment interface displayed on a vehicle 100 according to an embodiment of the present application;
Fig. 8C is a schematic diagram of a vehicle 100 displaying a two-dimensional code according to an embodiment of the present application;
fig. 9A-9B are schematic diagrams illustrating a payment interface displayed by the electronic device 200 according to an embodiment of the present application;
FIGS. 10A-10C are schematic diagrams illustrating a payment interface displayed by another vehicle 100 according to an embodiment of the present application;
FIG. 10D is a schematic diagram of a payment interface displayed by a further vehicle 100 according to an embodiment of the present application;
Fig. 10E-10G are schematic diagrams illustrating a payment interface displayed on an electronic device 200 according to an embodiment of the present application;
FIG. 11 is a schematic flow chart of a payment method according to an embodiment of the present application;
Fig. 12 is a flow chart of another payment method according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and furthermore, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
The application provides a payment method, which comprises the following steps: the vehicle 100 recognizes and acquires the payment code, or the vehicle 100 receives payment information transmitted from the merchant device.
After the vehicle 100 acquires the payment code or payment information, the vehicle 100 displays a payment user interface in the case where the vehicle 100 has the right to display the sensitive information. In the event that the vehicle 100 does not exhibit the authority of the sensitive information, the vehicle 100 transmits a payment code or payment information to the electronic device 200, and the electronic device 200 displays a payment user interface.
In the case where the vehicle 100 has payment capability, the vehicle 100 prompts the user for authentication and complete payment. After the payment is completed, the vehicle 100 prompts the user that the payment was successful.
In the case where the vehicle 100 has no payment capability, the vehicle 100 transmits a payment authentication request to the electronic device 200, and the electronic device 200 prompts the user to perform identity authentication and complete payment after receiving the payment authentication request transmitted by the vehicle 100. After the payment is completed, the electronic device 200 prompts the user that the payment was successful, or the electronic device 200 transmits a message of the payment success to the vehicle 100, and the vehicle 100 prompts the user that the payment was successful.
Thus, the vehicle 100 and the electronic equipment 200 cooperatively complete the outfield payment of the vehicle-mounted user, so that the convenience of outfield payment is improved, and the user experience is improved.
It should be noted that the embodiments of the present application are not limited to be applicable to the vehicle 100, but are also applicable to other types of electronic devices, such as large-screen, smart watch, etc. in conjunction with the electronic device 200.
The payment method provided by the embodiment of the application can be applied to a parking lot parking payment application scene of the vehicle 100, a filling station oiling fee application scene of the vehicle 100, a charging station electric fee application scene of the vehicle 100, a washing station car washing fee application scene of the vehicle 100, a road fee application scene of the vehicle 100, a bridge fee application scene of the vehicle 100, and the like, is not limited to the above application scenes, and can be applied to other vehicle payment scenes.
In each of the above payment scenarios, when the vehicle 100 finishes the services such as parking, refueling, charging, passing, and bridging, and when the payment is required to be made to a node, in one possible implementation manner, the vehicle 100 scans the code image and acquires the payment code at the payment node in a different scenario, and the payment process is completed on the vehicle 100 or the electronic device 200. In other possible implementations, a charging device is provided at the payment node in different scenarios, the vehicle 100 may establish a short-range communication connection with the charging device, the charging device sends payment information to the vehicle 100 through the short-range communication connection, and after the vehicle 100 acquires the payment information, the payment process may be completed on the vehicle 100 or the electronic device 200.
For example, in a fuel station pay fuel application scenario, the charging device may be a fuel peg or a fuel gun. In a pay charging station electricity rate application scenario, the charging device may be a charging peg or a charging gun. In the application scenario of paying car washing fees of a car washing station, the charging device may be an automatic charging device provided in the car washing station. It should be noted that in different scenarios, the forms of the charging devices may be the same or different, or the number of the charging devices may be one or more.
Fig. 1 illustrates a schematic diagram of a vehicle 100 acquiring a payment code in a pay station fueling rate application scenario.
For example, in a pay gas station fueling application scenario, a payment code may be located on top of each fueling peg. The payment code for each fueling stake may be different. It should be noted that the payment code used for the identification of the vehicle 100 in the gas station may be other display forms, which are not limited in the embodiment of the present application.
As shown in fig. 1, the camera on the vehicle 100 is turned on, and the vehicle 100 can acquire the payment code of the fuel filling pile 2 when the vehicle 100 is filling fuel through the fuel filling pile 2.
In one possible implementation, after the vehicle operator user informs the operator of the type and amount of fueling, a payment code may be displayed on the fueling peg 2 and the vehicle 100 may complete the payment by scanning the code through a camera on the vehicle 100 to the payment code, the vehicle 100 completing the payment, or the electronic device 200 completing the payment.
Optionally, when the fueling service exists in the fueling pile, the payment code is displayed on the fueling pile. When there is no fueling service, the payment code may not be displayed on the fueling stake.
Alternatively, the stake may update the payment code on the stake each time the stake is engaged in a different fueling operation, such as different amounts and types of fueling for different users.
In other possible implementations, the payment code on the fueling stake may also be unalterable. The vehicle 100 may determine the fueling amount, etc. by scanning the code to the payment code through a camera on the vehicle 100 and prompting the user to input the fueling stake number, fueling model, fueling amount. Therefore, the payment codes on different fueling piles can be the same, and the change of each fueling service is not needed, so that the universality is stronger. After determining the refuel amount, the vehicle 100 completes the payment or the payment is completed by the electronic device 200.
Fig. 2 illustrates a schematic diagram of a vehicle 100 acquiring payment information transmitted from a charging device in a scenario of applying a fuel charge to a fuel station.
For example, in a pay-to-fueling application scenario, the charging device may be a fueling gun on a corresponding fueling peg. It should be noted that the charging device in the gas station may also be in other display forms, which is not limited in the embodiment of the present application.
As shown in fig. 2, when the vehicle 100 is refueled by the fuel filling pile 2, after the vehicle operator informs the operator of the type and amount of refueled fuel, the operator brings the fuel gun of the fuel filling pile 2 closer to the vehicle 100. When the fuel nozzle of the fuel peg 2 is within a preset distance from the vehicle 100, the fuel nozzle of the fuel peg 2 may establish a short-range communication connection with the vehicle 100, for example, by Near Field Communication (NFC). After the fuel nozzle of the fuel peg 2 establishes a short-range communication connection with the vehicle 100, the fuel nozzle of the fuel peg 2 transmits payment information to the vehicle 100.
After the vehicle 100 acquires the payment information, the vehicle 100 may complete payment based on the payment information. Or the vehicle 100 transmits payment information to the electronic device 200, and the payment is completed through the electronic device 200.
The manner in which the vehicle 100 shown in fig. 1 and fig. 2 obtains the payment code or payment information is also applicable to other application scenarios, and the embodiments of the present application are not described herein again.
Fig. 3 schematically illustrates a system architecture according to an embodiment of the present application.
As shown in fig. 3, the vehicle 100 may establish a communication connection with the electronic device 200. The device type of the electronic device 200 may be various types, and the specific types of the plurality of electronic devices are not particularly limited in the embodiment of the present application. For example, the types of electronic devices 200 may include cell phones, tablet computers, desktop computers, laptop computers, handheld computers, notebook computers, smart screens, wearable devices (e.g., smart watches), augmented reality (augmented reality, AR) devices, virtual Reality (VR) devices, artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) devices, car machines, smart headphones, gaming machines, and may also include internet of things (internet of things, IOT) devices or smart home devices such as smart water heaters, smart lights, smart air conditioners, and the like. Without limitation, the plurality of devices in system 400 may also include non-portable terminal devices such as laptop computers (labtop) having a touch-sensitive surface or touch panel, desktop computers having a touch-sensitive surface or touch panel, and the like.
Not only the vehicle 100 establishes communication connection with the electronic device 200, but also the vehicle 100 may establish communication connection with more devices at the same time, which is not limited by the embodiment of the present application.
The following embodiments of the present application will be described with reference to the electronic device 200 as a mobile phone.
The vehicle 100 and the electronic device 200 may establish a connection in any one of the following ways.
Mode one: the vehicle 100 and the electronic device 200 may be connected to the same network, for example, the vehicle 100 and the electronic device 200 may be connected to the same local area network to establish a cooperative connection.
Mode two: the vehicle 100 and the electronic device 200 may also log in the same system account to establish a cooperative connection. For example, the system account number for which the vehicle 100 logs in with the electronic device 200 may be "HW1234".
Mode three: the system account numbers registered on the vehicle 100 and the electronic device 200 may all belong to the same account group. For example, the system account registered on the vehicle 100 and the electronic device 200 includes "HW001", "HW002", "HW003". The system accounts "HW001", "HW002", "HW003" belong to the account group "Huazhi".
Mode four: the vehicle 100 and the electronic device 200 may establish a connection through Near Field Communication (NFC), bluetooth (BT), wireless local area network (wireless local area networks, WLAN), such as wireless fidelity point-to-point (Wi-Fi P2P), infrared (IR), and the like.
Mode five: the vehicle 100 and the electronic device 200 can establish a temporary account group by scanning the same two-dimensional code, and establish cooperative connection to realize communication.
The vehicle 100 and the electronic device 200 may also establish a communication connection in other manners, which are not limited to the above-described five manners according to the embodiment of the present application.
In addition, the vehicle 100 and the electronic device 200 may also be connected and communicate in any of the several ways described above, which is not limited in this regard by the embodiment of the application.
After the vehicle 100 establishes a communication connection with the electronic device 200, the vehicle 100 may complete payment in cooperation with the electronic device 200 after the vehicle 100 acquires the payment code or payment information.
In particular, where the vehicle 100 has display capabilities, the vehicle 100 may display a payment user interface based on the payment code or payment information. In the event that the vehicle 100 does not have display capability, the vehicle 100 transmits a payment code or payment information to the electronic device 200, and a payment user interface is displayed by the electronic device 200.
In the case where the vehicle 100 has payment authentication capability, the vehicle 100 prompts the user for identity authentication and complete payment. After the payment is completed, the vehicle 100 prompts the user that the payment was successful.
In the case where the vehicle 100 has no payment capability, the vehicle 100 transmits a payment authentication request to the electronic device 200, and the electronic device 200 prompts the user to perform identity authentication and complete payment after receiving the payment authentication request transmitted by the vehicle 100. After the payment is completed, the electronic device 200 prompts the user that the payment was successful, or the electronic device 200 transmits a message of the payment success to the vehicle 100, and the vehicle 100 prompts the user that the payment was successful.
Thus, the vehicle 100 and the electronic equipment 200 cooperatively complete the outfield payment of the vehicle-mounted user, so that the convenience of outfield payment is improved, and the user experience is improved.
Next, a vehicle 100 provided by an embodiment of the present application is described.
The vehicle 100 in the embodiment of the present application may include a large automobile, a small automobile, an electric vehicle, a motorcycle, a tractor, and the like.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a vehicle 100 according to an embodiment of the present application.
As shown in fig. 4, the vehicle 100 includes: bus 11, a plurality of electronic control units (electronic control unit, ECU), engine 13, on-board BOX (TELEMATICS BOX, T-BOX) 14, transmission 15, intelligent cabin 16, anti-lock braking system (antilock brake system, ABS) 17, sensor system 18, camera system 19, microphone 20, and so forth.
The bus 11 may be a controller area network (controller area network, CAN) bus or an IP bus, among others.
Bus 11 is a serial communication network supporting distributed control or real-time control for connecting the various components of vehicle 100. Any component on the bus 11 may snoop all data transmitted on the bus 11. The frames transmitted by the bus 11 may comprise data frames, remote frames, error frames, overload frames, different frames transmitting different types of data. In an embodiment of the application, the bus 11 may be used to transmit data relating to the various components in a voice command based control method, the specific implementation of which is referred to in the following detailed description of the method embodiments.
Not limited to bus 11, in other embodiments, the various components of vehicle 100 may be connected and communicate in other ways. For example, the components may also communicate via an in-vehicle ethernet (LIN) local interconnect network (local interconnect network) bus, flexRay, and conventional in-vehicle network system (media oriented systems, MOST) bus, to name a few, as embodiments of the application are not limited in this regard. The following embodiments are described in terms of the various components communicating via bus 11.
The ECU corresponds to a processor or a brain of the vehicle 100 for instructing the corresponding components to perform the corresponding actions according to instructions acquired from the bus 11 or according to operations input by a user. The ECU may be composed of A security chip, A microprocessor (microcontroller unit, MCU), A random-access memory (random access memory, RAM), A read-only memory (ROM), an input/output interface (I/O), an analog/digital converter (A/D converter), and A large-scale integrated circuit of input, output, shaping, driving, etc.
The ECU is of a wide variety and different kinds of ECU can be used to realize different functions.
The plurality of ECUs in the vehicle 100 may include, for example: engine ECU121, ECU122 of a vehicle-mounted BOX (TELEMATICS BOX, T-BOX), transmission ECU123, anti-lock brake system (antilock brake system, ABS) ECU 125, and the like.
The engine ECU121 is used to manage the engine, coordinate various functions of the engine, and may be used to start the engine, shut down the engine, and the like, for example. The engine is a device that powers the vehicle 100. An engine is a machine that converts some form of energy into mechanical energy. The vehicle 100 may be used to burn chemical energy of liquid or gas, or to convert electrical energy into mechanical energy and output power to the outside. The engine component can comprise a crank connecting rod mechanism, a valve mechanism and five systems including a cooling system, a lubricating system, an ignition system, an energy supply system and a starting system. The main components of the engine are a cylinder body, a cylinder cover, a piston pin, a connecting rod, a crankshaft, a flywheel and the like.
T-BOX ECU122 is configured to manage T-BOX14.
The T-BOX14 is primarily responsible for communicating with the internet, providing a remote communication interface for the vehicle 100, and providing services including navigation, entertainment, driving data collection, driving track recording, vehicle fault monitoring, vehicle remote inquiry and control (e.g., lock-out, air conditioning control, window control, engine torque limitation, engine start-stop, seat adjustment, battery level inquiry, oil level, door status, etc.), driving behavior analysis, wireless hot spot sharing, road rescue, anomaly notification, etc.
The T-BOX14 may be used to communicate with a vehicle remote service provider (TSP) and user (e.g., driver) side electronic devices to enable vehicle status display and control on the electronic devices. After a user sends a control command through a vehicle management application on the electronic device, the TSP sends a request command to the T-BOX14, the T-BOX14 sends a control message through the CAN bus after obtaining the control command and controls the vehicle 100, and finally feeds back an operation result to the vehicle management application on the electronic device on the user side. That is, the data read by the T-BOX14 through the bus 11, such as the data of the vehicle condition report, the driving report, the fuel consumption statistics, the violation inquiry, the location track, the driving behavior, etc., may be transmitted to the TSP background system through the network, and forwarded to the electronic device on the user side by the TSP background system for the user to check.
T-BOX14 may include, in particular, a communication module and a display screen.
The communication module may be used to provide wireless communication functions, and support the vehicle 100 to communicate with other devices through wireless local area networks (wireless local area networks, WLAN) (e.g., wi-Fi network, WIRELESS FIDELITY), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near Field Communication (NFC), infrared (IR), ultra-wideband (UWB), and other wireless communication technologies. The communication module may also be used to provide mobile communication functions to support the vehicle 100 in communication with other devices via the global system for mobile communications (global system for mobile communications, GSM), universal mobile telecommunications system (universal Mobile telecommunications system, UMTS), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), 5G, and future-emerging 6G, among other communication technologies.
The communication module may establish a connection and communicate with a variety of (vehicle to everything, V2X) communication technologies (cellular V2X, C-V2X) and other devices such as servers, user-side electronic devices, etc. via a cellular network based vehicle. The C-V2X may include, for example, V2X (LTE-V2X), 5G-V2X, etc., based on long term evolution (long term evolution, LTE).
In some embodiments, the vehicle 100 may establish a short-range communication connection with the charging device based on the communication module, so that the charging device may send payment information to the vehicle 100, so as to enable a driver and passenger such as a vehicle user, a passenger, etc. to obtain the payment information and complete payment without getting off the vehicle or having to take out a mobile phone to scan a payment code outside the vehicle. The convenience of outfield payment is improved, the user experience is improved, and the driving safety problem of the vehicle-mounted user is also ensured.
The display screen is used to provide a visual interface for the driver. One or more displays may be included in the vehicle 100, such as an onboard display disposed in front of the operator's seat, a display disposed above the seat for displaying ambient conditions, a head up digital display (HUD) that projects information onto the windshield, and the like. The display screen for displaying the user interface in the vehicle 100 provided in the subsequent embodiment may be a vehicle-mounted display screen disposed beside the seat, or may be a display screen disposed above the seat, or may be a HUD, or the like, or may be a display screen on an instrument panel, or may be a central control screen, which is not limited herein. The user interface displayed on the display screen in the vehicle 100 may be specifically described with reference to the following embodiments, and will not be described in detail herein. The embodiment of the application does not limit the size and the display style of the HUD and the instrument panel.
Optionally, the display may display payment information or payment codes acquired by the vehicle 100 to prompt the user to complete the payment.
T-BOX14 may also be referred to as a vehicle system, a telematics device, a vehicle gateway, etc., as embodiments of the application are not limited in this regard.
The transmission ECU123 is for managing the transmission.
The transmission 15 may be used as a mechanism for varying the rotational speed and torque of the engine, which can fix or shift the ratio of the output shaft to the input shaft. The transmission 15 components may include a variable speed drive, an operating mechanism, a power take off mechanism, and the like. The main function of the variable speed transmission mechanism is to change the numerical value and direction of torque and rotating speed; the main function of the control mechanism is to control the transmission mechanism to realize the change of the transmission ratio of the transmission, namely the gear shift, so as to achieve the purposes of speed change and torque conversion.
The intelligent cockpit 16 may include, but is not limited to, a tachograph and a center control screen. In some embodiments, the intelligent cabin 16 may also include a tachograph ECU for managing the tachograph. In some embodiments, vehicle 100 may not include a tachograph ECU. The components of the automobile data recorder can comprise a host, a vehicle speed sensor, data analysis software and the like. The automobile data recorder is an instrument for recording the related information of the running time, speed, position and the like of the images and sounds in the running process of the automobile. In the embodiment of the application, when the vehicle runs, the vehicle speed sensor acquires the wheel rotation speed and sends the vehicle speed information to the vehicle recorder through the bus 11. The central control screen can be an intelligent large screen, and the size and the display style of the central control screen are not limited in the embodiment of the application.
The ABS ECU125 is for managing the ABS17.
The ABS17 is configured to automatically control the magnitude of the braking force of the brake when the vehicle is braked, so that the wheels are not locked and are in a state of rolling and sliding at the same time, so as to ensure that the adhesion between the wheels and the ground is maximum. In the braking process, when the electronic control device judges that the wheels tend to lock according to the wheel rotating speed signals input by the wheel rotating speed sensor, the ABS enters an anti-lock braking pressure adjusting process.
The sensor system 18 may include: acceleration sensor, vehicle speed sensor, vibration sensor, gyro sensor, radar sensor, signal transmitter, signal receiver, etc., pressure sensor, etc. The acceleration sensor and the vehicle speed sensor are used to detect the speed of the vehicle 100. The shock sensor may be disposed under the seat, in the seat belt, in the seat back, in the operator panel, in the air bag, or in other locations for detecting whether the vehicle 100 is crashed and where the user is located. The gyroscopic sensor may be used to determine a motion pose of the vehicle 100. The radar sensor may include a lidar, an ultrasonic radar, a millimeter wave radar, or the like. The radar sensor is used to emit electromagnetic waves to irradiate a target and receive echoes thereof, thereby obtaining information of a distance, a distance change rate (radial velocity), an azimuth, an altitude, and the like of the target to an electromagnetic wave emission point, thereby identifying other vehicles, pedestrians, roadblocks, and the like near the vehicle 100. The signal transmitter and the signal receiver are used for receiving signals, which can be used for detecting the position of the user, and the signals can be ultrasonic waves, millimeter waves, laser, etc. The pressure sensor is used to collect pressure data, and the vehicle 100 can monitor whether passengers are on the intelligent cabin based on the pressure data, so as to judge the number of people on the vehicle 100.
The camera system 19 may include a plurality of cameras for capturing still images or video. The camera in the camera system 19 may be disposed at one or more positions above, below, in front of, behind, left of, right of, in the vehicle, etc., so as to facilitate the functions of driving assistance, driving recording, panoramic looking around, in-vehicle monitoring, etc.
The number of cameras in each direction can be one or two or more.
In some embodiments, a camera on the vehicle may be used to collect payment codes used in a corresponding scene, so that a driver and passenger such as a vehicle user and a passenger can complete payment without getting off the vehicle or taking out a mobile phone to scan the payment codes outside the vehicle. The convenience of outfield payment is improved, the user experience is improved, and the driving safety problem of the vehicle-mounted user is also ensured.
The sensor system 18, the camera system 19 may be used to detect the ambient environment, facilitating the vehicle 100 to make corresponding decisions to cope with environmental changes, such as may be used in an autopilot phase to perform tasks that focus on the ambient environment.
A microphone 20, also called "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or outputting a voice command, the user can sound near the microphone 20 through the mouth, inputting a sound signal to the microphone 20. The vehicle 100 may be provided with at least one microphone 20. In other embodiments, the vehicle 100 may be provided with two microphones 20, and may perform a noise reduction function in addition to collecting sound signals. In other embodiments, the vehicle 100 may also be provided with three, four, or more microphones 20, forming a microphone array, enabling collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
In addition, the vehicle 100 may also include multiple interfaces, such as a USB interface, an RS-232 interface, an RS485 interface, etc., that may be externally connected to a camera, a microphone, an earphone, and a user-side electronic device.
In an embodiment of the present application, the microphone 20 may be used to detect voice commands entered by a user. Sensor system 18, camera system 19, T-BOX14, etc. may be used to obtain character information of the user inputting the voice command. For the manner in which the various components in the vehicle 100 obtain the user's role information, reference may be made to the relevant descriptions in the subsequent method embodiments. The T-BOX ECU122 may be configured to determine, according to the role information, whether the user currently has the authority corresponding to the voice command, and only if the user has the authority, the T-BOX ECU122 may schedule the corresponding component in the vehicle 100 to respond to the voice command.
In some embodiments, sensor system 18, camera system 19, T-BOX14, etc. are used to obtain not only the role information of the user inputting the voice instruction, but also the role information of other users. The T-BOX ECU122 may be configured to determine whether the user currently has the authority corresponding to the voice command in combination with the role information of the user inputting the voice command and the role information of other users.
In some embodiments, sensor system 18, camera system 19, T-BOX14, etc. may be used to obtain a vehicle state of vehicle 100. The T-BOX ECU122 may be configured to determine whether the user currently has the authority corresponding to the voice command in combination with the vehicle status and the role information of the user.
In some embodiments, memory in the vehicle 100 may be used to store binding relationships between the vehicle and the user.
It will be appreciated that the configuration illustrated in the embodiments of the present application does not constitute a specific limitation on the vehicle system. The vehicle 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
For example, the vehicle 100 may also include separate memories, batteries, lights, wipers, dashboards, audio, vehicle terminals (transmission control unit, TCU), auxiliary control units (auxiliary control unit, ACU), intelligent access and actuation systems (PASSIVE ENTRY PASSIVE START, PEPS), on Board Units (OBU), body control modules (body control module, BCM), charging interfaces, and the like. The memory may be configured to store authority information for different roles in the vehicle 100, where the authority information indicates a usage authority of the vehicle 100 that the role has or does not have. In some embodiments, the memory may be used to store rights information for different roles in the vehicle 100 in different vehicle states. In some embodiments, the memory may be used to store rights information for different roles in the vehicle 100 when there are different other roles.
The specific roles of the various components of the vehicle 100 are also described with reference to the following method embodiments, and are not repeated here.
Fig. 5 schematically shows functional modules of the vehicle 100.
Including but not limited to on vehicle 100: a payment awareness module 501, a display module 502, an identity authentication module 503, and a notification module 504.
The payment sensing module 501 is configured to obtain a payment code or payment information.
The payment sensing module 501 may collect an image based on a camera on the vehicle 100 and obtain a payment code. Wherein the payment awareness module 501 may obtain payment information from the payment code. The payment sensing module 501 may also receive the payment information obtained by receiving the payment request input by the user after obtaining the payment code.
The payment sensing module 501 may also establish a short-range communication connection based on the communication module on the vehicle 100 and the charging device in the corresponding scenario, where the payment sensing module 501 directly obtains the payment information sent by the charging device through the short-range communication connection with the charging device. Among them, short-range communications include, but are not limited to, bluetooth, wi-Fi, NFC, and the like.
After the payment sensing module 501 obtains the payment code or payment information, the payment sensing module 501 is further configured to send the payment code or payment information to the display module 502.
After the payment code or payment information is acquired, the display module 502 is configured to display a payment interface based on the payment code or payment information when the vehicle 100 has the capability of resolving the information and displaying the authority of the sensitive information. In the case that the vehicle 100 does not have the capability of resolving information and exhibiting the authority of sensitive information, the display module 502 is configured to send the payment code or the payment information to the electronic device 200 that establishes a communication connection therewith, and display a payment interface based on the payment code or the payment information through the electronic device 200.
If a payment interface is displayed on the vehicle 100, the display module 502 is further configured to send an authentication command to the authentication module 503.
After receiving the authentication instruction, the authentication module 503 verifies the identity of the user based on the biometric information stored on the vehicle 100 to complete payment in the case where the vehicle 100 has payment capability. In the event that the vehicle 100 does not have payment capabilities, the identity authentication module 503 may verify the identity of the user through the electronic device 200 with which the communication connection is established to complete the payment.
If a payment interface is displayed on the electronic device 200, the electronic device 200 may send an authentication instruction to the vehicle 100. After the vehicle 100 receives the authentication instruction transmitted by the electronic device 200, in the case where the vehicle 100 has a payment capability, the authentication module 503 verifies the identity of the user based on the biometric information stored on the vehicle 100 to complete the payment. In the event that the vehicle 100 does not have payment capabilities, the identity authentication module 503 may verify the identity of the user through the electronic device 200 with which the communication connection is established to complete the payment.
If the vehicle 100 verifies that the user's identity is complete, the identity authentication module 503 is further configured to send a payment complete notification to the notification module 504.
After receiving the payment completion notification, the notification module 504 may prompt the vehicle user that the vehicle 100 has completed payment by one or more means such as voice, text, pictures, flashing lights, vibration, etc.
If the electronic device 200 verifies that the user's identity is paying, in one possible implementation, the electronic device 200 may prompt the vehicle user that the vehicle 100 has completed paying by one or more means such as voice, text, pictures, flashing lights, vibration, etc. In other possible implementations, the electronic device 200 sends a payment completion notification to the vehicle 100, and the vehicle 100 may prompt the vehicle user that the vehicle 100 has completed payment by one or more means, such as voice, text, picture, flashing light, vibration, etc., after receiving the payment completion notification.
It should be noted that, the one or more functional modules may exist alone to implement a preset function, or two or more functional modules may be combined together to implement a preset function, which is not limited in the embodiment of the present application.
Fig. 6 exemplarily shows a schematic structural diagram of the electronic device 200.
The electronic device 200 may be a cell phone, tablet computer, desktop computer, laptop computer, handheld computer, notebook computer, ultra-mobile personal computer (UMPC), netbook, and cellular telephone, personal Digital Assistant (PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, and the specific type of the electronic device is not particularly limited by the embodiments of the present application. The following embodiments of the present application will be described with reference to the electronic device 200 as a mobile phone.
The electronic device 200 may include A processor 110, an external memory interface 120, an internal memory 121, A universal serial bus (universal serial bus, USB) interface 130, A charge management module 140, A power management module 141, A battery 142, an antenna 1, an antenna 2, A mobile communication module 150, A wireless communication module 160, an audio module 170, A speaker 170A, A receiver 170B, A microphone 170C, an earphone interface 170D, A sensor module 180, keys 190, A motor 191, an indicator 192, A camera 193, A display 194, and A subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 200. In other embodiments of the application, electronic device 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be separated, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only illustrative, and does not limit the structure of the electronic device 200. In other embodiments of the present application, the electronic device 200 may also employ different interfacing manners, or a combination of interfacing manners, as in the above embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 200. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 200 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 200 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 200 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 200 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 200 is selecting a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 200 may support one or more video codecs. In this way, the electronic device 200 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the electronic device 200 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The electronic device 200 may implement audio functions through the audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, and application processor, etc. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 200 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the electronic device 200 is answering a telephone call or voice message, the voice can be received by placing the receiver 170B close to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 200 may be provided with at least one microphone 170C. In other embodiments, the electronic device 200 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 200 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 200.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 200 calculates altitude from barometric pressure values measured by the barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 200 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 200 is a flip machine, the electronic device 200 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 200 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 200 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 200 may measure the distance by infrared or laser. In some embodiments, the electronic device 200 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 200 emits infrared light outward through the light emitting diode. The electronic device 200 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that an object is in the vicinity of the electronic device 200. When insufficient reflected light is detected, the electronic device 200 may determine that there is no object in the vicinity of the electronic device 200. The electronic device 200 can detect that the user holds the electronic device 200 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 200 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 200 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 200 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 200 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 200 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 200 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 200 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 200.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card.
In the case that the vehicle 100 obtains the payment code by scanning the image with the camera, the vehicle 100 needs to start the camera to collect the image at a proper time to obtain the payment code. The camera is prevented from being opened in real time, so that the power consumption is too high. Or the time for starting the camera is too late, so that the time for acquiring the payment code is longer, and the vehicle-mounted user needs to wait for a longer time to influence the user experience.
The vehicle 100 may trigger the vehicle 100 to turn on a camera on the vehicle 100 for scanning the pay codes based on any one or more of the following.
Mode one: the vehicle 100 may determine to turn on a camera for scanning the pay codes based on one or more of vehicle status, vehicle location, and environmental information.
Vehicle conditions include, but are not limited to: power up/down of the vehicle 100, running/parking of the vehicle 100, and gear of the vehicle 100.
The vehicle location may be global positioning system (global positioning system, GPS) information of the vehicle 100.
The environmental information may be surrounding landmark objects acquired by the vehicle 100. For example, in a parking lot, the environmental information may be a parking lot exit gate or a parking lot exit sign, and the parking lot exit gate may be at least one of a straight bar gate, a fence gate, and a curved bar gate. The environmental information may be different at different locations, which is not limited in the embodiment of the present application.
The vehicle 100 may determine a scene in which the vehicle 100 is located based on one or more of a vehicle status, a vehicle location, and environmental information, and turn on a camera for scanning a payment code.
For example, the vehicle 100 determines that the vehicle 100 is powered down to powered up, the vehicle 100 is running slowly, the position of the vehicle 100 is a parking lot, and the parking lot exit gate or the parking lot exit mark is identified, so the vehicle 100 can determine that the vehicle 100 is in a scene of paying parking fee in the parking lot, and at this time, the vehicle 100 can automatically control the opening of a camera for scanning a payment code.
Mode two: the vehicle user or a driver on the vehicle may control the vehicle 100 to turn on a camera for scanning the pay codes by voice.
As shown in fig. 7, when a user of the vehicle or a driver on the vehicle needs to pay by visual inspection, the user of the vehicle or the driver on the vehicle wakes up the vehicle 100 by voice, so that the vehicle 100 controls the opening of a camera on the vehicle 100 for scanning a payment code. For example, the user outputs the voice "small E, begins scanning off-site pay codes". After the vehicle 100 recognizes the voice result of the user, after the vehicle 100 turns on the camera on the vehicle 100 for scanning the pay codes, the vehicle 100 may reply with a voice "the pay codes have been started" to prompt the user that the camera for scanning the pay codes has been turned on.
Mode three: the vehicle user or a driver on the vehicle may control the vehicle 100 to turn on the camera for scanning the pay codes through the shortcut key.
The shortcut key may be a key on the vehicle 100, the type of key may be a physical key or a virtual key, and the type of physical key may be a push, a rotary, or a rocker. The embodiment of the application is not limited to the specific implementation mode of the shortcut key.
When the vehicle 100 needs to turn on the payment code outside the camera scan field, the vehicle 100 may receive an input operation (e.g., a single press) of the shortcut key by the user, and in response to the input operation by the user, the vehicle 100 turns on the camera on the vehicle 100 for scanning the payment code. When the vehicle 100 receives the user's input operation (e.g., a single press) for the shortcut key again, the camera on the vehicle 100 for scanning the pay code is turned off.
The three ways for turning on the camera for scanning the pay codes on the vehicle 100 are not limited to the above, but other ways are also possible, which are not limited to the embodiment of the present application.
Optionally, after the vehicle 100 completes the payment and the payment is successful, the vehicle 100 may automatically turn off the camera for scanning the payment code to reduce the power consumption of the vehicle 100.
After a camera on the vehicle 100 for scanning the pay codes is turned on, the vehicle 100 acquires the pay codes by scanning an image with the camera. Or the vehicle 100 establishes a short-range communication connection with the collection device, and the vehicle 100 receives payment information transmitted by the collection device through the short-range communication connection with the collection device.
After the vehicle 100 acquires the payment code or payment information, the vehicle 100 needs to obtain payment details through the payment code or payment information. Wherein the payment code may be in an image format and the payment information may be in a message format. Payment details include, but are not limited to: payment amount, commodity name, payee name, etc.
In the case where the vehicle 100 has a resolved payment code or payment information, i.e., the vehicle 100 may resolve the image format or the message format, the vehicle 100 may directly obtain payment details from the payment code or the payment information.
In the case where the vehicle 100 does not parse the payment code or the payment information, the vehicle 100 may transmit the payment code or the payment information to the electronic device 200, and the electronic device 200 parses the picture format or the message format to obtain payment details. After that, the electronic device 200 transmits the payment details to the vehicle 100 again.
After the vehicle 100 acquires the payment details, in a case where the vehicle 100 has the authority to display the sensitive information, the vehicle 100 may display a payment interface based on the payment details to prompt the vehicle user for the payment amount.
Alternatively, before the vehicle 100 presents the payment interface, the vehicle 100 may obtain the user's authorization through voice, e.g., the voice may be "identify that payment is being made, is it continued? In the case where the user replies yes, the vehicle 100 may output the voice again, and the voice content may be the content of the payment details, for example, "you need to pay 15 yuan of parking fee". After that, the vehicle 100 may output the voice "please verify your identity" again. In the case where the user replies no, the vehicle 100 may output the voice again, and the voice content may be "suspended payment".
Fig. 8A schematically illustrates a vehicle 100 displaying a payment interface.
As shown in fig. 8A, the vehicle 100 displays a payment interface 801 on a center control screen. Information of payment details is included in the payment interface 801. For example, in a gas station pay fueling application scenario, payment details include, but are not limited to, fueling amount, fueling model number, fueling gun number, and the like. The vehicle 100 may display information of the payment details in the payment interface 801. For example, the fueling model number is 95#, the fueling gun number is 3, the fueling amount is 300 yuan, the preferential amount is 6.5 yuan, and the amount actually required to be paid is 293.5 yuan.
Not only to the center control screen, the vehicle 100 may display the payment interface 801 at other locations of the vehicle 100, such as in a dashboard or head up display to display the payment interface 801 so that the payment interface 801 may be seen by the driver.
Optionally, while the vehicle 100 displays the payment interface, the vehicle 100 may output a voice "the fueling model number is 95#, the fueling gun number is 3, the fueling amount is 300 yuan, the preferential amount is 6.5 yuan, and the amount actually required to be paid is 293.5 yuan. ", thereby prompting the user that the vehicle 100 is conducting a payment transaction.
After the vehicle 100 displays the payment interface, in the event that the vehicle 100 has useful payment capabilities, the vehicle 100 may prompt the user for authentication and complete payment. The user may verify identity by one or more of making a sound, touching the screen, touching a button, entering a password, and providing physiological identifying features such as a fingerprint, face, eyes, etc., to complete the payment.
After the vehicle 100 authenticates the user's identity, the vehicle 100 may display a prompt 802, as shown in fig. 8B, for prompting the user that payment is complete. Or the vehicle 100 may also prompt the user that payment is complete by voice, vibration, flashing lights, pictures, etc.
Optionally, after the vehicle 100 displays the payment interface, in the event that the vehicle 100 does not have payment capabilities, but there are other electronic devices that establish a communication connection with the vehicle 100, the vehicle 100 may send a verification instruction to the electronic device 200, the verification instruction being for instructing the electronic device 200 to verify the identity of the user. After the electronic device 200 authenticates the identity of the user, the vehicle 100 may send a verification passing instruction to the electronic device 200, and after the vehicle 100 receives the verification passing instruction sent by the electronic device 200, the vehicle 100 may display a prompt 802 as shown in fig. 8B. The vehicle 100 may also prompt the user that payment is complete by voice, vibration, flashing lights, pictures, etc.
Optionally, after the vehicle 100 acquires the payment code or payment information, in the case that the vehicle 100 has no payment capability, there is no other electronic device that establishes a communication connection with the vehicle 100, that is, the vehicle 100 cannot send a verification instruction to the other electronic device. As shown in fig. 8C, the vehicle 100 may display a two-dimensional code based on the payment code or payment information, where the two-dimensional code is used to prompt the user to actively scan the two-dimensional code through other devices to complete payment.
In some embodiments, after the vehicle 100 scans the payment code, the payment code may be displayed directly on the central control screen, i.e., the payment code is the same as the two-dimensional code shown in fig. 8C. In other embodiments, after the vehicle 100 scans the payment code, the vehicle 100 may acquire payment details from the payment code, and the vehicle 100 regenerates the two-dimensional code based on the payment details, that is, the payment code is different from the two-dimensional code shown in fig. 8C.
In some embodiments, after the vehicle 100 receives the payment information sent by the checkout device, the vehicle 100 may obtain payment details from the payment information, and the vehicle 100 may generate the two-dimensional code based on the payment details.
Thus, the vehicle 100 automatically acquires the payment code or the payment information, so that the driver and passengers such as a vehicle user, a passenger and the like can acquire the payment information and complete payment without getting off or taking out the mobile phone to scan the payment code outside the vehicle. The convenience of outfield payment is improved, the user experience is improved, and the driving safety problem of the vehicle-mounted user is also ensured.
In some embodiments, after the vehicle 100 obtains payment details, in the case where the vehicle 100 does not exhibit the authority of the sensitive information, but there are other electronic devices that establish a communication connection with the vehicle 100, the vehicle 100 may display a payment interface through the electronic device 200 that establishes a communication connection therewith to prompt the vehicle user to pay the amount.
Fig. 9A-9B schematically illustrate a schematic diagram of the electronic device 200 displaying a payment interface.
In the case where the vehicle 100 has parsed payment codes or payment information, i.e., the vehicle 100 may obtain payment details directly from the payment codes or payment information. If the vehicle 100 does not have presentation capabilities, the vehicle 100 may send payment details to the electronic device 200. Or in the case where the vehicle 100 has the resolved payment code or payment information, the vehicle 100 transmits the payment code or payment information to the electronic device 200, and the electronic device 200 resolves the payment code or payment information to obtain payment details.
After the electronic device 200 obtains the payment details, the electronic device 200 may display a payment interface based on the payment details.
As shown in fig. 9A, the electronic device 200 may display a payment interface 901 as shown in fig. 9A. The payment interface 901 is similar to the payment interface 801 shown in fig. 8A, and in particular, reference may be made to the description of the payment interface 801, and the embodiments of the present application will not be repeated herein.
Alternatively, the interface layout of the payment interface displayed on the electronic device 200 and the interface layout of the payment interface displayed on the vehicle 100 may also be different, which is not limited by the embodiment of the present application.
After the electronic device 200 displays the payment interface 901, the vehicle 100 may prompt the user for identity authentication and complete payment if the vehicle 100 has payment capabilities. The user may verify identity by one or more of making a sound, touching the screen, touching a button, entering a password, and providing physiological identifying features such as a fingerprint, face, eyes, etc., to complete the payment. After the vehicle 100 verifies that the user identity passes, the vehicle 100 may send a verification pass instruction to the electronic device 200. After receiving the verification passing instruction, the electronic device 200 may display a prompt 802 as shown in fig. 9B. The electronic device 200 may also prompt the user that payment is completed by means of voice, vibration, flashing light, pictures, etc.
Optionally, after the vehicle 100 verifies that the user passes, the vehicle 100 may prompt the user that payment is completed by means of voice, vibration, flashing light, or a picture.
After the electronic device 200 displays the payment interface 901, the vehicle 100 may verify the identity of the user through the electronic device 200 in the event that the vehicle 100 has no payment capabilities. Specifically, the vehicle 100 may send a verification instruction to the electronic device 200, the verification instruction being for instructing the electronic device 200 to verify the identity of the user. After the electronic device 200 authenticates the identity of the user, the vehicle 100 may send a verification passing instruction to the electronic device 200, and after the vehicle 100 receives the verification passing instruction sent by the electronic device 200, the vehicle 100 may display a prompt 802 as shown in fig. 8B. The vehicle 100 may also prompt the user that payment is complete by voice, vibration, flashing lights, pictures, etc.
In some embodiments, after the vehicle 100 obtains the payment details, in the case that the vehicle 100 does not display the authority of the sensitive information, and does not have other electronic devices that establish a communication connection with the vehicle 100, the vehicle 100 may not display any information, and may prompt the user to verify the identity of the user and complete the payment by vibration, flashing light, pictures, and the like.
Optionally, in some embodiments, after the vehicle 100 acquires the payment code or payment information, where the payment code or payment information is used to display the commodity purchase detail interface, the vehicle 100 may receive an input operation of the user on the commodity purchase interface, and then the vehicle 100 displays the payment interface again. Thus, for some self-service scenarios, such as a self-service gas station, a self-service charging station, a self-service washing station, etc., the vehicle 100 may first scan to obtain a payment code or payment information, display the commodity details on the vehicle, and complete payment on the vehicle after the user selects the commodity details. At present, the user is required to get off the vehicle to purchase goods and provide corresponding services after payment is completed by using the mobile phone of the user. The embodiment of the application can realize that a user can finish a series of operations without getting off and passing through the vehicle machine, and improves the user experience.
Fig. 10A-10C illustrate schematic diagrams of another vehicle 100 displaying a payment interface.
The following examples of the application are presented by way of example of self-service gasoline stations.
After the vehicle 100 scans the payment code through a camera on the vehicle, or after the vehicle 100 establishes a short-distance communication connection with the collection device, the vehicle 100 receives payment information sent by the collection device.
After the vehicle 100 acquires the payment code or payment information, the vehicle 100 may display a user interface 1001 as shown in fig. 10A.
As shown in fig. 10A, information of the first gas station is shown on the user interface 1001. The information includes, but is not limited to, an address of the first fuel station, business hours of the first fuel station, and fueling activity information of the first fuel station. The address of the first gas station may be "XX street XX number in XX district XX city of XX province", the business hours of the first gas station may be "08:00-22:00", and the fueling activity information of the first gas station may be "92# gasoline has preferential", "fueling gifts", "free carwash", "95# gasoline does not have preferential", etc. The vehicle 100 may highlight (e.g., deepen) a display area where the activity information with a high preference is located. For example, the vehicle 100 may highlight the display area of "92# gasoline has a preference", "refueled gift", "free car wash" in the refueled campaign information of the first gas station.
In some embodiments, the vehicle 100 may not display the user interface 1001.
After the vehicle 100 has displayed the user interface 1001 for a certain time, or the vehicle 100 receives an input operation (e.g., a click) by the user with respect to the user interface 1001, or the vehicle 100 receives a voice operation by the user, an input operation with respect to a shortcut key, or the like, the vehicle 100 may display the user interface 1002 as shown in fig. 10B.
As shown in fig. 10B, the user interface 1002 shows oil numbers including "92#" gasoline, "95#" gasoline, "98#" gasoline, and "0#" gasoline. The numbers of the illustrated fuel guns include number 1 fuel guns, number 2 fuel guns, number 3 fuel guns, number 4 fuel guns, number 5 fuel guns, number 6 fuel guns, number 7 fuel guns, number 8 fuel guns, number 9 fuel guns, number 10 fuel guns, number 11 fuel guns, number 12 fuel guns, number 13 fuel guns, number 14 fuel guns, number 15 fuel guns, and others. The user interface 1002 is used to prompt the user to select a fuel number and a fuel nozzle number. The vehicle 100 receives an input operation (e.g., a click) by the user with respect to the user interface 1002, selects the fuel number and the fuel dispenser number, or the vehicle 100 receives a voice operation by the user, an input operation with respect to a shortcut key, or the like, selects the fuel number and the fuel dispenser number. Illustratively, if the user selects the number 95 and the number 3, the vehicle 100 may highlight (e.g., deepen) the display area of the user interface 1002 where the number 95 is located and the display area of the number 3 is located, thereby prompting the user as to which of the number and the number of the fuel dispenser is selected.
After the fuel number and the number of the fuel dispenser are selected, the vehicle 100 may display a user interface 1003 as shown in fig. 10C. The fueling amount shown in user interface 1003 includes 100, 200, 300, 400, and 500. If the fuel amount is not the user's desired amount, the user interface 1003 also shows a fuel amount input area in which the user may input the fuel amount to be selected. For example, if the user selects a fueling amount of 300, vehicle 100 may highlight (e.g., deepen) the display area in user interface 1003 where fueling amount of 300 is located, thereby prompting the user as to which fueling amount is selected. As described above, the vehicle 100 may receive the voice selection amount of the user, or the vehicle 100 may receive the selection amount of the user such as the input operation of the shortcut key.
After the user confirms the payment amount (e.g., 300 yuan), in the event that the vehicle 100 has the ability to verify the user's identity, the vehicle 100 will then verify the user's identity and complete the payment. In the event that the vehicle 100 does not have the ability to verify the user's identity, the vehicle 100 may verify the user's identity and complete payment through the electronic device 200. Specifically, similar to the foregoing embodiments, the embodiments of the present application are not described herein.
Fig. 10D shows a schematic diagram of yet another vehicle 100 displaying a payment interface.
A payment user interface 1004 is shown in fig. 10D, i.e. the vehicle 100 may simultaneously display the user interface 1001, the user interface 1002 and the user interface 1003 shown in fig. 10A-10C. For the description of the user interface 1004, reference may be made to the descriptions of the user interface 1001, the user interface 1002, and the user interface 1003 in fig. 10A to fig. 10C, and the embodiments of the present application are not described herein again.
Fig. 10E-10G show schematic diagrams of a payment interface displayed on the electronic device 200.
As shown in fig. 10E, the electronic device 200 first displays a user interface 1005 on an interface, and the user interface 1005 is similar to the user interface 1001, which is not described herein.
After the electronic device 200 displays the user interface 1005 for a certain time, or the electronic device 200 receives an input operation (e.g., a click) by the user with respect to the user interface 1005, or the electronic device 200 receives a voice operation by the user, etc., the electronic device 200 may display the user interface 1006 as shown in fig. 10F.
As shown in fig. 10F, the number options for the fuel number and the number options for the fuel dispenser are shown on the user interface 1006. For the number options of the fuel number and the number options of the fuel gun shown in the user interface 1006, reference may be made to the related description in the foregoing embodiment of fig. 10B, and the description of the embodiment of the present application is omitted herein. Upon receiving a user selection of a fuel number (e.g., "95#" gasoline) and a number for a fuel dispenser (e.g., number 3 dispenser), the electronic device 200 may display a user interface 1007 as shown in fig. 10G.
As shown in fig. 10G, a refuel amount option is shown on user interface 1007. For the refuel amount option shown on the user interface 1007, reference may be made to the related description in the foregoing embodiment of fig. 10C, and the description of the embodiment of the present application is not repeated here. After the user has selected the refuel amount, the electronic device 200 may receive an input operation (e.g., a single click) by the user for a payment control on the user interface 1007, and in response to the input operation by the user, the cell phone may complete the payment. Or the user selects the amount by voice.
After the electronic device 200 displays the payment interface, if the vehicle 100 has payment capabilities, the vehicle 100 may prompt the user to verify the identity of the user and complete the payment. In the event that the electronic device 200 does not have payment capabilities, the vehicle 100 may verify the user's identity and complete the payment through the electronic device 200. In particular, reference may be made to the description of the foregoing embodiments, and the embodiments of the present application are not repeated herein.
It should be noted that the user interfaces shown in fig. 10A to 10G are only for explaining the present application, and the interface layout of the user interfaces may be different in practical applications, which is not limited in the embodiments of the present application.
The foregoing embodiments describe that in the event that the vehicle 100 does not exhibit sensitive information rights or has no payment capabilities, the vehicle 100 may complete payment by exhibiting a payment interface or verifying the identity of the user via the electronic device 200 with which the communication connection is established.
If there are a plurality of electronic devices that are simultaneously connected to the vehicle 100 in a communication manner, the vehicle 100 needs to determine one electronic device from the plurality of electronic devices to display a payment interface or verify the identity of the user to complete payment. For example, if a plurality of electronic devices are connected to the wireless network of the vehicle 100 at the same time, the vehicle 100 needs to determine one electronic device from the plurality of electronic devices to display the payment interface or verify the identity of the user to complete the payment.
The vehicle 100 may screen out one device from among a plurality of electronic devices that establish communication connection based on any one or several of the following.
Mode one: the vehicle 100 may screen out one electronic device with the same account number as the account number registered by the vehicle 100 from a plurality of electronic devices that establish communication connection.
Mode two: the vehicle 100 may screen out one electronic device having the largest number of times of establishing a connection with the vehicle 100 from among a plurality of electronic devices that establish communication connection.
Mode three: the vehicle 100 may screen out one electronic device having the longest time to establish a connection with the vehicle 100 from among a plurality of electronic devices that establish communication connection.
Mode four: the vehicle 100 may screen out one electronic device that first establishes a communication connection with the vehicle 100 from among a plurality of electronic devices that establish communication connections.
Mode five: the vehicle 100 may screen out one electronic device with the highest attention of the user to the electronic device from among the plurality of electronic devices that establish communication connection.
The vehicle 100 may score the user's attention to the plurality of electronic devices based on the state of the vehicle 100, and the device states of the other plurality of electronic devices, as well as the user state, and the like.
The attention of the user to the device can be obtained based on the following information: the working state of the electronic equipment and the state of the user detected by the electronic equipment. The operating state of the electronic device includes, but is not limited to: whether the electronic device is on, whether the electronic device is in an operational state (e.g., whether there is an image display or an audio output). The status of the user detected by the electronic device includes, but is not limited to: whether the user is looking at the electronic device or operating the electronic device.
The vehicle 100 may score the electronic device based on the status of the electronic device, resulting in a user's attention score to the electronic device.
Specifically, the vehicle 100 may divide the user's attention score to the electronic device into N levels. The lower the level, the less ambiguous the user's attention to the electronic device. The higher the level, the more definite the user's attention to the electronic device, and the higher the user's attention to the electronic device. Illustratively, when the user's attention score to the electronic device is at the first level, the user's attention score to the electronic device is 1/N. And when the attention degree of the user to the electronic equipment is scored as the second level, the attention degree of the user to the electronic equipment is scored as 2/N.
Assuming N is equal to 3, in a vehicle fueling scenario, the first level scoring event includes, but is not limited to: the electronic device has an audio output, then the first level of scoring event has a focus score of 0.3. Second level scoring events include, but are not limited to: the electronic device is on screen and then the second level of scoring event has a focus score of 0.6. Third level scoring events include, but are not limited to: the electronic equipment is on, and the electronic equipment is continuously on.
It should be noted that, the vehicle 100 may also determine the attention score of the user to each electronic device based on other manners, which is not limited in the embodiment of the present application.
After the vehicle 100 obtains the attention score of the user for each electronic device, a payment interface may be displayed or the user identity may be verified through the device with the highest attention of the user.
Mode six: the vehicle 100 may screen out one electronic device closest to a position between the vehicles from among a plurality of electronic devices that establish communication connection.
Not only the manner of screening one device from a plurality of electronic devices that establish communication connection shown in the first to sixth manners, but also the vehicle 100 may screen one device from a plurality of electronic devices that establish communication connection based on other manners, which is not limited in the embodiment of the present application.
Fig. 11 is a schematic flow chart of a payment method according to an embodiment of the present application.
S1101, the vehicle 100 acquires the payment code through a camera on the vehicle, or the vehicle 100 establishes short-distance communication connection with the collection device, and receives payment information sent by the collection device.
The vehicle 100 may include a plurality of cameras in different orientations, such as up, down, forward, rear, left, right, in-vehicle, etc. In this way, the vehicle 100 is facilitated to acquire a payment code based on the camera scan images of different orientations.
Alternatively, a camera on the vehicle 100 for scanning the image and acquiring the payment code may acquire the payment code based on starting the camera to acquire the image at a suitable time. For example, based on one or more of vehicle status, vehicle location, and environmental information, determining that a payment scene is in progress, vehicle 100 may automatically turn on a camera for scanning the image and acquiring a payment code. Or the vehicle 100 receives the user's operation, turns on the camera for scanning the image and acquiring the payment code.
Optionally, the vehicle 100 is provided with a plurality of cameras with different orientations for scanning images and acquiring payment codes, when the images are scanned and the payment codes are acquired, the cameras with a certain orientation can be controlled to be started, the cameras with all orientations are not required to be started, and the power consumption of the vehicle 100 is saved. For example, the vehicle user may determine that the payment code is located in a first orientation (e.g., left) of the vehicle 100 by visual inspection or the like, and the vehicle user may control the vehicle 100 to turn on the camera in the first orientation by voice or shortcut key. If the payment code is not scanned within a certain time after the camera in the first direction is turned on, the vehicle 100 may prompt the vehicle user that the image cannot be scanned by means of text, voice, picture, vibration, light flashing, etc. The vehicle user may again control the vehicle 100 to turn on the camera in the second orientation via voice or shortcut key. In this way, cameras of different orientations may be turned on in stages until the vehicle 100 acquires the payment code.
Optionally, the number of cameras in one direction may be multiple, and the vehicle user may start first to start one of the cameras in the first direction to start scanning the image and obtain the payment code. Under the condition that one of the cameras in the first direction does not scan the payment code within a certain time, the vehicle user can control the other camera in the first direction to be started, or the vehicle 100 automatically starts the other camera in the first direction to be started until all the cameras in the first direction are started. If all cameras in the first orientation are turned on and then the payment code is acquired, the vehicle 100 may prompt the user to turn on the cameras in the other orientations. In this way, different cameras in the same orientation may be turned on in stages until the payment code is acquired by the vehicle 100.
It should be noted that the method is not limited to opening one camera at a time, and may also be used to open a plurality of cameras in one direction at a time, or to open a plurality of cameras in different directions at a time, which is not limited in the embodiment of the present application.
In some embodiments, the vehicle 100 may also recognize the payment code via a camera (e.g., a vehicle recorder) that is not used to scan the image and acquire the payment code, and the vehicle 100 may acquire the payment code without turning on the camera that is used to scan the image and acquire the payment code.
The above-described manner of acquiring the payment code is merely for explaining the present application, and the vehicle 100 may acquire the payment code based on other manners, which is not limited in the embodiment of the present application.
Alternatively, before the vehicle 100 establishes the short-range communication connection with the collection device, the collection device may transmit a connection establishment request to the vehicle 100 for requesting establishment of the short-range communication connection with the vehicle 100, within a preset distance between the vehicle 100 and the collection device. After the vehicle 100 receives the connection establishment request sent by the cash register, the vehicle 100 may display a prompt message for prompting whether a communication connection with the cash register is required. After the user determines, the vehicle 100 may establish a communication connection with the checkout device.
Therefore, the payment can be completed without getting off or taking out the payment code outside the mobile phone scanning vehicle by the driver and passengers such as the vehicle user, the passenger and the like. The convenience of outfield payment is improved, the user experience is improved, and the driving safety problem of the vehicle-mounted user is also ensured.
S1102, is the vehicle 100 capable of resolving payment codes and payment information?
In the case where the vehicle 100 has the capability of resolving the capability and paying the information, the vehicle 100 executes S1103.
In the case where the vehicle 100 does not have the capability of resolving the capability and paying the information, the vehicle 100 executes S1104 to S1106.
S1103, the vehicle 100 acquires payment details based on the payment code and the payment information.
Alternatively, the payment code may be in image format and the payment information may be in message format. In the case where the vehicle 100 has the capability to parse an image or parse a message, the vehicle 100 may obtain payment details based on the payment code and the payment information.
Payment details include, but are not limited to: payment amount, commodity name, payee name, etc.
S1104, the vehicle 100 transmits the payment code and the payment information to the electronic device 200.
S1105, the electronic device 200 acquires payment details based on the payment code or the payment information.
S1106, the electronic device 200 transmits the payment details to the vehicle 100.
In the case where the vehicle 100 does not have the capability of resolving an image or resolving a message, the vehicle 100 may resolve the payment code and the payment information by using the electronic device 200 for establishing communication connection, and obtain payment details.
S1107, is the vehicle 100 authorized to display sensitive information?
In the case where the vehicle 100 has the authority to present the sensitive information, the vehicle 100 executes S1108.
In the case where the vehicle 100 does not exhibit the authority of the sensitive information, the vehicle 100 performs S1109 to S1110.
S1108, the vehicle 100 displays a payment interface based on the payment details.
Specifically, reference may be made to the embodiments shown in fig. 10A to 10D, and the embodiments of the present application will not be described herein.
S1109, the vehicle 100 transmits a presentation notification to the electronic device 200.
The presentation notification is used to instruct the electronic device 200 to display a payment interface based on the payment details.
S1110, the electronic device 200 displays a payment interface based on the payment details.
Specifically, reference may be made to the embodiments shown in fig. 10E to fig. 10G, and the embodiments of the present application are not described herein again.
S1111, is the vehicle 100 payable?
In the case where the vehicle 100 has the payment capability, the vehicle 100 executes S1112.
In the case where the vehicle 100 has no payment capability, the vehicle 100 executes S1113.
S1112, the vehicle 100 verifies the identity of the passing user and completes the payment.
S1113, is the vehicle 100 determining whether there is an electronic device that establishes a communication connection with the vehicle?
In the case where the vehicle 100 has an electronic device that establishes a communication connection with the vehicle, the vehicle 100 executes S1114-S1115.
In the case where the vehicle 100 has no electronic device that establishes a communication connection with the vehicle, the vehicle 100 executes S1116.
S1114, the vehicle 100 transmits a verification identity notification to the electronic device 200.
S1115, after the electronic device 200 receives the verification identity notification, the electronic device 200 verifies the identity of the passing user and completes the payment.
Optionally, after the electronic device 200 completes the payment, the electronic device 200 may send a payment complete notification to the vehicle 100 to prompt the user that the payment has been completed.
S1116, the vehicle 100 displays the two-dimensional code.
In the case where the vehicle 100 has no electronic device that establishes a communication connection with the vehicle, the vehicle 100 may display a two-dimensional code based on the payment code or payment information, the two-dimensional code being used to prompt the user to actively scan the two-dimensional code through other devices to complete payment.
In some embodiments, after the vehicle 100 scans the payment code, the payment code may be displayed directly on the central control screen, i.e., the payment code is the same as the two-dimensional code shown in fig. 8C. In other embodiments, after the vehicle 100 scans the payment code, the vehicle 100 may acquire payment details from the payment code, and the vehicle 100 regenerates the two-dimensional code based on the payment details, that is, the payment code is different from the two-dimensional code shown in fig. 8C.
In some embodiments, after the vehicle 100 receives the payment information sent by the checkout device, the vehicle 100 may obtain payment details from the payment information, and the vehicle 100 may generate the two-dimensional code based on the payment details.
In particular, reference may be made to the embodiment shown in fig. 8C.
Fig. 12 is a flow chart of a payment method according to an embodiment of the present application.
S1201, the vehicle scans the image through a camera on the vehicle to obtain a payment code, or the vehicle receives payment information sent by the collection device.
In one possible implementation manner, the vehicle receives payment information sent by the collection device, and specifically includes: the vehicle establishes short-distance communication connection with the collection equipment and receives payment information sent by the collection equipment; the short-range communication connection includes: bluetooth communication connection, wi-Fi communication connection, near field communication NFC communication connection.
S1202, acquiring payment details based on the payment code or the payment information, where the payment details include one or more of the following: payment amount, commodity name, payee name.
In one possible implementation manner, the vehicle obtains payment details based on the payment code or the payment information, and specifically includes: under the condition that the vehicle has resolving power, the vehicle resolves the payment code or the payment information to obtain payment details; under the condition that the vehicle has no resolving power, the vehicle analyzes the payment details from the payment code or the payment information through the first electronic equipment, and receives the payment details sent by the first electronic equipment.
In one possible implementation, after the vehicle obtains the payment details based on the payment code or the payment information, before the vehicle completes payment based on the payment details through the first electronic device, the method further includes: under the condition that the vehicle has the authority of displaying the sensitive information, the vehicle displays a payment interface based on the payment details; and under the condition that the vehicle does not display the authority of the sensitive information, the vehicle displays a payment interface through the first electronic equipment.
The vehicle displays a payment interface, which may be the user interface shown in fig. 8A-8B, or the user interface shown in fig. 10A-10D.
The first electronic device display payment interface may be the user interface shown in fig. 9A-9B, or the user interface shown in fig. 10E-10G.
And S1203, the vehicle completes payment based on the payment details through the first electronic device.
The vehicle may be the vehicle 100 and the first electronic device may be the electronic device 200.
In one possible implementation, the first electronic device is an electronic device that logs in to the same account as the vehicle. The vehicle and the electronic equipment are trusted equipment, so that the reliability of the collaborative payment process is ensured.
In one possible implementation, the vehicle completes payment based on the payment details through the first electronic device, and specifically comprises the vehicle completing payment based on the payment details through the first electronic device when the vehicle establishes a communication connection with the first electronic device.
In one possible implementation, in the event that the vehicle does not establish a communication connection with the first electronic device, the vehicle displays a first image based on the payment code or payment information, the first image being used to prompt the user to scan the first image through the electronic device and complete the payment.
The first image may be the user interface shown in fig. 8C.
For example, the first image may be a two-dimensional code. In this way, the vehicle displays the first image to prompt the user to complete payment based on the other electronic device scanning the first image. The first image is not limited to be a two-dimensional code, but may be other images, and the first image may be any image that can be scanned and obtain payment information.
Therefore, the vehicle can complete payment through the electronic equipment without judging whether the vehicle has the payment capability or not.
According to the payment method provided by the first aspect, the vehicle and the electronic equipment are cooperated to finish the outfield payment of the vehicle-mounted user, the convenience of the outfield payment is improved, and the user experience is improved.
In one possible implementation manner, the vehicle completes payment based on the payment details through the first electronic device, and specifically includes: in the event that the vehicle does not have payment capabilities, the vehicle completes payment based on the payment details via the first electronic device.
Therefore, after the vehicle acquires the payment details, whether the vehicle has the payment capability or not can be judged, and under the condition that the vehicle does not have the payment capability, the vehicle finishes the payment through the electronic equipment.
In one possible implementation, the method further includes: in the case of a vehicle having payment capability, the vehicle completes payment based on the payment details.
Therefore, after the vehicle acquires the payment details, whether the vehicle has the payment capability or not can be judged, and under the condition that the vehicle has the payment capability, the vehicle can directly complete payment without depending on other electronic equipment to complete payment.
In one possible implementation, the cameras on the vehicle include cameras in a plurality of orientations, each including one or more cameras in a first orientation, wherein the first orientation includes a first camera and a second camera; the vehicle obtains the payment code through the scanning image of a camera on the vehicle, and specifically comprises the following steps: the vehicle scans images through a first camera and a second camera in a first direction and obtains a payment code; wherein the first camera is different from the second camera.
Thus, the plurality of different cameras can be included in the same direction, and the plurality of different cameras in the same direction can be opened simultaneously or in a time-sharing manner. The vehicle scans the image through a plurality of different cameras in one direction to acquire the payment code, so that the accuracy of acquiring the payment code by the scanned image can be improved.
In one possible implementation, the vehicle scans an image through the first camera and the second camera in the first direction and obtains a payment code, which specifically includes: starting a first camera in a first direction by the vehicle, and scanning an image through the first camera; under the condition that the first camera does not acquire the payment code in the first time, the vehicle starts a second camera in the first direction, and the second camera scans an image to acquire the payment code.
Therefore, a plurality of different cameras in the same direction can be started in a time-sharing mode, and the problem of increased power consumption caused by simultaneous starting is avoided.
In one possible implementation, a camera on a vehicle includes a plurality of orientations of the camera, the plurality of orientations including a first orientation and a second orientation; the vehicle obtains the payment code through the scanning image of a camera on the vehicle, and specifically comprises the following steps: the vehicle scans images through the camera in the first direction and the camera in the second direction and obtains a payment code; wherein the first orientation is different from the second orientation.
Therefore, the vehicle can comprise a plurality of cameras in different directions, and the cameras in different directions can be opened simultaneously or in a time-sharing manner. The vehicle scans the images through the cameras in different directions to acquire the payment code, so that the accuracy of acquiring the payment code by the scanned images can be improved.
In one possible implementation manner, the vehicle scans the image through the camera in the first direction and the camera in the second direction and obtains the payment code, and specifically includes: starting a camera in a first direction by the vehicle, and scanning an image through the camera in the first direction; under the condition that the camera in the first direction does not acquire the payment code in the first time, the vehicle starts the camera in the second direction, and the payment code is acquired through the scanning image of the camera in the second direction.
Therefore, the cameras in a plurality of different directions can be started in a time-sharing mode, and the problem that power consumption is increased due to the fact that the cameras in a plurality of different directions are started simultaneously is avoided.
In one possible implementation, in a case where the vehicle obtains the payment code by scanning an image with a camera on the vehicle, before the vehicle obtains the payment code by scanning an image with a camera on the vehicle, the method further includes: the vehicle turns on a camera on the vehicle based on one or more of vehicle status, vehicle position, and environmental information.
Therefore, the vehicle can automatically start the camera to scan the image to acquire the payment code, and the operation of a user is reduced. For example, the vehicle may automatically turn on a camera in a first orientation and/or a camera in a second orientation on the vehicle. The vehicle may also automatically turn on the first camera in the first orientation and/or the second camera in the first orientation on the vehicle.
In the case where the vehicle obtains the payment code by scanning an image with a camera on the vehicle, the method further includes, before the vehicle obtains the payment code by scanning an image with a camera on the vehicle: the vehicle receives and responds to a first operation of a user, and a camera on the vehicle is started.
The first operation may be a voice operation, an operation for a shortcut key, or the like.
The first operation may be for turning on a camera in a first orientation and/or a camera in a second orientation on the vehicle. The first operation may also be turning on a first camera in a first orientation and/or a second camera in the first orientation on the vehicle.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Drive (SSD)), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (18)

1. A method of payment, the method comprising:
The vehicle obtains a payment code through scanning an image by a camera on the vehicle, or the vehicle receives payment information sent by a collection device;
The vehicle obtains payment details based on the payment code or the payment information, the payment details including one or more of: payment amount, commodity name, payee name;
the vehicle completes payment based on the payment details through a first electronic device.
2. The method according to claim 1, characterized in that the vehicle completes the payment by means of a first electronic device based on the payment details, in particular comprising:
In the event that the vehicle has no payment capabilities, the vehicle completes payment based on the payment details via a first electronic device.
3. The method according to claim 2, wherein the method further comprises:
in the case where the vehicle has payment capability, the vehicle completes payment based on the payment details.
4. A method according to any one of claims 1 to 3, wherein the cameras on the vehicle comprise cameras of a plurality of orientations, each orientation comprising one or more cameras, wherein a first orientation comprises a first camera and a second camera; the vehicle obtains a payment code by scanning an image through a camera on the vehicle, and specifically comprises the following steps:
The vehicle scans images through the first camera and the second camera in the first direction and acquires the payment code;
Wherein the first camera is different from the second camera.
5. The method of claim 4, wherein the vehicle scans images through the first camera and the second camera in the first orientation and obtains the payment code, specifically comprising:
the vehicle starts the first camera in a first direction, and images are scanned through the first camera;
And under the condition that the first camera does not acquire the payment code in the first time, the vehicle starts the second camera in the first direction, and the payment code is acquired through the scanning image of the second camera.
6. The method of any one of claims 1-5, wherein the camera on the vehicle comprises a plurality of orientations of the camera, the plurality of orientations comprising a first orientation and a second orientation; the vehicle obtains a payment code by scanning an image through a camera on the vehicle, and specifically comprises the following steps:
the vehicle scans images through the camera in the first direction and the camera in the second direction and acquires the payment code;
wherein the first orientation is different from the second orientation.
7. The method of claim 6, wherein the vehicle scans images through the camera in the first orientation and the camera in the second orientation and obtains the payment code, specifically comprising:
the vehicle starts a camera in the first direction, and images are scanned through the camera in the first direction;
Under the condition that the camera in the first direction does not acquire the payment code in the first time, the vehicle starts the camera in the second direction, and the payment code is acquired through the scanning image of the camera in the second direction.
8. The method of any one of claims 1-7, wherein in the event that the vehicle obtains a payment code via a camera-scanned image on the vehicle, the method further comprises, prior to the vehicle obtaining a payment code via the camera-scanned image on the vehicle:
the vehicle receives and responds to a first operation of a user, and a camera on the vehicle is started.
9. The method of any one of claims 1-7, wherein in the event that the vehicle obtains a payment code via a camera-scanned image on the vehicle, the method further comprises, prior to the vehicle obtaining a payment code via the camera-scanned image on the vehicle:
The vehicle starts a camera on the vehicle based on one or more of the vehicle state, the vehicle position and the environment information.
10. The method according to any one of claims 1 to 9, wherein the vehicle receives payment information sent by a collection device, in particular comprising:
The vehicle establishes short-distance communication connection with the collection equipment and receives the payment information sent by the collection equipment;
The short-range communication connection includes: bluetooth communication connection, wi-Fi communication connection, near field communication NFC communication connection.
11. The method of any one of claims 1-10, wherein the first electronic device is an electronic device that logs into the same account as the vehicle.
12. The method according to any one of claims 1-11, characterized in that the vehicle completes payment by means of a first electronic device based on the payment details, in particular comprising:
And under the condition that the vehicle establishes communication connection with the first electronic equipment, the vehicle completes payment based on the payment details through the first electronic equipment.
13. The method according to claim 12, wherein the method further comprises:
And under the condition that the vehicle does not establish communication connection with the first electronic device, the vehicle displays a first image based on the payment code or the payment information, and the first image is used for prompting a user to scan the first image through the electronic device and complete payment.
14. The method according to any one of claims 1-13, wherein the vehicle obtains payment details based on a payment code or payment information, in particular comprising:
Under the condition that the vehicle has resolving power, resolving the payment details from the payment code or the payment information by the vehicle;
And under the condition that the vehicle has no resolving capability, resolving the payment details from the payment code or the payment information by the vehicle through the first electronic equipment, and receiving the payment details sent by the first electronic equipment.
15. The method of any of claims 1-14, wherein after the vehicle obtains payment details based on a payment code or payment information, before the vehicle completes payment based on the payment details via a first electronic device, the method further comprises:
displaying a payment interface by the vehicle based on the payment details under the condition that the vehicle has the authority to display the sensitive information;
and under the condition that the vehicle does not display the sensitive information authority, the vehicle displays a payment interface through the first electronic equipment.
16. A vehicle, characterized in that the vehicle comprises: one or more processors, one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that the one or more processors invoke to cause the vehicle to perform the method of any of the above claims 1-15.
17. A computer readable storage medium having instructions stored therein which, when run on a vehicle, cause the vehicle to perform the method of any of claims 1-15.
18. A computer program product, characterized in that the computer program product, when executed by a vehicle, causes the vehicle to perform the method of any of claims 1-15.
CN202210901709.3A 2022-07-28 2022-07-28 Payment method and related device Pending CN117911019A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210901709.3A CN117911019A (en) 2022-07-28 2022-07-28 Payment method and related device
PCT/CN2023/109377 WO2024022394A1 (en) 2022-07-28 2023-07-26 Payment method and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210901709.3A CN117911019A (en) 2022-07-28 2022-07-28 Payment method and related device

Publications (1)

Publication Number Publication Date
CN117911019A true CN117911019A (en) 2024-04-19

Family

ID=89705384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210901709.3A Pending CN117911019A (en) 2022-07-28 2022-07-28 Payment method and related device

Country Status (2)

Country Link
CN (1) CN117911019A (en)
WO (1) WO2024022394A1 (en)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101241803B1 (en) * 2010-06-25 2013-03-14 엠텍비젼 주식회사 Multi-camera video recording device and method for vehicle using multi-camera system
CN107369218B (en) * 2017-07-21 2019-02-22 北京图森未来科技有限公司 Realize method and system, the relevant device of vehicle automatic fee
CN111260800A (en) * 2018-11-30 2020-06-09 上海博泰悦臻电子设备制造有限公司 Payment method, payment system, vehicle-mounted terminal and computer-readable storage medium
CN110536117A (en) * 2019-09-03 2019-12-03 北京蓦然认知科技有限公司 A kind of target assisted lookup method and apparatus using vehicle-mounted camera
CN112132569A (en) * 2020-09-22 2020-12-25 华人运通(上海)云计算科技有限公司 Vehicle-mounted payment method and device, vehicle-mounted terminal, storage medium and vehicle
CN114511315A (en) * 2020-11-17 2022-05-17 博泰车联网科技(上海)股份有限公司 Vehicle-mounted camera mobile payment method and computer storage medium
CN112529568A (en) * 2020-11-30 2021-03-19 重庆长安汽车股份有限公司 Vehicle-mounted payment method and system and automobile
CN112585613A (en) * 2020-11-30 2021-03-30 华为技术有限公司 Code scanning method and device
CN114373233A (en) * 2022-01-04 2022-04-19 中国银联股份有限公司 Payment method, payment device, payment equipment and computer-readable storage medium
CN114492492A (en) * 2022-01-28 2022-05-13 镁佳(北京)科技有限公司 Two-dimensional code scanning method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
WO2024022394A1 (en) 2024-02-01

Similar Documents

Publication Publication Date Title
EP3502862B1 (en) Method for presenting content based on checking of passenger equipment and distraction
CN110126783B (en) Vehicle control method and device
CN106415686A (en) Trainable transceiver and camera systems and methods
CN110311976B (en) Service distribution method, device, equipment and storage medium
EP4293535A1 (en) Information recommendation method and related device
CN111465536A (en) Service processing method and device
WO2023231890A1 (en) Service recommendation method and related apparatus
CN112061024A (en) Vehicle external speaker system
CN116853240A (en) Barrier early warning method, device, equipment and storage medium
WO2023179481A1 (en) Payment method and electronic device
CN113869911A (en) Vehicle borrowing method, vehicle returning method, vehicle-mounted terminal and vehicle borrowing and returning system
CN115134453B (en) Riding information display method and electronic equipment
CN116055629B (en) Method for identifying terminal state, electronic equipment, storage medium and chip
CN109189068B (en) Parking control method and device and storage medium
WO2019169591A1 (en) Method and device for voice interaction
CN117911019A (en) Payment method and related device
CN111475233B (en) Information acquisition method, graphic code generation method and device
CN208360117U (en) Fingerprint identification device, Vehicular intelligent cockpit and vehicle based on Vehicular intelligent cockpit
WO2023241482A1 (en) Man-machine dialogue method, device and system
US20190158629A1 (en) Systems and methods to aggregate vehicle data from infotainment application accessories
WO2023207704A1 (en) Vehicle control method based on voice instruction, and related apparatus
CN117351641A (en) Riding service card recommendation method and related device
US11914914B2 (en) Vehicle interface control
CN111364872B (en) Vehicle trunk control method, vehicle machine and vehicle
CN114595951A (en) Travel resource management method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination