CN115545812A - Equipment control method and related device - Google Patents

Equipment control method and related device Download PDF

Info

Publication number
CN115545812A
CN115545812A CN202110738497.7A CN202110738497A CN115545812A CN 115545812 A CN115545812 A CN 115545812A CN 202110738497 A CN202110738497 A CN 202110738497A CN 115545812 A CN115545812 A CN 115545812A
Authority
CN
China
Prior art keywords
electronic device
information
passenger
driver
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110738497.7A
Other languages
Chinese (zh)
Inventor
李帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110738497.7A priority Critical patent/CN115545812A/en
Priority to PCT/CN2022/101471 priority patent/WO2023274136A1/en
Publication of CN115545812A publication Critical patent/CN115545812A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Tourism & Hospitality (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

Disclosed is a device control method characterized by comprising: the first electronic device and the second electronic device are in the bus taking order incidence relation, the first electronic device obtains first information, and the vehicle-mounted electronic device is called to execute first operation based on the first information, wherein the first operation can be displaying a first color or a first pattern, or outputting preset audio and the like. Therefore, the user of the second electronic equipment can confirm the position of the first electronic equipment, and the first electronic equipment and the second electronic equipment can be quickly identified. According to the embodiment of the application, when the first electronic equipment is the driver-side equipment, the second electronic equipment is the passenger-side equipment, the vehicle-mounted electronic equipment is the display equipment or the audio equipment on the vehicle, and the passenger can quickly lock the vehicle matched with the passenger through the first operation executed by the vehicle, so that the efficiency of the passenger for identifying the driver is improved, and the quick taxi taking is realized.

Description

Equipment control method and related device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an apparatus control method and a related device.
Background
At present, taxi taking is already a common taxi taking mode through network taxi taking software, passengers place orders through the network taxi taking software, a network taxi taking software server issues order taking requirements of the passengers to the network taxi taking software at a driver end, and when a driver takes an order, pairing of the driver and the passengers is established. The network car booking server provides the position of the passenger for the driver, the network car booking server provides the information of the driver and the vehicle information (such as the color of the vehicle, the license plate and the like) for the passenger, and when the driver arrives at the position of the passenger, the passenger can find the target vehicle through the vehicle information of the driver to realize the taxi taking.
However, due to the fact that there are many vehicles on the road or the vehicles drive high beams, it is difficult for passengers to clearly see the license plate of each vehicle when driving at night, and it is often impossible to identify which vehicle is the driver's own vehicle. Particularly, when heavy rain occurs and vehicles are more, the situation that the passengers cannot see license plates is more serious, and as a result, an embarrassing situation that the car appointment driver cannot find the passengers and the passengers cannot find the car appointment driver is caused, and the time of the drivers and the passengers is wasted.
Therefore, how to improve the efficiency of recognition between the net car booking driver and the passenger is a problem being studied by those skilled in the art.
Disclosure of Invention
The embodiment of the application provides a device control method and a related device, which can improve the recognition efficiency between a net car booking driver and passengers.
In a first aspect, the present application provides a device control method, where a first electronic device and a vehicle-mounted electronic device establish a connection, the method including: the method comprises the steps that a first electronic device and a second electronic device establish a riding order incidence relation; the method comprises the steps that first electronic equipment acquires first information from a server, wherein the first information is used for triggering vehicle-mounted electronic equipment to execute first operation, and the first operation comprises at least one of display operation, light operation or audio playing operation; the first electronic device controls the vehicle-mounted electronic device to execute a first operation based on the first information.
According to the embodiment of the application, the first electronic device is a driver side device, the second electronic device is a passenger side device, the driver side device is connected with the vehicle-mounted electronic device, and the driver side device can call hardware of the vehicle-mounted electronic device. The first electronic device and the second electronic device establish a relationship of taking orders, and the taking orders include relevant information of the driver-side device and the passenger-side device, such as passenger position, driver position, getting-on time, getting-on position and the like. After the first electronic device and the second electronic device establish the riding order incidence relation, the first electronic device obtains first information, and calls the vehicle-mounted electronic device to execute a first operation based on the first information, where the first operation may be, for example, displaying a first color or a first pattern, or outputting a preset audio. Therefore, the position of the first electronic equipment can be confirmed by the user of the second electronic equipment, and the user of the first electronic equipment can be quickly identified by the user of the second electronic equipment.
In some scenarios, the passenger side device (second electronic device) places an order on the taxi taking application, the driver side device (first electronic device) accepts the order on the taxi taking application, a riding order association relationship between the passenger side device and the driver side device is established, after the passenger gets on the taxi, the driver and the passenger confirm each other, and the passenger or the driver can indicate that the driver has picked up the passenger by manually triggering the designated button in the taxi taking application. Before the passenger gets on the vehicle, the driver-side device may instruct headlights (on-board electronics) on the vehicle to display a red color (first operation) so that the passenger can recognize the vehicle to the driver-side device, enabling quick recognition of the passenger-side device and the driver-side device.
In some possible implementations, the first electronic device controls the in-vehicle electronic device to perform a first operation based on the first information, including: and detecting that the first electronic equipment meets a first condition, and controlling the vehicle-mounted electronic equipment to execute a first operation based on the first information by the first electronic equipment. In this implementation manner, after the first electronic device acquires the first information from the server, the first electronic device may control the vehicle-mounted electronic device to execute the first operation when the first electronic device detects that the first electronic device satisfies the first condition.
In some possible implementations, the first condition includes a location of the first electronic device reaching an pickup location associated with the ride order. When the first electronic device detects that the position of the first electronic device reaches the boarding position, the first electronic device controls the vehicle-mounted electronic device to execute a first operation, so that a user of the second electronic device can recognize the position of the vehicle-mounted electronic device, and the position of the first electronic device is recognized.
In some possible implementations, the first condition includes a time-to-pick-up associated with the ride order. The getting-on time associated with the riding order can be time preset by the second electronic device, or time estimated by the first electronic device according to the getting-on position and reaching the getting-on position of the second electronic device; when the first electronic device detects that the time of the first electronic device reaches the boarding time, the first electronic device controls the vehicle-mounted electronic device to execute a first operation, so that a user of the second electronic device can recognize the position of the vehicle-mounted electronic device, and the position of the first electronic device is recognized. The time of the first electronic device may be time displayed on the first electronic device, or standard local time acquired by the first electronic device through real-time networking.
In some possible implementations, the first condition includes an expected remaining time for the first electronic device to reach the pick-up location associated with the ride order being within a preset value. The first electronic device may determine, in real time, a remaining time for the first electronic device to reach the boarding location based on a distance between the location of the first electronic device and the boarding location, and when the first electronic device detects that the first electronic device is about to reach the boarding location (for example, when the remaining time is expected to be one minute), the first electronic device may control the in-vehicle electronic device to perform a first operation, so that a user of the second electronic device can recognize the location of the in-vehicle electronic device, thereby recognizing the location of the first electronic device.
In some possible implementations, the first condition includes a control instruction sent by the server, the control instruction being associated with a location of the second electronic device. The first condition may be a control instruction sent by the server, where the control instruction is associated with a location of the second electronic device, for example, when the server detects that the location of the second electronic device reaches the boarding location, the server sends the control instruction to the first electronic device; for another example, if the server detects that the location of the second electronic device is within a proximity of the first electronic device (e.g., within ten meters), the server sends a control command to the first electronic device. In this way, when the first electronic device receives the control instruction sent by the server, the first electronic device meets the first condition, and the first electronic device controls the vehicle-mounted electronic device to execute the first operation, so that the user of the second electronic device can recognize the position of the vehicle-mounted electronic device, and thus the position of the first electronic device is recognized.
In some possible implementation manners, the first information is sent to the first electronic device by the server when the first electronic device detects that the first electronic device satisfies the first condition. It is described herein that the first information may be sent to the first electronic device by the server in a case where the server detects that the first electronic device satisfies a first condition, where the first condition may be the first condition in the above-described implementation.
In some possible implementations, the obtaining, by the first electronic device, the first information from the server includes: the method comprises the steps that a first electronic device receives user operation aiming at a riding order; in response to a user operation, the first electronic device acquires first information from the server. It is described herein that the first information may be obtained by a user active trigger.
In some possible implementations, the first electronic device and the in-vehicle electronic device are integrated on the same electronic device. That is, the first electronic device may also be a vehicle-mounted electronic device, such as a car machine, an intelligent cabin, or the like.
In a second aspect, an embodiment of the present application provides an apparatus control method, where the method includes: the second electronic equipment and the first electronic equipment establish a riding order incidence relation; the second electronic equipment acquires second information from the server, wherein the second information is used for triggering the second electronic equipment to execute second operation, and the second operation comprises at least one of display operation, light operation or audio playing operation; the second electronic device performs a second operation based on the second information.
According to the embodiment of the application, the first electronic equipment is driver-side equipment, and the second electronic equipment is passenger-side equipment. The first electronic device and the second electronic device are in a relationship of taking a ride order, and the taking a ride order includes relevant information of the driver-side device and the passenger-side device, such as passenger position, driver position, boarding time, boarding position, and the like. After the first electronic device and the second electronic device establish the relationship of the taking order, the second electronic device obtains second information, and triggers the second electronic device to execute a second operation based on the second information, where the second operation may be, for example, displaying a first color or a first pattern, or outputting a preset audio. Therefore, the user of the first electronic device can confirm the position of the second electronic device, and the user of the first electronic device can quickly identify the user of the second electronic device.
In some scenarios, the passenger side device (second electronic device) places an order on the taxi taking application, the driver side device (first electronic device) takes an order on the taxi taking application, a correlation between the riding orders of the passenger side device and the driver side device is established, after the passenger gets on the taxi, the driver and the passenger confirm each other, and the passenger or the driver can indicate that the driver has picked up the passenger by manually triggering the designated button in the taxi taking application. Before the passenger gets on the vehicle, the passenger-side device may display its own flash of light or the like in red (first operation) so that the driver can recognize the user of the passenger-side device, enabling quick recognition of the passenger-side device and the driver-side device.
In some possible implementation manners, the second information is sent to the second electronic device by the server when the first electronic device detects that the first electronic device satisfies the first condition.
In some possible implementations, the first condition includes a location of the first electronic device reaching an pickup location associated with the ride order. When the server detects that the position of the first electronic device reaches the boarding position, the server sends second information to the second electronic device to instruct the second electronic device to execute second operation based on the second information, so that the user of the first electronic device can identify the position of the second electronic device based on the second operation, and the position of the user of the second electronic device is identified.
In some possible implementations, the first condition includes a time-to-pick-up associated with the ride order. The getting-on time associated with the riding order can be time preset by the second electronic device, or time estimated by the first electronic device according to the getting-on position and reaching the getting-on position of the second electronic device; when the server detects that the time of the first electronic device reaches the boarding time, the server sends second information to the second electronic device to instruct the second electronic device to execute a second operation based on the second information, so that the user of the first electronic device can recognize the position of the second electronic device and further recognize the position of the user of the second electronic device. The time of the first electronic device may be time displayed on the first electronic device, or standard local time acquired by the first electronic device through real-time networking.
In some possible implementations, the first condition includes an expected remaining time for the first electronic device to reach the pick-up location associated with the ride order being within a preset value. The server may determine, in real time, a remaining time for the first electronic device to reach the boarding location based on a distance between the location of the first electronic device and the boarding location, and when the server detects that the first electronic device is about to reach the boarding location (for example, when the remaining time is expected to be one minute), the server may send second information to the second electronic device, instruct the second electronic device to perform a second operation based on the second information, so that the user of the first electronic device can recognize the location of the second electronic device, and thus recognize the location of the user of the second electronic device.
In some possible implementations, the second electronic device performs a second operation based on the second information, including: and detecting that the second electronic equipment meets a second condition, and executing a second operation by the second electronic equipment based on the second information. It is described here that the second electronic device executes the second operation only when the second electronic device detects that the second electronic device satisfies the second condition after the second electronic device acquires the second information from the server.
In some possible implementations, the second condition includes a location of the second electronic device reaching an pickup location associated with the ride order. When the second electronic device detects that the position of the second electronic device reaches the boarding position, the second electronic device executes a second operation, so that the user of the first electronic device can recognize the position of the second electronic device, and the position of the user of the second electronic device is recognized.
In some possible implementations, the second electronic device obtaining the second information from the server includes: the second electronic equipment receives user operation aiming at the riding order; in response to the user operation, the second electronic device acquires the second information from the server. It is described herein that the second information may be actively triggered to be obtained by the user.
In some possible implementations, the second electronic device performs a second operation based on the second information, including: the second electronic device instructs the third electronic device to perform the second operation based on the second information. The third electronic device may be a wearable device connected to the second electronic device, where the second electronic device may be connected to one or more wearable devices, and the wearable device includes, for example, a watch, a bracelet, an earphone, and the like, and in a case where the second electronic device is connected to the wearable device (third electronic device), when the second electronic device receives the second information, the second electronic device may instruct the third electronic device to perform a second operation based on the second information, so that the user of the first electronic device can recognize a location of the third electronic device, thereby recognizing a location of the user of the second electronic device.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes the first electronic device described above, and includes: one or more processors, one or more memories; the one or more memories are coupled to the one or more processors; the one or more memories are for storing computer program code comprising computer instructions; the computer instructions, when executed on the processor, cause the electronic device to perform: establishing a riding order association relation with the second electronic equipment; acquiring first information from a server, wherein the first information is used for triggering the vehicle-mounted electronic equipment to execute a first operation, and the first operation comprises at least one of display operation, lighting operation or audio playing operation; and controlling the vehicle-mounted electronic equipment to execute the first operation based on the first information.
According to the embodiment of the application, the first electronic device is a driver side device, the second electronic device is a passenger side device, the driver side device is connected with the vehicle-mounted electronic device, and the driver side device can call hardware of the vehicle-mounted electronic device. The first electronic device and the second electronic device establish a relationship of taking a ride order, and the taking a ride order includes relevant information of the driver-side device and the passenger-side device, such as passenger position, driver position, boarding time, boarding position, and the like. After the first electronic device and the second electronic device establish the riding order incidence relation, the first electronic device obtains first information, and calls the vehicle-mounted electronic device to execute a first operation based on the first information, where the first operation may be, for example, displaying a first color or a first pattern, or outputting a preset audio. Therefore, the position of the first electronic equipment can be confirmed by the user of the second electronic equipment, and the user of the first electronic equipment can be quickly identified by the user of the second electronic equipment.
In some scenarios, the passenger side device (second electronic device) places an order on the taxi taking application, the driver side device (first electronic device) accepts the order on the taxi taking application, a riding order association relationship between the passenger side device and the driver side device is established, after the passenger gets on the taxi, the driver and the passenger confirm each other, and the passenger or the driver can indicate that the driver has picked up the passenger by manually triggering the designated button in the taxi taking application. Before the passenger gets on the vehicle, the driver-side device may instruct headlights (on-board electronics) on the vehicle to display red (first operation) so that the passenger recognizes the vehicle of the driver-side device, enabling quick recognition of the passenger-side device and the driver-side device.
In some possible implementations, controlling the third electronic device to perform the first operation based on the first information includes: and controlling the vehicle-mounted electronic equipment to execute a first operation based on the first information when the first electronic equipment is detected to meet the first condition. In this implementation manner, after the first electronic device acquires the first information from the server, the first electronic device may control the vehicle-mounted electronic device to execute the first operation when the first electronic device detects that the first electronic device satisfies the first condition.
In some possible implementations, the first condition includes a location of the first electronic device reaching an pickup location associated with the ride order. When the position of the first electronic device is detected to reach the boarding position, the first electronic device controls the vehicle-mounted electronic device to execute the first operation, so that a user of the second electronic device can recognize the position of the vehicle-mounted electronic device, and the position of the first electronic device is recognized.
In some possible implementations, the first condition includes a time-to-pick-up associated with the ride order. The boarding time associated with the riding order can be time preset by the second electronic device, or time estimated by the first electronic device according to the boarding position and reaching the boarding position of the second electronic device; when the first electronic device detects that the time of the first electronic device reaches the boarding time, the first electronic device controls the vehicle-mounted electronic device to execute a first operation, so that a user of the second electronic device can recognize the position of the vehicle-mounted electronic device, and the position of the first electronic device is recognized. The time of the first electronic device may be the time displayed on the first electronic device, or may be the standard local time obtained by the first electronic device through real-time networking.
In some possible implementations, the first condition includes an expected remaining time for the first electronic device to reach the pick-up location associated with the ride order being within a preset value. The first electronic device may determine, in real time, a remaining time for the first electronic device to reach the boarding location based on a distance between the location of the first electronic device and the boarding location, and when the first electronic device detects that the first electronic device is about to reach the boarding location (for example, when the remaining time is expected to be one minute), the first electronic device may control the in-vehicle electronic device to perform a first operation, so that a user of the second electronic device can recognize the location of the in-vehicle electronic device, thereby recognizing the location of the first electronic device.
In some possible implementations, the first condition includes a control instruction sent by the server, the control instruction being associated with a location of the second electronic device. The first condition may be a control instruction sent by the server, where the control instruction is associated with the location of the second electronic device, for example, when the server detects that the location of the second electronic device reaches the boarding location, the server sends the control instruction to the first electronic device; for another example, if the server detects that the location of the second electronic device is within a proximity of the first electronic device (e.g., within ten meters), the server sends a control command to the first electronic device. In this way, when the first electronic device receives the control instruction sent by the server, the first electronic device meets the first condition, and the first electronic device controls the vehicle-mounted electronic device to execute the first operation, so that the user of the second electronic device can recognize the position of the vehicle-mounted electronic device, and thus the position of the first electronic device is recognized.
In some possible implementation manners, the first information is sent to the first electronic device by the server when the first electronic device detects that the first electronic device satisfies the first condition. It is described herein that the first information may be sent to the first electronic device by the server in case that the server detects that the first electronic device satisfies a first condition, where the first condition may be the first condition in the above-described implementation.
In some possible implementations, obtaining the first information from the server includes: receiving user operation aiming at a riding order; in response to a user operation, first information is acquired from a server. It is described herein that the first information may be obtained by a user active trigger.
In some possible implementations, the first electronic device and the in-vehicle electronic device are integrated on the same electronic device. That is, the first electronic device may also be a vehicle-mounted electronic device, such as a car machine, an intelligent cabin, or the like.
In a fourth aspect, an embodiment of the present application provides an electronic device, where the electronic device includes the second electronic device, and includes: one or more processors, one or more memories; the one or more memories are coupled to the one or more processors; the one or more memories for storing computer program code, the computer program code including computer instructions; the computer instructions, when executed on the processor, cause the electronic device to perform: establishing a riding order association relation with the first electronic equipment; acquiring second information from the server, wherein the second information is used for triggering second electronic equipment to execute second operation, and the second operation comprises at least one of display operation, lighting operation or audio playing operation; a second operation is performed based on the second information.
According to the embodiment of the application, the first electronic equipment is driver-side equipment, and the second electronic equipment is passenger-side equipment. The first electronic device and the second electronic device are in a relationship of taking a ride order, and the ride order includes relevant information of the driver-side device and the passenger-side device, such as passenger position, driver position, boarding time, boarding position, and the like. After the first electronic device and the second electronic device establish the relationship of the taking order, the second electronic device obtains second information, and triggers the second electronic device to execute a second operation based on the second information, where the second operation may be, for example, displaying a first color or a first pattern, or outputting a preset audio. Therefore, the user of the first electronic device can confirm the position of the second electronic device, and the user of the first electronic device can quickly identify the user of the second electronic device.
In some scenarios, the passenger side device (second electronic device) places an order on the taxi taking application, the driver side device (first electronic device) accepts the order on the taxi taking application, a riding order association relationship between the passenger side device and the driver side device is established, after the passenger gets on the taxi, the driver and the passenger confirm each other, and the passenger or the driver can indicate that the driver has picked up the passenger by manually triggering the designated button in the taxi taking application. Before the passenger gets on the vehicle, the passenger-side device may display its own flash or the like in red (first operation), so that the driver can recognize the user of the passenger-side device, thereby realizing quick recognition of the passenger-side device and the driver-side device.
In some possible implementations, the second information is sent to the second electronic device by the server when the first electronic device detects that the first condition is satisfied by the first electronic device.
In some possible implementations, the first condition includes a location of the first electronic device reaching an pickup location associated with the ride order. When the server detects that the position of the first electronic device reaches the boarding position, the server sends second information to the second electronic device to instruct the second electronic device to execute second operation based on the second information, so that the user of the first electronic device can recognize the position of the second electronic device based on the second operation, and the position of the user of the second electronic device is recognized.
In some possible implementations, the first condition includes a time-to-pick-up associated with the ride order. The getting-on time associated with the riding order can be time preset by the second electronic device, or time estimated by the first electronic device according to the getting-on position and reaching the getting-on position of the second electronic device; when the server detects that the time of the first electronic device reaches the boarding time, the server sends second information to the second electronic device to instruct the second electronic device to execute a second operation based on the second information, so that the user of the first electronic device can recognize the position of the second electronic device and further recognize the position of the user of the second electronic device. The time of the first electronic device may be time displayed on the first electronic device, or standard local time acquired by the first electronic device through real-time networking.
In some possible implementations, the first condition includes an estimated time remaining for the first electronic device to reach an pick-up location associated with the ride order being within a preset value. The server may determine, in real time, a remaining time for the first electronic device to reach the boarding location based on a distance between the location of the first electronic device and the boarding location, and when the server detects that the first electronic device is about to reach the boarding location (for example, when the remaining time is expected to be one minute), the server may send second information to the second electronic device, instruct the second electronic device to perform a second operation based on the second information, so that the user of the first electronic device can recognize the location of the second electronic device, and thus recognize the location of the user of the second electronic device.
In some possible implementations, performing the second operation based on the second information includes: and detecting that the second electronic equipment meets a second condition, and executing a second operation based on the second information. It is described that the second electronic device executes the second operation only when the second electronic device detects that the second electronic device satisfies the second condition after the second electronic device acquires the second information from the server.
In some possible implementations, the second condition includes a location of the second electronic device reaching an pickup location associated with the ride order. When the second electronic device detects that the position of the second electronic device reaches the boarding position, the second electronic device executes a second operation, so that the user of the first electronic device can recognize the position of the second electronic device, and the position of the user of the second electronic device is recognized.
In some possible implementations, obtaining the second information from the server includes: receiving user operation aiming at a riding order; and acquiring the second information from the server in response to the user operation. It is described herein that the second information may be actively triggered to be obtained by the user.
In some possible implementations, performing the second operation based on the second information includes: and instructing the third electronic equipment to execute the second operation based on the second information. The third electronic device may be a wearable device connected to the second electronic device, where the second electronic device may be connected to one or more wearable devices, and the wearable device includes, for example, a watch, a bracelet, an earphone, and the like, and in a case where the second electronic device is connected to the wearable device (third electronic device), when the second electronic device receives the second information, the second electronic device may instruct the third electronic device to perform the second operation based on the second information, so that the user of the first electronic device can recognize the location of the third electronic device, and thus recognize the location of the user of the second electronic device.
In a fifth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the communication apparatus is caused to execute the device control method in any possible implementation manner of any one of the foregoing aspects.
In a sixth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the device control method in any one of the possible implementation manners of the foregoing aspect.
Drawings
Fig. 1 is a schematic diagram of a system architecture according to an embodiment of the present application;
fig. 2 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 4 is a flowchart of a method of controlling a device according to an embodiment of the present application;
fig. 5 is a flowchart of a method of controlling a device according to another embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a set of application interfaces provided by an embodiment of the present application;
fig. 7 is a schematic diagram of another group of application interfaces provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings. Wherein in the description of the embodiments of the present application, "/" indicates an inclusive meaning, for example, a/B may indicate a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of this application, a "plurality" means two or more unless indicated otherwise. The terms "intermediate," "left," "right," "upper," "lower," and the like, indicate orientations or positional relationships that are based on the orientations or positional relationships shown in the drawings, are used for convenience in describing the application and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and therefore should not be construed as limiting the application.
As shown in fig. 1, fig. 1 illustrates an architectural diagram of a system 10 provided by the present application. The system 10 may include: an electronic apparatus 100 (first electronic apparatus), an electronic apparatus 101 (second electronic apparatus), a server 102, and an in-vehicle electronic apparatus 200. The electronic device 100 and the vehicle-mounted electronic device 200 may communicate with each other through wireless communication methods such as Bluetooth (BT), near Field Communication (NFC), wireless fidelity (WiFi), wiFi direct connection, and ZigBee.
In some embodiments, the system 10 further includes an electronic device 103 (a third electronic device), and the electronic device 103 and the electronic device 101 may communicate through Bluetooth (BT), near Field Communication (NFC), wireless fidelity (WiFi), wiFi direct, zigBee, or other wireless communication manners. Optionally, the electronic device 103 is a wearable device of the electronic device 101, and includes an earphone, a smart watch, a smart bracelet, smart glasses, a Head-mounted display (HMD), an electronic garment, an electronic bracelet, an electronic necklace, and the like.
The electronic device 100 and the electronic device 101 may perform data communication through the server 102, the server 102 may be an application server, a cloud server, a background server, or the like, and the server 102 may perform network connection with a plurality of electronic devices to perform data communication with the plurality of electronic devices, such as the electronic device 100 and the electronic device 101. In some embodiments, the server 102 may transmit an instruction to the electronic device 100 instructing the electronic device 100 to perform a corresponding operation, or implement invocation and control of the in-vehicle electronic device 200 through the electronic device 100. In some embodiments, the server 102 may send instructions to the electronic device 101 instructing the electronic device 101 to perform corresponding operations or to enable invocation and control of the electronic device 103 by the electronic device 101.
The electronic device 100 and the electronic device 101 in this embodiment may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and an electronic device having a touch screen (or a display screen) such as a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, an intelligent cabin, a vehicle-mounted electronic device, a vehicle-mounted terminal, a watch, and a bracelet, and the specific type of the electronic device is not particularly limited in this embodiment.
The in-vehicle electronic device 200 includes a display device and an audio device of a vehicle, such as various lights (low beam, high beam, wide beam, daytime running light, warning light, fog light, brake light, backup light, turn signal light, etc.), a sound, a horn, an in-vehicle bracket, a rearview mirror, a display screen, an intelligent front windshield, an intelligent head-up display (HUD), and the like.
In some embodiments, the electronic apparatus 100 and the in-vehicle electronic apparatus 200 integrate a HiCar SDK that supports ecological connectivity of the electronic apparatus 100 and the in-vehicle electronic apparatus 200 to HiCar. The electronic device 100 may invoke the in-vehicle electronic device 200 to perform a corresponding action through the HiCar application.
Based on the system architecture, the following exemplarily illustrates an application scenario in which a passenger gets a car through a car-taking application, to which the embodiment of the present application is applied. The electronic device 100 may be a driver-side device (e.g., a mobile phone, a car machine, etc.), the electronic device 101 may be a passenger-side device (e.g., a mobile phone, a watch, etc.), and the on-board electronic device 200 may be a device that establishes a communication connection with the driver-side device, such as a headlight of a driver's vehicle. Driver-side devices and passenger-side devices have taxi taking applications installed thereon, and server 102 is the server for the taxi taking applications.
The passenger end equipment starts a taxi taking application, passengers place orders through the taxi taking application, the taxi taking application server issues the order placing requirements of the passengers to the taxi taking application of the driver end equipment, and when a driver receives the orders, a taxi taking order association relation between the driver end equipment and the passenger end equipment is established.
When the driver-side device arrives near the position of the passenger-side device, based on the position positioning of the driver-side device and the passenger-side device, the taxi taking application server detects that the driver-side device arrives near the position of the passenger-side device, at this moment, the taxi taking application server sends indication information to the driver-side device, the indication information can indicate a color or a pattern, and the driver-side device controls a headlight on a driver vehicle to display the first color or the first pattern based on the indication information. The passenger can conveniently find the driver's vehicle through the first color or the first pattern, thereby realizing taxi taking. In some embodiments, the taxi taking application server may send the indication information to the driver-side device at the same time as sending the indication information to the passenger-side device, and the passenger-side device displays the first color or the first pattern based on the indication information. The driver can conveniently find the position of the passenger through the first color or the first pattern, and mutual recognition of the driver and the passenger is realized.
For the above driving scenario, the in-vehicle electronic device 200 may be a headlight, a sound box, a speaker, an in-vehicle bracket, a rearview mirror, a display device on the roof of the vehicle, or other devices capable of displaying content or inputting audio. The passenger can rapidly lock the vehicle matched with the passenger by various different expression forms in the vehicle, so that the efficiency of the passenger for identifying the driver vehicle is improved, and the rapid taxi taking is realized.
In some embodiments, the taxi taking application on the driver side device is application software adapted to a CarKit, which includes one or more defined Session interfaces and callable methods/functions. The driver end device needs to realize interfaces stated in the car Kit, aar, and can be accessed to the HiCar application through the interfaces, the driver end device and the headlights on the driver vehicle are integrated with a Software Development Kit (SDK) of the HiCar, and the driver end device can call the headlights on the driver vehicle through the HiCar application to realize the control of the driving application on the headlights on the driver vehicle.
The electronic apparatus 100 according to the embodiment of the present application will be described first.
Referring to fig. 2, fig. 2 shows a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including UWB, wireless Local Area Networks (WLANs) (e.g., wireless fidelity (WiFi) network), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some embodiments of the present application, the interface content currently output by the system is displayed in the display screen 194. For example, the interface content is an interface provided by an instant messaging application.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal to the microphone 170C by uttering a voice signal close to the microphone 170C through the mouth of the user. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some alternative embodiments of the present application, the pressure sensor 180A may be configured to capture a pressure value generated when the user's finger portion contacts the display screen and transmit the pressure value to the processor, so that the processor identifies which finger portion the user entered the user action.
The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, different touch positions may be acted on, and different operation instructions may be corresponded. In some alternative embodiments, the pressure sensor 180A may also calculate the number of touch points from the detected signals and transmit the calculated values to the processor, so that the processor recognizes the user's operation by single-finger or multi-finger input.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 about three axes (the X, Y, and Z axes of the electronic device) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like. In some optional embodiments of the present application, the acceleration sensor 180E may be configured to capture an acceleration value generated when a finger portion of the user touches the display screen (or when the finger of the user strikes the rear side frame of the rear case of the electronic device 100), and transmit the acceleration value to the processor, so that the processor identifies which finger portion the user inputs the user operation through.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the display screen to achieve the purpose of saving power. The proximity light sensor 180G can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph a fingerprint, answer an incoming call with a fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or thereabout, which is an operation of a user's hand, elbow, stylus, or the like contacting the display screen 194. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 180M may also contact the human body pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
The software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a hierarchical architecture as an example, and exemplarily illustrates a software structure of the electronic device 100. The Android system is only an example of the system of the electronic device 100 in the embodiment of the present application, and the present application may also be applied to other types of operating systems, such as IOS, windows, and a grand montage, which is not limited in this application. The following description will use the Android system only as an example of the operating system of the electronic device 100.
Fig. 3 shows a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into three layers, which are an application layer, an application framework layer, and an operating system layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 3, the application packages may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, game, shopping, travel (e.g., car-driving application), instant messaging (e.g., short message), hicar, etc. applications. Wherein the hicar application is used to control and manage a device (e.g., in-vehicle electronic device 200 in fig. 1) connected to electronic device 100. In addition, the application package may further include: a main screen (i.e., desktop), a minus screen, a control center, a notification center, and other system applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3, the application framework layer may include an input manager, a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, a display manager, an activity manager, and the like. For illustrative purposes, in FIG. 3, the application framework layers are illustrated as including a window manager, an activity manager, a content provider, a view system, and a hicar module.
The activity manager is used for managing the active services running in the system, and comprises processes (processes), applications, services (services), task information and the like. Generally speaking, each time an application is run, the activity manager starts a task stack, and one task stack includes one or more activities.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. And may also be used to control the appearance, location, and manner in which the user operates the window programs. In the present application, the window manager of the electronic device 100 obtains the screen size of the display screen of the in-vehicle electronic device 200, and determines the size and position of the window displayed on the display screen of the in-vehicle electronic device 200.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, and the like.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures. In the present application, the content provider acquires display content provided by an application program, and draws the display content on the display screen of the electronic apparatus 100 and the display screen of the in-vehicle electronic apparatus 200 by the viewing system.
Optionally, the hicar module provides a corresponding interface for the application program of the application layer, and the interface provided by the hicar module can realize the call and data provision of the application program to the in-vehicle electronic device 200. In the embodiment of the application, the taxi taking application is taken as an example, the hicar module comprises a Carkit service module of the taxi taking application, the Carkit service module provides a corresponding interface for the taxi taking application, and the interaction between the taxi taking application and the vehicle-mounted electronic device 200 is realized through the interface; for example, the electronic device 100 acquires instruction information sent by the taxi taking application through an interface provided by a Carkit service module, and the electronic device 100 instructs the in-vehicle electronic device 200 to execute a corresponding action based on the instruction information.
The input manager is used for receiving instructions or requests reported by the operating system layer.
The display manager is used to transmit display content to the operating system layer.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The operating system layer provides management functions for hardware for managing and allocating the application programs to use various resources (CPU, memory, input-output devices, display screen, speaker, microphone, etc.) of the electronic device 100. For example, when an application needs to run, the operating system calls the application into a memory, and allocates a memory space for the application to run and store data. If an application needs to display data content, the operating system calls the display device (for example, the in-vehicle electronic device 200) to provide the application with a control service for the display device. In the embodiment of the present application, the electronic device 100 establishes a connection with the in-vehicle electronic device 200, and an application program in the electronic device 100 may use a hardware resource of the in-vehicle electronic device 200.
A specific implementation principle of the device control method provided in the embodiment of the present application is described below with reference to the software structure diagram shown in fig. 3. Taking the electronic device 100 as a driver-side device, the electronic device 101 as a passenger-side device, and the server 102 as a taxi-taking application server as an example, as shown in fig. 4, fig. 4 is a flowchart illustrating an apparatus control method according to an embodiment of the present disclosure.
In fig. 4, the passenger side device and the driver side device are both installed with a target application, and are in data communication with a server of the target application through a network; the passenger end equipment and the driver end equipment can comprise mobile phones, watches, intelligent glasses, tablets, computers, car machines and other equipment. The target application may be a taxi-taking type application, the passenger-side device may issue a taxi taking order based on the target application, and the driver-side device may receive the passenger order based on the target application.
Step S101: the passenger terminal equipment runs the target application; step S102: the driver side equipment runs the target application;
in some embodiments, the target application run by the passenger-side device and the driver-side device is a different version of the target application. The target application installed on the driver-side device is application software adapted to the CarKit, and the CarKit includes one or more defined Session interfaces and callable methods/functions. The driver-side device needs to register the Session interface declared in the cardkit. The target application can be accessed to the HiCar application through the Session interfaces, and the function provided in the HiCar application can be called through the callable method/function defined in the Carkit. Both the electronic device 100 and the vehicle-mounted electronic device 200 are integrated with a HiCar SDK, and the electronic device 100 can call the vehicle-mounted electronic device 200 through the HiCar application, so that the target application controls the vehicle-mounted electronic device 200.
The hicar can be understood as a communication mode between the driver side equipment and the vehicle-mounted terminal, and provides an interface for data communication between the driver side equipment and the vehicle-mounted terminal. The embodiment of the application does not limit the connection mode between the driver side equipment and the vehicle-mounted electronic equipment. Optionally, the driver end device and the vehicle-mounted terminal may be connected in a wired or wireless manner, where the wireless connection manner includes Bluetooth (BT), near Field Communication (NFC), wireless fidelity (WiFi), wiFi direct connection, zigBee, and other manners.
In some embodiments, the passenger-side device and the target application run by the driver-side device are application programs including a driver version and a passenger version, and before using the target application, a user can select to be the driver or the passenger, and the target application can provide different application interfaces for the driver version and the passenger version.
Step S103: and the passenger end equipment and the driver end equipment establish a riding order incidence relation based on the target application.
In some embodiments, the target application is a taxi taking application, the passenger-side device issues a riding order on the target application, and the driver-side device receives the riding order on the target application, so that a riding order association relationship between the passenger-side device and the driver-side device is established. When the passenger gets on the bus, the driver and passenger confirm each other, and the passenger or driver can indicate that the driver has picked up the passenger by manually triggering a designated button in the destination application.
After the passenger-side device and the driver-side device establish a riding order association relationship in the target application, the server 102 of the target application may provide driver information, driver vehicle information, route information, time information, and the like to the passenger-side device, and the server 102 of the target application may provide passenger information, route information, time information, and the like to the driver-side device.
In some embodiments, the server 102 can also provide the passenger-side device with relevant information that the driver-side device will display when the driver-side device moves into proximity with the passenger-side device. The related information can comprise that when the driver-side device moves to the vicinity of the passenger-side device, the driver vehicle can be displayed in red or a headlight of the driver vehicle can be displayed in red, so that the passenger can be indicated to be convenient for the passenger to identify the driver-side device, and the identification efficiency is improved.
The server 102 may also provide the driver-side device with relevant information that the passenger-side device will display when the driver-side device moves into proximity with the passenger-side device. The related information can include that when the driver-side device moves to the vicinity of the passenger-side device, the passenger-side device is displayed in red, so that the driver can be indicated to conveniently identify the passenger-side device, and the identification efficiency is improved.
Step S104: the server 102 detects that the electronic device 100 satisfies the first condition, and transmits first information to the electronic device 100.
The first condition may be that the location of the driver-side device has reached the pick-up location associated with the ride order; the first condition may be that the position of the driver-side device is within a preset range from the position of the passenger-side device; the first condition may also be that the predicted remaining time for the driver-side device to arrive at the location of the passenger-side device is within a preset value; the position of the passenger-side device may be a real-time geographic position of the passenger-side device, or a geographic position preset by a user. The server 102 may perform real-time detection on the geographic positions of the driver-side device and the passenger-side device, and optionally, the driver-side device and the passenger-side device perform real-time detection on their geographic positions, and then report the detected geographic positions to the server 102.
In some embodiments, the first condition may also be detection of a user operation for the target application; and so on. The user can indicate that the distance between the current position of the driver-side device and the position of the passenger-side device is within a preset range, or indicate that the time when the driver-side device reaches the position of the passenger-side device is within a preset time, or indicate that the electronic device 100 meets the first condition by means of active triggering through user operation.
In some embodiments, the first condition may also be that the driver-side device receives a control command sent by the server, where the control command is associated with the location of the passenger-side device, for example, when the location of the passenger-side device reaches the boarding location associated with the riding order, or the distance between the location of the passenger-side device and the boarding location is within a preset range.
In some embodiments, the first condition may also be that the time of the driver-side device reaches an arrival time associated with the ride order, which is a preset arrival time of the passenger.
The server 102 detects that the electronic device 100 satisfies the first condition, and the server 102 transmits first information, which may include color information, pattern information, frequency information, and the like, to the electronic device 100.
In some embodiments, the first information may instruct the electronic device 100 to display a corresponding color, pattern, or the like.
In some embodiments, the first information may indicate that the electronic device 100 triggers performing a preset operation, which is preset in the electronic device 100. When the server 102 detects that the electronic device 100 meets the first condition, first information is sent to the electronic device 100, and the electronic device 100 is instructed to trigger execution of preset operations preset in the electronic device 100.
In some embodiments, step S103 is optional, after the passenger-side device and the driver-side device establish the riding order association relationship in the target application, and when the electronic device 100 receives a user operation triggered by the user, the electronic device 100 does not execute step S13, and directly executes the following step S104. That is, the user may actively trigger the target application of the electronic device 100 to call the method to implement the first operation through the hicar.
Step S105: the electronic apparatus 100 instructs the in-vehicle electronic apparatus 200 to perform the first operation based on the first information.
The first information may include color information, pattern information, frequency information, etc., and the first operation may be to display a corresponding color, pattern, or display a corresponding color, pattern with a certain frequency, etc.
In some embodiments, the first information may be a trigger instruction, the first operation is a preset operation in target software of the electronic device 100, a preset mapping relationship exists between the first operation and the first information, and when the electronic device 100 receives the first information, the electronic device 100 determines, based on the first information, the first operation corresponding to the first information. The first operation may be factory set or preset by a user.
The electronic apparatus 100 determines a first operation based on the first information, and instructs the in-vehicle electronic apparatus 200 to perform the first operation.
Illustratively, the first information includes audio information, and the electronic device 100 receives the first information and instructs the in-vehicle electronic device 200 to play audio corresponding to the audio information.
Illustratively, the first information includes color information, and the electronic device 100 receives the first information and instructs the in-vehicle electronic device 200 to display a color corresponding to the color information.
Illustratively, the first information is a trigger instruction, the electronic device 100 receives the first information, determines that a first operation corresponding to the first information is to display red, and the electronic device 100 instructs the in-vehicle electronic device 200 to display the display device in red.
In some embodiments, the target application installed on the driver-side device is application software adapted to the CarKit, and the electronic device 100 may invoke the in-vehicle electronic device 200 through the HiCar application. The target application of the electronic apparatus 100 calls a corresponding method in the hicar, which calls an interface of the corresponding in-vehicle electronic apparatus 200 to instruct the in-vehicle electronic apparatus 200 to perform the first operation, based on the first information.
Illustratively, after the electronic device 100 receives the first information, based on the first information, the target application calls a corresponding method to implement a first operation corresponding to the first information by the hicar. Since the target application is application software adapted to the CarKit, the CarKit includes one or more defined interfaces and callable methods/functions.
Illustratively, the first information includes color information, such as red. When the electronic device 100 receives the first information, the electronic device 100 calls a corresponding method to call hicar, where the method may be a display method defined in the kit, or a method for displaying red, etc.; the hicar instructs the in-vehicle electronic apparatus 200 to display the display apparatus in red by calling a corresponding interface, which is an interface of the in-vehicle electronic apparatus 200 or an interface of a display of the in-vehicle electronic apparatus 200.
Also illustratively, the first information includes pattern information, such as a love pattern. When the electronic device 100 receives the first information, the electronic device 100 calls a corresponding method to call hicar, where the method may be a display method defined in the CarKit, or a method for displaying a love pattern, etc.; the hicar instructs the in-vehicle electronic apparatus 200 to display a love pattern on the display apparatus by calling a corresponding interface, which is an interface of the in-vehicle electronic apparatus 200 or an interface of a display of the in-vehicle electronic apparatus 200.
Further illustratively, the first information includes audio information, and the electronic apparatus 100 receives the first information, and the electronic apparatus 100 calls a corresponding method (a defined output audio method) to call the hicar, which instructs the in-vehicle electronic apparatus 200 to output corresponding audio data by calling a corresponding interface, which is an interface of the in-vehicle electronic apparatus 200 or an interface of an audio output apparatus of the in-vehicle electronic apparatus 200.
Further illustratively, the first information is a trigger instruction, the electronic device 100 receives the first information, determines that a first operation corresponding to the first information is displaying red, the electronic device 100 calls a corresponding method (a well-defined display method, or a red display method, etc.) to call a hicar, and the hicar instructs the in-vehicle electronic device 200 to display the display device in red by calling a corresponding interface, where the corresponding interface is an interface of the in-vehicle electronic device 200 or an interface of a display of the in-vehicle electronic device 200.
In some embodiments, when the target application of the electronic device 100 calls the display method/function, the hicar may determine a called interface based on the priority of the display device, and instruct the in-vehicle electronic device 200 to display the corresponding display content. When the target application of the electronic device 100 calls the audio output method/function, the hicar may determine a called interface based on the priority of the audio device, and instruct the in-vehicle electronic device 200 to output the corresponding audio content. The priority of the display device and the priority of the audio device may be preset by the user in the hicar application, or may be determined by the electronic device 100 based on the current operating states (e.g., idle state, network state) of the display device and the audio device.
Step S106: the electronic apparatus 200 performs a first operation.
The electronic apparatus 100 instructs the in-vehicle electronic apparatus 200 to perform the first operation, and the in-vehicle electronic apparatus 200 receives the instruction information and performs the first operation. The vehicle-mounted electronic device 200 may be a headlight, a display screen, a rearview mirror, a vehicle-mounted bracket, a sound device, and the like of a driver vehicle. When the in-vehicle electronic apparatus 200 is an electronic apparatus with a display function (e.g., a headlight, a display screen, a rearview mirror, etc. on a driver's vehicle), the first operation includes displaying a corresponding color, pattern, etc.; when the in-vehicle electronic apparatus 200 is an electronic apparatus with an audio output function (e.g., a sound, a horn, etc. on a driver's vehicle), the first operation includes outputting corresponding audio data. Thus, when the passenger sees the in-vehicle electronic device 200 to perform the first operation, the vehicle corresponding to the driver-side device matched with the passenger can be quickly identified. The passenger identifies the driver's vehicle, and after the passenger gets on the vehicle, the driver and passenger confirm each other, and the passenger or driver indicates that the driver has picked up the passenger by manually triggering a designated button in the destination application.
In some application scenarios, the passenger-side device starts a target application (a taxi taking application), a passenger places an order through the target application, the server 102 of the target application issues the order placing requirement of the passenger to the target application of the driver-side device (the electronic device 100), and when a driver accepts the order, a riding order association relationship between the driver-side device and the passenger-side device is established. When the server 102 detects that the driver arrives near the position of the passenger based on the positioning of the driver-side device and the passenger-side device when the driver arrives near the position of the passenger, the server 102 transmits indication information (first information) to the driver-side device, the indication information may indicate a color or a pattern, and the driver-side device controls the in-vehicle electronic device 200 to display the first color or the first pattern based on the indication information (first operation). The passenger can conveniently and quickly find the driver's vehicle through the first color or the first pattern, and the taxi taking is realized.
In some application scenarios, a riding order association relationship is established between the driver-side device and the passenger-side device, the boarding time associated with the riding order is noon in the next day, when the driver arrives at the boarding position associated with the riding order in advance, the server 102 detects that the driver arrives, and does not trigger the driver-side device to execute the first operation, when the server 102 detects that the current time arrives at the boarding position, the server 102 sends indication information (first information) to the driver-side device, and the driver-side device controls the vehicle-mounted electronic device 200 to execute the first operation based on the indication information. The passenger can conveniently and quickly find the driver's vehicle through the first operation, and the taxi taking is realized.
It can be seen that, in the above steps S101 to S106, the target application may implement passenger identification on the driver through invoking and controlling the vehicle-mounted electronic device 200, and in some embodiments, through step S107 and step S108, the embodiment of the present application may also implement passenger identification on the driver.
Optionally, step S107: the server 102 detects that the electronic device 100 satisfies the first condition, and sends the second information to the passenger-side device.
The first condition here is the same as the first condition in step S103, and the description in step S103 may be referred to, and is not repeated here.
The server 102 detects that the electronic device 100 satisfies the first condition, and the server 102 transmits second information to the passenger-side device, wherein the second information may include color information, pattern information, frequency information, and the like. In some embodiments, the color information, pattern information, frequency information, and the like included in the second information are the same as those included in the first information of step S103. That is, the server 102 provides the same elements (color, pattern, audio, etc.) to the passenger-side device and the electronic device 100, so as to achieve the effect of matching with each other.
In some embodiments, the second information may instruct the passenger device to display a corresponding color, pattern, or the like.
In some embodiments, the second information may instruct the passenger-side device to trigger execution of a preset operation, where the preset operation is preset in the passenger-side device. When the server 102 detects that the electronic device 100 meets the preset condition, it sends a second message to the passenger-side device, instructing the passenger-side device to trigger execution of a preset operation preset in the passenger-side device.
Optionally, step S108: the passenger end device executes a second operation based on the second information.
And the passenger terminal equipment receives the second information and executes a second operation. The second operation includes displaying a corresponding color, pattern, outputting corresponding audio data, and the like. Therefore, when the driver sees the passenger end device to execute the second operation, the passenger corresponding to the passenger end device matched with the driver can be quickly identified, and the taxi taking is realized.
In some embodiments, the passenger-side device instructs the device (the electronic device 103) that establishes the connection with the passenger-side device to perform the second operation based on the second information. For example, the passenger-side device is a mobile phone, the passenger-side device is connected to a smart watch, and when the passenger-side device receives the second information and the second operation corresponding to the second information is red, the passenger-side device may indicate or call the smart watch to display red. Alternatively, the device (electronic device 103) establishing the connection with the passenger-side device may include a smart watch, a bracelet, smart glasses, headphones, a stereo, and so on.
According to the embodiment of the application, the server 102 triggers the driver-side device to acquire the first information, where the first information may indicate one color or one pattern, and the driver-side device controls the in-vehicle electronic device 200 to display the one color or the one pattern based on the first information. The passenger can conveniently and quickly find the driver's vehicle through the color or the pattern, and the passenger can recognize the driver. And, triggering the passenger-side device to acquire the second information through the server 102, wherein the second information may indicate a color or a pattern, and the passenger-side device displays the color or the pattern. The driver can conveniently and quickly find the passenger terminal equipment through the color or the pattern, and the driver can recognize the passenger. Thus, efficiency of mutual recognition of the driver and the passenger is improved.
In the following, a specific implementation principle of another device control method provided in the embodiment of the present application is described with reference to the foregoing implementation scenario, that is, the electronic device 100 is a driver-side device, the electronic device 101 is a passenger-side device, and the server 102 is a taxi taking application server. As shown in fig. 5, fig. 5 is a schematic flowchart of another apparatus control method according to an embodiment of the present application.
Step S201: the passenger terminal equipment runs the target application; step S202: the driver side equipment runs the target application;
step S203: and the passenger side equipment and the driver side equipment establish a riding order incidence relation in the target application.
For specific implementation of steps S201 to S203, reference may be made to the related description of steps S101 to S103, which is not described herein again.
Step S204: the server 102 transmits the first information to the electronic device 100.
After the passenger side device and the driver side device establish a riding order association relationship in the target application, the server 102 of the target application sends first information to the electronic device 100 (the driver side device). The first information may include color information, pattern information, frequency information, and the like. In some embodiments, the first information may instruct the electronic device 100 to display a corresponding color, pattern, or the like after a preset condition is satisfied.
In some embodiments, the first information may instruct the electronic device 100 to perform a preset operation after a preset condition is satisfied. The preset operation is preset in the electronic apparatus 100. The electronic device 100 receives first information, which has a mapping relationship with a preset operation, and when the electronic device 100 detects that the electronic device 100 meets a preset condition, the electronic device 100 executes the preset operation preset in the electronic device 100 based on the first information.
In some embodiments, step S204 is optional, the electronic device 100 does not necessarily obtain the first information through the server 102, and the first information may also be preset in the electronic device 100, for example, the first information may be factory set, and may also be preset by a user.
Step S205: the electronic device 100 detects that the electronic device 100 satisfies the first condition.
The electronic device 100 detects that the electronic device 100 satisfies the first condition. The description of the first condition may refer to the description of the first condition in step S104, and is not repeated here.
Step S206: the target application of the electronic apparatus 100 instructs the in-vehicle electronic apparatus 200 to perform the first operation based on the first information.
Step S207: the in-vehicle electronic apparatus 200 performs a first operation indicating first information.
For specific implementation of step S206 to step S207, reference may be made to the related description of step S105 to step S106, which is not described herein again.
In some embodiments, optionally, step S208: the server 102 sends the second information to the passenger-side device.
After the passenger-side device and the driver-side device establish the riding order association relationship in the target application, the server 102 of the target application sends second information to the passenger-side device. The second information may include color information, pattern information, frequency information, and the like. In some embodiments, the color information, pattern information, frequency information, and the like included in the second information are the same as those included in the first information of step S204. That is, the server 102 provides the same elements (color, pattern, audio, etc.) to the passenger-side device and the electronic device 100, so as to achieve the effect of matching with each other.
In some embodiments, the second information may instruct the passenger device to display a corresponding color, pattern, or the like.
In some embodiments, the second information may instruct the passenger device to perform a preset operation. The preset operation is preset in the passenger terminal equipment, and the second information and the preset operation have a mapping relation. The passenger terminal device executes the preset operation based on the second information.
In some embodiments, step S208 is optional, the passenger-side device does not necessarily obtain the second information through the server 102, and the second information may also be preset in the passenger-side device, for example, the second information may be factory set, and may also be preset by a user.
Optionally, in step S209: the passenger end device executes a second operation based on the second information.
For a specific implementation of step S209, reference may be made to the related description of step S108, which is not described herein again.
In some embodiments, when the passenger-side device receives the instruction, a second operation is performed based on the second information. The instruction may be sent by the server 102, sent by the electronic device 100, or triggered by the user.
In the embodiment of the application, the electronic device 100 triggers the driver end device to control the vehicle-mounted electronic device 200 to display the color or the pattern based on the first information. The passenger can conveniently and quickly find the driver's vehicle through the color or the pattern, and the driver is identified by the passenger. And, the passenger-side device acquires second information, which may indicate a color or a pattern, that the passenger-side device displays. The driver can conveniently and quickly find the passenger terminal equipment through the color or the pattern, and the driver can recognize the passenger. Thus, efficiency of mutual recognition of the driver and the passenger is improved.
The specific implementation principle of the device control method provided in the embodiment of the present application is explained above, and the embodiment of the present application is further described below with reference to a display interface in an application scenario of driving a vehicle.
The passenger orders through the taxi taking application, and when the driver takes the order, the contact between the driver and the passenger is established. When the driver arrives near the position of the passenger, the driving application server detects that the driver arrives near the position of the passenger based on the position positioning of the driver-side device and the passenger-side device, at the moment, the driving application server sends indication information to the driver-side device, the indication information can indicate a color or a pattern, and the driver-side device controls headlights on the driver vehicle to display the first color or the first pattern. As shown in fig. 6, fig. 6 illustratively shows a user interface 410 in the target application of the driver-side device when the driver arrives near the passenger's location. The user interface 410 may include a map 401, a status bar 402, a toolbar 403, a prompt box 404, and a control 405. Wherein the content of the first and second substances,
the status bar 402 may include: one or more signal strength indicators for mobile communication signals (which may also be referred to as cellular signals), one or more signal strength indicators for wireless fidelity (Wi-Fi) signals, a battery status indicator, and a time indicator. The toolbar 403 includes one or more functionality controls that provide functionality such as making a call, playing music, and the like. The control 405 is used to zoom in and out on the size of the map 401.
"you have arrived near the passenger, your vehicle headlight starts to show red" is indicated in the prompt box 404. As can be seen, the in-vehicle electronic device 200 (the headlight of the driver's vehicle) displays red.
In some embodiments, the display form of the in-vehicle electronic device 200 may also be that a display screen of a car machine displays red, a head-up display device of an intelligent car displays red, or characters or patterns, a speaker plays designated audio, and the like.
Also shown in fig. 6 is a user interface 510 in a target application of the passenger end device. The prompt box 501 in the user interface 510 indicates "driver has reached your vicinity, driver car headlight red". In this way, the driver can be identified by the passenger.
In some embodiments, the taxi taking application server detects that the driver is approaching the passenger, and then the taxi taking application server sends indication information to the driver side device, and the driver side device is instructed to control a headlight on the driver vehicle to display a first color or a first pattern; and the taxi taking application server sends indication information to the passenger end equipment to indicate the passenger end equipment to display the first color or the first pattern. As shown in fig. 7, fig. 7 illustratively shows a user interface 610 in a target application of a driver-side device when the driver approaches the position of the passenger. A prompt box 601 in the user interface 610 indicates that "you have arrived near the passenger, your headlight starts to display red, and the passenger-side device also displays red". As can be seen in fig. 7, the in-vehicle electronic apparatus 200 (the headlight of the driver's vehicle) displays red.
Also shown in fig. 7 is a user interface 620 in the target application of the passenger end device. The prompt box 602 in the user interface 620 indicates "driver has reached your vicinity, driver vehicle headlight red, and your flash red. As can be seen in fig. 7, the flash of the passenger-side device displays red.
In some embodiments, the display form of the passenger-side device may also be that the display screen displays red, the display screen displays text or patterns, the speaker plays specified audio, or a wearable device (e.g., a watch, a bracelet, etc.) connected to the passenger-side device displays specified colors, text or patterns, etc.
In this way, mutual recognition of the passenger and the driver can be achieved.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (20)

1. An equipment control method is characterized in that a first electronic equipment and a vehicle-mounted electronic equipment are connected, and the method comprises the following steps of;
the first electronic equipment and the second electronic equipment establish a riding order incidence relation;
the first electronic equipment acquires first information from a server, wherein the first information is used for triggering the vehicle-mounted electronic equipment to execute first operation, and the first operation comprises at least one of display operation, light operation or audio playing operation;
the first electronic equipment controls the vehicle-mounted electronic equipment to execute the first operation based on the first information.
2. The method of claim 1, wherein the first electronic device controls the in-vehicle electronic device to perform the first operation based on the first information, and wherein the first operation comprises:
and detecting that the first electronic equipment meets a first condition, and controlling the vehicle-mounted electronic equipment to execute the first operation based on the first information by the first electronic equipment.
3. The method of claim 2, wherein the first condition comprises a location of the first electronic device reaching an pick-up location associated with the ride order.
4. The method of claim 2, wherein the first condition comprises an arrival time associated with the ride order.
5. The method of claim 2, wherein the first condition comprises an estimated time remaining for the first electronic device to reach an pick-up location associated with the ride order being within a preset value.
6. The method of claim 2, wherein the first condition comprises a control instruction sent by the server, and wherein the control instruction is associated with a location of the second electronic device.
7. The method according to any one of claims 3-5, wherein the first information is sent to the first electronic device by the server when the first electronic device detects that the first condition is satisfied by the first electronic device.
8. The method of claim 1, wherein the first electronic device obtaining the first information from the server comprises:
the first electronic equipment receives user operation aiming at the riding order;
and responding to the user operation, and the first electronic equipment acquires first information from a server.
9. The method of any of claims 1-8, wherein the first electronic device and the in-vehicle electronic device are integrated on the same electronic device.
10. An apparatus control method, characterized in that the method comprises:
the second electronic equipment and the first electronic equipment establish a riding order incidence relation;
the second electronic equipment acquires second information from a server, wherein the second information is used for triggering the second electronic equipment to execute second operation, and the second operation comprises at least one of display operation, lamplight operation or audio playing operation;
the second electronic device performs the second operation based on the second information.
11. The method according to claim 10, wherein the second information is sent to the second electronic device by the server when the server detects that the first electronic device satisfies a first condition.
12. The method of claim 11, wherein the first condition comprises a location of the first electronic device reaching a pickup location associated with the ride order.
13. The method of claim 11, wherein the first condition comprises a time to pick up associated with the ride order.
14. The method of claim 11, wherein the first condition comprises an expected remaining time for the first electronic device to reach a pick-up location associated with the ride order being within a preset value.
15. The method of claim 10, wherein the second electronic device performs the second operation based on the second information, comprising:
and detecting that the second electronic equipment meets a second condition, and executing the second operation by the second electronic equipment based on the second information.
16. The method of claim 15, wherein the second condition comprises a location of the second electronic device reaching a pick-up location associated with the ride order.
17. The method of claim 10, wherein the second electronic device obtaining second information from a server comprises:
the second electronic equipment receives user operation aiming at the riding order;
and responding to the user operation, and the second electronic equipment acquires second information from a server.
18. The method of any of claims 10-17, wherein the second electronic device performs the second operation based on the second information, comprising:
the second electronic device instructs a third electronic device to perform the second operation based on the second information.
19. An electronic device, comprising: one or more processors, one or more memories; the one or more memories are respectively coupled with the one or more processors; the one or more memories are for storing computer program code comprising computer instructions; the computer instructions, when executed on the processor, cause the electronic device to perform the method of claims 1-18.
20. A computer readable medium storing one or more programs, wherein the one or more programs are configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the method of claims 1-18.
CN202110738497.7A 2021-06-30 2021-06-30 Equipment control method and related device Pending CN115545812A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110738497.7A CN115545812A (en) 2021-06-30 2021-06-30 Equipment control method and related device
PCT/CN2022/101471 WO2023274136A1 (en) 2021-06-30 2022-06-27 Device control method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110738497.7A CN115545812A (en) 2021-06-30 2021-06-30 Equipment control method and related device

Publications (1)

Publication Number Publication Date
CN115545812A true CN115545812A (en) 2022-12-30

Family

ID=84690075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110738497.7A Pending CN115545812A (en) 2021-06-30 2021-06-30 Equipment control method and related device

Country Status (2)

Country Link
CN (1) CN115545812A (en)
WO (1) WO2023274136A1 (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103489306B (en) * 2012-06-15 2015-09-23 上海飞田通信技术有限公司 Enhancement driver and passenger contact the system and method for efficiency
CA2868649A1 (en) * 2013-10-25 2015-04-25 Creative Mobile Technologies, LLC System and method for interacting between passenger and in-vehicle equipment
CN106469514A (en) * 2015-08-21 2017-03-01 阿里巴巴集团控股有限公司 A kind of place reminding method and device
CN105818735A (en) * 2016-04-01 2016-08-03 蔡洪斌 Vehicle-mounted electronic display screen prompting method for indicating passenger to take reserved vehicle
CN107784368A (en) * 2016-08-25 2018-03-09 大连楼兰科技股份有限公司 Net about car management and the platform of car bulb auxiliary control
CN107844284B (en) * 2016-09-19 2020-12-04 北京嘀嘀无限科技发展有限公司 Passenger positioning processing method and server
CN109089208A (en) * 2018-08-10 2018-12-25 珠海格力电器股份有限公司 A kind of method, system and mobile terminal that prompt user correctly rides
CN110782051A (en) * 2018-11-09 2020-02-11 北京嘀嘀无限科技发展有限公司 Method and system for reminding service requester
CN112738197A (en) * 2018-12-04 2021-04-30 北京嘀嘀无限科技发展有限公司 Prompting method and device, electronic equipment and storage medium
CN110853334A (en) * 2020-01-07 2020-02-28 芜湖应天光电科技有限责任公司 Taxi booking system and top light display method thereof
CN112067012B (en) * 2020-11-12 2021-03-02 南京领行科技股份有限公司 Network appointment vehicle pick-up judgment method and device

Also Published As

Publication number Publication date
WO2023274136A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
WO2020177619A1 (en) Method, device and apparatus for providing reminder to charge terminal, and storage medium
CN111602108B (en) Application icon display method and terminal
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN114095599B (en) Message display method and electronic equipment
CN114466107A (en) Sound effect control method and device, electronic equipment and computer readable storage medium
CN111368765A (en) Vehicle position determining method and device, electronic equipment and vehicle-mounted equipment
CN112543447A (en) Device discovery method based on address list, audio and video communication method and electronic device
CN111542802A (en) Method for shielding touch event and electronic equipment
CN114090102B (en) Method, device, electronic equipment and medium for starting application program
CN114727220B (en) Equipment searching method and electronic equipment
CN111249728B (en) Image processing method, device and storage medium
CN114915721A (en) Method for establishing connection and electronic equipment
CN112532508B (en) Video communication method and video communication device
CN114691248B (en) Method, device, equipment and readable storage medium for displaying virtual reality interface
CN113890929B (en) Method and device for switching audio output channel and electronic equipment
CN113822643A (en) Method and device for setting travel reminding
WO2023274136A1 (en) Device control method and related device
CN114828098A (en) Data transmission method and electronic equipment
CN111339513A (en) Data sharing method and device
CN113672454B (en) Screen freezing monitoring method, electronic equipment and computer readable storage medium
CN116048236B (en) Communication method and related device
CN115022807B (en) Express information reminding method and electronic equipment
CN116346982B (en) Method for processing audio, electronic device and readable storage medium
EP4184298A1 (en) Method and apparatus for starting application, and electronic device and medium
CN115705565A (en) Payment method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination