CN111158549B - Travel order processing method and electronic equipment - Google Patents

Travel order processing method and electronic equipment Download PDF

Info

Publication number
CN111158549B
CN111158549B CN201911402591.4A CN201911402591A CN111158549B CN 111158549 B CN111158549 B CN 111158549B CN 201911402591 A CN201911402591 A CN 201911402591A CN 111158549 B CN111158549 B CN 111158549B
Authority
CN
China
Prior art keywords
input
information
user
target
travel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911402591.4A
Other languages
Chinese (zh)
Other versions
CN111158549A (en
Inventor
高桦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911402591.4A priority Critical patent/CN111158549B/en
Publication of CN111158549A publication Critical patent/CN111158549A/en
Application granted granted Critical
Publication of CN111158549B publication Critical patent/CN111158549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention discloses a travel order processing method and electronic equipment, which are used for solving the problem that the operation process is complex when people use public transportation for travel, and comprise the following steps: receiving a first input of a first user on a shooting preview picture; displaying target travel mode information in response to the first input, the target travel mode information being determined based on the input trajectory of the first input; receiving a second input of the target trip information by the first user; in response to the second input, the travel order is output. According to the technical scheme, when the travel order is processed, the travel mode is determined more simply, conveniently and quickly, the travel mode is determined more visually and visually by inputting the travel mode on the shooting preview picture, and the interactive experience feeling between a user and the electronic equipment in the travel mode determining process is improved; meanwhile, the second input of the first user to the target travel information can be responded, the travel order is output, the generation process of the travel order is simplified, and the user can travel more conveniently.

Description

Travel order processing method and electronic equipment
Technical Field
The invention relates to the technical field of AR (augmented reality) technology and information processing, in particular to a travel order processing method and electronic equipment.
Background
At present, people tend to travel more and more in public transportation, such as driving, riding a shared bicycle and the like. However, in the prior art, when the public transportation is used for related applications, the problems of complex operation process, inaccurate positioning information, difficult geographic information communication between a driver and passengers and the like exist, the travel time is consumed, and the travel efficiency of people is reduced. In addition, when vehicles (such as shared vehicles) are searched by using public transportation related applications, people cannot find the vehicles or find the vehicles which are the fault vehicles due to the problems of untimely information updating, non-visual navigation, inaccurate positioning and the like, and the traveling efficiency of people is reduced.
Disclosure of Invention
The embodiment of the invention provides a travel order processing method and electronic equipment, and aims to solve the problem that the operation process is complex when people travel by using public transportation.
To solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a method for processing a travel order, including:
receiving a first input of a first user on a shooting preview picture;
displaying target travel mode information in response to the first input, the target travel mode information being determined based on an input trajectory of the first input;
receiving a second input of the first user for target trip information;
outputting the travel order in response to the second input.
In a second aspect, an embodiment of the present invention further provides an electronic device, including:
the first receiving module is used for receiving a first input of a first user on a shooting preview picture;
the display module is used for responding to the first input and displaying the target travel mode information; the target travel mode information is determined based on the input track of the first input;
the second receiving module is used for receiving second input of the first user on the target trip information;
and the output module is used for responding to the second input and outputting the travel order.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
a memory storing computer program instructions;
a processor which, when executed by the processor, implements a method of processing travel orders as claimed in any one of the above.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes instructions, and when the instructions are executed on a computer, the instructions cause the computer to execute the method for processing a travel order as described in any one of the above.
In the embodiment of the invention, when a travel order is processed, the first input of a first user on a shooting preview picture is received, the first input is responded, and the target travel mode information is displayed, so that the travel mode is determined more conveniently and quickly, the travel mode is determined more visually and visually by the input on the shooting preview picture, and the interactive experience feeling between the user and the electronic equipment in the travel mode determining process is improved; meanwhile, a second input of the first user to the target travel information can be received, the second input is responded, and the travel order is output, so that compared with the traditional travel order determining mode, a plurality of links (such as links of APP login, advertisement skipping, information confirmation and the like) are omitted, the generation process of the travel order is simplified, and the user can travel more conveniently; in addition, the technical scheme can be combined with the existing functions (such as a map) on the electronic equipment to issue various tasks and directly complete the closed loop of the scene (such as reserving restaurants and directly taking a car).
Drawings
Fig. 1 is a schematic flow chart of a travel order processing method according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of a travel order processing method according to another embodiment of the present invention.
FIG. 3 is a schematic interface diagram of an input trace in one embodiment of the invention.
FIG. 4 is a schematic interface diagram of an input trace in another embodiment of the invention.
Fig. 5 is a schematic interface diagram of opening a live-action map according to an embodiment of the present invention.
Fig. 6 is a schematic interface diagram for determining target travel information according to an embodiment of the present invention.
FIG. 7 is a schematic interface diagram of a shared bicycle search interface in one embodiment of the invention.
FIG. 8 is a schematic interface diagram of a shared bicycle search interface in another embodiment of the present invention.
FIG. 9 is a schematic interface diagram of a shared bicycle unlock interface in an embodiment of the present invention.
FIG. 10 is a schematic interface diagram of one method of selecting attribute information in an embodiment of the invention.
FIG. 11 is a schematic interface diagram of a recommended travel mode in an embodiment of the invention.
Fig. 12 is a schematic structural diagram of an electronic device in an embodiment of the invention.
Fig. 13 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic flow chart of a travel order processing method according to an embodiment of the present invention. The method of fig. 1 applied to the electronic device may include:
step 102, receiving a first input of a first user on a shooting preview picture.
The input content of the first input may be an input track for determining the target travel mode. The first input may be: clicking input on a virtual key on a screen of the electronic device; double-click input or long-press input on a physical key on the electronic device; a long press input in a blank area on a screen of the electronic device; and so on.
The shooting preview picture is captured by a camera of the electronic equipment, and the shooting preview picture can be a current display interface on a screen of the electronic equipment or a position which is positioned in front of the camera and can be captured by the camera. Therefore, when the first user performs the first input on the shooting preview screen, the first user can perform the input on the screen or in front of the camera and at a position where the camera can capture the first input.
In this embodiment, before receiving a first input of a first user on a shooting preview screen, the AR mode may be entered through a first designation operation, and the camera is turned on in the AR mode and the shooting preview screen is displayed.
And 104, responding to the first input, and displaying target travel mode information, wherein the target travel mode information is determined based on the input track of the first input.
The target travel modes can include modes of getting on a bus, riding a shared bicycle, taking a subway, taking a bus and the like.
And 106, receiving a second input of the target trip information by the first user.
The target travel information may include information such as a travel mode, a starting position, a destination, and a departure time. The second input is used for determining target travel information. The second input may be: clicking input on the live-action map; a drag input on the live-action map; long press input on the live-action map; and so on.
And step 108, responding to the second input, and outputting the travel order.
In the embodiment of the invention, when a travel order is processed, the first input of a first user on a shooting preview picture is received, the first input is responded, and the target travel mode information is displayed, so that the travel mode is determined more conveniently and quickly, the travel mode is determined more visually and visually by the input on the shooting preview picture, and the interactive experience feeling between the user and the electronic equipment in the travel mode determining process is improved; meanwhile, a second input of the first user to the target travel information can be received, the second input is responded, and the travel order is output, so that compared with the traditional travel order determining mode, a plurality of links (such as links of APP login, advertisement skipping, information confirmation and the like) are omitted, the generation process of the travel order is simplified, and the user can travel more conveniently; in addition, the technical scheme can be combined with the existing functions (such as a map) on the electronic equipment to issue various tasks and directly complete the closed loop of the scene (such as reserving restaurants and directly taking a car).
In one embodiment, the target travel mode may be determined based on the input trajectory by obtaining the input trajectory of the first input.
The input track can comprise a track formed by at least one of a vehicle shape, a symbol corresponding to the vehicle and a character corresponding to the vehicle.
Optionally, the corresponding relationship between the input trajectory and the travel mode may be preset. The corresponding relationship can be established by any one of the following methods:
(1) and establishing a corresponding relation between each input track and a travel mode according to the travel mode corresponding to each input track submitted by the user on the electronic equipment.
(2) And determining the corresponding relation between each input track and the travel mode according to the travel mode corresponding to each input track stored by a worker with certain authority on the network side.
For example, it is possible to preset the input trajectory corresponding to the riding share bike system as a trajectory formed by a bicycle shape, two wheels, a symbol "bicycle", a character "bike", and the like, and the input trajectory corresponding to the taxi driving system as a trajectory formed by a car shape, four wheels, a symbol "car", a character "car", and the like.
In this embodiment, through acquiring the input track of the first input, the trip mode of the user is determined based on the input track, and because the input track can visually represent various trip modes, the operation process of determining the trip mode of the user is more intuitive, visualized and interesting, and the operation process of determining the trip mode of the user is more convenient.
In one embodiment, the target travel information may include information of a start position, a destination, and the like, and when a second input of the live-action map by the first user is received, the live-action map is displayed in response to the second input; when a third input of the first user to the first building and the second building on the live-action map is received, the position where the first building is located is determined as a starting position and the position where the second building is located is determined as a destination in response to the third input.
Wherein the second input is used to open the live-action map. The second input may be: clicking input on the shooting preview picture; long press input on the shooting preview picture; two-finger zoom input on a shooting preview picture; and so on. The third input is used to determine a starting location and a destination. The third input may be: clicking input on the live-action map; an operation input on the live-action map; and so on.
Alternatively, the live-action map may be opened by performing a gesture motion on the shooting preview screen. Wherein the gesture motion can be preset. Assuming that the preset gesture is a two-finger zooming action, the user performs the two-finger zooming action on the shooting preview picture, so that the electronic equipment can open the live-action map.
In the embodiment, the live-action map can be opened according to the gesture action executed by the user on the shooting preview picture, links such as skipping advertisements and APP login when the live-action map is opened in the traditional method are omitted, time is saved, and the whole man-machine interaction is interesting.
Optionally, the third input of the building on the live-action map by the user may be performed by any one of the following methods: when a user clicks any position on the live-action map, a positioning icon appears on the live-action map, the user can manually drag the positioning icon, and when the positioning icon stays on a certain building for a long time or the camera captures a determination gesture of the user for the space position corresponding to the positioning icon in front of the camera, the position where the building is located can be determined as an initial position or a destination.
If the initial position is determined by the long-time stay of the positioning icon on a certain building, the preset stay time can be set. Assuming that the preset stay time is 3 seconds, the user drags the positioning icon and stays on a certain building for at least 3 seconds, and the electronic device determines that the building is located at the initial position. If the determined gesture captured by the camera of the electronic device determines that the position of the positioning icon is the initial position, the type of the determined gesture can be preset. If the gesture is determined to be manually drawing the square root, when a camera of the electronic equipment captures that a user manually draws the square root at a spatial position corresponding to the positioning icon in front of the camera, the electronic equipment determines that the position of the positioning icon is the initial position.
Optionally, after the starting position and the destination of the user are determined, on the live-action map, the building where the starting position and the destination are located may be highlighted according to a preset display manner. The preset display mode can be a highlight display mode, a thickening display mode and the like.
In the embodiment, the building with the starting position and the destination position highlighted according to the preset display mode can effectively prompt the user whether the position selection is correct, so that the accuracy of geographic information positioning is improved, and the travel efficiency is improved.
In this embodiment, the target travel information is determined by the third input of the user to the building displayed on the live-action map, so that the determination process of the target travel information of the user is simplified, and the determination mode of the target travel information is more convenient and faster.
In one embodiment, after the live-action map is displayed, when a fourth input of the first user to a second building on the live-action map is received, at least one item of attribute information of the second building is displayed in response to the fourth input; when a fifth input of the first user to the target attribute information in the at least one item of attribute information is received, a target operation corresponding to the target attribute information is executed in response to the fifth input.
Wherein the fourth input is for displaying attribute information of the second building. The fourth input may be: clicking input on the live-action map; long press input on the live-action map; a two-finger zoom input on the live-action map; and so on. The attribute information may include encyclopedia information, travel information, associated information of the second building, and the like; the associated information of the second building may be smart identification information of the building (for example, the smart identification information of the restaurant may be a reservation, and the smart identification information of the hospital may be a reservation registration). The fifth input is for selecting target attribute information. The fifth input may be: clicking input of target attribute information and long-time pressing input of the target attribute information; and so on.
For example, the second building is a restaurant, and when it is received that the user performs a two-finger zoom operation on the restaurant on the live-action map, attribute information (encyclopedia, travel, reservation) of the restaurant is displayed. If the clicking operation on the encyclopedia is received, opening the encyclopedia and displaying the details of the restaurant; if the clicking operation on the trip is received, pushing a trip mode option; and if the click operation for the reservation is received, opening a reservation interface of the restaurant.
In the embodiment, the attribute information of the building can be displayed through the operation of the user on the building on the live-action map, and the corresponding target operation is executed according to the selection of the user on the target attribute information, so that various selections are provided for the user, the user can conveniently select the target attribute information according to the requirement of the user, and the whole man-machine interaction is interesting.
In one embodiment, the target travel information further includes a departure time, and the target travel information may be determined by receiving a second input from the first user into the preset input box.
And the second input is used for inputting the target trip information.
For example, when the user clicks an input box corresponding to the departure time on the live-action map, the user may determine the departure time by inputting time information or selecting a preset time point, and the start position and the destination information may be directly input in the input box.
In this embodiment, the target travel information is determined through the second input of the user in the preset input box, so that the target travel information determination process of the user is simplified, and the determination mode of the target travel information is more convenient.
In one embodiment, in the case that the target travel mode is determined to be the shared bicycle riding mode, the virtual reality and the at least one piece of shared bicycle information within the preset range are displayed, and when a sixth input of the first user to the target shared bicycle is received, the real navigation information is displayed in response to the sixth input.
The shared bicycle information may include information such as a current first location of the shared bicycle, a distance to a current second location of the electronic device, and a route to the second location. The sixth input is for selecting a target shared bicycle. The navigation destination in the live-action navigation information is the current first position of the target sharing bicycle.
Optionally, the sixth input may be: clicking and inputting the position of the target sharing bicycle; inputting the long press of the position of the target sharing bicycle; and so on.
For example, when the user forms two wheel shapes on the input track of the shooting preview picture, the target travel mode is determined to be a sharing bicycle riding mode, a virtual real scene in an area with the user as the center and the radius of 200 meters and sharing bicycle information in the area are displayed, and when the user clicks a first position of any one sharing bicycle, the electronic equipment executes real scene navigation on the first position of the clicked sharing bicycle.
In the embodiment, the position of the shared bicycle is subjected to live-action navigation, so that the shared bicycle is quickly positioned, a user can quickly find the shared bicycle, and the user can conveniently go out.
In one embodiment, after the position of the first building is determined as a starting position and the position of the second building is determined as a destination, the first information of the starting position and the destination can be obtained, and at least one travel mode option is pushed according to the first information of the starting position and the destination. When a seventh input of the user to a first travel mode option in the at least one travel mode option is received, a travel mode corresponding to the first travel mode option is determined as a target travel mode in response to the seventh input.
The first information comprises at least one of position information of a starting position and a destination, a distance between the starting position and the destination and road condition information. The different first information corresponds to different travel mode options. The travel mode options can comprise options corresponding to modes of riding a sharing bicycle, getting off, taking a subway, taking a bus and the like. The seventh input is used to select a travel mode option.
In this embodiment, the travel mode options are pushed according to the first information of the starting position and the destination, so that the effect of intelligently pushing the travel mode options for the user is achieved, the user can select a proper travel mode to travel, and the travel is more convenient and faster.
In one embodiment, after the output of the travel order, when an eighth input of the first user is received, a travel information sharing message is generated and sent to a target electronic device associated with an electronic device corresponding to the first user, and when a feedback message sent by the target electronic device is received, the target travel information is sent to the target electronic device for display.
The feedback message is used for indicating that the target trip information of the first user is determined to be shared. The eighth input is used to instruct the generation of the travel information sharing message. And the sharing content corresponding to the travel information sharing message is the target travel information.
The embodiment can be applied to taxi taking scenes. In the taxi taking scene, after the electronic device generates the target trip information of the first user, the first user inputs a taxi taking command (i.e., an eighth input) on the electronic device to trigger the electronic device to generate a taxi taking order (i.e., a trip information sharing message). The electronic device sends the taxi-taking order to the target electronic device associated with the electronic device. In this scenario, the associated target electronic devices refer to devices that use the same taxi-taking software. When receiving a receiving message (namely a feedback message) of a taxi taking order sent by the target electronic equipment, the electronic equipment sends the target trip information of the first user to the target electronic equipment for displaying, so that a second user (namely a taxi taking driver) of the target electronic equipment can view the target trip information of the first user, and the trip information is shared.
The technical scheme of the embodiment is applied to a taxi taking scene, the trip information sharing message can be generated according to the operation of the user, and when the feedback message sent by the target electronic equipment is received, the target trip information is sent to the target electronic equipment to be displayed, so that the driver can see the target trip information of the user after receiving the order, the trip time is saved according to the fact that the target trip information is accurately received by the user, and the trip efficiency is improved.
Fig. 2 is a schematic flow chart of a travel order processing method according to another embodiment of the present invention. The method for processing the travel order is applied to the mobile phone. The method of fig. 2 may include:
step 201, entering an AR mode based on a first input of a user, and opening a shooting preview screen of a camera in the AR mode.
Optionally, the first input may be: clicking input on a virtual key on a mobile phone screen; double-click input or long-press input on physical keys on the mobile phone; inputting a long press in a blank area on a mobile phone screen; and so on.
For example, the preset long press time is 5 seconds, and when the physical key on the mobile phone is pressed for at least 5 seconds, the AR mode can be triggered.
Step 202, receiving an input track of a user on a shooting preview picture of the mobile phone.
And step 203, determining a target travel mode based on the input track.
The target travel mode can comprise a taxi taking mode, a shared bicycle riding mode and the like.
As shown in fig. 3, the input trajectory of the taxi taking mode may be a preset type of trajectory, such as a shape type trajectory, a character type trajectory, and the like. For example, the shape class trajectory may include: the car shape 310, the tire shape 320, etc. may correspond to a character type track, which may include: the word "car" 330, etc. In addition, the tracks corresponding to other shapes can be input through the custom icon 340 according to the requirement of the user.
For example, when the user's input trajectory on the photographing preview screen forms a car shape, it may be determined that the user's target travel style is a taxi taking style.
Optionally, when the user aims the mobile phone camera at any surrounding building in the AR mode, it may also be determined that the target trip mode is a taxi taking mode.
As shown in fig. 4, the input trajectory in the bicycle-sharing mode may be a predetermined type of trajectory, such as a shape-type trajectory, a character-type trajectory, and the like. For example, the shape class trajectory may include: the bicycle shape 410, the two wheels 420 and the like correspond to the tracks, and the character tracks can include: the symbol "Bic" 430, the word "bicycle" 440, the word "ride" 450, and the like. In addition, the tracks corresponding to other shapes can be input through the custom icon 460 according to the requirement of the user.
For example, when the input trajectory of the user on the photographing preview screen forms a bicycle shape, it may be determined that the target travel mode of the user is a share-bike mode.
And step 204, opening the live-action map based on the gesture action of the user on the shooting preview picture.
The gesture actions can be preset, such as two-finger zooming, long-pressing and other actions.
For example, a gesture motion may be preset as a two-finger zoom motion, and as shown in fig. 5, when the user performs the two-finger zoom motion on the shooting preview screen, the live-action map may be opened.
Step 205, receiving the determination operation of the user on the target trip information on the live-action map.
When the target travel mode is the taxi taking mode, as shown in fig. 6(a), a target building and a target travel information confirmation text box are displayed on the live-action map, and the default starting position is the position of the target building, namely a patent supermarket (patent avenue), and is highlighted on the screen. The default departure time is 3 minutes later and pushes the vehicle closest to the user (taxi XX by the carrier). Wherein the user can preset the departure time and the vehicle operator.
Specifically, the user can confirm an input box corresponding to the departure time in the text box by clicking the target trip information, and determine or modify the departure time; the user can confirm the input box corresponding to the tool in the text box by clicking the target trip information, and input the vehicle operator in the popped up input box.
For example, a user may enter a taxi cab for M after 5 minutes of departure time.
In one embodiment, the start position may be modified by clicking a highlight area on the live-action map or an input box corresponding to the start position in the target travel information confirmation text box. When a user clicks a highlight area on the live-action map, a positioning icon appears, as shown in fig. 6(b), the user can manually drag the positioning icon, and when the positioning icon stays on a certain building for a long time, the position where the positioning icon is located can be determined as an initial position; when the user clicks the input box corresponding to the starting position in the text box confirmed by the target travel information on the live-action map, a search box pops up on the live-action map, and the user can input the starting position in the search box.
For example, if the initial position of the user is determined by the long-time stay of the positioning icon on a building, the preset stay time can be set to be 5 seconds, and when the user manually drags the positioning icon to stay on a building for at least 5 seconds, the position of the building can be determined to be the initial position.
In one embodiment, the user may also determine the user's destination information in the manner described above. In addition, the user can also input the track corresponding to the building on the shooting preview picture, and the place corresponding to the building is determined as the user destination according to the preset corresponding relation between the track corresponding to each building and the destination.
The corresponding relation between the corresponding track of each building and the destination can be stored on the network side by a worker with certain authority; the user can submit the corresponding relation between the track corresponding to each building and the destination on the mobile phone.
For example, a user may submit a track formed by "a house with a fish" on a mobile phone, and the destination of the track is an N ocean hall, and when the user inputs the track corresponding to the house with the fish on a shooting preview picture, the user may be determined to be the N ocean hall according to a preset corresponding relationship between the track corresponding to each building and the destination.
In addition, the user can also modify the destination when the destination deviates. When the deviation is large, the user can directly click the target travel information on the live-action map to confirm the input box corresponding to the destination in the text box, and then correct destination information is input; when the deviation is small, the user can click the positioning icon on the live-action map to modify the destination.
Assuming that the starting location is a patent cinema (patent avenue), the destination is XX hospital (XX way), the departure time is 5 minutes later, and a vehicle closest to the user is pushed (the carrier takes XX car), the target trip information can be determined as shown in fig. 6 (c).
And when the target travel mode is a shared bicycle riding mode, the information of the categories and the number of the shared bicycles and the distance between the shared bicycles and the current position are displayed on the live-action map. In which all categories, one shared vehicle, are searched by default, as shown in fig. 7. In addition, the user can set the types of the shared vehicles and the required number of the shared vehicles in advance.
On the live-action map, a user can observe shared bicycle information on all surrounding streets through a sliding mobile phone screen, and if the user searches all categories and multiple shared bicycles, information such as the position of a target shared bicycle, the distance between the target shared bicycle and the current position of the user, the route between the target shared bicycle and the current position of the user and the like is displayed on the live-action map.
In addition, if the user selects to search the shared bicycle at a specific place, an overhead live-action route map between the position of the shared bicycle and the current position of the user is displayed on the live-action map.
Optionally, when the user searches for the shared bicycles of all categories, the shared bicycles are highlighted on the live-action map in a virtual live-action mode, at this time, the user does not need to select the number of vehicles, information such as the positions and the number of vehicles of the shared bicycles in a preset range is automatically displayed on the screen, and the displayed vehicle proportion changes proportionally along with the distance. For example, as shown in fig. 8, if the preset range is 1000 meters, the screen only shows the positions and the number of vehicles within 1000 meters, two shared vehicles 810 at 300 meters, one shared vehicle 820 at 900 meters, and the proportion of the vehicles becomes smaller in proportion as the distance becomes longer.
In addition, if the implementation method is applied to walking navigation, the destination can be highlighted after the destination is selected, so that the method is helpful for a user to determine whether the destination is accurate, and the travel efficiency is improved.
Optionally, after the user determines that the target shared bicycle is currently on the screen, the target shared bicycle may be unlocked in the following manner: performing gesture action before a camera of the mobile phone; and clicking the target sharing bicycle on the live-action map. Wherein, the user can preset gesture actions such as clicking, swiping and the like.
For example, the user searches all categories and one sharing bicycle, and sees the target sharing bicycle by live-action navigation, as shown in fig. 9, the user clicks on the target sharing bicycle 910 on the live-action map, so as to unlock the bicycle.
Optionally, the user may click on a building on the live-action map, open attribute information of the building, and execute a corresponding operation when clicking on the attribute information.
When the user clicks the XX hospital (XX road) on the live-action map as shown in fig. 10(a), the attribute information (encyclopedia, travel, smart identification) of the hospital appears on the screen as shown in fig. 10 (b). When the user clicks the encyclopedia, displaying an encyclopedia page of the hospital; when a user clicks a trip, pushing a trip mode option; and when the user clicks the intelligent identification, displaying an appointment registration interface of the hospital.
Optionally, the travel mode may be pushed to the user according to the target travel destination determined by the user, for example, options corresponding to travel modes such as walking, riding a shared bicycle, taking a car, and the like are displayed on the current interface. If the user clicks the walk option, the live view navigation is performed as shown in fig. 11 (a); if the user clicks the option of riding the shared bicycle, the item class and the required number of the shared bicycle are provided as shown in fig. 11(b), and the position of the shared bicycle is subjected to live-action navigation; if the user clicks the taxi taking option, a target trip information confirmation text box is displayed as shown in fig. 11 (c).
In addition, if the user presets the travel mode, the travel mode preset by the user can be pushed only; if the user opens the attribute information of the building at the destination, the travel mode option is not pushed first, and when the user clicks the travel option, the travel mode option is pushed again.
And step 206, generating a travel order according to the target travel information.
The method for processing the travel order can be applied to a scene of shared bicycle travel, and the target shared bicycle information required by the user is determined through the operation of the user on the live-action map, so that the determination of the target shared bicycle information is more visual and visual, and the geographic information is more accurately positioned; and the position of the shared bicycle is subjected to live-action navigation, so that the shared bicycle is quickly positioned, a user can quickly find the shared bicycle, and the travel of the user is facilitated.
The method for processing the travel order can be applied to a scene of taking a car for travel, the target travel mode of the user is determined through the input track of the user on the shooting preview picture, the information such as the starting position, the destination, the departure time and the like of the user is determined through the operation executed on the live-action map, the target travel information is determined more simply, conveniently and rapidly, the travel mode is determined more visually and visually through the input on the shooting preview picture, and the interactive experience feeling between the user and the electronic equipment in the target travel information determining process is improved; the geographic information is positioned more accurately, so that the confirmation degree of the user or the driver to the appointed place is higher, and the travel efficiency is improved particularly when one party is not familiar with and knows the position of the user or the driver; in addition, the travel order can be automatically generated according to the travel information, so that the generation process of the travel order is simplified, and the travel of the user is more convenient.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Fig. 12 is a schematic structural diagram of an electronic device in an embodiment of the invention. Referring to fig. 12, an electronic apparatus 1200 may include:
a first receiving module 1210, configured to receive a first input of a first user on a shooting preview screen;
a display module 1220, configured to display the target trip mode information in response to the first input; the target travel mode information is determined based on the input track of the first input;
a second receiving module 1230, configured to receive a second input of the target trip information from the first user;
and an output module 1240 for outputting the travel order in response to the second input.
In one embodiment, the display module 1220 includes:
a first acquisition unit configured to acquire an input trajectory of a first input;
the first determining unit is used for determining a target trip mode based on the input track;
the input track comprises a track formed by at least one of a vehicle shape, a symbol corresponding to the vehicle and a character corresponding to the vehicle.
In one embodiment, the target travel information includes at least one of a starting location, a destination; the second receiving module 1230 includes:
the first receiving unit is used for receiving a second input of the live-action map from the first user;
a first display unit for displaying a live-action map in response to a second input;
a second receiving unit, configured to receive a third input of the first user to the first building and the second building on the live-action map;
and a second determination unit for determining the position of the first building as a starting position and the position of the second building as a destination in response to a third input.
In one embodiment, after displaying the live-action map, the second receiving module 1230 further includes:
a third receiving unit, configured to receive a fourth input of the second building on the live-action map from the first user;
a second display unit for displaying at least one item of attribute information of the second building in response to a fourth input, the attribute information including at least one item of encyclopedia information, trip information, and associated information of the second building;
a fourth receiving unit, configured to receive a fifth input of the target attribute information in the at least one item of attribute information by the first user;
and the execution unit is used for responding to the fifth input and executing the target operation corresponding to the target attribute information.
In one embodiment, the target travel information further includes a departure time; the second receiving module 1230 further includes:
and the fifth receiving unit is used for receiving a second input of the first user in the preset input box, wherein the second input is used for inputting the target trip information.
In one embodiment, the display module 1220 further includes:
the third display unit is used for displaying the virtual reality and at least one piece of shared bicycle information within a preset range under the condition that the travel mode is the shared bicycle riding mode, wherein the shared bicycle information comprises at least one of a current first position of the shared bicycle, a distance between the current first position of the shared bicycle and a current second position of the electronic equipment, and a route between the current first position of the shared bicycle and the second position of the electronic equipment;
a sixth receiving unit, configured to receive a sixth input of the target shared bicycle from the first user;
and the fourth display unit is used for responding to the sixth input and displaying the live-action navigation information, and the navigation destination in the live-action navigation information is the first position of the target shared bicycle.
In one embodiment, the second receiving module 1230 further comprises:
a second acquisition unit configured to acquire first information of a start position and a destination; the first information comprises at least one item of position information of a starting position and a destination, a distance between the starting position and the destination and road condition information; wherein, different first information corresponds to different travel mode options;
the pushing unit is used for pushing at least one travel mode option according to the first information;
a seventh receiving unit, configured to receive a seventh input of the first travel mode option of the at least one travel mode option from the user;
and the third determining unit is used for responding to the seventh input and determining the travel mode corresponding to the first travel mode option as the target travel mode.
In one embodiment, the output module 1240 further includes:
an eighth receiving unit, configured to receive an eighth input of the first user;
a generating and sending unit, configured to generate a travel information sharing message in response to the eighth input, and send the travel information sharing message to a target electronic device associated with an electronic device corresponding to the first user;
the transmitting unit is used for transmitting the target trip information to the target electronic equipment for displaying under the condition of receiving the feedback message transmitted by the target electronic equipment;
the feedback message is used for indicating that the sharing target trip information is determined.
The electronic device provided by the embodiment of the present invention can implement each process implemented by the method for processing a travel order in the foregoing method embodiments, and details are not described here to avoid repetition.
In the embodiment of the invention, when a travel order is processed, the first input of a first user on a shooting preview picture is received, the first input is responded, and the target travel mode information is displayed, so that the travel mode is determined more conveniently and quickly, the travel mode is determined more visually and visually by the input on the shooting preview picture, and the interactive experience feeling between the user and the electronic equipment in the travel mode determining process is improved; meanwhile, a second input of the first user to the target travel information can be received, the second input is responded, and the travel order is output, so that compared with the traditional travel order determining mode, a plurality of links (such as links of APP login, advertisement skipping, information confirmation and the like) are omitted, the generation process of the travel order is simplified, and the user can travel more conveniently; in addition, the technical scheme can be combined with the existing functions (such as a map) on the electronic equipment to issue various tasks and directly complete the closed loop of the scene (such as reserving restaurants and directly taking a car).
Fig. 13 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, an input unit 1304, a sensor 1305, a display unit 1306, a user input unit 1307, an interface unit 1308, a memory 1309, a processor 1310, a power supply 1311, and a shooting unit 1312. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 13 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 1310 is configured to receive a first input of a first user on the shooting preview screen; a display unit 1306 for displaying the target travel mode information in response to the first input; the target travel mode information is determined based on the input track of the first input; a processor 1310, further configured to receive a second input of target travel information from the first user; in response to the second input, the travel order is output.
In the embodiment of the invention, when a travel order is processed, the first input of a first user on a shooting preview picture is received, the first input is responded, and the target travel mode information is displayed, so that the travel mode is determined more conveniently and quickly, the travel mode is determined more visually and visually by the input on the shooting preview picture, and the interactive experience feeling between the user and the electronic equipment in the travel mode determining process is improved; meanwhile, a second input of the first user to the target travel information can be received, the second input is responded, and the travel order is output, so that compared with the traditional travel order determining mode, a plurality of links (such as links of APP login, advertisement skipping, information confirmation and the like) are omitted, the generation process of the travel order is simplified, and the user can travel more conveniently; in addition, the technical scheme can be combined with the existing functions (such as a map) on the electronic equipment to issue various tasks and directly complete the closed loop of the scene (such as reserving restaurants and directly taking a car).
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1301 may be configured to receive and transmit signals during a message transmission or call process, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 1310; in addition, the uplink data is transmitted to the base station. In general, radio unit 1301 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1301 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 1302, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 1303 can convert audio data received by the radio frequency unit 1301 or the network module 1302 or stored in the memory 1309 into an audio signal and output as sound. Also, the audio output unit 1303 may also provide audio output related to a specific function performed by the electronic apparatus 1300 (e.g., a call signal reception sound, a message reception sound, and the like). The audio output unit 1303 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1304 is used to receive audio or video signals. The input Unit 1304 may include a Graphics Processing Unit (GPU) 13041 and a microphone 13042, and the Graphics processor 13041 processes image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 1306. The image frames processed by the graphic processor 13041 may be stored in the memory 1309 (or other storage medium) or transmitted via the radio frequency unit 1301 or the network module 1302. The microphone 13042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1301 in case of a phone call mode.
The electronic device 1300 also includes at least one sensor 1305, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 13061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 13061 and/or the backlight when the electronic device 1300 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1305 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1306 is used to display information input by a user or information provided to the user. The Display unit 1306 may include a Display panel 13061, and the Display panel 13061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1307 may be used to receive input numerical or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 1307 includes a touch panel 13071 and other input devices 13072. Touch panel 13071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on touch panel 13071 or near touch panel 13071 using a finger, stylus, or any other suitable object or attachment). The touch panel 13071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1310, and receives and executes commands sent from the processor 1310. In addition, the touch panel 13071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 1307 may include other input devices 13072 in addition to the touch panel 13071. In particular, the other input devices 13072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 13071 can be overlaid on the display panel 13061, and when the touch panel 13071 detects a touch operation on or near the touch panel, the touch operation can be transmitted to the processor 1310 to determine the type of the touch event, and then the processor 1310 can provide a corresponding visual output on the display panel 13061 according to the type of the touch event. Although the touch panel 13071 and the display panel 13061 are shown in fig. 13 as two separate components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 13071 and the display panel 13061 may be integrated to implement the input and output functions of the electronic device, and are not limited herein.
The interface unit 1308 is an interface for connecting an external device to the electronic apparatus 1300. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1308 may be used to receive input from an external device (e.g., data information, power, etc.) and transmit the received input to one or more elements within the electronic device 1300 or may be used to transmit data between the electronic device 1300 and an external device.
The memory 1309 may be used to store software programs as well as various data. The memory 1309 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1309 can include high-speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1310 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 1309 and calling data stored in the memory 1309, thereby performing overall monitoring of the electronic device. Processor 1310 may include one or more processing units; preferably, the processor 1310 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1310.
The electronic device 1300 may also include a power supply 1311 (e.g., a battery) for powering the various components, and preferably, the power supply 1311 may be logically coupled to the processor 1310 via a power management system that may be configured to manage charging, discharging, and power consumption management.
The photographing unit 1312 may be a camera mounted on the electronic device 1300 and may be used to capture a photographing preview screen.
In addition, the electronic device 1300 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 1310, a memory 1309, and a computer program stored in the memory 1309 and capable of running on the processor 1310, where the computer program, when executed by the processor 1310, implements each process of the above-mentioned travel order processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, no further description is given here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above travel order processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A travel order processing method is applied to electronic equipment and is characterized by comprising the following steps:
receiving a first input of a first user on a shooting preview picture; the shooting preview picture is captured and displayed through a camera of the electronic equipment;
displaying target travel mode information in response to the first input, the target travel mode information being determined based on an input trajectory of the first input;
receiving a second input of the first user for target trip information;
outputting the travel order in response to the second input.
2. The method of claim 1, wherein said displaying target travel mode information in response to said first input comprises:
acquiring an input track of the first input;
determining a target travel mode based on the input track;
the input track comprises a track formed by at least one of a vehicle shape, a symbol corresponding to the vehicle and a character corresponding to the vehicle.
3. The method of claim 2, wherein the target travel information comprises at least one of a starting location, a destination;
the receiving of the second input of the target trip information by the first user includes:
receiving a second input of the live-action map by the first user;
displaying the live-action map in response to the second input;
receiving a third input of the first user to the first building and the second building on the live-action map;
in response to the third input, determining a location of the first building as a starting location and a location of the second building as a destination.
4. The method of claim 3, wherein after displaying the live-action map, further comprising:
receiving a fourth input of the first user to a second building on the live-action map;
displaying at least one item of attribute information of the second building in response to the fourth input, the attribute information including at least one of encyclopedia information, travel information, and associated information of the second building;
receiving a fifth input of the first user for target attribute information in the at least one item of attribute information;
and responding to the fifth input, and executing target operation corresponding to the target attribute information.
5. The method of claim 3, wherein said target travel information further comprises a departure time;
the receiving of the second input of the target trip information by the first user further includes:
and receiving a second input of the first user in a preset input box, wherein the second input is used for inputting the target trip information.
6. The method of claim 2, wherein after determining a target travel mode based on the input trajectory, further comprising:
displaying a virtual reality and at least one piece of shared bicycle information within a preset range under the condition that the travel mode is a shared bicycle riding mode, wherein the shared bicycle information comprises at least one of a current first position of the shared bicycle, a distance between the current first position and a current second position of the electronic equipment, and a route between the current first position and the second position;
receiving a sixth input by the first user to a target shared bicycle;
in response to the sixth input, displaying live action navigation information, a navigation destination in the live action navigation information being the first location of the target shared bicycle.
7. The method of claim 3, wherein after determining the location of the first building as a starting location and the location of the second building as a destination in response to the third input, further comprising:
acquiring first information of the starting position and the destination; the first information comprises at least one of the information of the starting position and the position of the destination, the distance between the starting position and the destination and road condition information; wherein, different first information corresponds to different travel mode options;
pushing at least one travel mode option according to the first information;
receiving a seventh input of the user to a first travel mode option of the at least one travel mode option;
and responding to the seventh input, and determining the travel mode corresponding to the first travel mode option as a target travel mode.
8. The method of claim 1, wherein after outputting the travel order, further comprising:
receiving an eighth input by the first user;
generating a travel information sharing message in response to the eighth input, and transmitting the travel information sharing message to a target electronic device associated with an electronic device corresponding to the first user;
under the condition of receiving a feedback message sent by the target electronic equipment, sending the target trip information to the target electronic equipment for displaying;
the feedback message is used for indicating that the target trip information is determined to be shared.
9. An electronic device, comprising:
the first receiving module is used for receiving a first input of a first user on a shooting preview picture; the shooting preview picture is captured and displayed through a camera of the electronic equipment;
a display module, configured to display, in response to the first input, target travel mode information, which is determined based on an input trajectory of the first input;
the second receiving module is used for receiving second input of the first user on the target trip information;
and the output module is used for responding to the second input and outputting the travel order.
10. An electronic device, comprising:
a memory storing computer program instructions;
a processor which, when executed by the processor, implements a method of processing a travel order according to any of claims 1 to 8.
CN201911402591.4A 2019-12-31 2019-12-31 Travel order processing method and electronic equipment Active CN111158549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911402591.4A CN111158549B (en) 2019-12-31 2019-12-31 Travel order processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911402591.4A CN111158549B (en) 2019-12-31 2019-12-31 Travel order processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111158549A CN111158549A (en) 2020-05-15
CN111158549B true CN111158549B (en) 2021-11-16

Family

ID=70559481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911402591.4A Active CN111158549B (en) 2019-12-31 2019-12-31 Travel order processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111158549B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106910292A (en) * 2017-02-28 2017-06-30 上海量明科技发展有限公司 Method, client and the system of shared vehicle are checked by augmented reality
CN108830674A (en) * 2018-05-23 2018-11-16 杭州优行科技有限公司 Trip order production method, device and terminal device
CN109307518A (en) * 2017-07-28 2019-02-05 丰田自动车株式会社 Navigation equipment, air navigation aid and navigation system
JP2019087071A (en) * 2017-11-08 2019-06-06 ミライフーズ株式会社 Local gift order acceptance device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106910292A (en) * 2017-02-28 2017-06-30 上海量明科技发展有限公司 Method, client and the system of shared vehicle are checked by augmented reality
CN109307518A (en) * 2017-07-28 2019-02-05 丰田自动车株式会社 Navigation equipment, air navigation aid and navigation system
JP2019087071A (en) * 2017-11-08 2019-06-06 ミライフーズ株式会社 Local gift order acceptance device
CN108830674A (en) * 2018-05-23 2018-11-16 杭州优行科技有限公司 Trip order production method, device and terminal device

Also Published As

Publication number Publication date
CN111158549A (en) 2020-05-15

Similar Documents

Publication Publication Date Title
CN108415652B (en) Text processing method and mobile terminal
CN108519080B (en) Navigation route planning method and terminal
CN108132752B (en) Text editing method and mobile terminal
CN107943390B (en) Character copying method and mobile terminal
CN110618969B (en) Icon display method and electronic equipment
CN111491211B (en) Video processing method, video processing device and electronic equipment
CN112068752B (en) Space display method and device, electronic equipment and storage medium
CN110519699B (en) Navigation method and electronic equipment
CN107707762A (en) A kind of method for operating application program and mobile terminal
CN110944139B (en) Display control method and electronic equipment
CN111107219B (en) Control method and electronic equipment
CN111126995A (en) Payment method and electronic equipment
CN107846518A (en) A kind of navigational state switching method, mobile terminal and computer-readable recording medium
CN108362303B (en) Navigation method and mobile terminal
CN108061557A (en) A kind of air navigation aid and mobile terminal
CN110544287A (en) Picture matching processing method and electronic equipment
CN110990679A (en) Information searching method and electronic equipment
CN109669710B (en) Note processing method and terminal
CN111061530A (en) Image processing method, electronic device and storage medium
CN111145582A (en) Information control method and electronic equipment
CN108958579B (en) Red packet sending and receiving method and red packet sending and receiving device
CN111158556B (en) Display control method and electronic equipment
CN111256678A (en) Navigation method and electronic equipment
CN111078819A (en) Application sharing method and electronic equipment
CN111158549B (en) Travel order processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant