CN113722029A - Context-based notification display method and apparatus - Google Patents

Context-based notification display method and apparatus Download PDF

Info

Publication number
CN113722029A
CN113722029A CN202110606593.6A CN202110606593A CN113722029A CN 113722029 A CN113722029 A CN 113722029A CN 202110606593 A CN202110606593 A CN 202110606593A CN 113722029 A CN113722029 A CN 113722029A
Authority
CN
China
Prior art keywords
user
state
card
content
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110606593.6A
Other languages
Chinese (zh)
Inventor
丁一晏
陈佳子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110606593.6A priority Critical patent/CN113722029A/en
Priority to CN202210679572.1A priority patent/CN115334193B/en
Publication of CN113722029A publication Critical patent/CN113722029A/en
Priority to PCT/CN2022/073332 priority patent/WO2022247326A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions

Abstract

A notification display method and device based on context can be applied to electronic equipment such as mobile phones. In the method, the electronic device can split the content in one notification into different parts according to different situations of the user. Then, according to the current situation of the user, selectively showing part of the content in the notification. Therefore, the user can obtain much and comprehensive information, and can quickly and accurately obtain the information which the user wants to know most under the current situation, so that the complexity and inconvenience caused by information stacking are avoided.

Description

Context-based notification display method and apparatus
Technical Field
The present application relates to the field of terminals, and in particular, to a method and an apparatus for displaying a notification based on a context.
Background
Existing cards typically summarize all basic information and functionality with one card. The information and functionality on the card is tiled, making it difficult for users to find important information. The larger the card, the longer it is when the card is to display as much information and functionality as possible. For the user, it is difficult for the user to clearly and directly acquire the information most needed by the current scene, thereby reducing the readability of the card.
Disclosure of Invention
The application provides a method and a device for displaying notification based on context. By implementing the method, the electronic equipment such as the mobile phone and the like can display the contents which are more concerned by the user under the current situation in the card according to the different situations of the user. Therefore, the user can obtain much and comprehensive information and can quickly and accurately obtain the information which the user most wants to know in the current situation.
In a first aspect, an embodiment of the present application provides a notification display method, where the method is applied to an electronic device, and the method includes: displaying a first user interface, wherein a first card is displayed on the first user interface, the first card comprises a first area and a second area, the first area is not overlapped with the second area, the first area displays first content, the second area displays second content, the second content is used for indicating that the state of a user is a first state, and the second content is associated with the first content; detecting that the state of the user is a second state, wherein the second state is different from the first state; in response to the state of the user being in the second state, the first area displays first content, the second area displays third content, the third content is used for indicating that the state of the user being in the second state, the third content is different from the second content, and the third content is associated with the first content; wherein the first state and the second state are states associated with a geographic location, and/or a time, at which the user is located.
Implementing the method provided by the first aspect, the electronic device may detect a change in the state in which the user is located. The electronic device can determine information of interest to the user in the changed state. The electronic device may then display the information in the card. Therefore, the user can obtain much and comprehensive information, and can quickly and accurately obtain the information which the user wants to know most under the current situation, so that the complexity and inconvenience caused by information stacking are avoided.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: detecting that the state of the user is a third state, wherein the third state is different from the second state; in response to that the state of the user is a third state, the first area displays fourth content, the second area displays fifth content, the fourth content is different from the first content, the fifth content is different from the third content, the fifth content is used for indicating that the state of the user is the third state, the fifth content is associated with the fourth content, the fourth content is associated with the first content, the display form of the fourth content is different from the display form of the first content, and the text content of the fourth content is the same as the text content of the first content; the third state is a state associated with the geographic location, and/or time, at which the user is located.
By implementing the method provided by the above embodiment, the electronic device may further change the content displayed in the first area in the process of changing the content displayed in the second area. This makes the content displayed by the card more flexible.
With reference to some embodiments of the first aspect, in some embodiments, after detecting that the state of the user is the second state, the method further includes: and displaying a message notification, wherein the message notification is one or more of a banner notification, a screen locking notification and a pull-down notification, and the message notification comprises sixth content, the sixth content is used for indicating that the state of the user is the second state, and the sixth content is associated with the third content.
By implementing the method provided by the embodiment, the electronic equipment can not only prompt the user of the state change through the update card, but also remind the user of the state change through notification modes such as banner notification and screen locking notification. In this way, when the electronic device is not in the user interface displaying the card, the electronic device can also remind the user in time through other notification forms.
With reference to some embodiments of the first aspect, in some embodiments, after detecting that the state of the user is the second state, the method further includes: displaying a second card at the first user interface, the second card including controls to obtain a health code, and/or a nucleic acid record, and/or a vaccine record.
Implementing the method provided by the above embodiment, the electronic device may further display a card including a shortcut for acquiring a health code, a nucleic acid record, and a vaccine record in the second state. Therefore, the electronic equipment can automatically display the card when required by the user, and more convenient and faster use experience is provided for the user.
In some embodiments in combination with some embodiments of the first aspect, in some embodiments, the first card is a flight card; the first state, the second state and the third state are respectively as follows: a state of reminding the user of check-in, a state of reminding the user of going to an airport, a state of reflecting the user is ready to board, and a state of reflecting the user's arrival at a destination.
With reference to some embodiments of the first aspect, in some embodiments, alerting the user to the status of the check-in procedure comprises: displaying the state of the on-duty forecast, the state of the on-duty prompt for a user, the state of the on-duty countdown or the state of the on-duty counter; or, the state of reminding the user to go to the airport includes: displaying the state of departure reminding or the state of displaying the journey time; alternatively, reflecting a state that the user is about to board the aircraft includes: displaying the state of the health code, the state of starting boarding or the state of a gate; alternatively, reflecting the state of the user's arrival at the destination includes: the state of the luggage carousel, the state of the hotel location or the state of the tourist attractions is displayed.
In combination with some embodiments of the first aspect, in some embodiments, the first content includes: one or more of flight number, travel date, origin, departure time, destination, arrival time of the flight; the second content, the third content, and the fifth content respectively include: one or more of an on-duty forecast, an on-duty prompt, an on-duty countdown, an on-duty counter, a departure prompt, a journey time, a seat number, a gate, a luggage turntable, a hotel location, and a tourist spot location of the flight; the fourth contents include: flight number, date of travel, origin, departure time, destination, arrival time, check-in counter, seat number, gate of flight.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: when the first state, the second state, or the third state is a state reflecting that the user is ready to board; determining that the electronic equipment is running an immersive application program, and displaying a floating window, wherein the floating window is used for displaying boarding reminding; wherein, the immersive application program is one or more of a video application program, a game application program, a music application program or a call application program.
Implementing the methods provided by the above embodiments, the electronic device may display the floating window upon detecting that the user is using the immersive application. The floating window can display prompt information and shortcuts. Thus, the user can quickly acquire the notification when using the immersive application program, and delay is avoided.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: detecting a first operation of a user on the floating window; in response to a first operation, displaying the electronic boarding pass, wherein the first operation is one of a click operation, a long-press operation sliding operation or a voice control operation.
By implementing the method provided by the embodiment, the user can click the floating window when needing to show the electronic boarding check, so that the electronic boarding check is quickly obtained.
In combination with some embodiments of the first aspect, in some embodiments, the first card is a card showing high-speed rail travel; the first state, the second state and the third state are respectively as follows: one of a state to remind the user that high-speed rail is about to depart, a state to reflect that the user has arrived at the high-speed rail station, and a state to reflect that the high-speed rail has departed.
In combination with some embodiments of the first aspect, in some embodiments, the first content includes: one or more of train number, trip date, departure place, departure time, destination, arrival time; the second content, the third content, and the fifth content respectively include: one or more of a departure reminder, a car number, a seat number, and a ticket gate; the fourth contents include: one or more of train number, journey date, departure place, departure time, destination, arrival time, departure reminder, carriage number, seat number, ticket gate.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: when the first state, the second state, or the third state is a state reflecting that the user has arrived at the high-speed rail station; determining that the electronic equipment is running an immersive application program, and displaying a floating window, wherein the floating window is used for displaying the two-dimensional code of the electronic ticket; wherein, the immersive application program is one or more of a video application program, a game application program, a music application program or a call application program.
Implementing the method provided by the above embodiment, the electronic device may display the floating window when it is detected that the user is waiting in the waiting room and using the immersive application. The suspension window can be used for a user to rapidly acquire the two-dimensional code of the electronic equipment, so that the user can rapidly pass security check. Thus, the user can quickly acquire the notification when using the immersive application program, and delay is avoided.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: detecting a first operation of a user on the floating window; and responding to a first operation, displaying the two-dimensional code of the electronic equipment, wherein the first operation is one of click operation, long-press operation, sliding operation or voice control operation.
Therefore, the user can click the suspension window when needing to display the two-dimensional code of the electronic ticket, and the two-dimensional code of the electronic ticket is quickly obtained.
In combination with some embodiments of the first aspect, in some embodiments, the first card is a punch card. Therefore, the electronic equipment can update the card punching notice in the card in time through state detection. Furthermore, the user can quickly finish punching the card through the card.
With reference to some embodiments of the first aspect, in some embodiments, the first state is a state in which an attendance checking reminder is displayed, and the second state is a state in which an attendance reminding is displayed.
In some embodiments, in combination with some embodiments of the first aspect, the first content includes a punch-out reminder, the second content includes one or more of an on-duty punch-out control, an on-duty punch-out time, or whether the user punches a card, and the third content includes: one or more of an off duty punch-in control, an off duty punch-in time, or whether the user punches a card.
In a second aspect, embodiments of the present application provide an electronic device, which includes one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors for storing computer program code comprising computer instructions which, when executed by the one or more processors, cause the electronic device to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a third aspect, embodiments of the present application provide an electronic device, which includes one or more processors and one or more memories; a memory coupled to the one or more processors, the memory for storing computer program code, the computer program code including computer instructions, the one or more processors invoking the computer instructions to cause the electronic device to perform:
displaying a first user interface, wherein a first card is displayed on the first user interface, the first card comprises a first area and a second area, the first area is not overlapped with the second area, the first area displays first content, the second area displays second content, the second content is used for indicating that the state of a user is a first state, and the second content is associated with the first content; detecting that the state of the user is a second state, wherein the second state is different from the first state; in response to the state of the user being in the second state, the first area displays first content, the second area displays third content, the third content is used for indicating that the state of the user being in the second state, the third content is different from the second content, and the third content is associated with the first content; wherein the first state and the second state are states associated with a geographic location, and/or a time, at which the user is located.
With reference to some embodiments of the third aspect, in some embodiments, the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to perform: detecting that the state of the user is a third state, wherein the third state is different from the second state; in response to that the state of the user is a third state, the first area displays fourth content, the second area displays fifth content, the fourth content is different from the first content, the fifth content is different from the third content, the fifth content is used for indicating that the state of the user is the third state, the fifth content is associated with the fourth content, the fourth content is associated with the first content, the display form of the fourth content is different from the display form of the first content, and the text content of the fourth content is the same as the text content of the first content; the third state is a state associated with the geographic location, and/or time, at which the user is located.
In some embodiments, in combination with some embodiments of the third aspect, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: and displaying a message notification, wherein the message notification is one or more of a banner notification, a screen locking notification and a pull-down notification, and the message notification comprises third content, and the third content is used for indicating that the state of the user is the second state.
In some embodiments, in combination with some embodiments of the third aspect, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: displaying a second card at the first user interface, the second card including controls to obtain a health code, and/or a nucleic acid record, and/or a vaccine record.
In some embodiments in combination with some embodiments of the third aspect, in some embodiments, the first card is a flight card; the first state, the second state and the third state are respectively as follows: a state of reminding the user of check-in, a state of reminding the user of going to an airport, a state of reflecting the user is ready to board, and a state of reflecting the user's arrival at a destination.
In some embodiments, in combination with some embodiments of the third aspect, alerting the user to the status of the check-in procedure includes: displaying the state of the on-duty forecast, the state of the on-duty prompt for a user, the state of the on-duty countdown or the state of the on-duty counter; or, the state of reminding the user to go to the airport includes: displaying the state of departure reminding or the state of displaying the journey time; alternatively, reflecting a state that the user is about to board the aircraft includes: displaying the state of the health code, the state of starting boarding or the state of a gate; alternatively, reflecting the state of the user's arrival at the destination includes: the state of the luggage carousel, the state of the hotel location or the state of the tourist attractions is displayed.
In combination with some embodiments of the third aspect, in some embodiments, the first content includes: one or more of flight number, travel date, origin, departure time, destination, arrival time of the flight; the second content, the third content, and the fifth content respectively include: one or more of an on-duty forecast, an on-duty prompt, an on-duty countdown, an on-duty counter, a departure prompt, a journey time, a seat number, a gate, a luggage turntable, a hotel location, and a tourist spot location of the flight; the fourth contents include: flight number, date of travel, origin, departure time, destination, arrival time, check-in counter, seat number, gate of flight.
In some embodiments, in combination with some embodiments of the third aspect, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: when the first state, the second state, or the third state is a state reflecting that the user is ready to board; determining that the electronic equipment is running an immersive application program, and displaying a floating window, wherein the floating window is used for displaying boarding reminding; wherein, the immersive application program is one or more of a video application program, a game application program, a music application program or a call application program.
In some embodiments, in combination with some embodiments of the third aspect, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: detecting a first operation of a user on the floating window; in response to a first operation, displaying the electronic boarding pass, wherein the first operation is one of a click operation, a long-press operation sliding operation or a voice control operation.
In some embodiments in combination with some embodiments of the third aspect, the first card is a card showing high-speed rail travel; the first state, the second state and the third state are respectively as follows: one of a state to remind the user that high-speed rail is about to depart, a state to reflect that the user has arrived at the high-speed rail station, and a state to reflect that the high-speed rail has departed.
In combination with some embodiments of the third aspect, in some embodiments, the first content includes: one or more of train number, trip date, departure place, departure time, destination, arrival time; the second content, the third content, and the fifth content respectively include: one or more of a departure reminder, a car number, a seat number, and a ticket gate; the fourth contents include: one or more of train number, journey date, departure place, departure time, destination, arrival time, departure reminder, carriage number, seat number, ticket gate.
In some embodiments, in combination with some embodiments of the third aspect, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: when the first state, the second state, or the third state is a state reflecting that the user has arrived at the high-speed rail station; determining that the electronic equipment is running an immersive application program, and displaying a floating window, wherein the floating window is used for displaying the two-dimensional code of the electronic ticket; wherein, the immersive application program is one or more of a video application program, a game application program, a music application program or a call application program.
In some embodiments, in combination with some embodiments of the third aspect, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: detecting a first operation of a user on the floating window; and responding to a first operation, displaying the two-dimensional code of the electronic equipment, wherein the first operation is one of click operation, long-press operation, sliding operation or voice control operation.
In combination with some embodiments of the third aspect, in some embodiments, the first card is a punch card. Therefore, the electronic equipment can update the card punching notice in the card in time through state detection. Furthermore, the user can quickly finish punching the card through the card.
With reference to some embodiments of the third aspect, in some embodiments, the first state is a state in which an attendance checking reminder is displayed, and the second state is a state in which an attendance reminding is displayed.
In some embodiments, in combination with some embodiments of the third aspect, the first content includes a punch-card reminder, the second content includes one or more of an on-duty punch-card control, an on-duty punch-card time, or whether the user punches a card, and the third content includes: one or more of an off duty punch-in control, an off duty punch-in time, or whether the user punches a card.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to perform the method described in the first aspect and any possible implementation manner of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product including instructions, which, when run on an electronic device, cause the electronic device to perform the method described in the first aspect and any possible implementation manner of the first aspect.
It is to be understood that the electronic device provided by the second aspect, the electronic device provided by the third aspect, the computer storage medium provided by the fourth aspect, and the computer program product provided by the fifth aspect are all configured to execute the method provided by the embodiment of the present application. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
FIGS. 1A-1B are a set of user interfaces displaying existing cards provided by embodiments of the present application;
fig. 1C is a flight travel flow chart provided in the embodiment of the present application;
FIGS. 2A-2M are user interfaces for a set of display cards provided by embodiments of the present application;
2N-2Q are schematic diagrams of possible other forms of the card provided by the embodiments of the present application;
3A-3H are user interfaces provided by embodiments of the present application to display other types of notifications;
4A-4B are system diagrams of a context-based notification display method provided by an embodiment of the application;
FIGS. 5A-5E are flow diagrams of a method for context-based notification display provided by an embodiment of the present application;
6A-6D are user interfaces of another application scenario provided by embodiments of the present application;
FIG. 7 is a user interface of another application scenario provided by an embodiment of the present application;
8A-8D are user interfaces of another application scenario provided by embodiments of the present application;
fig. 9 is a hardware configuration diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application.
Taking flight cards as an example, fig. 1A and 1B show 2 forms of mobile phone display flight cards.
As shown in FIG. 1A, a card 101 may include a flight number 101A, a date 101B, a time, and a place 101C. Where the time includes a departure time ("8: 00") and a arrival time ("11: 20"), and the location includes a departure place ("Shenzhen Baoan T3") and a destination ("Beijing capital T3").
The card shown in fig. 1A is smaller in size and displays less information. The advantage of using the above-described card is that the user can quickly lock the information that the card is intended to display, for example, the departure time and arrival time. However, the drawbacks of the above presented cards are also evident: the user has no more information available, such as when to start check-in, which counters can handle check-in, which gate is, etc.
The card shown in FIG. 1B shows more information about flights. For example, the cards 111 may include flight numbers 111A, day 111B, time and place 111C, check-in counters 112, gate 113, and baggage carousels 114, among others. The 111A, the day 111B period, the time and the location 111C can refer to the description of fig. 1A, and are not described herein again.
When the check-in information needs to be transacted, the user can know the check-in information through the check-in counter 112. Before boarding the airplane, the user can know the gate information through the gate 113. Upon arrival at the destination, the user may learn the information of the consignment baggage via the baggage carousel 114. Of course, the card 111 may also include buttons 115. When a user action on button 115 is detected, the handset may display more information about the flight presented by the card 111. In this way, the card 111 presents substantially all of the information about the flight.
However, the size of the card 111 at this time is significantly larger and longer. This also results in a pile of information and functions on the card. Further, the user cannot locate desired information at first sight. For example, when a user wants to obtain baggage carousel information, the user may first see the takeoff time and landing time.
In order to enable a card to display as much information as possible and also give consideration to the simplicity of the card, the embodiment of the application provides a notification display method and device based on the situation. The method can be applied to mobile phones. By implementing the method provided by the embodiment of the present application, the card displayed on the mobile phone may include more information, such as the check-in counter 112 and the gate 113 shown in fig. 1B. Meanwhile, the card can selectively display more information according to the current scene. For example, when the check-in is open, the card may display information shown on the check-in counter 112. When boarding begins, the card may display the information shown at gate 113 and no longer display the information shown at check-in counter 112.
Therefore, in one card, the user can obtain much and comprehensive information, and can quickly and accurately obtain the information to be understood, so that the complexity and inconvenience caused by information stacking are avoided.
Not limited to a cell phone, the electronic device (electronic device 100) providing the flight card display may also be a tablet computer, a desktop computer, a laptop computer, a notebook computer, a Virtual Reality (VR) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, and so on. The embodiment of the present application does not particularly limit the specific type of the electronic device 100. It will be appreciated that the use of flight cards on other types of electronic devices may refer to cell phone type electronic devices. The method provided by the embodiment of the present application will also be mainly described by taking a mobile phone as an example.
FIG. 1C shows a flow of travel by a user on a flight. This flow may also be referred to as the life of the flight. As shown in fig. 1C, the flow of taking a flight for travel can be divided into 3 parts: before, during and after travel. Each of the nodes shown in FIG. 1C may be referred to as a context.
The process of purchasing an air ticket by a user to arrive at an origin airport before traveling comprises the following steps: booking tickets, changing/canceling flights, destination weather, checking in/selecting seats, booking by taking off and reserving, departing to airports, arriving at airports and other subdivision situations. The process from the arrival of the user at the airport to the completion of boarding is carried out during the trip, and the process comprises the following steps: health code, boarding check/consignment, security check, flight delay, boarding gate change, waiting in the waiting hall, boarding and other subdivision situations. Post-trip refers to the process after the user arrives at the destination airport, including: and (4) extracting detailed situations such as luggage, taxi-taking to a hotel, business/travel and the like.
When the flight card is displayed by the mobile phone, according to a specific situation where the user is located, information which is more concerned by the user in the current situation can be displayed, namely, the content displayed in the flight card is adjusted.
Of course, the above situations do not all require the flight card to display information in the flight card adjustment card. For example, in the case of the check-in seat selection, the card may display the time for opening the check-in, and at the same time, the card may also display the button for the user to handle the check-in procedure; in the security inspection context, the content displayed in the card may not be adjusted. That is, the situations given above are optional, that is, the mobile phone can selectively change the content displayed in the card in a certain situation.
When the context changes, i.e. going from one node to another, the information that the user is interested in changes. For example, when a user purchases a ticket, the user may be more concerned about the date, origin, and destination of the flight; when faced with a check-in/pick, information of interest to the user includes: when to start check-in, whether the check-in can be transacted online; upon arrival at the airport, the user may be more concerned with information about check-in counters, gates, etc. of flights.
The context-based notification display method provided by the embodiment of the application is a method for displaying the important information at the stage by sensing the context where the user is located. The important information is the information that the user is more interested in.
In conjunction with the flight travel situation shown in fig. 1C, the following describes a process of changing the displayed content of the flight card by the mobile phone according to different situations. FIGS. 2A-2J illustrate user interfaces for a set of context-based flight cards. The context-based notification display method provided by the embodiments of the present application will be described below with reference to the user interfaces shown in fig. 2A-2J.
The corresponding contexts for flight cards shown in fig. 2A-2J include: booking tickets, boarding/seating, departing to an airport, arriving at an airport, boarding, and picking up luggage.
After the user purchases the air ticket, the mobile phone can obtain the travel plan of the user. The travel plan includes flight number, date, departure place, destination, time, and the like. The mode of the mobile phone acquiring the travel plan may include a short message and a push notification (push notification).
Wherein, the short message means: the mobile phone acquires the travel plan of the user by monitoring the ticketing short message sent by the airline company. The push notification means: after confirming the trip plan of the user, the airline company pushes a notification to the mobile phone of the user. The notification includes the travel plan. The following embodiments will describe in detail how the mobile phone obtains the travel plan of the user in the above two ways, which are not expanded first.
After the mobile phone obtains the travel plan, the mobile phone can display the travel plan in the flight card. At this time, the card displayed by the mobile phone can be called a flight card in a ticket booking situation. Fig. 2A shows the user interface 21 of the handset displaying a flight card in a ticket booking scenario.
As shown in fig. 2A, the user interface 21 may include a status bar 211, a page indicator 212, a frequently used application icon tray 213, and a plurality of other application icons 214. Wherein: the status bar may include one or more signal strength indicators (e.g., signal strength indicator 211A, signal strength indicator 211B), wireless fidelity (Wi-Fi) signal strength indicator 211C, battery status indicator 211D, time indicator 211E for a mobile communication signal (which may also be referred to as a cellular signal).
Page indicator 212 may be used to indicate the positional relationship of the currently displayed page with other pages.
The tray 213 of frequently used application icons may include a plurality of tray icons (e.g., camera application icons, address book application icons, phone application icons, information application icons) that remain displayed upon page switching. The tray icon is optional, and the embodiment of the present application does not limit this.
The other application icons 214 may include multiple application icons. For example, a settings application icon, an application marketplace application icon, a gallery application icon, a browser application icon, and the like. Other application icons also include third party application icons, e.g.
Figure BDA0003090265740000081
Figure BDA0003090265740000082
Application icons and the like
The other application icons 214 may be distributed across multiple pages, and the page indicator may also be used to indicate which page the user is currently browsing for applications in. The user may slide the area of the other application icons from side to browse the application icons in the other pages. It is understood that fig. 2A merely illustrates one user interface of the electronic device 100, and should not be construed as a limitation to the embodiments of the present application.
The user interface 21 may include a card 215. The contents displayed on the card 215 are the travel plan (i.e., flight information), including flight number, date, departure point, destination, time, and the like. For example, card 215 shows a travel plan of "1 month 9," flying from Shenzhen Baoan T3 to Beijing capital T3, with an expected departure time of 8:00, arrival time of 11:20, and flight number of CA1314 ".
When viewing the flight card shown in fig. 2A, the user can immediately know his/her own travel plan, and avoid logging in to the website or application for purchasing tickets to inquire about his/her travel plan.
Here, the flight card (card 215) is placed on the main page (home page) of the cell phone. In other embodiments, the card may also be displayed on the minus one screen. Here, minus one screen refers to the leftmost page of the mobile phone. The embodiment of the application does not limit the placement position of the card.
The airline will open the pick-up/pick-up aisle some time before the flight takes off, for example, one day before the flight takes off. The user can carry out the affairs of check-in/seat selection and the like through a check-in/seat selection channel opened by the airline company. The check-in/seat-selecting channel comprises an on-line channel and an off-line channel. The on-line channel refers to a channel of an on-line check-in machine provided by electronic equipment such as a mobile phone and a personal computer. The off-line passage refers to a passage for handling check-in provided by an airport check-in counter.
The phone may capture the time when the flight is open and detect if the time has been reached and adjust the information displayed in the card 215 accordingly.
Some time before the check-in is opened, the flight card may display a reminder that the check-in is about to be opened. The flight card at this time may be referred to as a flight card in the check-in/pick-seat scenario. The period of time is predetermined, such as 2 hours before opening the operator/seat selection channel, etc. After seeing the reminder, the user can know at what time in the future the check-in procedure can be started.
Fig. 2B shows the user interface 22 of the handset displaying the flight card in the check-in/pick-from scenario. As shown in fig. 2B, the card 215 can be divided into two parts, i.e., a left area 221 and a right area 222.
Region 221 (left region) may display basic information of the flight including origin, departure time, destination, and arrival time, such as "Shenzhen Jeans T3, 8:00, Beijing capital T3, 11: 20" shown in region 221.
Area 222 (right area) may display a reminder that the check-in is about to be opened. For example, "expect an 18:00 open value machine today".
Through the reminding message, the user can know that the flight purchased by the user is about to open the check-in/seat-selection channel in time. In this way, the user can further schedule check-in procedures.
Further, the handset can detect whether the airline has opened the airplane/seat selection channel. When the open/seat channel is detected, the mobile phone can display the user interface 23 shown in fig. 2C. At this point, the flight card may display a message prompting the user to handle the check-in procedure.
As shown in fig. 2C, user interface 23 includes a card 215. The card 215 includes: region 221 (left region), region 222 (right region). The area 221 still displays basic information of the flight (origin, departure time, destination, and arrival time), referring to fig. 2C. However, the area 222 at this time no longer displays a prompt to "expect an open operator today at 18: 00", but displays a prompt to have an operator already open, for example, "have an operator already open".
Meanwhile, the region 222 may further include a check-in button 231. The mobile phone may detect a user operation acting on the check-in button 231, and in response to the operation, the mobile phone may display a user interface for transacting a check-in procedure. The user operation is, for example, a click operation. The following embodiments will detail the user interface for the check-in procedure, which is not expanded first.
In the user interface for handling the check-in procedure, the mobile phone can receive the seat number data selected by the user and send the seat number data to the server of the airline company. After the airline's server confirms the seat number selected by the user, the cell phone can display a feedback of the success of the selection.
Here, the confirmation of the seat number selected by the user means to confirm whether or not the seat number is selectable, and if the seat number is selectable, the user flag corresponding to the seat number is updated to the user, and then the state of the seat number is changed to be not selectable (that is, the other user cannot select the seat any more). The user interface for the check-in procedure is not limited in this application.
It will be appreciated that the airline card may not display the check-in button 231 if the airline in the user's flight does not support the on-line check-in procedure.
When detecting that the user has transacted the check-in procedure, the mobile phone can update the 'open check-in' reminder displayed in the card 215 to a 'completed check-in' reminder. Figure 2D shows the user interface 24 of the handset displaying the flight card after the user transacts the check-in procedure.
As shown in fig. 2D, the card 215 includes: region 221 (left region), region 222 (right region). Likewise, the area 221 still displays basic information of the flight (origin, departure time, destination, and arrival time). At this point, the area 222 no longer displays a "clear" reminder, but rather a completed check-in reminder, such as "clear pick".
Meanwhile, the check-in button 231 in the user interface 23 may be replaced with a boarding pass button 241. When a user operation acting on the boarding pass button 241 is detected, the cellular phone may display a user interface of the user including an electronic boarding pass in response to the operation. The information recorded in the electronic boarding pass includes: passenger name, flight number, date, destination, cabin space, seat number, gate, and health code, among others. The following embodiments will describe in detail the user interface of the mobile phone for displaying the electronic boarding pass, which is not expanded first.
In other embodiments, the area 222 (right area) may also directly display the user-selected seat number. Thus, the user can more conveniently and quickly acquire the seat number.
After the mobile phone displays the user interface 23 shown in fig. 2C and when the check-in deadline is close, if the user does not transact the check-in procedure late, the mobile phone can further display the countdown of the check-in deadline. The countdown may give the user a more intense reminder. Thus, the user can transact the check-in procedure as soon as possible after seeing the countdown.
Figure 2E shows the handset displaying a user interface 25 including a countdown flight card.
As shown in fig. 2E, the card 215 includes: region 221 (left region), region 222 (right region). At this point, the area 222 may display the remaining check-in time (i.e., countdown). Here, the remaining check-in time may be obtained by calculating a difference between the current time and the takeoff time.
For example, the user interface 25 shows that the current time is 6:00 am and the departure time for the flight corresponding to the card 215 is 8:00 am, which is 2 hours away from the departure time. At this point, the region 222 may display a reminder word "2 hours remain from the cutoff.
As the current time changes, the handset may continually update the countdown displayed in area 222, such as "1 hour remains from the cutoff and" 30 minutes remains from the cutoff. The mobile phone can also set the prompting typeface to be red, so as to further remind the user.
While the countdown is displayed, the area 222 may also display a check button 231 shown in fig. 2C. When a user operation on the check-in button 231 is detected, the mobile phone may display a user interface for handling check-in procedures in response to the operation (refer to the foregoing description, and will not be described here). Thus, upon seeing the check-in countdown, the user can transact the check-in procedure through the check-in button 231.
When the check-in is ended, the area 222 may no longer display the countdown, but display a prompt of the ended check-in, such as displaying the word "ended check-in" with reference to the card on the right side of fig. 2E. The typeface can be set to other colors of red, thereby giving the user a stronger prompt.
In the process of displaying any flight card shown in fig. 2D-2E, the mobile phone can also obtain location data. The location data may be used to determine whether the user entered the airport geographic area (i.e., whether the airport was reached). The following embodiments will describe in detail the specific method for obtaining the location data by the mobile phone, which is not first developed. When the situation that the user enters the airport geographic range is detected, the mobile phone can update the content displayed in the flight card.
In the process that the mobile phone displays the flight card shown in fig. 2D, the mobile phone acquires position data indicating the geographical range of the airport. At this point, the handset may display the user interface 26 shown in FIG. 2F.
As shown in fig. 2F, the card 215 may include a region 221 (left region), a region 222 (right region). At this point, area 222 of card 215 may display check-in counters, such as "check-in counters G07-G11". When viewing the counter, the user can go to any one of the counters numbered "G07" to "G11" to check out the check-in procedure. Therefore, the user can know which counter the user should go to for checking in the machine in time, and time waste and delay of the journey are avoided.
Meanwhile, the region 222 may further include a check-in button 231. When a user operation acting on the check-in button 231 is detected, the cellular phone may display a user interface for transacting check-in procedures in response to the operation. That is, after arriving at the airport, the user may still transact check-in procedures through the on-line check-in/pick-seat channel provided by the check-in button 231. Similarly, if the airline to which the user's flight belongs does not support online check-in/selection, the flight card may not display the check-in button 231.
In the process of displaying the flight card shown in fig. 2D by the mobile phone, the position data acquired by the mobile phone indicates that the user does not enter the airport geographic range. Meanwhile, the current time acquired by the mobile phone indicates the approach takeoff time. The approach takeoff time here is, for example, 2 hours, 1 hour, etc. before takeoff. At this point, the handset may display the user interface 27 shown in fig. 2G.
As shown in fig. 2G, the card 215 may include a region 221 (left region), a region 222 (right region). At this point, the area 222 of the card 215 may display a departure alert. Specifically, the mobile phone can determine whether the user should go to the airport by the current time, departure time (takeoff time), expected journey time, and waiting time. For example, the user interface 27 displays a current time of 6:00, a departure time of 8:00 for a flight, a predicted journey time of 1 hour, and a predicted airport terminal time of 30 minutes. From the 4 time data, the handset can determine that the user has 30 minutes left. At this point the field 222 may display a reminder to "propose departure within 30 minutes".
Here, the predicted travel time may be calculated from the position data (current position data of the mobile phone), the airport position data, and the moving speed of the mobile phone (current moving speed of the user). Besides the data, the mobile phone can calculate the predicted journey time through a map interface provided by a third party.
The mobile phone can judge whether the user takes the vehicle or not through the moving speed of the mobile phone (the current moving speed of the user) so as to judge whether the user is on a road going to an airport or not. Generally, the walking speed of a person is 3km/h-5km/h, and the moving speed of a mobile phone is 3km/h-5 km/h. When the user is riding in a vehicle, the speed of movement of the person can be increased substantially, for example 20km/h, 30km/h or even faster. At the moment, the moving speed of the mobile phone is greatly improved. Therefore, through the moving speed, the mobile phone can press off whether the user takes the vehicle or goes to the airport.
Therefore, the user can know that the user should go to the airport at the moment through the prompt message so as to avoid the false alarm.
In particular, at this time, the cell phone can also highlight the starting place of the display ("Shenzhen Baoan T3"), for example, setting the typeface of "Shenzhen Baoan T3" to red, and so on. Therefore, the user can acquire the departure place more intuitively and clearly, so that the user can go to the departure place conveniently.
Area 222 may also display boarding pass button 251. When the mobile phone can detect a user operation acting on the boarding pass button 251, the mobile phone can display the electronic boarding pass shown in fig. 2L in response to the operation. If the airline to which the user's flight belongs does not provide an electronic boarding pass, field 222 may not display boarding pass button 251, referring to the flight card shown on the right side of FIG. 2G.
Area 222 may implement an update to the estimated journey time described above with reference to fig. 2H when it is confirmed that the user is already on the way to the airport.
The mobile phone may periodically acquire the location data of the device (i.e., the location data of the user) at a preset time, and then calculate a new predicted journey time according to the new location data of the device, the airport location data, and the current moving speed (current moving speed) of the device. The handset may then display the new estimated time of flight in area 222. For example, fig. 2H shows "projected 20 minute arrival".
Region 222 may also display boarding pass button 251. The cellular phone can detect a user operation acting on the boarding pass button 251, and in response to the operation, the cellular phone can display the electronic boarding pass of the user. Similarly, if the airline to which the user's flight belongs does not provide an electronic boarding pass, the area 222 may not display the boarding pass button 251, as shown with reference to the flight card shown on the right side of FIG. 2H.
If the user does not check in at the time of departure and travel to the airport, the boarding pass button 251 may be replaced with the check-in button 231 described above. In response to user manipulation of the check-in button 231, the handset may display a check-in user interface. The user can transact an online check-in machine through the interface.
When it is confirmed that the user enters the airport geographical area and has transacted the check-in procedure, the handset may display the user interface 28 shown in fig. 2I. The user interface 20 is displayed with a card 215. Likewise, the card 215 may include a region 221 (left region), a region 222 (right region).
At this time, the area 221 may display basic information of the flight. Here, basic information for flights includes departure time ("8: 00"), arrival time ("11: 00"), destination ("beijing capital T3"), and check-in counter ("G07-G11"). Region 222 may display a gate ("K50").
Region 222 may also include boarding pass button 251. The user may click on the button when the user wants to get more information about the flight. In response to the click operation of the user, the mobile phone can display the electronic boarding check of the flight. Thus, the user can acquire more information.
At this time, if a change in the gate is detected, the mobile phone may update the gate displayed on the card 215 ("K50"). For example, when the gate is changed to "K52", the cellular phone may change the originally displayed "K50" to "K52". In particular, the gate displayed after the change may be set to a different color, for example, red, so as to be significantly distinguished from other information routes. Therefore, the user can more intuitively and clearly acquire which contents are changed.
Optionally, the content displayed in the area 221, such as departure time ("8: 00"), arrival time ("11: 00"), destination ("beijing capital T3"), etc., may be adjusted according to a preset form to achieve the purposes of emphasis and differentiation. The preset form comprises: preset color, font weight, etc. For example, when the takeoff time is displayed as 8:00, a red font is used for displaying, so that the purpose of further reminding the user is achieved.
Similarly, if the airline to which the user's flight belongs does not provide an electronic boarding pass or does not allow display on this card (e.g., only allows access at the official website), field 222 may not display boarding pass button 251. Optionally, as an alternative, the area 222 may display a flight icon 252, where the flight icon 252 may indicate that the current flight does not support the display of an electronic boarding pass.
By detecting the system time and location data, the handset can confirm whether the user has arrived at the destination airport. Specifically, the mobile phone may periodically obtain current system time and location data according to a preset time. When the system time is after the arrival time and the location data indicates that the cell phone is within the geographic range of the destination airport, the cell phone can confirm that the user has arrived at the destination airport. At this time, the mobile phone may display the user interface shown in fig. 2J.
As shown in fig. 2J, the card 215 may include a region 221 (left region), a region 222 (right region). At this point, region 221 may display basic information for the user flight, including departure time ("8: 00"), arrival time ("11: 00"), origin ("Shenzhen Jeans T3"), and destination ("Beijing capital T3"). Here, the card 215 may enhance the geographic location information (including the origin and destination). The reinforced geographical location information is that: the font of the geographic position information is thickened, increased or a font with a specific color is used, so that the user can see the geographic position information more clearly and intuitively. This is because the user is more concerned about where he is when arriving at the destination airport.
Region 222 may display a baggage carousel on which the user checked in baggage. Such as "baggage carousel 19" shown as card 215 in fig. 3C. Thus, after the user sees the information, the user can go to the luggage turntable with the number 19 to take the luggage of the user.
The cell phone may then turn off the card 215 when the user is detected to be away from the destination airport. In other specific embodiments, the mobile phone can also acquire the system time. When the system time exceeds the preset time after the arrival time and the user is not within the geographic range of the destination airport, the mobile phone can confirm that the journey indicated by the card 215 is finished. At this point, the handset may close the card 215.
2A-2J show a series of user interfaces for splitting and displaying flight information by a mobile phone according to different scenes in the process of displaying flight cards. By implementing the method, the mobile phone can display the information which is most concerned by the user under the situation according to the situation of the user. Therefore, the user can obtain as much information as possible through the flight card, and inconvenience caused by information stacking can be avoided.
The user interface for the check-in operator option mentioned in fig. 2A-2J will be described below. When user operation acting on the check-in button is detected, the mobile phone can display a user interface for handling check-in seat selection. The above operation is, for example: a user operation acting on the check-in button 231 shown in fig. 2C, a user operation acting on the check-in button 231 shown in fig. 2E, a user operation acting on the check-in button 231 shown in fig. 2F, and the like.
Figure 2K illustrates the user interface of the handset displaying the check-in seat.
As shown in FIG. 2K, the page may include a region 261. Region 261 displays a plurality of seat icons. Wherein the seat icon can distinguish the state of the seat by different colors. For example, referring to icon 262, a white icon may indicate that the seat is free, i.e., not selected by other passengers. Referring to icon 263, the light gray icon may indicate that the seat is occupied, i.e., selected by another passenger. Thus, the seats corresponding to the white icons are user selectable and the seats corresponding to the light gray icons are not user selectable.
When a user action on a white icon is detected, the handset may change the white icon to dark gray, referenced by icon 264, in response to the action. The dark gray icon may indicate that the seat to which the icon corresponds is the seat selected by the user. At this point, the cell phone may display the passenger label 265 and the button 266. The passenger label 265 may represent identity information of the user. Button 266 may be used to lock the seat. When a user operation on button 266 is detected, the handset may display a transmission of data for that seat to the airline's server in response to the operation. After the airline's server confirms the seat number selected by the user, the cell phone can display a feedback of the success of the selection. At this time, the seat corresponding to the icon 264 is the seat selected by the user ("Lisa").
The user interface for displaying the electronic boarding pass mentioned in fig. 2A to 2J will be described below. When a user operation acting on an electronic boarding pass button is detected, the mobile phone can display the electronic boarding pass. The above operation is, for example: a user operation acting on the electronic boarding pass button 241 shown in fig. 2D, a user operation acting on the electronic boarding pass button 251 shown in fig. 2H, a user operation acting on the electronic boarding pass button 251 shown in fig. 2I, and the like.
Fig. 2L illustrates a user interface for a cell phone to display an electronic boarding pass.
As shown in fig. 2L, the interface includes a boarding pass 271, buttons 272. Boarding pass 271 may be used to display information needed for boarding. Button 272 may be used to close boarding pass 271.
Boarding pass 271 displays information such as the origin ("Shenzhen Baoan T3"), destination ("Beijing capital T3"), name of the user ("Lisa"), cabin class ("K"), seat number ("23D"), boarding time ("8: 00"), gate ("pending"), boarding sequence number ("67"), health code, and the like.
Boarding pass 271 may also include buttons 273 and controls 274. When a user operation on button 273 is detected, the cell phone can file boarding pass 271 to the wallet application in response to the operation, i.e., the user can open the electronic boarding pass in a wallet. Controls 274 may be used to share boarding passes. Upon detecting a user action on the controls 274, the handset may display icons for a plurality of applications in response to the action. The icons of the plurality of applications may represent objects to be shared.
In the flight cards shown in fig. 2A-2J, in addition to detecting user operations on particular buttons or controls, the cell phone may also detect user operations on other areas of the card. The specific buttons or controls include: an airplane check-in button 231 shown in fig. 2C, a button 241 for checking a boarding pass shown in fig. 2D, and the like. The areas of the card other than the particular buttons or controls described above may be referred to as other areas.
When a user operation acting on the other area is detected, the mobile phone can display a flight detail page in response to the operation. Fig. 2M illustrates a user interface for a cell phone to display a flight details page. The flight information displayed by the interface can refer to the descriptions in fig. 2A-2J, and is not described in detail here.
Wherein the interface distinguishes between planned and actual departure times, planned arrival times, and projected arrival times. The planned departure time and the planned arrival time are determined at the time the flight is scheduled. Flight delays due to weather and other factors can cause changes in the actual departure time and, correspondingly, the expected arrival time.
The cards in the different situations described in fig. 2A-2J are all the cards displayed when the flight is scheduled to proceed normally. When a user's flight is delayed, cancelled, etc., the card may display a tag for delaying or cancelling.
Fig. 2N shows how the flight cards shown in fig. 2A-2J look differently in the event of a delay. As shown in fig. 2N, when a flight of the user is delayed, the cell phone may add a delay tag to the currently displayed card.
Fig. 2O shows how the flight cards shown in fig. 2A-2J look differently in the case of cancellation. As shown in fig. 2M, when the user's flight is cancelled, the phone may display a "flight cancellation" mark on top of what the card originally displayed.
When the departure time and the arrival time of the flight of the user are not on the same day, the mobile phone can display a mark of "+ 1" after the arrival time displayed by the flight card, and the mark represents the time when the arrival time is the next day. FIG. 2P shows a flight card displaying a "+ 1" tag.
Referring to fig. 2Q, the flight card may display an early warning tag when the user's flight may be delayed (e.g., a preceding flight may not arrive on time or may be delayed due to thunderstorm weather, etc.). When the user flight cannot fly due to machine failure or other failures, the flight card can also display a failure label; other scenes, such as a crash tab, may also be displayed.
The context-based notification display method provided by the embodiment of the present application can be applied to banner notifications, pull-down notifications, lock screen notifications, and negative screens, without being limited to the cards described in fig. 2A to 2Q.
Banner notification refers to: and when the mobile phone is in the running state (the state of displaying a desktop or other application program interfaces after unlocking), displaying a notification mode above the screen. Typically, banner notifications last for a relatively short time. The drop-down notification is: a notification displayed on a drop-down interface. Here, the pull-down interface is an interface displayed by the mobile phone in response to a slide-down operation. The screen locking notification means: and displaying a notification mode in the screen locking interface by the mobile phone. Banner notifications that are not processed by the user in a timely manner may be archived in pull-down notifications, lock screen notifications.
The negative screen refers to the leftmost page of the mobile phone. The negative screen can display the application programs commonly used by the user or the shortcut functions provided by the program. Here, the shortcut function provided by the application program is, for example, a play/pause/song-cut function provided by the music application program. In the embodiment of the present application, the flight card can also be displayed by minus one screen.
It will be appreciated that the banner notification, the drop down notification, the lock screen notification, and the minus one screen card described above are optional. That is, the added content on the cards shown in fig. 2A-2Q may also be displayed on one or more of the banner, drop down, lock screen, and minus one screen cards described above during the transition from one context to another.
For example, the mobile phone may also display a banner notification during a transition from the scenario of displaying the check-in advance notice to the scenario of displaying the check-in procedure reminder (i.e., the card is updated from the state shown in fig. 2B to the state shown in fig. 2C). The banner notification includes the content displayed in area 222 shown in fig. 2C. When the mobile phone is in the screen locking state, the mobile phone can display a screen locking notice. Likewise, the lock screen notification includes the content displayed in the area 222 shown in fig. 2C.
It will be described that the mobile phone displays the above-mentioned other forms of notification user interfaces during the context conversion process shown in fig. 2B to 2C. For the conversion of other scenarios shown in fig. 2A-2Q, reference may be made to the following description, and the embodiments of the present application will not be described in detail.
Fig. 3A shows a user interface 31 for a handset to display notification content in the form of a banner notification.
When detecting that the flight of the user starts to check in, the mobile phone can receive a notice for reminding the user to check in. The handset may then display the user interface 31 as shown in fig. 3A. The user interface 31 may include a notification window 311. Notification window 311 shows a banner notification. The notification window 311 may include a control 312 and some prompting messages, such as a prompt indicating that the notification is a flight notification, flight number information indicating which flight the flight is, and so forth.
The control 312 may prompt the user to check-in. The cell phone may detect a user action on control 312, in response to which the cell phone may display a page for transacting check-in procedures.
The mobile phone can set a preset time, and when the preset time is over, the mobile phone can close the notification window 311.
Fig. 3B shows the user interface 32 for the handset to display a notification on the lock screen interface.
Similarly, the mobile phone may receive a notification to remind the user to check in after detecting that the flight of the user starts checking in. If the mobile phone is in the screen-off state at the moment, the mobile phone can light the screen and display the screen-locking interface. The handset may then display the notification on the lock screen interface, referring to the user interface 32 shown in FIG. 3B.
As shown in FIG. 3B, the user interface 32 may include a notification window 321. The notification window 321 may be used to show a notification that a flight is starting to check in. The specific content shown in the notification window 321 can refer to the description of fig. 3A, and is not described herein again. One notification shown in notification window 321 may be referred to as a lock screen notification.
In particular, when the user enters the airport, the notification displayed by the lock screen interface may include: and acquiring the electronic boarding check. For example, the notification window 321 may be replaced by "click to get electronic boarding pass" instead of "click to get boarding pass". When a user operation of "click to acquire an electronic boarding pass" is detected, the mobile phone may display a user interface showing the electronic boarding pass shown in fig. 2L in response to the operation.
The cell phone may not perform the step of verifying the identity of the user before displaying the user interface showing the electronic boarding pass shown in fig. 2L. For example, the mobile phone may not display a password keyboard, a user interface for fingerprint unlocking, and the like. Thus, the user can acquire the electronic boarding check more quickly.
Fig. 3C and 3D show a set of user interfaces for displaying notifications on the drop-down interface by the handset.
Fig. 3C shows a home page (home page) on the handset for exposing the installed application. When the mobile phone displays the home page, the mobile phone can detect the sliding down operation acting on the home page, referring to the gesture operation shown in fig. 3C. In response to the slide down operation, the mobile phone may display the pull down interface shown in fig. 3D.
As shown in FIG. 3D, the drop-down interface may include a notification window 331. The notification window 331 can be used to show the notification of the flight starting check-in (refer to the descriptions of fig. 3A and fig. 3B, which are not described herein again). One notification shown in the notification window 331 may be referred to as a drop down notification.
The drop down interface may also include a control 332. Control 332 can clear notifications for the drop down interface for the user. The cell phone can detect a user action on control 332, in response to which the cell phone can clear all notifications for the drop-down interface.
It will be appreciated that when the cell phone displays a banner notification, the cell phone may categorize the notification into a lock screen notification and a pull-down notification if the cell phone does not detect a user action, such as a click action, to the notification. Specifically, for banner notifications that the user does not process in time, when the mobile phone displays a screen locking interface, the mobile phone may display the content included in the banner notification in the screen locking interface, that is, display a screen locking notification. When the mobile phone displays the pull-down interface, the mobile phone can display the content contained in the banner notification in the pull-down interface, namely, a pull-down notification is displayed.
Such as banner notification 311 shown in fig. 3A. When the mobile phone does not detect the user operation on the banner notification 311 within a preset time, the mobile phone may turn off the notification. Then, when the mobile phone displays the screen locking interface shown in fig. 3B, the mobile phone may display the on-duty reminder included in the banner notification in the interface, referring to the screen locking notification 321. When the mobile phone displays the pull-down interface shown in fig. 3D, the mobile phone can display the check-in reminder included in the banner notification in the interface, referring to the pull-down notification 331.
Fig. 3E shows the user interface 35 on the handset displaying minus one screen. As shown in fig. 3E, the negative screen may display a plurality of cards including a flight card 341, a schedule card 342, a weather card 343, a music card 344, a gallery card 345, and so on. When a flight start check-in by the user is detected, the flight card 341 may display an open check-in prompt message, such as "open check-in".
The flight card with one screen and the flight card on the desktop are optional, namely the flight card can be displayed on one screen and the desktop at the same time by the mobile phone, and the flight card can be displayed on one screen or the desktop. The method of displaying the flight card in one screen is the same as the method of displaying the flight card on the desktop shown in fig. 2A to 2J, and the description is omitted here.
When the user is in the boarding stage (i.e. boarding situation), the flight card may present the appearance shown in fig. 2I, and display information of interest to the user in the boarding stage, i.e. information of gate, seat number, etc.
Here, the boarding phase refers to a situation in which the user arrives at the departure airport and approaches the departure time. Whether the takeoff time is close to can be judged by confirming whether the current time is within a preset close time period. For example, if the takeoff time is 8:00, the handset may determine that 7:30-8:00 is the approach takeoff time. If the current time is within the close takeoff time, the mobile phone can confirm the close takeoff time.
At this time, the mobile phone can display the floating window while displaying the card shown in fig. 2I. Specifically, after determining that the user is in a boarding situation, the cell phone may detect whether an immersive application is currently running. Optionally, the immersive application is an application with full screen content presentation capability. In experiencing an immersive application, the user is typically not easily receptive to the systemSystem notifications, particularly weak notifications. The immersive application may be: video type applications (e.g. video type applications)
Figure BDA0003090265740000161
Figure BDA0003090265740000162
Etc.), game-like applications (e.g., game-like applications
Figure BDA0003090265740000163
Etc.), music-like applications or call-like applications, etc., or may be applets (e.g., in some applications)
Figure BDA0003090265740000164
An applet, which may be a video-type application, a game-type application, or a music-type application).
When it is determined that the cell phone is running an immersive application, the cell phone may display a floating window on a layer of a user interface of the application that is currently being displayed. The floating window can display a control for a user to open the electronic boarding check at any time. During the process of using the immersive application program by the user, the floating window can be always placed on the layer currently displayed by the application program, namely, the floating window is placed on the top layer and cannot be covered. The floating window is movable. In response to the dragging operation of the user acting on the floating window, the mobile phone can display the floating window in the area designated by the user, so that the shielding is avoided, and the user experience is influenced. Optionally, the floating window may also be fixed.
Fig. 3F-3H show a set of user interfaces for a cell phone displaying a floating window at a user interface provided by an immersive application in the case where the application is running.
Fig. 3F shows the user interface 36 for the handset to play the video. The user interface 36 may include a floating window 351. The floating window 351 may be used to display an electronic boarding pass. The cell phone may detect a user operation acting on the floating window 351, in response to which the cell phone may display the user interface 37 of fig. 3G.
In the user interface 37, the floating window 351 may expand as shown by the floating window 352. The floating window 352 may include controls 353 ("check-in boarding pass"). When a user operation to control 353 is detected, the cell phone may display the electronic boarding pass shown in fig. 2M in response to the operation. The operation is, for example, a click operation, a slide operation, a voice control, and the like, which is not limited in this embodiment of the application.
The floating window 352 may also display a control 354 when the time the floating window 352 is displayed is at boarding time. Control 354 may be used to prompt the user to go to a gate in preparation for boarding the aircraft. Specifically, the control 354 may display boarding prompt ("start boarding") and gate ("K50") information in a carousel manner, which are exchanged within a preset time.
For example, the mobile phone may set the preset time to 2 seconds. Within the first 2 second time interval after the floating window 352 is displayed, the phone may display a "start boarding" and controls 353. When the first 2 seconds have elapsed, the "start boarding" can be replaced with a "K50 gate" as shown in fig. 3H. The phone may then repeat and alternate the floating window 352 shown in fig. 3G and 3H.
In a period from arrival at an airport to departure of an airplane, when it is not detected that the mobile phone runs the immersive application, the mobile phone may remind the user to prepare to board through a flight card (refer to fig. 2I), a banner notification, and the like. When the mobile phone is detected to be running the immersive application program, the floating window can be displayed by the mobile phone.
Therefore, when the user watches videos or plays games on the mobile phone in use, the mobile phone can remind the user of boarding through the floating window. Meanwhile, the suspension window is small, and the position for placing the suspension window can be freely adjusted by a user, so that the suspension window cannot generate large interference on the user.
Fig. 3A-3G show different forms of notifications displayed by the handset in different scenarios. It will be appreciated that the different notification formats described above are not mutually exclusive, i.e. the handset may display the notification in multiple formats simultaneously. Therefore, the user can see the notification in time no matter what scene the user is. Furthermore, the user can respond in time, and the trip plan is prevented from being delayed.
In the following, an embodiment of the present application will be described with reference to fig. 4A, which illustrates a system 10 for implementing a context-based notification display method.
As shown in fig. 4A, the system 10 may include an electronic device 100, a cloud 200. In the embodiment of the present application, the electronic device 100 is the mobile phone described above.
The cloud 200 stores all the data that the card needs to be displayed. Data stored in the cloud 200 may be transmitted to the electronic device 100 through a push (push) mechanism. The notification received by the electronic device through the push mechanism may be referred to as a push notification (push notification). In other embodiments, the electronic device 100 may send a query request to the cloud 200, and then obtain data required for the card presentation from the cloud 200. For example, in the cards shown in fig. 2A-2J, 3A-3F, the data displayed by the cards may be obtained from the cloud 200.
The cloud 200 includes two interfaces (APIs): a push (push) interface and a response interface. The cloud 200 may call the push interface to send a push notification to the electronic device 100. In response to a request for the electronic device 100 to obtain data, the cloud 200 may invoke the response interface to transmit the data to the electronic device 100.
The electronic device 100 includes a notification display application, a notification reception module.
The notification display application is a system level application installed on the electronic device 100. The application comprises a data acquisition module, a cloud tool kit (cloud SDK), a decision module and a display module.
The data acquisition module may be used to acquire status data of the electronic device 100. The state data includes: time data, location data, short messages, cellular signals.
The time data refers to data obtained by the electronic device 100 acquiring the current time. The electronic apparatus 100 may acquire the current time by reading the system time. In other embodiments, the current time may also be completed through a network time pair, that is, the current time is obtained through the network. In the flight cards shown in fig. 2A to 2J, the electronic device 100 determines whether the check-in time is reached, whether the departure time is approaching, and the like can all be completed through the time data.
Location data refers to data acquired by the electronic device 100 indicating the geographic location of the device. The electronic device 100 may be obtained by a Global Positioning System (GPS), a Wireless Fidelity (Wi-Fi) network to which the electronic device 100 is connected, a cellular signal used by the electronic device 100, or the like. The embodiment of the present application does not limit the method for acquiring the position data. In the flight cards shown in fig. 2A to 2J, the electronic device 100 determines whether or not the user has reached the departure airport, the destination airport, or the like, which can be completed by the position data.
The short message refers to a short message received by the electronic device 100. The electronic device 100 may detect whether it has received new sms message. If a newly received short message is detected, the electronic device 100 may identify the source of the short message. If the short message belongs to a short message from a specific source, the electronic device 100 can read the content of the short message. For example, the electronic device 100 may obtain the flight itinerary of the user by drawing a ticket. Specifically, the following embodiments will describe in detail the method for obtaining the flight of the user through the ticketing short message, which is not first described herein.
The electronic device 100 may also detect whether the device is using a cellular network. The electronic device 100 may detect whether the device uses a cellular network by detecting cellular data (cellular data). In the process of detecting whether the user boards the airplane or not and in the process of detecting whether the user arrives at the destination or not, the electronic device 100 may perform the determination through the cellular data. Specifically, if it is confirmed that no cellular data is generated for a period of time, the electronic device 100 may confirm that the user has boarded the airplane. The period of time is a preset time. If cellular data generation is detected after confirming that the user boards the airplane, the electronic device 100 may confirm that the user has arrived at the destination airport.
Of course, the above method for confirming the situation of the user through the cellular data is not perfect. Referring to the descriptions of fig. 2A to 2J, the electronic device 100 also acquires time data, position data, and the like in the process of detecting whether the user boards an airplane or not, and in the process of detecting whether the user arrives at a destination.
After the data acquisition module acquires the state information, the decision module can judge the scene where the user is located according to the state information. The decision module may then instruct the display module to update the displayed content according to the identified scene.
For example, when the time indicated by the time data is the time for starting the check-in, the decision module may confirm that the user enters the situation for handling the check-in procedure. At this point, the decision module may instruct the cloud SDK to obtain the check-in data to the cloud 200. The decision module may confirm that the user entered the boarding preparation phase when the time indicated by the time data is near the time of takeoff, e.g., 20 minutes prior to takeoff, and the location data indicates that the user is within geographic range of the airport. At this point, the decision module may instruct the cloud SDK to obtain data such as a gate, a seat number, etc. from the cloud 200.
The cloud SDK is a tool kit provided by the cloud 200 for the electronic device 100 to access the cloud 200, and includes a plurality of data read-write interfaces. The electronic device 100 may request data required in the card presentation process, such as date, departure place, destination, and the like of the flight displayed in the card 215, from the cloud 200 through the read interface provided by the cloud SDK. The electronic device 100 may write the changed data into the cloud 200 through a write interface provided by the cloud SDK, for example, after the check-in is completed, the electronic device 100 may write the seat number into the cloud 200 through the write interface provided by the cloud SDK.
Data acquired by the cloud SDK from the cloud 200 may be sent to the display module. The display module may display the data in the flight card (card 215).
For example, upon detecting a start of check-in, the display module may display the card 215 shown in FIG. 2C. The card 215 may display a reminder message to start check-in, such as "check-in started". Meanwhile, the display module may display a control (check-in button 231) for checking in the check-in procedure provided by the cloud SDK. In response to user operations acting on the above-described controls, electronic device 100 may display the user interface of the check-in procedure shown in FIG. 2K.
The following embodiments will describe in detail the detailed flow of updating the card display content by the decision module according to the status data acquired by the data acquisition module, which is not expanded first.
The electronic device 100 also includes a notification reception module. The notification receiving module may be configured to receive a notification actively sent by the cloud 200 to the electronic device 100. In the method described in the foregoing embodiment, the electronic device 100 may obtain the flight information of the user by detecting the ticketing short message. In this embodiment, the electronic device 100 may further generate a flight card by receiving a push notification (push notification) of the cloud 200 through the notification receiving module. The push notification may include flight information for the user.
Specifically, when a user purchases a ticket via a third-party application, the cloud 200 may generate a flight record for the user. The record includes the passenger name, passenger identification card, phone number, order time, date (date the flight departed), origin, departure time, destination, arrival time, flight number, and the like.
Then, the cloud 200 may call the push interface to send a push notification to the electronic device 100. The push notification may include information such as the passenger's name, date, origin, departure time, destination, arrival time, flight number, etc.
The notification receiving module of the electronic device 100 may receive the push notification. Then, the push notifies the data acquisition module which can be further sent. After detecting that the data obtaining module receives the push notification, the decision module may instruct the display module to display the content included in the push notification in the flight card.
That is, there are two ways for the electronic device 100 to obtain data and services from the cloud 200: firstly, a request for acquiring data or service is sent to the cloud 200 through the cloud SDK; secondly, the cloud 200 detects a data change of the stored flight record and actively sends a notification to the electronic device 100.
In the former case, the electronic device 100 acquires data or services required by itself from the cloud 200 according to its own needs. For example, when the mobile phone determines that the user's seat number should be presented in the card, the mobile phone may send a request to the cloud 200 to obtain the user's seat number. In response to the request sent by the cell phone, the cloud 200 can send the user's seat number to the cell phone. The phone can then display the user's seat number.
In the latter case, the electronic device 100 may know the change of the trip of the user in time through the notification pushed by the cloud 200. For example, when the user changes his/her flight, the cloud 200 may update the itinerary of the user recorded in the cloud 200, that is, replace the changed date, flight number, and other information with the original itinerary date, flight number, and other information. Then, the cloud 200 may send the updated trip information to the cell phone. Therefore, the mobile phone can update the information displayed by the card in time, so that the user can be reminded in time, and the user is prevented from missing a stroke.
Generally, the cloud 200 is typically provided by a third party. For example, in the process of presenting a flight card by the electronic device 100, the information involved in the flight card typically requires the airline or a third party (e.g., a cruise crossbar) aggregating numerous airline flight information to provide. In view of the issue of authority, and the issue of management of the electronic apparatus 100, a manner in which the electronic apparatus 100 separately acquires data or services to the cloud provided by the third party is not preferable.
Thus, in another embodiment, the system 10 may also be represented as shown in FIG. 4B. As shown in fig. 4B, the system 10 may also include a cloud 300. The cloud 300 is a data set constructed for the electronic device 100, that is, all data required by the electronic device 100 are stored in the cloud 300. It is understood that data stored in cloud 300 is obtained from cloud 200.
The cloud 200 may detect a change in the stored data, and then the cloud 200 may transmit the changed data to the cloud 300 in a push notification manner. Further, the cloud 300 may send the changed data to the electronic device 100. For example, during the change process, the cloud 200 may detect that data such as a flight number, departure time, etc. of the user has changed. Then, the cloud 200 can send the changed data to the cloud 300 in a push notification manner. After receiving the changed data, the cloud 300 may modify the stored data such as flight number and departure time. Then, the cloud 300 may transmit the changed data to the electronic device 100. In this way, the electronic device 100 may display the changed flight number, departure time, and the like.
In this way, the electronic device 100 can obtain data from the cloud 300 at any time, thereby avoiding direct contact with the third party cloud (cloud 200). Meanwhile, the cloud 300 may also centrally manage data of the electronic device 100.
The process of the electronic device 100 splitting the display notification according to the context change in which the user is located will be described below in conjunction with a set of timing diagrams shown in fig. 5A-5E.
First, fig. 5A shows a flow diagram of a flight card in a ticket ordering context of the electronic device 100.
S501: the cloud 200 generates a ticketing short message.
The user may purchase airline tickets through third party applications such as airline official websites, flight trips, take-away trips, and the like. After completing the purchase operation, the cloud 200 may generate a flight record for the user. The record includes the passenger name, passenger identification card, phone number, order time, date (date the flight departed), origin, departure time, destination, arrival time, flight number, and the like.
Then, the electronic device 100 may receive the ticketing short message sent by the cloud 200. Specifically, the cloud 200 may obtain the phone number of the user from the flight record. The cloud 200 may set the phone number as a receiver of the short message.
Meanwhile, the cloud 200 can extract data required by the ticketing short message from the flight record. Generally, the data required by the ticketing short message includes: passenger name, flight number, origin and departure time, destination and arrival time, etc. Of course, the ticketing information may also include more information, and is not limited herein.
S502: the cloud 200 sends the ticket issuing letter to the electronic device 100.
Based on the information such as the passenger name, the flight number, the departure place, the departure time, and the like, the cloud 200 may generate a ticketing short message. Then, the cloud 200 may send the electronic device corresponding to the phone number to which the ticketing short message is sent. In the embodiment of the present application, the user is a user of the electronic device 100, and the phone number of the user corresponds to the electronic device 100.
Therefore, the electronic device 100 can receive the ticketing short message sent by the cloud 200, that is, the user can receive the ticketing short message.
S503: the electronic device 100 identifies the source of the short message and extracts the content of the ticketing short message.
The data acquisition module of the card application may sense the ticketing short message received by the electronic device 100, and further, the data acquisition module may extract data such as a passenger name, a flight number, a departure place and a departure time, a destination and an arrival time from the ticketing short message.
Specifically, the electronic device 100 can determine whether the short message is sent by an airline company through the source of the short message. After confirming the sms sent by the airline company, the electronic device 100 may parse the content of the sms to check whether the sms is a ticketing sms. If the data such as the passenger name and the flight number cannot be obtained after the short message is analyzed, the electronic device 100 may confirm that the short message is not a ticketing short message. Otherwise, the electronic device 100 may confirm that the message is a ticketing message, and the electronic device 100 may display the data of the passenger name, the flight number, and the like in the flight card.
For example, the electronic device 100 may receive a short message from "95583" (international airline in china). The short message is, for example, "respectful member: your good, your predetermined Chinese international airline CA1314 Shenzhen-Beijing, Tao 2021-01-098:00 Shenzhen Baoan T3, Tao 2021-01-0911: 20 Beijing capital T3, and Lisa. Ask you to go to the airport two hours ahead of time with valid credentials to avoid mishaps, congratulate your pleasure! "
Through the number "95583" of the sender, the electronic device 100 can confirm that the short message is a short message sent by international airlines in china. Then, the electronic device 100 may parse the content of the short message. Taking the content exemplarily shown in the above short message as an example, the electronic device 100 may obtain: flight number ("CA 1314"), departure time ("2021-01-098: 00"), origin ("Shenzhen Jean T3"), arrival time ("2021-01-0911: 20"), destination ("Beijing capital T3"), and the like.
In addition, the short message can be provided with a label indicating the type of the short message, such as express short message, advertisement short message, ticket-drawing short message and the like. Therefore, the electronic device 100 can also distinguish the ticketing short messages sent by the airline company through the tags. After confirming that the short message is the ticketing short message through the tag, the electronic device 100 can extract the content of the ticketing short message, which is referred to the above description and will not be described herein again.
S504: the electronic device 100 generates and presents flight cards.
After obtaining the data (flight number, departure time, departure place, etc.) from the ticketing text message, the electronic device 100 can generate a flight card. The card may include the data described above. The electronic device 100 may then display the airline card described above with reference to the card 215 shown in fig. 2A. In the card shown in fig. 2A, the ticket drawing information displayed by the card includes: the cards shown in FIG. 2A of "CA 1314, 1 month, 9 days, Monday, Shenzhen Jeannan T3, 8:00, Beijing capital T3, 11: 20" may be referred to as airline cards in the ticketing phase.
In this way, the electronic device 100 can determine whether the user schedules a flight for travel by monitoring the short message of the device. In addition, the electronic device 100 may further extract the flight data of the user by analyzing the ticket-drawing short message, so as to generate a flight card. Therefore, the user can obtain the scheduled flight information at any time through the flight card, and the delay of the journey is avoided.
In other embodiments, the electronic device 100 may also obtain the data (flight number, departure time, departure place, etc.) by receiving a push notification.
In this embodiment, after the user completes purchasing the ticket, the cloud 200 generates a push notification. The specific content included in the push notification may refer to the above-mentioned ticketing short message, which is not described herein again.
After generating the push notification, the cloud 200 may call the push interface to send the push notification to the electronic device 100. The electronic device 100 may receive the notification. Specifically, the electronic device 100 includes a notification reception module. This module may be used to receive push notifications. Therefore, this module can receive the push notification sent by the cloud 200.
Upon receiving the push notification sent by the cloud 200, the electronic device 100 may parse the notification and then obtain the flight information included in the notification. Thus, the electronic device 100 may extract the user's flight data from the notification, including flight number, departure time, origin, and so forth.
Specifically, after the notification receiving module receives the push notification, the data obtaining module may obtain the push notification from the notification receiving module. Then, the cloud SDK may parse the push notification, and then obtain flight data of the user. Further, the electronic device 100 may generate a flight card according to the flight data, and display the flight card. The card may be referred to as card 215 shown in fig. 2A.
It will be appreciated that during the first generation of a flight card, the handset may display an authentication page. The authentication page may display a plurality of text entry boxes. The user can fill in the own name, identification number, mobile phone number and other identity information through the input box. After receiving the information, the mobile phone may request the cloud 200 for the schedule (i.e., flight) of the user corresponding to the identity using the identity information. The mobile phone may store the identity information to facilitate subsequent acquisition of flight data of the user to the cloud 200. Therefore, the user does not need to repeatedly input the identity information of the user to the mobile phone.
By using the push notification method, the electronic device 100 can avoid recognizing and extracting the short message. In this way, the electronic device 100 can more timely and conveniently acquire flight data of the user.
Referring to fig. 5B, a flow chart of displaying flight cards in the check-in/pick-up situation by the electronic device 100 after the user checks in the check-in procedure will be described.
S511: before starting the check-in, the electronic device 100 may display a notice card.
The electronic device 100 may replace the flight card displayed in the ticketing period with the notice card within a preset time period before the flight starts to check in. The forecast card includes a forecast of the value machine. The check-in advance notice may prompt the user when the flight is expected to begin check-in.
The card 215 shown in FIG. 2B may be referred to as a notice card. At this point, region 222 may display "projected open value machine, today 18: 00". Upon seeing the above prompt, the user may know that he can transact check-in procedures after 18:00 today.
S512: the cloud 200 detects that the user's flight starts to check in, and generates a push notification.
The cloud 200 may detect an event when a user's flight begins check-in. Specifically, the cloud 200 may obtain the current time. When it is determined that the current time coincides with the check-in time of the user's flight, the cloud 200 determines a starting check-in of the user's flight. Then, the cloud 200 may call the push interface, and send a push notification to the electronic device 100 to start the check-in.
S513: the electronic device 100 generates and displays a flight card containing an airline reminder.
The notification receiving module of the electronic device 100 may receive the notification. Further, the notification may be sent to a data acquisition module. The data acquisition module can report the notification to the decision module. At this point, the decision module may know that the user's flight is starting to check-in. The decision module may then instruct the display module to display the flight card containing the check-in reminder.
An example of the above value machine is "opened value machine". As shown in FIG. 2C, the area 222 of the card 215 may display "value machine opened". After seeing the prompt message, the user can know that the user can transact the check-in procedure.
Meanwhile, the flight card containing the check-in reminder can also comprise a check-in button. The check-in button can be used for handling check-in procedures. Specifically, the electronic device 100 may display the user interface of the attendant console when a user operation acting on the attendant button is detected. As shown in fig. 2C, the area 222 of the card 215 may also include a check-in button 231. Upon detecting a user operation on the attendant button 231, the electronic device 100 may display a user interface of the attendant console.
In another embodiment, confirming whether the user's flight has started check-in may also be accomplished by the electronic device 100. Specifically, the electronic device 100 may obtain time data. Here, the time data includes the time when the flight starts to check in and the current time. When the current time obtained by the electronic device 100 is the time to start check-in, or the current time is after the time to start check-in, the electronic device 100 may confirm that the flight of the user is detected to start check-in. At this point, the electronic device 100 may generate and display a flight card containing an attendant reminder, see card 215 shown in fig. 2C.
S514: the user transacts the check-in procedure through the electronic device 100.
After displaying the flight card containing the check-in reminder, the electronic device 100 may detect whether there is a user operation applied to the check-in button. Upon detecting a user operation on the attendant button, the electronic device 100 may display a user interface of the attendant console in response to the operation.
Specifically, upon detecting a user operation on the check-in button, the electronic device 100 may provide a read interface through the cloud SDK to send a request for checking in procedures to the cloud 200. In response to the request, the cloud 200 may send seat distribution data for the flight to the electronic device 100. The seat distribution data includes the spatial location of each seat and also whether the seat is selectable.
Upon receiving the seat distribution data, the electronic device 100 may display a user interface for seat selection. The user interface displays all seats for the flight. Some of the seats are not selectable (have been selected by others) and some are selectable (have not been selected by others). The user can select own seat from the seats displayed on the interface.
After the user confirms the seat selected by the user, the electronic device 100 may transmit the user-selected seat data, i.e., the seat number, to the cloud 200.
When the cloud 200 receives the seat data sent by the electronic device 100, the cloud 200 may store the seat data in a flight record. Of course, before that, the cloud 200 needs to detect whether the seat data of the user meets the requirement. For example, when a user selects a seat that is available, the user's seat data is satisfactory, whereas the user's seat data is unsatisfactory, i.e., there is a conflict.
After the cloud 200 determines that the seat data of the user is satisfactory, the cloud 200 may send a confirmation signal to the electronic device 100, that is, the check-in is successful. At this point, the user's selected seat is essentially locked as the user's seat.
S515: the electronic device 100 updates the content in the card and displays the flight card after the check-in is completed.
After receiving the confirmation signal of successful check-in sent by the cloud 200, the electronic device 100 may display the flight card after the check-in is completed. The flight card processing after the check-in includes basic information (date, departure time, departure place, arrival time and destination) of the flight and also includes a seat number of the user.
Specifically, after receiving a confirmation signal that the check-in is successful, which is sent by the cloud 200, the decision module notifying the display application may confirm that the user has completed the check-in operation. At this time, the decision module may instruct the display module to display the flight card after the check-in is completed.
The display module may obtain the user-selected seat number from the cloud SDK. Then, the display module can replace the prompt message for starting the check-in machine displayed on the right side of the card with the seat number.
Referring to FIG. 2D, the left area (area 221) of the card 215 may display basic information for the flight. The right area of the card 215 (area 222) may display a prompt message that the user has cleared the check-in procedure and the seat number selected by the user when the check-in procedure was cleared. The prompt message is, for example, "a preferred seat selection". The seat number is, for example, "23D" shown in fig. 2D.
Fig. 5C shows a flowchart of the electronic device 100 changing the flight card in the boarding stage.
S521: the electronic device 100 detects that the user is in a boarding context.
After confirming that the user has completed the check-in operation, the electronic device 100 may detect that the user is in a boarding situation. The boarding scenario refers to a scenario in which a user arrives at a departure airport and approaches a departure time. Specifically, the electronic device 100 may determine whether the user is in a boarding situation according to the status information. The status information here includes: time data, location data.
Wherein the time data includes a takeoff time and a current time. The current time may be obtained by acquiring the system time of the electronic device 100. Optionally, the electronic device 100 may obtain the current time through the network pair. The location data may be obtained via GPS, Wi-Fi to which the electronic device 100 is connected, cellular signals used by the electronic device 100, and the like.
The electronic apparatus 100 may set a preset time. If the current time is within the preset time before the takeoff time, the electronic device 100 may determine that the current time is close to the takeoff time. For example, the departure time of the flight of the user is "8: 00", and the electronic device 100 may set the departure time to be 20 minutes before, i.e., "7: 40-8: 00", as the approach departure time. If the current time is between "7: 40-8:00," the electronic device 100 may confirm that the user's flight is about to take off.
Meanwhile, when the electronic device 100 obtains the position data indicating that the electronic device 100 is within the airport territory, i.e., the user has arrived at the airport, the electronic device 100 may confirm that the user is about to prepare to board the airplane.
S522: the electronic device 100 acquires data required for boarding the card from the cloud 200.
Upon confirming that the user is in a stage of preparing to board the airplane, the electronic device 100 may display a flight card (boarding card) of the boarding stage. The content displayed in the boarding card comprises the following basic information of flights: gate, seat number, refer to fig. 2I.
Specifically, after confirming that the user is in a stage of preparing to board the airplane, the electronic device 100 may request data required for boarding the card from the cloud 200, including basic information of the flight (flight number, date, departure time, arrival time, etc.) and a gate, a seat number, etc.
In response to the request sent by the electronic device 100, the cloud 200 may send data required for the boarding card to the electronic device 100.
S523: electronic device 100 generates and presents the boarding card.
After receiving data required for the boarding card transmitted by the cloud 200, the electronic device 100 may generate the boarding card. Then, the display module may display the boarding card on the screen of the electronic device 100, referring to the user interface illustrated in fig. 2I.
At this moment, when seeing the flight card, the user can quickly know information such as a boarding gate, a seat number and the like, and further the user can quickly finish boarding, so that the boarding gate and the seat can be prevented from being found wrongly.
S524: the electronic device 100 presents an electronic boarding pass.
Optionally, the boarding card may also include a boarding pass button. When a user operation acting on the button is detected, the electronic apparatus 100 may present the electronic boarding pass of the user in response to the operation.
Specifically, when a user operation acting on the button is detected, in response to the operation, the electronic device 100 may send a request for acquiring data required for the electronic boarding pass to the cloud 200 through a read interface provided by the cloud SDK. The data required for the electronic boarding pass include: passenger name, flight number, date, destination, cabin space, seat number, gate, and health code, among others.
In response to the request, the cloud 200 may extract the data from the user's flight record. Then, the cloud 200 may transmit the above data to the electronic device 100.
The cloud SDK of the electronic device 100 may receive the data. Then, the decision module can instruct the display module to display the data, namely, the electronic boarding check is displayed.
If the user performs operations such as changing a ticket and returning a ticket, the flight record of the user stored in the cloud 200 is changed. At this time, the contents displayed on the flight card of the electronic device 100 are also changed accordingly so as not to mislead the user.
Fig. 5D shows a flowchart of the electronic device 100 refreshing the display content of the flight card according to flight change or the like.
S531: the cloud 200 detects a flight change transaction.
When the schedule of the user is changed, the user can handle procedures such as ticket refunding or ticket changing, and the like, so that the schedule is changed. When the user performs a ticket refund or change operation, the flight record of the user stored in the cloud 200 is changed accordingly in response to the above operation. In particular, when a user performs a refund operation, the cloud 200 may mark the user's flight record as invalid in response to the operation. When the user performs a change operation, in response to the operation, the cloud 200 may change the information of the flight number, the departure time, and the like stored in the flight record to the information of the changed flight.
For example, the user originally scheduled a flight of "8:00 am from Shenzhen Bao' an T3, 11:20 am to Beijing capital T3". At this time, the departure time and arrival time of the user recorded in the cloud 200 are "8: 00" and "11: 20". When the user cannot complete the flight on time, the user can change the flight to "11:00 am from Shenzhen Bao' an T3 and 13:20 pm to Beijing capital T3". At this time, the departure time and arrival time of the user recorded in the cloud 200 may be changed to "11: 00" and "13: 20".
S532: the electronic device 100 receives the change notification.
When a change occurs to the flight record of the user stored in the cloud 200, the cloud 200 may send a change notification to the electronic device 100. The cloud 200 may call the push interface to send a push notification (change notification) to the electronic apparatus 100. The push notification may instruct electronic device 100 to request data for the user's flight from cloud 200. The notification receiving module of the electronic device 100 may receive the push notification.
S533: according to the change notification, the electronic device 100 acquires data of the flight of the user to the cloud 200.
After receiving the push notification, the electronic device 100 may parse the notification. Upon parsing, the electronic device 100 may confirm that the notification indicates that the electronic device 100 requests data for the user's flight from the cloud 200.
In response to the notification, the electronic device 100 may send a request to the cloud 200 to obtain flight data for the user. The flight data of the user may include: flight number, departure time, origin, arrival time, destination, etc. In response to the request, the cloud 200 may transmit the data to the electronic device 100.
The electronic device 100 may then obtain the refunded or re-signed flight data. For example, after the user changes flight to "11:00 am from Shenzhen Bao' an T3 and 13:20 pm arrives at Beijing capital T3", the departure time acquired by the electronic device 100 may be changed to "11:00" and the arrival time may be changed to "13:20".
S534: the electronic device 100 updates the contents of the flight card presentation.
Upon receiving the changed data, the electronic device 100 may generate a new flight card. Further, the electronic device 100 may display the new flight card.
In other embodiments, the push notification sent by the cloud 200 may also include the changed data. Thus, the electronic device 100 can directly obtain the changed flight data through analysis. Then, the electronic device 100 may generate a new flight card according to the changed flight data, and then the electronic device 100 may display the new flight card.
When the user arrives at the destination, the flight card may display the user's baggage information and close the card after a period of time. Fig. 5E shows a flowchart of the electronic device 100 displaying the flight card after arriving at the destination.
S541: the cloud 200 detects a flight arriving at the destination.
The cloud 200 can detect whether the aircraft has landed at the destination airport, i.e., whether the user has arrived at the destination. Specifically, when the current time is the arrival time of the flight, the cloud 200 confirms that the airplane on which the user is seated has landed at the destination airport, that is, whether the user arrives at the destination.
S542: the electronic device 100 receives the landing notification.
Upon detecting that a flight has arrived at the destination, the cloud 200 may generate a push notification. The notification may be referred to as a drop notification. The notification may be used to instruct the electronic device 100 to request data (flight data) from the cloud 200 that is needed to present the flight card. Here, the data required for the flight card includes baggage carousel information. The baggage carousel information is used to indicate a location where the user's baggage is stored.
Then, the cloud 200 may call the push interface to send the push notification to the electronic device 100.
The electronic apparatus 100 may receive the landing notification. Specifically, the electronic device 100 includes a notification reception module. This module may be used to receive push notifications. Accordingly, the notification reception module may receive the drop notification transmitted from the cloud 200.
S543: the electronic device 100 requests flight data from the cloud 200.
In response to the landing notification, the electronic device 100 may send a request to the cloud 200. Specifically, the cloud SDK may provide a read interface for the electronic device 100 to read data from the cloud 200. The notification display application may read the user's flight record from the cloud 200 via the cloud SDK.
In response to a read operation of the electronic device 100, the cloud 200 may send flight data of the user to the electronic device 100. Here, the flight data includes baggage carousel information.
S544: the electronic device 100 displays the arriving flight card.
After receiving the flight data sent by the cloud 200, the electronic device 100 may generate a new flight card and then display the card.
Specifically, after the cloud SDK acquires the flight data through the read interface, the cloud SDK may generate a new flight card according to the flight data. The flight card now includes baggage carousel information. The decision module may then instruct the display module to display the new flight card on the screen of the electronic device 100. Referring to fig. 2J, after detecting that the flight has descended, electronic device 100 may display the user interface shown in fig. 2J. At this point, the area 222 of the flight card 215 may display a baggage carousel.
Therefore, the user can see the information of the luggage after opening the mobile phone after landing, and then the user can go to the place indicated by the luggage turntable to take back the luggage. The operation of inquiring the luggage information by the user is avoided, the user operation is saved, and the use experience of the user is improved.
S545: the electronic device 100 closes the flight card.
After displaying the arriving flight card, the electronic device 100 may close the flight card in a preset situation. The electronic device 100 closes the flight card as the itinerary indicated by the flight card is complete.
Specifically, the electronic device 100 may set a preset time. After the preset time is over, the electronic device 100 may confirm that the user has completed the flight itinerary and left the destination airport. For example, the preset time may be 30 minutes. The electronic device 100 may confirm that the 30 th minute after the arrival time of the flight is the time when the user completed the trip, i.e., the time of departure from the airport.
At the same time, electronic device 100 may also obtain location data. The electronic device 100 may combine the time and location data to determine whether the user has left the destination airport. If the current time exceeds the arrival time by 30 minutes or more, and the current location indicates that the electronic device 100 is not within geographic range of the destination airport, the electronic device 100 confirms that the user has completed the trip and has left the destination airport.
After confirming that the user has completed the trip and has left the destination airport, the electronic device 100 may close the flight card.
In other embodiments, the electronic device 100 may itself detect whether the user arrives at the destination airport. Specifically, the electronic device 100 may periodically acquire time data and location data. When the time data indicates that the current time is after the arrival time and the location data indicates that the current location is within the geographic range of the destination airport, the electronic device 100 may confirm that the user has arrived at the destination airport.
In addition, the electronic device 100 may also periodically detect whether the user is using cellular data after the takeoff time. If the user is detected to have generated cellular traffic, the electronic device 100 may determine that the user has arrived at the destination airport. Of course, the two methods can be combined to determine whether the user arrives at the destination airport.
Upon detecting that the user has arrived at the destination airport, the electronic device 100 may obtain flight data from the cloud 200, and further, the electronic device 100 may generate and display a flight card after arriving at the destination airport. The following methods can refer to S543-S545, which are not described herein.
Not limited to flight notifications, the context-based notification display method provided by the embodiment of the application can also be implemented in high-speed rail, health codes and office punch cards. Next, the application of the context-based notification display method in the high-speed rail and other scenes will be sequentially described in the embodiments of the present application.
First, FIGS. 6A-6D illustrate a set of user interfaces for a context-based notification display method applied in high-speed rail.
After purchasing the high-speed railway ticket, the mobile phone of the user can receive the ticket drawing short message of the high-speed railway ticket. Then, the mobile phone can obtain the data of the high-speed rail tickets reserved by the user from the ticket-out short messages in a mode of reading the ticket-out short messages. The data may be referred to as high-speed rail data. The high-speed rail data comprises: train number, date, departure location, departure time, destination, arrival time, etc. Then, the mobile phone can present the high-speed rail data in the form of a high-speed rail card.
Similarly, the mobile phone can also receive
Figure BDA0003090265740000261
The sent push notification gets high-speed rail data. Reference is made here to the description of flight cards, which will not be repeated.
As shown in fig. 6A, card 1011 may be referred to as a high-speed card. Card 1011 shows a high-speed trip, including the train number ("G1314"), date ("1 month 9"), origin ("Shenzhen North"), departure time ("8: 00"), destination ("Guangzhou south"), and arrival time ("11: 20").
Then, the mobile phone can monitor the current time and judge whether the current time is close to the departure time. Similarly, the mobile phone can judge whether the current time is close to the departure time by setting a preset time. For example, the preset time may be 4 hours. When the current time is within 4 hours before the departure time, the mobile phone can confirm that the departure time is close.
The mobile phone can acquire the system time of the mobile phone. The system time may be the current time. Then, according to the system time and the preset time, the mobile phone can judge whether the departure time is close. For example, the mobile phone acquires 6:00 am with a system time of 1 month and 9 and a departure time of a high-speed rail of 8:00 am, and then the mobile phone can confirm that 2 hours remain from the departure time. At this point, the handset may display the user interface 62 shown in FIG. 6B.
In the user interface 62, the mobile phone can adjust the content displayed in the original card 1011. Adjusted card 1011 may appear as shown for card 1012.
Card 1012 may include a left side area and a right side area. Similar to the airline card, the left area may display basic information in a high-speed ticket, including: departure place, departure time, destination, arrival time, etc. The right area may display a prompt for the time of the approaching departure, for example, a departure countdown ("2 hours from departure"). After the user sees the prompt, the user can know the approaching departure time, and then the user can go to a high-speed rail station, so that delay of a trip is avoided.
Meanwhile, the mobile phone can also acquire position data. Through the position data, the mobile phone can judge whether the user arrives at the high-speed rail station. The method for obtaining the position data by the mobile phone can refer to the flight card described above, and details are not repeated here.
Upon confirming that the user arrives at the high-speed rail station, the cell phone may update the content shown in card 1012. Specifically, the mobile phone can display information such as a ticket checking port, a seat number and the like in the high-speed rail card. As shown in fig. 6C, updated card 1012 may take the form of card 1013.
The content displayed by card 1013 may include: train number ("G1314"), date ("1 month 9 days"), departure time ("8: 00"), arrival time ("11: 20"), destination ("south of guangzhou"), seat number ("8 cars 11A"), ticket gate ("a 12"). Wherein the seat number and the ticket gate are incremented upon detecting the arrival of the user at the high-speed rail station.
Therefore, the user can immediately know which ticket checking port the high-speed rail carried by the user checks the ticket after seeing the card, so that the electronic ticket is prevented from being inquired, the operation of the user is saved, and the use experience of the user is improved.
Then, the mobile phone can determine whether to send the car or not according to the current time. The mobile phone can acquire system time and departure time. When the system time has reached or exceeded the system time, the mobile phone can confirm that the car has been sent. After confirming that the high-speed rail on which the user is seated has departed, the cell phone may display the user interface 64 shown in fig. 6D. User interface 64 shows the look of the high-speed rail card after departure (card 1014).
The content displayed by card 1014 may include: the train number ("G1314"), date ("1 month 9 days"), departure time ("8: 00"), arrival time ("11: 20"), origin ("north of shenzhen"), destination ("south of guangzhou"), seat number ("8 cars 11A"). Where the right area of the card 1014 may display the user's seat number. Meanwhile, the contents displayed by the card 1014 do not include ticket gate information.
When the flight card (or the high-speed rail card) is displayed, the mobile phone can display or close the health code card while updating the flight card (or the high-speed rail card). The application of the context-based notification display method in the health code scenario will be described below. Figure 7 shows a user interface for the handset to display a health code along with a flight card (or high-speed rail card).
When the mobile phone detects that the user arrives at an airport (or a high-speed rail station) through the position data, the displayed content of the flight card can increase information such as an on-board counter and a boarding gate; the content displayed by the high-speed rail card can be added with information such as ticket gates, carriages, seat numbers and the like. At this time, the mobile phone can display the health code card below the flight card (or the high-speed rail card).
Fig. 7 shows a user interface 71 for a handset to display a health code card. The user interface 71 may include a flight card 1101, a health code card 1102. The flight card 1101 is the card after the mobile phone detects that the user arrives at the airport. The health code card 1102 is a displayed card after the mobile phone detects that the user arrives at the airport.
For example, after arriving at an airport, a mobile phone may acquire the location data of the device by means of GPS or the like. Through the position data, the mobile phone can confirm that the user arrives at the airport. The phone can then change the flight card to the appearance shown in the flight card 1101, and at the same time, the phone can also display the health code card 1102 below the flight card 1101.
Health code card 1102 may include controls 1103, 1104, 1105, 1106. When a user action on control 1103 is detected, the handset may display a user interface containing a health code in response to the action. Upon detecting a user operation on control 1104, the handset may display a page that fills out the travel registration in response to the operation. When a user action on control 1105 is detected, the handset can display an interface containing the user's nucleic acid detection record in response to the action. Upon detecting user operation of the action control 1106, the handset may display an interface containing a user vaccination record in response to the operation.
Thus, when a user enters an airport and needs to present information such as a health code, a nucleic acid detection record and the like, the user can click a corresponding control, and then the user can acquire the information. For example, when performing a security check into an airport, the user can click on control 1103. In response to the user operation, the cell phone may display a user interface containing the health code. The user may then present the health code to a security inspector.
The handset can then detect whether the current time reaches the takeoff time. When the takeoff time is reached, the mobile phone can close the health code card. When the current time is detected to reach the landing time, the mobile phone can display the health code card 1102 again. And will not be described in detail herein.
The cell phone may again display the health code card 1102 when it detects that the user has arrived at the destination airport. Here, the confirmation of whether the user arrives at the destination airport may refer to the description of fig. 2J, that is, the judgment of whether the user arrives at the destination airport by the current time and the current location.
When the mobile phone closes the flight card (or the high-speed rail card), the mobile phone can also close the health code card. The method for confirming closing of the flight card (or the high-speed rail card) by the mobile phone can refer to the description of fig. 5E, and is not described herein again.
Of course, the cell phone may also always display the health code card during the time it is ready to board the aircraft to leave the airport. That is, the mobile phone may not turn off the health code card after detecting takeoff. Therefore, after the arrival at the destination airport is detected, the mobile phone does not need to display the health code card again. Then, while the mobile phone closes the flight card (or high-speed rail card), the mobile phone may close the health code card.
In other embodiments, after displaying the health code card, the cell phone may also determine whether to display the health code hover window based on whether the immersive application is currently running.
Specifically, referring to fig. 7, after the health code card is displayed, the mobile phone may detect whether the immersive application program is currently running (the method for detecting whether the immersive application program is currently running by the mobile phone refers to the description of fig. 3F, and is not described here again). After currently detecting that the mobile phone is running the immersive application, the mobile phone may display a health code floating window in a user interface of the immersive application. When user operation acting on the health code floating window is detected, the mobile phone can display a user interface comprising the health code, a vaccination record, a nucleic acid record and the like.
Like this, at the immersive application of cell-phone operation, information such as health sign indicating number is acquireed fast to user's accessible health sign indicating number suspension window (for example, through operations such as clicking suspension window, can select the mode the same with preceding embodiment, and it is no longer repeated here) to in convenient for provide the inspection staff's inspection of security check fast conveniently, save time promotes user and uses experience.
The application of the context-based notification display method in punch-through at work will be described below in conjunction with the user interfaces shown in FIGS. 8A-8B.
The mobile phone can acquire the current time and the current position. The method for acquiring the current time and the current position by the mobile phone can refer to the foregoing description. According to the current time and the current position, the mobile phone can judge whether the user arrives at the office area before the designated work time (namely, work punch-in card) or whether the user leaves the office area after the designated work time (namely, work punch-in card).
And in the preset time for completing the work attendance card punching, the mobile phone can display the work attendance card punching notice in the card punching card. The preset time for completing the card punching comprises the following steps: default time preset by a user and set by a developer when designing a card, and time obtained by learning user historical card punching records through a mobile phone.
Fig. 8A shows the user interface 81 of the cell phone displaying the office punch notification in the punch card. The user interface 81 may comprise a card 1201, a time indicator 1202. The card 1201 may include a button 1203.
Time indicator 1202 indicates that the current system time is 8:45 (current time). The designated work hours on duty was 9: 00. That is, the user arrives at the office area before specifying the working hours. The above-mentioned designated working hours are preset.
Button 1203 may be used to receive a user's card punch operation. When a user operation on the button 1203 is detected, the mobile phone can acquire the current position. When the current location is within the geographic range of the office area, the handset can confirm that the user arrived at the office area before the designated work hour. At this time, the mobile phone can generate the on-duty card punching record of the user, namely the user finishes on-duty card punching.
After confirming that the user finishes checking the card on duty, the mobile phone can display the user interface shown in fig. 8B. At this point, button 1203 may display that the user has completed the card punch. Optionally, button 1203 may also display the time the user completed the work punch, e.g., "8: 45".
In some embodiments, the cell phone may also continue to display the card shown in fig. 8B after the user completes the work punch card. And in the process of continuously displaying the card punching, the mobile phone can also display a switch control. When the user operation acting on the switch control is detected, the mobile phone can close the card in response to the operation. In other embodiments, the phone may close the card after a predetermined time period shown in fig. 8B.
Then, the mobile phone can detect whether the off-duty card punching condition is met. Firstly, the mobile phone can judge whether the time of next shift is reached according to the time, and when the current time reaches the time of next shift, or after the time of next shift, the mobile phone can display the notification of next shift punching. Similarly, the off-duty time is preset.
Fig. 8C shows the user interface 83 for the handset to display a next work punch notification in the punch card. The user interface 83 may include a card 1201, a time indicator 1212. The card 1201 may include buttons 1213.
Time indicator 1212 indicates that the current system time is 17: 00. The designated off-hours was 17: 00. That is, the current time has already met the requirements for next shift. At the moment, the mobile phone can detect whether the user performs off-duty card punching operation.
Button 1213 may be used to receive a user's punch operation. When the cellular phone detects a user operation acting on the button 1213, the cellular phone can acquire the current position. And if the current position is confirmed to be within the geographical range of the office area, the off-duty card punching is successful. The mobile phone can generate the user's off-duty card-punching record, that is, the user finishes off-duty card-punching.
After confirming that the user finishes the off-duty card punching operation, the mobile phone can display the user interface shown in fig. 8D. At this point, button 1213 may display that the user has completed the card punch. Optionally, button 1203 may also display the time the user completed the off-duty card punch, e.g., "17: 00".
In some embodiments, the cell phone may turn off the off duty punch-in notification if the current location is not within the geographic range of the office area after the specified off duty time. When the current position is within the geographical range of the office area, the mobile phone can redisplay the off-duty card punching notice. In other embodiments, if it is not detected that the user completes the work attendance checking operation, the mobile phone may also always display the work attendance checking notification.
In an embodiment of the present application, the first card may be a card for displaying flight itineraries shown in fig. 2A-2Q; the first card can be the card for displaying the high-speed rail travel shown in fig. 6A-6D; the first card may also be a punch card as shown in fig. 8A-8B.
Taking a card showing a flight itinerary (flight card) as an example, the area 221 (left area) may be referred to as a first area, and the area 222 (left area) may be referred to as a second area.
Referring to the flight card shown in FIG. 2B, the content displayed in the first region may be referred to as first content, e.g., "Shenzhen Bao's T38: 00 Beijing capital T311: 20". The content displayed in the second area may be referred to as second content, such as "open machines expected to be 18:00 today". At this time, the first state is a state in which the notice of the value machine is displayed.
After the first state, the state in which the check-in reminder is displayed may be referred to as a second state. Referring to the user interface shown in FIG. 2C, in response to the second state, the content displayed in the first region can be referred to as first content, e.g., "Shenzhen Jeannan T38: 00 Beijing capital T311: 20". The content displayed in the second area may be referred to as third content. At this time, the third contents are "open operator" and operator seating button 231.
If the card shown in fig. 2C is in the first state, the card shown in fig. 2D can be referred to as the card in the second state. At this time, the content displayed in the right area in the card shown in fig. 2C may be referred to as second content, and the content displayed in the right area in the card shown in fig. 2D may be referred to as third content. The other states are the same.
The state in which the electronic device 100 displays the flight card shown in fig. 2I may be referred to as a third state. The content displayed in the area 221 of the card shown in FIG. 2I may be referred to as fourth content; the display content shown in the area 222 may be referred to as fifth content. In the user interfaces shown in fig. 3A to 3D, the content displayed in the notification bar (banner notification, lock screen notification, pull-down notification) may be referred to as sixth content.
The electronic device 100 displays the status of the flight card shown in fig. 2B, which may be referred to as a status of displaying an attended forecast. The status of the electronic device 100 displaying the flight card shown in fig. 2C may be referred to as a status of displaying an attendant reminder. The status of the electronic device 100 displaying the flight card shown in fig. 2D may be referred to as a status of prompting the user that the flight card is on duty. The status of the electronic device 100 displaying the flight card shown in fig. 2E may be referred to as displaying the check-in countdown status. The status of the electronic device 100 displaying the flight card shown in fig. 2F may be referred to as displaying the status of the check-in counter.
The status of the electronic device 100 displaying the flight card shown in fig. 2G may be referred to as a status of displaying a departure alert. The state in which the electronic device 100 displays the flight card shown in fig. 2H may be referred to as a state in which the travel time is displayed.
The status of the electronic device 100 displaying the flight card shown in fig. 2L may be referred to as a status displaying a health code. The status of the electronic device 100 displaying the flight card shown in fig. 2I may be referred to as a status of starting to board an airplane. The status of the electronic device 100 displaying the flight card shown in fig. 2J may be referred to as displaying the status of the baggage carousel.
The status of the electronic device 100 displaying the flight card shown in fig. 6B may be referred to as a status of reminding the user that high-speed rail is about to depart. The status of the electronic device 100 displaying the flight card shown in fig. 6C may be referred to as reflecting the status that the user has arrived at the high-speed rail station. The status of the electronic device 100 displaying the flight card shown in fig. 6D may be referred to as a status reflecting that the high-speed rail has departed.
Next, the hardware structure of the electronic device 100 will be described in the embodiment of the present application with reference to fig. 9.
In the embodiment of the present application, the electronic device 100 is a mobile phone. Furthermore, the electronic device 100 may also be a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, and the embodiment of the present application does not particularly limit the specific type of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
In the embodiment of the present application, the capabilities provided by the mobile communication module 150 and the wireless communication module 160 for the electronic device 100 include: location data (current location) via GPS, location data via cellular data, and upload and download services via cellular data and acquisition, among others.
Electronic device 100 may determine the context in which the user is based on the location data and other status information (e.g., time, etc.). Further, the electronic device 100 may update the content presented in the card or other type of notification according to the context in which the user is located. The upload and download services may enable the electronic device 100 to obtain content presented in other types of notifications, such as cards.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In the embodiment of the present application, the user interfaces shown in fig. 2A to 2J, fig. 3A to 3F, fig. 6A to 6D, fig. 7, and fig. 8A to 8B may be displayed through the GPU and the display screen 194.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The internal memory 121 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs).
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (DRAM), synchronous dynamic random-access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), such as fifth generation DDR SDRAM generally referred to as DDR5 SDRAM, and the like;
the nonvolatile memory may include a magnetic disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operation principle, may include single-level cells (SLC), multi-level cells (MLC), three-level cells (TLC), four-level cells (QLC), etc. according to the level order of the memory cells, and may include universal FLASH memory (UFS), embedded multimedia memory cards (eMMC), etc. according to the storage specification.
The random access memory may be read and written directly by the processor 110, may be used to store executable programs (e.g., machine instructions) of an operating system or other programs in operation, and may also be used to store data of users and applications, etc.
The nonvolatile memory may also store executable programs, data of users and application programs, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect an external nonvolatile memory to extend the storage capability of the electronic device 100. The external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are saved in an external nonvolatile memory.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
In the embodiment of the present application, the touch sensor 180K may support operations such as clicking and sliding performed by the user on the screen, and in response to the operations, the electronic device 100 may display a user interface corresponding to the operations. For example, referring to fig. 2I, touch sensor 180K may detect a user operation by a user on button 251, and in response to the operation, electronic device 100 may display a user interface including an electronic boarding pass.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
By implementing the method provided by the embodiment of the application, the electronic equipment such as the mobile phone and the like can divide the content in one notification into different parts according to different situations of the user. Then, according to the current situation of the user, selectively showing part of the content in the notification. Therefore, the user can obtain much and comprehensive information, and can quickly and accurately obtain the information which the user wants to know most under the current situation, so that the complexity and inconvenience caused by information stacking are avoided.
The term "User Interface (UI)" in the specification, claims and drawings of the present application is a medium interface for interaction and information exchange between an application program or operating system and a user, and it implements conversion between an internal form of information and a form acceptable to the user. The user interface of the application program is a source code written by a specific computer language such as java, extensible markup language (XML), and the like, and the interface source code is analyzed and rendered on the terminal device, and finally presented as content that can be identified by the user, such as controls such as pictures, characters, buttons, and the like. Controls, also called widgets, are basic elements of user interfaces, and typically have a toolbar (toolbar), menu bar (menu bar), text box (text box), button (button), scroll bar (scrollbar), picture, and text. The properties and contents of the controls in the interface are defined by tags or nodes, such as XML defining the controls contained by the interface by nodes < Textview >, < ImgView >, < VideoView >, and the like. A node corresponds to a control or attribute in the interface, and the node is rendered as user-viewable content after parsing and rendering. In addition, many applications, such as hybrid applications (hybrid applications), typically include web pages in their interfaces. A web page, also called a page, may be understood as a special control embedded in an application program interface, where the web page is a source code written in a specific computer language, such as hypertext markup language (GTML), Cascading Style Sheets (CSS), java script (JavaScript, JS), etc., and the web page source code may be loaded and displayed as content recognizable to a user by a browser or a web page display component similar to a browser function. The specific content contained in the web page is also defined by tags or nodes in the source code of the web page, such as GTML defining elements and attributes of the web page by < p >, < img >, < video >, < canvas >.
A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items. As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (19)

1. A notification display method applied to electronic equipment is characterized by comprising the following steps:
displaying a first user interface, wherein a first card is displayed on the first user interface, the first card comprises a first area and a second area, the first area is not overlapped with the second area, the first area displays first content, the second area displays second content, the second content is used for indicating that the state of a user is a first state, and the second content is associated with the first content;
detecting that the state of the user is a second state, wherein the second state is different from the first state;
in response to the user being in a second state, the first area displays the first content, the second area displays third content, the third content is used for indicating that the user being in the second state, the third content is different from the second content, and the third content is associated with the first content;
wherein the first state and the second state are states associated with a geographic location, and/or a time, at which the user is located.
2. The method of claim 1, further comprising:
detecting that the state of the user is a third state, wherein the third state is different from the second state;
in response to that the state of the user is a third state, the first area displays fourth content, the second area displays fifth content, the fourth content is different from the first content, the fifth content is different from third content, the fifth content is used for indicating that the state of the user is the third state, and the fifth content is associated with the fourth content;
wherein the fourth content is associated with the first content, the display form of the fourth content is different from that of the first content, and the text content of the fourth content is the same as that of the first content;
the third state is a state associated with the geographic location, and/or time, at which the user is located.
3. The method according to claim 1 or 2, wherein after detecting that the state of the user is the second state, the method further comprises:
displaying a message notification, wherein the message notification is one or more of a banner notification, a lock screen notification, and a pull-down notification, and the message notification includes sixth content, and the sixth content is used for indicating that the state of the user is the second state, and is associated with third content.
4. The method according to any one of claims 1-3, wherein after detecting that the state of the user is the second state, the method further comprises: displaying a second card at the first user interface, the second card including controls to obtain a health code, and/or a nucleic acid record, and/or a vaccine record.
5. The method of any of claims 1-4, wherein the first card is an airline card;
the first state, the second state and the third state are respectively: a state of reminding the user of check-in, a state of reminding the user of going to an airport, a state of reflecting the user is ready to board, and a state of reflecting the user's arrival at a destination.
6. The method of claim 5,
the state of reminding the user to handle the check-in procedure comprises the following steps: displaying the state of the on-duty forecast, the state of the on-duty prompt for a user, the state of the on-duty countdown or the state of the on-duty counter; alternatively, the first and second electrodes may be,
the state of reminding the user to go to the airport comprises: displaying the state of departure reminding or the state of displaying the journey time; alternatively, the first and second electrodes may be,
the state reflecting that the user is ready to board includes: displaying the state of the health code, the state of starting boarding or the state of a gate; alternatively, the first and second electrodes may be,
the state reflecting the arrival of the user at the destination comprises: the state of the luggage carousel, the state of the hotel location or the state of the tourist attractions is displayed.
7. The method according to any one of claims 1 to 6,
the first content includes: one or more of a flight number, a travel date, a departure location, a departure time, a destination, and an arrival time of the flight; alternatively, the first and second electrodes may be,
the second content, the third content, and the fifth content respectively include: one or more of an attendance forecast, an attendance prompt, an attended prompt, an attendance countdown, an attendance counter, a departure prompt, a journey time, a seat number, a boarding gate, a luggage turntable, a hotel position and a tourist spot of the flight; alternatively, the first and second electrodes may be,
the fourth content includes: one or more of a flight number, a travel date, a departure location, a departure time, a destination, an arrival time, an check-in counter, a seat number, a gate of the flight.
8. The method according to any one of claims 1-7, further comprising:
when the first state, the second state, or the third state is the state reflecting that the user is ready to board;
determining that the electronic equipment is running an immersive application program, and displaying a floating window, wherein the floating window is used for displaying boarding reminders; wherein the immersive application is one or more of a video-class application, a game-class application, a music-class application, or a call-class application.
9. The method of claim 8, wherein after displaying the floating window, the method further comprises:
detecting a first operation of a user on the floating window;
displaying the electronic boarding check in response to the first operation, wherein the first operation is one of a click operation, a long-press operation sliding operation or a voice control operation.
10. The method according to any one of claims 1 to 4,
the first card is a card for displaying high-speed rail travel;
the first state, the second state and the third state are respectively: one of a state to remind the user that high-speed rail is about to depart, a state to reflect that the user has arrived at the high-speed rail station, and a state to reflect that the high-speed rail has departed.
11. The method of claim 10,
the first content includes: one or more of train number, trip date, departure place, departure time, destination, arrival time;
the second content, the third content, and the fifth content respectively include: one or more of a departure reminder, a car number, a seat number, and a ticket gate;
the fourth content includes: one or more of train number, journey date, departure place, departure time, destination, arrival time, departure reminder, carriage number, seat number, ticket gate.
12. The method according to claim 10 or 11, characterized in that the method further comprises:
when the first state, the second state, or the third state is the state reflecting that the user has arrived at a high-speed rail station;
determining that the electronic equipment is running an immersive application program, and displaying a floating window, wherein the floating window is used for displaying an electronic ticket two-dimensional code; wherein the immersive application is one or more of a video-class application, a game-class application, a music-class application, or a call-class application.
13. The method of claim 12, wherein after displaying the floating window, the method further comprises:
detecting a first operation of a user on the floating window;
and responding to the first operation, and displaying the two-dimensional code of the electronic equipment, wherein the first operation is one of click operation, long-press operation, sliding operation or voice control operation.
14. The method of claim 1 or 3, wherein the first card is a punch card.
15. The method of claim 14, wherein the first state is a state in which an on-duty punch-card reminder is displayed, and wherein the second state is a state in which an off-duty reminder is displayed.
16. The method of claim 15, wherein the first content comprises a punch-out reminder, wherein the second content comprises one or more of a work punch-out control, a work punch-out time, or whether a user punches a card, and wherein the third content comprises: one or more of an off duty punch-in control, an off duty punch-in time, or whether the user punches a card.
17. An electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors for storing computer program code comprising computer instructions which, when executed by the one or more processors, cause performance of the method recited in any of claims 1-16.
18. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause performance of the method of any of claims 1-16.
19. A computer program product comprising instructions for causing an electronic device to perform the method of any one of claims 1-16 when the computer program product is run on the electronic device.
CN202110606593.6A 2021-05-28 2021-05-28 Context-based notification display method and apparatus Pending CN113722029A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110606593.6A CN113722029A (en) 2021-05-28 2021-05-28 Context-based notification display method and apparatus
CN202210679572.1A CN115334193B (en) 2021-05-28 2021-05-28 Notification display method and device based on situation
PCT/CN2022/073332 WO2022247326A1 (en) 2021-05-28 2022-01-22 Notification display method and apparatus based on scenario

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110606593.6A CN113722029A (en) 2021-05-28 2021-05-28 Context-based notification display method and apparatus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210679572.1A Division CN115334193B (en) 2021-05-28 2021-05-28 Notification display method and device based on situation

Publications (1)

Publication Number Publication Date
CN113722029A true CN113722029A (en) 2021-11-30

Family

ID=78672841

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210679572.1A Active CN115334193B (en) 2021-05-28 2021-05-28 Notification display method and device based on situation
CN202110606593.6A Pending CN113722029A (en) 2021-05-28 2021-05-28 Context-based notification display method and apparatus

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210679572.1A Active CN115334193B (en) 2021-05-28 2021-05-28 Notification display method and device based on situation

Country Status (2)

Country Link
CN (2) CN115334193B (en)
WO (1) WO2022247326A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489435A (en) * 2021-12-20 2022-05-13 广东乐心医疗电子股份有限公司 Area display method and device and electronic equipment
WO2022247326A1 (en) * 2021-05-28 2022-12-01 荣耀终端有限公司 Notification display method and apparatus based on scenario
CN115659069A (en) * 2022-12-28 2023-01-31 荣耀终端有限公司 Card punching recommendation method and device and terminal equipment
CN116320141A (en) * 2023-05-24 2023-06-23 荣耀终端有限公司 Method for recommending card punching, electronic equipment and computer readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116506547B (en) * 2023-06-30 2023-10-24 荣耀终端有限公司 Information prompting method, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105916106A (en) * 2015-10-30 2016-08-31 乐视移动智能信息技术(北京)有限公司 Flight information display method based on mobile phone screen and flight information display system thereof
CN108089832A (en) * 2017-12-18 2018-05-29 携程旅游网络技术(上海)有限公司 Flight dynamic information methods of exhibiting, system, equipment and storage medium
CN109691072A (en) * 2016-09-09 2019-04-26 华为技术有限公司 For the method, apparatus of sending out notice, mobile terminal and graphic user interface
CN112333240A (en) * 2020-10-13 2021-02-05 珠海格力电器股份有限公司 Message push display method and device, readable storage medium and computer equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007091331A1 (en) * 2006-02-10 2007-08-16 Fujitsu Limited Information display device, information display method, and program
CN104423934A (en) * 2013-08-25 2015-03-18 上海莞东拿信息科技有限公司 Android platform system based journey flight dynamic notification system and method
CN108153601B (en) * 2018-01-05 2021-05-18 北京小米移动软件有限公司 Method and device for outputting notification information
CN109819410A (en) * 2019-03-18 2019-05-28 北京小米移动软件有限公司 Short message display method, device and storage medium
CN110543287B (en) * 2019-08-01 2023-07-18 华为技术有限公司 Screen display method and electronic equipment
CN115334193B (en) * 2021-05-28 2023-10-31 荣耀终端有限公司 Notification display method and device based on situation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105916106A (en) * 2015-10-30 2016-08-31 乐视移动智能信息技术(北京)有限公司 Flight information display method based on mobile phone screen and flight information display system thereof
CN109691072A (en) * 2016-09-09 2019-04-26 华为技术有限公司 For the method, apparatus of sending out notice, mobile terminal and graphic user interface
CN108089832A (en) * 2017-12-18 2018-05-29 携程旅游网络技术(上海)有限公司 Flight dynamic information methods of exhibiting, system, equipment and storage medium
CN112333240A (en) * 2020-10-13 2021-02-05 珠海格力电器股份有限公司 Message push display method and device, readable storage medium and computer equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247326A1 (en) * 2021-05-28 2022-12-01 荣耀终端有限公司 Notification display method and apparatus based on scenario
CN114489435A (en) * 2021-12-20 2022-05-13 广东乐心医疗电子股份有限公司 Area display method and device and electronic equipment
CN115659069A (en) * 2022-12-28 2023-01-31 荣耀终端有限公司 Card punching recommendation method and device and terminal equipment
CN116320141A (en) * 2023-05-24 2023-06-23 荣耀终端有限公司 Method for recommending card punching, electronic equipment and computer readable storage medium
CN116320141B (en) * 2023-05-24 2023-10-20 荣耀终端有限公司 Method for recommending card punching, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN115334193B (en) 2023-10-31
WO2022247326A1 (en) 2022-12-01
CN115334193A (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN115334193B (en) Notification display method and device based on situation
CN113741781B (en) Notification display method and electronic equipment
CN113722028B (en) Dynamic card display method and device
CN108595634B (en) Short message management method and device and electronic equipment
CN113722581B (en) Information pushing method and electronic equipment
CN107967154A (en) Remind item generation method and device
WO2022088938A1 (en) Sleep monitoring method and apparatus, and electronic device and computer-readable storage medium
WO2022152024A1 (en) Widget display method and electronic device
CN111222836A (en) Arrival reminding method and related device
WO2022022335A1 (en) Method and apparatus for displaying weather information, and electronic device
EP4131032A1 (en) Reminding method and related apparatus
WO2021147653A1 (en) Information sharing method for scenario-intelligence service and related device
WO2022037398A1 (en) Audio control method, device, and system
CN115688743B (en) Short message analysis method and related electronic equipment
WO2022247383A1 (en) Prompt method, graphical user interface, and related apparatus
WO2022121600A1 (en) Activity recognition method, display method, and electronic device
WO2022083328A1 (en) Content pushing method and apparatus, storage medium, and chip system
CN114465975B (en) Content pushing method, device, storage medium and chip system
CN114822543A (en) Lip language identification method, sample labeling method, model training method, device, equipment and storage medium
US20240134491A1 (en) Notification Display Method and Electronic Device
WO2024041180A1 (en) Path planning method and apparatus
CN113507406B (en) Message management method and related equipment
WO2024001906A1 (en) Terminal scene identifying method and device, terminal, storage medium and program product
CN117909536A (en) Photo pushing method and related device
CN115482440A (en) Sample data acquisition method, model training method, electronic device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination