CN115563338B - Photo pushing method and related device - Google Patents
Photo pushing method and related device Download PDFInfo
- Publication number
- CN115563338B CN115563338B CN202210509811.9A CN202210509811A CN115563338B CN 115563338 B CN115563338 B CN 115563338B CN 202210509811 A CN202210509811 A CN 202210509811A CN 115563338 B CN115563338 B CN 115563338B
- Authority
- CN
- China
- Prior art keywords
- user
- arrival
- interface
- place
- photo
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 129
- 238000004590 computer program Methods 0.000 claims description 12
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000004458 analytical method Methods 0.000 description 55
- 238000012545 processing Methods 0.000 description 34
- 230000006870 function Effects 0.000 description 27
- 238000004891 communication Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 19
- 230000008569 process Effects 0.000 description 17
- 238000010295 mobile communication Methods 0.000 description 10
- 230000005236 sound signal Effects 0.000 description 10
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000008033 biological extinction Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/587—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/535—Filtering based on additional data, e.g. user or group profiles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/958—Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Telephone Function (AREA)
Abstract
The method and the related device for pushing the photos are beneficial to timely generating the moment album associated with the arrival places for the user and improve the use experience of the user. The method comprises the following steps: acquiring an arrival place of a first journey of a user; a first notification message is displayed at the first interface for notifying a user to view a first set of photos associated with the arrival location, the first set of photos generated prior to the user arriving at the arrival location.
Description
The present application claims priority from chinese patent office, application No. 202210116581.X, application name "data display method, apparatus, storage medium and program product" filed on day 2022, 02 and 07, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the field of terminals, in particular to a method for pushing photos and a related device.
Background
With the continuous development of technology, terminal equipment has more and more interesting functions for user experience. At present, a gallery of the terminal equipment has a 'moment' function, after a user shoots a photo, the gallery analyzes the shot photo, automatically clusters the photo based on time, place and other information, and generates a album with multiple moments, so that the recall of the user can be presented in a more wonderful form.
Because the terminal equipment generates the moment album and consumes excessive power consumption, strict generating conditions need to be met, for example, the terminal equipment needs to carry out intelligent analysis on photos in a gallery under the scene of screen extinction and power supply charging, and the moment album is generated.
In one possible scenario, the user travels from the departure location to the arrival location, after the user arrives at the arrival location, the terminal device determines that the geographic location of the user has changed, and the terminal device may generate a photo album at the arrival location that the user took before the arrival location. However, the terminal device may not meet the above conditions for generating the time album at the arrival place, for example, the user has a short time at the arrival place and has not charged the terminal device during the arrival place, so the time album cannot be generated in time, and the user experience is poor.
Disclosure of Invention
The method and the related device for pushing the photos are beneficial to timely generating the moment album associated with the arrival places for the user and improve the use experience of the user.
In a first aspect, a method for pushing a photo is provided, and the method is applied to a terminal device, and includes: acquiring an arrival place of a first journey of a user; a first notification message is displayed at the first interface for notifying a user to view a first set of photographs associated with the arrival location, the first set of photographs generated prior to arrival of the user at the arrival location.
In the present application, the user is in a departure scenario in the first journey, and the user may purchase an airplane ticket in advance before proceeding from the departure place, for example. The terminal device may obtain an arrival location of a first itinerary of the user based on ticket information of the user, such that the terminal device may generate a first photo set, which may also be referred to as a moment album, before the user arrives at the arrival location and push the first photo set to the user.
It should be understood that, in the present application, the terminal device may acquire the arrival location of the present trip in advance based on the trip information of the user, instead of determining that the user has arrived at the destination based on the change in the geographical location of the user. Therefore, the time album associated with the arrival place can be generated more timely, and the use experience of the user is improved.
It should be noted that, because the album needs to consume too much power consumption when the terminal device generates, a strict generating condition needs to be satisfied, for example, the terminal device needs to perform intelligent analysis on the photos in the gallery in a scene of screen extinction and power supply charging, so as to generate the album at the moment. In the application, the departure place of the user is the resident place of the user, and compared with the arrival place, the user has a larger possibility of meeting the generation condition of the album at the resident place. Therefore, the terminal equipment can timely generate the moment album for the user before the user arrives at the arrival place, and the flexibility of generating the moment album is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, before the first interface displays the first notification message, the method further includes: judging whether the arrival place is a resident place of a user or not; if the arrival location is a location other than the user's residence, it is determined to generate a first set of photographs associated with the arrival location.
In the application, after the terminal equipment acquires the arrival place of the first journey, the terminal equipment can judge whether the arrival place is the resident place of the user. If the arrival location is a location other than the user's residence, the terminal device may generate a first set of photographs based on photographs in the gallery associated with the arrival location.
The precondition for the terminal device to generate the first photo collection associated with the arrival place is that: the user has come to the place of arrival and the photo taken by the user at the place of arrival is stored in the gallery of the terminal device.
With reference to the first aspect, in certain implementations of the first aspect, before determining to generate the first set of photos associated with the arrival location, the method further includes: the departure time of the first trip is obtained. Determining to generate a first set of photographs associated with the arrival location, comprising: a first set of photographs is generated prior to a departure time.
In the application, the terminal equipment can acquire the departure time of the current journey of the user and generate the first photo set before the departure time. For example, the terminal device may preset a generation time, which is determined from the departure time, and the generation time is earlier than the departure time. And triggering the terminal equipment to generate the first photo set when the generation time arrives.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: the departure point of the first trip is obtained. Displaying a first notification message on a first interface, including: a first notification message is displayed on a first interface before a user issues from a departure point.
In the application, the terminal equipment can push the first photo set for the user before the user starts from the departure place, so that the user can look at the first photo set in advance before the user starts, the time of arrival is restarted, and the use experience of the user is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, displaying a first notification message on a first interface includes: after the user arrives at the arrival location, the first notification message is displayed on the first interface.
With reference to the first aspect, in certain implementation manners of the first aspect, before the first interface displays the first notification message, the method further includes: displaying a second notification message on the second interface, the second notification message being for notifying a user to view travel information, the travel information including one or more of: departure time, departure place and arrival place.
With reference to the first aspect, in certain implementations of the first aspect, the itinerary information is determined based on ticketing information or calendar information.
With reference to the first aspect, in certain implementations of the first aspect, after obtaining the arrival location of the first trip of the user, the method further includes: obtaining a storage path of a target photo in a gallery from a database, wherein the target photo comprises a photo taken by a user at an arrival place in a history journey; acquiring a target photo from a gallery based on the storage path; based on the target photographs, a first set of photographs is generated.
In the present application, the first set of photographs includes the target photographs, i.e., photographs taken by the user at the arrival location during the historical trip.
Optionally, the first photo set may further include photos related to the arrival place, which are acquired by the terminal device from the network side, for example, photos including a landmark scenic spot, a building, a food, and the like of the arrival place.
With reference to the first aspect, in certain implementations of the first aspect, the aesthetic score of the target photograph is above a preset threshold.
In the application, the terminal equipment can perform aesthetic detection on photos taken by a user at the arrival place in a history journey, and generate a first photo set by adopting photos with aesthetic scores higher than a preset threshold value.
With reference to the first aspect, in certain implementation manners of the first aspect, before acquiring a storage path of the target photo in the gallery from the database, the method further includes: the target photograph is determined based on the photographing location information of the plurality of photographs in the gallery.
In the application, the terminal equipment can analyze the information of a plurality of photos in the gallery to obtain the shooting location information of the plurality of photos, thereby determining the photos shot at the arrival place as target photos.
With reference to the first aspect, in certain implementation manners of the first aspect, before determining the target photo based on the shooting location information of the plurality of photos in the gallery, the method further includes: acquiring shooting location information of a plurality of pictures in a gallery, wherein the plurality of pictures in the gallery comprise target pictures; the shooting location information of the plurality of photos is stored in a database.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: acquiring a departure place of a second journey of a user; and displaying a third notification message on the third interface, wherein the third notification message is used for notifying a user to view a second photo set associated with the departure place, and the second photo set is generated before the user departs from the departure place.
In the application, the second journey is a return journey scene, and the departure place of the return journey of the user and the arrival place of the departure journey of the user are the same place. Similarly, the arrival place of the user return trip and the departure place of the user departure trip are the same place, namely the resident place of the user.
In the return scenario, the terminal device may obtain the departure place in the travel information of the user return, so that the terminal device may generate a second photo set before the user starts from the departure place, and push the second photo set to the user, where the second photo set is a photo set associated with the departure place, and the second photo set may also be referred to as a time album.
It is understood that in the application, the terminal device can acquire the departure place of the current journey in advance based on the journey information of the user, so that a moment album associated with the departure place can be generated more timely, and the use experience of the user is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, before the third interface displays the third notification message, the method further includes: acquiring the arrival ground of the second stroke; judging whether the arrival place is a resident place of the user; if the arrival location is the customer's residence, it is determined to generate a second set of photographs associated with the departure location.
In the application, the terminal equipment can acquire the arrival place of the current journey of the user, if the arrival place is the resident place of the user, the terminal equipment can judge that the current state of the user is before returning to the resident place, and then can determine that the current departure place is other places except the resident place of the user, so the terminal equipment can generate a second photo set associated with the departure place for the user.
It should be appreciated that both the departure scenario and the return scenario, the terminal device generates a collection of photographs associated with locations other than the user's residence.
In a second aspect, an apparatus for pushing a photograph is provided, including: for performing the method in any of the possible implementations of the first aspect described above. In particular, the apparatus comprises means for performing the method in any one of the possible implementations of the first aspect described above.
In a third aspect, there is provided another apparatus for pushing photographs, comprising a processor and a memory, the processor being coupled to the memory, the memory being operable to store a computer program, the processor being operable to invoke and execute the computer program in the memory to implement a method according to any of the possible implementations of the first aspect.
In one implementation, the device for pushing the photo is a terminal device. When the means for pushing the photo is a terminal device, the communication interface may be a transceiver, or an input/output interface.
In another implementation, the device for pushing the photo is a chip configured in the terminal device. When the means for pushing the photo is a chip configured in the terminal device, the communication interface may be an input/output interface.
In a fourth aspect, there is provided a processor comprising: input circuit, output circuit and processing circuit. The processing circuitry is configured to receive signals via the input circuitry and to transmit signals via the output circuitry such that the processor performs the method of any one of the possible implementations of the first aspect described above.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a trigger, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the output signal may be output by, for example and without limitation, a transmitter and transmitted by a transmitter, and the input circuit and the output circuit may be the same circuit, which functions as the input circuit and the output circuit, respectively, at different times. The specific implementation of the processor and various circuits is not limited by the present application.
In a fifth aspect, a processing device is provided that includes a processor and a memory. The processor is configured to read instructions stored in the memory and to receive signals via the receiver and to transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, the processor is one or more and the memory is one or more.
Alternatively, the memory may be integrated with the processor or the memory may be separate from the processor.
In a specific implementation process, the memory may be a non-transient (non-transitory) memory, for example, a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips.
It should be appreciated that the related data interaction process, for example, transmitting the indication information, may be a process of outputting the indication information from the processor, and the receiving the capability information may be a process of receiving the input capability information by the processor. Specifically, the data output by the processing may be output to the transmitter, and the input data received by the processor may be from the receiver. Wherein the transmitter and receiver may be collectively referred to as a transceiver.
The processing means in the fifth aspect may be a chip, and the processor may be implemented by hardware or by software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor, implemented by reading software code stored in a memory, which may be integrated in the processor, or may reside outside the processor, and exist separately.
In a sixth aspect, there is provided a computer program product comprising: computer program code which, when run, causes a computer to perform the method of any one of the possible implementations of the first aspect described above.
In a seventh aspect, a computer readable storage medium is provided, the computer readable storage medium storing a computer program which, when executed, causes a computer to perform the method of any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applicable;
FIG. 2 is an interface schematic diagram of a push photo according to an embodiment of the present application;
FIG. 3 is a schematic view of an interface of a photo album at a viewing time according to an embodiment of the present application;
FIG. 4 is a schematic view of another view moment album interface according to an embodiment of the present application;
FIG. 5 is a schematic view of an interface for a photograph album at a viewing time according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an interface for a photo album at a viewing time provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an interface for pushing photos according to another embodiment of the present application;
FIG. 8 is a schematic flow chart of a method for pushing photos provided by an embodiment of the present application;
fig. 9 is a schematic diagram of a pushing photo according to an embodiment of the present application;
FIG. 10 is a schematic flow chart diagram of another method for pushing photos provided by an embodiment of the present application;
FIG. 11 is a schematic flow chart of yet another method for pushing photos provided by an embodiment of the present application;
FIG. 12 is a schematic flow chart diagram of yet another method for pushing photos provided by an embodiment of the present application;
FIG. 13 is a schematic flow chart of yet another method for pushing photos provided by an embodiment of the present application;
FIG. 14 is a schematic block diagram of an apparatus for pushing photographs provided by an embodiment of the present application;
fig. 15 is a schematic block diagram of another apparatus for pushing photos according to an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first notification message and the second notification message are for distinguishing different notification messages, and the sequence of the first notification message and the second notification message is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Fig. 1 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applicable. As shown in fig. 1, the terminal device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. It is to be understood that the configuration illustrated in the present embodiment does not constitute a specific limitation on the terminal device 100. In other embodiments of the application, terminal device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a display processing unit (display process unit, DPU), and/or a neural-network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In some embodiments, the terminal device 100 may also include one or more processors 110. The processor may be a neural hub and a command center of the terminal device 100. The processor can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 uses or recycles. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby improving the efficiency of the terminal device 100.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB interface, among others. The USB interface 130 is an interface conforming to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, or may be used to transfer data between the terminal device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is illustrated schematically, and does not constitute a structural limitation of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The wireless communication function of the terminal device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the terminal device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the terminal device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN), bluetooth, global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), NFC, infrared technology (IR), etc. applied on the terminal device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of terminal device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that terminal device 100 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou satellite navigation system (bei dou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenithsatellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The terminal device 100 may implement a display function through a GPU, a display screen 194, an application processor, and the like. The application processor may include an NPU and/or a DPU. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute instructions to generate or change display information. The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the terminal device 100 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc. The DPU is also referred to as a display sub-system (DSS) and is used to adjust the color of the display screen 194, which may be adjusted by a color three-dimensional look-up table (3D look up table,3D LUT). The DPU can also perform processes such as scaling, noise reduction, contrast enhancement, backlight brightness management, hdr processing, display parameter Gamma adjustment, and the like on the picture.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-OLED, or a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED). In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement photographing functions through an ISP, one or more cameras 193, a video codec, a GPU, one or more display screens 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, data files such as music, photos, videos, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the terminal device 100 to execute various functional applications, data processing, and the like by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage area may also store one or more applications (e.g., gallery, contacts, etc.), and so forth. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the terminal device 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. In some embodiments, the processor 110 may cause the terminal device 100 to perform various functional applications and data processing by executing instructions stored in the internal memory 121, and/or instructions stored in a memory provided in the processor 110.
The terminal device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc. Wherein the audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device 100 can listen to music or to handsfree talk through the speaker 170A. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the terminal device 100 receives a call or voice message, it is possible to receive voice by approaching the receiver 170B to the human ear. Microphone 170C, also known as a "microphone" or "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may be further provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify the source of sound, implement directional recording functions, etc. The earphone interface 170D is used to connect a wired earphone. The earphone interface 170D may be a USB interface 130, or may be a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, or may be a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensors 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The terminal device in the embodiment of the present application may be a handheld device, an in-vehicle device, or the like with a wireless connection function, and the terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. Currently, examples of some terminal devices are: mobile phone (mobile phone), tablet, smart tv, notebook, tablet (Pad), palm, mobile internet device (mobile internet device, MID), virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned driving (self driving), wireless terminal in teleoperation (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart home), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, computing device or other processing device connected to wireless modem, vehicle device, wearable device, terminal device in 5G network or terminal device in future evolution, public mode of the application is not adopted for specific embodiments of the present application, and the present application.
By way of example, and not limitation, in embodiments of the present application, the terminal device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
It should be understood that in the embodiment of the present application, the terminal device may be a device for implementing a function of the terminal device, or may be a device capable of supporting the terminal device to implement the function, for example, a chip system, and the device may be installed in the terminal. In the embodiment of the application, the chip system can be composed of chips, and can also comprise chips and other discrete devices.
The terminal device in the embodiment of the present application may also be referred to as: a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment, etc.
As can be seen from fig. 1, a gallery is maintained in the terminal device, and the gallery is used for storing photos, where the photos are obtained in various manners, for example, from photos taken by a user or from photos obtained from a network side.
It should be understood that the gallery of the terminal device has a "moment" function, and after the user takes the photo, the gallery may analyze the taken photo, automatically cluster the photo based on information such as time and place, and generate a album with multiple moments, so that the recall of the user may be presented in a more wonderful form.
The embodiment of the application provides a method and a related device for pushing photos, wherein terminal equipment can judge whether the current state of a user is before departure from a resident place or before return to the resident place in advance based on the travel information of the current travel of the user, and generate a time album associated with the departure place or the arrival place, so that the coupling between the travel information and the time album is improved, the time album is generated more timely, the current actual state of the user is more met, and the use experience of the user is improved.
The embodiment of the application can be particularly applied to the following two scenes:
scene 1: the current trip of the user is a trip scenario, for example, the user trips from city a to city B, where city a is the residence of the user and is the departure place of the current trip, and city B is the arrival place of the current trip. Before departure, a user can purchase travel tickets from the city A to the city B through a ticket purchasing application in the terminal equipment, after the ticket purchasing is successful, the terminal equipment can acquire the journey information of the user, and a photo set associated with the city B is generated in the city A based on the journey information of the current journey of the user.
It should be understood that in scenario 1, the present trip is a departure scenario, and the terminal device generates, based on trip information of the departure, a photo set associated with an arrival location of the departure at a departure of the departure.
Scene 2: the current trip of the user is a return scene, for example, the user has come from city a to city B, and is now going back from city B to city a, where city a is the residence of the user, and is the arrival place of the current trip, and city B is the departure place of the current trip. Before returning, the user can purchase travel tickets from the city B to the city A through a ticket purchasing application in the terminal equipment, after successful ticket purchasing, the terminal equipment can acquire the journey information of the user, and a photo set associated with the city B is generated in the city B based on the journey information of the current journey of the user.
It should be understood that, in the scenario 2, the current trip is a return trip scenario, and the terminal device generates, at the departure of the return trip, a photograph set associated with the departure place of the return trip based on trip information of the return trip.
It should be noted that the arrival location in scenario 1 and the departure location in scenario 2 are actually the same location, which is other than the user's residence.
Alternatively, the travel information may include a departure place, an arrival place, a departure time, and an arrival time.
Illustratively, the travel tickets may include an airplane ticket, a train ticket, a motor ticket, and the like, which is not limited by the embodiment of the present application.
In the embodiment of the present application, the photo set associated with the arrival place generated in the scene 1 or the photo set associated with the departure place generated in the scene 2 may be presented in the form of an album, and the photo set associated with the arrival place generated or the photo set associated with the departure place generated may be hereinafter referred to as a target album.
The interface for pushing a photo according to an embodiment of the present application will be described first with respect to scenario 1.
Fig. 2 is an interface schematic diagram of pushing a photo according to an embodiment of the present application. Taking the example of the user purchasing the airplane ticket, before the user purchases the ticket or before the user purchases the ticket but the terminal device does not push the travel notification, as shown in an interface A in fig. 2, the 'moment' interface in the gallery does not display the target album, and at this time, the gallery does not acquire the travel information of the user, and the target album is not generated temporarily. The terminal device may push the travel notification on the desktop in the form of a travel card, as shown in interface B in fig. 2.
After the terminal device pushes the travel notification, as shown in an interface C in fig. 2, a "moment" interface in the gallery displays a moment album named "so-called" which is the target album. The user can click on the album at the moment named' so that the user can get into the photo showing interface to view the photo taken at the arrival place.
Optionally, the target album comprises a plurality of selected photos through aesthetic detection, and a certain association relationship is formed among the plurality of photos.
Optionally, the terminal device may display a plurality of photos with an association relationship in the photo display interface according to a preset rule, where the preset rule may be that the plurality of photos are displayed in the photo display interface according to a shooting time sequence, that is, the plurality of photos have an association relationship in time; or, the preset rule may be that a plurality of photos are displayed in the photo display interface according to the shooting locations, specifically, the plurality of photos may be divided into a plurality of groups according to the difference of the shooting locations, and the photos in each group have an association relationship on the shooting locations. The shooting location may specifically be a point of sale location where the user arrives at, for example, the user has punched a card to take a picture at an a scenic spot and a B scenic spot of a B city, and when displaying a picture related to the B city, the terminal device may display the picture taken at the a scenic spot first in a picture display interface according to the difference of the shooting locations, and then display the picture taken at the B scenic spot.
After the terminal device pushes the travel notification, the terminal device may generate the target album under the condition that the generation time album is satisfied. Illustratively, the user purchases an airline ticket from city a to city B on day 2022, 4, 14, 10 am, 4, 15, and the terminal device pushes a travel notification on day 2022, 4, 14, 11 am. The user can charge the terminal device during the sleep period of 14 d 4 a of 2022, during which the terminal device remains off screen and charged, the condition of generating the moment album can be satisfied, and at this time the gallery has acquired the travel information of the user, it can be determined that the user will go to B city at 15 d 4 a of 2022 based on the travel information, and therefore the gallery can generate the moment album related to B city before the user goes to B city.
Fig. 3 is an interface schematic diagram of an album at a viewing time according to an embodiment of the present application. After generating the target album based on the travel information of the user, the gallery of the terminal device may notify the user pushing system on the negative screen as shown in an interface a in fig. 3, to remind the user that the album related to the arrival place has been generated before the departure.
In the interface a in fig. 3, the system notification is presented to the user in the form of a card, and the embodiment of the present application refers to a carded system notification as a notification card, on which the name of the target album, the cover photograph, and the photographing time of the cover photograph can be displayed. Wherein, as shown in an interface a in fig. 3, the name of the target album may be "so that it is reblowed". The photograph currently displayed on the notification card is a cover photograph, which may be the one with the highest aesthetic score in the target album, i.e., the cover photograph displayed on the notification card is a fixed one. Alternatively, the cover photos displayed on the notification card are not fixed, but the photos in the target album are circularly displayed at preset time intervals.
If the user wants to view the target album, the user can click on the system notification shown as an interface A in fig. 3, the interface of the terminal device jumps to an interface B in fig. 3, and as can be seen, the selected photos taken by the user of the terminal device in the arriving place are displayed on the interface B in fig. 3, and the user can slide up and down to view the photos.
Fig. 4 is a schematic diagram of an interface of another album at a viewing time according to an embodiment of the present application. After the target album is generated based on the travel information of the user, the gallery of the terminal device may display the photos in the target album on the gallery card of the desktop as shown in an interface a in fig. 4 to remind the user that the album of the arrival time has been generated before the departure. Similar to the embodiment described in fig. 3, the name of the target album, the cover photo, and the time of taking the cover photo may be displayed on the gallery card. The cover photo may be a fixed photo in the target album, or the gallery card may circularly display the photo in the target album as the cover photo of the gallery card at preset time intervals.
If the user wants to view the target album, the user can click on a gallery card in an interface shown as an interface A in fig. 4, the interface of the terminal equipment jumps to a photo display interface shown as an interface B in fig. 4, photos shot by the user carefully selected by the terminal equipment in the interface shown as the interface B in fig. 4 are displayed, and the user can slide up and down to view the photos.
Fig. 5 is a schematic view of an interface of another album at a viewing time according to an embodiment of the present application. After generating the target album based on the travel information of the user, as shown in an interface a in fig. 5, the gallery of the terminal device may notify the user push system in a drop-down menu bar to remind the user that the album at the arrival time, that is, the target album, has been generated before departure. As shown in the interface a in fig. 5, the system notification may prompt the user that the arriving photo is finished before departure, and after the user clicks the system notification, the terminal device displays a "moment" interface as shown in the interface B in fig. 5, and it can be seen that the "moment" interface displays a moment album named "so-called" local re-travel ", that is, a target album. The terminal device may display a C interface in fig. 5 after the user clicks on the target album, where the C interface in fig. 5 shows photos taken by the user who has selected the terminal device at the arrival place, and the user may slide up and down to view the taken photos.
Optionally, as shown in an interface a in fig. 5, the terminal device may recommend, for the user, food and news of the arrival place based on the travel information of the user, so that the user can know related information of the arrival place in advance, and use experience of the user is improved.
Fig. 6 is a schematic diagram of an interface of another album at a viewing time according to an embodiment of the present application. After notifying the user that the target album has been generated, the user may click on an icon of a gallery on the desktop of the terminal device as shown in an interface a in fig. 6, and in response to the user's operation, the terminal device displays an interface of "time" as shown in an interface B in fig. 6. Similar to the interface B in fig. 5, it can be seen that a time album named "so-called" is displayed in the interface shown in the interface B in fig. 6, that is, a target album. The user can click on the target album to enter a C interface in fig. 6, wherein the C interface in fig. 6 shows photos taken by the user at the arrival place selected by the terminal equipment, and the user can slide up and down to view the photos.
It should be understood that, in the embodiment of the present application, the terminal device informs the user that the target album has been generated, including: the terminal device informs the user that the target album has been generated in a system notification manner (such as the embodiment described in fig. 3 or fig. 5), or informs the user that the target album has been generated in a gallery card manner (such as the embodiment described in fig. 4), which is not limited by the embodiment of the present application.
In the above-described embodiment, in response to the operation of clicking the target album by the user, the terminal device enters the photo display page in the target album to display a plurality of static photos for the user, where the plurality of static photos may be displayed according to a preset arrangement rule, and the sizes of the plurality of static photos may be different. Or the terminal equipment can generate a video from the photo shot by the user at the arrival place, and after the user clicks the target album to enter the photo display interface, the terminal equipment can play the dynamic video for the user. Or, the photo showing page which the user clicks the target album comprises not only video but also a plurality of static photos, which is not the case according to the embodiment of the present application.
In the above-described embodiment, the terminal device determines the travel information of the user based on the ticket purchase information of the user, and generates the target album including the photos that the user has taken at the arrival place, based on the travel information, before the user starts the travel from the departure place to the arrival place. It should be understood that the terminal device generates the target album before departure, that is, at the departure point of the current trip, but may notify the user that the target album has been generated before departure of the user, or may notify the user that the target album has been generated after arrival of the user at the arrival place, which is not limited by the embodiment of the present application. It should also be understood that the above embodiment is implemented on the premise that the user has come to the arrival place, and that photos taken by the user at the arrival place are stored in the gallery of the terminal device.
Illustratively, the user purchases an airline ticket from city a to city B on day 10 of day 2022, 4, 14, and 10 am, on day 2022, 4, 15, departure time 16:00, arrival time 19:45, and the terminal device may generate the target album based on the travel information after pushing the travel notification on day 2022, 4, 14, 11 am. The terminal device may prompt the user that the target album has been created at 2022, 4, 15, 14:00, i.e., the terminal device may prompt the user that the target album has been created the first two hours of the user's departure time. Alternatively, the terminal device may be at 2022, 4, 15, 21:45, the terminal device may prompt the user that the target album has been created, i.e., two hours after the user's arrival time.
The above description has been made with reference to the accompanying drawings, in which the terminal device generates, at the departure place, a scene 1 of a time album associated with the arrival place based on the travel information before the departure of the user to the arrival place. The interface for pushing photos according to the embodiment of the present application is described below with respect to scenario 2 described above.
Based on the above description, scenario 2 is a return scenario, and in combination with the above example, the user purchases an airline ticket from city B to city a, and likewise, the terminal device may push a travel notification including travel information of the user's return, where the departure place is city B and the arrival place is city a. The terminal device may combine the trip information and the photos taken by the user in the city B to generate a time album related to the city B, and the photos taken by the user in the city B may include photos taken by the user in the city B in a history trip before the trip. After the terminal device generates the moment album, a notification that the moment album related to the city B is generated can be pushed to the user, and the user can view the photos shot in the city B based on the pushed notification.
It should be understood that, the trip from city a to city B is a trip before the trip from city B to city a, i.e. in scenario 2, the history trip before this trip includes the trip from city a to city B of the user, after reaching city B, the user may take a photo in city B, and the terminal device may generate the target album based on the photo taken in city B in the gallery.
Fig. 7 is an interface schematic diagram of another pushed photo according to an embodiment of the present application. Similarly, with reference to the embodiment described in fig. 2, the terminal device purchases airline tickets from city B back to city a, and the terminal device does not display the target album in the "time of day" interface in the gallery before pushing no travel notification, as shown in interface a in fig. 7. After the terminal device pushes the travel notification at the interface B in fig. 7, as shown at the interface C in fig. 7, the "time" interface in the gallery displays a time album named "so-called" re-travel ", i.e., a target album.
In the return scene, the terminal device can execute actions similar to those in the departure scene, the user is prompted to generate the target album in a system notification or gallery card mode, the page of the terminal device jumps to a photo display interface in the target album in response to the operation of the user on the system notification or gallery card, or the terminal device displays a 'moment' interface in response to the operation of the user on the gallery icon, and the terminal device displays the photo display interface in response to the operation of the user on the target album so as to enable the user to view the memory reserved at the departure place. The specific interface for pushing the photo may be referred to in fig. 3 to 6, and will not be described herein.
The interface for pushing the photo is exemplarily described above with reference to the accompanying drawings, and a method for pushing the photo will be described below with reference to fig. 8.
Fig. 8 is a schematic flow chart of a method 800 for pushing photos, where a terminal device includes a plurality of function class services: intelligent analysis service, picture parsing service, highlight analysis service, computer vision (computer vision) algorithm service, and notification (notify) service. The steps of method 800 may be applied to scenario 1 or scenario 2 above, but embodiments of the application are not limited in this regard. The method 800 includes steps S801 to S815, which specifically include the following steps:
s801, the intelligent analysis service acquires journey information.
In this step, the trip information includes information such as a departure place, an arrival place, a departure time, and an arrival time.
In one possible implementation, the intelligent analysis service obtains trip information, including: the intelligent analysis service receives travel information sent by clients from travel information service class applications through software toolkits (software development Kit, SDK).
In another possible implementation, the intelligent analysis service obtains trip information, including: and the intelligent analysis service can analyze the notification short message so as to acquire the travel information.
S802, the intelligent analysis service transmits the travel information to a media server (MP). Accordingly, the MP database receives the travel information.
The MP is a database of the android system, which can be called as an MP database in the embodiment of the application, and the MP database can store information of multimedia files such as pictures, videos, documents and the like for video playing, music players, drawing libraries and document editors.
After storing the trip information in the MP database, various applications in the terminal device may acquire the trip information from the MP database to provide the user with services related to the trip information. For example, the weather application queries travel information from the MP database and obtains the target location information, which may provide the weather conditions of the target location to the user.
Wherein, for scenario 1, the target location is the arrival location, and for scenario 2, the target location is the departure location.
In this step, the intelligent analysis service sends the trip information to the MP database, which can store the trip information in preparation for subsequent generation of a trip notification based on the trip information.
S803, the intelligent analysis service acquires the journey information from the MP database to generate journey notification.
In this step, the intelligent analysis service may generate a trip notification based on the trip information before the user starts.
S804, the intelligent analysis service sends a travel notification to the notification service. Accordingly, the notification service receives the travel notification.
S805, the notification service pushes the trip notification in the interface of the terminal device to prompt the user to pay attention to the trip information.
Alternatively, the time for the notification service to push the travel notification may be determined according to the departure time of the user and a preset advance duration. The notification service may determine a departure time advanced by a preset advance duration and then a time after the advance duration as a time of the push trip notification.
Illustratively, the travel notification may be presented to the user in the form of a travel card as shown in the B interface in fig. 2.
S806, the picture analysis service analyzes the information of the pictures in the picture library to obtain the shooting place information of the pictures.
In this step, the format of the photograph may be an exchangeable image file format (exchangeable image file format, exif), and exif may record information of the photograph, including at least one of global positioning system (global positioning system, GPS) positioning information, photographing time, model, author, or photographing parameters, for example. The picture parsing service may obtain photographing location information of a photo based on the GPS positioning information, wherein the photographing location information of the photo includes a geographical location where the photo was photographed.
S807, the picture parsing service transmits the shooting location information of the photo to a Media Library (ML). Accordingly, the ML database receives the photographing place information of the photograph.
Where ML is a database of gallery, which may be referred to as an ML database in embodiments of the present application.
In this step, the picture parsing service transmits the photographing place information of the photo to the ML database, which stores the photographing place information of the photo in preparation for generating a target album based on the travel information later.
S808, the CV algorithm service performs aesthetic inspection on the photos in the gallery, and determines aesthetic scores of the photos.
In this step, the CV algorithm service can aesthetically test the photo based on its hue, saturation, brightness, etc. dimensions to determine its aesthetic score.
Alternatively, the CV algorithm service may insert photo classifications meeting a preset threshold into tag (tag) columns of the ML database, each tag column corresponding to a type of photo. By way of example, the types of photographs may include figures, landscapes, delicacies, buildings, and the like.
S809, the highlight moment analysis service obtains the journey information of the user.
In one possible implementation, the highlight reel analysis service obtains travel information of the user, including: the highlight analysis service receives trip information from the smart analysis service. In the method, an interface between the intelligent analysis service and the wonderful time analysis service is opened, so that the way of acquiring the journey information is faster, and the power consumption of the terminal device is saved.
In another possible implementation, the highlight reel analysis service obtains travel information of the user, including: the highlight reel analysis service obtains travel information from the MP database.
S810, the highlight moment analysis service sends a request for inquiring the target photo to the ML database based on the target place in the journey information. Accordingly, the ML database receives the request.
Wherein the target photograph represents a photograph associated with the target location, the target photograph may include a photograph taken by the user at the target location during the historical trip.
Optionally, the target photo may further include a photo of the target location acquired from the network side.
In this step, since the picture parsing service in S807 has already stored the photographing place information of the photo into the ML database, the highlight moment analysis service may query the ML database for the photo associated with the target place based on the photographing place information of the photo.
Alternatively, the request to query the target photo may instruct the ML to filter the target photo to determine a target photo that meets the photo quality requirements.
S811, the ML database sends a uniform resource locator (uniform resource locator, URL) of the target photo to the highlight reel service, wherein the URL indicates a storage path of the target photo in a gallery of the terminal device. Accordingly, the highlight reel service receives the URL of the target photograph.
Optionally, the ML database screens the target photos, determines photos with aesthetic scores higher than a preset threshold value from the target photos, and sends URLs of the screened target photos to the highlight moment service.
S812, the highlight moment analysis service obtains the target photo based on the URL of the target photo.
S813, the wonderful time analysis service analyzes the target photo to generate a target album.
S814, the highlight moment analysis service transmits a notification of generating the target album to the notification service. Accordingly, the notification service receives a notification of generating the target album.
S815, the notification service pushes a notification of generating the target album, and prompts the user that the target album has been generated.
The time for generating the notification of the target album by the notification service push can be determined according to the departure time and the preset adjustment time, and in this step, the preset adjustment time can be positive or negative. For example, when the preset adjustment period is a positive number, the notification service determines a time after the departure time is delayed by the preset adjustment period as a time for pushing the notification of the generation target album; when the preset adjustment is negative, the notification service determines the time after the departure time is advanced by the preset adjustment time length as the time for pushing the notification of the generation target album. Of course, the preset adjustment time may also be 0, and when the preset adjustment time is 0, the departure time is determined as the time of pushing the notification of the generation target album.
Optionally, the method 800 further comprises: the highlight moment analysis service sends the target album to the client of the gallery. Accordingly, the client of the gallery receives and displays the target album.
In response to a user's operation on a target album, the client of the gallery may request the ML database for a photo in the target album, i.e., a storage path of the target photo, the ML database returns a URL of the target photo to the picture parsing service, the picture parsing service obtains the target photo based on the URL of the target photo, and returns a thumbnail of the target photo to the client of the gallery for the user to view.
In the embodiment of the present application, S801 to S805 are processes of pushing a travel notification by a terminal device. The intelligent analysis service may acquire ticket purchasing information of the user before the user starts to acquire travel information of the user, generate a travel notification based on the travel information, and push the travel notification in an interface of the terminal device through the notification service. Illustratively, the travel notification may be presented to the user in the form of a travel card, as shown in the B interface in fig. 2.
S806 to S815 are processes of pushing the target album by the terminal device, in which the picture parsing service and the wonderful moment parsing service are function class service modules of the gallery. The highlight moment analysis service may acquire travel information of the user from the smart analysis service or the MP database, determine a target location of the current travel based on the travel information of the user, then query a picture associated with the target location from the ML database by the highlight moment analysis service, acquire a target photo, and generate a target album based on the target photo.
Fig. 9 is a schematic diagram of a photo pushing principle provided by an embodiment of the present application. As shown in fig. 9, the method for pushing photos according to the embodiment of the present application involves multiple databases and services in the terminal device. The MP database is used for storing the acquired journey information. Taking the user to purchase the airplane ticket as an example, the intelligent analysis service can analyze the flight short message to acquire the travel information of the user or acquire the travel information of the user from an interface of the travel information service application. After acquiring the trip information, the intelligent resolution service may store the trip information in the MP database. The position analysis service in the picture analysis service may analyze the photographing place information of the picture in the gallery and store the photographing place information of the picture in the ML database. An aesthetic scoring service in the CV algorithm service can perform aesthetic detection on the photos in the gallery, storing the aesthetic scores of the photos into the ML database.
Before the user departs, the intelligent analytics service may send a flight message notification to the highlight reel. The flight message notification may trigger the highlight reel analysis service to analyze the flight message in real-time, including: the highlight moment analysis service can determine the residence of the user from a user information server (pengino. UserProfileProvider), and determine the current actual state of the user by combining the flight message and the current position information of the user based on the residence, so as to generate target albums of different types or topics.
For scene 1, the highlight moment analysis service determines that the departure place in the travel information is the resident place of the user, and the highlight moment analysis service determines that the current actual state of the user is before the departure from the resident place according to the flight information and the current position information of the user, namely the user is currently in the resident place, so that the highlight moment analysis service can generate a target album with a theme or a type that is the destination photo of the accident before departure, and the highlight moment analysis service can display a prompt corresponding to the theme on a notification interface when pushing the notification of generating the target album, as shown in an interface A in fig. 5, the prompt is "the destination photo is tidied for you before departure".
For scenario 2, the highlight moment analysis service determines that the departure place in the travel information is not the resident place of the user, and the highlight moment analysis service determines that the current actual state of the user is before returning to the resident place according to the flight message and the current position information of the user, namely, the user is currently at a place except the resident place, so that the highlight moment analysis service can generate a target album with a theme or a type of a destination album which is a destination of a destination before returning, and the highlight moment analysis service can display a prompt corresponding to the theme on a notification interface when pushing a notification of generating the target album, for example, the prompt is "the mark left in the city before returning is appreciated".
In the process of generating the target album, the highlight moment analysis service may acquire the URL of the target photo whose aesthetic score exceeds the preset threshold value from the ML database, and after acquiring the target photo based on the URL, the highlight moment analysis service may call the CV-cluster album generation service and update the cover service to generate the target album.
In the embodiments described above in connection with fig. 2 to 9, the intelligent analysis service of the terminal device may acquire the travel information of the user from the ticket purchase information of the user, and extract the departure place, arrival place, departure time and arrival time from the travel information.
In addition, in some possible embodiments, the smart analysis service in the terminal device may obtain, through the network, schedule information marked by a user in a schedule application in the terminal device, where the schedule information may include schedule and reminder information, for example, the schedule application is a calendar, and the user goes to a B city meeting when marking 2022, 4, 15, 16:00 in the calendar, so that, in combination with current location information of the user (for example, the user is currently in a city a), the smart analysis service may determine that the departure place of the current trip is the city a, the arrival place is the city B, and further the smart analysis service may send the trip information of the current trip to the wonderful time analysis service, where the wonderful time analysis service generates a time album associated with the city B.
In summary, as shown in fig. 10, an embodiment of the present application provides a method 1000 for pushing photos, where the method 1000 is a method for pushing photos for a departure scenario. The steps of method 1000 may be performed by a terminal device that may have the structure shown in fig. 1, but embodiments of the present application are not limited thereto. The method 1000 comprises the steps of:
s1001, obtaining an arrival place of a first journey of a user;
s1002, a first notification message is displayed on a first interface.
Wherein the first notification message is for notifying a user to view a first set of photos associated with the arrival location, the first set of photos being generated prior to the user arriving at the arrival location.
In the embodiment of the application, the first journey is a departure scene of a user. The terminal device can acquire the arrival place of the current journey of the user before the user starts the journey, and generates a first photo set associated with the arrival place based on the arrival place. In the embodiment of the application, the terminal equipment can generate the first photo set associated with the arrival place in advance before the user arrives at the arrival place, so that timeliness of generating the photo set associated with the arrival place is improved, and the use experience of the user is improved.
Referring to the previous example, the user starts the journey from city a to city B, where city a is the residence of the user, city a is the departure place of the journey, and city B is the arrival place of the journey. The first set of photos may be the target album generated in the departure scenario described above.
In connection with the foregoing description, the first interface may be a negative screen as shown in the interface a in fig. 3, and the first notification message may be presented to the user in the form of a notification card as shown in the interface a in fig. 3. Alternatively, the first interface may be a desktop as shown in interface a in fig. 4, and the first notification message may be presented to the user in the form of a gallery card as shown in interface a in fig. 4. Alternatively, the first interface may be a drop down menu bar as shown in interface a in fig. 5, and the first notification message may be presented to the user in the form of a system notification as shown in interface a in fig. 5. Reference may be made specifically to the embodiments described above with respect to fig. 3, 4 or 5, and no further description is given here.
Fig. 11 is a schematic flow chart diagram of still another method 1100 for pushing photos according to an embodiment of the present application. Method 1100 is a method of pushing photos for a departure scenario. The steps of method 1100 may be performed by a terminal device, which may have the structure shown in fig. 1, but the embodiment of the present application is not limited thereto. The method 1100 includes S1001, S1101, S1102, and S1002, and the specific steps are as follows:
S1001, the arrival location of the first trip of the user is acquired.
S1101, it is determined whether the arrival location is a customer premises.
In this step, the user information server in the terminal device stores the resident information of the user, and the terminal device can determine the resident of the user from the user information server and determine whether the arrival location is the resident of the user.
S1102, if the arrival location is another location other than the user' S resident location, it is determined to generate a first photograph set associated with the arrival location.
In one possible implementation, the terminal device may obtain a departure time of the first journey, and generate the first photo set before the departure time. In such an implementation, the terminal device needs to know the departure time and determine a certain time before the departure time as the time at which the first photo set is generated.
In another possible implementation manner, the terminal device may generate the first photo set after acquiring the arrival place of the user, and the terminal device does not need to acquire the departure time of the user.
In this step, if the terminal device determines that the arrival location is another location other than the residence of the user, the terminal device may determine that the current state of the user is before the user starts from the residence, and thus the terminal device may generate the first photo set associated with the arrival location for the user.
S1002, displaying a first notification message on a first interface, the first notification message being for notifying a user to view a first photo set associated with an arrival location, the first photo set being generated before the user arrives at the arrival location.
In the embodiment of the application, after the terminal device obtains the arrival location in the trip information, the terminal device may perform residence analysis to determine whether the arrival location of the current trip is the residence of the user. In the event that the arrival location is not the customer's premises, i.e., the arrival location is a location other than the customer's premises, the terminal device determines to generate a first set of photos associated with the arrival location for the customer.
The precondition that the terminal device generates the first photo set associated with the arrival place for the user is that: the user goes to the arrival place before the present trip and takes a picture at the arrival place.
As an alternative embodiment, prior to S1002, method 1000 or method 1100 further comprises: the departure point of the first trip is obtained. S1002 includes: a first notification message is displayed on a first interface before a user issues from a departure point.
As an alternative embodiment, S1002 includes: after the user arrives at the arrival location, a first notification message is displayed at a first interface.
As an alternative embodiment, prior to S1002, method 1000 or method 1100 further comprises: and displaying a second notification message on the second interface, wherein the second notification message is used for notifying a user to view the journey information. The trip information includes one or more of the following: departure time, departure place and arrival place. Optionally, the trip information further includes a time of arrival.
In an embodiment of the present application, the second interface in the departure scenario may be shown as the B interface in fig. 2, and the second notification message may be presented to the user in the form of a travel card as shown as the B interface in fig. 2. See in particular the embodiment described above with respect to fig. 2, which is not described in detail here.
As an alternative embodiment, the itinerary information is determined based on ticketing information or calendar information.
As an alternative embodiment, after S1001, the method 1000 or the method 1100 further includes: obtaining a storage path of a target photo in a gallery from a database, wherein the target photo comprises a photo taken by a user at an arrival place in a history journey; acquiring a target photo from a gallery based on the storage path; based on the target photographs, a first set of photographs is generated.
In an embodiment of the present application, the database may be the ML database described above.
As an alternative embodiment, the aesthetic score of the target photograph is above a preset threshold.
As an alternative embodiment, before retrieving the storage path of the target photo in the gallery from the database, the method 1000 or the method 1100 further includes: the target photograph is determined based on the photographing location information of the plurality of photographs in the gallery.
As an alternative embodiment, before determining the target photo based on the shooting location information of the plurality of photos in the gallery, the method 1000 or the method 1100 further includes: acquiring shooting location information of a plurality of pictures in a gallery, wherein the plurality of pictures in the gallery comprise target pictures; the shooting location information of the plurality of photos is stored in a database.
In an embodiment of the present application, the database may be the ML database described above.
It should be appreciated that the description relating to the steps of method 1000 or method 1100 may be found in the embodiments described above in connection with fig. 2-6.
The method 1000 and method 1100 of pushing photos in the departure scenario are described above in connection with fig. 10 and 11. The following describes a method for pushing photos in the return scene with reference to fig. 12 and 13.
Fig. 12 is a schematic flow chart of still another method 1200 for pushing photos provided by an embodiment of the present application. Method 1200 is a method of pushing photos for a return scene. The steps of method 1200 may be performed by a terminal device, which may have the structure shown in fig. 1, but the embodiment of the present application is not limited thereto. The method 1200 includes the steps of:
S1201, acquiring a departure place of a second journey of a user;
s1202, a third notification message is displayed on the third interface.
Wherein the third notification message is for notifying the user to view a second set of photographs associated with the departure location, the second set of photographs being generated prior to departure from the departure location.
In the embodiment of the application, the second journey is a return journey scene of the user. The terminal device may acquire the departure place of the current trip of the user before the user returns the trip, and generate the second photo set associated with the departure place based on the departure place. In the embodiment of the application, the terminal equipment can generate the second photo set associated with the departure place in advance before the user is sent out from the departure place, so that timeliness of generating the photo set associated with the departure place is improved, and the use experience of the user is improved.
Referring to the previous example, the user purchased an airline ticket that returned from city B to city a, where in the travel information of the return trip the departure point was city B and the arrival point was city a, where city a is the customer's residence. Similar to the departure scenario, in the return scenario, the terminal device may obtain in advance that the departure point in the travel information of the user is the B city, and generate a second photo set associated with the B city before the user departs from the B city.
In connection with the foregoing description, the third interface may be a negative screen as shown in the interface a in fig. 3, and the third notification message may be presented to the user in the form of a notification card as shown in the interface a in fig. 3. Alternatively, the third interface may be a desktop as shown in interface A in FIG. 4, and the third notification message may be presented to the user in the form of a gallery card as shown in interface A in FIG. 4. Alternatively, the third interface may be a drop down menu bar as shown in interface a in fig. 5, and the third notification message may be presented to the user in the form of a system notification as shown in interface a in fig. 5. Reference may be made specifically to the embodiments described above with respect to fig. 3, 4 or 5, and no further description is given here.
Fig. 13 is a schematic flow chart of still another method 1300 for pushing photos provided by an embodiment of the present application. Method 1300 is a method of pushing photos for a return scene. The steps of method 1300 may be performed by a terminal device, which may have a structure as shown in fig. 1, but the embodiment of the present application is not limited thereto. The method 1300 includes the steps of:
s1201, acquiring a departure place of a second journey of a user;
s1301, obtaining the arrival place of the second journey of the user;
s1302, judging whether the arrival place is a resident place of the user;
S1303, if the arrival place is the resident place of the user, determining to generate a second photo set associated with the departure place;
s1202, displaying a third notification message on the third interface, the third notification message being for notifying the user to view a second set of photos associated with the departure location, the second set of photos being generated prior to departure from the departure location.
In the embodiment of the application, the terminal device can judge whether the arrival place in the second journey is the resident place of the user, and under the condition that the arrival place is determined to be the resident place of the user, the terminal device can determine that the current state of the user is before returning to the resident place, namely, the user is currently positioned at other places except the resident place, so that the terminal device can generate the second photo set associated with the departure place for the user.
As an alternative embodiment, prior to S1303, method 1300 further includes: the departure time of the second trip is obtained. Determining in S1303 to generate a second set of photos associated with the place of departure, including: a second set of photographs is generated prior to the departure time.
As an alternative embodiment, S1202 includes: a third notification message is displayed on a third interface before the user proceeds from the origin.
As an alternative embodiment, S1202 includes: after the user arrives at the arrival location, a third notification message is displayed at a third interface.
As an alternative embodiment, prior to S1202, method 1200 or method 1300 further comprises: and displaying a second notification message on the second interface, wherein the second notification message is used for notifying a user to view the journey information. The trip information includes one or more of the following: departure time, departure place and arrival place. Optionally, the trip information further includes a time of arrival.
In an embodiment of the present application, the second interface in the return scenario may be shown as the B interface in fig. 7, and the second notification message may be presented to the user in the form of a travel card as shown as the B interface in fig. 7. See in particular the embodiment described above with respect to fig. 7, which is not described in detail here.
It should be understood that the sequence numbers of the above processes do not mean the order of execution, and the execution order of the processes should be determined by the functions and internal logic of the processes, and should not be construed as limiting the implementation process of the embodiments of the present application.
The method for pushing the photo according to the embodiment of the present application is described in detail above with reference to fig. 8 to 13, and the apparatus for pushing the photo according to the embodiment of the present application will be described in detail below with reference to fig. 14 and 15.
Fig. 14 shows a schematic block diagram of an apparatus 1400 for pushing photos 1400 according to an embodiment of the present application, where the apparatus 1400 includes an acquisition module 1410 and a processing module 1420.
Wherein, the acquisition module 1410 is configured to: an arrival location of a first trip of a user is obtained. Processing module user: a first notification message is displayed at the first interface for notifying a user to view a first set of photos associated with the arrival location, the first set of photos generated prior to the user arriving at the arrival location.
Optionally, the processing module 1420 is configured to: judging whether the arrival place is a resident place of the user; and if the arrival location is another location other than the user's resident location, determining to generate a first photo set associated with the arrival location.
Optionally, the acquiring module is configured to: the departure time of the first trip is obtained. The processing module 1420 is configured to: a first set of photographs is generated prior to a departure time.
Optionally, the obtaining module 1410 is configured to: and acquiring the departure place of the first travel. The processing module 1420 is configured to: a first notification message is displayed on a first interface before a user issues from a departure point.
Optionally, the processing module 1420 is configured to: after the user arrives at the arrival location, a first notification message is displayed at a first interface.
Optionally, the processing module 1420 is configured to: and displaying a second notification message on a second interface. The second notification message is used for notifying the user to view journey information, and the journey information comprises one or more of the following: departure time, departure place and arrival place.
Optionally, the itinerary information is determined based on ticketing information or calendar information.
Optionally, the obtaining module 1410 is configured to: obtaining a storage path of a target photo in a gallery from a database, wherein the target photo comprises a photo taken by a user at the arrival place in a history journey; and acquiring the target photo from the gallery based on the storage path. The processing module 1420 is configured to: based on the target photographs, a first set of photographs is generated.
Optionally, the aesthetic score of the target photograph is above a preset threshold.
Optionally, the processing module 1420 is configured to: the target photograph is determined based on the photographing location information of the plurality of photographs in the gallery.
Optionally, the obtaining module 1410 is configured to: and acquiring shooting location information of a plurality of photos in a gallery, wherein the plurality of photos in the gallery comprise target photos. The processing module 1420 is configured to: the shooting location information of the plurality of photos is stored in a database.
Optionally, the obtaining module 1410 is configured to: the departure point of the second trip of the user is obtained. The processing module 1420 is configured to: and displaying a third notification message on the third interface, wherein the third notification message is used for notifying the user to view a second photo set associated with the departure place, and the second photo set is generated before the user departs from the departure place.
Optionally, the obtaining module 1410 is configured to: an arrival ground of the second stroke is acquired. The processing module 1420 is configured to: judging whether the arrival place is a resident place of the user; and if the arrival location is the customer's residence, determining to generate a second photo set associated with the departure location.
In an alternative example, it will be appreciated by those skilled in the art that the apparatus 1400 may be embodied as a terminal device in the above embodiment, or the functions of the terminal device in the above embodiment may be integrated in the apparatus 1400. The above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above. The apparatus 1400 may be configured to perform the various processes and/or steps corresponding to the terminal device in the method embodiments described above.
It should be appreciated that the apparatus 1400 herein is embodied in the form of functional modules. The term module herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an embodiment of the present application, the apparatus 1400 in fig. 14 may also be a chip or a system of chips, for example: system on chip (SoC).
Fig. 15 shows a schematic block diagram of another apparatus 1500 for pushing photos according to an embodiment of the present application. The apparatus 1500 includes a processor 1510, a transceiver 1520, and a memory 1530. Wherein the processor 1510, the transceiver 1520 and the memory 1530 communicate with each other through an internal connection path, the memory 1530 is for storing instructions, and the processor 1510 is for executing the instructions stored in the memory 1530 to control the transceiver 1520 to transmit signals and/or receive signals.
It should be understood that the apparatus 1500 may be specifically a terminal device in the foregoing embodiment, or the functions of the terminal device in the foregoing embodiment may be integrated in the apparatus 1500, and the apparatus 1500 may be configured to perform the steps and/or flows corresponding to the terminal device in the foregoing method embodiment. The memory 1530 may optionally include read only memory and random access memory, and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type. The processor 1510 may be configured to execute instructions stored in the memory, and when the processor executes the instructions, the processor may perform various steps and/or flows corresponding to the electronic device in the method embodiments described above.
It is to be appreciated that in embodiments of the application, the processor 1510 may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (digital signal processing, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate arrays (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The application also provides a computer readable storage medium, wherein computer executable instructions are stored in the computer readable storage medium, and when the computer executable instructions are executed by a processor, the method executed by the terminal device in any method embodiment can be realized.
The embodiment of the application also provides a computer program product, which comprises a computer program, wherein the computer program can realize the method executed by the terminal equipment in any method embodiment when being executed by a processor.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor executes instructions in the memory to perform the steps of the method described above in conjunction with its hardware. To avoid repetition, a detailed description is not provided herein.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and all changes and substitutions are included in the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.
Claims (8)
1. A method of pushing a photograph, comprising:
acquiring an arrival place of a first journey of a user;
displaying a first notification message on a first interface, the first notification message being for notifying a user to view a first set of photographs associated with the arrival location, the first set of photographs being generated before the user arrives at the arrival location;
before the first interface displays the first notification message, the method further comprises:
judging whether the arrival place is a resident place of a user or not;
determining to generate the first photo set associated with the arrival location if the arrival location is another location other than the customer's residence;
after the obtaining the arrival of the first trip of the user, the method further comprises:
Obtaining a storage path of a target photo in a gallery from a database, wherein the target photo comprises a photo taken by a user at the arrival place in a history journey;
acquiring the target photo from the gallery based on the storage path;
generating the first photo set based on the target photo;
before the determining generates the first set of photographs associated with the arrival location, the method further comprises:
acquiring the departure time of the first stroke;
the determining generates the first set of photographs associated with the arrival location, comprising:
generating the first set of photographs prior to the departure time;
the method further comprises the steps of:
acquiring a departure place of the first stroke;
the displaying the first notification message on the first interface includes:
displaying the first notification message on the first interface before a user starts from the departure place;
the method further comprises the steps of:
acquiring a departure place of a second journey of a user;
displaying a third notification message at a third interface, the third notification message for notifying a user to view a second set of photographs associated with the departure point, the second set of photographs generated prior to departure from the departure point;
Before the third interface displays the third notification message, the method further comprises:
acquiring the arrival ground of the second stroke;
judging whether the arrival place is a resident place of a user or not;
and if the arrival place is the resident place of the user, determining to generate the second photo set associated with the departure place.
2. The method of claim 1, wherein prior to the first interface displaying a first notification message, the method further comprises:
displaying a second notification message on a second interface, the second notification message being used to notify a user to view travel information, the travel information including one or more of: departure time, departure place and arrival place.
3. The method of claim 2, wherein the itinerary information is determined based on ticketing information or schedule information.
4. The method of claim 1, wherein the aesthetic score of the target photograph is above a preset threshold.
5. The method of claim 1 or 4, wherein prior to said retrieving the storage path of the target photograph in the gallery from the database, the method further comprises:
and determining the target photo based on the shooting location information of the plurality of photos in the gallery.
6. The method of claim 5, wherein prior to the determining the target photograph based on the location information of the plurality of photographs in the gallery, the method further comprises:
acquiring shooting location information of a plurality of photos in the gallery, wherein the plurality of photos in the gallery comprise the target photo;
and storing the shooting location information of the plurality of photos to the database.
7. An apparatus for pushing a photograph, comprising: a processor and a memory, wherein,
the memory is used for storing a computer program;
the processor is configured to invoke and execute the computer program to cause the apparatus to perform the method of any of claims 1 to 6.
8. A computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410074267.9A CN117909536A (en) | 2022-02-07 | 2022-05-11 | Photo pushing method and related device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210116581X | 2022-02-07 | ||
CN202210116581 | 2022-02-07 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410074267.9A Division CN117909536A (en) | 2022-02-07 | 2022-05-11 | Photo pushing method and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115563338A CN115563338A (en) | 2023-01-03 |
CN115563338B true CN115563338B (en) | 2023-11-24 |
Family
ID=84738175
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410074267.9A Pending CN117909536A (en) | 2022-02-07 | 2022-05-11 | Photo pushing method and related device |
CN202210509811.9A Active CN115563338B (en) | 2022-02-07 | 2022-05-11 | Photo pushing method and related device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410074267.9A Pending CN117909536A (en) | 2022-02-07 | 2022-05-11 | Photo pushing method and related device |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN117909536A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108267142A (en) * | 2016-12-30 | 2018-07-10 | 上海博泰悦臻电子设备制造有限公司 | A kind of navigation display method based on address card, system and a kind of vehicle device |
JP2021021970A (en) * | 2019-07-24 | 2021-02-18 | スポットツアー株式会社 | Photobook production system, terminal, photobook production method, and program |
CN113741781A (en) * | 2021-06-15 | 2021-12-03 | 荣耀终端有限公司 | Notification display method and electronic equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110301835A1 (en) * | 2010-06-07 | 2011-12-08 | James Bongiorno | Portable vacation/travel planner, and family tour guide device |
US9618343B2 (en) * | 2013-12-12 | 2017-04-11 | Microsoft Technology Licensing, Llc | Predicted travel intent |
-
2022
- 2022-05-11 CN CN202410074267.9A patent/CN117909536A/en active Pending
- 2022-05-11 CN CN202210509811.9A patent/CN115563338B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108267142A (en) * | 2016-12-30 | 2018-07-10 | 上海博泰悦臻电子设备制造有限公司 | A kind of navigation display method based on address card, system and a kind of vehicle device |
JP2021021970A (en) * | 2019-07-24 | 2021-02-18 | スポットツアー株式会社 | Photobook production system, terminal, photobook production method, and program |
CN113741781A (en) * | 2021-06-15 | 2021-12-03 | 荣耀终端有限公司 | Notification display method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN115563338A (en) | 2023-01-03 |
CN117909536A (en) | 2024-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021213120A1 (en) | Screen projection method and apparatus, and electronic device | |
WO2021129688A1 (en) | Display method and related product | |
WO2020177619A1 (en) | Method, device and apparatus for providing reminder to charge terminal, and storage medium | |
WO2020029306A1 (en) | Image capture method and electronic device | |
WO2020140726A1 (en) | Photographing method and electronic device | |
WO2021004527A1 (en) | Countdown display method and electronic device | |
WO2021213031A1 (en) | Image synthesis method and related apparatus | |
WO2022022319A1 (en) | Image processing method, electronic device, image processing system and chip system | |
CN112923943A (en) | Auxiliary navigation method and electronic equipment | |
CN113472861B (en) | File transmission method and electronic equipment | |
CN111835904A (en) | Method for starting application based on context awareness and user portrait and electronic equipment | |
WO2021218837A1 (en) | Reminding method and related apparatus | |
CN115543145A (en) | Folder management method and device | |
CN114077713A (en) | Content recommendation method, electronic device and server | |
WO2023071441A1 (en) | Method and apparatus for displaying letters in contact list, and terminal device | |
CN115563338B (en) | Photo pushing method and related device | |
WO2020051916A1 (en) | Method for transmitting information and electronic device | |
CN115171073A (en) | Vehicle searching method and device and electronic equipment | |
CN114466100B (en) | Method, device and system for adapting accessory theme | |
CN111339513B (en) | Data sharing method and device | |
WO2024067328A1 (en) | Message processing method | |
WO2024021691A9 (en) | Display method and electronic device | |
CN116095219B (en) | Notification display method and terminal device | |
CN115297438B (en) | Express delivery prompt method, equipment and storage medium | |
CN115297530B (en) | Network connection method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |