CN113807055A - Method and apparatus for editing information - Google Patents

Method and apparatus for editing information Download PDF

Info

Publication number
CN113807055A
CN113807055A CN202111106025.6A CN202111106025A CN113807055A CN 113807055 A CN113807055 A CN 113807055A CN 202111106025 A CN202111106025 A CN 202111106025A CN 113807055 A CN113807055 A CN 113807055A
Authority
CN
China
Prior art keywords
information
travel
user
picture
identity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111106025.6A
Other languages
Chinese (zh)
Inventor
韩雅娟
陈宪涛
徐濛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202111106025.6A priority Critical patent/CN113807055A/en
Publication of CN113807055A publication Critical patent/CN113807055A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/186Templates

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure provides a method, an apparatus, an electronic device, a storage medium, and a computer program product for editing information, which relate to the field of artificial intelligence, and in particular, to the field of map applications. The specific implementation scheme is as follows: acquiring navigation and positioning records; generating travel information according to the navigation and positioning records; acquiring the tactical information according to the type of the place in the travel information; the jargon information is combined with the travel information to generate the travel notes. The embodiment realizes automatic generation of the travel notes, reduces the burden of recording and writing the travel notes of the user, and improves the viscosity and loyalty of the user to the map.

Description

Method and apparatus for editing information
Technical Field
The present disclosure relates to the field of artificial intelligence, and more particularly to the field of map applications.
Background
When a user travels in different places, due to low familiarity and higher dependence on a map, the user searches for POI (Point of Interest) scenic spots, hotel positions, restaurant positions and the like through the map, and searches for route schemes, navigation and the like through the map. It is possible to know what time and where the user has gone. These user "footprints" that can be recorded at a remote location are often a unique experience for the user, but the user does not necessarily remember his place and time of arrival each time after a long time.
Existing map platforms generate a user's footprint, or a city that has been lit up, but only a record of navigation miles and a record of cities that have arrived.
Disclosure of Invention
The present disclosure provides a method, apparatus, device, storage medium, and computer program product for editing information.
According to a first aspect of the present disclosure, there is provided a method for editing information, comprising: acquiring navigation and positioning records; generating travel information according to the navigation and positioning records; acquiring the tactical information according to the type of the place in the travel information; the jargon information is combined with the travel information to generate the travel notes.
According to a second aspect of the present disclosure, there is provided an apparatus for editing information, comprising: a first acquisition unit configured to acquire a navigation and positioning record; a generating unit configured to generate trip information from the navigation and positioning record; a second acquisition unit configured to acquire the traffic information according to the type of the place in the travel information; a composition unit configured to combine the conversational information with the travel information to generate a tour.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method of the first aspect.
According to the method and the device for editing the information, the travel notes are generated and shared in a one-key mode through the map use record of the user, the photo album identification and the travel related information which guides the user to actively supplement. Through the functional advantage of the map, reduce the burden that the user recorded and drafted the travel notes, as the record of a period of time of user's travel simultaneously, more have emotional meaning, improve user's viscidity and loyalty to the map.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for editing information, according to the present disclosure;
FIG. 3 is a flow diagram of yet another embodiment of a method for editing information according to the present disclosure;
FIG. 4 is a schematic diagram of one application scenario of a method for editing information according to the present disclosure;
FIG. 5 is a schematic diagram illustrating one embodiment of an apparatus for editing information according to the present disclosure;
FIG. 6 is a schematic block diagram of a computer system suitable for use with an electronic device implementing embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the disclosed method for editing information or apparatus for editing information may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. Various communication client applications, such as a map application, an album application, a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting navigation and photographing functions, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop and desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server that provides various services, such as a background map server that provides support for maps displayed on the terminal devices 101, 102, 103. The background map server may analyze and otherwise process data such as the received navigation request, and feed back a processing result (e.g., line information) to the terminal device.
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein. The server may also be a server of a distributed system, or a server incorporating a blockchain. The server can also be a cloud server, or an intelligent cloud computing server or an intelligent cloud host with artificial intelligence technology.
It should be noted that the method for editing information provided by the embodiments of the present disclosure is generally executed by the terminal devices 101, 102, 103, and accordingly, the apparatus for editing information is generally set in the terminal devices 101, 102, 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for editing information in accordance with the present disclosure is shown. The method for editing information comprises the following steps:
step 201, navigation and positioning records are obtained.
In the present embodiment, an execution subject (e.g., a terminal device shown in fig. 1) of the method for editing information may obtain map navigation information through a server. The map APP running on the terminal equipment can determine the standing address of the user according to the positioning record of the user. When the map APP detects that the user is at an extraordinary address, a navigation and positioning record can be formed by recording navigation and positioning used in the area. For example, the user uses the source address (manually entered or determined by location), destination address, start time, arrival time at the time of navigation. The location and time when the user was located. Here, the source address, the destination address, and the location of the position are all POIs.
It should be noted that the navigation and positioning record in this embodiment is not a navigation and positioning record for a specific user, and cannot reflect personal information of a specific user.
In this embodiment, the navigation and positioning records are obtained by authorization of the user corresponding to the navigation and positioning records.
In this embodiment, the execution subject for editing information may obtain the navigation and positioning records in various public and legal compliance manners, for example, the navigation and positioning records may be obtained from public data sets or obtained from users after authorization of the users.
And 202, generating travel information according to the navigation and positioning records.
In the embodiment, through navigation and positioning record of the user, the POI places reached by the user are connected in series based on the time line, and the travel information composed of time and place is formed. The POI arrived here refers to a place where the user stays for more than a predetermined time (e.g., 5 minutes), not a place where the user merely passes by. The trip information may be 8 a: 05 guobao, 12: 24XX restaurant, 13: 15 Imperial palace, 18: the 14YY hotel.
Step 203, obtaining the conversational information according to the type of the place in the journey information.
In this embodiment, the types of locations (POIs) include, but are not limited to, restaurant gourmet, park attractions, malls/malls, hotels, museums/technology museums, and the like. Corresponding conversational information is preset for different POI types, e.g. restaurant food corresponds to "going xx restaurants have eaten delicious dinners". "food by dish" may be determined by the time the user arrives at the restaurant, for example, 5 pm: 00-8: between 00, one may set "dinner" as dinner. If the number is 8: the meal may be set to overnight between 00-2: 00. Words can also be set according to the characteristics of restaurants. For example, if it is a Thai restaurant, it may be set that "the xx restaurant going to southeast Asia has eaten a characteristic Thai dish".
And step 204, combining the dialoging information with the travel information to generate the travel notes.
In the embodiment, travel notes are generated by using the jargon series journey information when the user arrives at different POI types, for example, you live in xx hotels in xx months xx in xxxx years, visit 4 hours in xx scenic spots, go xx restaurants and eat delicious dinner, and the like.
The generated travel notes may be converted to a file (e.g., PDF) download in a specified format. A link to a travel note may also be generated and then shared one-click to a specified platform, e.g., WeChat circle of friends, etc.
An instruction can be actively initiated by a user to enable the map APP to generate the travel notes. The map APP may also automatically add new content to the tour after the user has reached one POI each time. The user can actively initiate instruction issue travel notes. For example, a user may post a generated travel note each day after returning to a hotel. The map APP can also remind the user to issue the travel notes after detecting that the user returns to the place of daily living.
The method provided by the embodiment of the disclosure automatically generates the travel notes through the use records of the remote maps of the users, records the specific POI information in a fine-grained manner, has referential property for other users, and is easy to cause the user to remember the good taste. Through the functional advantage of the map, reduce the burden that the user recorded and drafted the travel notes, as the record of a period of time of user's allopatric travel simultaneously, more have emotional meaning, improve user's viscidity and loyalty to the map.
With further reference to FIG. 3, a flow 300 of yet another embodiment of a method for editing information is shown. The process 300 of the method for editing information includes the steps of:
step 301, navigation and positioning records are obtained.
Step 302, generating travel information according to the navigation and positioning records.
Step 303, obtaining the conversational information according to the type of the place in the journey information.
Step 304, combining the dialoging information with the travel information to generate a travel note.
The steps 301-304 are substantially the same as the steps 201-204, and therefore will not be described again.
And 305, selecting pictures from the photo album according to the travel information.
In this embodiment, it should be noted that the acquisition of the album in this embodiment is authorized by the user corresponding to the album.
And acquiring photo album identification and synchronization permission by user authorization, and calling the mobile phone photo album pictures according to the time and the POI places. The album of the terminal equipment has the function of recording the time and the place of picture shooting. After the user authorizes, the time and the place are recorded in the shot pictures. And according to the time and the place of the travel information, the matched picture can be found from the photo album. If there are a plurality of pictures in the same time period (e.g. within 1 minute) and location, the similarity determination may be performed, and if there is a group of pictures with similarity greater than a predetermined value, only the one with the best image quality (obtained by a commonly used image quality detection algorithm), usually the last picture, may be selected. For example, when a user takes a picture, the user may have other people in the background and take the picture again until a satisfactory picture is taken, so that the last picture in the pictures taken at the same time and place is usually most satisfactory.
The terminal equipment can automatically select pictures for direct use, and can also provide a picture list for a user to select which pictures are inserted into the travel notes.
Alternatively, a video may be selected from the album to insert into the shorthand. The shorthand can be in the form of multimedia, including sound, images, video, text.
Step 306, insert the picture into the travel note.
In this embodiment, the picture can be inserted directly into the back of the vernier. Especially, when the number of the character description part is less, the selected picture can be directly inserted into the rear part of the travel notes.
Optionally, the corresponding text description part in the generated travel notes is found according to the position on the picture and is inserted into the picture, and the text description part is not inserted randomly. For example, you live in xx hotels (with hotel pictures inserted) on xx month xx days in xxxx years, visit 4 hours in xx sightseeing (with scenery pictures inserted), go xx restaurants and eat delicious dinner (with dish pictures inserted), and so on.
Optionally, the official website of the place or the profile on the comment APP can be obtained according to the shooting place. And matching the picture with the picture in the brief introduction to find a similar picture, thereby obtaining the picture description. For example, when the user takes a small snack at all places, the map APP can be matched with pictures all at XX comment according to the picture of the snack, and if the similarity of the pictures with the "shao mai" is found to be greater than a preset value, it can be determined that the user has eaten shao mai, and the travel notes can be modified to be "delicious shao mai has been eaten at all places". The user may make manual modifications to the automatically generated travel notes.
In some optional implementations of this embodiment, the method further includes: identifying the identity of the person in the picture; if the identity is not the user himself, a person is added to the travel notes as a co-actor. Whether the picture is the user himself or not can be recognized by an image recognition method (the user registers a picture of the user himself in advance for identification). When other users are identified in the picture, the user can be guided to supplement the person title. The overview of the travel notes includes information about who has the travel notes. For example, the user is prompted for an unknown person in the photograph, please add a title. If the user adds the designation "boss," the travel notes can be written "and the boss has visited the xx attraction for 4 hours". The mode can enrich travel information and reflect the travel process more truly.
In some optional implementations of this embodiment, the method further includes: checking whether at least one of the following is missing from the tour: people, time, place; outputting information for guiding a user to add the missing item; and editing the travel notes according to the information added by the user. Entities in the biographies can be identified by natural language processing methods, such as if no people are extracted from the biographies, then missing people are declared. The complete travel notes are time, place, people required. Therefore, the user needs to be guided and prompted to refine and modify important information such as time, POI, people, and the like. For example, if only the time to enter the attraction is detected, and the user leaves the attraction without navigation, and therefore cannot determine the time to leave, the user may be guided to supplement the time to leave the attraction. The method can enrich the travel notes information, make the contents of the travel notes more complete and help the user to remember the missed contents.
In some optional implementations of this embodiment, identifying the identity of the person in the picture includes: matching the picture with a human face image with a known identity; and if the matching degree is greater than the preset value, the identity of the person in the picture is the identity corresponding to the successfully matched face image. If the user marks the titles of some characters in the photo album in advance, the user can match the faces of the marked character images. And finding out the matched face image to determine the person title. For example, matching the picture with unknown identity with the picture with known identity (old man) in the album, and if the matching degree is more than 90%, the matching is successful. It can be determined that the picture of unknown identity belongs to the user's boss. And if the successfully matched image cannot be found, prompting the user to manually label the identity. The method can automatically add the title without manual addition of a user, and improves convenience.
In some optional implementations of this embodiment, identifying the identity of the person in the picture includes: and determining the identity of the person in the picture according to the map group travel information. The information of the same-person can be identified through functions such as grouping, traveling and the like of the map. The fellow persons will have an ID or nickname when logging in the map APP, and the ID or nickname can be used to represent the identity of the user. The method can automatically add the title without manual addition of a user, and improves convenience.
In some optional implementations of this embodiment, the method further includes: outputting information for guiding a user to add evaluation and remarks; if the user inputs voice, converting the voice into text and adding the text into the travel notes, and storing the voice and the travel notes in a correlation manner; if the user inputs text information, the text information is added to the travel notes. And guiding the user to evaluate and remark the journey, wherein the user can also choose to directly generate without adding, and input a supporting text and voice form. If the user inputs voice, the voice is converted into text and added into the shorthand, the voice is still remained and also added into the shorthand, and the playing of the voice is controlled through keys. This way, the content of the travel notes can be enriched, and other people who watch the travel notes can also benefit from the content of the travel notes.
In some optional implementations of this embodiment, the method further includes: extracting the abstract from the travel notes; the summary and the travel notes are published to a specified location. After recognizing that the user returns to the place of daily use, the default is the end of the travel, and the travel notes abstract is extracted, which may include: time, days (x days x nights), city number and route, and who went to. The abstract extraction method can be realized by a common text abstract extraction model. After the abstract is extracted, the user can be reminded that the system has finished generating the image-text travel notes, and the user can share the image-text travel notes to a specific user or a friend circle by one key. The user can also manually modify the publication for publication. The user can then post the travel notes to a designated location (circle of friends, microblog, etc.) that displays the summary content, which can be linked to the web page where the travel notes are located. And simultaneously, the function of downloading the travel notes in the pdf form is supported. The user can conveniently store and share the travel notes and share the own joys and memories.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 2, the flow 300 of the method for editing information in the present embodiment embodies the step of adding pictures to the travel notes. Therefore, the scheme described in the embodiment can generate the travel notes with luxuriant pictures and texts, so that the travel notes are more vivid and have commemorative significance.
With continued reference to fig. 4, fig. 4 is a schematic diagram of an application scenario of the method for editing information according to the present embodiment. In the application scenario of fig. 4, when the user uses the map APP, the map APP recognizes that the user has switched from hangzhou, which is the usual premises, to beijing. Presumably, the user is traveling in Beijing and needs to be helped to note the travel life. Through the navigation and positioning records of the user during Beijing, the POI sites reached by the user are connected in series based on the time line to form a schedule composed of time and sites, for example, 5: 00xx square, 7: 02 front door snack street, 8: 40 guobao …. Then performing story-based concatenation according to different POI types. For example, i am 5% early in xxxx year xx month xx: 00 watch the flag-raising ceremony at xx squares, then 7: 02 enjoys delicious breakfast in the front snack street. Then, at 8: 40 enter the world visit. The mobile phone can also obtain photo album identification and synchronization authority through user authorization, call the photo album picture of the mobile phone according to time and POI places and supplement the picture to the travel notes, for example, supplement the picture or video of the national flag, the picture of the singing wheat and the picture of the cultural relic to the travel notes. When other users in the album are identified, the user may be directed to supplement the person title. The overview of the travel notes includes information about who has the travel notes. For example, if the user is guided to add the title of the fellow in a shao's photo, if the user writes "the aged man", the corresponding content in the biography may be modified to be 7: 02 and old men enjoy delicious breakfast in front door snack street. The user can also be guided to perfect the evaluation of the POI. For example, the user is guided to rate the front door snack street. The user may rate the data in voice or text form. The map APP may convert speech into text to be added to the travel notes. Meanwhile, voice is kept, and the voice can be played by a playing key on the travel note interface. After identifying that the user returns to Hangzhou in the usual residence, the travel notes abstract is extracted as the default for the end of travel (10-14 days 10-8 months in 2021 and Beijing three-day tour of old man). The user may share the tour a key to a particular user or circle of friends. And can also be converted into pdf form to be downloaded to the local.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for editing information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for editing information of the present embodiment includes: a first acquisition unit 501, a generation unit 502, a second acquisition unit 503, and a synthesis unit 504. Wherein, the first obtaining unit 501 is configured to obtain navigation and positioning records; a generating unit 502 configured to generate trip information from the navigation and positioning record; a second obtaining unit 503 configured to obtain the verbal information according to the type of the place in the travel information; a synthesizing unit 504 configured to combine the tactical information with the travel information to generate a travel note.
In the present embodiment, specific processing of the first acquiring unit 501, the generating unit 502, the second acquiring unit 503, and the synthesizing unit 504 of the apparatus 500 for editing information may refer to step 201, step 202, step 203, step 204 in the corresponding embodiment of fig. 2.
In some optional implementations of this embodiment, the apparatus 500 further comprises an insertion unit (not shown in the drawings) configured to: selecting pictures from the photo album according to the travel information; inserting the picture into the travel note.
In some optional implementations of this embodiment, the apparatus 500 further comprises an identification unit (not shown in the drawings) configured to: identifying the identity of the person in the picture; and if the identity is not the user himself, adding the character in the travel notes as a peer.
In some optional implementations of this embodiment, the apparatus 500 further comprises a guiding unit (not shown in the drawings) configured to: checking whether at least one of the following is missing from the tour: people, time, place; outputting information for guiding a user to add the missing item; and editing the travel notes according to the information added by the user.
In some optional implementations of this embodiment, the identification unit is further configured to: matching the picture with a human face image with a known identity; and if the matching degree is greater than the preset value, the identity of the person in the picture is the identity corresponding to the successfully matched face image.
In some optional implementations of this embodiment, the identification unit is further configured to: and determining the identity of the person in the picture according to the map group travel information.
In some optional implementations of this embodiment, the apparatus 500 further comprises an editing unit (not shown in the drawings) configured to: outputting information for guiding a user to add evaluation and remarks; if the user inputs voice, converting the voice into text and adding the text into the travel notes, and storing the voice and the travel notes in an associated manner; and if the user inputs text information, adding the text information into the travel notes.
In some optional implementations of this embodiment, the apparatus 500 further comprises a publishing unit (not shown in the drawings) configured to: extracting an abstract from the travel notes; the summary and the travel notes are published to a specified location.
In the technical scheme of the disclosure, the collection, storage, use, processing, transmission, provision, disclosure and other processing of the personal information of the related user are all in accordance with the regulations of related laws and regulations and do not violate the good customs of the public order.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of flows 200 and 300.
A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of flows 200 and 300.
A computer program product comprising a computer program which, when executed by a processor, implements the methods of flows 200 and 300.
FIG. 6 illustrates a schematic block diagram of an example electronic device 600 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 performs the respective methods and processes described above, such as a method for editing information. For example, in some embodiments, the method for editing information may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM 603 and executed by the computing unit 601, one or more steps of the method for editing information described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured by any other suitable means (e.g., by means of firmware) to perform the method for editing information.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (19)

1. A method for editing information, comprising:
acquiring navigation and positioning records;
generating travel information according to the navigation and positioning records;
acquiring the tactical information according to the type of the place in the travel information;
combining the verbal information with the travel information to generate a travel note.
2. The method of claim 1, wherein the method further comprises:
selecting pictures from the photo album according to the travel information;
inserting the picture into the travel note.
3. The method of claim 2, wherein the method further comprises:
identifying the identity of the person in the picture;
and if the identity is not the user himself, adding the character in the travel notes as a peer.
4. The method of claim 3, wherein the method further comprises:
checking whether at least one of the following is missing from the tour: people, time, place;
outputting information for guiding a user to add the missing item;
and editing the travel notes according to the information added by the user.
5. The method of claim 3, wherein said identifying the identity of the person in the picture comprises:
matching the picture with a human face image with a known identity;
and if the matching degree is greater than the preset value, the identity of the person in the picture is the identity corresponding to the successfully matched face image.
6. The method of claim 3, wherein said identifying the identity of the person in the picture comprises:
and determining the identity of the person in the picture according to the map group travel information.
7. The method of claim 1, wherein the method further comprises:
outputting information for guiding a user to add evaluation and remarks;
if the user inputs voice, converting the voice into text and adding the text into the travel notes, and storing the voice and the travel notes in an associated manner;
and if the user inputs text information, adding the text information into the travel notes.
8. The method according to any one of claims 1-7, wherein the method further comprises:
extracting an abstract from the travel notes;
the summary and the travel notes are published to a specified location.
9. An apparatus for editing information, comprising:
a first acquisition unit configured to acquire a navigation and positioning record;
a generating unit configured to generate trip information from the navigation and positioning record;
a second acquisition unit configured to acquire the traffic information according to a type of a place in the travel information;
a composition unit configured to combine the verbal information with the travel information to generate a travel note.
10. The apparatus of claim 9, wherein the apparatus further comprises an insertion unit configured to:
selecting pictures from the photo album according to the travel information;
inserting the picture into the travel note.
11. The apparatus of claim 10, wherein the apparatus further comprises an identification unit configured to:
identifying the identity of the person in the picture;
and if the identity is not the user himself, adding the character in the travel notes as a peer.
12. The apparatus of claim 11, wherein the apparatus further comprises a guiding unit configured to:
checking whether at least one of the following is missing from the tour: people, time, place;
outputting information for guiding a user to add the missing item;
and editing the travel notes according to the information added by the user.
13. The apparatus of claim 11, wherein the identification unit is further configured to:
matching the picture with a human face image with a known identity;
and if the matching degree is greater than the preset value, the identity of the person in the picture is the identity corresponding to the successfully matched face image.
14. The apparatus of claim 11, wherein the identification unit is further configured to:
and determining the identity of the person in the picture according to the map group travel information.
15. The apparatus of claim 9, wherein the apparatus further comprises an editing unit configured to:
outputting information for guiding a user to add evaluation and remarks;
if the user inputs voice, converting the voice into text and adding the text into the travel notes, and storing the voice and the travel notes in an associated manner;
and if the user inputs text information, adding the text information into the travel notes.
16. The apparatus according to any one of claims 9-15, wherein the apparatus further comprises an issuing unit configured to:
extracting an abstract from the travel notes;
the summary and the travel notes are published to a specified location.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-8.
19. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-8.
CN202111106025.6A 2021-09-22 2021-09-22 Method and apparatus for editing information Pending CN113807055A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111106025.6A CN113807055A (en) 2021-09-22 2021-09-22 Method and apparatus for editing information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111106025.6A CN113807055A (en) 2021-09-22 2021-09-22 Method and apparatus for editing information

Publications (1)

Publication Number Publication Date
CN113807055A true CN113807055A (en) 2021-12-17

Family

ID=78939807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111106025.6A Pending CN113807055A (en) 2021-09-22 2021-09-22 Method and apparatus for editing information

Country Status (1)

Country Link
CN (1) CN113807055A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246710A (en) * 2013-04-22 2013-08-14 张经纶 Method and device for automatically generating multimedia travel notes
CN104902083A (en) * 2015-05-08 2015-09-09 惠州Tcl移动通信有限公司 Electronic travel note generating method and system based on mobile terminal
US20190147042A1 (en) * 2017-11-14 2019-05-16 Microsoft Technology Licensing, Llc Automated travel diary generation
CN110019599A (en) * 2017-10-13 2019-07-16 阿里巴巴集团控股有限公司 Obtain method, system, device and the electronic equipment of point of interest POI information
CN110245339A (en) * 2019-06-20 2019-09-17 北京百度网讯科技有限公司 Article generation method, device, equipment and storage medium
CN111666430A (en) * 2020-06-08 2020-09-15 东风汽车有限公司 Vehicle-mounted travel note system and method for generating travel notes
CN112069347A (en) * 2020-09-04 2020-12-11 中国平安人寿保险股份有限公司 Travel note generation method and device, computer equipment and readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246710A (en) * 2013-04-22 2013-08-14 张经纶 Method and device for automatically generating multimedia travel notes
CN104902083A (en) * 2015-05-08 2015-09-09 惠州Tcl移动通信有限公司 Electronic travel note generating method and system based on mobile terminal
CN110019599A (en) * 2017-10-13 2019-07-16 阿里巴巴集团控股有限公司 Obtain method, system, device and the electronic equipment of point of interest POI information
US20190147042A1 (en) * 2017-11-14 2019-05-16 Microsoft Technology Licensing, Llc Automated travel diary generation
CN110245339A (en) * 2019-06-20 2019-09-17 北京百度网讯科技有限公司 Article generation method, device, equipment and storage medium
CN111666430A (en) * 2020-06-08 2020-09-15 东风汽车有限公司 Vehicle-mounted travel note system and method for generating travel notes
CN112069347A (en) * 2020-09-04 2020-12-11 中国平安人寿保险股份有限公司 Travel note generation method and device, computer equipment and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑治伟;: "智慧旅游背景下的智能景区导览系统研究与设计", 扬州大学学报(人文社会科学版), no. 02 *

Similar Documents

Publication Publication Date Title
US10424290B2 (en) Cross device companion application for phone
US11580993B2 (en) Keyword determinations from conversational data
US11146520B2 (en) Sharing images and image albums over a communication network
JP6784308B2 (en) Programs that update facility characteristics, programs that profile facilities, computer systems, and how to update facility characteristics
US11392896B2 (en) Event extraction systems and methods
CN113111026A (en) Gallery of messages with shared interests
US9530067B2 (en) Method and apparatus for storing and retrieving personal contact information
JP2019220194A (en) Information processing device, information processing method and program
US11328716B2 (en) Information processing device, information processing system, and information processing method, and program
CN105359087A (en) Auto-calendaring
KR102637042B1 (en) Messaging system for resurfacing content items
US11769500B2 (en) Augmented reality-based translation of speech in association with travel
EP2896162A1 (en) Determining additional information associated with geographic location information
CN105874452A (en) Point of interest tagging from social feeds
CN113850083A (en) Method, device and equipment for determining broadcast style and computer storage medium
CN113807055A (en) Method and apparatus for editing information
CN108153785A (en) The method and apparatus of generation displaying information
JP6709709B2 (en) Information processing apparatus, information processing system, information processing method, and program
JP2022021316A (en) Information processing device, information processing method and information processing system
JP6063697B2 (en) Apparatus, method and program for image display
CN113220816A (en) Data processing method, device and equipment for POI (Point of interest) of electronic map
JPWO2019098036A1 (en) Information processing equipment, information processing terminals, and information processing methods
KR101918747B1 (en) Server system and computer program for driving the server system
Mazur Catholic Cathedrals in Europe and Representation of Heritage via Mobile Applications
WO2015133009A1 (en) Information processing system, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination