CN111638831A - Content fusion method and device and electronic equipment - Google Patents

Content fusion method and device and electronic equipment Download PDF

Info

Publication number
CN111638831A
CN111638831A CN202010478325.6A CN202010478325A CN111638831A CN 111638831 A CN111638831 A CN 111638831A CN 202010478325 A CN202010478325 A CN 202010478325A CN 111638831 A CN111638831 A CN 111638831A
Authority
CN
China
Prior art keywords
content
file
matching degree
determining
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010478325.6A
Other languages
Chinese (zh)
Other versions
CN111638831B (en
Inventor
帅广应
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010478325.6A priority Critical patent/CN111638831B/en
Publication of CN111638831A publication Critical patent/CN111638831A/en
Application granted granted Critical
Publication of CN111638831B publication Critical patent/CN111638831B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a content fusion method and device and electronic equipment, relates to the technical field of communication, and aims to solve the problems that a content comparison process is complicated and complicated, and man-machine interaction performance is poor. The scheme comprises the following steps: determining a first file and a second file to be fused; determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree. The method is applied to a scene that the electronic equipment fuses the contents of a plurality of files.

Description

Content fusion method and device and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of communication, in particular to a content fusion method and device and electronic equipment.
Background
Currently, a user can directly read contents such as electronic books, papers, and blogs through related applications (e.g., reading-type applications, browsing-type applications, etc.) in an electronic device.
For example, taking the example that the user needs to read the electronic book 1 through the reading application program, the user may click on the electronic book 1 (specifically, a mark indicating the electronic book 1) in the main interface of the reading application program to trigger the electronic device to display the content of the electronic book 1. In this case, if the user needs to compare some contents in the electronic book 1 with related contents in the electronic book 2, the user may first trigger the electronic device to return to the main interface, and click on the electronic book 2 to trigger the electronic device to display the contents of the electronic book 2; the user may then trigger the electronic device to switch to reading the chapters of the electronic book 2 that include the associated content. After reading, the user may trigger the electronic device to return to the main interface again, and click on the electronic book 1 again to trigger the electronic device to continue displaying the content of the electronic book 1. Therefore, the user needs to trigger the electronic device to continuously switch between the two electronic books to compare the related contents in the two electronic books, so that the content comparison process is complicated and complicated, and the man-machine interaction performance is poor.
Disclosure of Invention
The embodiment of the application provides a content fusion method and device and electronic equipment, and can solve the problems of complex content comparison process and poor man-machine interaction performance.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a content fusion method, where the method includes: determining a first file and a second file to be fused; determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree.
In a second aspect, an embodiment of the present application provides a content fusion apparatus, including: a determination module and a setting module. The determining module is used for determining a first file and a second file to be fused; determining a first matching degree between the contents of the first file and the second file, and determining a first content and a second content to be fused; the setting module is used for setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and executable on the processor, and when the computer program is executed by the processor, the steps of the content fusion method in the first aspect may be implemented.
In a fourth aspect, the present application provides a readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, may implement the steps of the content fusion method of the first aspect.
In the embodiment of the application, the electronic device may determine a first file and a second file to be merged; determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree. According to the scheme, the electronic equipment can set the first content in the first file and the second content in the second file, wherein the matching degree of the second content of the first content and the second content of the second file is larger than or equal to the preset threshold value in the adjacent content area, so that a user can respectively check the first content and the second content in the adjacent content area; and the greater the content matching degree between the contents, the higher the correlation degree between the contents, that is, the higher the correlation degree between the first content and the second content. Therefore, the user can check the content with higher correlation degree in the adjacent content area, so that the comparison of the related content of the first file and the second file can be realized without triggering the electronic equipment to switch between the first file and the second file, the content comparison process can be simplified, and the man-machine interaction performance can be improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present application;
fig. 2 is a schematic diagram of a content fusion method provided in an embodiment of the present application;
fig. 3 is one of schematic interfaces of an application of the content fusion method provided in the embodiment of the present application;
fig. 4 is a second schematic interface diagram of an application of the content fusion method according to the embodiment of the present application;
fig. 5 is a third schematic interface diagram of an application of the content fusion method according to the embodiment of the present application;
fig. 6 is a fourth schematic interface diagram of an application of the content fusion method provided in the embodiment of the present application;
fig. 7 is a fifth schematic interface diagram of an application of the content fusion method according to the embodiment of the present application;
fig. 8 is a sixth schematic interface diagram of an application of the content fusion method according to the embodiment of the present application;
fig. 9 is a seventh schematic interface diagram of an application of the content fusion method provided in the embodiment of the present application;
fig. 10 is a schematic structural diagram of a content fusion apparatus according to an embodiment of the present application;
fig. 11 is a hardware schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," etc. herein are used to distinguish between different objects and are not used to describe a particular order of objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise specified, "a plurality" means two or more, for example, a plurality of elements means two or more elements, and the like.
The embodiment of the application provides a content fusion method, a content fusion device and electronic equipment, wherein the method can be applied to the electronic equipment, and the electronic equipment can determine a first file and a second file to be fused; determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree. According to the scheme, the electronic equipment can set the first content in the first file and the second content in the second file, wherein the matching degree of the second content of the first content and the second content of the second file is larger than or equal to the preset threshold value in the adjacent content area, so that a user can respectively check the first content and the second content in the adjacent content area; and the greater the content matching degree between the contents, the higher the correlation degree between the contents, that is, the higher the correlation degree between the first content and the second content. Therefore, the user can check the content with higher correlation degree in the adjacent content area, so that the comparison of the related content of the first file and the second file can be realized without triggering the electronic equipment to switch between the first file and the second file, the content comparison process can be simplified, and the man-machine interaction performance can be improved.
The electronic device in the embodiment of the present application may be an electronic device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The following describes a software environment to which the content fusion method provided by the embodiment of the present application is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present application. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present application, a developer may develop a software program for implementing the content fusion method provided in the embodiment of the present application based on the system architecture of the android operating system shown in fig. 1, so that the content fusion method may operate based on the android operating system shown in fig. 1. Namely, the processor or the electronic device can implement the content fusion method provided by the embodiment of the application by running the software program in the android operating system.
The electronic device in the embodiment of the application may be a mobile terminal or a non-mobile terminal. Illustratively, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiment of the present application is not particularly limited.
The execution subject of the content fusion method provided in the embodiment of the present application may be the electronic device, or may also be a functional module and/or a functional entity capable of implementing the content fusion method in the electronic device, for example, a content fusion device described below, which may be specifically determined according to actual use requirements, and the embodiment of the present application is not limited. The content fusion method provided by the embodiment of the present application is exemplarily described below by taking an electronic device as an example.
The content fusion method provided by the embodiment of the application can be applied to a scene that a user needs to compare and view the content in one file with the related content in other files.
In the embodiment of the present application, when a user needs to compare and view content in one file with related content in other files, before the user compares and views related content in the files, the electronic device may execute the content fusion method provided in the embodiment of the present application. Specifically, the electronic device may determine, as a file to be merged, a file 1 (e.g., a first file in this embodiment) and a file 2 (e.g., a second file in this embodiment) that are compared and viewed by a user requirement, and then the electronic device may determine, according to a first matching degree between contents of the file 1 and the file 2, a content c (e.g., a first content in this embodiment) with a higher degree of correlation in the file 1 and the file 2 and a content d in the file 2; and the content c and the content d are arranged in the adjacent content areas according to a preset fusion mode, so that the user can respectively check the content c and the content d with higher matching degree with the content c in the adjacent content areas. Therefore, the user can check the content with higher correlation degree in the adjacent content area, so that the comparison of the related content in the files can be realized without triggering the electronic equipment to switch among the files by the user, the content comparison process can be simplified, and the man-machine interaction performance can be improved.
The content fusion method provided by the embodiment of the present application is described below in detail with reference to the accompanying drawings.
As shown in fig. 2, an embodiment of the present application provides a content fusion method, which may include S201 to S203 described below.
S201, the electronic equipment determines a first file and a second file to be fused.
In the embodiment of the application, the first file and the second file are both files to be fused.
In the embodiment of the application, the first file and the second file are files for comparing and viewing the content of the user. Namely, the user needs to compare and view the related content between the first file and the second file.
Optionally, in this embodiment of the application, the number of the first files may be one, and the number of the second files may be one or multiple, and may be determined specifically according to actual use requirements, and this embodiment of the application is not limited.
Alternatively, in this embodiment of the present application, the first file may be referred to as a "core file", and the second file may be referred to as a "peripheral file".
Optionally, in this embodiment of the application, the first file and the second file may be files locally stored in the electronic device, or files stored in a server, which may be specifically determined according to actual use requirements, and this embodiment of the application is not limited.
In the embodiment of the application, the file type of the first file can be any possible file types such as a text type, a video type, an audio type, an image type and the like; accordingly, the file type of the second file may be any possible file type such as a text type, a video type, an audio type, and an image type.
Optionally, in this embodiment of the application, the file type of the first file and the file type of the second file may be the same or different, and may be determined specifically according to an actual use requirement, and this embodiment of the application is not limited.
For example, when the file type of the first file is a text class, the file type of the second file may be a text class, i.e., the file type of the first file is the same as the file type of the second file. Or the file type of the second file may be a video class, an audio class, or an image class, i.e. the file type of the first file is different from the file type of the second file.
In order to better describe the content fusion method provided by the embodiment of the present application, except for special description, the following embodiments are all exemplified by a file in which the first file and the second file are both text type.
The following describes an exemplary method for determining a first file and a second file to be merged by an electronic device.
One possible implementation: the user knows all the files which need to be compared and viewed, namely the user knows the first file and the second file.
Optionally, in this embodiment of the application, in the possible implementation manner, the S201 may be specifically implemented by the following S201a and S201 b.
S201a, the electronic device receives a third input from the user.
S201b, the electronic device responds to the third input and determines a first file and a second file to be fused.
And the third input is input for triggering the electronic equipment to execute the fusion function on the first file and the second file. That is, the third input is used to trigger the electronic device to select the first file and the second file to be merged.
It can be understood that, in the embodiment of the present application, executing, by the electronic device, the fusion function on the first file and the second file may be understood as executing, by the electronic device, the fusion function on the content of the first file and the content of the second file, that is, fusing the content of the first file and the content of the second file. A specific method for the electronic device to perform the fusion function on the content of the first file and the content of the second file will be described in detail in the following embodiments, and details are not repeated herein to avoid repetition.
In this embodiment of the application, after receiving the third input of the user, the electronic device may select a file corresponding to the third input first, and determine the file corresponding to the third input as a file to be merged (i.e., the first file and the second file).
Optionally, in this embodiment of the application, the third input may specifically be an identifier of the first file, an identifier of the second file, and an input of the first control by the user. The first control can be used for triggering a fusion function of the electronic equipment.
Optionally, in this embodiment of the application, the electronic device may set the first control in a setting application, or the electronic device may set the first control in an application (for example, a reading application) that reads the first file and the second file to be merged, which may be determined specifically according to actual usage requirements, and this embodiment of the application is not limited.
Optionally, in this embodiment of the application, the third input may include 3 sub-inputs, which are a first sub-input, a second sub-input, and a third sub-input, respectively, where the first sub-input may specifically be an input to the first control; the second sub-input may specifically be an input to an identifier of the first file, that is, the second sub-input is used to trigger the electronic device to select the first file; the third sub-input may specifically be an input to an identifier of the second file, that is, the third sub-input is used to trigger the electronic device to select the second file.
It can be understood that, in the embodiment of the present application, when the number of the second files is multiple, the third sub input may include multiple sub inputs, and one of the third sub inputs is used to trigger the selection of one of the second files.
Optionally, in this embodiment of the application, the order of executing the 3 sub-inputs may not be limited. Specifically, in the first case, the user may first execute the first sub-input, and then respectively execute the second sub-input and the third sub-input. In the second case, the user may perform the second sub-input and the third sub-input first, and then perform the first sub-input. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the application, the third input may be any possible form of input, such as click input, long-press input, or long-press input, and may specifically be determined according to an actual use requirement, and this embodiment of the application is not limited.
The click input can be input of clicking, double clicking or continuously clicking for a first preset number of times. The long press input may be an input contacting for a first preset duration. The above-mentioned heavy-pressing input is also referred to as a pressure touch input, and refers to an input that a user presses at a pressure value greater than or equal to the first pressure threshold value.
It should be noted that, in the embodiment of the present application, the first preset number of times, the first preset duration and the first pressure threshold may all be determined according to actual use requirements, and the embodiment of the present application is not limited.
Optionally, in this embodiment of the application, when the second sub-input is a click input, in the second case, the first preset number of times is greater than or equal to 2 times.
The above-described S201a and S201b are exemplarily described below with reference to specific examples.
Illustratively, assume that a reading application displays on its main interface menu controls and the user's recently read identification of probability theory 1, probability theory 2, statistics, and linear algebra. Then:
in the first case, as shown in (a) of fig. 3, the user may first click on a menu control 31 in the main interface 30 of the reading application to trigger the electronic device to display a menu bar of the reading application, the menu bar including a book merge option 32 (i.e., a first control), so that the user may click on the book merge option 32 (i.e., a first sub-input), and then, in response to the first sub-input, as shown in (b) of fig. 3, the electronic device may display a file merge interface 33, the file merge interface 33 including an identifier of "probability theory 1", an identifier of "probability theory 2", an identifier of "statistics", an identifier of "linear algebra", and a determination control. Thus, the user can click on the identifier of the probability theory 1 and the identifier of the probability theory 2 in sequence (i.e. the second sub-input and the third sub-input), and then the electronic device determines the probability theory 1 and the probability theory 2 as the files to be merged, i.e. determines the first file and the second file to be merged, in response to the second sub-input and the third sub-input.
In the second case, as shown in fig. 4 (a), the user may press the mark of "probability theory 1" displayed on the main interface 40 of the reading application program for a long time (i.e., the second sub-input), and then the electronic device responds to the second sub-input, as shown in fig. 4 (b), the electronic device determines that "probability theory 1" is the first file, and displays a selection box including the book merge option 41 (i.e., the first control); so that the user can click on the book merge option 41 (i.e., the first sub-input), and then the electronic device responds to the first sub-input, as shown in (c) of fig. 4, the electronic device can display a file merge interface 42, wherein the file merge interface 42 includes an identifier of "probability theory 1" (in a selected state), an identifier of "probability theory 2", an identifier of "statistics", an identifier of "linear algebra", and a determination control. Thus, the user can click on the identifier of "probability theory 2" (i.e., the third sub-input), and then the electronic device determines "probability theory 2" as the second file in response to the third sub-input, so far the electronic device determines "probability theory 1" and "probability theory 2" as the first file and the second file to be merged.
It will be appreciated that in the first and second cases described above, after the user performs the third sub-input, a determination input may also be performed on the determination control, the determination input being used to trigger the electronic device to determine that the user has completed the file selection. The determined input is a sub-input of the third input.
In the embodiment of the invention, the user can directly trigger the electronic equipment to determine the specific file (namely the first file and the second file) as the file to be fused by inputting, so that the file to be fused can be rapidly determined, and the fused file can meet the fusion requirement of the user.
Another possible implementation: the user knows that the user needs to compare with the viewed part of the file (e.g., the description information of the file), for example, the user knows the first file and does not know the second file.
Optionally, in this embodiment of the application, in another possible implementation manner described above, the above S201 may be specifically implemented by SA to SE described below.
And SA, the electronic equipment receives a first input of a user.
The first input may be an input for triggering the electronic device to execute a fusion function on the first file.
Optionally, in this embodiment of the application, the first input may include 2 sub-inputs, which are a fourth sub-input and a fifth sub-input, respectively, the fourth sub-input may specifically be an input of a user to the second control, and the fifth sub-input may specifically be an input of an identifier of the first file by the user, that is, the fifth sub-input is an input that triggers the electronic device to select the first file.
For other descriptions of the second control and the first input, reference may be specifically made to the related descriptions of the first control and the third input in the foregoing embodiments, and details are not described here again to avoid repetition.
The SB, the electronic device, in response to the first input, sends a first request message to the target device.
The first request message may request the target device to determine a file to be merged with the first file.
In this embodiment of the application, after receiving the first input, the electronic device may determine, in response to the first input, a file corresponding to the first input as a first file, and send a first request message to the target device based on description information of the first file.
Alternatively, in this embodiment of the application, the target device may be a server or another electronic device (hereinafter referred to as an auxiliary device) different from the above-described electronic device.
Optionally, in this embodiment of the application, the first request message may include description information of the first file.
Optionally, in this embodiment of the present application, the description information of one file may be at least one of: file name, author, publication number, and version number, etc. are any information that can uniquely identify a file.
In the embodiment of the application, after the electronic device sends the first request message to the target device, the target device may receive the first request message. After receiving the first request message, the target device may determine at least one candidate file to be merged with the first file according to the first file and the history merging information.
Optionally, in this embodiment of the application, the history fusion information is information of a history fusion first file of the target device.
Optionally, in this embodiment of the application, the history fusion information may indicate all the files fused with the first file within a first preset time period (which may be specifically determined according to actual usage requirements).
For example, assume that the first preset time period is the last month, and the files merged with the first file in the last month are file 3, file 4, file 5, file 6, and file 7; the history fusion information may indicate file 3, file 4, file 5, file 6, and file 7.
Hereinafter, the file merged with the first file within the first preset time period and the first file may be collectively referred to as a merged object. For example, the first file and the file 3 may be referred to as a fusion object, and the first file and the file 4 may be referred to as a fusion object.
Optionally, in this embodiment of the application, the at least one candidate file may be all files indicated by the history fusion information, or may also be a part of files indicated by the history fusion information, which may be specifically determined according to actual use requirements, and this embodiment of the application is not limited.
Optionally, in this embodiment of the application, the at least one candidate file may be a file whose history score is greater than a preset score threshold, among the files indicated by the history fusion information. The historical score may be determined based on a score of the merged result of the first file and the other files by the other users and/or a time length for the merged file to be read by the other users in history (when the target device is a server, the time length may be uploaded to the server by the electronic device at the user end).
The following describes an exemplary relationship between a file indicated by the history fusion information and at least one candidate file, with reference to a specific example.
Exemplarily, it is assumed that the first preset time period is the last month, and the first file is merged with 5 files in the last month, where the 5 files are: file 3, file 4, file 5, file 6, and file 7; if the scores of the other users to the result of the first file and the 5 files after being fused are shown in the following table 1:
TABLE 1
Fusion object Historical score (%)
First document and document 3 80%
First document and document 4 70%
First document and document 5 50%
First document and document 6 30%
First document and document 7 85%
As shown in table 1, the files indicated by the history fusion information are: file 3, file 4, file 5, file 6, and file 7. And if the preset score threshold value is 60%, the at least one candidate file is file 3, file 4 and file 7 in the files indicated by the history fusion information.
In this embodiment of the application, after determining at least one candidate file to be merged with the first file, the target device may send a response message (for example, a first response message described below) indicating the at least one candidate file to the electronic device, where for example, the first response message may include description information and a corresponding history score of the at least one candidate file.
And the SC and the electronic equipment receive a first response message sent by the server.
Wherein the first response message may indicate the at least one alternative file.
Optionally, in this embodiment of the application, after receiving the first response message, the electronic device may display an interface (hereinafter, referred to as a target interface) for indicating the at least one candidate file.
Specifically, the target interface may include a plurality of entries, where one entry corresponds to one of the at least one candidate file, and the entries correspond to the candidate files one to one.
Optionally, in this embodiment of the application, the electronic device may display an entry indicating the at least one candidate file on the target interface in an order from high to low of the score corresponding to the at least one candidate file.
And the SD and the electronic equipment receive second input of the user to the target file in the at least one alternative file.
And the SE and the electronic equipment respond to the second input and determine that the target file is the second file to be fused.
Optionally, in this embodiment of the application, the third input may specifically be a touch input for indicating a target file entry.
In this embodiment of the application, after the user performs the second input on the target file entry indicated in the target interface, the electronic device may determine, in response to the second input, an alternative file (for example, a target file described below) corresponding to the entry as the second file to be merged.
The SC and SE are exemplarily described below in connection with fig. 5.
Illustratively, as shown in fig. 5, after receiving the first response message, the electronic device may display an interface (i.e., a target interface), where the interface includes 3 entries, where the 3 entries respectively indicate file 7, file 3, and file 4, and the score for file 7 is 85%, the score for file 3 is 80%, and the score for file 4 is 70%. Then, the user may click on the entry indicating file 3 (i.e., the target file), i.e., the electronic device receives a second input by the user to the target file in the at least one alternative file, and then the electronic device may determine that file 3 is the second file to be merged in response to the second input.
In the embodiment of the invention, the electronic equipment can rapidly recommend the file list fused with the core file to the user through interaction with the target equipment based on the core file on the basis of triggering and selecting the core file by the user, so that the user can rapidly determine the peripheral file to be fused based on the file list recommended by the electronic equipment even if the user cannot directly determine the peripheral file to be fused with the core file. Therefore, the flexibility and convenience of content fusion can be further improved, and the human-computer interaction performance is improved.
Optionally, in this embodiment of the application, the second control and the first control may be the same control. Or the second control and the first control may be different controls, for example, the first control is a book merge control 70 shown in fig. 6, and the second control may be a book find + merge control 71 shown in fig. 6. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the application, when the second control and the first control are the same control, after receiving an input (which is a first input or a third input) from a user, the electronic device first determines the number of files corresponding to the input, and if the number of files corresponding to the input is multiple, the electronic device may directly determine the files corresponding to the input as all files to be merged (i.e., determine the first file and the second file to be merged), that is, determine the first file and the second file to be merged according to the one possible implementation manner; if the number of the files corresponding to the input is 1, the electronic device may determine the file corresponding to the input as a first file to be merged, and send the first request message to the server based on the file, that is, determine the first file and the second file to be merged according to the another possible implementation manner.
S202, the electronic equipment determines first content and second content to be fused according to the first matching degree between the contents of the first file and the second file.
The first content may be content in a first file, the second content may be content in a second file, and a second matching degree between the first content and the second content is greater than or equal to a matching degree threshold.
Optionally, in this embodiment of the application, the first matching degree may include a matching degree between each content in the first file and each content in the second file.
For example, assume that the first file includes 2 contents, content c1 and content c2, respectively, and the second file includes 1 content, content d 1; then the first degree of match includes: the degree of match between content c1 and content d1, and the degree of match between content c2 and content d 1.
Optionally, in this embodiment of the present application, a matching degree between one content in the first file and another content in the second file is a second matching degree between the one content and the another content.
In the embodiment of the present application, the first matching degree may include at least one second matching degree.
In this embodiment, for each content in the first file, if a second matching degree between one content and another content in the second file is greater than or equal to a matching degree threshold, the one content and the another content may form a content group. Wherein the one content is a first content in the content group, and the another content is a second content in the content group. In this way, the electronic device can determine at least one content group according to the first matching degree between the contents of the first file and the second file.
The following describes, by way of example, a method for determining, by an electronic device, first content and second content to be fused, in combination with the first possible implementation and the second possible implementation, respectively.
A first possible implementation: the first content and the second content to be fused are directly determined by the electronic equipment.
Optionally, in this embodiment of the application, after determining the first file and the second file to be merged, the electronic device may copy the first file and the second file first, and split the content of the copied first file into M pieces of content according to a first preset manner; and splitting the content of the copied second file into N contents according to a second preset mode, wherein M and N are positive integers.
It should be noted that, in this embodiment of the application, when the second file is one file, the electronic device splits the copy file of the one file to obtain N contents, and when the second file is P (P is greater than 1) files, the server splits the copy file of the P files to obtain N contents.
The first preset mode can be any possible mode such as chapter, section, paragraph or sentence; accordingly, the second predetermined manner may be any possible manner such as a chapter, section, paragraph, or sentence. The first preset mode and the second preset mode may be the same or different, and may be determined specifically according to actual use requirements, and the embodiment of the present application is not limited.
OptionallyIn the embodiment of the application, the electronic device may perform cleaning operations such as word segmentation and stop word removal on the M contents, and then process the M cleaned contents into M first feature vectors, that is, the M contents correspond to the M first feature vectors one to one; and the M first eigenvectors are marked as a first vector pool C, C ═ ai}1≤i≤M,aiRepresenting a first feature vector; correspondingly, the electronic device can also perform word segmentation, stop word removal and other cleaning operations on the N contents, and then process the cleaned N contents into N second feature vectors, namely that the N contents correspond to the N second feature vectors one to one; and the N second eigenvectors are marked as a second vector pool O, O ═ bj}1≤j≤N,bjRepresenting the second feature vector.
Optionally, in this embodiment of the application, in a first possible implementation manner, assuming that the content of the first file corresponds to M first feature vectors (that is, the first file includes M contents), and the content of the second file corresponds to N second feature vectors (that is, the second file includes N contents), then the above S202 may be specifically implemented by the following S202a to S202 d.
S202a, for each first feature vector in the M first feature vectors, the electronic device obtains a matching degree between one first feature vector and each second feature vector in the N second feature vectors, to obtain M × N matching degrees.
It can be understood that, in the embodiment of the present application, for each of the M × N matching degrees, a value range of one matching degree is [0, 1 ].
Optionally, in this embodiment of the application, the electronic device may use a matching degree algorithm (which may be determined according to actual usage requirements, and is not limited in this embodiment of the application), and respectively calculate a matching degree between each first feature vector in the M first feature vectors and each second feature vector in the N second feature vectors, so as to obtain the M × N matching degrees.
Optionally, in this embodiment of the application, the M × N matching degrees may be S11,S12,......Sij,.MN. Wherein S isijRepresenting a first feature vector aiAnd a second feature vector bjDegree of match between, Sij=similarity(ai,bj),0≤Sij≤1,
1≤i≤M,1≤j≤N,aiRepresenting a first feature vector, bjRepresenting a second eigenvector, M and N both being integers greater than 0.
It can be understood that in the embodiment of the present application, SijMay also represent a second degree of match between the third content and the fourth content; the third content is the content corresponding to the first feature vector in the content of the first file, and the fourth content is the content corresponding to the second feature vector in the content of the second file.
In this embodiment, the first matching degree may include the M × N matching degrees.
S202b, the electronic device determines at least one of the M × N matching degrees that is greater than or equal to the threshold matching degree.
In this embodiment of the application, after determining M × N matching degrees, the electronic device may first compare a first matching degree of the M × N matching degrees with a matching degree threshold, and if the first matching degree is greater than or equal to the matching degree threshold, the electronic device may determine that the first matching degree is a matching degree of at least one matching degree; if the first matching degree threshold is less than the matching degree threshold, the electronic device may determine that the first matching degree is not a matching degree of the at least one matching degree. Then the electronic device compares a second matching degree of the M × N matching degrees with a matching degree threshold, and if the second matching degree is greater than or equal to the matching degree threshold, the electronic device may determine that the second matching degree is the matching degree of the at least one matching degree; if the second threshold of closeness is less than the threshold of closeness, the electronic device may determine that the second closeness is not a closeness of the at least one closeness. By analogy, after the electronic device compares the last matching degree of the M × N matching degrees with the threshold matching degree, the electronic device may determine at least one matching degree of the M × N matching degrees that is greater than or equal to the threshold matching degree, and mark the at least one matching degree as K matching degrees.
S202c, the electronic device determines the content corresponding to at least one matching degree in the content of the first file as the first content.
S202d, the electronic device determines the content corresponding to at least one matching degree in the content of the second file as the second content.
In this embodiment, for each of the at least one matching degree, one matching degree may determine a content group in the at least one content group.
Specifically, for each matching degree of the at least one matching degree, the electronic device may determine one content corresponding to the one matching degree in the content of the first file and another content corresponding to the one matching degree in the content of the second file, and determine the one content and the another content as a content group.
It can be understood that, in the embodiment of the present application, the number of content groups determined by the electronic device is the same as the number of at least one matching degree.
The following describes exemplary correspondence between M contents, N contents, at least one matching degree, and the first content and the second content, with reference to specific examples.
Exemplarily, it is assumed that the first file includes 3(M ═ 3) contents, respectively, content 1, content 2, and content 3, and the second file includes 2(N ═ 2) contents, respectively, content 5 and content 6; again assume that the threshold match is 0.7. Then M contents, N contents, at least one matching degree, and the correspondence between the first contents and the second contents are shown in table 2 below.
TABLE 2
Figure BDA0002516489140000101
As shown in table 2 above, the number of the 3 × 2 matching degrees (i.e., M × N matching degrees, i.e., the first matching degree) that are greater than the matching degree threshold is 3 (i.e., K — 3), which are: 0.7, 0.8 and 0.72. The content group determined by the electronic equipment according to the matching degree of 0.7 is as follows: (content 1, content 5); and determining the content group as follows according to the matching degree of 0.8: (content 2, content 5); the content group determined according to the matching degree of 0.72 is as follows: (content 2, content 6).
In the embodiment of the invention, the electronic device can directly determine the first content and the second content to be fused according to the matching degree between the feature vector corresponding to the content in the first file and the feature vector corresponding to the content in the second file without interacting with other devices (such as target devices), so that the process of determining the first content and the second content to be fused by the electronic device can be simplified.
A second possible implementation: the electronic equipment of the first content and the second content to be fused is determined by the target equipment.
Optionally, in this embodiment of the application, in the second possible implementation manner, the S202 may be specifically implemented by the following S202e and S202 f.
S202e, the electronic device sends a second request message to the target device.
Wherein the second request message requests to merge the first file and the second file.
Optionally, in this embodiment of the application, the second request message is specifically used to request the target device to determine the content to be merged in the first file and the second file, that is, to request to determine the first content and the second content to be merged.
Optionally, in this embodiment of the application, the second request message may include description information of the first file and description information of the second file. For the description of the description information of the file, reference may be specifically made to the related description of the description information of the file in the SB, and details are not repeated here to avoid repetition.
In this embodiment, after the electronic device sends the second request message to the target device, the target device may receive the second request message.
In this embodiment of the application, if the target devices are different, the target devices may determine the first content and the second content to be merged differently.
Mode 1: the target device is a server.
Optionally, in this embodiment of the application, in mode 1, the target device may first determine, based on the description information of the first file and the description information of the second file included in the second request message, whether the target device has determined the first content and the second content to be merged within a second preset time period. If the target device has determined the first content and the second content to be merged within the second time period, the target device may directly transmit a history determination result (for example, the following indication information) to the electronic device. If the target device does not determine the first content and the second content to be fused within the second preset time period, the target device may obtain the first file and the second file according to the description information of the first file and the description information of the second file, and after obtaining the first file and the second file, the target device may determine the first content and the second content to be fused according to a first matching degree between the contents of the first file and the second file. For the target device determining the descriptions of the first content and the second content to be fused according to the first matching degree between the contents of the first file and the second file, reference may be specifically made to the related descriptions in the first possible implementation manner, and details are not described here again to avoid repetition.
Optionally, in this embodiment of the application, after determining the first content and the second content to be fused, the target device may obtain first information indicating the first content from the first file, and obtain second information indicating the second content from the second file; then, the target device may associate the first information with the second information and generate an association list including association relationships between all the first information and all the second information. In this way, if another user requests the target device to merge the first file and the second file again through the electronic device of the other user, the target device may directly send the association list to the other electronic device.
The first information may be an index of a content area in which the first content is located in the first file, and the second information may be an index of a content area in which the second content is located in the second file.
Optionally, in this embodiment of the present application, for a content area in a file, the index of the content area may be determined according to an offset of the content area with respect to a target object in the file.
Optionally, in this embodiment of the application, the target object may be at least one of the following: any possible object such as a title page of a file, a last 1 page of a file, a directory of a file, a1 st paragraph of a file, a last 1 paragraph of a file, a1 st sentence of a file, a last sentence of a file, a first character of a file, a last character of a file, an image in a file, a table in a file, and the like may be determined specifically according to actual use requirements, and the embodiment of the present application is not limited.
For example, taking the target object as a directory of a file as an example, assuming that a content area 1 is included in a file 1, and an offset of a start position of the content area 1 with respect to the directory of the file 1 is 50 paragraphs, and an offset of an end position of the area with respect to the directory of the file 1 is 53 paragraphs, an index of the content area 1 may be [50 paragraphs, 53 paragraphs ].
The correspondence relationship between the first content, the second content, the first information, the second information, and the association list is exemplarily described below with reference to specific examples.
For example, assume that the first file includes 3(M ═ 3) contents, a1, a2, a3, respectively; the second file includes 2(N ═ 2) contents, b1 and b2, respectively; further, assume that the number of the first content and the second content to be fused determined by the target device is 2, and the number is respectively: (a2, b1) and (a1, b 2); then, if the index of the content area where a1 is located in the first file is [1 segment, 23 segments ], the index of the content area where a2 is located in the first file is [24 segments, 30 segments ], the index of the content area where b1 is located in the second file is [1 segment, 5 segments ], and the index of the content area where b2 is located in the second file is [6 segments, 10 segments ]; the corresponding relationship between the first content, the second content, the first information, the second information, and the association list is shown in table 3:
TABLE 3
First content First information Second content Second information Association List/{ first information, second information }
a1 [ paragraphs 1 and 23 ]] b2 [ paragraph 6, paragraph 10)] { [ paragraph 1, paragraph 23)]Para 6 and para 10]}
a2 [ paragraph 24, paragraph 30)] b1 [ paragraphs 1 and 5 ]] { [24 th, 30 th paragraph { []And [1 th, 5 th]}
As shown in table 3 above, the first message includes 2 messages, respectively a 1: para 1, para 23 and a 2: [ paragraph 24, paragraph 30 ]; the second information also includes two, respectively b 1: para 1, para 5 and b 2: [ paragraph 6, paragraph 10 ]. The association list including the association relationship between all the first information and all the second information may be: { [ paragraph 1, paragraph 5 ], [ paragraph 24, paragraph 30 ] } and { [ paragraph 6, paragraph 10 ], [ paragraph 1, paragraph 23 ] }.
Optionally, in this embodiment of the application, after determining the first content and the second content, the target device may send a response message (for example, a second response message described below) to the electronic device, where the response message may include indication information indicating the first content and the second content. For example, the association list may be included in the indication information.
It is to be understood that, in this embodiment of the application, in the first possible implementation manner, the electronic device may also generate the association list according to first information indicating the first content and second information indicating the second content.
The second method comprises the following steps: the target device is an auxiliary device.
Optionally, in this embodiment of the application, in mode 2, after the target device receives the second response message, the target device may determine, based on the description information of the first file and the description information of the second file included in the second request message, whether an association list indicating the first content and the second content exists in the target device, and if the association list exists in the target device, the target device may send the association list to the electronic device; if the association list does not exist in the target device, the target device may send a prompt message to the electronic device to indicate that the target device cannot determine the first content and the second content to be merged.
S202f, the electronic device receives the second response message sent by the target device.
The second response message may include indication information indicating the first content and the second content, and the indication information may include the association list. In this way, the electronic device may determine the first content and the second content to be merged according to the second response message.
In the embodiment of the invention, the electronic device can determine the first content and the second content to be fused through the target device, so that the requirement on the data processing capacity of the electronic device is reduced, and the speed of determining the first content and the second content to be fused can be improved.
S203, the electronic equipment sets the first content and the second content in adjacent content areas according to a preset fusion mode.
In the embodiment of the application, after the electronic device determines the first content and the second content, the first content and the second content may be rearranged according to a preset fusion mode, that is, the first content and the second content are arranged in adjacent content areas according to the preset fusion mode.
Optionally, in this embodiment of the present application, the preset fusing manner may be that the content to be fused is set in an adjacent content area.
Optionally, in this embodiment of the application, the preset fusion manner may be pre-stored in the electronic device, or may also be sent by the target device, and may specifically be determined according to an actual use requirement, which is not limited in this embodiment of the application.
In this embodiment, in the second possible implementation manner, the electronic device may specifically set the first content and the second content in the adjacent content areas according to the indication information in the second response message and a preset fusion manner.
Optionally, in this embodiment of the present application, if the first content is set in the first content area and the second content is set in the second content area according to a preset fusion manner, the second content area may be a content area that is above and adjacent to the first content area; or may be a content area on the lower side of and adjacent to the first content area; or may be a content area to the left of and adjacent to the first content area; or may be an area to the right of and adjacent to the first content area, etc. The method can be determined according to actual use requirements, and the embodiment of the application is not limited.
Optionally, in this embodiment of the application, after the electronic device sets the first content and the second content in adjacent content areas, a merged file may be obtained.
Optionally, in this embodiment of the application, the electronic device may generate a fusion index of the fusion file based on the content of the fusion file, so that a user may trigger the electronic device to locate the specific content of the fusion file through inputting the fusion index.
It can be understood that, in the embodiment of the present application, one merged file (for example, a file obtained by merging the first file and the second file) corresponds to one merged index. For example, as shown in fig. 7, a merged file 1 corresponds to one merged index, and a merged file 2 corresponds to another merged index.
Optionally, in this embodiment of the present application, in another possible implementation manner, after the electronic device determines the first content and the second content to be fused, if the electronic device does not locally store the first file and/or the second file, the electronic device may display a prompt message to prompt a user to download a corresponding file (the first file and/or the second file) from the server; or prompt the user to import the corresponding file (first file and/or second file) into the electronic device. After the electronic device detects that the electronic device locally has the first file and the second file, the electronic device may set the first content and the second content in adjacent content areas according to a preset fusion mode.
Optionally, in this embodiment of the application, in a first manner, the electronic device may extract the first content from the first file, extract the second content from the second file, and set the extracted first content and the extracted second content in adjacent content areas according to a preset fusion manner, so as to obtain a fused file. Or, in the second manner, the electronic device may extract the second content from the second file, and insert the extracted first content into a target content area in the first file according to a preset fusion manner, where the target content area is adjacent to a content area where the first content in the first file is located.
Alternatively, in the embodiment of the present application, in the second mode, the step S203 may be specifically implemented by the step S203a described below.
S203a, the electronic device inserts the second content into the target content area of the first file according to a preset fusion mode.
The target content area is adjacent to the content area where the first content is located.
For other descriptions of S203a, refer to the relevant descriptions in S203 above, and are not described herein again to avoid repetition.
In the embodiment of the invention, because the electronic device can insert the content (namely the second content) in the first file into the first file according to the preset fusion mode and the content area adjacent to the content area where the first content is located, the user can directly check the content in the second file in the first file, thereby further improving the convenience of content comparison and improving the man-machine interaction performance.
Optionally, in this embodiment of the application, when the first file and the second file are both text files, the electronic device sets the first content and the second content in the adjacent content areas according to a preset fusion mode, and the electronic device may create a new identifier indicating a file obtained by fusing the content of the first file and the content of the second file in the reading application program, so that a user may directly input the identifier to trigger the electronic device to display the first content and the second content in a fusion reading mode. It is understood that in the merged reading mode, the electronic device may display the second content in a content area adjacent to the content area where the first content is located.
Optionally, in this embodiment of the application, the electronic device may set a converged reading control for entering a converged reading mode on the reading interface of the first file and the reading interface of the second file. Therefore, in the process of reading the content of the first file or the content of the second file by the user, if the related content in the first file and the second file is required to be compared, the user directly inputs the fusion reading control, and the electronic equipment can be triggered to automatically switch to the fusion reading mode, namely the electronic equipment displays the second content in the adjacent content area of the content area where the first content is located.
For example, as shown in (a) of fig. 8, the electronic device displays the content c1 and the content c2 in the first file on the first interface 90 (i.e., the reading interface of the first file), at this time, if the user clicks on the first interface, the electronic device may read the control bar 91, and then the user may click on the reading menu control 92 in the reading control bar 91 to trigger the electronic device to display a reading menu, which respectively includes a search control for triggering to search for content in the first file, a bookmark control for triggering to set a bookmark on the first interface, and a merge reading control for triggering to switch to the merge reading mode, as shown in (b) of fig. 8. If the user enters the merge reading control, the electronic device may display content d1 in a content area adjacent to the content area where content c1 is located, as shown in fig. 8 (c). The content d1 is the content in the second file, the content matching degree with the content c1 is greater than the preset threshold, that is, the electronic device is switched to the converged reading mode.
Optionally, in this embodiment of the application, in order to facilitate distinguishing between the content in the first document and the content in the second document, the electronic device may distinguish between the first content and the second content in at least one of different fonts, different font sizes, different colors, and different ground colors (i.e., background colors of the content area other than text).
For example, as shown in (c) of fig. 8, the electronic device displays content c1 and content d1 in different ground colors.
Optionally, in this embodiment, when the electronic device includes a plurality of display screens, for example, two display screens, the electronic device may display the content in one display screen and display the second content in the other display screen. At this time, the display area in the first display screen and the display area in the second display screen may be considered as two adjacent content areas.
Optionally, in this embodiment of the application, in the converged reading mode, the electronic device may further display a hiding control for hiding the content of the second file on the reading interface, such as a "shrink" control shown in fig. 8 (c). As such, the user may trigger the electronic device to hide the second content (e.g., content d1 shown in (c) of fig. 8) and maintain the display of the first content (e.g., content c1 and content c2 shown in (c) of fig. 8) by inputting to the hidden control.
Optionally, in this embodiment of the application, the hidden control may be for all of the second content (one), or the hidden control may be for a part of the second content, and assuming that the second content includes a plurality of sub-contents, each sub-content belongs to a different content group, then one hidden control corresponds to one sub-content (two).
For example, assume that content c1 and content c2 are content in a first file, content d1 and content d2 are content in a second file, and content c1 and content d1 constitute a content group, and content c2 and content d2 constitute a content group; then the electronic device may display content c1 in content area 93, content d1 in content area 94, and content c2 in content area 95, as shown in fig. 9 (a). And content area 94 is located below content area 93 and adjacent to content area 94, and content area 95 is located below content area 94 and adjacent to content area 94; at this time, if the user clicks on the "shrink" control, in the above (one), as shown in (b) of fig. 9, the electronic device may hold the display content c1 in the content area 93, display the content c2 in the content area 94, and cancel the content c2 displayed in the content area 95. In the above (two), as shown in (c) of fig. 9, the electronic device displays content c1 in the content area 93, content c2 in the content area 94, and content d2 in the content area 95, i.e., one hidden control corresponds to one sub-content in the second content.
In the embodiment of the present application, the dashed boxes in fig. 8 and 9 are only used to illustrate the content area, and in an actual implementation, the dashed boxes in fig. 8 and 9 may not be visible to the user.
In the embodiments of the present application, the content fusion methods shown in the above-mentioned method drawings are all exemplarily described with reference to one drawing in the embodiments of the present application. In specific implementation, the content fusion method shown in each method drawing can also be implemented by combining any other drawing which can be combined and is illustrated in the above embodiments, and details are not described here.
As shown in fig. 10, an embodiment of the present application provides a content fusion apparatus, which may include: a determination module 121 and a setting module 122. A determining module 121, configured to determine a first file and a second file to be merged; determining a first matching degree between the contents of the first file and the second file, and determining a first content and a second content to be fused; the setting module 122 may be configured to set the first content and the second content in adjacent content areas according to a preset fusion manner; the first content may be content in a first file, the second content may be content in a second file, and a second matching degree between the first content and the second content is greater than or equal to a matching degree threshold.
Optionally, in this embodiment of the application, the setting module 122 may be specifically configured to insert the second content into a target content area in the first file, where the target content area is adjacent to a content area where the first content is located.
Optionally, in this embodiment of the application, the determining module 121 may specifically include a first receiving submodule, a first sending submodule, a second receiving submodule, and a first determining submodule; the first receiving submodule can be used for receiving a first input, and the first input can be an input for triggering the execution of the fusion function on the first file; a first sending submodule, operable to send a first request message to the target device in response to a first input received by the first receiving submodule; the second receiving submodule can be used for receiving a first response message sent by the target device, wherein the first response message indicates at least one alternative file; the first receiving submodule can be further used for receiving a second input of a target file in the at least one alternative file; the first determining submodule may be configured to determine, in response to a second input received by the first receiving submodule, that the target file is a second file to be merged.
Optionally, in this embodiment of the application, the determining module 121 may include a second sending submodule and a third receiving submodule. The second sending submodule can be used for sending a second request message to the target device, wherein the second request message requests to fuse the first file and the second file; and the third receiving submodule may be configured to receive a second response message sent by the target device, where the second response message may include indication information indicating the first content and the second content.
Optionally, in this embodiment of the application, the content of the first file may correspond to M first feature vectors, the content of the second file may correspond to N second feature vectors, and M and N are positive integers. The determining module 121 may specifically include an obtaining sub-module and a second determining sub-module.
The obtaining sub-module is specifically configured to obtain, for each first feature vector of the M first feature vectors, a matching degree between one first feature vector and each second feature vector of the N second feature vectors, so as to obtain M × N matching degrees;
the second determining submodule may be specifically configured to determine at least one matching degree, which is greater than or equal to the matching degree threshold, of the M × N matching degrees obtained by the obtaining submodule, and record the at least one matching degree as K matching degrees; determining the content corresponding to at least one matching degree in the content of the first file as first content; and determining the content corresponding to the at least one matching degree in the content of the second file as the second content.
The content fusion apparatus 120 provided in the embodiment of the present application can implement each process implemented by the electronic device shown in the foregoing method embodiment, and is not described here again to avoid repetition.
The embodiment of the application provides a content fusion device, which can determine a first file and a second file to be fused; determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree. According to the scheme, the device can set the first content in the first file and the second content in the second file, wherein the matching degree of the second content of the first content and the second content of the second file is larger than or equal to the preset threshold value in the adjacent content area, so that a user can respectively check the first content and the second content in the adjacent content area; and the greater the content matching degree between the contents, the higher the correlation degree between the contents, that is, the higher the correlation degree between the first content and the second content. Therefore, the user can check the content with higher correlation degree in the adjacent content area, so that the comparison of the related content of the first file and the second file can be realized without triggering the device to switch between the first file and the second file, the content comparison process can be simplified, and the man-machine interaction performance can be improved.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention. As shown in fig. 11, the electronic device 100 includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 11 does not constitute a limitation of electronic devices, which may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present application, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 110 is configured to determine a first file and a second file to be merged; determining a first matching degree between the contents of the first file and the second file, and determining a first content and a second content to be fused; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree.
It can be understood that, in this embodiment of the application, both the determining module 121 and the setting module 122 in the structural schematic diagram of the electronic device (for example, fig. 10) may be implemented by the processor 110.
The embodiment of the application provides electronic equipment, which can determine a first file and a second file to be fused; determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file; setting the first content and the second content in adjacent content areas according to a preset fusion mode; the first content is the content in the first file, the second content is the content in the second file, and the second matching degree between the first content and the second content is larger than or equal to the threshold value of the matching degree. According to the scheme, the electronic equipment can set the first content in the first file and the second content in the second file, wherein the matching degree of the second content of the first content and the second content of the second file is larger than or equal to the preset threshold value in the adjacent content area, so that a user can respectively check the first content and the second content in the adjacent content area; and the greater the content matching degree between the contents, the higher the correlation degree between the contents, that is, the higher the correlation degree between the first content and the second content. Therefore, the user can check the content with higher correlation degree in the adjacent content area, so that the comparison of the related content of the first file and the second file can be realized without triggering the electronic equipment to switch between the first file and the second file, the content comparison process can be simplified, and the man-machine interaction performance can be improved.
It should be understood that, in the embodiment of the present application, the radio frequency unit 101 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 11, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an electronic device is further provided in an embodiment of the present application, and includes a processor 110, a memory 109, and a computer program that is stored in the memory 109 and is executable on the processor 110, where the computer program is executed by the processor 110 to implement the processes in the foregoing method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present application further provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here. The readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A method for content fusion, the method comprising:
determining a first file and a second file to be fused;
determining first content and second content to be fused according to a first matching degree between the contents of the first file and the second file;
setting the first content and the second content in adjacent content areas according to a preset fusion mode;
the first content is the content in the first file, the second content is the content in the second file, and a second matching degree between the first content and the second content is greater than or equal to a matching degree threshold value.
2. The method according to claim 1, wherein the disposing the first content and the second content in adjacent content areas comprises:
and inserting the second content into a target content area in the first file, wherein the target content area is adjacent to the content area where the first content is located.
3. The method of claim 1, wherein determining the first file and the second file to be merged comprises:
receiving a first input, wherein the first input is an input for triggering the execution of the fusion function on the first file;
in response to the first input, sending a first request message to a target device;
receiving a first response message sent by the target device, wherein the first response message indicates at least one alternative file;
receiving a second input of a target file in the at least one alternative file;
and responding to the second input, and determining the target file as the second file to be fused.
4. The method according to any one of claims 1 to 3, wherein determining the first content and the second content to be merged according to the first matching degree between the contents of the first file and the second file comprises:
sending a second request message to the target device, wherein the second request message requests to fuse the first file and the second file;
and receiving a second response message sent by the target device, wherein the second response message comprises indication information indicating the first content and the second content.
5. The method according to claim 1, wherein the content of the first file corresponds to M first eigenvectors, the content of the second file corresponds to N second eigenvectors, and M and N are positive integers;
the determining the first content and the second content to be fused according to the first matching degree between the contents of the first file and the second file includes:
for each first feature vector in the M first feature vectors, obtaining the matching degree between one first feature vector and each second feature vector in the N second feature vectors to obtain M x N matching degrees;
determining K matching degrees which are greater than or equal to the threshold matching degree in the M x N matching degrees, wherein K is an integer which is greater than or equal to 0;
determining the content corresponding to the K matching degrees in the content of the first file as the first content;
and determining the content corresponding to the K matching degrees in the content of the second file as the second content.
6. A content fusion apparatus, characterized in that the apparatus comprises: a determining module and a setting module;
the determining module is used for determining a first file and a second file to be fused; determining a first matching degree between the contents of the first file and the second file, and determining a first content and a second content to be fused;
the setting module is used for setting the first content and the second content in adjacent content areas according to a preset fusion mode;
the first content is the content in the first file, the second content is the content in the second file, and a second matching degree between the first content and the second content is greater than or equal to a matching degree threshold value.
7. The apparatus according to claim 6, wherein the setting module is specifically configured to insert the second content into a target content area in the first file, where the target content area is adjacent to a content area where the first content is located.
8. The apparatus of claim 6, wherein the determining module comprises a first receiving sub-module, a first transmitting sub-module, a second receiving sub-module, and a first determining sub-module;
the first receiving submodule is used for receiving a first input, and the first input is an input for triggering the execution of the fusion function on the first file;
the first sending submodule is used for responding to the first input received by the first receiving submodule and sending a first request message to target equipment;
the second receiving submodule is configured to receive a first response message sent by the target device, where the first response message indicates at least one candidate file;
the first receiving submodule is further used for receiving a second input of a target file in the at least one alternative file;
the first determining submodule is configured to determine, in response to the second input received by the first receiving submodule, that the target file is the second file to be merged.
9. The apparatus of claims 6 to 8, wherein the determining module comprises a second transmitting submodule and a third receiving submodule;
the second sending submodule is configured to send a second request message to a target device, where the second request message requests to merge the first file and the second file;
the third receiving submodule is configured to receive a second response message sent by the target device, where the second response message includes indication information indicating the first content and the second content.
10. The apparatus of claim 6, wherein the content of the first file corresponds to M first eigenvectors, the content of the second file corresponds to N second eigenvectors, and M and N are positive integers; the determining module comprises an obtaining submodule and a second determining submodule;
the obtaining sub-module is specifically configured to obtain, for each first feature vector of the M first feature vectors, a matching degree between one first feature vector and each second feature vector of the N second feature vectors, so as to obtain M × N matching degrees;
the second determining submodule is specifically configured to determine at least one matching degree that is greater than or equal to the matching degree threshold value among the M × N matching degrees acquired by the acquiring submodule; determining the content corresponding to the at least one matching degree in the content of the first file as the first content; and determining the content corresponding to the at least one matching degree in the content of the second file as the second content.
CN202010478325.6A 2020-05-29 2020-05-29 Content fusion method and device and electronic equipment Active CN111638831B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010478325.6A CN111638831B (en) 2020-05-29 2020-05-29 Content fusion method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010478325.6A CN111638831B (en) 2020-05-29 2020-05-29 Content fusion method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111638831A true CN111638831A (en) 2020-09-08
CN111638831B CN111638831B (en) 2021-09-28

Family

ID=72329713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010478325.6A Active CN111638831B (en) 2020-05-29 2020-05-29 Content fusion method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111638831B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112905080A (en) * 2021-03-01 2021-06-04 联想(北京)有限公司 Processing method and device
CN115202543A (en) * 2022-07-28 2022-10-18 京东方科技集团股份有限公司 Book type navigation bar generation and switching method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101986302A (en) * 2010-10-28 2011-03-16 华为终端有限公司 Media file association method and device
CN102648609A (en) * 2009-12-07 2012-08-22 三星电子株式会社 Streaming method and apparatus operating by inserting other content into main content
CN103514228A (en) * 2012-06-29 2014-01-15 联想(北京)有限公司 File processing method and electronic device
EP2892015A1 (en) * 2014-01-06 2015-07-08 HTC Corporation Media data processing method and non-transitory computer readable storage medium thereof
CN109302368A (en) * 2017-06-19 2019-02-01 中兴通讯股份有限公司 A kind of document handling method and server

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102648609A (en) * 2009-12-07 2012-08-22 三星电子株式会社 Streaming method and apparatus operating by inserting other content into main content
CN101986302A (en) * 2010-10-28 2011-03-16 华为终端有限公司 Media file association method and device
CN103514228A (en) * 2012-06-29 2014-01-15 联想(北京)有限公司 File processing method and electronic device
EP2892015A1 (en) * 2014-01-06 2015-07-08 HTC Corporation Media data processing method and non-transitory computer readable storage medium thereof
CN109302368A (en) * 2017-06-19 2019-02-01 中兴通讯股份有限公司 A kind of document handling method and server

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112905080A (en) * 2021-03-01 2021-06-04 联想(北京)有限公司 Processing method and device
CN115202543A (en) * 2022-07-28 2022-10-18 京东方科技集团股份有限公司 Book type navigation bar generation and switching method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111638831B (en) 2021-09-28

Similar Documents

Publication Publication Date Title
US20220053083A1 (en) Unread message management method and terminal device
CN109543099B (en) Content recommendation method and terminal equipment
CN110069188B (en) Identification display method and terminal equipment
CN111124245B (en) Control method and electronic equipment
CN110099296B (en) Information display method and terminal equipment
CN111061383B (en) Text detection method and electronic equipment
CN108874906B (en) Information recommendation method and terminal
CN110703972B (en) File control method and electronic equipment
CN111274777A (en) Thinking guide graph display method and electronic equipment
CN111459358B (en) Application program control method and electronic equipment
CN111064848B (en) Picture display method and electronic equipment
CN110908555A (en) Icon display method and electronic equipment
CN111459361B (en) Application icon display method and device and electronic equipment
CN111638831B (en) Content fusion method and device and electronic equipment
CN110647277A (en) Control method and terminal equipment
CN111352547A (en) Display method and electronic equipment
CN111274842A (en) Method for identifying coded image and electronic equipment
CN110888569A (en) Content selection control method and electronic equipment
CN111310248B (en) Privacy protection method and electronic equipment
CN111144065B (en) Display control method and electronic equipment
CN111049976B (en) Interface display method, electronic device and computer readable storage medium
CN110166621B (en) Word processing method and terminal equipment
CN109857578B (en) Text copying method and electronic equipment
CN111368151A (en) Display method and electronic equipment
CN111443819A (en) Control method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant