CN112905080A - Processing method and device - Google Patents

Processing method and device Download PDF

Info

Publication number
CN112905080A
CN112905080A CN202110227077.2A CN202110227077A CN112905080A CN 112905080 A CN112905080 A CN 112905080A CN 202110227077 A CN202110227077 A CN 202110227077A CN 112905080 A CN112905080 A CN 112905080A
Authority
CN
China
Prior art keywords
target
target object
file
content
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110227077.2A
Other languages
Chinese (zh)
Inventor
王惠迎
李翔
沈瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202110227077.2A priority Critical patent/CN112905080A/en
Publication of CN112905080A publication Critical patent/CN112905080A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to a processing method and apparatus, the method comprising: determining a target object from a target file; if an instruction for performing associated display on the target object and the associated object thereof is obtained, determining the associated object from a target file where the target object is located or a target associated file associated with the target file; displaying the target object and the related object thereof in a display output area of the electronic equipment at least according to the display parameter of the content display interface where the target object is located; wherein the display parameters of the target object and the associated object in the display output area are different. According to the method and the device, after the target object in the target file is determined, the associated object of the target object can be quickly searched from the target file or the target associated file thereof, and the target object and the associated object thereof are associated and displayed according to different display parameters, so that a user can more conveniently perform file processing operations such as comparison and the like, and the file processing efficiency and effect are improved.

Description

Processing method and device
Technical Field
The present disclosure relates to the field of information processing technologies, and in particular, to a processing method and apparatus.
Background
In the prior art, a user often has a requirement for comparing context content when reading a document, and if the document is inconvenient to compare back and forth by using a sliding strip, particularly the repeated sliding is more complicated when a plurality of pictures with a plurality of pages are compared with a table; meanwhile, sliding back and forth requires the user to remember a part of the content, which easily causes the deviation of the context comparison.
Disclosure of Invention
According to one aspect of the present disclosure, there is provided a processing method including:
determining a target object from a target file;
if an instruction for performing associated display on the target object and the associated object thereof is obtained, determining the associated object from a target file where the target object is located or a target associated file associated with the target file;
displaying the target object and the related object thereof in a display output area of the electronic equipment at least according to the display parameter of the content display interface where the target object is located;
wherein the display parameters of the target object and the associated object in the display output area are different.
In some embodiments, determining the target object from the target file comprises:
if a selection input acting on a content display interface of the target file is obtained, determining the content selected by the selection input as the target object; or the like, or, alternatively,
if a search input acting on the target content of the target file is obtained, determining the target content searched by the search input as the target object; or the like, or, alternatively,
and if a target input acting on an input component of the electronic equipment is obtained, determining the target content in the target file pointed by the target input as the target object.
In some embodiments, obtaining the instruction for performing the associated display of the target object and the associated object thereof includes:
obtaining an instruction for comparing the target object with the related object; or the like, or, alternatively,
a request for obtaining a comparison result of the target object and the associated object; or the like, or, alternatively,
obtaining an instruction for splitting the target object and the related object thereof; or the like, or, alternatively,
and acquiring an instruction for fusing the target object and the related object thereof.
In some embodiments, determining the associated object from the target file in which the target object is located includes:
acquiring attribute information of the target object, and determining the content in the target file, which has the same attribute as the target object, as the associated object; or the like, or, alternatively,
acquiring attribute information of the target object and position information of the target object in the target file, and determining content which has the same attribute with the target object and meets a preset position relation in the target file as the associated object; or the like, or, alternatively,
obtaining content information of the target object, and determining content in the target file, which has a first content similarity threshold with the target object, as the associated object; or the like, or, alternatively,
and obtaining the content information and the attribute information of the target object, and determining the content which has the same attribute as the target object and has a second content similarity threshold value in the target file as the associated object.
In some embodiments, determining the associated object from a target associated file associated with the target file comprises:
acquiring attribute information of the target object, and determining the content in the target associated file, which has the same attribute as the target object, as the associated object; or the like, or, alternatively,
obtaining content information of the target object, and determining the content in the target associated file, which has a third content similarity threshold with the target object, as the associated object; or the like, or, alternatively,
and obtaining the content information and the attribute information of the target object, and determining the content which has the same attribute as the target object and has a fourth content similarity threshold value in the target associated file as the associated object.
In some embodiments, displaying the target object and its associated object in a display output area of an electronic device according to at least a display parameter of a content display interface where the target object is located includes:
if the display parameters represent that the content display interface where the target object is located is in a full-screen display state, displaying the associated object at the position close to the target object in the content display interface, or displaying the target object and the associated object at the forefront end of the display output area; or the like, or, alternatively,
if the display parameter represents that the content display interface where the target object is located in a first position area of the display output area, displaying the target object in the first position area, displaying an associated object of the target object in a second position area different from the first position area, or displaying the target object and the associated object in the second position area different from the first position area; or the like, or, alternatively,
and acquiring configuration information of a display output area of the electronic equipment, and displaying the target object and the related object thereof in the display output area of the electronic equipment according to the configuration information and the display parameters.
In some embodiments, displaying the target object and its associated object in a display output area of an electronic device according to the configuration information and the display parameters includes:
if the configuration information represents that the electronic equipment at least comprises a first display output area and a second display output area, and the display parameter represents that a content display interface where the target object is located is in a full-screen display state in the first display output area, displaying the target object at the forefront end of the first display output area, displaying an associated object of the target object in the second display output area, or displaying the target object and the associated object thereof in the second display output area; or the like, or, alternatively,
if the configuration information indicates that the electronic device at least comprises a first display output area and a second display output area, and the display parameter indicates that a content display interface where the target object is located in a third position area of the first display output area, displaying the target object in the third position area, and displaying an associated object of the target object in a fourth position area different from the third position area in the first display output area, or displaying the target object and the associated object thereof in the second display output area.
In some embodiments, displaying the target object and its associated objects in a display output area of an electronic device includes:
after the target object and the related object are processed into a file with a preset format, displaying the file in the display output area in a floating window or layer form; and/or the presence of a gas in the atmosphere,
and displaying the comparison result of the target object and the related object in a preset form in the display output area.
In some embodiments, further comprising:
and classifying and/or editing the target object and the associated object thereof.
According to one of the aspects of the present disclosure, there is also provided a processing apparatus including:
a first determination module configured to determine a target object from a target file;
the second determination module is configured to determine the associated object from a target file where the target object is located or a target associated file associated with the target file if an instruction for displaying the target object and the associated object in an associated manner is obtained;
the display module is configured to display the target object and the related objects thereof in a display output area of the electronic equipment at least according to the display parameters of the content display interface where the target object is located;
wherein the display parameters of the target object and the associated object in the display output area are different.
According to one of the aspects of the present disclosure, there is also provided an electronic device, including a processor and a memory, where the memory is used to store computer-executable instructions, and the processor implements the processing method described above when executing the computer-executable instructions.
According to one aspect of the present disclosure, a computer-readable storage medium is provided, on which computer-executable instructions are stored, which, when executed by a processor, implement the processing method described above.
According to the processing method and device provided by various embodiments of the disclosure, after the target object in the target file is determined, the associated object of the target object can be quickly searched from the target file or the target associated file thereof, and the target object and the associated object thereof are associated and displayed in the display output area of the electronic device according to different display parameters, so that a user can more conveniently perform file processing operations such as comparison and the like, and the file processing efficiency and effect are improved.
Drawings
FIG. 1 shows a flow diagram of a processing method of an embodiment of the present disclosure;
2-9 illustrate display diagrams of a target object and its associated objects in embodiments of the present disclosure;
fig. 10 shows a schematic structural diagram of a processing device according to an embodiment of the present disclosure.
Detailed Description
Various aspects and features of the disclosure are described herein with reference to the drawings.
It will be understood that various modifications may be made to the embodiments disclosed herein. Accordingly, the foregoing description should not be construed as limiting, but merely as exemplifications of embodiments. Other modifications will occur to those skilled in the art within the scope and spirit of the disclosure.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the disclosure and, together with a general description of the disclosure given above, and the detailed description of the embodiments given below, serve to explain the principles of the disclosure.
These and other characteristics of the present disclosure will become apparent from the following description of preferred forms of embodiment, given as non-limiting examples, with reference to the attached drawings.
It is also to be understood that although the present disclosure has been described with reference to certain specific examples, those skilled in the art will be able to ascertain many other equivalents to the present disclosure.
The above and other aspects, features and advantages of the present disclosure will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present disclosure are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely examples of the disclosure that may be embodied in various forms. Well-known and/or repeated functions and structures have not been described in detail so as not to obscure the present disclosure with unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure.
The processing method and device of the embodiment of the disclosure are applied to electronic equipment which is provided with a display unit used for displaying information. The electronic device can be in various forms, such as a mobile phone, a tablet computer or a notebook computer.
FIG. 1 shows a flow chart of a processing method of an embodiment of the present disclosure. As shown in fig. 1, an embodiment of the present disclosure provides a processing method, including:
s101: a target object is determined from the target file.
The target file may be a document, a web page, a table, a ppt, a database, or the like, which contains text, a table, a picture, a video, or the like. In this embodiment, the target file takes a word document as an example, and the processing method is specifically described.
The target object is content that is concerned by the user in the target file, for example, the target object may be a certain sentence or a certain text that is determined when the user browses a word document, or may be content such as a picture, a formula, a table, and the like in the word document.
In step S101, determining a target object from a target file includes:
if a selection input acting on a content display interface of the target file is obtained, determining the content selected by the selection input as the target object; or the like, or, alternatively,
if a search input acting on the target content of the target file is obtained, determining the target content searched by the search input as the target object; or the like, or, alternatively,
and if a target input acting on an input component of the electronic equipment is obtained, determining the target content in the target file pointed by the target input as the target object.
Specifically, when the word document is displayed in the display output area of the electronic device, the page of the word document displayed on the display interface is the content display interface of the target file, and at this time, the content selected by the user may be determined as the target object based on the selection input of the user on the content display interface of the target file. For example, when the target object is a piece of target text, a slide-down line may be marked below the piece of target text to select it, or the piece of target text may be marked with a highlight color (e.g., yellow) to select it.
In other embodiments, the target content searched from the word document may be determined as the target object based on a user search. The search input of the target content may include a search of text content in the target file, that is, when the target object is determined to be a certain target text, the target text or a keyword representing the target text may be directly input in a search box of the content display interface, the target text is searched, and the searched target text is determined to be the target object.
The search input for the target content may also include a search of a library of text-based snapshots in the target file. Specifically, the content in the word document may be divided and different snapshots may be generated, the different snapshots may be stored in a snapshot library, then a snapshot including the target content may be searched from the target file by inputting a keyword or scanning a picture, a two-dimensional code, or the like, and the searched snapshot including the target text may be determined as the target object. In the embodiment, the contents such as the text in the target file are stored in the form of the snapshot, so that the target contents can be prevented from being modified, and the accuracy of target content searching is ensured.
In the above embodiment, the search mode of the target content is mainly text search, for example, text search is performed by using keywords; in a specific implementation, the search mode may also be a voice search, and may use a microphone of the electronic device to collect voice information of the user, and use the voice information to search for the target content in the target file.
In the above embodiment, the electronic device may be an electronic device which has a touch screen and can directly interact with a user, such as a mobile phone and a tablet computer. In other embodiments, the target object may be determined based on a user action with a target input of an input component of the electronic device when the user is unable to directly interact with the electronic device. The input means may include a mouse, a keyboard, a light pen, a handwriting input board, a voice input recognition device, a lip language input recognition device, and the like.
In the above embodiments, the determination of the target object is mainly determined based on user input, in other embodiments, the target object may also be determined automatically, for example, when a user browses a document, browse to a certain page, and when the dwell time of the page exceeds a preset time length, the content information of the page may be determined automatically as the target content, and then the target content may be determined as the target object.
S102: and if an instruction for performing associated display on the target object and the associated object thereof is obtained, determining the associated object from the target file where the target object is located or the target associated file associated with the target file.
The associated object is content associated with the target object, so that the user can perform subsequent operations such as comparison and combination with the target object.
When a user needs to perform associated display on a target object and an associated object thereof, the user can perform input operation on the electronic device to send an instruction for performing associated display on the target object and the associated object thereof, and after receiving the instruction for performing associated display, the electronic device searches the associated object from a target file in which the target object is located or a target associated file associated with the target file, so as to determine the associated object.
The target-associated file associated with the target file may be a file having the same or similar identification as the target file, for example, the file name of the target file is "notification about XXX", and the target-associated file may be a file having the same or similar name as the file name; the target associated file may also be a file matched with the file content of the target file, for example, when the matching degree of the file content of the target file and the file content of the target associated file reaches a preset threshold, it may be determined that the target file and the file content of the target associated file are associated with each other; the target associated file may also be a file having a certain parallel relationship or a certain dependency relationship with the target file, for example, the target file and the target associated file may be files in the same folder; the target associated file may also be a file having a link relationship with the target file, and after the target file is determined, the electronic device may link and jump to the target associated file. The target associated file also includes documents, web pages, tables, ppt, databases, etc.
In this embodiment, the file type of the target associated file may be the same as or different from that of the target file, for example, when the target file is a word document, the target associated file may be a web page, and as long as networking is possible, the target file and the target associated file may be associated.
In step S102, obtaining an instruction for performing a display of the target object and the associated object thereof, includes:
obtaining an instruction for comparing the target object with the related object; or the like, or, alternatively,
a request for obtaining a comparison result of the target object and the associated object; or the like, or, alternatively,
obtaining an instruction for splitting the target object and the related object thereof; or the like, or, alternatively,
and acquiring an instruction for fusing the target object and the related object thereof.
Specifically, the user may directly input an instruction for comparing the target object with the associated object, and the electronic device may search the associated object after receiving the instruction. When a user wants to directly obtain a comparison result between a target object and an associated object thereof, or the content of the target object or the associated object is more, and the direct association display comparison effect is not obvious, in order to facilitate the comparison by the user, the user can input a request for obtaining the comparison result between the target object and the associated object thereof, and when the electronic equipment receives the input request, the associated object can be searched.
Further, the user may also input an operation instruction for splitting or fusing the target object and the associated object thereof, so that the electronic device searches for the associated object and performs a corresponding splitting or fusing operation. The fusion processing comprises modes of superposition, splicing, embedding and the like. For example, when the text content of the target object is large, the target object is not convenient for a user to view, or the target object contains texts with different font sizes, the target object may be split. For another example, when the target object and the associated object are similar pictures, the target object and the associated object may be superimposed to quickly find the difference between the target object and the associated object; when the target object is a chart and the associated object is a detailed text description of the chart, the target object and the associated object can be spliced so that a user can conveniently understand the chart and the text description; when the target object contains a blank area, in order to facilitate the comparison between the target object and the associated object, the associated object may be embedded in the blank area, so as to realize the fusion of the target object and the associated object.
In step S102, determining the associated object from the target file in which the target object is located includes:
acquiring attribute information of the target object, and determining the content in the target file, which has the same attribute as the target object, as the associated object; or the like, or, alternatively,
acquiring attribute information of the target object and position information of the target object in the target file, and determining content which has the same attribute with the target object and meets a preset position relation in the target file as the associated object; or the like, or, alternatively,
obtaining content information of the target object, and determining content in the target file, which has a first content similarity threshold with the target object, as the associated object; or the like, or, alternatively,
and obtaining the content information and the attribute information of the target object, and determining the content which has the same attribute as the target object and has a second content similarity threshold value in the target file as the associated object.
The attribute information of the target object comprises types of the target object, such as different types of contents of texts, pictures, tables, formulas, audios and the like; for example, when it is determined that a target object in a word document is a picture, other pictures in the word document may be determined as related objects, so as to perform centralized comparison on different pictures in a target file. The attribute information of the target object may also include identification information of the target object, for example, different pictures in the word document are generally marked in the manners of fig. 1 and (2) …, and when it is determined that the picture corresponding to fig. 1 is the target object, the picture corresponding to fig. 2 in the word document or the picture corresponding to other identification information may be determined as the associated object based on the identification information of fig. 1.
In some embodiments, not only the attribute information of the target object but also the location information of the target object in the target file may be obtained, for example, when the target object is a certain target text in a word document, it may be determined that not only the attribute information is a text, but also the location information of a specific page, paragraph, etc. of the target text in the word document may be determined, and particularly when the target file includes a plurality of related objects, by obtaining the location information of the target object in the target file, it may be determined that a final related object compared with the target object is accurately determined, and accurate information of determining the related object is improved. For example, when it is determined that the related object of fig. 1 is fig. 2, since fig. 2 includes fig. 2-1 and fig. 2-2, fig. 2-1 satisfying the predetermined positional relationship may be determined as the final related object.
In the above embodiment, the associated object is determined by obtaining the attribute information of the target object or combining the attribute information with the position information of the target object, and since the attribute information is relatively general, the determined associated object may also be relatively general, and a final associated object needs to be further determined. In other embodiments, to improve the efficiency and accuracy of determining the associated object, the content information of the target object may be directly obtained, and the content in the target file having the first content similarity threshold with the target object is determined as the associated object.
For example, when a target object in a word document is a target text, a content in the word document having a similarity threshold value of more than 90% with the text content of the target text may be determined as an associated object; when the target object is a chart or a table, a chart with the same header content (the similarity threshold of the header content is 100%) as the chart or the table may be determined as an associated object, for example, the target object is a price trend chart of a current year of a certain commodity, the associated object is a price trend chart of a last year of the commodity, and the headers are the price trend charts of the commodity; for example, the target object and the associated object are balance tables, and the balance tables are tables with a general format, so that the balance tables can be obtained by judging the contents of the head of the table; when the target object is a table that includes both a header (title) and table contents, a table that is the same as the header content of the table and has a certain similarity threshold with the tag content in the table may be determined as the related object. I.e. the first content similarity threshold is determined in dependence of the type of the target object.
Further, the content information and the attribute information of the target object may be obtained at the same time, and the content in the target file having the same attribute as the target object and having the second content similarity threshold may be determined as the associated object.
In a specific embodiment, for example, the target object is a chart including a header (title), and when the associated object is determined by using the second content similarity threshold, a chart and a text having the same content as the header of the target object can be obtained, where the text has the same content as the header, but a user wants to compare the two charts, so that, on the basis of primarily determining the associated object by using the second content similarity threshold, the associated object to be finally compared is determined by using the attribute information of the target object, and the accuracy of determining the associated object is ensured. In other embodiments, the content information and the location information of the target object may also be obtained at the same time, so as to determine the associated object. For example, when the target object in the word document is the target text, the content in the word document, which has a similarity threshold of 90% or more with the text content of the target text, may be preliminarily determined as the associated object, and when the preliminarily determined associated objects are multiple and located at different positions in the word document, the final associated object may be further determined according to the position of the target text.
In still other embodiments, the content information, the attribute information, and the position information of the target object may be obtained at the same time, and then the associated object of the target object may be accurately determined based on the three information. As described above, when a plurality of related objects are determined according to the content information and the attribute information of the target object, it is necessary to determine a final related object based on the position information of the target object for comparison.
It should be noted that the type of the target object may be the same as or different from the associated object. For example, the target object is a target text in a word document, the associated object is a certain diagram in the word document, and the target text is an explanation of the diagram, so that text content in the target text and a header or a parameter tag of the diagram satisfy a certain content similarity threshold.
In some embodiments, in step S102, determining the association object from a target association file associated with the target file includes:
acquiring attribute information of the target object, and determining the content in the target associated file, which has the same attribute as the target object, as the associated object; or the like, or, alternatively,
obtaining content information of the target object, and determining the content in the target associated file, which has a third content similarity threshold with the target object, as the associated object; or the like, or, alternatively,
and obtaining the content information and the attribute information of the target object, and determining the content which has the same attribute as the target object and has a fourth content similarity threshold value in the target associated file as the associated object.
Determining the associated object from the target associated file associated with the target file is similar to the method for determining the associated object from the target file where the target object is located, and is not repeated here.
In particular, when the associated object and the target object are in the same target file, the associated object can be accurately determined based on the position information of the target object in the target file, and when the associated object is in the target associated file of the target file, since the type, content and the like of the target file and the target associated file may be different, determining the associated object from the target associated file associated with the target file based on the position information of the target file may have a problem that the searching is cumbersome and may not be determined, and therefore, when determining the associated object from the target associated file associated with the target file, the position information of the target object in the target file is not required to be utilized.
Since the types of the target associated file and the target file may be different, the third content similarity threshold and the fourth content similarity threshold are different from the first content similarity threshold and the second content similarity threshold, and in this embodiment, the third content similarity threshold and the fourth content similarity threshold may be the occupation ratio of the same sub-object included in the target object and the associated object thereof. For example, when the target file is a word document and the target associated file is a PPT, the typesetting of the word document is different from that of the PPT, and therefore, the third content similarity threshold and the fourth content similarity threshold are determined as the occupation ratios of the target object and the same sub-objects contained in the associated object, so that the accuracy of determining the associated object can be ensured, and the missing of the key associated object can be prevented.
In a specific implementation, there may be one or more determined related objects, and the disclosure is not particularly limited.
S103: displaying the target object and the related object thereof in a display output area of the electronic equipment at least according to the display parameter of the content display interface where the target object is located;
wherein the display parameters of the target object and the associated object in the display output area are different.
After the associated object of the target object is determined, the associated object and the target object can be displayed simultaneously, and the target object and the associated object are displayed in a display output area according to different display parameters at least according to the display parameters of a content display interface where the target object is located, so that the target object and the associated object can be displayed in a differentiated mode.
The display parameters of the target object and the associated object in the display output area comprise a display style, a display format or a display mode of the target object and the associated object in the display output area, wherein the display style comprises a display position, a font, a color, a size and the like; the display format includes a format of a specified type such as a text type, a picture type, and the like. The display mode may include a full display, a partial display, a hidden display, or the like.
For example, when the associated object is determined to be text content, the target object and the associated object thereof can be directly displayed in the display output area in a text form, so that the user can conveniently mark the associated object; alternatively, the target object and its related object may be displayed in an unmodifiable format such as a picture, thereby preventing the text content from being confused due to a misalignment of the text format when the target object and its related object are displayed in the form of text.
In step S103, displaying the target object and its associated object in a display output area of the electronic device according to at least the display parameter of the content display interface where the target object is located, including:
s1031: if the display parameters represent that the content display interface where the target object is located is in a full-screen display state, displaying the associated object at the position close to the target object in the content display interface, or displaying the target object and the associated object at the forefront end of the display output area; or the like, or, alternatively,
s1032: if the display parameter represents that the content display interface where the target object is located in a first position area of the display output area, displaying the target object in the first position area, displaying an associated object of the target object in a second position area different from the first position area, or displaying the target object and the associated object in the second position area different from the first position area; or the like, or, alternatively,
s1033: and acquiring configuration information of a display output area of the electronic equipment, and displaying the target object and the related object thereof in the display output area of the electronic equipment according to the configuration information and the display parameters.
As shown in fig. 2, in step S1031, when the display parameter indicates that the content display interface where the target object is located is in a full-screen display state, the content display interface occupies a display output area of the electronic device.
At this time, when the target object is visually displayed on the content display interface, its associated object may be displayed at a position where the content display interface is close to the target object. For example, when a word document is displayed on a full screen of a display interface of an electronic device, a page of the word document is a content display interface, a target text is displayed on the content display interface, and text content of an associated object can be displayed around the target text.
Optionally, as shown in fig. 3, when the content display interface further includes other content, which is likely to cause interference or display a position of the target object that is not obvious in the content display interface, and the user is inconvenient to view and compare, the target object and the associated object thereof may be highlighted at the frontmost end of the display output area, so that the user can view and compare conveniently.
As shown in fig. 4, in step S1032, the display output area of the electronic device is not occupied by the content display interface. The first position area may be a left side area of the display output area, when the target object is displayed in the first position area, the right side area of the display output area may be used as a second position area to display the associated object, and the target object and the associated object are respectively displayed in different display output areas, so that the comparison effect is obvious. Optionally, as shown in fig. 5, when the content display interface further includes other content that is likely to cause interference, or the target object is displayed at an unobvious position such as the upper left corner of the first position area, the target object and its associated object may be displayed in a second position area different from the first position area, which is convenient for the user to view and compare.
In step S1033, displaying the target object and the associated object in a display output area of the electronic device according to the configuration information and the display parameter, including:
if the configuration information represents that the electronic equipment at least comprises a first display output area and a second display output area, and the display parameter represents that a content display interface where the target object is located is in a full-screen display state in the first display output area, displaying the target object at the forefront end of the first display output area, displaying an associated object of the target object in the second display output area, or displaying the target object and the associated object thereof in the second display output area; or the like, or, alternatively,
if the configuration information indicates that the electronic device at least comprises a first display output area and a second display output area, and the display parameter indicates that a content display interface where the target object is located in a third position area of the first display output area, displaying the target object in the third position area, and displaying an associated object of the target object in a fourth position area different from the third position area in the first display output area, or displaying the target object and the associated object thereof in the second display output area.
Specifically, as shown in fig. 6, when the electronic device includes different display areas, for example, the first display output area is a left area of the electronic device, the second display output area is a right area, and the content display interface where the target object is located occupies the first display output area, in order to prevent interference of other contents in the target file, and for convenience of comparison, the target object may be displayed at the forefront of the first display output area, and the associated object of the target object may be displayed in the second display output area.
Optionally, as shown in fig. 7, when the content display interface further includes other content, which is likely to cause interference, or a target object is at an unobvious position such as the upper left corner of the first display output area, the target object and the associated object thereof may be displayed in the second display output area, so that the display of the other content in the first display output area is not affected while the target object and the associated object thereof are displayed in a comparative manner.
In other embodiments, as shown in fig. 8, when the electronic device includes different display areas, for example, the first display output area is a left area of the electronic device, the second display output area is a right area, and the content display interface where the target object is located in a third position area of the first display output area, for example, the third position area may be an upper half area of the first display output area, that is, the content display interface does not occupy the first display output area, the target object may be displayed in the upper half area of the first display output area, and the associated object of the target object may be displayed in a lower half area of the first display output area, so that the target object and the associated object thereof are displayed in the first display output area in a comparison manner, and the comparison effect is obvious when the same display output area is displayed in a comparison manner.
Alternatively, the target object may be displayed in the upper half area of the first display output area and the associated object may be displayed in the second display output area, similar to that shown in fig. 4.
Alternatively, when there is more other irrelevant content in the upper half area of the first display output area, or the target object is displayed at an unobvious position in the upper half area of the first display output area, the target object and its associated object may be displayed at the same time in the lower half area of the first display output area. Alternatively, the target object and its associated object may be displayed simultaneously in the second display output area similar to that shown in fig. 5.
The various display modes in step S103 may be determined according to actual needs, as long as the target object and the associated object thereof can be highlighted and displayed in an associated manner. The steps S1031, S1032 and S1033 are independent steps to realize the associated display of the target object and the associated object, respectively.
In some embodiments, the step S103 of displaying the target object and the associated object in the display output area of the electronic device includes:
after the target object and the related object are processed into a file with a preset format, displaying the file in the display output area in a floating window or layer form; and/or the presence of a gas in the atmosphere,
and displaying the comparison result of the target object and the related object in a preset form in the display output area.
For example, when the target object and the associated object thereof are both texts, the image including the target object and the associated object thereof may be respectively captured to generate a snapshot, the snapshot is saved in the image format, and the snapshot is displayed in the display output area in the form of a floating window or a layer.
The floating window or graphic layer form display is specifically as follows: the target object and its associated objects in the predetermined format are displayed above the content display interface of the previously opened target file, which is not closed but cannot be edited.
Specifically, as shown in fig. 2, in step S1031, the target text may be displayed on the content display interface in the original text format (no frame line is around the target text), and the associated text may be displayed in the adjacent position of the target object in the picture format (a frame line is around the associated text); alternatively, the target object may be displayed in a picture format overlaid on the original target text. Or, as shown in fig. 3, the target text and the associated text are respectively displayed in a picture format at the forefront of the display output area.
In step S1032 and step S1033, the display forms of the target object and the associated object at other display positions are similar to that in step S1031, and are not described again here.
In other embodiments, after the associated object is determined, the specific content of the associated object may not be displayed, and the comparison result between the target object and the associated object thereof may be directly displayed.
The predetermined form of the comparison result display may be a floating window, a pop window, or an OSD (on-screen display, screen menu type adjustment). For example, the comparison result may be directly displayed in the form of a floating window at a position close to the target object on the content display interface, i.e., the comparison result and the original target object may be compared and checked to verify the accuracy of the comparison result, or the original target object may be deeply understood based on the comparison result. The comparison result may also be displayed in a pop-up window form, for example, when the target object is displayed on the content display interface, the comparison result may pop-up by clicking the target object. When the comparison result is displayed in an OSD mode, the comparison result can be a rectangular menu containing the comparison result of each subdivision item, and each subdivision index of the result can be analyzed through the menu comparison, so that the target object and the related objects thereof are further accurately compared and analyzed.
The comparison result may be displayed at any position of the display output area, for example, in the first display output area, the second display output area, or the first position area. Preferably, when the target object is displayed in the content display interface, whether the content display interface occupies the display output area or not, the comparison result is displayed at the adjacent position of the target object, so as to further analyze the comparison result based on the target object.
In some embodiments, the processing method further comprises:
s104: and classifying and/or editing the target object and the associated object thereof.
Specifically, before step S101, different target objects in the target file may be classified in advance, and attribute information of the target object may be determined, so that the target object is called in time after the target object is determined from the target file through step S101. For example, the whole word document may be traversed in advance, the content in the word document may be divided, a plurality of pictures including the target object may be captured, and a snapshot may be generated. And inputting the picture containing the target object into a preset classification model, processing and analyzing the picture by using a picture matching algorithm to obtain a classification result of the target object in each picture, and generating a preset classification picture library. The classification category (attribute information) of the target object may include text, formula, table, picture, video, audio, and the like.
Further, in some embodiments, since the picture categories may include sub-category categories such as people, animals, and scenic categories, in order to accurately determine the category to which the content of the picture category belongs, the large picture category may be further classified to obtain pictures of each sub-category, that is, the sub-category attribute information of the target object is obtained. For another example, the target object of the formula class may be further classified according to the subject to which the formula belongs, for example, the formula is a mathematical formula or a physical formula.
In step S104, after the target object is determined in step S101, the target object may be classified based on a preset classification result, and if the target object belongs to a preset classification category, the target object is classified into a corresponding category; and if the target object does not belong to the preset classification category, establishing a new classification category based on the target object, classifying the target object into the new category, and storing the new category into the preset classification picture library so as to update the picture library.
In some embodiments, when the associated object is derived from the target file, that is, the preset classification picture library includes both the picture of the target object and the picture of the associated object, in step S102, the classification category of the target object may be identified by using the obtained classification result, and the associated object may be matched from the preset classification picture library based on the classification category of the target object.
When the associated object is derived from the target associated file, before step S101, the associated object may be classified based on a preset classification model, and the classification category to which the associated object belongs is determined, so that the associated object having the same attribute information as the target object is determined in time through step S102. Further, after the associated object is determined in step S102, the associated object may be classified to determine the category thereof.
After classifying the target object and the related objects thereof, as shown in fig. 9, the plurality of pictures containing the target object in the preset classification category may be hidden and displayed on the top menu bar of the target file according to the classification category to which the pictures belong, so that the user may determine the target object to be compared by selecting an input or the like. That is, different target objects may be displayed in the form of OSD on the display output area of the electronic device, and when the user clicks the menu bar, the corresponding target object may be selected based on the identification information of each target object so as to determine the target object. After the target object is determined, the target object may be displayed in the display output area in the display manner in step S103.
Alternatively, the user may toggle selection of different categories of target objects using a shortcut key (e.g., Ctrl + wheel), using left and right arrows, or using a mouse, etc. Meanwhile, the target objects under the classification category can be sequentially displayed in a rolling mode through the sliding roller. Further, the target object to be compared under the classification category can be determined by using a shortcut key (for example, Shift + click) or mouse selection. In this embodiment, a plurality of target objects may be simultaneously selected in the same classification category, and the target objects may be displayed in the display manner in step S103, so as to simultaneously display the associated objects.
In particular, in this embodiment, as shown in fig. 9, after the target object is further classified into its sub-classification category, the identification information of the target object displayed in the top menu bar is the identification information of the sub-classification category recorded last time. For example, when the target object and the related object displayed last time are related to a character picture, a sub-classification category under the picture category of "character picture category" may be displayed on the top menu bar, so as to prompt the user to include the sub-classification category, and the picture recorded most recently is a character picture, so that the user can compare the sub-classification category with the picture recorded most recently.
In this step, editing the target object and the associated object includes: editing the content of the target object and the related object thereof, or adding or deleting the generated picture containing the target object.
For example, the content of the target object may be edited, and the attribute information of the target object is newly determined based on the edited content, that is, the editing and classification of the target object are combined, so as to re-determine the associated object, that is, timely update of the target file may be achieved. For example, when there are a plurality of associated objects displayed in the display output area, one of the associated objects may be selected to be compared with the target object, and the other associated objects may be deleted to facilitate the comparison.
According to the processing method provided by the embodiment of the disclosure, after the target object in the target file is determined, the associated object of the target object can be quickly searched from the target file or the target associated file thereof, and the target object and the associated object thereof are associated and displayed in the display output area of the electronic device according to different display parameters, so that a user can more conveniently perform file processing operations such as comparison and the like, and the file processing efficiency and effect are improved.
Fig. 10 shows a schematic structural diagram of a processing device according to an embodiment of the disclosure. As shown in fig. 10, an embodiment of the present disclosure provides a processing apparatus, including:
a first determining module 100 configured to determine a target object from a target file;
a second determining module 200, configured to determine the associated object from a target file in which the target object is located or a target associated file associated with the target file if an instruction for displaying the target object and the associated object in an associated manner is obtained;
the display module 300 is configured to display the target object and the related objects thereof in a display output area of the electronic device according to at least the display parameters of the content display interface where the target object is located;
wherein the display parameters of the target object and the associated object in the display output area are different.
According to the processing device provided by the embodiment of the disclosure, after the target object in the target file is determined, the associated object of the target object can be quickly searched from the target file or the target associated file thereof, and the target object and the associated object thereof are associated and displayed in the display output area of the electronic device according to different display parameters, so that a user can more conveniently perform file processing operations such as comparison and the like, and the file processing efficiency and effect are improved.
The processing apparatus provided in the embodiment of the present disclosure corresponds to the processing method in the embodiment described above, and based on the processing method described above, a person skilled in the art can understand the specific implementation manner of the processing apparatus in the embodiment of the present disclosure and various variations thereof, and any optional items in the embodiment of the processing method are also applicable to the processing apparatus, and are not described herein again.
An embodiment of the present disclosure further provides an electronic device, including: comprises a processor and a memory, wherein the memory is used for storing computer executable instructions, and the processor realizes the processing method when executing the computer executable instructions.
The processor may be a general-purpose processor, including a central processing unit CPU, a Network Processor (NP), and the like; but also a digital signal processor DSP, an application specific integrated circuit ASIC, a field programmable gate array FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
The memory may include Random Access Memory (RAM) and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The embodiment of the present disclosure also provides a computer-readable storage medium, on which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the processing method is implemented.
The above embodiments are merely exemplary embodiments of the present disclosure, which is not intended to limit the present disclosure, and the scope of the present disclosure is defined by the claims. Various modifications and equivalents of the disclosure may occur to those skilled in the art within the spirit and scope of the disclosure, and such modifications and equivalents are considered to be within the scope of the disclosure.

Claims (10)

1. A method of processing, comprising:
determining a target object from a target file;
if an instruction for performing associated display on the target object and the associated object thereof is obtained, determining the associated object from a target file where the target object is located or a target associated file associated with the target file;
displaying the target object and the related object thereof in a display output area of the electronic equipment at least according to the display parameter of the content display interface where the target object is located;
wherein the display parameters of the target object and the associated object in the display output area are different.
2. The method of claim 1, wherein determining a target object from a target file comprises:
if a selection input acting on a content display interface of the target file is obtained, determining the content selected by the selection input as the target object; or the like, or, alternatively,
if a search input acting on the target content of the target file is obtained, determining the target content searched by the search input as the target object; or the like, or, alternatively,
and if a target input acting on an input component of the electronic equipment is obtained, determining the target content in the target file pointed by the target input as the target object.
3. The method of claim 1 or 2, wherein obtaining instructions to display the target object in association with its associated object comprises:
obtaining an instruction for comparing the target object with the related object; or the like, or, alternatively,
a request for obtaining a comparison result of the target object and the associated object; or the like, or, alternatively,
obtaining an instruction for splitting the target object and the related object thereof; or the like, or, alternatively,
and acquiring an instruction for fusing the target object and the related object thereof.
4. The method of claim 3, wherein determining the associated object from a target file in which the target object is located comprises:
acquiring attribute information of the target object, and determining the content in the target file, which has the same attribute as the target object, as the associated object; or the like, or, alternatively,
acquiring attribute information of the target object and position information of the target object in the target file, and determining content which has the same attribute with the target object and meets a preset position relation in the target file as the associated object; or the like, or, alternatively,
obtaining content information of the target object, and determining content in the target file, which has a first content similarity threshold with the target object, as the associated object; or the like, or, alternatively,
and obtaining the content information and the attribute information of the target object, and determining the content which has the same attribute as the target object and has a second content similarity threshold value in the target file as the associated object.
5. The method of claim 3, wherein determining the associated object from a target associated file associated with the target file comprises:
acquiring attribute information of the target object, and determining the content in the target associated file, which has the same attribute as the target object, as the associated object; or the like, or, alternatively,
obtaining content information of the target object, and determining the content in the target associated file, which has a third content similarity threshold with the target object, as the associated object; or the like, or, alternatively,
and obtaining the content information and the attribute information of the target object, and determining the content which has the same attribute as the target object and has a fourth content similarity threshold value in the target associated file as the associated object.
6. The method according to claim 4 or 5, wherein displaying the target object and the associated object in a display output area of the electronic device according to at least a display parameter of a content display interface where the target object is located comprises:
if the display parameters represent that the content display interface where the target object is located is in a full-screen display state, displaying the associated object at the position close to the target object in the content display interface, or displaying the target object and the associated object at the forefront end of the display output area; or the like, or, alternatively,
if the display parameter represents that the content display interface where the target object is located in a first position area of the display output area, displaying the target object in the first position area, displaying an associated object of the target object in a second position area different from the first position area, or displaying the target object and the associated object in the second position area different from the first position area; or the like, or, alternatively,
and acquiring configuration information of a display output area of the electronic equipment, and displaying the target object and the related object thereof in the display output area of the electronic equipment according to the configuration information and the display parameters.
7. The method of claim 6, wherein displaying the target object and its associated objects in a display output area of an electronic device according to the configuration information and the display parameters comprises:
if the configuration information represents that the electronic equipment at least comprises a first display output area and a second display output area, and the display parameter represents that a content display interface where the target object is located is in a full-screen display state in the first display output area, displaying the target object at the forefront end of the first display output area, displaying an associated object of the target object in the second display output area, or displaying the target object and the associated object thereof in the second display output area; or the like, or, alternatively,
if the configuration information indicates that the electronic device at least comprises a first display output area and a second display output area, and the display parameter indicates that a content display interface where the target object is located in a third position area of the first display output area, displaying the target object in the third position area, and displaying an associated object of the target object in a fourth position area different from the third position area in the first display output area, or displaying the target object and the associated object thereof in the second display output area.
8. The method of any of claims 4, 5, or 7, wherein displaying the target object and its associated objects in a display output area of an electronic device comprises:
after the target object and the related object are processed into a file with a preset format, displaying the file in the display output area in a floating window or layer form; and/or the presence of a gas in the atmosphere,
and displaying the comparison result of the target object and the related object in a preset form in the display output area.
9. The method of claim 8, further comprising:
and classifying and/or editing the target object and the associated object thereof.
10. A processing apparatus, comprising:
a first determination module configured to determine a target object from a target file;
the second determination module is configured to determine the associated object from a target file where the target object is located or a target associated file associated with the target file if an instruction for displaying the target object and the associated object in an associated manner is obtained;
the display module is configured to display the target object and the related objects thereof in a display output area of the electronic equipment at least according to the display parameters of the content display interface where the target object is located;
wherein the display parameters of the target object and the associated object in the display output area are different.
CN202110227077.2A 2021-03-01 2021-03-01 Processing method and device Pending CN112905080A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110227077.2A CN112905080A (en) 2021-03-01 2021-03-01 Processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110227077.2A CN112905080A (en) 2021-03-01 2021-03-01 Processing method and device

Publications (1)

Publication Number Publication Date
CN112905080A true CN112905080A (en) 2021-06-04

Family

ID=76108500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110227077.2A Pending CN112905080A (en) 2021-03-01 2021-03-01 Processing method and device

Country Status (1)

Country Link
CN (1) CN112905080A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436477A (en) * 2011-10-11 2012-05-02 鸿富锦精密工业(深圳)有限公司 Device with related content search function and method
US20130325859A1 (en) * 2012-05-30 2013-12-05 Skychron, Inc. Using chronology as the primary system interface for files, their related meta-data, and their related files
CN107977342A (en) * 2016-10-25 2018-05-01 阿里巴巴集团控股有限公司 A kind of document control methods and device
CN110188178A (en) * 2019-05-30 2019-08-30 深圳龙图腾创新设计有限公司 Across the document information lookup method of one kind, device, computer equipment and storage medium
CN110231907A (en) * 2019-06-19 2019-09-13 京东方科技集团股份有限公司 Display methods, electronic equipment, computer equipment and the medium of electronic reading
CN111552783A (en) * 2020-04-30 2020-08-18 深圳前海微众银行股份有限公司 Content analysis query method, device, equipment and computer storage medium
CN111638831A (en) * 2020-05-29 2020-09-08 维沃移动通信有限公司 Content fusion method and device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436477A (en) * 2011-10-11 2012-05-02 鸿富锦精密工业(深圳)有限公司 Device with related content search function and method
US20130325859A1 (en) * 2012-05-30 2013-12-05 Skychron, Inc. Using chronology as the primary system interface for files, their related meta-data, and their related files
CN107977342A (en) * 2016-10-25 2018-05-01 阿里巴巴集团控股有限公司 A kind of document control methods and device
CN110188178A (en) * 2019-05-30 2019-08-30 深圳龙图腾创新设计有限公司 Across the document information lookup method of one kind, device, computer equipment and storage medium
CN110231907A (en) * 2019-06-19 2019-09-13 京东方科技集团股份有限公司 Display methods, electronic equipment, computer equipment and the medium of electronic reading
CN111552783A (en) * 2020-04-30 2020-08-18 深圳前海微众银行股份有限公司 Content analysis query method, device, equipment and computer storage medium
CN111638831A (en) * 2020-05-29 2020-09-08 维沃移动通信有限公司 Content fusion method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US7788590B2 (en) Lightweight reference user interface
US7624130B2 (en) System and method for exploring a semantic file network
US7634471B2 (en) Adaptive grouping in a file network
US7502785B2 (en) Extracting semantic attributes
WO2018072071A1 (en) Knowledge map building system and method
US7840891B1 (en) Method and system for content extraction from forms
US20150248429A1 (en) Generation of visual representations for electronic content items
US20130305149A1 (en) Document reader and system for extraction of structural and semantic information from documents
JP5161658B2 (en) Keyword input support device, keyword input support method, and program
US10650186B2 (en) Device, system and method for displaying sectioned documents
US20130124515A1 (en) Method for document search and analysis
US20220343077A1 (en) Method for displaying entity-associated information based on electronic book and electronic device
US20080104040A1 (en) Visually intuitive search method
US20140019852A1 (en) Document association device, document association method, and non-transitory computer readable medium
US20140348400A1 (en) Computer-readable recording medium storing program for character input
US20070136348A1 (en) Screen-wise presentation of search results
US7921127B2 (en) File management apparatus, control method therefor, computer program, and computer-readable storage medium
CN109933702B (en) Retrieval display method, device, equipment and storage medium
JP2008040753A (en) Image processor and method, program and recording medium
JP2007279978A (en) Document retrieval device and document retrieval method
CN112905080A (en) Processing method and device
JP2005107931A (en) Image search apparatus
JP4408605B2 (en) Knowledge data processing device
US20140304118A1 (en) Product comparison apparatus and method
US20040237026A1 (en) System and method for creating reminders in electronic documents

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination