CN114547508A - Data processing method, data processing device, computer equipment and storage medium - Google Patents
Data processing method, data processing device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN114547508A CN114547508A CN202210169997.8A CN202210169997A CN114547508A CN 114547508 A CN114547508 A CN 114547508A CN 202210169997 A CN202210169997 A CN 202210169997A CN 114547508 A CN114547508 A CN 114547508A
- Authority
- CN
- China
- Prior art keywords
- data
- frame selection
- area
- merged
- displaying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/957—Browsing optimisation, e.g. caching or content distillation
- G06F16/9577—Optimising the visualization of content, e.g. distillation of HTML documents
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/958—Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to a data processing method, a data processing device and computer equipment. The method comprises the following steps: responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data; carrying out data merging on the frame selection data to obtain merged data; and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area. By adopting the method, the data processing efficiency can be improved.
Description
Technical Field
The present application relates to the field of internet technologies, and in particular, to a data processing method, an apparatus, a computer device, a storage medium, and a computer program product.
Background
With the development of internet technology, data processing technology has appeared, and data selected in web pages are aggregated through a web page aggregation function. However, a user needs to open a web page work bar to search a data aggregation option, the user can select data to be processed only after opening a data aggregation function by clicking the data aggregation option, and the aggregation function needs to be closed when the user performs other operations on the data, so that the operation of processing the data by the user is complicated, the problem of low data processing efficiency is caused, and the user experience is poor.
Disclosure of Invention
In view of the above, it is necessary to provide a data processing method, an apparatus, a computer device, a computer readable storage medium, and a computer program product, which can improve data processing efficiency.
In a first aspect, the present application provides a data processing method. The method comprises the following steps:
responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data;
carrying out data merging on the frame selection data to obtain merged data;
and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
In one embodiment, the data merging the frame selection data to obtain merged data includes:
obtaining coordinate points in the framed selection area, and determining cells in the framed selection area based on the coordinate points in the framed selection area;
and acquiring the framing data in the cells in the framing area, and performing data merging on the framing data in the cells in the framing area to obtain merged data corresponding to the framing area.
In one embodiment, the data merging the frame selection data to obtain merged data includes:
acquiring position information of the framing data in a framing area, dividing the framing data into at least one type of data according to the position information, and performing data merging on the same type of data in the at least one type of data to obtain merged data corresponding to the same type of data;
displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises:
and correspondingly displaying the merged data corresponding to the homogeneous data and the homogeneous data in the merged data display area.
In one embodiment, after the selection area selected by the selection operation is displayed in the current interface, the method further includes:
in response to a subframe selection operation on the frame selection region, displaying a subframe selection region framed by the subframe selection operation in the frame selection region, the subframe selection region including subframe selection data in the frame selection data;
carrying out data merging on the sub-frame selection data to obtain sub-merging data;
and displaying a sub-merged data display region corresponding to the sub-frame selection region, and displaying the sub-merged data in the sub-merged data display region.
In one embodiment, the data merging the frame selection data to obtain merged data includes:
dividing the framing data into at least one type of identification data according to the data type identification corresponding to the framing data, and carrying out data merging on the same type of identification data in the at least one type of identification data to obtain merged data corresponding to the same type of identification data;
displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises:
and correspondingly displaying the merged data corresponding to the same type of identification data and the data type identification corresponding to the same type of identification data in the merged data display area.
In one embodiment, in response to a frame selection operation on a current interface, displaying a frame selection area framed by the frame selection operation in the current interface, where the current interface display includes interface data, and after the frame selection area includes frame selection data in the interface data, the method further includes:
correspondingly displaying at least one preset merging strategy based on the frame selection area in the current interface;
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises:
in response to the selection operation of at least one preset merging strategy, determining the selected preset merging strategy as a determined target merging strategy;
and carrying out data merging on the frame selection data according to the target merging strategy to obtain merged data.
In one embodiment, displaying the merged data in the merged data display area includes:
when the beginning of data display operation is detected in the frame selection area, displaying the merged data in the merged data display area;
the method further comprises the following steps:
and hiding the merged data display area and the merged data when the data display operation is detected to be finished in the frame selection area.
In one embodiment, the frame area comprises an image area; the method further comprises the following steps:
performing image identification based on the image area to obtain image display data;
carrying out data merging on the image display data to obtain image merging data corresponding to the image area;
and displaying an image merging data display area corresponding to the image area in the current interface, and displaying the image merging data in the image merging data display area.
In a second aspect, the present application further provides a data processing apparatus. The device comprises:
the frame selection module is used for responding to the frame selection operation of the current interface and displaying a frame selection area framed by the frame selection operation in the current interface, the current interface display comprises interface data, and the frame selection area comprises frame selection data in the interface data;
the merging module is used for merging the frame selection data to obtain merged data;
and the display module is used for displaying the merged data display area corresponding to the frame selection area in the current interface and displaying the merged data in the merged data display area.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data;
carrying out data merging on the frame selection data to obtain merged data;
and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data;
carrying out data merging on the frame selection data to obtain merged data;
and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data;
carrying out data merging on the frame selection data to obtain merged data;
and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
According to the data processing method, the data processing device, the computer equipment, the storage medium and the computer program product, the frame selection area framed and selected by the frame selection operation is displayed in the current interface by responding to the frame selection operation of the current interface; carrying out data combination on the frame selection data to obtain combined data; the merged data display area corresponding to the frame selection area is displayed in the current interface, so that the merged data display area is obviously distinguished from the frame selection area, and merged data is displayed in the merged data display area, so that a user does not need to merge the data through a summation function in the interface, the operation steps of the user are reduced, the merged data in the frame selection area can be quickly obtained on the current interface, the data processing efficiency is improved, and the user experience is improved.
Drawings
FIG. 1 is a diagram of an application environment of a data processing method in one embodiment;
FIG. 2 is a flow diagram illustrating a data processing method according to one embodiment;
FIG. 3 is a schematic flow chart illustrating data merging according to one embodiment;
FIG. 4 is a diagram illustrating consolidated data, according to one embodiment;
FIG. 5 is a flow diagram illustrating the generation of child-merge-data in one embodiment;
FIG. 6 is a flow diagram of a data processing method in an exemplary embodiment;
FIG. 7 is a block diagram showing the structure of a data processing apparatus according to an embodiment;
FIG. 8 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The data processing method provided by the embodiment of the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The terminal 102 may retrieve the online interface source code and display the interface via the server 104. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104, or may be located on the cloud or other network server. The terminal 102 responds to the frame selection operation of the current interface, and displays a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data; the terminal 102 performs data merging on the frame selection data to obtain merged data; the terminal 102 displays a merged data display area corresponding to the frame selection area in the current interface, and displays the merged data in the merged data display area. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices and portable wearable devices, and the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart car-mounted devices, and the like. The portable wearable device can be a smart watch, a smart bracelet, a head-mounted device, and the like. The server 104 may be implemented as a stand-alone server or as a server cluster comprised of multiple servers. In some embodiments, the interface displayed by the terminal 102 may also be an offline interface, that is, the terminal 102 may independently implement the data processing method provided in the embodiments of the present application.
In one embodiment, as shown in fig. 2, a data processing method is provided, which is described by taking the application of the method to the terminal in fig. 1 as an example, and includes the following steps:
step 202, in response to the frame selection operation on the current interface, displaying a frame selection area framed by the frame selection operation in the current interface, where the current interface includes interface data, and the frame selection area includes frame selection data in the interface data.
The current interface refers to various types of interfaces which can be displayed by the terminal. For example, it may be an online interface or an offline interface, depending on the online situation; according to the application scenario, the interface can be an interface in various business systems, such as an enterprise management system, an ERP system, a production and manufacturing management system, a supply chain system, a personnel system, a financial system, a tax system and the like. The frame selection operation refers to an event for selecting an area in the current interface, and can be realized through various operations, such as a sliding operation. The interface data refers to data which can be selected by frames in the current interface, and comprises numerical data, text data, image data, table data and the like. The display frame selection area is that the display mode of the frame selection area is different from other contents in the current interface.
Specifically, when detecting the frame selection operation of the user on the current interface, the terminal responds to the frame selection operation of the current interface, and displays a frame selection area framed by the frame selection operation on the current interface, wherein the frame selection area comprises frame selection data, and the frame selection data is part of interface data framed and selected by the user in the current interface. The terminal may display the frame selection area in the current interface in a color difference, for example, display the frame selection area with a specific color or highlight the frame selection area.
And 204, carrying out data combination on the frame selection data to obtain combined data.
The data merging refers to the operation of the frame selection data according to a preset merging strategy. The merged data refers to data generated by calculating the frame selection data according to a preset merging strategy.
Specifically, the terminal acquires the framing data according to the framing area, and operates the framing data according to a preset merging strategy to generate merging data corresponding to the framing area.
And step 206, displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
The merged data display area is an area for displaying the merged data, and the area corresponds to the frame selection area.
Specifically, the merged data display area and the frame selection area may correspond to each other in position, or the merged data in the merged data display area and the frame selection data in the frame selection area may correspond to each other. The terminal displays the merged data display area at a position corresponding to the frame selection area in the current interface in a manner different from that of the current interface and the frame selection area, for example, the terminal displays the merged data display area in a color different from that of the current interface and the frame selection area, and the terminal can also display the merged data display area at a position corresponding to the outside of the frame selection area and display the merged data in the merged data display area. The terminal may also display the merged data at a position corresponding to the frame selection data in the merged data display area. The terminal may also display the merged data in the merged data display area in a form distinct from the frame-selected data, for example, with a fill color of the merged data display area distinct from the frame-selected area and the current interface. The terminal can directly display the merged data display area and the merged data at the corresponding position outside the frame selection area in the current interface, and keep the display state until a new frame selection event is detected in the current interface, and cancel the display of the merged data display area and the merged data.
In the data processing method, a frame selection area framed by the frame selection operation is displayed in the current interface by responding to the frame selection operation on the current interface; carrying out data combination on the frame selection data to obtain combined data; and displaying the merged data display area corresponding to the frame selection area in the current interface, so that the merged data display area is obviously distinguished from the frame selection area, and displaying the merged data in the merged data display area, so that a user does not need to merge data through a summation function in the interface, the operation steps of the user are reduced, the merged data in the frame selection area can be quickly obtained on the current interface, the data processing efficiency is improved, and the user experience is improved.
In an embodiment, as shown in fig. 3, a flow chart of data merging is provided, and step 204, performing data merging on the boxed data to obtain merged data includes:
step 302, obtaining coordinate points in the framed selection area, and determining cells in the framed selection area based on the coordinate points in the framed selection area;
and 304, acquiring the framing data in the cells in the framing area, and performing data merging on the framing data in the cells in the framing area to obtain merged data corresponding to the framing area.
The coordinate points are two-dimensional points where horizontal and vertical coordinates in the framing area intersect, the two-dimensional points form the framing area, and the coordinate points contained in different cells in the framing area are different. The frame selection area is determined by the terminal according to the position coordinate of the frame selection area, the position coordinate of the frame selection area comprises the row coordinate and the column coordinate of the frame selection area in the current interface and the length and the width of the frame selection area, the row coordinate and the column coordinate are used for representing the position and the size of the frame selection area in the current interface, and the row coordinate and the column coordinate are obtained according to the distance between the frame selection area and the boundary of the current interface.
Specifically, the terminal obtains the row coordinate and the column coordinate of the frame selection area corresponding to the frame selection event in the current interface and the length and the width of the frame selection area to obtain the position coordinate of the frame selection area corresponding to the frame selection area, and then determines the position and the size of the frame selection area in the current interface according to the position coordinate of the frame selection area to obtain the frame selection area.
And traversing the horizontal and vertical coordinate points in the frame selection area by the terminal to obtain coordinate points corresponding to each cell in the frame selection area, and determining the cell to which the coordinate points in the frame selection area belong according to the obtained coordinate points and the incidence relation between the preset coordinate points and the cells.
The terminal judges the data type of the frame selection data, the preset data type refers to the type corresponding to the preset data, and the type is the type capable of carrying out data combination and comprises numerical data, text data, image data, audio data and the like. When the framing data is of a preset data type, the framing data in the cells in the framing area are subjected to data merging, and the terminal can perform data fusion on the framing data in the cells in the framing area to obtain merged data corresponding to the framing area; and when the data type of the frame selection data is a non-preset data type, indicating that the displayed data is data which can not be combined and not processed.
In this embodiment, the cells in the framing area can be obtained through the coordinate points in the framing area, then the framing data in the cells is obtained, and the data merging is performed on the framing data to obtain merged data, so that the speed of data merging is increased, and the efficiency of data processing is improved.
In one embodiment, the data merging the frame selection data to obtain merged data includes:
acquiring position information of the framing data in a framing area, dividing the framing data into at least one type of data according to the position information, and performing data merging on the same type of data in the at least one type of data to obtain merged data corresponding to the same type of data;
displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises:
and correspondingly displaying the merged data corresponding to the homogeneous data and the homogeneous data in the merged data display area.
The frame selection data may be pre-edited data in the cell. The position information refers to the distance between the cell to which the frame selection data of the frame selection area belongs and the boundary of the current interface, and each cell has a corresponding target distance and can be represented by row coordinates or column coordinates. At least one type of data refers to a type of data having the same row coordinates or a type of data having the same column coordinates in the boxed area. The homogeneous data refers to single-row data or multi-row data with the same row coordinate, and can also be single-column data or multi-column data with the same column coordinate.
Specifically, the terminal determines whether the target distances corresponding to all cells in the frame selection area are consistent, and when the target distances corresponding to the row coordinates or the column coordinates of all cells are consistent, it indicates that the cells in the frame selection area can be divided according to the target distances, for example, the cells in the same column with the same column coordinates or the cells in the same row with the same row coordinates, so as to obtain at least one type of cells. And then at least one type of frame selection data corresponding to at least one type of unit cells in the frame selection area is obtained. And carrying out data merging on the same-class data in the at least one class of frame data, wherein the same-class data can be the same-row data or the same-column data, and obtaining merged data corresponding to the same-class data. And respectively displaying the merged data corresponding to the homogeneous data in the merged data display area, wherein the merged data corresponding to the homogeneous data and the homogeneous data are correspondingly displayed. The merged data display area may be displayed in the current interface in the form of a floating window, and may be filled with a color for display as a distinction from the current interface and the framing area.
In a specific embodiment, the terminal calculates the distance between the cell in the frame selection area and the boundary of the current interface, wherein the distance between the cell and the upper boundary of the current interface is row coordinates, and the distance between the cell and the left boundary of the current interface is column coordinates. The terminal obtains position coordinates of the cells in the frame selection area as (top, left) according to the distance between the cells and the boundary of the current interface, wherein the top is the row coordinates of the cells, the left is the column coordinates of the cells, and the cells in the frame selection area can be divided according to the same column coordinates or according to the same row coordinates according to the position coordinates of the cells;
obtaining at least one type of cell in the framed selection area after division, wherein the at least one type of cell can be at least two columns of cells with different column coordinates or at least two rows of cells with different row coordinates in the framed selection area, and acquiring data of the same type of cells in the at least one type of cells in the framed selection area;
and then the terminal judges the data type of the data in at least one type of cell in the frame selection area and judges whether the data in at least one type of cell in the frame selection area is a preset data type. For example, the data in the cells of the same type are numerical data, and in this case, the numerical data in the cells of the same type in the framed area may be subjected to mathematical operation, where the mathematical operation may be summation, subtraction, multiplication, and the like. For example, the data in the cells of the same type are text data, and at this time, the text data in the cells of the same type in the frame selection area may be subjected to text splicing. The terminal may use a regular expression to perform data type determination, for example, the regular expression is/^ d + $/, where the left and right slashes are the limiting symbols of the regular expression, and the content of the regular expression is between them:
the symbol of "" represents the matching start position;
"$" indicates the matching end position;
"\ d" indicates a matching number;
the "+" sign defines the number of previously matching digits to be more than 1.
At the moment, the terminal obtains merged data corresponding to the same type of data in at least one type of data respectively, and displays the merged data corresponding to the same type of data in the merged data display area. When the merged data is the merged data corresponding to the same column of data in the frame selection area, the merged data display area may be displayed in a floating window form at a lower position outside the frame selection area, and the merged data in the merged data display area may be displayed in the same column as the frame selection data corresponding thereto.
In one embodiment, as shown in fig. 4, a schematic diagram for displaying merged data is provided, which includes:
the user selects two columns of cells in the current interface in a frame mode through frame selection operation, the terminal determines a frame selection area according to the frame selection operation, then data in the cells in the same column in the frame selection area are added to obtain a sum, and then the sum of the two cells is correspondingly displayed below the cells with the same column coordinates in the frame selection area in a suspension frame mode. In the embodiment, the cells in the frame selection area are classified according to the cell position coordinates to obtain the cell information in the same column or the same row, the data in the same type of cells are calculated, and the calculated combined data and the same type of cells in the frame selection area are correspondingly displayed, so that the requirement of a user on combining the cell data of the framed table data in the current interface is met, the user can conveniently perform subsequent processing on the data in the frame selection area, and the data processing efficiency is improved.
In one embodiment, as shown in FIG. 5, a flow diagram for generating child-merge-data is provided; after the selection area selected by the selection operation is displayed in the current interface, the method further comprises the following steps:
step 502, in response to the subframe selection operation on the frame selection area, displaying a subframe selection area selected by the subframe selection operation in the frame selection area, wherein the subframe selection area comprises subframe selection data in the frame selection data;
step 504, performing data merging on the sub-frame selection data to obtain sub-merged data;
step 506, displaying a sub-merged data display area corresponding to the sub-frame selection area, and displaying the sub-merged data in the sub-merged data display area.
The sub-frame selection area refers to an area for performing frame selection in an existing frame selection area, and the sub-frame selection area can also be an area for performing frame selection in the existing sub-frame selection area. The current interface may include a plurality of sub-frame selection areas. The subframe selection data refers to frame selection data in the subframe selection area. The sub merged data refers to merged data corresponding to the frame selection data in the sub frame selection area.
Specifically, the terminal may display a coincidence region frame selection option on the current interface, and the user may select the sub-frame selection region through the coincidence region frame selection option. When the terminal detects that a user clicks the selection option of the overlapped area, the terminal switches the current interface into an overlapped area frame selection mode, wherein the overlapped area frame selection mode is a mode for performing sub-frame selection area frame selection on the existing frame selection area.
When the terminal responds to a subframe selection area framing event of an existing framing selection area, the position coordinates of the subframe selection area are (top, left, width, height), wherein the top is the row coordinates of the subframe selection area and is obtained according to the distance between the subframe selection area and the upper limit of the current interface, the left is the column coordinates of the subframe selection area and is obtained according to the distance between the subframe selection area and the left limit of the current interface, the width is the width of the subframe selection area, the height is the height of the subframe selection area, and the position and the size of the subframe selection area on the current interface can be obtained through the position coordinates of the subframe selection area.
Traversing the horizontal coordinate in the frame selection area for the first time and traversing the vertical coordinate in the frame selection area for the second time by the terminal to obtain a coordinate point in the subframe selection area; the terminal determines a cell to which the coordinate point in the subframe belongs according to the obtained coordinate point in the subframe selected region, for example, the terminal may obtain an html (HyperText Markup Language) element in which the coordinate point is located according to the coordinate point (x, y) in the subframe selected region by using document. When the terminal detects that the column coordinates or the row coordinates of the cells in the subframe selection area are the same, the target distances of the cells in the subframe selection area are consistent, and subframe selection data in the cells in the subframe selection area are obtained, for example, the terminal can obtain data in the cells by using a textContent attribute of the cells, wherein the textContent attribute can be used for obtaining text content in html elements; and then judging the data type of the sub-frame selection data by using a regular expression, when the sub-frame selection data in the cells in the sub-frame selection area are numerical data, carrying out merging calculation on the sub-frame selection data in the cells in the sub-frame selection area to obtain corresponding merged data, displaying a sub-merged data display area corresponding to the sub-frame selection area on the current interface, and displaying the sub-merged data in the sub-merged data display area. The terminal can simultaneously display the combined data corresponding to the existing frame selection area and the combined data corresponding to the sub frame selection area.
The terminal can also respond to the frame selection operation of a user on a plurality of non-coincident frame selection areas in the current interface by using a shift key and a mouse at the computer terminal, and respectively carry out data merging on the frame selection data in the non-coincident frame selection areas to obtain merged data corresponding to the non-coincident frame selection areas; and then displaying a plurality of combined data display areas corresponding to the non-coincident frame selection areas on the current page, and displaying corresponding combined data in the combined data display areas corresponding to the non-coincident frame selection areas.
The terminal can also display a polygon area framing option on the current interface, when the terminal detects a mouse event of the polygon area framing option by the user, the terminal switches the current interface into a polygon area framing mode, and the polygon area framing mode represents a mode of framing any polygon area on the current interface by the user, for example, a map in the current interface is subjected to polygon framing, and the map includes provincial population data and the like. And when the terminal detects a framing event of the polygonal area, acquiring data in the polygonal area, performing merging calculation, and displaying the obtained merged data in a merged data display area corresponding to the polygonal area.
In the embodiment, the sub-frame selection data of the existing frame selection area is subjected to the combination calculation, and the data in the frame selection area of the special scene is subjected to the combination calculation, so that the data calculation requirements of the user on different scenes can be met, the user experience is improved, the subsequent processing of the data in the frame selection area by the user is facilitated, and the data processing efficiency is improved.
In one embodiment, the data merging the frame selection data to obtain merged data includes:
dividing the framing data into at least one type of identification data according to the data type identification corresponding to the framing data, and carrying out data merging on the same type of identification data in the at least one type of identification data to obtain merged data corresponding to the same type of identification data;
displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises:
and correspondingly displaying the merged data corresponding to the same type of identification data and the data type identification corresponding to the same type of identification data in the merged data display area.
The identification data refers to frame selection data divided according to data type identification. The same type identification data refers to the frame selection data corresponding to the same data type identification.
Specifically, when the terminal detects that the framing data and the data type identification corresponding to the framing data are included in the framing area, a data classification function is started, the terminal classifies the framing data according to the data type identification to obtain at least one type of identification data, and the like identification data in the at least one type of identification data are combined to obtain combined data corresponding to the like identification data. And displaying a merged data display area corresponding to the frame selection area on the current interface, and correspondingly displaying the merged data corresponding to the same type identification data and the data type identification corresponding to the same type identification in the merged data display area.
In a specific embodiment, the frame selection area includes detection result data, the content of the detection result data includes error identifications and corresponding error numbers, warning identifications and corresponding warning numbers, the terminal classifies the detection result data in the frame selection area according to the error identifications and the warning identifications to obtain error classification data and warning classification data, the terminal respectively sums a plurality of error classification data and warning classification data to obtain error number counts and warning number counts, and sums the error number counts and warning number counts to obtain an abnormal sum result, and simultaneously displays the error number counts, warning number counts and abnormal sum result in the combined data display area corresponding to the frame selection area, for example, the frame selection area includes "error: 1 "," error: 2 "," warning: 3 "," warning: 2 ", when the error count is as small as 3, the warning count is as small as 5, and the abnormality count result is 8, the corresponding merged data including" error count total 3 "and" warning count total 5 "is displayed in the merged data display area, and all the abnormality merged data, that is," abnormality count total 8 ", is displayed at the same time.
In the embodiment, the data in the user frame selection area is automatically classified, so that the classification requirement of the user is met, the user experience is improved, the user can conveniently perform subsequent processing on the data in the frame selection area, and the data processing efficiency is improved.
In one embodiment, after responding to a frame selection operation on a current interface, displaying a frame selection area framed by the frame selection operation in the current interface, where the current interface display includes interface data, and the frame selection area includes frame selection data in the interface data, the method further includes:
correspondingly displaying at least one preset merging strategy based on the frame selection area in the current interface;
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises:
in response to the selection operation of at least one preset merging strategy, determining the selected preset merging strategy as a determined target merging strategy;
and carrying out data merging on the frame selection data according to the target merging strategy to obtain merged data.
The preset merging strategy refers to calculation mode options corresponding to multiple calculation modes displayed on the current interface, and includes numerical calculation options such as summation calculation, subtraction calculation, multiplication calculation and the like, and also includes non-numerical calculation options such as text splicing, image fusion and the like.
Specifically, the terminal may display the preset merging strategy on the current interface before the user performs the frame selection operation, or may display the preset merging strategy on the current interface after the user performs the frame selection operation; the terminal can also add a user-defined merging strategy in the preset merging strategy. For example, when the terminal detects a frame selection operation on the current interface, a plurality of calculation mode options are displayed on the current interface, the terminal responds to a mouse event of a user for a target calculation mode option, and calculates frame selection data according to a calculation mode corresponding to the target calculation mode option to obtain combined data corresponding to the frame selection data.
In the embodiment, the preset merging strategy is displayed on the current interface, so that the requirements of the user on different calculation modes of the frame selection data can be met, the user experience is improved, the user can conveniently perform subsequent processing on the data in the frame selection area, and the data processing efficiency is improved.
In one embodiment, displaying the merged data in the merged data display area includes:
when the beginning of data display operation is detected in the frame selection area, displaying merged data in a merged data display area;
the method further comprises the following steps:
and hiding the merged data display area and the merged data when the data display operation is detected to be finished in the frame selection area.
Specifically, when the terminal detects that the mouse cursor moves into the frame selection area from the outside of the frame selection area, the data display operation is started, at the moment, the mouse cursor is in the frame selection area, the terminal displays the merged data display area in the form of a floating window on the current interface, and the merged data is displayed in the merged data display area; when the terminal detects that the mouse cursor moves out of the frame selection area from the frame selection area, the data display operation is finished, the mouse cursor is outside the frame selection area, the terminal hides the merged data display area and the merged data, and the terminal does not display the merged data display area and the merged data on the current interface. The terminal can also display the merged data display area and the merged data according to a touch screen signal of the user, for example, when the user clicks the frame selection area for the first time, the terminal displays the merged data display area and the merged data; when the user clicks the frame selection area again, the terminal hides the merged data display area and the merged data. Or when the user presses the frame selection area for a long time, the terminal always displays the merged data display area and the merged data, and when the user presses the frame selection area for a long time, the terminal hides the merged data display area and the merged data.
In the embodiment, the merged data display area and the merged data are displayed or hidden at the positions inside and outside the frame selection area through the mouse cursor, so that a user can view interface data shielded by the merged data display area, the flexibility of merged data display is realized, and the user experience is improved.
In one embodiment, the frame area includes an image area; the method further comprises the following steps:
performing image identification based on the image area to obtain image display data;
carrying out data merging on the image display data to obtain image merging data corresponding to the image area;
and displaying an image combination data display area in the current interface, and displaying image combination data corresponding to the image area in the image combination data display area.
The image area refers to an area for displaying an image in the current interface, and the image includes, but is not limited to, a tabular image. The image display data refers to data information in an image, and the data information may be table information in the image. The image merging data is obtained by merging data information in the image area.
Specifically, the terminal detects that a framing area in the current interface includes an image area, wherein the image area can be a full-coverage image of the framing area or a partial-coverage image of the framing area, and the terminal identifies image content according to the image area and can use an image identification algorithm to identify the image content to obtain image table information in the image area;
the terminal acquires the position coordinates of the cells of the image table in the image area, determines that the cells in the image table are the cells of the same type when the target distances of the cells in the image table are consistent, acquires the display data in the cells in the image table, performs merging calculation on the display data in the cells when the display data in the cells in the image table are numerical data, displays the merged data in the area outside the image area, and displays the merged data corresponding to the cells in the image area.
In the embodiment, the data calculation result information of the image area can be obtained by performing frame selection on the image in the current interface, so that the requirement of a user on calculating the image data is met, the user can conveniently perform subsequent processing on the image data, and the data processing efficiency is improved.
In an embodiment, as shown in fig. 6, a flow chart of a data processing method is provided, which includes:
the user views the form in the current interface and frames the data to be aggregated in the form. And when the terminal detects a framing event of the table in the current interface, acquiring data in a framing area corresponding to the framing event. And then judging whether the data in the frame selection area has the numerical data which can be summed up, if not, directly ending the process without processing. If the totalizable numerical data exists, the data in the frame selection area is totalized to obtain a totalized result, and the totalized result is displayed below the frame selection area in a form of a floating frame.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the present application further provides a data processing apparatus for implementing the above-mentioned data processing method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme described in the above method, so the specific limitations in one or more embodiments of the data processing device provided below may refer to the limitations on the data processing method in the above description, and are not described herein again.
In one embodiment, as shown in fig. 7, there is provided a data processing apparatus 700 comprising: a frame selection module 702, a merge module 704, and a display module 706, wherein:
the frame selection module is used for responding to the frame selection operation of the current interface and displaying a frame selection area framed by the frame selection operation in the current interface, the current interface display comprises interface data, and the frame selection area comprises frame selection data in the interface data;
the merging module is used for merging the frame selection data to obtain merged data;
and the display module is used for displaying the merged data display area corresponding to the frame selection area in the current interface and displaying the merged data in the merged data display area.
In one embodiment, the merging module 704 further includes:
the frame selection data merging unit is used for acquiring coordinate points in the frame selection area and determining the cells in the frame selection area based on the coordinate points in the frame selection area; and acquiring the framing data in the cells in the framing area, and performing data merging on the framing data in the cells in the framing area to obtain merged data corresponding to the framing area.
In one embodiment, the merging module 704 further includes:
the classification data merging unit is used for acquiring the position information of the framing data in the framing area, dividing the framing data into at least one type of data according to the position information, and merging the same type of data in the at least one type of data to obtain merged data corresponding to the same type of data; displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises: and correspondingly displaying the merged data corresponding to the homogeneous data and the homogeneous data in the merged data display area.
In one embodiment, the data processing apparatus 700 further comprises:
a subframe selection data merging unit configured to display, in response to a subframe selection operation on the frame selection region, a subframe selection region framed by the subframe selection operation in the frame selection region, the subframe selection region including subframe selection data in the frame selection data; carrying out data merging on the sub-frame selection data to obtain sub-merging data; and displaying a sub-merged data display region corresponding to the sub-frame selection region, and displaying the sub-merged data in the sub-merged data display region.
In one embodiment, the merging module 704 further includes:
the data dividing unit is used for dividing the framing data into at least one type of identification data according to the data type identification corresponding to the framing data, and performing data merging on the same type of identification data in the at least one type of identification data to obtain merged data corresponding to the same type of identification data; displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises: and correspondingly displaying the merged data corresponding to the same type of identification data and the data type identification corresponding to the same type of identification data in the merged data display area.
In one embodiment, the data processing apparatus 700 further comprises:
the merging strategy unit is used for correspondingly displaying at least one preset merging strategy based on the frame selection area in the current interface; carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: in response to the selection operation of at least one preset merging strategy, determining the selected preset merging strategy as a determined target merging strategy; and carrying out data merging on the frame selection data according to the target merging strategy to obtain merged data.
The various modules in the data processing apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 8. The computer apparatus includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input device. The processor, the memory and the input/output interface are connected by a system bus, and the communication interface, the display unit and the input device are connected by the input/output interface to the system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The input/output interface of the computer device is used for exchanging information between the processor and an external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a data processing method. The display unit of the computer equipment is used for forming a visual and visible picture, and can be a display screen, a projection device or a virtual reality imaging device, the display screen can be a liquid crystal display screen or an electronic ink display screen, the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data; carrying out data merging on the frame selection data to obtain merged data; and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: obtaining coordinate points in the framed selection area, and determining cells in the framed selection area based on the coordinate points in the framed selection area; and acquiring the framing data in the cells in the framing area, and performing data merging on the framing data in the cells in the framing area to obtain merged data corresponding to the framing area.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: acquiring position information of the framing data in a framing area, dividing the framing data into at least one type of data according to the position information, and performing data merging on the same type of data in the at least one type of data to obtain merged data corresponding to the same type of data; displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises: and correspondingly displaying the merged data corresponding to the homogeneous data and the homogeneous data in the merged data display area.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
after the selection area selected by the selection operation is displayed in the current interface, the method further comprises the following steps: in response to a subframe selection operation on the frame selection region, displaying a subframe selection region framed by the subframe selection operation in the frame selection region, the subframe selection region including subframe selection data in the frame selection data; carrying out data merging on the sub-frame selection data to obtain sub-merging data; and displaying a sub-merged data display region corresponding to the sub-frame selection region, and displaying the sub-merged data in the sub-merged data display region.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: dividing the framing data into at least one type of identification data according to the data type identification corresponding to the framing data, and carrying out data merging on the same type of identification data in the at least one type of identification data to obtain merged data corresponding to the same type of identification data; displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises: and correspondingly displaying the merged data corresponding to the same type of identification data and the data type identification corresponding to the same type of identification data in the merged data display area.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
in response to the selection operation of the current interface, displaying a selection area selected by the selection operation in the current interface, wherein the current interface display includes interface data, and the selection area includes the selection data in the interface data, and then: correspondingly displaying at least one preset merging strategy based on the frame selection area in the current interface; carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: in response to the selection operation of at least one preset merging strategy, determining the selected preset merging strategy as a determined target merging strategy; and carrying out data merging on the frame selection data according to the target merging strategy to obtain merged data.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
displaying the merged data in a merged data display area, comprising: when the beginning of data display operation is detected in the frame selection area, displaying the merged data in the merged data display area; the method further comprises the following steps: and hiding the merged data display area and the merged data when the data display operation is detected to be finished in the frame selection area.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
the frame selection area comprises an image area; the method further comprises the following steps: performing image identification based on the image area to obtain image display data; carrying out data merging on the image display data to obtain image merging data corresponding to the image area; and displaying an image merging data display area corresponding to the image area in the current interface, and displaying the image merging data in the image merging data display area.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
responding to the frame selection operation of the current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data; carrying out data merging on the frame selection data to obtain merged data; and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
In one embodiment, the computer program when executed by the processor further performs the steps of:
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: obtaining coordinate points in the framed selection area, and determining cells in the framed selection area based on the coordinate points in the framed selection area; and acquiring the framing data in the cells in the framing area, and performing data merging on the framing data in the cells in the framing area to obtain merged data corresponding to the framing area.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring position information of the framing data in a framing area, dividing the framing data into at least one type of data according to the position information, and performing data merging on the same type of data in the at least one type of data to obtain merged data corresponding to the same type of data; displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises: and correspondingly displaying the merged data corresponding to the homogeneous data and the homogeneous data in the merged data display area.
In one embodiment, the computer program when executed by the processor further performs the steps of:
after the selection area selected by the selection operation is displayed in the current interface, the method further comprises the following steps: in response to a subframe selection operation on the frame selection region, displaying a subframe selection region framed by the subframe selection operation in the frame selection region, the subframe selection region including subframe selection data in the frame selection data; carrying out data merging on the sub-frame selection data to obtain sub-merging data; and displaying a sub-merged data display region corresponding to the sub-frame selection region, and displaying the sub-merged data in the sub-merged data display region.
In one embodiment, the computer program when executed by the processor further performs the steps of:
carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: dividing the framing data into at least one type of identification data according to the data type identification corresponding to the framing data, and carrying out data merging on the same type of identification data in the at least one type of identification data to obtain merged data corresponding to the same type of identification data; displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying merged data in the merged data display area, wherein the merged data display area comprises: and correspondingly displaying the merged data corresponding to the same type of identification data and the data type identification corresponding to the same type of identification data in the merged data display area.
In one embodiment, the computer program when executed by the processor further performs the steps of:
in response to the selection operation of the current interface, displaying a selection area selected by the selection operation in the current interface, wherein the current interface display includes interface data, and the selection area includes the selection data in the interface data, and then: correspondingly displaying at least one preset merging strategy based on the frame selection area in the current interface; carrying out data merging on the frame selection data to obtain merged data, wherein the merged data comprises: in response to the selection operation of at least one preset merging strategy, determining the selected preset merging strategy as a determined target merging strategy; and carrying out data merging on the frame selection data according to the target merging strategy to obtain merged data.
In one embodiment, the computer program when executed by the processor further performs the steps of:
displaying the merged data in a merged data display area, comprising: when the beginning of data display operation is detected in the frame selection area, displaying the merged data in the merged data display area; the method further comprises the following steps: and hiding the merged data display area and the merged data when the data display operation is detected to be finished in the frame selection area.
In one embodiment, the computer program when executed by the processor further performs the steps of:
the frame selection area comprises an image area; the method further comprises the following steps: performing image identification based on the image area to obtain image display data; carrying out data merging on the image display data to obtain image merging data corresponding to the image area; and displaying an image merging data display area corresponding to the image area in the current interface, and displaying the image merging data in the image merging data display area.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the relevant laws and regulations and standards of the relevant country and region.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.
Claims (10)
1. A method of data processing, the method comprising:
responding to a frame selection operation of a current interface, and displaying a frame selection area selected by the frame selection operation in the current interface, wherein the current interface comprises interface data, and the frame selection area comprises frame selection data in the interface data;
carrying out data merging on the frame selection data to obtain merged data;
and displaying a merged data display area corresponding to the frame selection area in the current interface, and displaying the merged data in the merged data display area.
2. The method of claim 1, wherein the data merging the boxed data to obtain merged data comprises:
obtaining coordinate points in the framed selection area, and determining cells in the framed selection area based on the coordinate points in the framed selection area;
and acquiring the framing data in the cells in the framing area, and performing data combination on the framing data in the cells in the framing area to obtain combined data corresponding to the framing area.
3. The method of claim 1, wherein the data merging the frame selection data to obtain merged data comprises:
acquiring position information of the framing data in the framing area, dividing the framing data into at least one type of data according to the position information, and performing data merging on the same type of data in the at least one type of data to obtain merged data corresponding to the same type of data;
the displaying the merged data in the merged data display area includes:
and correspondingly displaying the merged data corresponding to the homogeneous data and the homogeneous data in the merged data display area.
4. The method of claim 1, wherein after displaying the selection area selected by the selection operation in the current interface, further comprising:
in response to a subframe selection operation on the frame selection region, displaying a subframe selection region framed by the subframe selection operation in the frame selection region, the subframe selection region including subframe selection data in the frame selection data;
performing data merging on the sub-frame selection data to obtain sub-merged data;
and displaying a sub-merged data display region corresponding to the sub-frame selection region, and displaying the sub-merged data in the sub-merged data display region.
5. The method of claim 1, wherein the data merging the boxed data to obtain merged data comprises:
dividing the framing data into at least one type of identification data according to the data type identification corresponding to the framing data, and performing data merging on the same type of identification data in the at least one type of identification data to obtain merged data corresponding to the same type of identification data;
the displaying the merged data in the merged data display area includes:
and correspondingly displaying the merged data corresponding to the identification data of the same type and the data type identification corresponding to the identification data of the same type in the merged data display area.
6. The method of claim 1, wherein in response to a frame selection operation on a current interface, displaying a frame selection area in the current interface, the current interface display including interface data, the frame selection area including frame selection data in the interface data, further comprising:
correspondingly displaying at least one preset merging strategy based on the frame selection area in the current interface;
the data merging of the frame selection data to obtain merged data includes:
in response to the selection operation of the at least one preset merging strategy, determining the selected preset merging strategy as a target merging strategy;
and carrying out data merging on the frame selection data according to the target merging strategy to obtain merged data.
7. The method of claim 1, wherein displaying the consolidated data in the consolidated data display area comprises:
when the start of a data display operation is detected in the frame selection area, displaying the merged data in the merged data display area;
the method further comprises the following steps:
and hiding the merged data display area and the merged data when the data display operation is detected to be finished in the frame selection area.
8. A data processing apparatus, characterized in that the apparatus comprises:
the interface display device comprises a frame selection module, a frame selection module and a display module, wherein the frame selection module is used for responding to the frame selection operation of a current interface and displaying a frame selection area framed by the frame selection operation in the current interface, the current interface display comprises interface data, and the frame selection area comprises frame selection data in the interface data;
the merging module is used for merging the frame selection data to obtain merged data;
and the display module is used for displaying a merged data display area corresponding to the frame selection area in the current interface and displaying the merged data in the merged data display area.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210169997.8A CN114547508A (en) | 2022-02-23 | 2022-02-23 | Data processing method, data processing device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210169997.8A CN114547508A (en) | 2022-02-23 | 2022-02-23 | Data processing method, data processing device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114547508A true CN114547508A (en) | 2022-05-27 |
Family
ID=81677463
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210169997.8A Pending CN114547508A (en) | 2022-02-23 | 2022-02-23 | Data processing method, data processing device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114547508A (en) |
-
2022
- 2022-02-23 CN CN202210169997.8A patent/CN114547508A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109543162B (en) | Multi-chart display method and device for data, computer equipment and storage medium | |
US20130325673A1 (en) | Coordinate model for inventory visualization in conjunction with physical layout | |
US10565560B2 (en) | Alternative people charting for organizational charts | |
CN103577524A (en) | Business object representations and detail boxes display | |
CN116883563B (en) | Method, device, computer equipment and storage medium for rendering annotation points | |
CN107220230A (en) | A kind of information collecting method and device, and a kind of intelligent terminal | |
US11341197B2 (en) | Recommendation system based on adjustable virtual indicium | |
CN117112942A (en) | User interface adjustment method, device, computer equipment and storage medium | |
CN110019492B (en) | Information display method and device | |
CN114547508A (en) | Data processing method, data processing device, computer equipment and storage medium | |
US9779524B2 (en) | Visualization that indicates event significance represented by a discriminative metric computed using a contingency calculation | |
CN109254985A (en) | Method for exhibiting data and device, the electronic equipment of database | |
CN113704334A (en) | Target data display method, device, equipment and storage medium | |
CN111753113A (en) | Picture browsing method and device | |
CN111581273B (en) | Data visualization method and device, computer storage medium and electronic equipment | |
CN112231372B (en) | Data processing method, device, computer equipment and storage medium | |
CN116882527A (en) | Bank outlet reservation method, device, computer equipment and storage medium | |
CN115357647A (en) | Report display method and device, computer equipment and storage medium | |
CN116861121A (en) | Webpage generation method, device, computer equipment and storage medium | |
CN117216164A (en) | Financial data synchronous processing method, apparatus, device, medium and program product | |
CN116861123A (en) | Webpage generation method, device, computer equipment and storage medium | |
CN118313910A (en) | Financial business processing method, device, computer equipment and storage medium | |
CN115904900A (en) | Behavior data processing method and device, computer equipment and storage medium | |
CN117612192A (en) | Electronic drawing information identification method, system, electronic equipment and storage medium | |
CN118484253A (en) | Function module display method, device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |