CN108427589B - Data processing method and electronic equipment - Google Patents

Data processing method and electronic equipment Download PDF

Info

Publication number
CN108427589B
CN108427589B CN201810103789.1A CN201810103789A CN108427589B CN 108427589 B CN108427589 B CN 108427589B CN 201810103789 A CN201810103789 A CN 201810103789A CN 108427589 B CN108427589 B CN 108427589B
Authority
CN
China
Prior art keywords
display area
display
editing
electronic device
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810103789.1A
Other languages
Chinese (zh)
Other versions
CN108427589A (en
Inventor
王力军
虞新立
刘悦
郑轶民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201810103789.1A priority Critical patent/CN108427589B/en
Publication of CN108427589A publication Critical patent/CN108427589A/en
Application granted granted Critical
Publication of CN108427589B publication Critical patent/CN108427589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Abstract

The invention discloses a data processing method and electronic equipment, wherein the method comprises the following steps: generating a second display area according to a selection operation of the first display area, wherein the second display area is partially overlapped with the first display area; and editing operation is carried out in the second display area, and editing effects are synchronously displayed in the first display area and the second display area. By adopting the technical scheme of the invention, the interaction efficiency between the interfaces can be improved, and the use experience of a user is further improved.

Description

Data processing method and electronic equipment
Technical Field
The present invention relates to information processing technologies of electronic devices, and in particular, to a data processing method and an electronic device.
Background
With the development of technology, there is an increasing demand for electronic devices, and interactions between multiple interfaces of the electronic devices are also becoming one of the hot spots. In the prior art, after the process corresponding to one interface is processed, the process corresponding to the other interface is executed, so that the interaction efficiency between the interfaces is seriously affected.
Disclosure of Invention
In view of this, the present invention is expected to provide a data processing method and an electronic device, which can improve interaction efficiency between interfaces, thereby improving user experience.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a data processing method, applied to an electronic device, where the method includes:
generating a second display area according to a selection operation of the first display area, wherein the second display area is partially overlapped with the first display area;
and editing operation is carried out in the second display area, and editing effects are synchronously displayed in the first display area and the second display area.
In the above aspect, optionally, the method further includes:
dividing a third display area in the first display area, the third display area being a portion of the first display area;
the step of synchronously displaying editing effects in the first display area and the second display area comprises the following steps:
displaying the editing effect in the third display area, and retaining the original image in the first display area.
In the above solution, optionally, the first display area and the second display area correspond to different applications respectively.
In the above solution, optionally, the first display area is located on a first display screen of the electronic device, and the second display area is located on a second display screen of the electronic device.
In the above solution, optionally, the editing operation is performed in the second display area, and the editing effect is synchronously displayed in the first display area and the second display area, including:
transmitting operation data corresponding to the editing operation to a processor;
the processor processes the first data based on the operation data to generate second data; wherein the first data is used to characterize a target object;
and controlling the first display area and the second display area to output editing effects in the corresponding dimension representation forms based on the second data.
In a second aspect, an embodiment of the present invention provides an electronic device, including:
a processor for generating a second display area according to a selection operation of a first display area, the second display area partially overlapping the first display area;
and the display controller is used for synchronously displaying editing effects in the first display area and the second display area when editing operation is performed in the second display area.
In the foregoing aspect, optionally, the processor is further configured to:
dividing a third display area in the first display area, the third display area being a portion of the first display area;
the display controller is further configured to:
displaying the editing effect in the third display area, and retaining the original image in the first display area.
In the above solution, optionally, the first display area and the second display area correspond to different applications respectively.
In the above scheme, optionally, the electronic device at least includes two display screens, namely a first display screen and a second display screen; the first display area is located on a first display screen of the electronic device, and the second display area is located on a second display screen of the electronic device.
In the above aspect, optionally, the display controller is further configured to send operation data corresponding to the editing operation to the processor;
the processor is further used for processing the first data based on the operation data to generate second data; wherein the first data is used to characterize a target object;
the display controller is further configured to control the first display area and the second display area to output editing effects in respective corresponding dimension representations based on the second data.
In a third aspect, an embodiment of the present invention further provides a computer storage medium, where computer executable instructions are stored, where the computer executable instructions are configured to perform the data processing method according to the embodiment of the present invention.
According to the technical scheme, the second display area is generated according to the selection operation of the first display area; editing operation is carried out in the second display area, and editing effects are synchronously displayed in the first display area and the second display area; in this way, the interaction efficiency between the first display area and the second display area can be improved; if the editing operation is performed on the target object in the second display area, the editing effect corresponding to the editing operation is displayed in the second display area, and meanwhile, the editing effect can be synchronously displayed in the first display area without the need of a user to additionally execute the operation of calling the first display area, so that the user operation is simplified, and the use experience of the user is greatly improved.
Drawings
FIG. 1 is a schematic diagram of an implementation flow of a data processing method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a first interaction between a first display area and a second display area according to an embodiment of the present invention;
FIG. 3 is a second interaction diagram of a first display area and a second display area according to an embodiment of the present invention;
FIG. 4 is a third interaction diagram of a first display area and a second display area according to an embodiment of the present invention;
FIG. 5 is a second schematic diagram of an implementation flow of a data processing method according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
So that the manner in which the features and objects of the present invention can be understood in more detail, a more particular description of the invention, briefly summarized above, may be had by reference to the appended drawings, which are not necessarily limited to the embodiments described.
In the following embodiments of the data processing method and the electronic device provided by the present invention, the related electronic device includes, but is not limited to: desktop computers, notebook computers, tablet computers, cell phones, televisions, wearable devices, and the like. The electronic equipment of the embodiment of the invention can be single-display screen electronic equipment or multi-display screen electronic equipment.
The technical scheme of the invention is further elaborated below with reference to the drawings and specific embodiments.
Example 1
Fig. 1 is a schematic diagram of an implementation flow of a data processing method according to an embodiment of the present invention, which is applied to an electronic device, and the data processing method mainly includes the following steps:
step 101: and generating a second display area according to the selection operation of the first display area.
Here, the selection operation can be understood as: the user operates on the target object in the first display area.
The selection operation can be input through gestures and also can be input through voices.
As an alternative embodiment, the generating the second display area according to the selecting operation of the first display area includes:
acquiring a selection operation of a user for a target object displayed in a first display area;
and determining a second display area and contents to be displayed in the second display area according to the selection operation.
Specifically, an image is displayed in a first display area, when the electronic device detects a selection operation of the image by a user, a second display area suitable for editing the image is determined, and part or all of the image determined based on the selection operation is displayed in the second display area.
For example, the user selects the right half of the image through gesture input, the electronic device determines the right half of the image based on gesture input information, determines a second display area suitable for editing the image, and displays the right half of the image in the second display area.
For another example, the user selects the upper half of the image through voice input, the electronic device determines the upper half of the image based on voice input information, determines a second display area suitable for editing the image, and displays the upper half of the image in the second display area.
Here, the selection operation can also be understood as: the electronic device selects an operation of determining display information in the second display area for a target object currently displayed in the first display area.
As an alternative embodiment, the generating the second display area according to the selecting operation of the first display area includes:
acquiring attribute information of a target object displayed in a first display area;
and selecting and determining a second display area and contents to be displayed in the second display area according to the attribute information.
Here, the attribute information includes natural attribute information of the target object, and/or application attribute information.
Wherein the natural attribute information of the target object includes: characteristic information for the target object itself, such as attribute information of pictures, videos, codes, texts, and the like.
Wherein the application attribute information of the target object includes: and an application identifier for displaying the target object, such as an identifier for characterizing a browser application, an image editing application, a text editing application, a video playing application, an instant messaging application and the like, which is currently started in the first display area.
The electronic device detects that the image is opened in the first display area by the gallery application, generates a second display area when judging that the image is subjected to image repairing operation in advance by a user and determining an image to be repaired, and displays the image to be repaired in the image in the second display area.
In an exemplary embodiment, a game test screen is displayed in the first display area, the electronic device detects that the game test screen is played by the game application in the first display area, generates a second display area when it is determined that the user has pre-changed the game code and the range to be modified is determined, and displays an editable code area matching the modification range in the second display area.
In an exemplary embodiment, a live video is displayed in a first display area, the electronic device detects that the live video is output by an instant messaging application in the first display area, generates a second display area when it is determined that a user pre-displays a certain type of state of a user on the side of the electronic device, starts a front camera, and displays a collected state image of the user in the second display area.
Preferably, the second display area partially overlaps the first display area.
Here, the partial overlap can be understood as: the display picture of the second display area and the display picture of the first display content are overlapped, so that the display content of the first display area can be changed along with the change of the display content of the second display area. For example, the display content is an image frame by frame.
Here, the partial overlap can also be understood as: the first display area and the second display area have partial overlapping on the display interface. For example, a first display window of the first display area may partially overlap a second display window of the second display area.
Step 102: and editing operation is carried out in the second display area, and editing effects are synchronously displayed in the first display area and the second display area.
In one embodiment, the editing operation is performed in the second display area, and the editing effect is displayed in the first display area and the second display area simultaneously, including:
transmitting operation data corresponding to the editing operation to a processor; wherein the editing operation is an editing operation for a target object;
the processor processes the first data based on the operation data to generate second data; wherein the first data is used to characterize a target object;
and controlling the first display area and the second display area to output editing effects in the corresponding dimension representation forms based on the second data.
Here, the dimensional representation forms include, but are not limited to, still image forms, moving image forms, audio forms, video forms, text forms, and voice forms.
Here, the editing operation is an operation for a target object supported by the application corresponding to the second display area.
Wherein the target object is at least a portion of an object present in the first display area or is a thing for characterizing association with at least a portion of an object present in the first display area.
Specifically, the sending the operation data corresponding to the editing operation to the processor includes:
and sending the operation data corresponding to the editing operation to the processor through a second thread corresponding to the second display area.
Specifically, before the controlling the first display area and the second display area to output editing effects in respective corresponding dimension representations based on the second data at the same time, the method further includes:
sending the second data to a second display area through a second thread corresponding to the second display area;
and sending the second data to the first display area through the first thread corresponding to the first display area.
Specifically, the controlling the first display area and the second display area to simultaneously output editing effects in respective corresponding dimension representations based on the second data includes:
controlling the first display area to output editing effects in a corresponding first-dimension representation form based on the second data; at the same time, the method comprises the steps of,
and controlling the second display area to output the editing effect in a corresponding second dimension representation form based on the second data.
Here, the first dimension representation may be the same as or different from the second dimension representation.
For example, the editing effect is displayed in the form of an image in the first display area, and the editing effect is also displayed in the form of an image in the second display area.
For another example, the editing effect is displayed in the form of an image in the first display area, and the editing effect is displayed in the form of a code in the second display area.
For another example, the editing effect is displayed in a live video format in the first display area, and the editing effect is displayed in an image format in the second display area.
In the above aspect, further preferably, the method further includes:
step 100: a third display area is divided in the first display area, the third display area being a portion of the first display area.
The execution order of step 100 is not limited, and step 100 may be performed simultaneously with step 101, before step 101, after step 101, and before step 102.
Accordingly, the step 102 may be changed to:
and when the second display area performs editing operation, synchronously displaying editing effects in the third display area and the second display area, and reserving original images in the first display area.
Therefore, images before and after editing are compared more conveniently, and the editing effect can be recognized.
Fig. 2 is a first interactive schematic diagram of a first display area and a second display area, as shown in fig. 2, in which a third display area is separated from the first display area, an original image is displayed in the first display area, the second display area is an image editing interface, and when a user edits an image of the original image in the second display area, the third display area synchronously displays an image obtained by editing in the second display area and an image obtained by overlapping the original image; because the original image is displayed in the first display area, and the overlapped images are displayed in the third display area, the user can edit the images conveniently, and the comparison effect before and after editing can be observed conveniently.
Taking fig. 2 as an example, the first display area and the third display area form a main screen of the electronic device, the second display area forms a secondary screen of the electronic device, an image to be processed is arranged at the first display area, when an operation of shielding a right eye part in the image by a hand sent by a user is received, the operation is used for indicating an operation instruction for amplifying eyes, an amplified eye image is displayed at the second display area, and a corresponding image is displayed at the third display area after the first display area image and the second display area image are overlapped; further, one eye mark is removed by the image editing operation at the second display area, and then a thumbnail corresponding to the image at the second display area (i.e., one eye mark is removed) is simultaneously presented at the third display area. That is, the image displayed in the first display area is an original image, i.e., an unprocessed image; the second display area displays a partial enlarged view, the third display area displays a thumbnail corresponding to the current display image in the second display area, or an image obtained by overlapping the first display area display image and the second display area processed image, and in summary, the content displayed in the third display area changes along with the change of the display content of the second display area.
According to the scheme, the first display area and the second display area can respectively correspond to different applications.
In this way, since the dimensional representation forms corresponding to different applications are different, if the first display area and the second display area correspond to different applications, respectively, the first display area and the second display area can display the same thing in different dimensional representation forms.
FIG. 3 is a second interactive schematic diagram of a first display area and a second display area, where the second display area is a code editing interface, and the first display area is a schematic diagram of a result of running a currently edited code in the second display area, as shown in FIG. 3; obviously, the first display area and the second display area correspond to different applications respectively, for a programmer, a code editing interface is called in the second display area, code editing is carried out on the code editing interface, when a certain section of code is knocked and operated in the second display area, an operation picture of the section of code can be displayed in the first display interface, so that the programmer can edit the code and test the code at the same time, the code detection efficiency can be improved, and the test time can be saved.
Still taking fig. 3 as an example, the first display area forms a main screen of the electronic device, the second display area forms a sub-screen of the electronic device, and specifically, the black area of the display program and the keyboard area together form the sub-screen; the programmer knocks the code in the keyboard area, and the currently knocked code is displayed in the black area; the section of the program edited by the programmer is run at the first display area, i.e., a scene in which the section of the code just written is run is displayed at the first display area, and whether the section of the program is defective (BUG) is detected by observing whether the animation at the first display area is smooth, complete, or the like.
According to the scheme, the first display area is located on the first display screen of the electronic equipment, and the second display area is located on the second display screen of the electronic equipment.
Therefore, the first display area and the second display area can be better distinguished, and meanwhile, the interaction efficiency of the interfaces on the first display screen and the second display screen of the multi-display-screen electronic equipment can be improved.
Fig. 4 is a third interaction schematic diagram of a first display area and a second display area, and as shown in fig. 4, the first display area is a main screen, the second display area is a secondary screen, a video live broadcast picture is displayed in the first display area, a live broadcast condition of a teacher lecture is displayed in the live broadcast picture, and a student lecture listening state is displayed in the lower left corner of the first display area; in the second display area, the notes currently recorded by the student are displayed in real time. Like this, mr can see the expression that the student was listened to the class, can know the condition of recording the note when the student was listened to the class again, because the teacher in time has known the student and has listened to the class condition and made the note condition, more helps carrying out the teaching.
In the embodiment of the invention, the second display area is generated according to the selection operation of the first display area; if editing operation is carried out in the second display area, synchronously displaying editing effects in the first display area and the second display area; in this way, the interaction efficiency between the first display area and the second display area can be improved; when the second display area displays the editing effect corresponding to the editing operation, the editing effect can be synchronously displayed in the first display area without the need of a user to additionally execute the operation of calling the first display area, so that the user operation is simplified, and the use experience of the user is greatly improved.
Example two
Fig. 5 is a second implementation flow chart of a data processing method of the embodiment of the present invention, which is applied to an electronic device, where the electronic device supports simultaneous display of N interaction interfaces, where N is a positive integer greater than or equal to 2, and in this embodiment, it is assumed that N interaction interfaces are simultaneously displayed on a display screen of the electronic device, and the data processing method mainly includes the following steps:
step 501: and acquiring editing operation on the Nth interactive interface.
The editing operation is an editing operation on a target object in the Nth interactive interface.
Step 502: and correspondingly processing the first data based on the editing operation by the processor to obtain second data.
Wherein the first data is used to characterize a target object.
Here, the editing operation is an operation for a target object supported by the application corresponding to the nth interactive interface.
Wherein the target object is at least a portion of an object present in the other N-1 interaction interfaces or is a representation of something associated with at least a portion of an object present in the other N-1 interaction interfaces.
Step 503: and distributing the second data to the corresponding interaction interfaces through N threads corresponding to the N interaction interfaces, and outputting editing effects by the N interaction interfaces based on the second data in different dimension representation forms.
Here, the dimensional representation forms include, but are not limited to, still image forms, moving image forms, audio forms, video forms, text forms, and voice forms.
As one embodiment, the outputting, by the N interaction interfaces, the editing effect in different dimension representations based on the second data simultaneously includes:
determining a dimension representation form and a display area for each of the N interactive interfaces; wherein the display area is an area allocated for displaying the target object, and the display area is a part or all of the area on the interactive interface;
and simultaneously outputting editing effects in corresponding dimension representation forms based on the second data in N display areas corresponding to the N interactive interfaces.
In an optional embodiment, the determining the dimension representation and the display area for each of the N interactive interfaces includes:
acquiring the size of a maximum window supported by each of the N interactive interfaces and the display position of each interactive interface relative to the electronic equipment;
and according to the determined characteristics of the dimension representation forms, combining the size of the maximum window supported by each interactive interface and the display position of each interactive interface relative to the electronic equipment, and distributing dimension representation forms and display areas for the N interactive interfaces.
Further, after the editing effect is output by the N interactive interfaces simultaneously in different dimension representations based on the second data, the method further includes:
any one of the N interactive interfaces changes the target object due to a first permission operation, and the N-1 interactive interfaces respectively change along with the change of the target object in the first interactive interface; wherein the first permission operation is an operation for the target object supported by the interactive interface that received the first permission operation.
As an optional implementation manner, when any one of the N interaction interfaces changes the target object due to the first permission operation, the N-1 interaction interfaces respectively change along with the change of the target object in the first interaction interface, including:
detecting a first interaction interface in the N interaction interfaces to receive a first operation, wherein the first operation is supported by the first interaction interface and is used for enabling the target object to change;
transmitting operation data corresponding to the first operation to the processor through a first thread corresponding to the first interactive interface;
the processor processes the first data based on the first operation data to generate second data;
and simultaneously distributing the second data to the corresponding interaction interfaces through N threads corresponding to the N interaction interfaces, and outputting editing effects by the N interaction interfaces based on the second data in different dimension representation forms.
Here, the first interactive interface is any one of the N interactive interfaces.
Further, after the editing effect is output by the N interactive interfaces simultaneously in different dimension representation forms based on the first data, the method further includes:
any one of the N interactive interfaces changes due to the second permission operation, and the N-1 interactive interfaces do not change along with the change of the first interactive interface; wherein the second licensed operation is an operation for the interactive interface supported by the interactive interface that received the second licensed operation but that does not affect the target object.
In the above scheme, the N interaction interfaces are distributed on N display screens of the electronic device, or the N interaction interfaces are distributed on N-x display screens of the electronic device, x is a positive integer greater than or equal to 1, and at least one display screen in the N-x display screens has 2 or more interaction interfaces.
In the embodiment of the invention, when the editing operation of the Nth interactive interface is detected, the N interactive interfaces output corresponding editing effects in different dimension representation forms based on the editing effect of the Nth interactive interface at the same time. Therefore, the interaction efficiency between the interaction interfaces can be improved, and the use experience of a user is further improved.
Example III
An embodiment of the present invention provides an electronic device, as shown in fig. 6, including:
a processor 10 for generating a second display area according to a selection operation of a first display area, the second display area partially overlapping the first display area;
and a display controller 20 for synchronously displaying editing effects in the first display area and the second display area when editing operation is performed in the second display area.
As an alternative embodiment, the processor 10 is further configured to:
dividing a third display area in the first display area, the third display area being a portion of the first display area;
the display controller 20 is further configured to:
displaying the editing effect in the third display area, and retaining the original image in the first display area.
As an embodiment, the method, optionally,
the display controller 20 is further configured to send operation data corresponding to the editing operation to the processor 10;
the processor 10 is further configured to process the first data based on the operation data, and generate second data; wherein the first data is used to characterize a target object;
the display controller 20 is further configured to control the first display area and the second display area to output editing effects in respective corresponding dimension representations based on the second data.
Wherein the dimension representation includes, but is not limited to: here, the dimensional representation forms include, but are not limited to, still image forms, moving image forms, audio forms, video forms, text forms, and voice forms.
Optionally, the first display area and the second display area respectively correspond to different applications.
Optionally, the first display area and the second display area correspond to the same application.
Optionally, the electronic device at least comprises a first display screen and a second display screen; the first display area may be located on a first display screen of the electronic device and the second display area may be located on a second display screen of the electronic device.
It should be understood by those skilled in the art that the functions of each module in the electronic device according to the embodiment of the present invention may be understood by referring to the foregoing description applied to the data processing method, and each module in the electronic device according to the embodiment of the present invention may be implemented by using an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by executing software that implements the functions described in the embodiment of the present invention on the electronic device.
The electronic device according to the embodiment of the present invention provides specific implementation hardware for the methods according to the first to second embodiments, and can be used to implement any of the technical solutions according to the first to second embodiments.
The present invention also provides a computer storage medium storing a computer program, where the computer program is executed by a processor, and the computer program can implement any one or more of the data processing methods described in the foregoing embodiments and applied to an electronic device side. The computer storage medium may be various types of storage media, and in this embodiment may be preferably a non-transitory storage medium.
In the embodiments provided in the present invention, it should be understood that the disclosed method, apparatus and electronic device may be implemented in other manners. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present invention may be integrated in one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk or an optical disk, or the like, which can store program codes.
Alternatively, the integrated units of the embodiments of the present invention may be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as separate products. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, ROM, RAM, magnetic or optical disk, or other medium capable of storing program code.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A data processing method applied to an electronic device, the method comprising:
generating a second display area according to a selection operation of the first display area, wherein the second display area is partially overlapped with the first display area; wherein the first display area and the second display area correspond to different applications respectively;
editing operation is carried out in the second display area, and editing effects are synchronously displayed in the first display area and the second display area;
wherein the generating a second display area according to the selection operation of the first display area includes:
acquiring attribute information of a target object displayed in a first display area;
determining a second display area and contents correspondingly displayed in the second display area according to the attribute information; the dimensional representation of the content displayed in the second display area is different from the dimensional representation of the content displayed in the first display area.
2. The method according to claim 1, wherein the method further comprises:
dividing a third display area in the first display area, the third display area being a portion of the first display area;
the step of synchronously displaying editing effects in the first display area and the second display area comprises the following steps:
displaying the editing effect in the third display area, and retaining the original image in the first display area.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the first display area is located on a first display screen of the electronic device, and the second display area is located on a second display screen of the electronic device.
4. The method according to claim 1, wherein the editing operation is performed in the second display area, and the editing effect is synchronously displayed in the first display area and the second display area, including:
transmitting operation data corresponding to the editing operation to a processor;
the processor processes the first data based on the operation data to generate second data; wherein the first data is used to characterize a target object;
and controlling the first display area and the second display area to output editing effects in the corresponding dimension representation forms based on the second data.
5. An electronic device, the electronic device comprising:
a processor for generating a second display area according to a selection operation of a first display area, the second display area partially overlapping the first display area; wherein the first display area and the second display area correspond to different applications respectively;
the display controller is used for synchronously displaying editing effects in the first display area and the second display area when editing operation is performed in the second display area;
wherein the generating a second display area according to the selection operation of the first display area includes:
acquiring attribute information of a target object displayed in a first display area;
determining a second display area and contents correspondingly displayed in the second display area according to the attribute information; the dimensional representation of the content displayed in the second display area is different from the dimensional representation of the content displayed in the first display area.
6. The electronic device of claim 5, wherein the processor is further configured to:
dividing a third display area in the first display area, the third display area being a portion of the first display area;
the display controller is further configured to: displaying the editing effect in the third display area, and retaining the original image in the first display area.
7. The electronic device of claim 5, wherein the electronic device comprises at least two display screens, a first display screen and a second display screen; the first display area is located on a first display screen of the electronic device, and the second display area is located on a second display screen of the electronic device.
8. The electronic device of claim 5, wherein the electronic device comprises a memory device,
the display controller is further used for sending operation data corresponding to the editing operation to the processor;
the processor is further used for processing the first data based on the operation data to generate second data; wherein the first data is used to characterize a target object;
the display controller is further configured to control the first display area and the second display area to output editing effects in respective corresponding dimension representations based on the second data.
CN201810103789.1A 2018-02-01 2018-02-01 Data processing method and electronic equipment Active CN108427589B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810103789.1A CN108427589B (en) 2018-02-01 2018-02-01 Data processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810103789.1A CN108427589B (en) 2018-02-01 2018-02-01 Data processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN108427589A CN108427589A (en) 2018-08-21
CN108427589B true CN108427589B (en) 2023-07-21

Family

ID=63156436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810103789.1A Active CN108427589B (en) 2018-02-01 2018-02-01 Data processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN108427589B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109271122B (en) * 2018-09-03 2022-07-01 高新兴科技集团股份有限公司 File display method, device and equipment based on double display screens
CN109597550A (en) * 2018-11-19 2019-04-09 维沃移动通信有限公司 A kind of image processing method and mobile terminal
WO2020107173A1 (en) * 2018-11-26 2020-06-04 深圳市大疆创新科技有限公司 Interface content adjustment method, electronic device and machine-readable storage medium
CN111722771B (en) * 2019-03-20 2022-06-07 富士胶片实业发展(上海)有限公司 Image association display method and device and computer readable medium
CN112533021B (en) 2019-09-19 2023-04-11 Vidaa(荷兰)国际控股有限公司 Display method and display equipment
CN111158620B (en) * 2019-12-26 2020-11-24 成都星时代宇航科技有限公司 Picture display method and device and terminal

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103282955A (en) * 2010-10-01 2013-09-04 Flex Electronics ID Co.,Ltd. Displaying the desktop upon device open

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10303328B2 (en) * 2015-09-14 2019-05-28 Lg Electronics Inc. Mobile terminal and method for controlling the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103282955A (en) * 2010-10-01 2013-09-04 Flex Electronics ID Co.,Ltd. Displaying the desktop upon device open

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
https://jingyan.baidu.com/article/1e5468f92c5702484961b73a.html;百度经验;《Photoshop如何新建实时预览窗口》;20131125;全文 *
https://jingyan.baidu.com/article/d5a880eb6b907113f147ccb2.html;百度经验;《Lightroom如何快速对比照片修改前后及快捷键》;20140129;第1-6页 *
https://lightroomkillertips.com/using-two-monitors-lightroom/;Scott Kelby;《Using Two Monitors in Lightroom》;20170922;第1-6页 *
Scott Kelby.https://lightroomkillertips.com/using-two-monitors-lightroom/.《Using Two Monitors in Lightroom》.2017,1-6. *

Also Published As

Publication number Publication date
CN108427589A (en) 2018-08-21

Similar Documents

Publication Publication Date Title
CN108427589B (en) Data processing method and electronic equipment
JP6165846B2 (en) Selective enhancement of parts of the display based on eye tracking
US9049482B2 (en) System and method for combining computer-based educational content recording and video-based educational content recording
TWI478043B (en) Systems and methods for app page template generation, and storage medium thereof
CN111866423A (en) Screen recording method for electronic terminal and corresponding equipment
US10593018B2 (en) Picture processing method and apparatus, and storage medium
JP2018520450A (en) Information processing method, terminal, and computer storage medium
CN104765528A (en) Display method and device of virtual keyboard
CN112437353A (en) Video processing method, video processing apparatus, electronic device, and readable storage medium
KR20160106970A (en) Method and Apparatus for Generating Optimal Template of Digital Signage
US20230244363A1 (en) Screen capture method and apparatus, and electronic device
CN114339363B (en) Picture switching processing method and device, computer equipment and storage medium
CN108984263B (en) Video display method and device
WO2024066752A1 (en) Display control method and apparatus, head-mounted display device, and medium
CN112887794A (en) Video editing method and device
CN110971955B (en) Page processing method and device, electronic equipment and storage medium
CN107680038B (en) Picture processing method, medium and related device
CN107995538B (en) Video annotation method and system
CN115437736A (en) Method and device for recording notes
JP2020527814A (en) Systems and methods for creating and displaying interactive 3D representations of real objects
CN115086747A (en) Information processing method and device, electronic equipment and readable storage medium
CN113919997A (en) Watermark processing method and device, electronic equipment and storage medium
CN114302209A (en) Video processing method, video processing device, electronic equipment and medium
CN107197387B (en) Method and device for displaying video information on webpage in time-sharing manner
KR101116538B1 (en) Choreography production system and choreography production method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant