CN113778356A - Image processing method and system - Google Patents

Image processing method and system Download PDF

Info

Publication number
CN113778356A
CN113778356A CN202110882840.5A CN202110882840A CN113778356A CN 113778356 A CN113778356 A CN 113778356A CN 202110882840 A CN202110882840 A CN 202110882840A CN 113778356 A CN113778356 A CN 113778356A
Authority
CN
China
Prior art keywords
interface
display
display area
icon
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110882840.5A
Other languages
Chinese (zh)
Inventor
苏剑峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Wanxiang Electronics Technology Co Ltd
Original Assignee
Xian Wanxiang Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Wanxiang Electronics Technology Co Ltd filed Critical Xian Wanxiang Electronics Technology Co Ltd
Priority to CN202110882840.5A priority Critical patent/CN113778356A/en
Publication of CN113778356A publication Critical patent/CN113778356A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an image processing method and system, relates to the technical field of electronic information, and can solve the problem of privacy exposure caused by screen projection and display of the whole operation interface in an application program by a terminal device. The specific technical scheme is as follows: dividing a display interface of an application program into a plurality of areas, determining whether each display area carries out screen projection display according to a preset display strategy, carrying out image acquisition on the area which allows screen projection display, and sending the image to display equipment corresponding to the area. The display method and the display device are used for displaying the operation interface in the terminal equipment.

Description

Image processing method and system
Technical Field
The present disclosure relates to the field of electronic information technologies, and in particular, to an image processing method and system.
Background
At present, in order to improve the display effect of images, more and more screen projection products are used. For example, the screen of the terminal device is projected on a television screen or projected on a large screen for display.
However, in the projection process, all the contents displayed by the mobile phone are projected to the screen, so that the privacy of the user cannot be guaranteed. For example, the traditional screen projection equipment in the market can only project the whole or whole picture, and cannot meet the screen projection requirements of more users, more details and intellectualization.
Disclosure of Invention
The embodiment of the disclosure provides an image processing method and system, which can solve the problem of privacy exposure caused by screen projection and display of the whole operation interface in an application program by a terminal device. The technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
when the running of the target program is detected, acquiring an interface of the target program, and determining a display area corresponding to the interface, wherein the display area at least comprises a first display area and a second display area;
acquiring a first interface of the target program in the first display area, and sending the first interface to first display equipment matched with the first display area;
and when the use state of the target program in the second display area is not used, acquiring an icon of a second interface of the target program in the second display area, and displaying the icon in the first interface.
In one embodiment, the method further comprises:
acquiring a second interface of the target program in the second display area;
when the second interface changes within the preset time, determining that the use state of the second display area is the use state;
when the second interface is not changed within the preset time, determining that the use state of the second display area is the non-use state;
or;
when the target application is detected to receive an operation instruction of a user in the second display area within the preset time, determining that the use state of the second display area is the use state;
and when the target application is not detected to receive the operation instruction of the user in the second display area within the preset time, determining that the use state of the second display area is the non-use state.
In one embodiment, the method further comprises:
extracting a functional module of the target program in a second interface in the second display area;
generating an icon of the second interface according to the functional module in the second interface, wherein the icon is used for triggering the functional module;
alternatively, the first and second electrodes may be,
extracting a message list of the target program in a second interface in the second display area;
and generating an icon of the second interface according to the message list in the second interface, wherein the icon is used for triggering the display of the message list.
In one embodiment, the method, displayed on the first interface, includes:
acquiring an icon of the second interface and a display position of the icon, wherein the display position is used for indicating the position of the icon in the first interface;
according to the display position of the icon and a preset display rule, performing superposition processing on the icon on a first interface;
and displaying the first interface subjected to the superposition processing in a first display area.
In one embodiment, the method further comprises:
when a first operation instruction is detected in the first display area, transferring the icon of the second interface to the second display area, and acquiring a third interface of the target application after the target application responds to the first operation instruction in the second display area, wherein the first operation instruction is an instruction of a pointer to the icon of the second interface;
and displaying the third interface in the second display area.
In one embodiment, the method further comprises:
acquiring a display interface of a target application, and dividing the display interface into N sub-display areas, wherein N is greater than 1;
and determining M display devices corresponding to the N sub-display areas according to a preset display rule, wherein M is less than or equal to N, and the display devices are used for displaying the images of the sub-display areas.
According to the image processing method provided by the embodiment of the disclosure, a display interface of an application program can be divided into a plurality of areas, whether screen projection display is carried out on each display area is determined according to a preset display strategy, image acquisition is carried out on the area which is allowed to be displayed by screen projection, and the image acquisition is sent to the display equipment corresponding to the area; only acquiring icon information from an area which is not allowed to be projected for display, and displaying the icon information in the area which is allowed to be projected for display; the system can deliver a part of the screen or the picture, and also can deliver the picture and the content which the user wants to deliver to the display component according to the sequence or the time, thereby improving the flexibility of image display, meeting the intelligent requirements of the customers, greatly facilitating the use of the users, and being widely applied to unmanned meeting rooms, unmanned classroom, commercial advertisement propaganda and the like.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing system including:
the system comprises a server and terminal equipment, wherein the terminal equipment is provided with a screen projection control module;
the terminal device is used for determining a display area corresponding to the target program through the screen projection control module when the target operation is detected, wherein the display area at least comprises a first display area and a second display area;
acquiring a first interface of the target program in the first display area;
when the use state of the target program in the second display area is not used, acquiring an icon of a second interface of the target program in the second display area;
after the icon of the second interface is displayed in the first interface, the first interface is sent to a server;
the server is used for receiving the first interface and sending the first interface to first display equipment matched with the first display area.
In one embodiment, the system further comprises a display device, the display device comprising a receiving device;
and the display device displays the first interface sent by the server through the receiving device.
In one embodiment, the screen projection control module in the system is also used for
Acquiring an icon of the second interface and a display position of the icon, wherein the display position is used for indicating the position of the icon in the first interface;
according to the display position of the icon and a preset display rule, performing superposition processing on the icon on a first interface;
and displaying the first interface subjected to the superposition processing in a first display area.
In one embodiment, the screen projection control module in the system is also used for
Acquiring a display interface of a target application, and dividing the display interface into N sub-display areas, wherein N is greater than 1;
and determining M display devices corresponding to the N sub-display areas according to a preset display rule, wherein M is less than or equal to N, and the display devices are used for displaying the images of the sub-display areas.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart 1 of an image processing method provided by an embodiment of the present disclosure;
fig. 1a is a schematic view of a display interface of an image processing method according to an embodiment of the present disclosure;
fig. 1b is a schematic display interface diagram 2 of an image processing method according to an embodiment of the present disclosure;
fig. 2 is a flowchart 2 of an image processing method provided by an embodiment of the present disclosure;
FIG. 3 is a block diagram of an image processing system provided by an embodiment of the present disclosure;
fig. 3a is a block diagram of a terminal device in an image processing system according to an embodiment of the disclosure;
fig. 3b is a deployment diagram of an image processing system provided by an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
An embodiment of the present disclosure provides an image processing method, as shown in fig. 1, the image processing method including the steps of:
101. when the running of the target program is detected, the interface of the target program is obtained, and the display area corresponding to the interface is determined.
The display area includes at least a first display area and a second display area.
In an alternative embodiment, the first display area may be a display area of content disclosed in the target program, and the second display area may be a display area of content hidden in the target program.
In an optional embodiment, the display area may be divided according to a submodule in the target program, or may be divided according to the content of the display interface in the target program.
For example, when the target application is a WeChat program, the upper half part of a WeChat display area is divided into a first display area and the lower half part of the WeChat display area is divided into a second display area, the first display area only displays the name of the WeChat application and the number of messages, and the second display area displays details of specific contacts in the WeChat.
In a specific implementation process, the display area may be set according to a user's requirement, for example, the display area may be divided into N display areas, and specific display content in each display area is confirmed.
The display area division diagram shown in fig. 1a illustrates specific examples to explain the display effects mentioned in the present disclosure:
screen 1 is an area projected to the display device, and screen 2 is an area not projected.
The screen 2 may display a preset privacy APP display screen or an interface browsed by another user.
In this embodiment, the display area corresponding to the screen 1 is a first area (also referred to as a projection area), and the display area corresponding to the screen 2 is a second area.
Optionally, the first region and the second region have the same area, that is, the screen is evenly distributed.
In another alternative embodiment, the area of the second region is greater than the area of the first region.
The display area division diagram shown in fig. 1b is a specific example, which is used to illustrate the display effect mentioned in the present disclosure:
in the embodiment of FIG. 1b, the display of the target application is divided into three: a first display area, a second display area and a third display area, wherein the third display area of the picture 3 is not projected. The picture 1 first display area and the picture 2 second display area are projected on the screen 1 and the screen 2, respectively.
In an alternative implementation, a plurality of programs may be respectively run on the terminal device, and the running screen of each program displays different screens in different display areas: for example, the a application is divided into two display areas and the B application is divided into four display areas.
Specifically, the display area division schematic diagrams for different applications can refer to fig. 1: the terminal equipment can be divided into 4 screens, 8 screens, 16 screens and the like. The specific screen projection mapping relationship can be set by a user. For example, the screen projection pictures may correspond to the screens one to one, or a part of the pictures may be projected to a plurality of screens, and the other part of the pictures may be projected to one screen.
Specifically, the device for installing the target program may be a mobile phone, a computer, a pad, or a split screen display.
102. And acquiring a first interface of the target program in the first display area, and sending the first interface to first display equipment matched with the first display area.
The method provided by the present disclosure may be implemented by sending the first interface to a first display device that matches the first display region.
Specifically, the method can control the acquisition device to acquire a first interface through the intelligent screen projection control module and then generate a first display instruction, wherein the first display instruction is used for instructing the first display device to display the first interface;
sending the first display instruction to a server;
the server finds the first display device according to the first display instruction, and sends the first interface to the first display device.
In the method provided by the present disclosure, a mapping relationship between the display area and the display device is also established, specifically:
acquiring a display interface of a target application, and dividing the display interface into N sub-display areas, wherein N is greater than 1;
and determining M display devices corresponding to the N sub-display areas according to a preset display rule, wherein M is less than or equal to N, and the display devices are used for displaying the images of the sub-display areas.
103. And when the use state of the target program in the second display area is not used, acquiring the icon of the target program in the second display area and displaying the icon in the first interface.
And after the icon of the second interface is displayed in the first interface, the first interface is sent to the first display equipment.
In the method provided by the present disclosure, the icon of the second interface is an operable control, such as a button, which can respond to a user's click, press, slide, etc. touch operation, so as to acquire data or activate a function.
The method provided by the present disclosure further includes detecting a use state of the target program in the second display area, specifically including:
acquiring a second interface of the target program in the second display area;
when the second interface changes within the preset time, determining that the use state of the second display area is the use state;
when the second interface is not changed within the preset time, determining that the use state of the second display area is the non-use state;
or;
when the target application is detected to receive an operation instruction of a user in the second display area within the preset time, determining that the use state of the second display area is the use state;
and when the target application is not detected to receive the operation instruction of the user in the second display area within the preset time, determining that the use state of the second display area is the non-use state.
The method provided by the present disclosure further includes generating an icon of the second interface according to the function module in the second interface, specifically including:
extracting a functional module of the target program in a second interface in the second display area;
generating an icon of the second interface according to the functional module in the second interface, wherein the icon is used for triggering the functional module;
and acquiring the operation of the user on the icon, and endowing the function to the image information according to the operation on the icon.
For example, the second interface includes a music function, the second interface may display an interface for playing music, such as lyric information, playing information, personal information, and the like, and the icon of the second interface may be an icon for activating the music function or an icon for controlling the music function.
The method provided by the present disclosure further includes generating an icon of the second interface according to the message list in the second interface, and specifically includes:
extracting a message list of the target program in a second interface in the second display area;
and generating an icon of the second interface according to the message list in the second interface, wherein the icon is used for triggering the display of the message list.
For example, the second interface includes a communication function, a message list of the communication function may be displayed in the second interface, the message list includes at least communication time, a communication session window, and identification information of a communication party, and an icon of the second interface may be summary information of the communication function, the summary information includes the number and status of communications, and does not display specific communication content.
The method is displayed on a first interface and comprises the following steps:
acquiring an icon of the second interface and a display position of the icon, wherein the display position is used for indicating the position of the icon in the first interface;
according to the display position of the icon and a preset display rule, overlapping the icon on a first interface and displaying the icon;
and displaying the first interface subjected to the superposition processing in a first display area.
The method provided by the present disclosure further includes detecting whether an icon of a second interface in the first interface is triggered, and specifically includes: :
when a first operation instruction is detected in the first display area, transferring the icon of the second interface to the second display area, and acquiring a third interface of the target application after the target application responds to the first operation instruction in the second display area, wherein the first operation instruction is an instruction of a pointer to the icon of the second interface;
and displaying the third interface in the second display area.
For example, the icon of the second interface is an icon for reminding an incoming call, and when the user operation is detected in the first interface and the icon is determined to be triggered, the reminding incoming call is transferred to the second screen and displayed in a full screen mode.
As above, the release picture may also be made a preset rule, and a limitation on maximum segmentation is made, for example, when capturing exceeds 16 segmented pictures, the release picture will exceed the system support range, or no limitation is made, and when no limitation is made, any region of the desktop may be arbitrarily selected to capture and release.
According to the image processing method provided by the embodiment of the disclosure, a display interface of an application program can be divided into a plurality of areas, whether screen projection display is carried out on each display area is determined according to a preset display strategy, image acquisition is carried out on the area which is allowed to be displayed by screen projection, and the image acquisition is sent to the display equipment corresponding to the area; only acquiring icon information from an area which is not allowed to be projected for display, and displaying the icon information in the area which is allowed to be projected for display; the system can deliver a part of the screen or the picture, and also can deliver the picture and the content which the user wants to deliver to the display component according to the sequence or the time, thereby improving the flexibility of image display, meeting the intelligent requirements of the customers, greatly facilitating the use of the users, and being widely applied to unmanned meeting rooms, unmanned classroom, commercial advertisement propaganda and the like.
Based on the image processing method provided by the embodiment corresponding to fig. 1, another embodiment of the present disclosure provides an image processing method, which may be applied to a terminal device. Referring to fig. 2, the image processing method provided in this embodiment includes the following steps:
201. and detecting the running target application, and determining the display position of the corresponding picture based on the identification information of the target application.
The identification information of the target application may include information such as a name, weight information, authority information, etc. of the target application.
The display position of the screen refers to a display position corresponding to the display content in the target application.
The method provided by the present disclosure may be to divide the display content of the target application into two parts for display, for example, the upper half screen and the lower half screen may be set on the display device to display the upper half screen only, and the application or information related to the privacy of the user may be displayed on the display device only on the lower half screen without displaying on the display device.
Specifically, the privacy or information in the target application may be set in advance by the user. Such as running WeChats, notes, conversations, microblogs, or other content.
In the method provided by the disclosure, the upper half part of the display content in the target application is divided into a first interface, and the lower half part of the display content in the target application is divided into a second area.
202. And capturing the upper half picture of the target application, and sending the upper half picture to display equipment for displaying.
The display device in the method provided by the present disclosure includes a receiver, which may be a wireless communication module.
In case of one-to-one projection, no additional transmitter or receiver needs to be accessed.
One-to-one projection refers to projecting a picture onto a screen.
In the case of one-to-many projection, it is equivalent to projecting one picture onto a plurality of screens.
Many-to-many projection is equivalent to projecting a plurality of pictures onto a plurality of screens, and at this time, if a plurality of split screens of a terminal device are provided, a plurality of transmitters are required to be additionally accessed. The terminal device contains by default a transmitter. Optionally, all projection screens may use external transmitters.
The display effect is shown in the following figure: screen 1 is a projected area, and screen 2 is an area that is not projected. The screen 2 may display a preset privacy APP display screen or an interface browsed by another user.
In the method provided by the present disclosure, a display area corresponding to an upper half area in the target application is a first area (also referred to as a projection area), and a display area corresponding to a lower half area is a second area.
203. A second zone usage status in the target application is detected.
And if the display picture of the second area is not changed within the preset time, determining that the use state is unused.
If the display screen of the second area changes or the touch operation of the user is performed, the state of the second area is determined to be used.
204. And detecting the use state of the incoming call module in the target application.
If the incoming call module is in the use state, 205 is executed;
if the incoming call module is not in use, then 206 is performed.
205. And displaying the first incoming call reminder in all interfaces of the second interface.
The first incoming call reminder displays information of the caller, such as a telephone number.
And if the first incoming call reminder is in the address book, displaying remark information in the address book. Such as Zhang Sanyuan, 1581234567890, etc.
206. And displaying the second incoming call reminder on the first interface.
The second incoming call prompt does not display the incoming call party telephone or remark information, and only displays the incoming call.
The second incoming call prompt is displayed above the first interface and one layer above the picture before the first interface.
207. And after the user operation is detected, transferring the second reminding incoming call to a second interface and displaying the second reminding incoming call in a full screen mode.
The user operation may be a shortcut key set in advance or an operation of clicking the first screen region.
In the process from 204 to 207, in the case of no incoming call information, the incoming call can be further extended to other applications, the running applications include instant chat tools such as WeChat, short messages and the like, and the display mode is determined based on the use state. The specific application list can be customized by the user.
Example two
Based on the image processing method described in the embodiment corresponding to fig. 1 and fig. 2, the following is an embodiment of the system of the present disclosure, which can be used to execute an embodiment of the method of the present disclosure.
An embodiment of the present disclosure provides an image processing system, as shown in fig. 3, the image processing system 30 includes: a server 302 and a terminal device 301, as shown in fig. 3a, the terminal device 301 is configured with a screen-projection control module 3011;
the terminal device 301 is configured to determine, by using the screen projection control module, a display area corresponding to the target program when the target operation is detected, where the display area at least includes a first display area and a second display area;
acquiring a first interface of the target program in the first display area;
when the use state of a second interface of the target program in the second display area is not used, acquiring an icon of the second interface of the target program in the second display area;
after the icon of the second interface is displayed in the first interface, the first interface is sent to a server;
the server 302 is configured to receive the first interface and send the first interface to a first display device matched with the first display area.
In one embodiment, the screen projection control module in the system 30 is also used for
Acquiring an icon of the second interface and a display position of the icon, wherein the display position is used for indicating the position of the icon in the first interface;
according to the display position of the icon and a preset display rule, performing superposition processing on the icon on a first interface;
and displaying the first interface subjected to the superposition processing in a first display area.
In one embodiment, the screen projection control module in the system 30 is also used for
Acquiring a display interface of a target application, and dividing the display interface into N sub-display areas, wherein N is greater than 1;
and determining M display devices corresponding to the N sub-display areas according to a preset display rule, wherein M is less than or equal to N, and the display devices are used for displaying the images of the sub-display areas.
In one embodiment, the system 30 further comprises a display device comprising a receiving device;
and the display device displays the first interface sent by the server through the receiving device.
In one embodiment, the system 30 further comprises a collecting device for collecting the first interface of the target program in the first display area;
and sending the first interface to the server.
As shown in fig. 3b, in practical application of the system provided by the present disclosure, when a user selects to launch the upper half screen of the display interface of the intelligent terminal, the screen launching control module will divide the desktop displayed by the terminal into an upper part and a lower part. When the upper half of the picture is captured, the screen projection control module controls the server, the server informs the intelligent acquisition end to acquire the picture, and the intelligent acquisition end responds after receiving an information instruction sent by the server. The intelligent acquisition end transmits the acquired interface content to the intelligent acquisition emitter through a high-definition interface. And the emitter receives the collected interface content, performs coding processing and then informs the control server. And after receiving the notification of the transmitter, the control server sends a notification to the intelligent receiver, so that the intelligent receiver sends the coding information of the interface content to the receiver. The receiver responds to the intelligent transmitter through the control server according to the notification sent by the server. The transmitter receives the response of the receiver and transmits the processed coded information to the intelligent receiver through a wireless network protocol, the intelligent receiver receives the coded information transmitted by the transmitter and then performs decoding processing, the decoded picture content information is transmitted to the intelligent display component through a high-definition interface, and the intelligent display component receives the picture content information transmitted by the intelligent receiver and displays the picture content information through the display component; thereby completing the delivery of the picture content information.
According to the image processing system provided by the embodiment of the disclosure, the terminal device can divide the display interface of the application program into a plurality of areas, determine whether each display area is displayed by screen projection according to a preset display strategy, acquire images of the areas allowing screen projection display, and send the images to the display devices corresponding to the areas; only acquiring icon information from an area which is not allowed to be projected for display, and displaying the icon information in the area which is allowed to be projected for display; the system can deliver a part of the screen or the picture, and also can deliver the picture and the content which the user wants to deliver to the display component according to the sequence or the time, thereby improving the flexibility of image display, meeting the intelligent requirements of the customers, greatly facilitating the use of the users, and being widely applied to unmanned meeting rooms, unmanned classroom, commercial advertisement propaganda and the like.
Based on the image processing method described in the embodiment corresponding to fig. 1 and fig. 2, an embodiment of the present disclosure further provides a computer-readable storage medium, for example, the non-transitory computer-readable storage medium may be a Read Only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The storage medium stores computer instructions for executing the image processing method described in the embodiment corresponding to fig. 1 and fig. 2, which is not described herein again.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
when the running of a target program is detected, acquiring an interface of the target program, and determining a display area corresponding to the interface, wherein the display area at least comprises a first display area and a second display area;
acquiring a first interface of the target program in the first display area, and sending the first interface to first display equipment matched with the first display area;
and when the use state of the target program in the second display area is not used, acquiring an icon of a second interface of the target program in the second display area, and displaying the icon in the first interface.
2. The method of claim 1, further comprising:
acquiring a second interface of the target program in the second display area;
when the second interface changes within a preset time, determining that the use state of the second display area is the use state;
when the second interface is not changed within the preset time, determining that the use state of the second display area is an unused state;
or;
when the target application is detected to receive an operation instruction of a user in the second display area within the preset time, determining that the use state of the second display area is the use state;
and when the target application is not detected to receive the operation instruction of the user in the second display area within the preset time, determining that the use state of the second display area is the non-use state.
3. The method of claim 1, further comprising:
extracting a functional module of the target program in a second interface in the second display area;
generating an icon of the second interface according to a function module in the second interface, wherein the icon is used for triggering the function module;
alternatively, the first and second electrodes may be,
extracting a message list of the target program in a second interface in the second display area;
and generating an icon of the second interface according to the message list in the second interface, wherein the icon is used for triggering and displaying the message list.
4. The method of claim 2, wherein displaying on a first interface in the method comprises:
acquiring an icon of the second interface and a display position of the icon, wherein the display position is used for indicating the position of the icon in the first interface;
according to the display position of the icon and a preset display rule, performing superposition processing on the icon on a first interface;
and displaying the overlapped first interface in a first display area.
5. The method of claim 1, further comprising:
when a first operation instruction is detected in a first display area, transferring an icon of the second interface to a second display area, and acquiring a third interface of the target application after the target application responds to the first operation instruction in the second display area, wherein the first operation instruction is an instruction of a pointer to the icon of the second interface;
displaying the third interface in the second display area.
6. The method of claim 1, further comprising:
acquiring a display interface of a target application, and dividing the display interface into N sub-display areas, wherein N is greater than 1;
and determining M display devices corresponding to the N sub-display areas according to a preset display rule, wherein M is less than or equal to N, and the display devices are used for displaying the images of the sub-display areas.
7. An image processing system, comprising: the system comprises a server and terminal equipment, wherein the terminal equipment is provided with a screen projection control module;
the terminal device is used for determining a display area corresponding to the target program through the screen projection control module when the target operation is detected, wherein the display area at least comprises a first display area and a second display area;
acquiring a first interface of the target program in the first display area;
when the use state of the target program in the second display area is not used, acquiring an icon of a second interface of the target program in the second display area;
after the icon of the second interface is displayed in the first interface, the first interface is sent to a server;
and the server is used for receiving the first interface and sending the first interface to first display equipment matched with the first display area.
8. The system of claim 6, further comprising a display device, the display device comprising a receiving device;
and the display equipment displays the first interface sent by the server through the receiving equipment.
9. The system of claim 7, wherein the screen projection control module is further configured to
Acquiring an icon of the second interface and a display position of the icon, wherein the display position is used for indicating the position of the icon in the first interface;
according to the display position of the icon and a preset display rule, performing superposition processing on the icon on a first interface;
and displaying the overlapped first interface in a first display area.
10. The system of claim 7, wherein the screen projection control module is further configured to
Acquiring a display interface of a target application, and dividing the display interface into N sub-display areas, wherein N is greater than 1;
and determining M display devices corresponding to the N sub-display areas according to a preset display rule, wherein M is less than or equal to N, and the display devices are used for displaying the images of the sub-display areas.
CN202110882840.5A 2021-08-02 2021-08-02 Image processing method and system Pending CN113778356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110882840.5A CN113778356A (en) 2021-08-02 2021-08-02 Image processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110882840.5A CN113778356A (en) 2021-08-02 2021-08-02 Image processing method and system

Publications (1)

Publication Number Publication Date
CN113778356A true CN113778356A (en) 2021-12-10

Family

ID=78836552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110882840.5A Pending CN113778356A (en) 2021-08-02 2021-08-02 Image processing method and system

Country Status (1)

Country Link
CN (1) CN113778356A (en)

Similar Documents

Publication Publication Date Title
US9419923B2 (en) Method for sharing function between terminals and terminal thereof
CN102859480B (en) Screen sharing
CN105635625B (en) Video call method and device
US9253318B2 (en) Method and apparatus for providing state information
JP6286105B2 (en) Cloud card transmission method, apparatus, program, and recording medium
CN113329240A (en) Screen projection method and device
CN108037863B (en) Method and device for displaying image
WO2019072096A1 (en) Interactive method, device, system and computer readable storage medium in live video streaming
JP2023511195A (en) Message presentation method and electronic device
CN111147661B (en) Interface display method and electronic equipment
CN104115443A (en) Transferring of communication event
CN113365153B (en) Data sharing method and device, storage medium and electronic equipment
CN107888965B (en) Image gift display method and device, terminal, system and storage medium
CN110493629B (en) Live broadcast cover hanging part display method and device, electronic equipment and storage medium
EP3068099A1 (en) Communication system, transmission terminal, communication method, and medium
EP3223147A2 (en) Method for accessing virtual desktop and mobile terminal
EP3462367B1 (en) Method and apparatus for displaying application interface
CN108965413B (en) Information interaction method and device and storage medium
JP2016530818A (en) CALL METHOD, CALL DEVICE, CALL SYSTEM, PROGRAM, AND RECORDING MEDIUM
EP4054198A1 (en) Live broadcast method and apparatus, electronic device, and storage medium
CN112104913B (en) Wheat connecting switching method and device, computer equipment and storage medium
CN106792442B (en) Data migration method and device
CN107682541B (en) Audio control method for screen projection, mobile terminal and storage medium
CN113179208B (en) Interaction method, interaction device and storage medium
CN111414097A (en) Interaction method, interaction system and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication