CN111064894A - Image processing method, terminal device and computer-readable storage medium - Google Patents

Image processing method, terminal device and computer-readable storage medium Download PDF

Info

Publication number
CN111064894A
CN111064894A CN201911385885.0A CN201911385885A CN111064894A CN 111064894 A CN111064894 A CN 111064894A CN 201911385885 A CN201911385885 A CN 201911385885A CN 111064894 A CN111064894 A CN 111064894A
Authority
CN
China
Prior art keywords
terminal
image
information
image processing
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911385885.0A
Other languages
Chinese (zh)
Inventor
冯启哲
周凡贻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Chuanying Information Technology Co Ltd
Original Assignee
Shanghai Chuanying Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Chuanying Information Technology Co Ltd filed Critical Shanghai Chuanying Information Technology Co Ltd
Priority to CN201911385885.0A priority Critical patent/CN111064894A/en
Publication of CN111064894A publication Critical patent/CN111064894A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The invention provides an image processing method, a terminal device and a computer readable storage medium. The image processing method provided by the invention comprises the following steps: the method comprises the steps that a first terminal collects first image information and receives second image information; synthesizing a preview image according to the first image information and the second image information; and outputting the preview image. The terminal equipment can receive the image information sent by the other terminal equipment while acquiring the image information and integrate and process the image information to form a group photo, thereby realizing the sharing of image resources of a plurality of equipment, solving the problem of single image processing mode of the current mobile phone and enhancing the interestingness of mobile phone image processing.

Description

Image processing method, terminal device and computer-readable storage medium
Technical Field
The present invention relates to the field of mobile communications, and in particular, to an image processing method, a terminal device, and a computer-readable storage medium.
Background
Augmented Reality (AR) is a technology for real-time real-site calculation of the position and angle of a camera image and addition of corresponding images, videos and 3D models, and aims to fit a virtual world in the real world on a screen and interact with the virtual world. The current mobile phone can use a virtual reality technology in the camera application on the additional special effect of the human face, and can store the photos with the special effect when the user needs the photos. But the processed picture only reflects the image shot by the current equipment, and the image processing mode is single.
Disclosure of Invention
The invention mainly aims to provide an image processing method, terminal equipment and a computer readable storage medium, and aims to solve the problem that the current mobile phone image processing mode is single.
In order to achieve the above object, the present invention provides an image processing method, including:
the method comprises the steps that a first terminal collects first image information and receives second image information;
synthesizing a preview image according to the first image information and the second image information;
and outputting the preview image.
Optionally, the step of the first terminal acquiring the first image information includes:
the method comprises the steps that a first terminal shoots a first image and extracts first feature information of the first image;
acquiring a first image processing resource matched with the first image;
and taking the first characteristic information and the first image processing resource as the first image information.
Optionally, the step of outputting the preview image includes:
and displaying the preview image and/or sending the preview image to a second terminal.
Optionally, before the step of acquiring the first image information and receiving the second image information by the first terminal, the method further includes:
the first terminal establishes connection with the second terminal;
and if the first terminal meets the preset condition, executing the steps of acquiring first image information and receiving second image information by the first terminal.
Optionally, the preset condition includes at least one of the following conditions:
the memory of the first terminal is larger than that of the second terminal;
the processor load of the first terminal is greater than the processor load of the second terminal.
In order to achieve the above object, the present invention further provides an image processing method, including:
the second terminal sends the collected second image information to the first terminal, so that the first terminal synthesizes the collected first image information and the received second image information into a preview image and sends the preview image to the second terminal;
receiving the preview image sent by the first terminal;
and displaying the second image information and/or the preview image.
Optionally, the step of sending, by the second terminal, the acquired second image information to the first terminal includes:
the second terminal shoots a second image and extracts second characteristic information of the second image;
acquiring a second image processing resource matched with the second image;
taking the second feature information and the second image processing resource as the second image information;
and sending the second image information to the first terminal.
Optionally, before the step of sending, by the second terminal, the acquired second image information to the first terminal, so that the first terminal synthesizes the preview image, the method further includes:
the second terminal establishes connection with the first terminal;
and if the second terminal does not meet the preset condition, executing the step that the second terminal sends the collected second image information to the first terminal.
Optionally, the preset condition includes at least one of the following conditions:
the memory of the second terminal is smaller than that of the first terminal;
the processor of the second terminal is less loaded than the first terminal.
In order to achieve the above object, the present invention further proposes a terminal device, which includes a memory, a processor, and a control program of an image processing method stored on the memory and executable on the processor, and the control program of the image processing method implements the steps of the control method of the image processing method as described above when executed by the processor.
To achieve the above object, the present invention also proposes a computer-readable storage medium having stored thereon a control program of an image processing method, which when executed by a processor, implements the steps of the control method of the image processing method as described above.
The technical scheme of the invention includes that first image information is collected through a first terminal, and second image information sent by a second terminal is received; synthesizing a preview image according to the first image information and the second image information; and outputting the preview image. The terminal equipment can receive the image information sent by the other terminal equipment while acquiring the image information and integrate and process the image information to form a group photo, so that the image resources of a plurality of equipment are shared, the problem of single image processing mode of the current mobile phone is solved, and the interestingness of the image processing mode of the mobile phone is enhanced.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a first embodiment of an image processing method according to the present invention;
FIG. 3 is a flowchart illustrating a second embodiment of an image processing method according to the present invention;
FIG. 4 is a flowchart illustrating a third embodiment of an image processing method according to the present invention;
FIG. 5 is a flowchart illustrating a fourth embodiment of an image processing method according to the present invention;
FIG. 6 is a flowchart illustrating an image processing method according to a fifth embodiment of the present invention;
FIG. 7 is a flowchart illustrating an image processing method according to a sixth embodiment of the present invention;
fig. 8 is a flowchart illustrating an image processing method according to a seventh embodiment of the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, if directional indications (such as up, down, left, right, front, and back) are involved in the embodiment of the present invention, the directional indications are only used for explaining the relative positional relationship, the motion situation, and the like between the components in a certain posture, and if the certain posture is changed, the directional indications are changed accordingly.
In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The main solution of the embodiment of the invention is as follows: the method comprises the steps that a first terminal collects first image information and receives second image information sent by a second terminal; synthesizing a preview image according to the first image information and the second image information; and outputting the preview image.
The image processing mode of the mobile phone in the prior art is single.
The invention provides an image processing method, which comprises the following steps: the method comprises the steps that a first terminal collects first image information and receives second image information sent by a second terminal; synthesizing a preview image according to the first image information and the second image information; and outputting the preview image. The technical problem that an existing mobile phone image processing mode is single is solved.
As shown in fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal of the embodiment of the invention can be a mobile phone, and can also be a mobile intelligent terminal with a shooting function, such as a tablet and the like.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a camera 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The memory 1005 may be a high-speed RAM memory, or may be an NVM (non-volatile memory), such as a disk memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer-readable storage medium, may include therein a control program of an operating system, a network communication module, and an image processing method.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server, and performing data communication with the backend server, or directly connecting to a client (user end) and performing data communication; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call up a control program of the image processing method stored in the memory 1005, and perform the following operations:
the method comprises the steps that a first terminal collects first image information and receives second image information;
synthesizing a preview image according to the first image information and the second image information;
and outputting the preview image.
Further, the processor 1001 may call a control program of the image processing method stored in the memory 1005, and also perform the following operations:
the method comprises the steps that a first terminal shoots a first image and extracts first feature information of the first image;
acquiring a first image processing resource matched with the first image;
and taking the first characteristic information and the first image processing resource as the first image information.
Further, the processor 1001 may call a control program of the image processing method stored in the memory 1005, and also perform the following operations:
and displaying the preview image and/or sending the preview image to a second terminal.
Further, the processor 1001 may call a control program of the image processing method stored in the memory 1005, and also perform the following operations:
the first terminal establishes connection with the second terminal;
and if the first terminal meets the preset condition, executing the steps of acquiring first image information and receiving second image information by the first terminal.
The processor 1001 may also be configured to call a control program of an image processing method stored in the memory 1005, and perform the following operations:
the second terminal sends the collected second image information to the first terminal, so that the first terminal synthesizes the collected first image information and the received second image information into a preview image and sends the preview image to the second terminal;
receiving the preview image sent by the first terminal;
and displaying the second image information and/or the preview image.
Further, the processor 1001 may call a control program of the image processing method stored in the memory 1005, and also perform the following operations:
the second terminal shoots a second image and extracts second characteristic information of the second image;
acquiring a second image processing resource matched with the second image;
taking the second feature information and the second image processing resource as the second image information;
and sending the second image information to the first terminal.
Further, the processor 1001 may call a control program of the image processing method stored in the memory 1005, and also perform the following operations:
the second terminal establishes connection with the first terminal;
and if the second terminal does not meet the preset condition, executing the step that the second terminal sends the collected second image information to the first terminal.
Based on the hardware architecture, the embodiment of the image processing method is provided.
Referring to fig. 2, fig. 2 is a first embodiment of the image processing method of the present invention, which includes the steps of:
step S10, the first terminal collects the first image information and receives the second image information;
in this embodiment, the number of the first terminals is one, and the first terminals are used for performing centralized processing on the image information collected by the local terminal and the received image information sent by the other terminal. The first terminal can be a mobile intelligent device with a photographing function, such as a smart phone and a tablet. The user opens the camera function of the local terminal equipment and switches to the image processing photographing mode, the first terminal can collect image information, and the collected image information comprises at least one of images shot through the camera and image processing resources matched with the system and available for the user to select, such as AR special effects. The image shot by the camera comprises one or more pieces of personal information.
In this embodiment, the second image information is image information acquired by a plurality of terminals other than the first terminal, and the plurality of terminals other than the first terminal and the first terminal may be terminal devices having the same functions of taking pictures and the like. In the case that there are a plurality of other terminals except the first terminal, that is, a second terminal, a third terminal, a fourth terminal, and so on, the third terminal corresponds to the third image information, the fourth terminal corresponds to the fourth image information, and so on, so the first terminal can receive not only the second image information sent by the second terminal, but also the third image information sent by the third terminal, the fourth image information sent by the fourth terminal, and corresponding image information sent by more terminals. For convenience of presentation, the remaining plurality of terminal devices are collectively referred to as a second terminal, and the number of the second terminals is not limited to one, so that the second image information also includes image information collected by the plurality of terminals and sent to the first terminal.
A step S20 of synthesizing a preview image from the first image information and the second image information;
in this embodiment, the first terminal may perform centralized processing on the first image information acquired by the home terminal and the received second image information, and synthesize a preview image including all the image information. The mode of the first terminal for processing the image information can comprise the following steps: if the image information is only the image shot through the camera, the first terminal integrates the first terminal and the character information in the received image into one picture, adjusts the size of the character, rearranges the positions of the characters to enable the sizes of all the characters in the synthesized image to be coordinated, and finally forms a preview picture. If the image information comprises an image shot by a camera and an image processing resource selected by a user, the first terminal integrates and superposes the first terminal, the received character information and the corresponding image processing resource to form character image information with a special effect, integrates the character image information into one picture, adjusts the size of the character, rearranges the positions of the characters to coordinate the sizes of the characters in the synthesized image, and finally forms a preview picture. It is to be understood that the above-mentioned modes are only illustrative of the composite preview image, and the mode of combining the first image and the second image into the preview image is not limited.
Step S30, the preview image is output.
In this embodiment, after the first terminal synthesizes the preview image, the synthesized preview image is output to be displayed by the first terminal and/or the second terminal.
In the embodiment, one terminal device is used for carrying out centralized processing on the image information acquired by the local terminal and the image information sent by the other terminal, the terminal device can receive the image information sent by the other terminal device while acquiring the image information and carry out integrated processing on the image information to form a group photo, the sharing of image resources of a plurality of devices is realized, and the problem that the current mobile phone image processing mode is single is solved.
Referring to fig. 3, fig. 3 is a second embodiment of the image processing method according to the present invention, and based on the first embodiment, in step S10, the first terminal acquires first image information, including:
step S11, the first terminal shoots a first image and extracts first characteristic information of the first image;
in this embodiment, after the first terminal captures an image through the camera, the system automatically calls a built-in image processing software to perform initial processing on the image, where the initial processing includes cutting, slicing, and the like, and analyzes feature information of the image after the initial processing, specifically, identifies and extracts face information and body feature information in the image, and uses the face information and the body feature information as first feature information.
In this embodiment, since the process of capturing the image by the first terminal is performed in real time, the first feature information is also updated in real time according to the captured image.
Step S12, acquiring a first image processing resource matched with the first image;
in this embodiment, after the first terminal captures an image, or during the process of capturing the image, a user may select an image processing resource from a menu bar disposed at any position of a current screen as a first image processing resource of the first terminal; or, in a default state, the system may also automatically match the corresponding image processing resource as the first image processing resource of the first terminal.
In this embodiment, any terminal device is configured with corresponding image processing resources, and for convenience of distinction, the image processing resources are numbered to correspond to the terminal device to which the image processing resources belong, that is, the first image processing resource corresponds to the first image acquired by the first terminal, and similarly, the second image processing resource corresponds to the second image acquired by the second terminal, and so on. The image processing resources include character special effects resources, such as AR special effects. The image processing resources are selected from an image processing resource library preset by the system, and the image processing resources in the image processing resource library can be downloaded and updated through network connection so as to expand the types and the number of the image processing resources.
In this embodiment, since the process of capturing an image by the first terminal is performed in real time, the first feature information is updated in real time according to the difference of the captured image, and in this case, the first image processing resource is also updated in such a manner that, when the system is in the user selection mode, the user can select different image processing resources in the menu bar at any time, and the system updates the first image resource in time according to the selection condition of the user; when the system is in a default state, after the system matches corresponding image processing resources for the initial shot image, if the figure of the shot image does not change, the initially matched image processing resources do not change, if the figure of the shot image changes, the initially matched image processing resources change, in the process, if the figure is increased, the system matches a new image processing resource for each newly added figure, and if the figure is decreased, the image processing resources of the existing figures in the image do not change.
Step S13, regarding the first feature information and the first image processing resource as the first image information.
In this embodiment, the first image information of the first terminal includes first feature information acquired by the first terminal and a first image processing resource matched with the first feature information. When the image shot by the first terminal contains a plurality of persons, each person has corresponding first characteristic information, for convenience of presentation, the first characteristic information corresponding to the person a is referred to as first characteristic information a, so that the first characteristic information corresponding to the person B is first characteristic information B, the first characteristic information corresponding to the person C is first characteristic information C, and so on, the first characteristic information a, the first characteristic information B, the first characteristic information C and so on are collectively referred to as first characteristic information. Then, the first image processing resource corresponding to the first feature information a is a first image processing resource a, the first image processing resource corresponding to the first feature information B is a first image processing resource B, the first image processing resource corresponding to the first feature information C is a first image processing resource C, and so on, and in the same way, the first image processing resource a, the first image processing resource B, the first image processing resource C, and so on are collectively referred to as a first image processing resource.
In the embodiment, the first image resources acquired by the first terminal are matched with the corresponding first image processing resources, so that the real image information is changed into images with different image expression effects with the assistance of the image processing resources, and the interestingness and flexibility of shooting are increased.
Referring to fig. 4, fig. 4 is a third embodiment of the image processing method according to the present invention, and based on the first or second embodiment, step S30 includes:
and step S31, displaying the preview image and/or sending the preview image to a second terminal.
In this embodiment, after the first terminal synthesizes the preview image, the synthesized preview image may be displayed on a local display screen and/or sent to the second terminal, so that a user of the second terminal may browse the preview image.
In this embodiment, the terminal display screens of the first terminal and the second terminal may only display the preview image, or may display the preview image and the image information acquired by the home terminal in a split screen manner, and when the preview image is displayed in the split screen manner, a user may zoom or drag the display screen to adjust the size, the scale, and the position of each display window.
Referring to fig. 5, fig. 5 is a diagram of a fourth embodiment of the image processing method according to the present invention, and based on any one of the first to third embodiments, the image processing method in this embodiment further includes:
step S40, the first terminal establishes connection with the second terminal;
in this embodiment, the connection between the first terminal and the second terminal may be established through any one of a bluetooth connection, a network data connection, and a wireless local area network connection, so that users of different terminals can transmit image data at a long distance or a short distance.
Step S50, if the first terminal meets a preset condition, the first terminal collects the first image information and receives the second image information.
In this embodiment, after the first terminal establishes a connection with the second terminal, it is determined whether the first terminal meets a preset condition. The preset condition comprises at least one of the following conditions: the memory of the first terminal is larger than that of the second terminal, and the processor load of the first terminal is larger than that of the second terminal.
In this embodiment, for convenience of description, the number of the terminals is set to 4, which are the terminal 1, the terminal 2, the terminal 3, and the terminal 4, but the actual number of the terminals is not limited to four. When terminals are connected through bluetooth, if at the same time, the terminal 1, the terminal 2, the terminal 3 and the terminal 4 simultaneously open bluetooth and access the same shared network, the terminal system automatically acquires the memory information and/or the processor load information of the local terminal device, compares the memory information of the local terminal device with the memory information of other terminal devices, compares the processor load information of the local terminal device with the processor load information of other terminal devices, determines the terminal 1 as a main device, i.e., the first terminal, if the memory or processor load of the terminal 1 is the maximum, and similarly, determines the terminal 2 as a main device, i.e., the first terminal, if the memory or processor load of the terminal 2 is the maximum. When the terminals are connected through the Bluetooth, but the time points when the terminal 1, the terminal 2, the terminal 3 and the terminal 4 open the Bluetooth and access the same shared network are different, the terminal system automatically acquires the memory information and/or the processor load information of the terminal device, and when the terminal system enters the shared network, the terminal system is compared with the memory information and/or the processor load information of other terminal devices which simultaneously enter the shared network, so that the device with the maximum memory and/or processor load is obtained as a main device, namely the first terminal. For example, when the terminal 1 and the terminal 2 open bluetooth at the same time and access to the same shared network, if the load of the memory or processor of the terminal 1 is greater than that of the terminal 2, the terminal 1 is determined as a primary device, that is, the first terminal; after the terminal 3 enters the same shared network through bluetooth connection, the memory information and/or processor load information of the terminal 3 and the terminal 1 are compared, if the memory or processor load of the terminal 3 is greater than that of the terminal 1, the terminal 3 is determined as a new main device, and if the memory or processor load of the terminal 3 is less than that of the terminal 1, the terminal 1 continues to be used as the main device, namely the first terminal.
In this embodiment, when terminals are connected via a wireless local area network (hotspot connection), first, a terminal device is determined as a hotspot sending end, when other terminal devices are added to the same shared network as hotspot connection ends, a first terminal device connected to the hotspot sending end is compared with memory information and/or processor load information of the terminal device as the hotspot sending end to determine the first terminal, then, the terminal device subsequently connected to the hotspot is compared with the memory information and/or processor load information of the terminal device as the hotspot sending end to determine whether the memory information and/or processor load information of the terminal device subsequently connected to the hotspot is greater than the memory information and/or processor load information of the terminal device as the hotspot sending end, if the memory information and/or processor load information of the terminal device as the hotspot sending end is greater than the memory information and/or processor load information of the terminal device as the hotspot sending end, and comparing the memory information and/or processor load information of the terminal equipment subsequently connected to the hotspot with the previously determined memory information and/or processor load information of the first terminal, and if the memory information and/or processor load information of the terminal equipment subsequently connected to the hotspot is greater than the previously determined memory and/or processor load information of the first terminal, replacing the main equipment, namely taking the terminal equipment subsequently connected to the hotspot as a new first terminal. For example, if the terminal 1 is used as a hotspot issuing end, the terminal 2 firstly shares a hotspot issued by the terminal 1 and connects to the same network as the terminal 1, comparing the memory information and/or processor load information of the terminal 2 and the terminal 1 with the terminal 1, and if the memory or processor load of the terminal 2 is greater than that of the terminal 1, determining the terminal 2 as a main device, that is, the first terminal; and then, the terminal 3 shares the hot spot sent by the terminal 1 and is connected to the same network with the terminal 1, the memory information and/or the processor load information of the terminal 3 and the terminal 1 are compared, if the memory or the processor load of the terminal 3 is greater than that of the terminal 1, the memory information and/or the processor load information of the terminal 3 and the terminal 2 are compared, and if the memory or the processor load of the terminal 3 is greater than that of the terminal 2, the main equipment is replaced, namely the terminal 3 is used as a new first terminal.
In this embodiment, when the terminals are connected through network data, the mode of determining the first terminal is the same as the bluetooth connection mode.
In this embodiment, the memory information and/or the processor load information of each device in the same local area network are compared to find the device with the current memory and the best performance as the main device, so that the device resources are fully utilized, and the information processing efficiency is improved.
Referring to fig. 6, fig. 6 is a fifth embodiment of the image processing method of the present invention, which includes the steps of:
step S60, the second terminal sends the collected second image information to the first terminal, so that the first terminal can synthesize the collected first image information and the received second image information into a preview image and send the preview image to the second terminal;
in this embodiment, the user opens the camera function of the home terminal device and switches to the image processing photographing mode described in this application, and the image information acquired by the second terminal includes at least one of an image photographed by the camera and an image processing resource, such as an AR special effect, matched with the system and available for the user to select. The image shot by the camera comprises one or more pieces of personal information.
In this embodiment, the number of the second terminals is not limited to one, and may include a plurality of terminals, and when there are a plurality of terminals, that is, the second terminal, the third terminal, the fourth terminal, and the like, the third terminal corresponds to the third image information, the fourth terminal corresponds to the fourth image information, and so on, so that the first terminal, the second terminal, the third terminal, the fourth terminal, and the like can all send the correspondingly acquired second image information, third image information, fourth image information, and the like to the first terminal.
Step S70, receiving the preview image sent by the first terminal;
step S80, displaying the second image information and/or the preview image.
In this embodiment, after the second terminal receives the preview image synthesized by the first terminal, a terminal display screen of the second terminal may only display the preview image, or may display the preview image and image information acquired by the second terminal in a split screen manner, and when the preview image is displayed in the split screen manner, a user may zoom or drag the display screen to adjust the size, the scale, and the position of each display window.
Referring to fig. 7, fig. 7 shows a sixth embodiment of the image processing method according to the present invention, and based on the fifth embodiment, step S60 includes:
step S61, the second terminal shoots a second image and extracts second characteristic information of the second image;
in this embodiment, after the second terminal captures an image through the camera, the system automatically calls a built-in image processing software to perform initial processing on the image, where the initial processing includes cutting, slicing, and the like, and analyzes feature information of the image after the initial processing, specifically, identifies and extracts face information and body feature information in the image, and uses the face information and the body feature information as second feature information.
In this embodiment, since the process of capturing the image by the second terminal is performed in real time, the second feature information is also updated in real time according to the captured image.
Step S62, acquiring a second image processing resource matched with the second image;
in this embodiment, after the second terminal captures an image, or during the process of capturing the image, the user may select an image processing resource from a menu bar disposed at any position of the current screen as the second image processing resource of the second terminal; or, in a default state, the system may also automatically match the corresponding image processing resource as the second image processing resource of the second terminal. The image processing resources include character special effects resources, such as AR special effects. The image processing resources are selected from an image processing resource library preset by the system, and the image processing resources in the image processing resource library can be downloaded and updated through network connection so as to expand the types and the number of the image processing resources.
In this embodiment, since the process of capturing the image by the second terminal is performed in real time, the second feature information is updated in real time according to the difference of the captured image, and in this case, the second image processing resource is also updated in such a manner that, when the system is in the user selection mode, the user can select different image processing resources in the menu bar at any time, and the system updates the second image resource in time according to the selection condition of the user; when the system is in a default state, after the system matches corresponding image processing resources for the initial shot image, if the figure of the shot image does not change, the initially matched image processing resources do not change, if the figure of the shot image changes, the initially matched image processing resources change, in the process, if the figure is increased, the system matches a new image processing resource for each newly added figure, and if the figure is decreased, the image processing resources of the existing figures in the image do not change.
A step S63 of regarding the second feature information and the second image processing resource as the second image information;
in this embodiment, the second image information of the second terminal includes second feature information acquired by the second terminal and a second image processing resource matched with the second feature information. When the image shot by the second terminal includes a plurality of persons, each person has corresponding second feature information, and for convenience of presentation, the second feature information corresponding to the person a is referred to as second feature information a, so that the second feature information corresponding to the person B is second feature information B, the second feature information corresponding to the person C is second feature information C, and so on, the second feature information a, the second feature information B, the second feature information C, and so on are collectively referred to as second feature information. Then, the second image processing resource corresponding to the second feature information a is a second image processing resource a, the second image processing resource corresponding to the second feature information B is a second image processing resource B, the second image processing resource corresponding to the second feature information C is a second image processing resource C, and so on, and in the same way, the second image processing resource a, the second image processing resource B, the second image processing resource C, and so on are collectively referred to as a second image processing resource.
Step S64, sending the second image information to the first terminal.
In this embodiment, the second terminal sends the collected second image information to the first terminal for centralized processing, so that the first terminal synthesizes a preview image by using the second image information.
In the embodiment, the second image resources acquired by the second terminal are matched with the corresponding second image processing resources, so that the real image information is changed into images with different image expression effects with the assistance of the image processing resources, and the interestingness and flexibility of shooting are increased.
Referring to fig. 8, fig. 8 is a seventh embodiment of the image processing method according to the present invention, and based on the fifth or sixth embodiment, the image processing method further includes:
step S90, the second terminal establishes connection with the first terminal;
in this embodiment, the connection between the second terminal and the first terminal may be established through any one of bluetooth connection, network data connection, and wireless lan connection, so that users of different terminals can transmit image data at a long distance or a short distance.
And S100, if the second terminal does not meet the preset condition, the second terminal sends the acquired second image information to the first terminal.
In this embodiment, after the second terminal establishes a connection with the first terminal, it is determined whether the second terminal meets a preset condition. The preset condition comprises at least one of the following conditions: the memory of the second terminal is smaller than that of the first terminal; the processor of the second terminal is less loaded than the first terminal.
In this embodiment, for convenience of description, the number of the terminals is set to 4, which are the terminal 1, the terminal 2, the terminal 3, and the terminal 4, but the actual number of the terminals is not limited to four. When terminals are connected through Bluetooth, if at the same time, a terminal 1, a terminal 2, a terminal 3 and a terminal 4 simultaneously open Bluetooth and access the same shared network, a terminal system automatically acquires memory information and/or processor load information of local terminal equipment, compares the memory information of the local terminal equipment with the memory information of other terminal equipment, compares the processor load information of the local terminal equipment with the processor load information of other terminal equipment, and if the memory or processor load of the terminal 1 is the maximum, determines the terminal 1 as a main device, namely the first terminal, and takes the other terminal equipment (the terminal 2, the terminal 3 and the terminal 4) as the second terminal; similarly, if the load of the memory or the processor of the terminal 2 is the maximum, the terminal 2 is determined as the primary device, that is, the first terminal, and the other terminal devices (terminal 1, terminal 3, and terminal 4) are used as the second terminal. When the terminals are connected through the Bluetooth, but the time points when the terminal 1, the terminal 2, the terminal 3 and the terminal 4 open the Bluetooth and access the same shared network are different, the terminal system automatically acquires the memory information and/or the processor load information of the terminal device, and when the terminal system enters the shared network, the terminal system is compared with the memory information and/or the processor load information of other terminal devices which simultaneously enter the shared network, so that the device with the maximum memory and/or processor load is obtained to be used as a main device, namely the first terminal, and the other terminal devices are used as the second terminals. For example, when the terminal 1 and the terminal 2 open bluetooth at the same time and access to the same shared network, if the load of the memory or the processor of the terminal 2 is smaller than that of the terminal 1, the terminal 1 is determined as a primary device, that is, the first terminal, and the terminal 2 is determined as the second terminal; after the terminal 3 enters the same sharing network through the Bluetooth connection, comparing the memory information and/or processor load information of the terminal 3 and the terminal 1, if the memory or processor load of the terminal 1 is smaller than that of the terminal 3, determining the terminal 3 as a new main device, and changing the terminal 1 into the second terminal; if the load of the memory or processor of the terminal 3 is smaller than that of the terminal 1, the terminal 1 continues to be used as the main device, i.e. the first terminal, and the terminal 3 is used as the second terminal.
In this embodiment, when terminals are connected via a wireless local area network (hotspot connection), first, a terminal device is determined as a hotspot sending end, when other terminal devices are added to the same shared network as hotspot connection ends, a first terminal device connected to the hotspot sending end is compared with memory information and/or processor load information of the terminal device as the hotspot sending end to determine the first terminal, then, the terminal device subsequently connected to the hotspot is compared with the memory information and/or processor load information of the terminal device as the hotspot sending end to determine whether the memory information and/or processor load information of the terminal device subsequently connected to the hotspot is less than the memory information and/or processor load information of the terminal device as the hotspot sending end, if the memory information and/or processor load information of the terminal device as the hotspot sending end is less than the memory information and/or processor load information of the terminal device as the hotspot sending end, the terminal device subsequently connected to the hotspot is used as the second terminal; if the memory information and/or the processor load information of the terminal equipment which is taken as the hotspot sending end is larger than the memory information and/or the processor load information of the terminal equipment which is subsequently connected into the hotspot, the memory information and/or the processor load information of the first terminal which is determined in advance are compared, and if the memory information and/or the processor load information of the terminal equipment which is subsequently connected into the hotspot is larger than the memory and/or the processor load information of the first terminal which is determined in advance, the main equipment is replaced, namely the terminal equipment which is subsequently connected into the hotspot is taken as a new first terminal. For example, if the terminal 1 is used as a hotspot issuing end, the terminal 2 firstly shares a hotspot issued by the terminal 1 and connects to the same network as the terminal 1, comparing the memory information and/or processor load information of the terminal 2 and the terminal 1 with the terminal 1, and if the memory or processor load of the terminal 1 is smaller than that of the terminal 2, determining the terminal 2 as a main device, that is, the first terminal, and determining the terminal 1 as the second terminal; and then, the terminal 3 shares the hot spot sent by the terminal 1 and is connected to the same network with the terminal 1, the memory information and/or processor load information of the terminal 3 and the terminal 1 are compared, if the memory or processor load of the terminal 1 is smaller than that of the terminal 3, the memory information and/or processor load information of the terminal 3 and the terminal 2 are compared, and if the memory or processor load of the terminal 2 is smaller than that of the terminal 3, the main equipment is replaced, namely the terminal 3 is used as a new first terminal, and the terminal 2 is used as a second terminal.
In this embodiment, when the terminals are connected through network data, the mode of determining the first terminal is the same as the bluetooth connection mode.
In this embodiment, the memory information and/or the processor load information of each device in the same local area network are compared to find the device with the current memory and the best performance as the main device, so that the device resources are fully utilized, and the information processing efficiency is improved.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (11)

1. An image processing method, characterized by comprising the steps of:
the method comprises the steps that a first terminal collects first image information and receives second image information;
synthesizing a preview image according to the first image information and the second image information;
and outputting the preview image.
2. The image processing method of claim 1, wherein the step of the first terminal acquiring the first image information comprises:
the method comprises the steps that a first terminal shoots a first image and extracts first feature information of the first image;
acquiring a first image processing resource matched with the first image;
and taking the first characteristic information and the first image processing resource as the first image information.
3. The image processing method according to claim 1, wherein the step of outputting the preview image comprises:
and displaying the preview image and/or sending the preview image to a second terminal.
4. The image processing method of claim 3, wherein before the step of the first terminal acquiring the first image information and receiving the second image information, further comprising:
the first terminal establishes connection with the second terminal;
and if the first terminal meets the preset condition, executing the steps of acquiring first image information and receiving second image information by the first terminal.
5. The image processing method according to claim 4, wherein the preset condition includes at least one of the following conditions:
the memory of the first terminal is larger than that of the second terminal;
the processor load of the first terminal is greater than the processor load of the second terminal.
6. An image processing method, characterized by comprising the steps of:
the second terminal sends the collected second image information to the first terminal, so that the first terminal synthesizes the collected first image information and the received second image information into a preview image and sends the preview image to the second terminal;
receiving the preview image sent by the first terminal;
and displaying the second image information and/or the preview image.
7. The image processing method according to claim 6, wherein the step of the second terminal sending the acquired second image information to the first terminal comprises:
the second terminal shoots a second image and extracts second characteristic information of the second image;
acquiring a second image processing resource matched with the second image;
taking the second feature information and the second image processing resource as the second image information;
and sending the second image information to the first terminal.
8. The image processing method according to claim 6, wherein before the step of the second terminal sending the acquired second image information to the first terminal for the first terminal to compose the preview image, the method further comprises:
the second terminal establishes connection with the first terminal;
and if the second terminal does not meet the preset condition, executing the step that the second terminal sends the collected second image information to the first terminal.
9. The image processing method according to claim 6, wherein the preset condition includes at least one of the following conditions:
the memory of the second terminal is smaller than that of the first terminal;
the processor of the second terminal is less loaded than the first terminal.
10. A terminal device characterized by comprising a memory, a processor, and a control program of an image processing method stored on the memory and executable on the processor, the control program of the image processing method realizing the steps of the control method of the image processing method according to any one of claims 1 to 9 when executed by the processor.
11. A computer-readable storage medium, characterized in that a control program of an image processing method is stored on the computer-readable storage medium, which when executed by a processor implements the steps of the control method of the image processing method according to any one of claims 1 to 9.
CN201911385885.0A 2019-12-26 2019-12-26 Image processing method, terminal device and computer-readable storage medium Pending CN111064894A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911385885.0A CN111064894A (en) 2019-12-26 2019-12-26 Image processing method, terminal device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911385885.0A CN111064894A (en) 2019-12-26 2019-12-26 Image processing method, terminal device and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN111064894A true CN111064894A (en) 2020-04-24

Family

ID=70304464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911385885.0A Pending CN111064894A (en) 2019-12-26 2019-12-26 Image processing method, terminal device and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN111064894A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095660A (en) * 2021-11-29 2022-02-25 Oppo广东移动通信有限公司 Image display method, image display device, storage medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427228A (en) * 2013-08-22 2015-03-18 展讯通信(上海)有限公司 Collaborative shooting system and shooting method thereof
CN105323252A (en) * 2015-11-16 2016-02-10 上海璟世数字科技有限公司 Method and system for realizing interaction based on augmented reality technology and terminal
CN106412432A (en) * 2016-09-30 2017-02-15 维沃移动通信有限公司 Photographing method and mobile terminal
CN106572306A (en) * 2016-10-28 2017-04-19 北京小米移动软件有限公司 Image shooting method and electronic equipment
CN107197144A (en) * 2017-05-24 2017-09-22 珠海市魅族科技有限公司 Filming control method and device, computer installation and readable storage medium storing program for executing
CN108021896A (en) * 2017-12-08 2018-05-11 北京百度网讯科技有限公司 Image pickup method, device, equipment and computer-readable medium based on augmented reality
CN110020981A (en) * 2019-03-27 2019-07-16 阿里巴巴集团控股有限公司 A kind of image information processing method, device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104427228A (en) * 2013-08-22 2015-03-18 展讯通信(上海)有限公司 Collaborative shooting system and shooting method thereof
CN105323252A (en) * 2015-11-16 2016-02-10 上海璟世数字科技有限公司 Method and system for realizing interaction based on augmented reality technology and terminal
CN106412432A (en) * 2016-09-30 2017-02-15 维沃移动通信有限公司 Photographing method and mobile terminal
CN106572306A (en) * 2016-10-28 2017-04-19 北京小米移动软件有限公司 Image shooting method and electronic equipment
CN107197144A (en) * 2017-05-24 2017-09-22 珠海市魅族科技有限公司 Filming control method and device, computer installation and readable storage medium storing program for executing
CN108021896A (en) * 2017-12-08 2018-05-11 北京百度网讯科技有限公司 Image pickup method, device, equipment and computer-readable medium based on augmented reality
CN110020981A (en) * 2019-03-27 2019-07-16 阿里巴巴集团控股有限公司 A kind of image information processing method, device and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095660A (en) * 2021-11-29 2022-02-25 Oppo广东移动通信有限公司 Image display method, image display device, storage medium and electronic equipment
CN114095660B (en) * 2021-11-29 2024-01-30 Oppo广东移动通信有限公司 Image display method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US10009543B2 (en) Method and apparatus for displaying self-taken images
WO2021175055A1 (en) Video processing method and related device
JP2020507136A (en) VR object synthesizing method, apparatus, program, and recording medium
EP2420978A2 (en) Apparatus and method for providing object information
CN105554372B (en) Shooting method and device
EP1981254A2 (en) Communication control device and communication terminal
CN107197144A (en) Filming control method and device, computer installation and readable storage medium storing program for executing
EP2905945A1 (en) Inter-terminal image sharing method, terminal device and communication system
US10922042B2 (en) System for sharing virtual content and method for displaying virtual content
WO2023093438A1 (en) Image display method and apparatus, and electronic device and computer-readable storage medium
US9052866B2 (en) Method, apparatus and computer-readable medium for image registration and display
CN112333386A (en) Shooting method and device and electronic equipment
CN111083371A (en) Shooting method and electronic equipment
CN109788359B (en) Video data processing method and related device
CN113330453A (en) System and method for providing personalized video for multiple persons
CN106445440B (en) Screen sharing method and its terminal
CN109963106B (en) Video image processing method and device, storage medium and terminal
CN108961424B (en) Virtual information processing method, device and storage medium
CN111064894A (en) Image processing method, terminal device and computer-readable storage medium
EP1983751A2 (en) Control apparatus, mobile communications system, and communications terminal
US20150186095A1 (en) Inter-terminal image sharing method, terminal device, and communications system
CN111567034A (en) Exposure compensation method, device and computer readable storage medium
CN110545385A (en) image processing method and terminal equipment
CN109308740B (en) 3D scene data processing method and device and electronic equipment
JP6673459B2 (en) Image processing apparatus, image processing system, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination