CN112445398A - Method, electronic device and computer readable medium for editing pictures - Google Patents

Method, electronic device and computer readable medium for editing pictures Download PDF

Info

Publication number
CN112445398A
CN112445398A CN201910831041.8A CN201910831041A CN112445398A CN 112445398 A CN112445398 A CN 112445398A CN 201910831041 A CN201910831041 A CN 201910831041A CN 112445398 A CN112445398 A CN 112445398A
Authority
CN
China
Prior art keywords
picture
edited
area
user
mask layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910831041.8A
Other languages
Chinese (zh)
Inventor
张译方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Zhangmen Science and Technology Co Ltd
Original Assignee
Shanghai Zhangmen Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhangmen Science and Technology Co Ltd filed Critical Shanghai Zhangmen Science and Technology Co Ltd
Priority to CN201910831041.8A priority Critical patent/CN112445398A/en
Publication of CN112445398A publication Critical patent/CN112445398A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/80Creating or modifying a manually drawn or painted image using a manual input device, e.g. mouse, light pen, direction keys on keyboard

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method, electronic equipment and a computer readable medium for editing pictures. One embodiment of the method for editing pictures includes: acquiring and presenting an original picture according to picture acquisition operation executed by a user when a session interface of instant messaging is displayed; superposing a semitransparent mask layer on the original picture, wherein the mask layer covers the original picture; determining an area to be edited based on a coverage area of a sliding operation of a user on the mask layer; and editing the original picture based on the area to be edited to generate an edited picture. The embodiment can enable the user to select the shape and the area which are required to be edited more conveniently.

Description

Method, electronic device and computer readable medium for editing pictures
Technical Field
The present application relates to the field of computer technology, and in particular, to a method, an electronic device, and a computer-readable medium for editing a picture.
Background
Instant Messaging (IM) is a real-time communication system that allows two or more people to communicate using a network to transmit text messages, files, voice, video, etc. in real time.
In the prior art, when a picture is sent by a session in an IM tool, if the IM tool is installed at a PC, the picture can be secondarily processed (for example, clipping or selecting a key point by a frame) by operating a rectangular frame which can be enlarged or reduced by a mouse; if the IM tool is installed on the mobile terminal and the user needs to edit and send the original picture, the user usually needs to leave the IM tool to enter the image editing tool, and based on the slidable and expandable rectangular frame and the sliding and expandable operations on the original picture, further cut out the overlapped part of the final rectangular frame and the original picture, edit and store the original picture, then enter the IM tool, and select and send the edited picture. This approach is relatively cumbersome and has limited editing operations.
Disclosure of Invention
The embodiment of the application provides a method, electronic equipment and a computer readable medium for editing pictures.
In a first aspect, some embodiments of the present application provide a method for editing a picture, the method comprising: acquiring and presenting an original picture according to picture acquisition operation executed by a user when a session interface of instant messaging is displayed; superposing a semitransparent shielding layer on the original picture, wherein the semitransparent shielding layer covers the original picture; determining an area to be edited based on a coverage area of a sliding operation of a user on the mask layer; and editing the original picture based on the area to be edited to generate an edited picture.
In a second aspect, some embodiments of the present application provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon which, when executed by one or more processors, cause the one or more processors to implement the method as described in the first aspect.
In a third aspect, some embodiments of the present application provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method as described in the first aspect.
According to the method, the electronic device and the computer readable medium for editing the picture, provided by the embodiment of the application, the original picture is obtained and presented when the IM conversation is carried out, the semi-transparent mask layer is superposed on the original picture, the area to be edited is determined based on the coverage area of the sliding operation of the user on the mask layer, and finally the original picture is edited based on the area to be edited, so that a more flexible and convenient picture editing scheme for embedding the IM conversation scene is provided, and the user can select the shape and the area to be edited more conveniently.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram to which some embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for editing pictures according to the present application;
fig. 3A to 3C are schematic diagrams of an application scenario of a method for editing pictures according to the present application;
FIG. 4 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the method for editing pictures of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminals 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminals 101, 102, 103 and the server 105. The network 104 may include various connection types, such as a wireless local area network, a mobile network, and so on.
The user may use the terminals 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminals 101, 102, 103 may have installed thereon various communication client applications, such as an instant messenger, a social application, a browser application, a shopping-like application, a search-like application, a mailbox client, and the like.
The terminals 101, 102, 103 may be hardware or software. When the terminals 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting instant messaging, including but not limited to smart phones, tablet computers, Personal Digital Assistants (PDAs), laptop portable computers, desktop computers, and the like. When the terminals 101, 102, 103 are software, they can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for instant messaging applications running on the terminals 101, 102, 103. The background server may send the received session message to the designated terminal.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
It should be noted that the method for editing pictures provided in the embodiments of the present application is generally executed by the terminals 101, 102, and 103.
It should be understood that the number of terminals, networks, and servers in fig. 1 are merely illustrative. There may be any suitable number of terminals, networks, and servers, as desired for an implementation.
With continued reference to fig. 2, a flow 200 of one embodiment of a method for editing pictures in accordance with the present application is shown. The method for editing the picture can be applied to a terminal, and can comprise the following steps 201-204:
step 201, according to the picture obtaining operation executed by the user when the session interface of the instant messaging is displayed, obtaining and presenting the original picture.
In this embodiment, an IM application may be installed on the terminal, and the user may perform a conversation with the social friend in a conversation interface of the IM application.
An executing body (for example, the terminals 101, 102, 103 shown in fig. 1) of the method for editing pictures may acquire an original picture according to a picture acquisition operation performed by a user when the user is exposed to the above-mentioned session interface (i.e., when an IM session is performed), and present the acquired original picture on a screen of the above-mentioned terminal. Here, the picture taking operation may refer to an operation in which the user takes an original picture, for example, a screen shot.
In some optional implementations of the embodiment, the picture obtaining operation may be an operation of a user selecting a picture from the session record as an original picture. For example, when a user a and a user B perform an IM session, the user a selects a picture from a historical session with the user B as an original picture, which is a picture obtaining operation.
In some optional implementations of the embodiment, the picture obtaining operation may be an operation in which a user selects a picture from a gallery (which may also be referred to as an album) as an original picture. Here, the gallery may include several pictures stored on the terminal.
In some optional implementation manners of this embodiment, the picture obtaining operation may be an operation in which a user performs screen capturing on an interface currently displayed by the terminal to use a picture obtained by screen capturing as an original picture.
Although the above implementations describe selecting an original picture from a session record, selecting an original picture from a gallery, and capturing an original picture through a screenshot, the application is not limited thereto. The original picture may also be acquired in other suitable ways, as will be appreciated by those skilled in the art.
In step 202, a semi-transparent mask layer is superimposed on the original picture, and the semi-transparent mask layer covers the original picture.
In this embodiment, the executing entity of the method for editing pictures (e.g., the terminals 101, 102, 103 shown in fig. 1) may superimpose a semi-transparent mask layer on the original picture presented in step 201. The mask layer may completely cover the original picture, for example, the mask layer covers the entire display area of the screen. Since the mask layer is translucent, the user can clearly observe the image content in the original picture through the mask layer.
In step 203, the area to be edited is determined based on the coverage area of the sliding operation of the user on the mask layer.
In this embodiment, the user may perform a sliding operation on the mask layer superimposed in step 202. For example, the user continuously slides (or smears) on the mask layer by a finger, a touch pen, or the like.
An execution subject of the method for editing pictures (e.g., the terminals 101, 102, 103 shown in fig. 1) may determine an area to be edited based on the coverage area of the above-described sliding operation. Here, the coverage area of the sliding operation may include an area touched (or contacted) by a user's finger, a touch pen, or the like since (i.e., during one touch) the execution subject detects a touch event (e.g., detects that a user's finger is pressed on the screen) until the touch event is ended (e.g., the user's finger is off the screen).
In some optional implementation manners of this embodiment, step 203 may specifically include:
first, a sliding path and a path width of a user on the mask layer are obtained to determine a coverage area of the sliding operation. Here, the slide path may be a path through which a user's finger, a touch pen, or the like slides. For example, a series of consecutive sets of points across which the finger slides. The path width may be a fixed value (e.g., a user-selected width value) or may be a variable value (e.g., a value that varies as a function of pressure, contact area, etc.).
And then, determining the coverage area as an area to be edited.
In other optional implementations of this embodiment, step 203 may specifically include:
first, a sliding path and a path width of a user on the mask layer are obtained to determine a coverage area of the sliding operation.
And then, determining the area outside the coverage area as the area to be edited.
Compared with the method of simply enlarging and reducing the rectangular frame to select the area to be edited, the method for realizing the area to be edited determines the area to be edited based on the coverage area of the sliding operation, and is beneficial to a user to determine the area to be edited more quickly, and the determined area to be edited is not limited to the rectangle but can be in any shape, so that the area to be edited can be determined more accurately.
In some alternative implementations of the present embodiment, the original picture displayed in the conversation interface is fixed in size and is not movable.
In some optional implementations of this embodiment, after step 202, the executing entity may obtain the sliding path and the path width of the user on the mask layer by: in response to detecting a touch event, position information for each waypoint slid during the touch is recorded along with the path width as each point is slid.
It should be appreciated that when the path width is a fixed value, only the position information of the various path points traversed during the touch may be recorded.
In some optional implementations of the embodiment, the executing body may start to record the position information of each waypoint slid during the touch and the path width when each point is slid when detecting that the pressing pressure of the user on the screen is greater than or equal to the preset pressure value or the pressing duration is greater than or equal to the preset duration. In this way, when the pressing pressure is less than the preset pressure value and the pressing time length is less than the preset time length, the execution main body can execute other operations based on the touch operation of the user.
Optionally, the method for editing pictures may further include: and in response to detecting that at least two fingers slide in the same direction, enabling the original picture to follow the sliding. For example, if two fingers are detected to slide to the right at the same time, the original picture can be slid to the right at the same speed as the fingers.
Optionally, the method for editing pictures may further include: in response to detecting the at least two finger heterodromous swipes, scaling the original picture. For example, if two fingers are detected to slide towards each other, the original picture can be reduced. And if the two fingers are detected to slide backwards, the original picture can be amplified.
In some optional implementations of this embodiment, the method for editing a picture may further include: and in response to detecting the sliding operation of the user on the mask layer, displaying a track of the sliding operation on the mask layer in real time so as to facilitate the user to continue subsequent sliding (or smearing).
In some optional implementations of this embodiment, the method for editing a picture may further include: during the sliding operation performed by the user, the mask layer in the area over which the user slid is removed (or the color of the portion of the mask layer is adjusted from translucent to fully transparent). The mode can enable the user to check the region to be edited to be determined in real time, and the user can conveniently adjust the region in time.
In some optional implementations of this embodiment, the method for editing a picture may further include: and removing the mask layer in the area to be edited, or adjusting the color of the mask layer in the area to be edited from translucence to complete transparency. By removing the mask layer, the user can clearly observe the image area to be edited so as to determine the subsequent operation (e.g., execute step 204, or execute step 203 again).
And 204, editing the original picture based on the area to be edited to generate an edited picture.
In this embodiment, an executing subject (for example, the terminals 101, 102, 103 shown in fig. 1) of the method for editing pictures may edit the original picture presented in step 201 based on the region to be edited determined in step 203, thereby generating an edited picture.
Compared with the method for clipping the picture by using the rectangular frame capable of being enlarged and reduced, in the embodiment, the user only needs to slide on the mask layer, the operation is simple, and the area to be edited in any shape can be painted out, so that the original picture can be edited more flexibly.
In some optional implementations of this embodiment, step 204 may specifically include:
firstly, cutting an original picture along the edge of the area to be edited to obtain image information corresponding to the area to be edited. That is, image information in an area of the original picture that overlaps with the area to be edited is obtained.
Then, an edited picture is generated based on the image information. For example, the image information is saved in a new transparent canvas to obtain a new picture.
Alternatively, the canvas holding the image information may be held as a picture in a picture format supporting a transparent background (for example, png (Portable Network Graphics) format), so that a picture in an irregular shape (for example, the same shape as a slide track) can be obtained.
Alternatively, the canvas in which the above-described image information is saved may be saved as a picture in a picture format that does not support a transparent background (e.g., jpg (Joint Photographic Experts Group)), so that a picture in a regular shape (e.g., a rectangle) with a higher compression ratio may be obtained.
In some optional implementations of this embodiment, step 204 may specifically include: and replacing the image of the original image corresponding to the area to be edited by using a preset image, and storing the replaced image as an edited picture. As an example, a pure color pattern such as black or white may be used to replace an image in an area corresponding to the above-described area to be edited in the original image, thereby achieving an effect similar to erasing.
In some optional implementations of this embodiment, step 204 may specifically include: and superposing a preset image to the image in the area of the original image corresponding to the area to be edited, and storing the superposed image as an edited image. As an example, a patch pattern may be superimposed on an image in a region corresponding to the above-described region to be edited in the original image, thereby achieving a mosaic-like effect.
In some optional implementations of this embodiment, the method for editing a picture may further include: and compressing the edited picture, and then sending the compressed picture to an opposite end of the session corresponding to the session interface, or storing the compressed picture. As an example, if the user a has a conversation with the user B in the conversation interface, the execution body may compress the edited picture and transmit the compressed picture to the terminal held by the user B. Or after the edited picture is obtained, the edited picture can be directly sent to the opposite end of the session corresponding to the session interface without executing the compression operation, or the edited picture is stored.
In some optional implementations of this embodiment, before step 204, the method for editing a picture may further include: in response to detecting an operation indicating to redetermine the region to be edited, reacquiring a coverage region of a sliding operation of the user on the mask layer; and then determining a new area to be edited based on the retrieved coverage area.
Here, the operation of instructing to redetermine the region to be edited may include, but is not limited to: the user clicks a "redo" button, etc. As an example, if the user is not satisfied with the area to be edited that has been painted, and wants to paint the area to be edited again, the user may click the "redo" button and then perform the sliding operation on the mask layer again. By re-determining the area to be edited, the user can obtain a satisfactory area to be edited conveniently.
With continued reference to fig. 3A-3C, one application scenario for a method for editing pictures according to the present application is shown. In this application scenario, first, as shown in fig. 3A, a user 302 is engaged in a conversation with a user 301 in an instant messaging session interface. The user 302 clicks on a control 303 in the session interface to select the original picture 307 from the gallery 304. After detecting the above operation of the user 302, the mobile phone 31 obtains the original picture 307 selected by the user 301 from the gallery 304, and presents the original picture 307 on the screen, as shown in fig. 3B. The original picture 307 is overlaid with a semi-transparent mask layer. When the finger 305 slides over the mask layer, the mask layer in the area over which the finger slides changes from translucent to transparent. When the sliding is finished, the area to be edited 306 is obtained. Finally, the image of the region of the original picture 307 outside the region overlapping with the region to be edited 306 is cropped, resulting in an edited picture 309 as shown in fig. 3C. After the user 302 clicks the button 308, the picture 309 may be compressed and saved to the gallery 304.
According to the method for editing the picture, the original picture is obtained and presented when the IM conversation is carried out, the semi-transparent mask layer is then overlapped on the original picture, the area to be edited is determined based on the covering area of the sliding operation of the user on the mask layer, and finally the original picture is edited based on the area to be edited, so that the user can select the shape and the area to be edited more conveniently.
Referring now to fig. 4, shown is a schematic diagram of an electronic device (e.g., terminals 101, 102, 103 shown in fig. 1) 400 suitable for use in implementing embodiments of the present application. The electronic device 400 shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 4, electronic device 400 may include a processing device (e.g., central processing unit, graphics processor, etc.) 401 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage device 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM 402, and the RAM403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
The following devices may be connected to the I/O interface 405 in general: input devices 406 including, for example, a touch screen, keys, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; a storage device 408 including, for example, a memory card or the like; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 4 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 4 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication device 409, or from the storage device 408, or from the ROM 402. The computer program, when executed by the processing apparatus 401, performs the above-described functions defined in the methods of embodiments of the present disclosure. It should be noted that the computer readable medium in the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the terminal; or may exist separately and not be assembled into the terminal. The computer readable medium carries one or more programs which, when executed by the terminal, cause the terminal to: acquiring and presenting an original picture according to picture acquisition operation executed by a user when a session interface of instant messaging is displayed; superposing a semitransparent shielding layer on the original picture, wherein the semitransparent shielding layer covers the original picture; determining an area to be edited based on a coverage area of a sliding operation of a user on the mask layer; and editing the original picture based on the area to be edited to generate an edited picture.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is possible without departing from the inventive concept as defined above. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (15)

1. A method for editing a picture, the method comprising:
acquiring and presenting an original picture according to picture acquisition operation executed by a user when a session interface of instant messaging is displayed;
overlaying a semi-transparent mask layer on the original picture, wherein the mask layer covers the original picture;
determining an area to be edited based on a coverage area of a sliding operation of a user on the mask layer;
and editing the original picture based on the area to be edited to generate an edited picture.
2. The method of claim 1, wherein the picture taking operation comprises:
selecting a picture from the session record as the operation of the original picture; or
Selecting a picture from a gallery as the operation of the original picture; or
And performing screen capture to take the picture obtained by screen capture as the original picture.
3. The method of claim 1, wherein determining the area to be edited based on the coverage area of the sliding operation of the user on the mask layer comprises:
acquiring a sliding path and a path width of a user on the mask layer, and determining a coverage area of the sliding operation;
and determining the coverage area or the area outside the coverage area as an area to be edited.
4. The method of claim 3, wherein obtaining a sliding path and a path width of a user over the mask layer comprises:
in response to detecting that the pressing pressure and/or the duration of the screen by the user is greater than a preset value, recording the sliding path and the path width of the user on the mask layer.
5. The method of claim 1, further comprising:
in response to detecting a sliding operation of a user on the mask layer, displaying a track of the sliding operation on the mask layer in real time.
6. The method of claim 1, further comprising:
and removing the mask layer in the area to be edited or adjusting the mask layer from translucency to transparency.
7. The method according to claim 1, wherein the editing the original picture based on the region to be edited to generate an edited picture comprises:
cutting the original picture along the edge of the area to be edited to obtain image information corresponding to the area to be edited;
and generating the edited picture based on the image information.
8. The method according to claim 7, wherein the format of the edited picture is a picture format supporting a transparent background.
9. The method of claim 1, further comprising:
compressing the edited picture;
and sending the compressed picture to the opposite end of the session corresponding to the session interface, or storing the compressed picture.
10. The method according to claim 1, wherein the editing the original picture based on the region to be edited to generate an edited picture comprises:
replacing an image of the original image corresponding to the area to be edited with a preset image, or overlaying the preset image onto the image of the original image corresponding to the area to be edited;
and storing the replaced or superposed image as the edited picture.
11. The method of claim 1, wherein the original picture displayed in the conversation interface is fixed in size and not movable.
12. The method of claim 1, further comprising operating on the original picture by at least one of:
in response to detecting that at least two fingers slide in the same direction, enabling the original picture to follow the sliding;
in response to detecting the at least two finger heterodromous swipes, scaling the original picture.
13. The method according to claim 1, wherein before editing the original picture based on the region to be edited, the method further comprises:
in response to detecting an operation indicating to redetermine an area to be edited, reacquiring a coverage area of a sliding operation of a user on the mask layer;
and determining a new area to be edited based on the retrieved coverage area.
14. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-13.
15. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-13.
CN201910831041.8A 2019-09-04 2019-09-04 Method, electronic device and computer readable medium for editing pictures Pending CN112445398A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910831041.8A CN112445398A (en) 2019-09-04 2019-09-04 Method, electronic device and computer readable medium for editing pictures

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910831041.8A CN112445398A (en) 2019-09-04 2019-09-04 Method, electronic device and computer readable medium for editing pictures

Publications (1)

Publication Number Publication Date
CN112445398A true CN112445398A (en) 2021-03-05

Family

ID=74734828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910831041.8A Pending CN112445398A (en) 2019-09-04 2019-09-04 Method, electronic device and computer readable medium for editing pictures

Country Status (1)

Country Link
CN (1) CN112445398A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113835812A (en) * 2021-09-24 2021-12-24 深圳集智数字科技有限公司 Chat interface display method and device, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046722A (en) * 2006-03-31 2007-10-03 腾讯科技(深圳)有限公司 Method for editing picture in customer end contents transmission window and customer end
US20130055112A1 (en) * 2011-08-28 2013-02-28 Hoozin Ltd. Computerized System And Method Supporting Message-Based Group Communication Sessions
CN103150745A (en) * 2011-12-06 2013-06-12 腾讯科技(深圳)有限公司 Method and device capable of editing a picture on line
CN103164123A (en) * 2011-12-12 2013-06-19 北京神州泰岳软件股份有限公司 Method and device for editing screen capture
CN103577083A (en) * 2012-07-30 2014-02-12 腾讯科技(深圳)有限公司 Image operation method and mobile terminal
CN104598119A (en) * 2013-10-17 2015-05-06 深圳天科智慧科技有限公司 Screen capture method and device
CN104932827A (en) * 2015-06-29 2015-09-23 北京金山安全软件有限公司 Picture clipping method and device and terminal
CN105278825A (en) * 2014-08-19 2016-01-27 东莞市步步高通信软件有限公司 Screen capturing method and mobile terminal
CN105892839A (en) * 2015-01-26 2016-08-24 腾讯科技(深圳)有限公司 Screenshot processing method and device based on instant communication tool
CN106331482A (en) * 2016-08-23 2017-01-11 努比亚技术有限公司 Photo processing device and method
CN106527929A (en) * 2016-10-31 2017-03-22 宇龙计算机通信科技(深圳)有限公司 Picture information hiding method and apparatus

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101046722A (en) * 2006-03-31 2007-10-03 腾讯科技(深圳)有限公司 Method for editing picture in customer end contents transmission window and customer end
US20130055112A1 (en) * 2011-08-28 2013-02-28 Hoozin Ltd. Computerized System And Method Supporting Message-Based Group Communication Sessions
CN103150745A (en) * 2011-12-06 2013-06-12 腾讯科技(深圳)有限公司 Method and device capable of editing a picture on line
CN103164123A (en) * 2011-12-12 2013-06-19 北京神州泰岳软件股份有限公司 Method and device for editing screen capture
CN103577083A (en) * 2012-07-30 2014-02-12 腾讯科技(深圳)有限公司 Image operation method and mobile terminal
CN104598119A (en) * 2013-10-17 2015-05-06 深圳天科智慧科技有限公司 Screen capture method and device
CN105278825A (en) * 2014-08-19 2016-01-27 东莞市步步高通信软件有限公司 Screen capturing method and mobile terminal
CN105892839A (en) * 2015-01-26 2016-08-24 腾讯科技(深圳)有限公司 Screenshot processing method and device based on instant communication tool
CN104932827A (en) * 2015-06-29 2015-09-23 北京金山安全软件有限公司 Picture clipping method and device and terminal
CN106331482A (en) * 2016-08-23 2017-01-11 努比亚技术有限公司 Photo processing device and method
CN106527929A (en) * 2016-10-31 2017-03-22 宇龙计算机通信科技(深圳)有限公司 Picture information hiding method and apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113835812A (en) * 2021-09-24 2021-12-24 深圳集智数字科技有限公司 Chat interface display method and device, electronic equipment and storage medium
CN113835812B (en) * 2021-09-24 2024-04-30 深圳集智数字科技有限公司 Chat interface display method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102013331B1 (en) Terminal device and method for synthesizing a dual image in device having a dual camera
US9560414B1 (en) Method, apparatus and system for dynamic content
US10430456B2 (en) Automatic grouping based handling of similar photos
KR102064973B1 (en) Apparatas and method for editing of dual image in an electronic device
CN107111437B (en) Digital media message generation
CN108924440B (en) Sticker display method, device, terminal and computer-readable storage medium
KR20190132360A (en) Method and device for processing multimedia resources
US20170034451A1 (en) Mobile terminal and control method thereof for displaying image cluster differently in an image gallery mode
KR102673676B1 (en) Inserting advertisements into videos within messaging systems
CN113613067B (en) Video processing method, device, equipment and storage medium
EP4343580A1 (en) Media file processing method and apparatus, device, readable storage medium, and product
CN112004032A (en) Video processing method, terminal device and storage medium
CN110390641B (en) Image desensitizing method, electronic device and storage medium
CN113096213A (en) Image processing method and device, electronic equipment and storage medium
CN112672061A (en) Video shooting method and device, electronic equipment and medium
CN115576456A (en) Session page display method, device, equipment, readable storage medium and product
CN114598823B (en) Special effect video generation method and device, electronic equipment and storage medium
CN110851059A (en) Picture editing method and device and electronic equipment
CN112363790B (en) Table view display method and device and electronic equipment
CN114153346A (en) Picture processing method and device, storage medium and electronic equipment
CN112445398A (en) Method, electronic device and computer readable medium for editing pictures
CN112613447A (en) Key point detection method and device, electronic equipment and storage medium
CN115134317B (en) Message display method, device, storage medium and electronic device
CN115880168A (en) Image restoration method, device, equipment, computer readable storage medium and product
KR20150000030A (en) Contents sharing service

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination