CN115185418A - Image processing method, image processing device, computer storage medium, equipment and system - Google Patents

Image processing method, image processing device, computer storage medium, equipment and system Download PDF

Info

Publication number
CN115185418A
CN115185418A CN202110374597.6A CN202110374597A CN115185418A CN 115185418 A CN115185418 A CN 115185418A CN 202110374597 A CN202110374597 A CN 202110374597A CN 115185418 A CN115185418 A CN 115185418A
Authority
CN
China
Prior art keywords
target
image
expression
session
ejection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110374597.6A
Other languages
Chinese (zh)
Inventor
沙莎
蔡忆宁
刘伟
刘立强
何丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110374597.6A priority Critical patent/CN115185418A/en
Publication of CN115185418A publication Critical patent/CN115185418A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an image processing method, an image processing device, a computer storage medium, equipment and a system. The embodiment of the application displays the expression input panel on the first conversation interface; responding to a touch operation of selecting a target expression image from the plurality of expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image; and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface. Therefore, the target expression ejection animation is rapidly triggered to be displayed on two sides through touch operation and dragging operation of the target expression image of the first session interface, and the diversity of image processing is greatly improved.

Description

Image processing method, image processing device, computer storage medium, equipment and system
Technical Field
The present application relates to the field of communications technologies, and in particular, to an image processing method, an image processing apparatus, a computer storage medium, a device, and a system.
Background
With the development of the internet, instant messaging has become an indispensable network communication mode in people's life, and in recent years, a mobile instant messaging technology has emerged, which is based on instant messaging of various mobile communication devices (such as mobile phones).
In the prior art, the mobile user can send emoticons during chatting, so that the interest of the chatting is increased, and the user experience is enriched. Moreover, in order to enrich the expression dimension of the expression, different forms of animation effects can be triggered to express according to the expression style after the expression is sent.
In the research and practice process of the prior art, the inventor of the application finds that the expression sending mode in the prior art is relatively fixed and rigid, the emotion expression capability of a user is weak, and the diversity of image processing is poor.
Disclosure of Invention
Embodiments of the present application provide an image processing method, an image processing apparatus, a computer storage medium, a device, and a system, which can improve diversity of image processing.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
an image processing method comprising:
displaying an expression input panel on a first session interface, wherein the expression input panel comprises a plurality of expression images;
responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image;
and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a conversation message frame of a second conversation interface and plays the target expression ejection animation in the second conversation interface.
An image processing method, comprising:
displaying an image input panel in a first session interface, wherein the image input panel comprises a plurality of multimedia objects;
responding to a touch operation of selecting a target multimedia object from the plurality of multimedia objects and responding to a dragging operation of dragging the target multimedia object to a multimedia continuous sending area, and playing a target multimedia continuous sending animation corresponding to the target multimedia object;
and sending the target multimedia object to a receiving terminal, so that the receiving terminal displays the target multimedia object in a session message frame of a second session interface and displays the target multimedia object in a session background of the second session interface.
An image processing method, comprising:
receiving a target expression image and a target expression ejection animation sent by a sending terminal;
playing the target expression ejection animation on a second session interface; and
and displaying the target expression image in a conversation message frame of the second conversation interface.
An image processing method comprising:
receiving a target multimedia object sent by a sending terminal;
displaying the target multimedia object in a session message frame of a second session interface; and
and displaying the target multimedia object in the session background of the second session interface.
An image processing apparatus comprising:
the first display unit is used for displaying an expression input panel on a first conversation interface, and the expression input panel comprises a plurality of expression images;
the first playing unit is used for responding to touch operation of selecting a target expression image from the expression images and responding to dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image;
and the first sending unit is used for sending the target expression image and the target expression ejection animation to a receiving terminal so that the receiving terminal can display the target expression image in a conversation message frame of a second conversation interface and play the target expression ejection animation in the second conversation interface.
In some embodiments, the image processing apparatus further includes:
and the second display unit is used for displaying the expression ejection area in the currently displayed conversation area of the first conversation interface.
In some embodiments, the second display unit is configured to:
and displaying an expression ejection area formed by combining the mask component and the prompt component in a currently displayed conversation area of the first conversation interface.
In some embodiments, the first playback unit is configured to:
responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and triggering and previewing an expression ejection animation corresponding to the target expression image;
detecting first touch control time of touch control operation in the expression ejection area;
and continuously adjusting the animation style of the expression ejection animation according to the change of the first touch time until the touch operation in the expression ejection area is finished, and generating a target expression ejection animation.
An image processing apparatus comprising:
a first display unit for displaying an image input panel including a plurality of multimedia objects in a first session interface;
the first playing unit is used for responding to touch operation of selecting a target multimedia object from the plurality of multimedia objects and responding to dragging operation of dragging the target multimedia object to a multimedia continuous sending area, and playing a target multimedia continuous sending animation corresponding to the target multimedia object;
and the first sending unit is used for sending the target multimedia object to a receiving terminal so that the receiving terminal displays the target multimedia object in a session message frame of a second session interface and displays the target multimedia object in a session background of the second session interface.
In some embodiments, the image processing apparatus further includes:
and the second display unit displays a multimedia continuous area in the currently displayed session area of the first session interface.
In some embodiments, the first playback unit is configured to:
the multimedia object display device comprises a touch operation module, a display module and a display module, wherein the touch operation module is used for responding to a touch operation of selecting a target multimedia object from a plurality of multimedia objects and responding to a dragging operation of dragging the target multimedia object to a multimedia burst area, and triggering and playing a target multimedia burst animation corresponding to the target multimedia object;
detecting second touch time of touch operation in the multimedia continuous transmission area;
and continuously playing the target multimedia continuous sending animation corresponding to the target multimedia object within the second touch time until the touch operation in the multimedia continuous sending area is finished, and stopping playing the target multimedia continuous sending animation.
In some embodiments, the multimedia object is an image, and the image processing apparatus further includes:
the detection unit is used for detecting whether the second touch time is smaller than a preset time threshold value;
the first saving unit is used for saving the second touch time when the second touch time is detected to be smaller than a preset time threshold;
the first sending unit is further configured to send the second touch time and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the second touch time;
the second saving unit is used for saving the preset time threshold when the second touch time is not smaller than the preset time threshold;
the first sending unit is further configured to send the preset time threshold and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the preset time threshold.
In some embodiments, the multimedia object is a video, and the first sending unit is further configured to:
and sending the target video to a receiving terminal, so that the receiving terminal displays the target video in a session message frame of a second session interface and plays the target video in a session background of the second session interface.
An image processing apparatus comprising:
the first receiving unit is used for receiving the target expression image and the target expression ejection animation sent by the sending terminal;
the first playing unit is used for playing the target expression ejection animation on a second session interface;
and the first display unit is used for displaying the target expression image in a conversation message frame of the second conversation interface.
In some embodiments, the first display unit is configured to:
acquiring the ejection times corresponding to the target expression image indicated by the target expression ejection animation;
adjusting the size information of the target expression image according to the ejection times;
and displaying the target expression image with the adjusted size information on a conversation message frame of a second conversation interface by combining the ejection times.
An image processing apparatus comprising:
a first receiving unit, configured to receive a target multimedia object sent by a sending terminal;
the first playing unit is used for displaying the target multimedia object in a session message frame of a second session interface;
and the first display unit is used for displaying the target multimedia object in the session background of the second session interface.
In some embodiments, the multimedia object is an image, and the first display unit is configured to:
receiving second touch time sent by the sending terminal;
and displaying the target image in a session background of the second session interface based on the second touch time.
In some embodiments, the multimedia object is an image, and the first display unit is further configured to:
receiving a preset time threshold sent by a sending terminal;
and displaying the target image in a session background of the second session interface based on the preset time threshold.
In some embodiments, the multimedia object is a video, and the first display unit is further configured to:
and playing the target video in the session background of the second session interface.
A computer storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the image processing method.
A computer device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, said processor implementing the steps of the image processing method provided above when executing said computer program.
An image processing system, the system comprising: a transmitting terminal, a server and a receiving terminal;
the transmitting terminal comprises the image processing device;
the server is used for receiving the target expression image and the target expression ejection animation sent by the sending terminal and forwarding the target expression image and the target expression ejection animation to the receiving terminal;
the receiving terminal comprises the image processing device.
A computer program product or computer program comprising computer instructions stored in a storage medium. The processor of the computer device reads the computer instructions from the storage medium, and the processor executes the computer instructions to cause the computer device to execute the steps in the image processing method provided above.
The embodiment of the application displays the expression input panel on the first session interface; responding to a touch operation of selecting a target expression image from the plurality of expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image; and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface. Therefore, the target expression ejection animation is rapidly triggered to be displayed on both sides through touch operation and dragging operation of the target expression image of the first session interface, and diversity of image processing is greatly improved compared with a fixed and rigid expression sending scheme.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic view of a scene of an image processing system provided in an embodiment of the present application;
fig. 2a is a schematic flowchart of an image processing method provided in an embodiment of the present application;
FIG. 2b is a schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 2c is another schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 2d is a schematic flowchart of an image processing method according to an embodiment of the present disclosure;
FIG. 2e is a schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 2f is another schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 2g is another schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 2h is another schematic diagram of an image processing interface provided by an embodiment of the present application;
fig. 3a is a schematic flowchart of an image processing method provided in an embodiment of the present application;
FIG. 3b is a schematic diagram of an image processing interface provided by an embodiment of the present application;
fig. 3c is a schematic flowchart of an image processing method provided in an embodiment of the present application;
FIG. 3d is a schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 3e is another schematic diagram of an image processing interface provided by an embodiment of the present application;
FIG. 4 is a timing diagram illustrating an image processing method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an image processing method, an image processing device, a computer storage medium, equipment and a system.
Referring to fig. 1, fig. 1 is a schematic view of a scene of an image processing system according to an embodiment of the present application, including: the sending terminal 11, the receiving terminal 12 and the server 20 may be connected through a communication network, which includes a wireless network and a wired network, wherein the wireless network includes one or more of a wireless wide area network, a wireless local area network, a wireless metropolitan area network and a wireless personal area network. The network includes network entities such as routers, gateways, etc., which are not shown in the figure. The transmitting terminal 11 and the receiving terminal 12 may perform information interaction with the server 20 through a communication network.
The image processing system may include an image processing apparatus, and the image processing apparatus may be specifically integrated in a terminal having a storage unit and a microprocessor installed therein and having an arithmetic capability, such as a tablet computer, a mobile phone, a notebook computer, a desktop computer, a smart home, a VR/AR device, and a vehicle-mounted computer. In fig. 1, the terminals are the sending terminal 11 and the receiving terminal 12 in fig. 1, and various applications required by the user, such as an instant messaging application and the like, may be installed in the sending terminal 11. The sending terminal 11 may be configured to display an expression input panel on the first session interface, where the expression input panel includes a plurality of expression images; responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image; and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface. The transmitting terminal 11 may include a plurality of terminals.
The receiving terminal 12 may have various applications required by the user, such as an instant messaging application, installed therein. The receiving terminal 12 may be configured to receive the target expression image and the target expression ejection animation sent by the sending terminal 11; playing the target expression ejection animation on a second session interface; and displaying the target expression image in a session message frame of the second session interface. Note that the receiving terminal 12 may include a plurality of terminals.
The image processing system may further include a server 20, which may be configured to receive the target expression image and the target expression ejection animation transmitted by the transmitting terminal 11; and forwarding the target expression image and the target expression ejection animation to the receiving terminal 12.
It should be noted that the scene schematic diagram of the image processing system shown in fig. 1 is merely an example, and the image processing system and the scene described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not constitute a limitation to the technical solution provided in the embodiment of the present application, and as a person having ordinary skill in the art knows, with the evolution of the image processing system and the occurrence of a new business scene, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
The following are detailed below.
In this embodiment, the image processing apparatus will be described from the perspective of an image processing apparatus, which may be specifically integrated in a sending terminal having computing capability, such as a tablet computer, a mobile phone, and the like, which has a storage unit and is equipped with a microprocessor, where the sending terminal is a terminal corresponding to a message sending side in an instant messaging application.
Referring to fig. 2a, fig. 2a is a schematic flowchart of an image processing method according to an embodiment of the present disclosure. The image processing method comprises the following steps:
in step 101, an emoji input panel is displayed on a first session interface.
The first session interface may be a chat interface corresponding to an instant messaging application installed on the sending terminal, and may be formed by a public chat window component, for example, through which the user may implement a session function with another user, such as a function of sending text, voice, image, emoticon, and the like.
For example, referring to fig. 2b together, the expression input panel is a functional component for inputting expressions to perform instant messaging, the expression input panel may include a plurality of expression images, the expression images may be emoji expression images or user-defined expression images, which are not specifically limited herein, and a terminal user may open and display the expression input panel by clicking a smiling face icon 11 on the first session interface, and the expression input panel may display a plurality of expressions.
In step 102, in response to a touch operation of selecting a target expression image from the plurality of expression images and a drag operation of dragging the target expression image to an expression ejection area, a target expression ejection animation corresponding to the target expression image is played.
It should be noted that, in the implementation of the present application, the response is used to indicate a condition or a state on which an executed operation depends, and when the condition or the state on which the executed operation depends is satisfied, one or more executed operations may be in real time or may have a set delay; there is no restriction on the order of execution of the operations performed unless otherwise specified.
In this embodiment, the long-press touch operation may be described as an example, please refer to fig. 2b, where a user may perform the long-press touch operation on the target expression image 12 by using a finger, the sending terminal 10 may correspondingly display an expression ejection area in response to the touch operation of the user selecting the target expression image 12 from the expression images, and in an embodiment, the target expression image 12 may be generated into a preview for the terminal user to further confirm.
In some embodiments, after responding to a touch operation of selecting a target expression image from a plurality of expression images, an expression ejection area may be displayed in a conversation area currently displayed in the first conversation interface.
And displaying an expression ejection area formed by combining the mask component and the prompt component in a currently displayed conversation area of the first conversation interface. For example, referring to fig. 2b together, the emoticon pop-up area may be formed by combining a mask component 13 and a prompt component 14, and the mask component 13 may be a Canvas (Canvas) for blocking the content of the conversation area to prevent misoperation. The prompting component 14 can be used for prompting the user to drag the target expression image 12 to the expression ejection area to realize a repeating expression.
It is understood that the expression ejection area can be other areas, such as a specific area on the left side, the right side, the top of the screen, or the head portrait area of the user.
In some embodiments, the step of displaying an emoji ejection area formed by combining the mask component and the prompt component in the conversation area currently displayed in the first conversation interface may include:
(1.1) acquiring target size information of a conversation area currently displayed by the first conversation interface;
(1.2) generating a mask component with the same size as the target size information through Gaussian blur processing;
and (1.3) displaying the mask component in the currently displayed session area of the first session interface, and displaying a prompt component on the displayed mask component to form an expression ejection area.
As shown in fig. 2b, in order to calculate the size of the expression ejection area, target size information of the current conversation area of the first conversation interface, for example, size information of 60 by 80 pixels, may be obtained. This Gaussian Blur (Gaussian Blur) processing is also called Gaussian smoothing, and is a processing effect widely used in image processing software such as Adobe Photoshop, GIMP, and paint. The image generated by the blurring technique has the visual effect of being viewed through a piece of ground glass, so that a mask component with the same size as the target size information can be generated through Gaussian blurring, and the mask component can be understood as a blurred canvas component.
Furthermore, the mask component can be displayed in a conversation area currently displayed on the first conversation interface, the content of the conversation area is shielded, misoperation is prevented, a prompt component is displayed on the displayed mask component, and characters are displayed, wherein the characters are dragged to the area to continuously send out the expressions, so that an expression ejection area is formed.
Further, in response to the dragging operation of dragging the target expression image to the expression ejection area, the target expression ejection animation corresponding to the target expression image is played.
The dragging operation may be a dragging operation on a target expression image, and correspondingly, please continue to refer to fig. 2b, a terminal user may drag the target expression image 12 to the expression ejection area through a finger, the sending terminal may start to play a target expression ejection animation corresponding to the target expression image in response to the dragging operation of dragging the target expression image to the expression ejection area, the target expression ejection animation may be an animation obtained by randomly ejecting the target expression image from the top of the first session interface at an angle of 360 degrees, the number of times of ejection may be preset by the system or determined according to a user touch time, the larger the number of times of ejection is, the stronger the effect of the target expression ejection animation is, the smaller the number of times of ejection is, and the weaker the effect of the target expression animation is.
In some embodiments, the step of playing the target expression ejection animation corresponding to the target expression image may include:
(1) Triggering and previewing an expression ejection animation corresponding to the target expression image;
(2) Detecting first touch control time of touch control operation in the expression ejection area;
(3) And continuously adjusting the animation style of the expression ejection animation according to the change of the first touch time until the touch operation in the expression ejection area is finished, and generating the target expression ejection animation.
As shown in fig. 2c, when the terminal user drags the target expression image 12 to the expression ejection area through a finger, the sending terminal triggers and previews the expression ejection animation corresponding to the target expression image in response to the drag operation of dragging the target expression image to the expression ejection area, and as shown in fig. 2c, the target expression image may be ejected from the top to the bottom of the first session interface.
Furthermore, in order to enrich the emotional expression of the terminal user, the first touch time of the touch operation in the expression ejection area can be continuously detected, the touch operation in the expression ejection area is a long-press touch operation of a finger of the terminal user after dragging the target expression image to the expression ejection area, the animation style of the expression ejection animation can be changed by the first touch time, and the animation style can be the ejection times of the target expression image and the size information of the target expression image during ejection.
In an embodiment, the longer the first touch time is, the more the ejection times of the target expression image are, and the larger the size information of the target expression image when ejected is, the smaller the first touch time is, the less the ejection times of the target expression image is, and the smaller the size information of the target expression image when ejected is, as shown in fig. 2c, when triggering and previewing the expression ejection animation corresponding to the target expression image, the ejection times of the target expression image may be 1, that is, one target expression image is ejected from the top to the bottom of the first session interface, and the size information of the target expression image may be the original size information, that is, the size of the target expression image is the initial state, and as the first touch time increases, the style animation of the expression ejection animation of the expression animation may be continuously adjusted, for example, the ejection times of the target expression image increases by 1 time every second increase, and the size information of the target expression image increases by one percent.
Therefore, as shown in fig. 2c, a first touch time may be displayed at a touch point pressed by a finger of a terminal user, for example, the first touch time is 10 seconds, that is, the number of times of ejection of a target expression image is 10 times, and the size information of the target expression image during ejection is increased by 10 percent, so that the terminal user may control the sending number and size information of the target expression image according to emotion until the touch operation of the finger of the user in the expression ejection area is finished, and generate and store a target expression ejection animation according to the number of times of ejection of the target expression image and the size information of the target expression image during ejection at the first touch time.
In step 103, the target expression image and the target expression ejection animation are sent to the receiving terminal.
The receiving terminal is a terminal which carries out instant communication with the sending terminal, and the target expression image and the target expression ejection animation can be sent to the receiving terminal in order to realize the emotion expression of a terminal user of the sending terminal and the rich diversity of the chat atmosphere.
In the related art, after receiving the target expression, the receiving terminal only displays the target expression image in the session message frame, but in the embodiment of the application, the target expression image can be displayed in the session message frame of the second session interface of the receiving terminal, the target expression ejection animation can be played in the second session interface, the emotion of the sending terminal is better expressed on the receiving terminal, the chat atmosphere is better promoted, and the interest of the chat is promoted.
As can be seen from the above, in the embodiment of the application, the expression input panel is displayed on the first session interface; responding to a touch operation of selecting a target expression image from the plurality of expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image; and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface. Therefore, the target expression ejection animation is rapidly triggered to be displayed on both sides through touch operation and dragging operation of the target expression image of the first session interface, and diversity of image processing is greatly improved compared with a fixed and rigid expression sending scheme.
Referring to fig. 2d, fig. 2d is a schematic flowchart of an image processing method according to an embodiment of the present disclosure. The image processing method comprises the following steps:
in step 104, an image input panel is displayed at the first session interface.
As shown in fig. 2e, the image input panel is a functional component for inputting an image or a video for instant messaging, and the image input panel includes a plurality of multimedia objects, which may be images, videos or display objects. The end user may open a display image input panel including and displaying a plurality of multimedia objects by clicking on an image icon displayed at the first session interface.
In step 105, in response to a touch operation of selecting a target multimedia object from the plurality of multimedia objects and in response to a drag operation of dragging the target multimedia object to the multimedia burst area, a target multimedia burst animation corresponding to the target multimedia object is played.
Further, the touch operation may be a long-press touch operation, a double-click touch operation, even an air gesture control, and the like, in this embodiment, the long-press touch operation is taken as an example, and a user may perform the long-press touch operation on a target multimedia through a finger.
In an embodiment, after the step of responding to the touch operation of selecting the target multimedia object from the plurality of multimedia objects, the method further includes: and displaying a multimedia continuous sending area in the currently displayed session area of the first session interface.
After responding to the touch operation of selecting the target image from the plurality of images by the user, the sending terminal 10 may correspondingly display a multimedia burst area, where the multimedia burst area may be composed of a mask component and a prompt component, and the prompt component in the image burst area may display a text "dragged to the area burst multimedia object".
For example, referring to fig. 2e together, when the multimedia object is an image, the user may perform a long-press touch operation on the target image by using a finger, the sending terminal 10 may correspondingly display an image burst area in response to the touch operation of the user selecting the target image from the plurality of images, the image burst area may be composed of a mask component and a prompt component, and the prompt component of the image burst area may display a text "dragged to the area burst image".
Or, referring to fig. 2f together, when the multimedia object is a video, the user may select a target video from the multiple videos by long-press touch operation with a finger, the sending terminal 10 may correspondingly display an image burst area in response to the touch operation of the user selecting the target video from the multiple videos, the image burst area may be composed of a mask component and a prompt component, and at this time, the prompt component of the image burst area may display a text "dragged to this area burst image".
The dragging operation may be a dragging operation of a target multimedia object, a terminal user may drag the target multimedia object into the multimedia burst area through a finger, and the sending terminal may start playing a target multimedia burst animation corresponding to the target multimedia object in response to the dragging operation of dragging the target multimedia object to the multimedia burst area.
For example, please refer to fig. 2e, when the multimedia object is an image, the terminal user may drag the target image into the image sending area by a finger, and the sending terminal 10 may start to play the target image sending animation corresponding to the target image in response to the drag operation of dragging the target image into the image sending area, and as shown in fig. 2f, the target image sending animation may be an animation in which the target image is drifted from the top of the first session interface.
For another example, please continue to refer to fig. 2g, the terminal user may drag the target video to the image sending area by a finger, the sending terminal 10 may start to play the target image sending animation corresponding to the target image frame of the target video in response to the drag operation of dragging the target video to the image sending area, the target image frame may be a cover image frame of the target video, please refer to fig. 2h together, and the target image sending animation may be an animation in which the target image frame is floated from the top of the first session interface.
In some embodiments, the step of playing the target multimedia continuous animation corresponding to the target multimedia object includes:
(1.1) triggering and playing the target multimedia continuous sending animation corresponding to the target multimedia object;
(1.2) detecting second touch time of touch operation in the multimedia continuous transmission area;
and (1.3) continuously playing the target multimedia continuous sending animation corresponding to the target multimedia object within the second touch time until the touch operation in the multimedia continuous sending area is finished, and stopping playing the target multimedia continuous sending animation.
And triggering and playing the target multimedia continuous animation corresponding to the target multimedia object, wherein the target image continuous animation can be an animation which enables the target multimedia object to continuously float from the top to the bottom of the first session interface.
Further, a second touch time of the touch operation in the multimedia continuous transmission area can be continuously detected, the touch operation in the multimedia continuous transmission area is a long-press touch operation of a finger of a terminal user after dragging a target multimedia object to the multimedia continuous transmission area, the target multimedia continuous transmission animation can be continuously played within the second touch time, and the target multimedia continuous transmission animation is stopped being played until the touch operation in the multimedia continuous transmission area is finished.
In step 106, the target multimedia object is transmitted to the receiving terminal.
Finally, to achieve rich diversity of chat atmospheres, the target multimedia object may be transmitted to a receiving terminal.
In the related art, after receiving the target multimedia object, the receiving terminal only displays the target multimedia object in the session message frame, but in the embodiment of the present application, the target multimedia object can be displayed in the session message frame of the second session interface of the receiving terminal, and also can be displayed in the session background of the second session interface, so as to better highlight the chat atmosphere and improve the interest of the chat.
In some embodiments, the step of stopping playing the target multimedia repeating animation until the touch operation in the multimedia repeating area is finished further includes:
(2.1) detecting whether the second touch time is smaller than a preset time threshold value;
(2.2) when the second touch time is detected to be smaller than a preset time threshold, saving the second touch time;
(2.3) the step of sending the target multimedia object to the receiving terminal specifically comprises: sending the second touch time and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the second touch time;
(2.4) when the second touch time is detected to be not less than a preset time threshold, saving the preset time threshold;
(2.5) the step of sending the target multimedia object to the receiving terminal specifically includes: and sending the preset time threshold and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the preset time threshold.
In order to enrich the emotional expression of the terminal user, it may be detected whether the second touch time is less than a preset time threshold, where the preset time threshold is the longest time for displaying the target image in the session background of the second session interface, for example, 3 seconds.
When it is detected that the second touch time is less than the preset time threshold, for example, 2 seconds, the second touch time may be reserved for 2 seconds, and the second touch time of the terminal user performing the touch operation is used as the time for displaying the target image in the session background of the second session interface of the receiving terminal. On the basis, the second touch time and the target image can be sent to the receiving terminal together, so that the receiving terminal can not only display the target image on a conversation message frame of a second conversation interface, but also display the target image on a conversation background of the second conversation interface based on the second touch time.
When it is detected that the second touch time is not less than the preset time threshold, for example, 4 seconds, in order to prevent the target image from being maliciously displayed on the second session interface of the receiving terminal for a long time, it may be limited that when the second touch time exceeds the preset time threshold, the preset time threshold is reserved for 3 seconds, and the preset time threshold is used as a time for displaying the target image on the session background of the second session interface of the receiving terminal. On the basis, the preset time threshold and the target image can be sent to the receiving terminal together, so that the receiving terminal can not only display the target image in a conversation message frame of a second conversation interface, but also display the target image in a conversation background of the second conversation interface based on the preset time threshold. The personalized expression of the target image is realized, and the diversity of image processing is further improved.
It is understood that, in addition to displaying the target image according to time, whether to stop displaying the target image may also be determined according to whether the second interface of the receiving terminal receives the operation information of the user. And when the operation of the user on the second conversation interface is detected, stopping displaying the target image, otherwise, continuously displaying the target image.
In some embodiments, the multimedia object is a video, and the step of transmitting the target multimedia object to the receiving terminal includes: and sending the target video to a receiving terminal, so that the receiving terminal displays the target video in a session message box of a second session interface and plays the target video in a session background of the second session interface.
The second session interface may include operation controls of the background video, including pause, close, play, and the like, so that the user may operate the corresponding operation controls to control the background video to pause, close, play, and the like.
Wherein, in order to realize rich diversity of chat atmosphere, the target video can be transmitted to the receiving terminal.
In the related art, after receiving the target video, the receiving terminal only displays the target video in the session message frame, but the embodiment of the application not only can display the target video in the session message frame of the second session interface of the receiving terminal, but also can play the target video in the session background of the second session interface, so that the chat atmosphere is better promoted, and the interest of the chat is improved.
In the present embodiment, description will be made from the viewpoint of an image processing apparatus which can be specifically integrated in a receiving terminal.
Referring to fig. 3a, fig. 3a is another schematic flow chart of an image processing method according to an embodiment of the present disclosure. The method flow can comprise the following steps:
in step 201, the target expression image and the target expression ejection animation sent by the sending terminal are received.
The receiving terminal can receive the target expression image and the target expression ejection animation sent by the sending terminal.
In step 202, the target emoticon ejection animation is played on the second session interface.
As shown in fig. 3b, in the embodiment of the application, the target expression ejection animation can be played on the second session interface, so that the emotion of the sending terminal is better expressed on the receiving terminal, thereby better creating a chat atmosphere and improving the interest of the chat.
In step 203, the target expression image is displayed in a conversation message box of the second conversation interface.
As shown in fig. 3b, the receiving terminal 10 may further display the target expression image in a session message frame of the second session interface, so as to display the target expression pop-up animation and the target expression image at the same time, create a better chat atmosphere, and improve the interest of the chat.
In some embodiments, the step of displaying the target emoticon in a conversation message box of the second conversation interface includes:
(1) Acquiring the ejection times corresponding to the target expression image indicated by the target expression ejection animation;
(2) Adjusting the size information of the target expression image according to the ejection times;
(3) And displaying the target expression image with the adjusted size information in a conversation message frame of a second conversation interface by combining the ejection times.
In order to better enrich the display of the target expression image, the ejection times corresponding to the target expression image indicated by the target expression ejection animation can be acquired, and as can be known from the description of the embodiment of the sending terminal, the larger the ejection times, the larger the size information of the image expression image is, so that the size information of the target expression image can be adjusted according to the ejection times, and further the size information of the target expression image can be adjusted according to the ejection times.
Further, please continue to fig. 3b, in combination with the number of ejections, the size information adjusted target expression image may be displayed in the session message frame 21 of the second session interface, and the session message frame 21 may display the size information adjusted target expression image multiplied by the number of ejections.
Referring to fig. 3c, fig. 3c is a schematic flowchart of an image processing method according to an embodiment of the present disclosure. The image processing method comprises the following steps:
in step 204, the target multimedia object transmitted by the transmitting terminal is received.
In step 205, the target multimedia object is displayed in a session message box of the second session interface.
In step 206, the target multimedia object is displayed in the session context of the second session interface.
The receiving terminal can receive the target multimedia object sent by the sending terminal, display the target multimedia object in a session message frame of a second session interface, and display the target multimedia object in a session background of the second session interface.
In some embodiments, the multimedia object is an image, and the step of displaying the target multimedia object in the session background of the second session interface comprises:
(1) Receiving second touch time sent by the sending terminal;
(2) And displaying the target image in the session background of the second session interface based on the second touch time.
As shown in fig. 3d, the receiving terminal 10 may receive the target image sent by the sending terminal, display the target image in the session message frame 22 of the second session interface, and display the target image in the session background of the second session interface. And receiving a second touch time, for example, 2 seconds, sent by the sending terminal, and displaying the target image in the session background of the second session interface for 2 seconds.
In some embodiments, the multimedia object is an image, and the step of displaying the target multimedia object in the session background of the second session interface includes:
(2.1) receiving a preset time threshold value sent by a sending terminal;
(2.2) displaying the target image in the session background of the second session interface based on the preset time threshold.
And receiving a preset time threshold value sent by the sending terminal, for example, 4 seconds, and displaying the target image in the session background of the second session interface for 4 seconds.
In some embodiments, the multimedia object is a video, and the step of displaying the target multimedia object in the session background of the second session interface comprises:
and playing the target video in the session background of the second session interface.
As shown in fig. 3e, the receiving terminal 10 may receive the target video sent by the sending terminal, display the target video in a session message frame 23 of a second session interface, and play the target video in a session background of the second session interface.
As can be seen from the above, in the embodiment of the present application, a target multimedia object sent by a sending terminal is received; displaying the target multimedia object in a session message frame of a second session interface; and displaying the target multimedia object in the session background of the second session interface. Therefore, the target expression image is displayed on the target expression image, the target expression ejection animation can be played, the interest of instant messaging is improved, and the diversity of image processing is greatly improved.
Referring to fig. 4, fig. 4 is a timing diagram of an image processing method according to an embodiment of the present disclosure. The method flow can comprise the following steps:
in step S1, the sending terminal sends the target expression image and the target expression ejection animation to the server.
The sending terminal can determine the target expression image and the target expression ejection animation through touch operation and dragging operation, and send the target expression image and the target expression ejection animation to the server.
In step S2, the server sends the target expression image and the target expression ejection animation to the receiving terminal.
And the server sends the target expression image and the target expression ejection animation to a receiving terminal which is in the same instant messaging environment with the sending terminal.
In step S3, the receiving terminal plays the target expression ejection animation on the second session interface, and displays the target expression image on the session message frame of the second session interface.
The receiving terminal plays the target expression ejection animation on the second session interface, and simultaneously displays the target expression image on the session message frame of the second session program surface, so that better emotion expression of the sending terminal is realized, and interestingness and diversity of instant communication are improved.
In order to better implement the image processing method provided by the embodiment of the present application, an embodiment of the present application further provides a device based on the image processing method. The terms are the same as those in the image processing method, and details of implementation can be referred to the description in the method embodiment.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure, where the image processing apparatus is applied to a sending terminal, and the image processing apparatus may include a first display unit 301, a first playing unit 302, a first sending unit 303, and the like.
The first display unit 301 is configured to display an expression input panel on the first session interface, where the expression input panel includes a plurality of expression images.
The first playing unit 302 is configured to play a target expression ejection animation corresponding to a target expression image in response to a touch operation of selecting the target expression image from the plurality of expression images and in response to a drag operation of dragging the target expression image to an expression ejection area.
In some embodiments, the first playing unit 302 is configured to:
responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and triggering and previewing an expression ejection animation corresponding to the target expression image;
detecting first touch time of touch operation in the expression ejection area;
and continuously adjusting the animation style of the expression ejection animation according to the change of the first touch time until the touch operation in the expression ejection area is finished, and generating the target expression ejection animation.
A first sending unit 303, configured to send the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface.
In some embodiments, the image processing apparatus further includes:
and the second display unit is used for displaying the expression ejection area in the currently displayed conversation area of the first conversation interface.
In some embodiments, the second display unit is configured to:
and displaying an expression ejection area formed by combining the mask component and the prompt component in a currently displayed conversation area of the first conversation interface.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure, where the image processing apparatus is applied to a sending terminal, where the image processing apparatus may include a first display unit 401, a first playing unit 402, a first sending unit 403, and the like.
A first display unit 401 for displaying an image input panel including a plurality of multimedia objects in a first session interface.
The first playing unit 402 is configured to play a target multimedia burst animation corresponding to a target multimedia object in response to a touch operation of selecting the target multimedia object from the plurality of multimedia objects and in response to a drag operation of dragging the target multimedia object to a multimedia burst area.
In some embodiments, the first playback unit 402 is configured to:
the multimedia object display device comprises a touch operation module, a multimedia object display module and a multimedia object display module, wherein the touch operation module is used for responding to a touch operation of selecting a target multimedia object from the multimedia objects and responding to a dragging operation of dragging the target multimedia object to a multimedia continuous sending area, and a target multimedia continuous sending animation corresponding to the target multimedia object is triggered and played;
detecting second touch time of touch operation in the multimedia continuous transmission area;
and continuously playing the target multimedia continuous sending animation corresponding to the target multimedia object within the second touch time until the touch operation in the multimedia continuous sending area is finished, and stopping playing the target multimedia continuous sending animation.
A first sending unit 403, configured to send the target multimedia object to a receiving terminal, so that the receiving terminal displays the target multimedia object in a session message box of a second session interface and displays the target multimedia object in a session background of the second session interface.
In some embodiments, the multimedia object is an image, and the image processing apparatus further includes:
the detection unit is used for detecting whether the second touch time is smaller than a preset time threshold value;
the first storage unit is used for storing the second touch time when the second touch time is detected to be smaller than a preset time threshold value;
the first sending unit 403 is further configured to send the second touch time and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the second touch time;
the second storage unit is used for storing the preset time threshold when the second touch time is not less than the preset time threshold;
the first sending unit 403 is further configured to send the preset time threshold and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the preset time threshold.
In some embodiments, the multimedia object is a video, and the first sending unit 403 is further configured to:
and sending the target video to a receiving terminal so that the receiving terminal displays the target video in a session message box of a second session interface and plays the target video in a session background of the second session interface.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure, where the image processing apparatus is applied to a server, and the image processing apparatus may include a first receiving unit 501, a first playing unit 502, a first display unit 503, and the like.
A first receiving unit 501, configured to receive a target expression image and a target expression ejection animation sent by a sending terminal;
the first playing unit 502 is used for playing the target expression ejection animation on the second session interface;
a first display unit 503, configured to display the target emotion image in a session message box of the second session interface.
In some embodiments, the first display unit 503 is configured to:
acquiring the ejection times corresponding to the target expression image indicated by the target expression ejection animation;
adjusting the size information of the target expression image according to the ejection times;
and displaying the target expression image with the adjusted size information on a conversation message frame of a second conversation interface by combining the ejection times.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an image processing apparatus applied to a server according to an embodiment of the present disclosure, where the image processing apparatus may include a first receiving unit 601, a first playing unit 602, a first display unit 603, and the like.
A first receiving unit 601, configured to receive a target multimedia object sent by a sending terminal.
A first playing unit 602, configured to display the target multimedia object in a session message box of the second session interface.
A first display unit 603, configured to display the target multimedia object in a session background of the second session interface.
In some embodiments, the multimedia object is an image, and the first display unit 603 is configured to:
receiving second touch time sent by the sending terminal;
and displaying the target image in the session background of the second session interface based on the second touch time.
In some embodiments, the multimedia object is an image, and the first display unit 603 is further configured to:
receiving a preset time threshold sent by a sending terminal;
and displaying the target image in the session background of the second session interface based on the preset time threshold.
In some embodiments, the multimedia object is a video, and the first display unit 603 is further configured to:
and playing the target video in the session background of the second session interface.
Embodiments of the present application also provide a terminal, as shown in fig. 9, which may include Radio Frequency (RF) circuitry 701, a memory 702 including one or more computer-readable storage media, an input unit 703, a display unit 704, a sensor 705, an audio circuit 706, a Wireless Fidelity (WiFi) module 707, a processor 708 including one or more processing cores, and a power supply 709. Those skilled in the art will appreciate that the terminal structure shown in fig. 9 does not constitute a limitation of the terminal, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 701 may be used for receiving and transmitting signals during a message transmission or communication process, and in particular, for receiving downlink information of a base station and then sending the received downlink information to the one or more processors 708 for processing; in addition, data relating to uplink is transmitted to the base station. In general, RF circuitry 701 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 701 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), long Term Evolution (LTE), email, short Messaging Service (SMS), etc.
The memory 702 may be used to store software programs and modules, and the processor 708 executes various functional applications and image processing by executing the software programs and modules stored in the memory 702. The memory 702 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal, etc. Further, the memory 702 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 702 may also include a memory controller to provide access to the memory 702 by the processor 708 and the input unit 703.
The input unit 703 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, the input unit 703 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (such as operations by the user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 708, and can receive and execute commands sent from the processor 708. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 703 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 704 may be used to display information input by or provided to a user and various graphic user interfaces of the terminal, which may be configured of graphics, text, icons, video, and any combination thereof. The Display unit 704 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation may be communicated to the processor 708 to determine the type of touch event, and the processor 708 may then provide a corresponding visual output at the display panel based on the type of touch event. Although in FIG. 9 the touch sensitive surface and the display panel are implemented as two separate components for input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel for input and output functions.
The terminal may also include at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the terminal moves to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal, detailed description is omitted here.
Audio circuitry 706, a speaker, and a microphone may provide an audio interface between the user and the terminal. The audio circuit 706 can transmit the electrical signal converted from the received audio data to a loudspeaker, and the electrical signal is converted into a sound signal by the loudspeaker and output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 706 and converted into audio data, which is then processed by the audio data output processor 708, and then transmitted to, for example, another terminal via the RF circuit 701, or the audio data is output to the memory 702 for further processing. The audio circuitry 706 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
WiFi belongs to short-distance wireless transmission technology, and the terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 707 and provides wireless broadband internet access for the user. Although fig. 9 shows the WiFi module 707, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 708 is a control center of the terminal, connects various parts of the entire handset using various interfaces and lines, performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 702 and calling data stored in the memory 702, thereby integrally monitoring the handset. Optionally, processor 708 may include one or more processing cores; preferably, the processor 708 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 708.
The terminal also includes a power source 709 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 708 via a power management system that may be configured to manage charging, discharging, and power consumption. The power supply 709 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the terminal may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the processor 708 in the terminal loads the executable file corresponding to the process of one or more application programs into the memory 702 according to the following instructions, and the processor 708 runs the application programs stored in the memory 702, thereby implementing various functions:
displaying an expression input panel on a first conversation interface, wherein the expression input panel comprises a plurality of expression images; responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image; and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the image processing method, and are not described again here.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps in any one of the image processing methods provided by the embodiments of the present application. For example, the instructions may perform the steps of:
displaying an expression input panel on a first session interface, wherein the expression input panel comprises a plurality of expression images; responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image; and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations provided by the embodiments described above.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the computer storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the computer storage medium can execute the steps in any image processing method provided in the embodiments of the present application, the beneficial effects that can be achieved by any image processing method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described again here.
The foregoing detailed description has provided a method, an apparatus, a computer storage medium, a device, and a system for image processing according to embodiments of the present application, and specific examples have been applied in the present application to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (20)

1. An image processing method, characterized in that the method comprises:
displaying an expression input panel on a first session interface, wherein the expression input panel comprises a plurality of expression images;
responding to a touch operation of selecting a target expression image from the expression images and responding to a dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image;
and sending the target expression image and the target expression ejection animation to a receiving terminal, so that the receiving terminal displays the target expression image in a session message frame of a second session interface and plays the target expression ejection animation in the second session interface.
2. The image processing method according to claim 1, wherein after the step of responding to the touch operation of selecting the target expression image from the plurality of expression images, the method further comprises:
and displaying an expression ejection area in the currently displayed conversation area of the first conversation interface.
3. The image processing method according to claim 2, wherein the step of displaying an expression ejection area in the currently displayed session area of the first session interface comprises:
and displaying an expression ejection area formed by combining the mask component and the prompt component in a currently displayed conversation area of the first conversation interface.
4. The image processing method according to any one of claims 1 to 3, wherein the step of playing the target expression ejection animation corresponding to the target expression image includes:
triggering and previewing an expression ejection animation corresponding to the target expression image;
detecting first touch time of touch operation in the expression ejection area;
and continuously adjusting the animation style of the expression ejection animation according to the change of the first touch time until the touch operation in the expression ejection area is finished, and generating a target expression ejection animation.
5. An image processing method, characterized in that the method comprises:
displaying an image input panel in a first session interface, wherein the image input panel comprises a plurality of multimedia objects;
responding to a touch operation of selecting a target multimedia object from the plurality of multimedia objects and responding to a dragging operation of dragging the target multimedia object to a multimedia continuous sending area, and playing a target multimedia continuous sending animation corresponding to the target multimedia object;
and sending the target multimedia object to a receiving terminal, so that the receiving terminal displays the target multimedia object in a session message frame of a second session interface and displays the target multimedia object in a session background of the second session interface.
6. The image processing method according to claim 5, wherein after the step of responding to the touch operation of selecting the target multimedia object from the plurality of multimedia objects, the method further comprises:
and displaying a multimedia continuous sending area in the currently displayed session area of the first session interface.
7. The image processing method according to claim 5, wherein the step of playing the target multimedia continuous animation corresponding to the target multimedia object comprises:
triggering and playing a target multimedia continuous-sending animation corresponding to the target multimedia object;
detecting second touch time of touch operation in the multimedia continuous transmission area;
and continuously playing the target multimedia continuous sending animation corresponding to the target multimedia object within the second touch time until the touch operation in the multimedia continuous sending area is finished, and stopping playing the target multimedia continuous sending animation.
8. The image processing method according to claim 6, wherein the multimedia object is an image, and further comprising, after the step of stopping playing the target multimedia repeating animation until the touch operation in the multimedia repeating area is finished, the step of:
detecting whether the second touch time is smaller than a preset time threshold value;
when the second touch time is detected to be smaller than a preset time threshold value, saving the second touch time;
the step of sending the target multimedia object to a receiving terminal specifically includes:
sending the second touch time and the target image to a receiving terminal, so that the receiving terminal displays the target image in a session message frame of a second session interface and displays the target image in a session background of the second session interface based on the second touch time;
when the second touch time is detected to be not less than a preset time threshold, saving the preset time threshold;
the step of sending the target multimedia object to a receiving terminal specifically includes:
and sending the preset time threshold and the target image to a receiving terminal, so that the receiving terminal displays the target image on a conversation message frame of a second conversation interface and displays the target image on a conversation background of the second conversation interface based on the preset time threshold.
9. The image processing method according to claim 5, wherein the multimedia object is a video, and the step of transmitting the target multimedia object to a receiving terminal comprises:
and sending the target video to a receiving terminal, so that the receiving terminal displays the target video in a session message frame of a second session interface and plays the target video in a session background of the second session interface.
10. An image processing method, characterized in that the method comprises:
receiving a target expression image and a target expression ejection animation sent by a sending terminal;
playing the target expression ejection animation on a second session interface; and
and displaying the target expression image in a session message frame of the second session interface.
11. The image processing method according to claim 10, wherein the step of displaying the target emoticon in a conversation message box of the second conversation interface includes:
acquiring the ejection times corresponding to the target expression image indicated by the target expression ejection animation;
adjusting the size information of the target expression image according to the ejection times;
and displaying the target expression image with the adjusted size information on a conversation message frame of a second conversation interface by combining the ejection times.
12. An image processing method, characterized by comprising:
receiving a target multimedia object sent by a sending terminal;
displaying the target multimedia object in a session message frame of a second session interface; and
and displaying the target multimedia object in the session background of the second session interface.
13. The image processing method according to claim 12, wherein the multimedia object is an image, and the step of displaying the target multimedia object in the session background of the second session interface comprises:
receiving second touch time sent by the sending terminal;
and displaying the target image in a session background of the second session interface based on the second touch time.
14. The image processing method according to claim 12, wherein the multimedia object is an image, and the step of displaying the target multimedia object in the session background of the second session interface comprises:
receiving a preset time threshold sent by a sending terminal;
and displaying the target image in a session background of the second session interface based on the preset time threshold.
15. The image processing method according to claim 12, wherein the multimedia object is a video, and the step of displaying the target multimedia object in the session background of the second session interface comprises:
and playing the target video in the session background of the second session interface.
16. An image processing apparatus characterized by comprising:
the first display unit is used for displaying an expression input panel on a first session interface, and the expression input panel comprises a plurality of expression images;
the first playing unit is used for responding to touch operation of selecting a target expression image from the expression images and responding to dragging operation of dragging the target expression image to an expression ejection area, and playing a target expression ejection animation corresponding to the target expression image;
and the first sending unit is used for sending the target expression image and the target expression ejection animation to a receiving terminal so that the receiving terminal can display the target expression image in a session message frame of a second session interface and play the target expression ejection animation in the second session interface.
17. An image processing apparatus characterized by comprising:
the first receiving unit is used for receiving the target expression image and the target expression ejection animation sent by the sending terminal;
the first playing unit is used for playing the target expression ejection animation on the second session interface;
and the first display unit is used for displaying the target expression image in a conversation message frame of the second conversation interface.
18. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the image processing method according to any one of claims 1 to 15.
19. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps in the image processing method of any one of claims 1 to 15 when executing the computer program.
20. An image processing system, characterized in that the system comprises: a transmitting terminal, a server and a receiving terminal;
the transmitting terminal includes the image processing apparatus according to claim 16;
the server is used for receiving the target expression image and the target expression ejection animation sent by the sending terminal and forwarding the target expression image and the target expression ejection animation to the receiving terminal;
the receiving terminal comprises the image processing apparatus according to claim 17.
CN202110374597.6A 2021-04-07 2021-04-07 Image processing method, image processing device, computer storage medium, equipment and system Pending CN115185418A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110374597.6A CN115185418A (en) 2021-04-07 2021-04-07 Image processing method, image processing device, computer storage medium, equipment and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110374597.6A CN115185418A (en) 2021-04-07 2021-04-07 Image processing method, image processing device, computer storage medium, equipment and system

Publications (1)

Publication Number Publication Date
CN115185418A true CN115185418A (en) 2022-10-14

Family

ID=83512529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110374597.6A Pending CN115185418A (en) 2021-04-07 2021-04-07 Image processing method, image processing device, computer storage medium, equipment and system

Country Status (1)

Country Link
CN (1) CN115185418A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023098318A1 (en) * 2021-11-30 2023-06-08 腾讯科技(深圳)有限公司 Session-based information display method and apparatus, and device, storage medium and program product

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023098318A1 (en) * 2021-11-30 2023-06-08 腾讯科技(深圳)有限公司 Session-based information display method and apparatus, and device, storage medium and program product

Similar Documents

Publication Publication Date Title
TWI674555B (en) Emoticon display method, apparatus, computer-readable storage medium and terminal
CN106973330B (en) Screen live broadcasting method, device and system
CN109491738B (en) Terminal device control method and terminal device
CN111666009B (en) Interface display method and electronic equipment
CN108958606B (en) Split screen display method and device, storage medium and electronic equipment
CN105094501B (en) Method, device and system for displaying messages in mobile terminal
CN110673770B (en) Message display method and terminal equipment
CN106775194B (en) Application interface switching method and device
CN112312144B (en) Live broadcast method, device, equipment and storage medium
CN108712577A (en) A kind of call mode switching method and terminal device
CN108600079B (en) Chat record display method and mobile terminal
CN108052258B (en) Terminal task processing method, task processing device and mobile terminal
CN110221765B (en) Video file playing method and device, storage medium and terminal
CN108744495A (en) A kind of control method of virtual key, terminal and computer storage media
CN109660445B (en) Message processing method, device and storage medium
CN109166164B (en) Expression picture generation method and terminal
CN110688051B (en) Screen recording operation method and device, computer readable storage medium and terminal
CN110908757B (en) Method and related device for displaying media content
CN115185418A (en) Image processing method, image processing device, computer storage medium, equipment and system
CN111372003A (en) Camera switching method and device and terminal
CN107807876B (en) Split screen display method, mobile terminal and storage medium
CN110796438A (en) Message sending method and mobile terminal
CN106850413B (en) Instant messaging information processing method and device
CN111522674B (en) Cross-application processing method of multimedia content and electronic equipment
CN106803916B (en) Information display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40075272

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination