WO2019024700A1 - 表情展示方法、装置及计算机可读存储介质 - Google Patents

表情展示方法、装置及计算机可读存储介质 Download PDF

Info

Publication number
WO2019024700A1
WO2019024700A1 PCT/CN2018/096609 CN2018096609W WO2019024700A1 WO 2019024700 A1 WO2019024700 A1 WO 2019024700A1 CN 2018096609 W CN2018096609 W CN 2018096609W WO 2019024700 A1 WO2019024700 A1 WO 2019024700A1
Authority
WO
WIPO (PCT)
Prior art keywords
expression
target
terminal
display
target expression
Prior art date
Application number
PCT/CN2018/096609
Other languages
English (en)
French (fr)
Inventor
栗绍峰
吴昊
朱明浩
杨晓明
陈煜聪
叶振鹏
玉绍祖
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019024700A1 publication Critical patent/WO2019024700A1/zh
Priority to US16/569,515 priority Critical patent/US11204684B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services

Definitions

  • the embodiments of the present application relate to the field of Internet technologies, and in particular, to an expression display method, apparatus, and computer readable storage medium.
  • the related technology is generally implemented in the following manner: after detecting that the interactive party clicks on any expression in the expression selection window, the terminal displays the selected expression sequentially on the message display interface of the message display interface. . For example, if no message is currently displayed on the message display interface, the terminal displays the expression on the first message display location fixed on the message display interface. If a message is currently displayed on the message display interface, the terminal displays the first message placement after the message.
  • the selected expression can only be displayed on the message display interface in order, so the expression display method lacks vividness, the method is too single, and the display effect is poor.
  • an embodiment of the present application provides an expression display method, apparatus, and computer readable storage medium.
  • the technical solution is as follows:
  • an expression display method which is applied to a first terminal, and the method includes:
  • the first terminal After the first terminal obtains a drag command for the selected target expression in the expression selection window, the first terminal moves the target expression according to the obtained drag track;
  • the first terminal displays a placement prompt message in the process of moving the target expression
  • the first terminal acquires a stop drag command for the target expression in the message display interface, and in response to the drag stop instruction, after receiving the placement confirmation command triggered by the placement prompt message, stopping the dragging
  • the first target position displays the target expression.
  • an expression display method for applying to a second terminal, the method comprising:
  • the second terminal receives the expression display data sent by the first terminal, where the expression display data includes at least the first coordinate position information, the first screen size information, and the identification information of the at least one display element;
  • the second terminal superimposes the target expression on the at least one display element for display at the second target position in a display manner placed on the top layer.
  • an emoticon display apparatus for use in a first terminal, the apparatus comprising one or more processors, and one or more memories storing program units, wherein the program units are processed by the Executed, the program unit includes:
  • the processing module is configured to: after obtaining the drag command for the selected target expression in the expression selection window, move the target expression according to the obtained drag track;
  • a first display module configured to display a placement prompt message during the process of moving the target expression
  • a second display module configured to acquire a stop drag command for the target expression in the message display interface, and in response to the drag stop instruction, stop after receiving the placement confirmation command triggered by the placement prompt message
  • the first target position of the drag shows the target expression.
  • an emoticon display apparatus for use in a second terminal, the apparatus comprising one or more processors, and one or more memories storing program units, wherein the program units are processed by the Executed, the program unit includes:
  • the receiving module is configured to receive the expression display data sent by the first terminal, where the expression display data includes at least the first coordinate position information, the first screen size information, and the identification information of the at least one display element;
  • a determining module configured to determine, according to the identification information of the at least one display element, a display range area of the target expression on the message display interface
  • An obtaining module configured to acquire second screen size information of the second terminal
  • the determining module is further configured to determine a second target location in the display range area according to the first screen size information, the second screen size information, and the first coordinate position information;
  • a display module configured to be displayed at the second target position to be displayed on the top layer, and to superimpose the target expression on the at least one display element for display.
  • a fifth aspect provides a computer readable storage medium, where the storage medium stores at least one instruction, at least one program, a code set, or a set of instructions, the at least one instruction, the at least one program, and the code a set or set of instructions loaded and executed by the processor to implement the emoticon presentation method of the first aspect; or the at least one instruction, the at least one program, the set of codes, or the set of instructions by the processor
  • the expression presentation method described in the second aspect is loaded and executed.
  • a terminal including a processor and a memory, where the memory stores at least one instruction, at least one program, a code set or a set of instructions, the at least one instruction, the at least one program
  • the code set or instruction set is loaded and executed by the processor to implement the emoticon presentation method of the first aspect; or the at least one instruction, the at least one program, the code set, or the instruction set is The processor loads and executes to implement the expression presentation method of the second aspect.
  • the selected expression drag and drop operation is supported in the expression selection window, and in the process of dragging and dropping the expression, the expression is allowed to be randomly placed on the message display interface, so the expression display mode is more vivid.
  • the interactive method is more diversified and the display effect is better.
  • FIG. 1 is a schematic diagram of a message display interface provided by an embodiment of the present application.
  • FIG. 2 is a system architecture diagram of an expression display method according to an embodiment of the present application.
  • FIG. 3 is a flowchart of an expression display method provided by an embodiment of the present application.
  • FIG. 4A is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4B is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4C is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4D is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4E is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4F is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4G is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4H is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4I is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4J is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4K is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4L is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4M is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4N is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4P is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4Q is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4R is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4S is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • 4T is a schematic diagram of another message display interface provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an expression display device according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of another expression display device according to an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a terminal according to an embodiment of the present application.
  • Emoticons are a popular culture formed after the social application is active, to express specific emotions, mainly for the thoughts and emotions on the face or posture.
  • expressions can be generally divided into symbol expressions, static picture expressions, dynamic picture expressions, and the like.
  • the expression can be based on the face that expresses various human emotions, or the current popular stars, quotations, anime, film and television screenshots, etc., and a series of matching texts.
  • All In One refers to the message display interface provided in the social application, such as the friend chat interface or the group chat interface, used to display the expression.
  • Emoji support drags to the message display interface, and can be affixed to the message display interface, even if there is a text message or other expressions displayed at the current location.
  • the expression display method provided by the embodiment of the present application is mainly used for a friend interaction scene or a group interaction scene.
  • the interactive scene if the interactive party selects an expression by clicking on the expression selection window, the selected expression will be displayed in sequence on the message display interface of the interactive parties.
  • the expression A is displayed before the expression B.
  • the sent expression A is displayed with the left side of the interface as a reference point extending toward the longitudinal central axis, and the sent expression B is based on the right side of the interface. The points are shown in a manner that extends toward the longitudinal center axis.
  • the embodiment of the present application proposes that an expression can be dragged to the message display interface, and can be randomly placed on the message display interface, and can also be hidden and called again.
  • the interactive method enables the user to better express personal feelings during the interaction with the friends in the relationship chain, enhances the emotional expression in the case of non-face-to-face interaction on the online, and enhances the activity of the friend relationship chain, adding Interactive pleasure and better user experience.
  • FIG. 2 is a system architecture diagram of an expression display method according to an embodiment of the present application.
  • the system architecture includes a first terminal, a server, and a second terminal.
  • the first terminal and the second terminal may be smart phones, tablet computers, and the like, which are not specifically limited in this embodiment. If the number of terminals included in the second terminal is one for the one-to-one interaction scenario, the number of terminals included in the second terminal is multiple.
  • the same social application is installed on the first terminal and the second terminal, and the first user of the first terminal and the second user of the second terminal interact based on the social application, and the social application is also the first user and the first
  • the second user maintains a chain of friend relationships. The first user is located in the friend relationship chain of the second user, and the second user is located in the friend relationship chain of the first user.
  • the expression display process may be briefly described as: after the first terminal obtains a drag command for the selected target expression in the expression selection window, the target expression is moved according to the obtained drag track, and the target is displayed. a placement prompt message of the expression; afterwards, if the first terminal obtains a stop drag command for the target expression in the message display interface, responding to the drag stop instruction, after receiving the placement confirmation command triggered by the placement prompt message, Stop the first target position of the drag to display the target expression.
  • the first target location may be any location on the message display interface, even if a text message or other expression is displayed at the location.
  • the first terminal further sends the related expression display data to the second terminal through the transparent transmission of the server, so that the second terminal has the same expression display effect as the first terminal.
  • the embodiment of the present application provides a new expression interaction method by sensing the user's gesture operation. See the following examples for detailed implementations of triggering drag and drop operations, arbitrarily placing, hiding emoticons, and evoking expressions.
  • FIG. 3 is a flowchart of an expression display method provided by an embodiment of the present application.
  • the method process provided by the embodiment of the present application includes:
  • the first target moves the target expression according to the obtained drag track, and displays a placement prompt message during the process of moving the target expression.
  • the expression selection window is used to display a plurality of different expressions for the user to input the expression.
  • the first terminal determines the expression F as the target expression.
  • the first terminal determines to obtain the drag command for the expression F, and acquires the drag track formed by the sliding operation in real time. And then move the expression F according to the drag track.
  • the duration of the long press operation may be 1 second or 2 seconds, etc., which is not specifically limited in this embodiment of the present application.
  • the length of the long press operation may be determined by the single click operation.
  • the acquisition of the drag track can be implemented by the touch sensitive component on the first terminal, which is not specifically limited in this embodiment of the present application.
  • the first terminal may be in the peripheral area of the expression F.
  • An enlarged version of the expression F is displayed in a display manner placed on the top layer.
  • a placement prompt message for the pair of expressions F is also displayed during the movement of the expression F.
  • the placement prompt message is essentially a placement prompt icon, and the placement prompt icon is attached to the expression F.
  • the placement prompt message moves as the expression F moves.
  • the placement prompt message may also be placed in the upper left corner and the lower left corner of the expression F, or the upper left corner and the upper right corner, or the lower left corner and the lower right corner, which are not specifically limited in this embodiment of the present application.
  • the expression F since the expression F is used for interaction with others, it is usually placed on the message display interface, so the placement prompt message can be displayed after the expression F is moved to the message display interface. Of course, it is also possible to display when the expression F is moved at the beginning.
  • the expression F is displayed in real time on the first user and the terminal screen during the continuous sliding operation. Contact position. That is, where the first user's finger slides, the expression F is displayed.
  • the sliding operation can be performed on the message display interface. Whether it is a blank area of the message display interface during the movement process, or a non-blank area displaying a text message or other expression on the message display interface, the expression F is always displayed on the message display interface in the display mode placed on the top layer. .
  • the first terminal acquires a stop drag command for the target expression in the message display interface, and in response to the drag stop instruction, stops receiving the first target after the placement confirmation command triggered by the placement prompt message
  • the location shows the target expression.
  • the first A terminal acquires a stop drag command for the target expression and responds to the stop drag command.
  • the target expression does not continue to move the display, but is displayed at a fixed position as shown in any of Figs. 4C to 4E. Since the first terminal has not been able to determine whether the first user actually wants to place the target expression in the fixed position at this time, the placement prompt message is still attached to the target expression.
  • the first terminal receives the placement confirmation command, and determines that the first user actually wants to place the target expression in the fixed position, so the fixed position where the target expression is currently located is determined as the first target position, and is first.
  • the target expression is displayed at the target location in a form without a placement reminder message. That is, the presentation manner of the target expression after the placement is successful may be as shown in FIG. 4G. At this point, the successful transmission of the target expression is achieved.
  • the current expression selection window is restored to the previous operable state, and the target expression area is restored to the original display style, and the similar expression can be continued at this time. Drag and drop and place the processing.
  • the first terminal receives the cancel placement instruction, and determines that the first user cancels the target expression in the fixed position, so the display of the target expression is cancelled on the message display interface, that is, the target expression disappears in the message display interface. on.
  • the current expression selection window is restored to the previous operational state, and the original expression display area is restored to the original display style. At this time, similar expression drag and drop processing and placement processing can be continued. Specific restrictions are made.
  • the embodiment of the present application can implement the above-described expression display method because the visual rendering layer of the message display interface is redrawn.
  • the message presentation interface includes a visual rendering layer, a message container placed on the visual rendering layer, and an input control placed on the message container.
  • the message container is a layer that accommodates the message generated by the first user and the second user during the interaction process.
  • the message container may be a rectangle or a circular shape or an irregular shape. limited. Additionally, the area in the message container for presenting the message may be opaque, and other areas of the message container other than the area used to present the message may be transparent or translucent.
  • Input controls are controls for input and can include input boxes.
  • the user interface (UI) component is used to draw the content on the visual rendering layer by using the rendering method of the rendering object, so that the user can trigger by long pressing and the like. The expression is dragged to the message display interface for free placement, and the expression is hidden and the expression is called again.
  • the embodiment of the present application when the message generated by the input using the input box is displayed, the embodiment of the present application is specifically displayed in the message container, and the display of the target expression is implemented by the visual rendering layer. This makes the display of regular messages separate from this special expression display, achieving a better display effect.
  • the first terminal after having the visual rendering layer, the first terminal can display the target expression in the following manners.
  • the target expression displayed at the first target position is drawn on the visual rendering layer in a display manner placed on the top layer, so that the target expression is superimposed on the display. On the message box.
  • the first mode corresponds to the expression display mode shown in FIG. 4G.
  • the first target position may correspond to a geometric center point of the target expression.
  • the position of an expression or a message frame can be referred to by the position of its geometric center point.
  • the embodiment of the present application can be configured to only superimpose such randomly placed expressions on the original message frame for display. That is, if the first user wants to place the target emoticon in a purely blank area on the message presentation interface, the placement will not be successful.
  • the first terminal may determine whether the message frame is displayed at the first target location; if the message frame is displayed at the first target location, execute the first target location The step of displaying the target expression; if the message frame does not exist at the first target position, the relocation prompt message is displayed.
  • the relocation prompt message may be displayed in a strip display manner on an edge area of the message display interface to minimize excessive coverage of the message display interface.
  • the message box is displayed as a message generated when input through a conventional input box mode.
  • the generated message may be a plain text message, or may be only an expression, or may be a combination of a text message and an expression. This embodiment of the present application does not specifically limit this.
  • the display is displayed on the visual rendering layer at the first target position by displaying the other expressions on the message frame.
  • the target expression is such that the target expression is superimposed on other expressions.
  • the expression E is superimposed on the expression F.
  • the message box, the target expression and other expressions have overlapping parts, and it can be determined that the message frame, the target expression and other expressions are simultaneously displayed at the first target position.
  • the target expression displayed at the first target position is drawn on the visual rendering layer.
  • the embodiment of the present application can also be configured to support displaying the randomly placed expression in a blank space. That is, if the first user wants to place the target expression in a purely blank area on the message display interface, it will also be placed successfully without the support of the message frame.
  • the embodiment of the present application completes the dragging and arbitrarily placing the target expression on the first terminal, so that the target expression on the message display interface of the second terminal also has the same display effect, instead of the target.
  • the expressions are displayed in a conventional manner.
  • the embodiment of the present application further includes the following steps 303 to 307.
  • the first terminal sends the expression display data to the server, where the expression display data includes at least first coordinate position information, first screen size information, and identification information of at least one display element.
  • the expression display data is generated by the first terminal according to the following manner: the first terminal acquires the first screen size information of the first terminal, and determines at least one display element associated with the target expression, and calculates the target expression.
  • At least one display element is a message frame and/or an expression displayed at the first target location. That is, when only the message frame is displayed at the first target position, then at least one of the display elements is only the message frame, as shown in the case shown in FIG. 4E. When other expressions are superimposed on the message box displayed at the first target position, then at least one of the display elements includes a message frame and other expressions, such as shown in FIG. 4J. Of course, if the support expression is placed in a blank space, then at least one display element may also include only other expressions.
  • the first screen size information is also acquired because there are various sizes for the terminal, and in order to synchronize terminals of different sizes, the terminals of various sizes are in the target.
  • the expression When the expression is displayed, it has a consistent display effect. Therefore, it is necessary to perform relative position conversion between the target expression and at least one display element based on the difference in size information between the terminals.
  • the coincidence portion between the target expression and the message frame is 30%, and then the overlapping portion between the target expression and the message frame is also required to be ensured on the message display interface of the second terminal. It is 30%.
  • the first coordinate position information is absolute position information. That is, in addition to performing the conversion of the coordinate position on the message display interface of the second terminal according to the relative first coordinate position information and the first screen size information, the absolute first coordinate position information and the first screen size information may be used. The corresponding conversion is not specifically limited in the embodiment of the present application.
  • the identification information of the at least one display element is used to facilitate the location of the second terminal to quickly lock the target expression in the entire content displayed on the message display interface.
  • the first message may be multiplexed with the original message channel for transmitting the message data, which is not specifically limited in this embodiment of the present application.
  • the server After receiving the expression display data, the server sends the expression display data to the second terminal.
  • the server can also multiplex the original message channel used for transmitting the message data, and send the expression display data to the second terminal.
  • the second terminal After receiving the expression display data sent by the first terminal, the second terminal determines, according to the identifier information of the at least one display element, the display range area of the target expression on the message display interface.
  • the display area of the at least one display element and the peripheral area of the at least one display element can determine the display range area of the target expression.
  • the second terminal acquires its second screen size information, and determines a second target location in the display range area according to the first screen size information, the second screen size information, and the first coordinate position information.
  • the second terminal may first calculate, according to the first screen size information, the second screen size information, and the first coordinate position information, the target expression relative to the at least one display element on the message display interface of the second display.
  • the second coordinate position information is used to calculate the second target position based on the second coordinate position information and the position information of the at least one display element.
  • the second terminal at the second target location, superimposes the target emoticon on the at least one display element for display in a display manner placed on the top layer.
  • the second terminal when performing the display of the target expression, also performs the drawing on the visual rendering layer in a manner similar to the above-mentioned display manner on the message display interface of the first terminal, so that the target expression is at the second terminal.
  • the presentation manner on the message display interface is the same as that displayed on the first terminal.
  • the dragging and placing of the expression E may be continued as shown in FIG. 4I. That is, the embodiment of the present application supports dragging and placing an expression multiple times. In addition, the embodiment of the present application further supports the operations of enlarging, reducing, and rotating the placed expression after placing the expression on the message display interface. For details, refer to the following description:
  • the first terminal acquires a target zoom ratio that matches the zoom instruction; and further scales the expression according to the target zoom ratio.
  • the scaling process may be either an enlargement process or a reduction process.
  • the placed expression E when the placed expression E is scaled, it can be realized by two-hand operation.
  • the two index fingers when performing the enlargement process, the two index fingers can be respectively placed in the upper left corner and the upper left corner of the figure, and slide in the direction of gradually moving away from the figure; wherein the larger the sliding distance, the enlargement ratio of the expression E The bigger.
  • the first terminal may preset the correspondence between the sliding distance and the magnification ratio, and then calculate how much the expression E should be enlarged according to the obtained sliding distance.
  • the first terminal only needs to detect that there are two contact positions on the expression E, whether the two contact positions are realized by two hands or one hand, and two contacts As the position gradually moves away, it is determined that the first user is zooming in on the expression E.
  • the two index fingers can also be respectively placed in the upper left corner and the upper left corner of the figure, and slide in the direction of gradually approaching; wherein the sliding distance is larger,
  • the first terminal only needs to detect that there are two contact positions on the expression E, and the two contact positions are gradually approached, and it can be determined that the first user is performing the reduction process on the expression E.
  • the placement prompt information described above may be displayed again to place a reminder for the first user.
  • the first terminal detects the placement confirmation operation of the first user the scaled expression E is displayed; if the first terminal detects the cancellation confirmation operation of the first user, Show the initial expression E before scaling.
  • the embodiment of the present application is also sent to the second terminal through the server, so that the second terminal synchronously displays the expression E on the message display interface according to the zoomed data.
  • the zoom data includes at least a target zoom ratio of the expression E and the expression E.
  • FIG. 4K to 4M are schematic views of the rotation of the expression E.
  • FIG. 4K is a relatively common one, that is, when the first end detects that there is a contact position on the expression E, and the contact position is performed in a counterclockwise or clockwise direction.
  • the first terminal may preset the correspondence between the sliding amplitude and the rotation angle, and further calculate how many degrees the expression E should be rotated according to the obtained sliding amplitude.
  • the placement prompt information described above may be displayed again as shown in FIG. 4K to FIG. 4M to the first user. Place a reminder.
  • the first user rotates the expression E, as shown in FIG. 4L, if the first terminal detects the placement confirmation operation of the first user, the rotated expression E is displayed; if the first terminal detects the first user The cancel confirmation operation displays the initial expression E before the rotation.
  • the rotated expression E is also displayed synchronously.
  • the rotation data of the expression E is sent to the second terminal through the server, so that the second terminal synchronously displays the expression E on the message display interface according to the rotation data.
  • the rotation data includes at least an expression E, an object rotation direction of the expression E, and a target rotation angle.
  • the embodiment of the present application further supports hiding and re-calling the placed expression by the gesture operation. For example, if there are too many expressions placed on the current message display interface, which affects the first user's viewing of the message, the placed expression can be hidden. Further, after hiding the placed expression, the hidden expression can be performed again, so that the previously hidden expression is displayed again on the message display interface.
  • the first terminal controls the placed expression to follow the preset movement trajectory from the first target position to the preset termination position. Move; at the same time, adjust the transparency and size of the placed expression during the move until the display of the placed expression is cancelled.
  • the current message display interface includes a plurality of placed expressions, which seriously affects the first user's viewing of the message, and the first user can perform a hidden operation on the placed expression.
  • a hidden operation is supported to hide all the expressions placed on the current message display interface at one time.
  • the method for obtaining the hidden display instruction may be: taking the center of the screen as a demarcation point, and the two fingers gradually sliding to the sides from the middle of the screen. That is, when the first terminal detects that the two contact positions are gradually moving away from the left and right sides of the screen, it is determined that the first user performs a hidden operation on the placed expression.
  • the preset end position is the left and right sides of the screen.
  • the preset end position can be randomly set to the left side of the screen or the right side of the screen, or set according to the distance from both sides of the screen.
  • the preset end position of the expression closer to the left side of the screen is the left side of the screen
  • the preset end position of the expression closer to the right side of the screen is the right side of the screen.
  • the placed expressions are also moved.
  • the placed expression is usually moved according to the preset movement trajectory during the movement process.
  • the preset movement trajectory may be a linear trajectory, a wavy trajectory, a curved trajectory, etc., which is not specifically limited in this embodiment of the present application.
  • the transparency and size of each expression may also be adjusted during the movement of the placed expression. For example, as shown in FIGS. 4O to 4Q, the smaller the expression is placed closer to the preset end position and the higher the transparency. When the user's finger slides to the edge of the screen, the placed expression disappears completely and the transparency is completely transparent. Among them, the faster the user's finger slides, the faster the expression will disappear after the placement, that is, the transparency of each expression will increase, and the size will decrease. After hiding the placed expression, the message display interface is as shown in Figure 4R.
  • the preset termination position may be the upper and lower sides of the screen, or the upper left corner and the lower right corner of the screen, or the lower left corner and the upper right corner of the screen. Specifically limited.
  • the first terminal controls the placed expression to move from the preset termination position to the first target position according to the preset movement trajectory. And adjust the transparency and size of the target expression during the move until the target expression returns to its original size and original transparency. Among them, the closer to the original location, the larger the expression and the lower the transparency after each placement.
  • the method provided by the embodiment of the present application supports the selected expression drag and drop operation in the expression selection window when performing the expression display, and supports the expression to be randomly placed on the message display interface during the process of dragging and dropping the expression.
  • the expression display method is more vivid, the interaction mode is more diverse, and the display effect is better.
  • the embodiment of the present application also supports operations such as zooming in, zooming out, rotating, hiding, and recalling the placed expressions, thereby enriching the display style of the expressions, and making the interaction more diverse.
  • FIG. 5 is a schematic structural diagram of an expression display device according to an embodiment of the present application.
  • the apparatus includes one or more processors, and one or more memories storing program units, wherein the program units are executed by the processor, the program units including:
  • the processing module 501 is configured to: after the drag selection instruction for the selected target expression is acquired in the expression selection window, move the target expression according to the acquired drag track;
  • the first display module 502 is configured to display a placement prompt message during the process of moving the target expression
  • the second display module 503 is configured to obtain a stop drag command for the target expression in the message display interface, and in response to the drag stop instruction, after receiving the placement confirmation command triggered by the placement prompt message, Stopping the first target position of the drag to display the target expression.
  • program unit further includes:
  • An obtaining module configured to acquire a target scaling ratio that matches the scaling instruction after acquiring a scaling instruction for the target expression
  • the processing module is further configured to perform scaling processing on the target expression according to the target scaling ratio to obtain a scaled target expression
  • the display module is configured to display the scaled target expression at the first target location.
  • program unit further includes:
  • An acquiring module configured to acquire a target rotation direction and a target rotation angle that match the rotation instruction after acquiring a rotation instruction to the target expression
  • the processing module is further configured to rotate the target expression according to the target rotation direction and the target rotation angle to obtain a rotated target expression
  • the display module is configured to display the rotated target expression at the first target position.
  • the message presentation interface includes a visual rendering layer
  • the second display module 503 is configured to: if the message frame is displayed at the first target location, display the display at the first target location on the visual rendering layer in a presentation manner placed on the top layer The target expression at the location, so that the target expression is superimposed on the message frame; or if a message frame is displayed at the first target location, and the other expressions are superimposed on the message frame And displaying the target expression displayed at the first target position on the visual rendering layer in a display manner placed on the top layer, so that the target expression overlay is displayed on the other expression; or If the first target position is blank, the target expression displayed at the first target position is drawn on the visual rendering layer.
  • the second display module 503 is further configured to, after acquiring the hidden display instruction for the target expression, control the target expression according to a preset movement trajectory, starting from the first target position. Moving to a preset end position; adjusting the transparency and size of the target expression during the movement until the display of the target expression is cancelled;
  • the second display module 503 is further configured to, after acquiring the hidden display instruction for the target expression, control the target expression according to a preset movement trajectory, starting from the first target position. Moving to a preset end position; adjusting the transparency and size of the target expression during the movement until the display of the target expression is cancelled;
  • program unit further includes:
  • An acquiring module configured to acquire first screen size information of the first terminal
  • a determining module configured to determine at least one display element associated with the target expression, the presentation element being a message frame and/or an expression displayed at the first target location;
  • a calculation module configured to calculate first coordinate position information of the target expression relative to the at least one display element
  • the sending module is configured to send the emoticon display data to the second terminal, so that the second terminal displays the target emoticon according to the emoticon display data, where the emoticon display data includes at least the first coordinate position Information, the first screen size information, and identification information of the at least one display element;
  • the second user of the second terminal is located in a friend relationship chain of the first user of the first terminal.
  • the device provided by the embodiment of the present application supports the selected expression drag operation in the expression selection window when the expression is displayed, and supports the expression to be randomly placed on the message display interface during the process of dragging and dropping the expression.
  • the expression display method is more vivid, the interaction mode is more diverse, and the display effect is better.
  • the embodiment of the present application also supports operations such as zooming in, zooming out, rotating, hiding, and recalling the placed expressions, thereby enriching the display style of the expressions, and making the interaction more diverse.
  • FIG. 6 is a schematic structural diagram of an expression display device according to an embodiment of the present application.
  • the apparatus includes one or more processors, and one or more memories storing program units, wherein the program units are executed by the processor, the program units comprising:
  • the receiving module 601 is configured to receive the emoticon display data sent by the first terminal, where the emoticon display data includes at least the first coordinate position information, the first screen size information, and the identification information of the at least one display element. ;
  • the determining module 602 is configured to determine, according to the identification information of the at least one display element, a display range area of the target expression on the message display interface;
  • the obtaining module 603 is configured to acquire second screen size information of the second terminal
  • the determining module is further configured to determine a second target location in the display range area according to the first screen size information, the second screen size information, and the first coordinate position information;
  • the display module 604 is configured to display the target emoticon on the at least one display element for display at the second target position in a display manner placed on the top layer.
  • the device provided by the embodiment of the present application supports the selected expression drag operation in the expression selection window when the expression is displayed, and supports the expression to be randomly placed on the message display interface during the process of dragging and dropping the expression.
  • the expression display method is more vivid, the interaction mode is more diverse, and the display effect is better.
  • the embodiment of the present application also supports operations such as zooming in, zooming out, rotating, hiding, and recalling the placed expressions, thereby enriching the display style of the expressions, and making the interaction more diverse.
  • the expression display device provided in the foregoing embodiment is only illustrated by the division of the above functional modules when performing the expression display. In actual applications, the function distribution may be completed by different functional modules as needed. The internal structure of the device is divided into different functional modules to perform all or part of the functions described above. In addition, the embodiment of the present invention is the same as the embodiment of the present invention. The specific implementation process is not described here.
  • FIG. 7 is a schematic structural diagram of an electronic terminal according to an embodiment of the present disclosure, where the electronic terminal can be used to execute the expression display method provided in the foregoing embodiment.
  • the terminal 700 includes:
  • Radio frequency (RF) circuit 110 memory 120 including one or more computer readable storage media, input unit 130, display unit 140, sensor 150, audio circuit 160, Wireless Fidelity (WiFi) module 170.
  • a processor 180 having one or more processing cores, and a power supply 190 and the like. It will be understood by those skilled in the art that the terminal structure shown in Fig. 7 does not constitute a limitation to the terminal, and may include more or less components than those illustrated, or a combination of certain components, or different component arrangements. among them:
  • the RF circuit 110 can be configured to receive and transmit signals during or after the transmission or reception of information, in particular, after receiving the downlink information of the base station, and processing it by one or more processors 180; in addition, it will involve uplink data transmission.
  • the RF circuit 110 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, and a Low Noise Amplifier (LNA). , duplexer, etc.
  • SIM Subscriber Identity Module
  • LNA Low Noise Amplifier
  • RF circuitry 110 can also communicate with the network and other devices via wireless communication.
  • Wireless communication can use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division Multiple) Access, CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), e-mail, Short Messaging Service (SMS), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • SMS Short Messaging Service
  • the memory 120 can be configured to store software programs and modules, and the processor 180 executes various functional applications and data processing by running software programs and modules stored in the memory 120.
  • the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to The data created by the use of the terminal 700 (such as audio data, phone book, etc.) and the like.
  • memory 120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, memory 120 may also include a memory controller to provide access to memory 120 by processor 180 and input unit 130.
  • the input unit 130 can be configured to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function controls.
  • input unit 130 can include touch-sensitive surface 131 as well as other input devices 132.
  • Touch-sensitive surface 131 also referred to as a touch display or trackpad, can collect touch operations on or near the user (such as a user using a finger, stylus, etc., on any suitable object or accessory on touch-sensitive surface 131 or The operation near the touch-sensitive surface 131) and driving the corresponding connecting device according to a preset program.
  • the touch-sensitive surface 131 can include both a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 180 is provided and can receive commands from the processor 180 and execute them.
  • the touch-sensitive surface 131 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 130 can also include other input devices 132.
  • other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • Display unit 140 can be configured to display information entered by the user or information provided to the user and various graphical user interfaces of terminal 700, which can be comprised of graphics, text, icons, video, and any combination thereof.
  • the display unit 140 may include a display panel 141.
  • the display panel 141 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the touch-sensitive surface 131 may cover the display panel 141, and when the touch-sensitive surface 131 detects a touch operation thereon or nearby, it is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 according to the touch event The type provides a corresponding visual output on display panel 141.
  • touch-sensitive surface 131 and display panel 141 are implemented as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 can be integrated with display panel 141 for input. And output function.
  • Terminal 700 can also include at least one type of sensor 150, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 141 when the terminal 700 moves to the ear. / or backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the gesture of the mobile phone such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; as for the terminal 700 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
  • the audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the terminal 700.
  • the audio circuit 160 can transmit the converted electrical data of the received audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal by the audio circuit 160. After receiving, it is converted into audio data, and then processed by the audio data output processor 180, transmitted to the terminal, for example, via the RF circuit 110, or outputted to the memory 120 for further processing.
  • the audio circuit 160 may also include an earbud jack to provide communication of the peripheral earphones with the terminal 700.
  • WiFi is a short-range wireless transmission technology
  • the terminal 700 can help users to send and receive emails, browse web pages, and access streaming media through the WiFi module 170, which provides wireless broadband Internet access for users.
  • the processor 180 is the control center of the terminal 700, connecting various portions of the entire handset with various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling data stored in the memory 120, The various functions and processing data of the terminal 700 are performed to perform overall monitoring of the mobile phone.
  • the processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 180.
  • the terminal 700 also includes a power source 190 (such as a battery) for powering various components.
  • a power source 190 such as a battery
  • the power source can be logically coupled to the processor 180 through a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • Power supply 190 may also include any one or more of a DC or AC power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
  • the terminal 700 may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • the display unit of the terminal is a touch screen display, and the terminal further includes a memory, where the memory stores at least one instruction, at least one program, a code set or a command set, and the at least one instruction, the at least one instruction A program, the set of codes, or a set of instructions is loaded and executed by the processor to implement the expression presentation method described in the above embodiments.
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.

Abstract

本申请实施例公开了一种表情展示方法、装置及计算机可读存储介质,属于互联网技术领域。该方法包括:第一终端在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动目标表情;第一终端在移动目标表情的过程中显示放置提示消息;第一终端在消息展示界面获取对所述目标表情的停止拖拽指令,响应所述拖拽停止指令,在接收到由所述放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示所述目标表情。本申请实施例在进行表情展示时,支持在表情选取窗口进行选中的表情拖拽操作,且在拖拽移动该表情的过程中,支持将该表情随意放置在消息展示界面上,该种表情展示方式更加生动、互动方式更加多样化、显示效果较佳。

Description

表情展示方法、装置及计算机可读存储介质
本申请要求于2017年07月31日提交中国专利局、申请号为201710639778.0、发明名称“表情展示方法、装置及计算机可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及互联网技术领域,特别涉及一种表情展示方法、装置及计算机可读存储介质。
背景技术
在移动互联网时代,依托于社交和网络的不断发展,人们之间交流方式也出现了相应的改变,由最早的文字沟通到开始逐渐使用一些简单的符号及表情,再逐步演变为日益多元化的表情文化。换句话说,表情是在社交型应用活跃之后,形成的一种流行文化。比如,在用户与好友互动的过程中,为了使得互动双方获得良好的沟通体验,该类社交型应用还支持表情展示功能。即,参与互动的任一方均可以在消息展示界面上向互动对方进行表情的展示。
相关技术在进行表情展示时,通常采取下述方式实现:终端在检测到互动一方对表情选取窗口中任一表情的点击操作后,将选中的表情按序显示在消息展示界面的消息展示界面上。比如,若消息展示界面上当前未展示有任何消息,那么终端将该表情展示在消息展示界面上固定的首个消息展示位置。若消息展示界面上当前显示有消息,则终端将该表情展示这些消息之后的首个消息展示位置。
在实现本申请实施例的过程中,发明人发现相关技术至少存在以下问题:
在进行表情展示时,选中的表情仅能按序在消息展示界面上进行展示,因此该种表情展示方式缺乏生动性、方式过于单一、显示效果较差。
发明内容
为了解决相关技术的问题,本申请实施例提供了一种表情展示方法、装置及计算机可读存储介质。所述技术方案如下:
第一方面,提供了一种表情展示方法,应用于第一终端,所述方法包括:
所述第一终端在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动所述目标表情;
所述第一终端在移动所述目标表情的过程中显示放置提示消息;
所述第一终端在消息展示界面获取对所述目标表情的停止拖拽指令,响应所述拖拽停止指令,在接收到由所述放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示所述目标表情。
第二方面,提供了一种表情展示方法,应用于第二终端,所述方法包括:
所述第二终端接收第一终端发送的表情展示数据,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
所述第二终端根据所述至少一个展示元素的标识信息,在所述消息展示界面上确定所述目标表情的展示范围区域;
所述第二终端获取所述第二终端的第二屏幕尺寸信息;
所述第二终端根据所述第一屏幕尺寸信息、第二屏幕尺寸信息以及所述第一坐标位置信息,在所述展示范围区域内确定第二目标位置;
所述第二终端在所述第二目标位置处,以置于顶层的显示方式,将所述目标表情叠加在所述至少一个展示元素上进行展示。
第三方面,提供了一种表情展示装置,应用于第一终端,所述装置包括一个或多个处理器,以及一个或多个存储程序单元的存储器,其中,所述程序单元由所述处理器执行,所述程序单元包括:
处理模块,被设置为在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动所述目标表情;
第一展示模块,被设置为在移动所述目标表情的过程中显示放置提示消息;
第二展示模块,被设置为在消息展示界面获取对所述目标表情的停止 拖拽指令,响应所述拖拽停止指令,在接收到由所述放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示所述目标表情。
第四方面,提供了一种表情展示装置,应用于第二终端,所述装置包括一个或多个处理器,以及一个或多个存储程序单元的存储器,其中,所述程序单元由所述处理器执行,所述程序单元包括:
接收模块,被设置为接收第一终端发送的表情展示数据,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
确定模块,被设置为根据所述至少一个展示元素的标识信息,在所述消息展示界面上确定所述目标表情的展示范围区域;
获取模块,被设置为获取所述第二终端的第二屏幕尺寸信息;
所述确定模块,还被设置为根据所述第一屏幕尺寸信息、第二屏幕尺寸信息以及所述第一坐标位置信息,在所述展示范围区域内确定第二目标位置;
展示模块,被设置为在所述第二目标位置处,以置于顶层的显示方式,将所述目标表情叠加在所述至少一个展示元素上进行展示。
第五方面,提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现第一方面所述的表情展示方法;或,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现第二方面所述的表情展示方法。
第六方面,提供了一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现第一方面所述的表情展示方法;或,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现第二方面所述的表情展示方法。
本申请实施例提供的技术方案带来的有益效果是:
在进行表情展示时,支持在表情选取窗口进行选中的表情拖拽操作,且在拖拽移动该表情的过程中,支持将该表情随意放置在消息展示界面上,因此该种表情展示方式更加生动、互动方式更加多样化、显示效果较佳。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种消息展示界面的示意图;
图2是本申请实施例提供的一种表情展示方法所涉及的系统架构图;
图3是本申请实施例提供的一种表情展示方法的流程图;
图4A是本申请实施例提供的另一种消息展示界面的示意图;
图4B是本申请实施例提供的另一种消息展示界面的示意图;
图4C是本申请实施例提供的另一种消息展示界面的示意图;
图4D是本申请实施例提供的另一种消息展示界面的示意图;
图4E是本申请实施例提供的另一种消息展示界面的示意图;
图4F是本申请实施例提供的另一种消息展示界面的示意图;
图4G是本申请实施例提供的另一种消息展示界面的示意图;
图4H是本申请实施例提供的另一种消息展示界面的示意图;
图4I是本申请实施例提供的另一种消息展示界面的示意图;
图4J是本申请实施例提供的另一种消息展示界面的示意图;
图4K是本申请实施例提供的另一种消息展示界面的示意图;
图4L是本申请实施例提供的另一种消息展示界面的示意图;
图4M是本申请实施例提供的另一种消息展示界面的示意图;
图4N是本申请实施例提供的另一种消息展示界面的示意图;
图4O是本申请实施例提供的另一种消息展示界面的示意图;
图4P是本申请实施例提供的另一种消息展示界面的示意图;
图4Q是本申请实施例提供的另一种消息展示界面的示意图;
图4R是本申请实施例提供的另一种消息展示界面的示意图;
图4S是本申请实施例提供的另一种消息展示界面的示意图;
图4T是本申请实施例提供的另一种消息展示界面的示意图;
图5是本申请实施例提供的一种表情展示装置的结构示意图;
图6是本申请实施例提供的另一种表情展示装置的结构示意图;
图7是本申请实施例提供的一种终端的结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
表情:表情是在社交应用活跃之后,形成的一种流行文化,用以表达特定的情感,主要针对在面部或姿态上的思想情感。其中,表情一般可分为符号表情、静态图片表情、动态图片表情等。比如,表情可以以表达人类各种情绪的人脸为素材,或者以时下流行的明星、语录、动漫、影视截图等为素材,再配上一系列相匹配的文字等。
消息展示界面(All In One,AIO):指代社交应用中提供的消息展示界面,比如好友聊天界面或者群组聊天界面,用来展示表情。
随意放置:表情支持拖动至消息展示界面,并可以随意贴在消息展示界面上,即便当前位置上显示有文本消息或其他表情也无妨。
接下来,对本申请实施例提供的表情展示方法所涉及的实施场景以及系统架构进行简单说明。
本申请实施例提供的表情展示方法主要用于好友互动场景或者群组互动场景。时下在互动场景下,假设互动一方在表情选取窗口通过点击操作选中一个表情,那么互动各方的消息展示界面上便会按序对这个选中的表情进行展示。
举例来说,在图1中,由于表情A的发送时间晚于表情B,因此表情A显示在表情B之前。此外,由于表情A和表情B来自于不同用户,因此为了便于区分,发送的表情A以界面左侧为基准点以向纵向中心轴延伸的方式展示,而发送的表情B以界面右侧为基准点以向纵向中心轴延伸的方式展示。
由于这种表情展示方式过于单一,以及缺乏生动性,因此本申请实施例提出了一种表情可以拖拽至消息展示界面,并且可以在消息展示界面上随意放置,且还可进行隐藏和再次唤出的互动方式,使得用户在与好友关系链的好友互动过程中能够更好地表达个人情感,增强了在线上非面对面互动的情况下的情感表达,提升了好友关系链的活跃性,增添了互动愉悦性,用户体验度更好。
图2是本申请实施例提供的一种表情展示方法所涉及的系统架构图。参见图2,该系统架构中包括第一终端、服务器和第二终端。
其中,第一终端和第二终端可为智能手机、平板电脑等,本申请实施例对此不进行具体限定。若针对一对一互动场景,则第二终端包括的终端个数为一个;而若针对一对多互动场景,则第二终端包括的终端个数为多个。此外,第一终端和第二终端上安装有同一款社交应用,第一终端的第一用户和第二终端的第二用户基于该社交应用进行互动,该社交应用还分别为第一用户和第二用户维护了一个好友关系链。其中,第一用户位于第二用户的好友关系链中,第二用户位于第一用户的好友关系链中。
在本申请实施例中,表情展示过程可简述为:第一终端在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动目标表情,并显示针对目标表情的放置提示消息;之后,若第一终端在消息展示界面获取到对目标表情的停止拖拽指令,则响应拖拽停止指令,在接收到由该放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示目标表情。
其中,第一目标位置可为消息展示界面上的任一位置,即便这个位置上显示有文本消息或其他表情也无妨。当然,第一终端还会将相关的表情展示数据通过服务器的透传发送给第二终端,以使第二终端同第一终端具有相同的表情展示效果。
这样便实现了通过长按等触发操作将表情拖拽至消息展示界面进行随意放置的互动方式,换句话说,本申请实施例通过感知用户的手势操作提供了一种新的表情互动方式。关于触发拖拽操作、随意放置、隐藏表情以及唤出表情的详细实现方式请参见下述实施例。
图3是本申请实施例提供的一种表情展示方法的流程图。参见图3,本申请实施例提供的方法流程包括:
301、第一终端在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动该目标表情,并在移动该目标表情的过程中显示放置提示消息。
其中,表情选取窗口中用于进行多个不同的表情的展示,以供用户进行表情输入。在本申请实施例中,参见图4A,若第一终端检测到第一用户对表情选取窗口中的表情F触发了长按操作,则第一终端将表情F确定为目标表情。接下来,若第一终端检测到第一用户从这一位置开始执行了滑动操作,则第一终端确定获取到对表情F的拖拽指令,并实时获取这一滑动操作所形成的拖拽轨迹,进而按照该拖拽轨迹移动表情F。
其中,长按操作的持续时间可为1秒或2秒等,本申请实施例对此不进行具体限定,长按操作的时长取值大小以可以同单次点击操作进行区分为准。拖拽轨迹的获取可由第一终端上的触敏元件实现,本申请实施例对此同样不进行具体限定。此外,第一终端在检测到第一用户对表情F的长按操作后,为了后续便于第一用户对表情F进行拖拽操作,如图4B所示,第一终端可在表情F的周边区域以置于顶层的显示方式,显示一个放大版的表情F。这样当第一用户在对这个放大版的表情F执行拖拽操作时,便可实现将表情F从表情选取窗口中拖拽出来。
在另一个实施例中,在移动表情F的过程中还会显示有这对表情F的放置提示消息。其中,如图4C至图4F所示,该放置提示消息本质上为放置提示图标,且该放置提示图标贴附在表情F上。比如,该放置提示消息中的取消放置提示图标
Figure PCTCN2018096609-appb-000001
放置在表情F的右上角,而该放置提示消息中的确认放置提示图标
Figure PCTCN2018096609-appb-000002
放置在表情F的右下角。其中,该放置提示消息随着表情F的移动而移动。此外,该放置提示消息还可放置在表情F的左上角和左下角,或者,左上角和右上角,或者,左下角和右下角,本申请实施例对此不进行具体限定。
需要说明的是,由于表情F是用于同他人进行交流互动,因此通常均是放置在消息展示界面上,因此这一放置提示消息可以在表情F移动到消息展示界面上后再进行显示。当然,也可以在一开始移动表情F时便进行 显示。
此外,如图4C至图4E所示,在本申请实施例中为了体现出对表情F的拖拽效果,在滑动操作持续进行的过程中,表情F实时跟随展示在第一用户与终端屏幕的触点位置。即,第一用户的手指滑动到哪里,表情F便展示在哪里。
在另一个实施例中,参见图4F,在开始拖拽表情F后,表情选取窗口中的其他表情均置于不可操作状态,且表情F所在的区域会显示一个拖拽后留下的阴影。或者,在开始拖动表情F后整个表情选取窗口均置于不可操作的状态。
在另一个实施例中,在移动表情F的过程中,可以随意在消息展示界面上进行滑动操作。无论是在移动过程中经过消息展示界面的空白区域,还是经过消息展示界面上展示有文本消息或其他表情的非空白区域,表情F均是一直以置于顶层的显示方式显示在消息展示界面上。
302、第一终端在消息展示界面获取对该目标表情的停止拖拽指令,响应该拖拽停止指令,在接收到由该放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示目标表情。
在本申请实施例中,当第一用户停止对目标表情的拖拽后,即第一用户在执行滑动操作后,若第一终端检测到其在某一位置的停止时长超过规定时长,则第一终端获取到对该目标表情的停止拖拽指令,并对该停止拖拽指令进行响应。此时目标表情不会继续移动显示,而是显示在如图4C至图4E中任一个图所示的一个固定位置。由于此时第一终端还未能确定第一用户是否真正欲将该目标表情放置在这一固定位置,所以此时该目标表情上还是会贴附显示有放置提示消息。
如图4E所示,若第一用户此时点击了确认放置提示图标
Figure PCTCN2018096609-appb-000003
则第一终端会接收到放置确认指令,确定第一用户确实欲将该目标表情放置在这一固定位置,因此将该目标表情当前所在的这个固定位置确定为第一目标位置,并在第一目标位置处以未贴附有放置提示消息的形式展示该目标表情。即,放置成功后的该目标表情的展示方式可如图4G所示。此时便实现了对该目标表情的发送成功。由于已经完成对该目标表情的拖拽、放置以及展示,因此当前表情选取窗口便又恢复了之前的可操作状态,该目标 表情所在区域也会恢复原始显示样式,此时可以继续进行类似的表情拖拽以及放置处理。
在另一个实施例中,若第一用户在上述固定位置处点击了取消放置提示图标
Figure PCTCN2018096609-appb-000004
则第一终端会接收到取消放置指令,确定第一用户取消将该目标表情放置在这一固定位置,因此在消息展示界面上取消对该目标表情的展示,即该目标表情消失在消息展示界面上。针对该种情况,在第一用户点击了取消放置提示图标
Figure PCTCN2018096609-appb-000005
后,当前表情选取窗口也会恢复为之前的可操作状态,该目标表情所在区域也会恢复原始显示样式,此时同样可以继续进行类似的表情拖拽以及放置处理,本申请实施例对此不进行具体限定。
需要说明的是,本申请实施例之所以能够实现上述的表情展示方法,是因为重绘了消息展示界面的视觉渲染层。其中,参见图4H,该消息展示界面包括视觉渲染层、置于视觉渲染层上的消息容器和置于消息容器上的输入控件。
其中,消息容器是容纳第一用户和第二用户在互动过程中产生的消息的图层,该消息容器可为矩形,也可为圆形或者不规则形状,本申请实施例对此不进行具体限定。此外,该消息容器中用于展示消息的区域可以为不透明,而该消息容器中除了用于展示消息的区域之外的其他区域可以为透明或者半透明。输入控件是用于输入的控件,可包括输入框。其中,本申请实施例通过重绘视觉渲染层,即调用用户界面(User Interface,UI)组件通过利用渲染对象的绘制方法在视觉渲染层上进行内容绘制,使得用户可通过长按等操作触发将表情拖拽至消息展示界面进行随意放置,以及隐藏表情和再次唤出表情等。
在另一个实施例中,本申请实施例在对按照常规的利用输入框进行输入生成的消息进行展示时,具体是在消息容器中进行展示,而目标表情的展示是通过视觉渲染层来实现,使得常规消息的展示与这一特殊的表情展示互相分离,达到了更佳的展示效果。其中,在有了视觉渲染层后,第一终端可按照下述几种方式对该目标表情进行展示。
第一种方式、若第一目标位置处展示有消息框体,则以置于顶层的展示方式,在视觉渲染层上绘制展示在第一目标位置处的目标表情,以使目标表情叠加展示在消息框体上。
在本申请实施例中,只要消息框体与该目标表情有重合的部分,即确定第一目标位置处展示有消息框体。即第一种方式对应图4G所示的表情展示方式。其中,第一目标位置处可对应于目标表情的几何中心点。换句话说,一个表情或一个消息框体的位置均可用其几何中心点的位置进行指代。
需要说明的是,本申请实施例可设置仅支持将这种随意放置的表情叠加在原有的消息框体上进行展示。即,如果第一用户想要将该目标表情放置在消息展示界面上的纯空白区域,则不会放置成功。
针对该种设置方式,第一终端在接收到放置确认指令后,可判断第一目标位置处是否展示有消息框体;若第一目标位置处展示有消息框体,则执行在第一目标位置对目标表情进行展示的步骤;若第一目标位置处不存在消息框体,则显示重新放置提示消息。其中,该重新放置提示消息可以条状的显示方式显示在消息展示界面的边缘区域,以尽量不对消息展示界面进行过多的覆盖。
其中,消息框体中展示的为通过常规的输入框方式进行输入时生成的消息。这一生成的消息可为纯文本消息,也可仅为表情,也可为文本消息与表情的结合,本申请实施例对此不进行具体限定。
第二种方式、若第一目标位置处展示有消息框体,且消息框体上叠加展示其他表情,则以置于顶层的展示方式,在视觉渲染层上绘制展示在第一目标位置处的目标表情,以使目标表情叠加展示在其他表情上。
需要说明的是,这里的其他表情指代的是之前通过拖拽操作随意放置在消息框体上的表情。
针对第二种方式,主要指代的是在表情上堆叠表情的方式。如图4I和图4J所示,表情E便叠加在表情F上。其中,仅要消息框体、目标表情以及其他表情这几者之间有重合的部分,便可确定第一目标位置处同时展示有消息框体、目标表情以及其他表情。
第三种方式、若第一目标位置处空白,则在视觉渲染层上绘制展示在第一目标位置处的目标表情。
针对第三种方式,本申请实施例也可设置支持将这种随意放置的表情在空白处进行展示。即,如果第一用户想要将该目标表情放置在消息展示 界面上的纯空白区域,则也会放置成功,而无需消息框体的依托。
综上所述,本申请实施例完成了在第一终端上对目标表情的拖拽以及随意放置,为了使得第二终端的消息展示界面上目标表情也具备同样的展示效果,而不是将该目标表情按照常规方式进行按序展示,本申请实施例还包括下述步骤303至步骤307。
303、第一终端向服务器发送表情展示数据,该表情展示数据中至少包括第一坐标位置信息、第一屏幕尺寸信息以及至少一个展示元素的标识信息。
在本申请实施例中,表情展示数据为第一终端按照下述方式生成的:第一终端获取自身的第一屏幕尺寸信息,并确定与该目标表情关联的至少一个展示元素,计算该目标表情相对于至少一个展示元素的第一坐标位置信息;之后,至少根据该第一坐标位置信息、第一屏幕尺寸信息以及至少一个展示元素的标识信息生成该表情展示数据。
其中,至少一个展示元素为展示在第一目标位置处的消息框体和/或表情。即当第一目标位置处仅展示有消息框体时,那么至少一个展示元素便仅为消息框体,比如参见图4E所示情形。当在第一目标位置处展示的消息框体上叠加显示有其他表情时,那么至少一个展示元素便包括消息框体和其他表情,比如参见图4J所示情形。当然如果支持表情放置在空白处,那么至少一个展示元素也可仅包括其他表情。
此外,在本申请实施例中之所以还要获取第一屏幕尺寸信息,是因为时下对于终端来说具有各种尺寸大小,为了对不同尺寸的终端进行同步,使得各种尺寸的终端在对目标表情进行显示时,均具有一致的显示效果,因此还需基于终端之间尺寸信息的差异,来进行一下目标表情与至少一个展示元素之间的相对位置换算。比如,在第一终端的消息展示界面上目标表情与消息框体之间的重合部分为30%,那么在第二终端的消息展示界面上也需保证目标表情与消息框体之间的重合部分为30%。
需要说明的是,如果该目标表情是展示在消息展示界面上的空白位置处,那么第一坐标位置信息便为绝对位置信息。即,除了根据相对的第一坐标位置信息和第一屏幕尺寸信息,在第二终端的消息展示界面上进行坐标位置的换算外,还可依据绝对的第一坐标位置信息和第一屏幕尺寸信息 进行相应换算,本申请实施例对此不进行具体限定。其中,至少一个展示元素的标识信息用于便于第二终端在消息展示界面上展示的全部内容中快速锁定目标表情的大致位置所在。
其中,第一终端在向服务器发送该表情展示数据时,可复用原有用于传输消息数据的消息通道,本申请实施例对此不进行具体限定。
304、服务器在接收到该表情展示数据后,将该表情展示数据发送给第二终端。
其中,服务器可以同样复用原有用于传输消息数据的消息通道,将该表情展示数据发送给第二终端。
305、第二终端在接收第一终端发送的表情展示数据后,根据至少一个展示元素的标识信息,在消息展示界面上确定目标表情的展示范围区域。
其中,至少一个展示元素的展示区域以及至少一个展示元素的周边区域均可以确定为该目标表情的展示范围区域。
306、第二终端获取自身的第二屏幕尺寸信息,并根据第一屏幕尺寸信息、第二屏幕尺寸信息以及第一坐标位置信息,在该展示范围区域内确定第二目标位置。
在本申请实施例中,第二终端可以先根据第一屏幕尺寸信息、第二屏幕尺寸信息以及第一坐标位置信息,计算在自身的消息展示界面上该目标表情相对于至少一个展示元素的第二坐标位置信息,之后再根据第二坐标位置信息和至少一个展示元素的位置信息,计算出第二目标位置。
307、第二终端在第二目标位置处,以置于顶层的显示方式将该目标表情叠加在至少一个展示元素上进行展示。
其中,第二终端在进行该目标表情的展示时,同样以类似于上述在第一终端的消息展示界面上的展示方式,通过在视觉渲染层上进行绘制,使得该目标表情在第二终端的消息展示界面上的展示方式同在第一终端的展示方式一致。
在本申请实施例中,在如图4A至图4G所示完成对表情F的拖拽后以及放置后,还可以如图4I所示继续开始对表情E的拖拽和放置。即,本申请实施例支持多次拖拽并放置表情。此外,本申请实施例还支持在将表情放置在消息展示界面上后,可对放置后的表情进行放大、缩小和旋转 等处理操作,详细过程请参见下述描述:
第一、对放置后的表情进行缩放。即,第一终端在获取到对放置后的表情的缩放指令后,获取与该缩放指令匹配的目标缩放比例;进而根据目标缩放比例对该表情进行缩放处理。
其中,缩放处理既可为放大处理也可以为缩小处理。而获取到缩放指令的方式可有多种。如图4J所示,在对放置后的表情E进行缩放处理时,可通过双手操作实现。比如,在进行放大处理时,两个食指可分别置于图示的左上角和左上角,且按照图示的逐渐远离的方向进行滑动;其中,滑动的距离越大,对表情E的放大比例越大。而第一终端可以预先设置滑动距离与放大比例的对应关系,进而根据获取到的滑动距离来计算应该将表情E放大到多少。
当然,除了图4J所示的放大方式以外,第一终端仅要检测到表情E上存在两个触点位置,无论这两个触点位置是通过双手实现还是单手实现,且两个触点位置逐渐远离,便确定第一用户是在对表情E进行放大处理。
对应地,继续以图4J为例,针对缩小处理来说,两个食指同样可分别置于图示的左上角和左上角,且按照逐渐靠近的方向进行滑动;其中,滑动的距离越大,对表情E的缩小比例越大。即,二者越靠近,表明缩小比例越大。当然,除了这种缩小方式以外,第一终端仅要检测到表情E上存在两个触点位置,且两个触点位置逐渐靠近,便可确定第一用户是在对表情E进行缩小处理。
在另一个实施例中,当第一终端确定第一用户是在对表情E进行缩放处理后,还可再次显示前文所述的放置提示信息,以对第一用户进行放置提醒。在第一用户将表情E进行缩放处理完毕后,若第一终端检测到第一用户的放置确认操作,则展示缩放后的表情E;若第一终端检测到第一用户的取消确认操作,则展示缩放前的初始表情E。
在另一个实施例中,在第一用户对表情E进行缩放处理后,为了使得第二终端的消息展示界面上的表情E具有相同的展示效果,也同步放大展示或缩小展示,本申请实施例还会将表情E的缩放数据通过服务器发送至第二终端,使得第二终端根据该缩放数据在消息展示界面上同步展示表情E。其中,该缩放数据中至少包括表情E和表情E的目标缩放比例。
第二、对放置后的表情进行旋转。即,第一终端在获取到对放置后的表情的旋转指令后,获取与该旋转指令匹配的目标旋转方向和目标旋转角度,并根据目标旋转方向和目标旋转角度对目标表情进行旋转处理。
图4K至图4M为对表情E进行旋转的示意图。其中,旋转指令的获取形式也有多种,图4K为比较常见的一种,即当第一种端检测到表情E上存在一个触点位置,且该触点位置以逆时针或顺时针方向进行滑动时,便确定第一用户是在对表情E执行旋转处理。其中,滑动操作执行的幅度越大,对表情E旋转的角度也越大。同样,第一终端可以预先设置滑动幅度与旋转角度的对应关系,进而根据获取到的滑动幅度来计算应该将表情E旋转到多少度。
在另一个实施例中,当第一终端确定第一用户是在对表情E进行旋转处理后,还可如图4K至图4M所示再次显示前文所述的放置提示信息,以对第一用户进行放置提醒。在第一用户将表情E进行旋转处理完毕后,如图4L所示,若第一终端检测到第一用户的放置确认操作,则展示旋转后的表情E;若第一终端检测到第一用户的取消确认操作,则展示旋转前的初始表情E。
在另一个实施例中,在第一用户对表情E进行旋转处理后,为了使得第二终端的消息展示界面上的表情E具有相同的展示效果,也同步展示旋转后的表情E,本申请实施例还会将表情E的旋转数据通过服务器发送至第二终端,使得第二终端根据该旋转数据在消息展示界面上同步展示表情E。其中,该旋转数据中至少包括表情E、表情E的目标旋转方向和目标旋转角度。第二终端在对表情E进行旋转处理后,展示效果同样如图4M所示。
需要说明的是,在本申请实施例中除了可对放置后的表情进行缩放处理以及旋转处理外,本申请实施例还支持通过手势操作对放置后的表情进行隐藏以及再次唤出。比如,当前消息展示界面上的放置的表情过多,影响了第一用户对消息的查看,那么可以对放置后的表情进行隐藏。进一步地,在隐藏放置后的表情后,还可以对隐藏后的表情执行再次唤出操作,使得之前隐藏的表情再次展示在消息展示界面上。
第一、对放置后的表情执行隐藏操作。
即,如图4N至图4R所示,第一终端在获取到对放置后的表情的隐藏展示指令后,控制放置后的表情按照预设移动轨迹,从第一目标位置开始向预设终止位置进行移动;同时,在移动过程中调整放置后的表情的透明度和大小直至取消对放置后的表情的展示。
如4N所示,当前消息展示界面上包括多个放置后的表情,严重影响了第一用户对消息的查看,此时第一用户便可执行针对放置后的表情的隐藏操作。在本申请实施例中,支持一次隐藏操作将当前消息展示界面上全部放置后的表情一次性进行隐藏。
其中,如图4O至图4Q所示,在本申请实施例中隐藏展示指令的获取方式可为:以屏幕中心为分界点,两个手指由屏幕中间逐渐向两侧滑动。即,当第一终端检测到两个触点位置在逐渐向屏幕的左右两侧远离时,确定第一用户是在对放置后的表情执行隐藏操作。
换句话说,预设终止位置为屏幕的左右两侧。其中,对于每一个放置后的表情来说,可将其预设终止位置随机设置为屏幕左侧或者屏幕右侧,或者,根据距离屏幕两侧的距离来设置。比如,距离屏幕左侧更近的表情的预设终止位置为屏幕左侧,距离屏幕右侧更近的表情的预设终止位置为屏幕右侧。
如图4O至图4Q所示,在第一用户的手指逐渐远离的过程中,这些放置后的表情也随之进行移动。其中,放置后的表情在移动的过程中通常是按照预设移动轨迹来移动的。比如预设移动轨迹可为直线轨迹、波浪形轨迹、曲线轨迹等,本申请实施例对此不进行具体限定。
在另一个实施例中,为了增强用户体验效果,还可以在移动放置后的表情的过程中,调整每一个表情的透明度和大小。比如,如图4O至图4Q所示,越靠近预设终止位置放置后的表情越小且透明度越高。当用户手指滑动至屏幕边缘的时候,放置后的表情完全消失,透明度为完全透明。其中,当用户手指滑动的动作越快速时,放置后的表情对应消失的也越快,即每一个表情的透明度随之提升的越块,而大小也随之降低的越快。在隐藏放置后的表情后,消息展示界面便如图4R所示。
需要说明的是,预设终止位置除了屏幕左右两侧以外,还可为屏幕上下两侧,或者屏幕的左上角和右下角,或者屏幕的左下角和右上角,本申 请实施例对此不进行具体限定。
第二、再次唤出之前隐藏的表情。如图4S和图4T所示,第一终端在获取到对放置后的表情的取消隐藏指令后,控制放置后的表情按照预设移动轨迹,从预设终止位置开始向第一目标位置进行移动;并在移动过程中调整目标表情的透明度和大小,直至目标表情恢复原始大小以及原始透明度。其中,越靠近原始所在位置时,每一个放置后的表情越大且透明度越低。
针对再出唤出操作,其与上述隐藏操作为互逆过程。即,如图4S和图4T所示,两个手指可由屏幕两侧逐渐向屏幕中心进行滑动,此时之前被隐藏的表情从屏幕左右两侧出现,并按照预设轨迹逐渐向屏幕中心移动,且在移动过程中透明度降低且越变越大。当每一个表情都到达原始所在位置时,变为没有透明度,且大小为原始大小,即恢复如图4N所示的状态。
本申请实施例提供的方法在进行表情展示时,支持在表情选取窗口进行选中的表情拖拽操作,且在拖拽移动该表情的过程中,支持将该表情随意放置在消息展示界面上,因此该种表情展示方式更加生动、互动方式更加多样化、显示效果较佳。
此外,本申请实施例还支持对放置后的表情进行放大、缩小、旋转、隐藏以及再次唤出等操作,因此更加丰富了对表情的展示样式,使得互动更加多样。
图5是本申请实施例提供的一种表情展示装置的结构示意图。参见图5,该装置包括一个或多个处理器,以及一个或多个存储程序单元的存储器,其中,所述程序单元由所述处理器执行,所述程序单元包括:
处理模块501,被设置为在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动所述目标表情;
第一展示模块502,被设置为在移动所述目标表情的过程中显示放置提示消息;
第二展示模块503,被设置为在消息展示界面获取对所述目标表情的停止拖拽指令,响应所述拖拽停止指令,在接收到由所述放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示所述目标表情。
在另一个实施例中,所述程序单元还包括:
获取模块,被设置为在获取到对所述目标表情的缩放指令后,获取与所述缩放指令匹配的目标缩放比例;
所述处理模块,还被设置为根据所述目标缩放比例对所述目标表情进行缩放处理,得到缩放后的目标表情;
所述展示模块,被设置为在所述第一目标位置对所述缩放后的目标表情进行展示。
在另一个实施例中,所述程序单元还包括:
获取模块,被设置为在获取到对所述目标表情的旋转指令后,获取与所述旋转指令匹配的目标旋转方向和目标旋转角度;
所述处理模块,还被设置为根据所述目标旋转方向和目标旋转角度对所述目标表情进行旋转处理,得到旋转后的目标表情;
所述展示模块,被设置为在所述第一目标位置对所述旋转后的目标表情进行展示。
在另一个实施例中,所述消息展示界面包括视觉渲染层;
所述第二展示模块503,被设置为若所述第一目标位置处展示有消息框体,则以置于顶层的展示方式,在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情,以使所述目标表情叠加展示在所述消息框体上;或,若所述第一目标位置处展示有消息框体,且所述消息框体上叠加展示其他表情,则以置于顶层的展示方式,在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情,以使所述目标表情叠加展示在所述其他表情上;或,若所述第一目标位置处空白,则在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情。
在另一个实施例中,第二展示模块503,还被设置为在获取到对所述目标表情的隐藏展示指令后,控制所述目标表情按照预设移动轨迹,从所述第一目标位置开始向预设终止位置进行移动;在移动过程中调整所述目标表情的透明度和大小,直至取消对所述目标表情的展示;
其中,越靠近所述预设终止位置所述目标表情越小且透明度越高。
在另一个实施例中,第二展示模块503,还被设置为在获取到对所述目标表情的隐藏展示指令后,控制所述目标表情按照预设移动轨迹,从所 述第一目标位置开始向预设终止位置进行移动;在移动过程中调整所述目标表情的透明度和大小,直至取消对所述目标表情的展示;
其中,越靠近所述预设终止位置所述目标表情越小且透明度越高。
在另一个实施例中,所述程序单元还包括:
获取模块,被设置为获取所述第一终端的第一屏幕尺寸信息;
确定模块,被设置为确定与所述目标表情关联的至少一个展示元素,所述展示元素为展示在所述第一目标位置处的消息框体和/或表情;
计算模块,被设置为计算所述目标表情相对于所述至少一个展示元素的第一坐标位置信息;
发送模块,被设置为向第二终端发送表情展示数据,以使所述第二终端根据所述表情展示数据对所述目标表情进行展示,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
其中,所述第二终端的第二用户位于所述第一终端的第一用户的好友关系链中。
本申请实施例提供的装置在进行表情展示时,支持在表情选取窗口进行选中的表情拖拽操作,且在拖拽移动该表情的过程中,支持将该表情随意放置在消息展示界面上,因此该种表情展示方式更加生动、互动方式更加多样化、显示效果较佳。
此外,本申请实施例还支持对放置后的表情进行放大、缩小、旋转、隐藏以及再次唤出等操作,因此更加丰富了对表情的展示样式,使得互动更加多样。
图6是本申请实施例提供的一种表情展示装置的结构示意图。参见图6,该装置包括一个或多个处理器,以及一个或多个存储程序单元的存储器,其中,所述程序单元由所述处理器执行,所述程序单元包括:
接收模块601,被设置为接收第一终端发送的表情展示数据,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
确定模块602,被设置为根据所述至少一个展示元素的标识信息,在 所述消息展示界面上确定所述目标表情的展示范围区域;
获取模块603,被设置为获取所述第二终端的第二屏幕尺寸信息;
所述确定模块,还被设置为根据所述第一屏幕尺寸信息、第二屏幕尺寸信息以及所述第一坐标位置信息,在所述展示范围区域内确定第二目标位置;
展示模块604,被设置为在所述第二目标位置处,以置于顶层的显示方式,将所述目标表情叠加在所述至少一个展示元素上进行展示。
本申请实施例提供的装置在进行表情展示时,支持在表情选取窗口进行选中的表情拖拽操作,且在拖拽移动该表情的过程中,支持将该表情随意放置在消息展示界面上,因此该种表情展示方式更加生动、互动方式更加多样化、显示效果较佳。
此外,本申请实施例还支持对放置后的表情进行放大、缩小、旋转、隐藏以及再次唤出等操作,因此更加丰富了对表情的展示样式,使得互动更加多样。
需要说明的是:上述实施例提供的表情展示装置在进行表情展示时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的表情展示装置与表情展示方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
图7是本申请实施例提供的一种电子终端的结构示意图,该电子终端可以用于执行上述实施例中提供的表情展示方法。参见图7,该终端700包括:
射频(Radio Frequency,RF)电路110、包括有一个或一个以上计算机可读存储介质的存储器120、输入单元130、显示单元140、传感器150、音频电路160、无线保真(Wireless Fidelity,WiFi)模块170、包括有一个或者一个以上处理核心的处理器180、以及电源190等部件。本领域技术人员可以理解,图7中示出的终端结构并不构成对终端的限定,可以包括 比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。其中:
RF电路110可被设置为收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,交由一个或者一个以上处理器180处理;另外,将涉及上行的数据发送给基站。通常,RF电路110包括但不限于天线、至少一个放大器、调谐器、一个或多个振荡器、用户身份模块(SIM)卡、收发信机、耦合器、低噪声放大器(Low Noise Amplifier,LNA)、双工器等。此外,RF电路110还可以通过无线通信与网络和其他设备通信。无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(Global System of Mobile communication,GSM)、通用分组无线服务(General Packet Radio Service,GPRS)、码分多址(Code Division Multiple Access,CDMA)、宽带码分多址(Wideband Code Division Multiple Access,WCDMA)、长期演进(Long Term Evolution,LTE)、电子邮件、短消息服务(Short Messaging Service,SMS)等。
存储器120可被设置为存储软件程序以及模块,处理器180通过运行存储在存储器120的软件程序以及模块,从而执行各种功能应用以及数据处理。存储器120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据终端700的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器120还可以包括存储器控制器,以提供处理器180和输入单元130对存储器120的访问。
输入单元130可被设置为接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。具体地,输入单元130可包括触敏表面131以及其他输入设备132。触敏表面131,也称为触摸显示屏或者触控板,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触敏表面131上或在触敏表面131附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触敏表面131可包括触摸检测装置和触摸控制器 两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器180,并能接收处理器180发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触敏表面131。除了触敏表面131,输入单元130还可以包括其他输入设备132。具体地,其他输入设备132可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元140可被设置为显示由用户输入的信息或提供给用户的信息以及终端700的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。显示单元140可包括显示面板141,可选的,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板141。进一步的,触敏表面131可覆盖显示面板141,当触敏表面131检测到在其上或附近的触摸操作后,传送给处理器180以确定触摸事件的类型,随后处理器180根据触摸事件的类型在显示面板141上提供相应的视觉输出。虽然在图7中,触敏表面131与显示面板141是作为两个独立的部件来实现输入和输出功能,但是在某些实施例中,可以将触敏表面131与显示面板141集成而实现输入和输出功能。
终端700还可包括至少一种传感器150,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在终端700移动到耳边时,关闭显示面板141和/或背光。作为运动传感器的一种,重力加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于终端700还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路160、扬声器161,传声器162可提供用户与终端700之间的音频接口。音频电路160可将接收到的音频数据转换后的电信号,传输 到扬声器161,由扬声器161转换为声音信号输出;另一方面,传声器162将收集的声音信号转换为电信号,由音频电路160接收后转换为音频数据,再将音频数据输出处理器180处理后,经RF电路110以发送给比如另一终端,或者将音频数据输出至存储器120以便进一步处理。音频电路160还可能包括耳塞插孔,以提供外设耳机与终端700的通信。
WiFi属于短距离无线传输技术,终端700通过WiFi模块170可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。
处理器180是终端700的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器120内的软件程序和/或模块,以及调用存储在存储器120内的数据,执行终端700的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器180可包括一个或多个处理核心;优选的,处理器180可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器180中。
终端700还包括给各个部件供电的电源190(比如电池),优选的,电源可以通过电源管理系统与处理器180逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源190还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管未示出,终端700还可以包括摄像头、蓝牙模块等,在此不再赘述。具体在本实施例中,终端的显示单元是触摸屏显示器,终端还包括有存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述实施例所述的表情展示方法。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读 存储器,磁盘或光盘等。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请实施例的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请实施例的保护范围之内。

Claims (15)

  1. 一种表情展示方法,应用于第一终端,所述方法包括:
    所述第一终端在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动所述目标表情;
    所述第一终端在移动所述目标表情的过程中显示放置提示消息;
    所述第一终端在消息展示界面获取对所述目标表情的停止拖拽指令,响应所述拖拽停止指令,在接收到由所述放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示所述目标表情。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    所述第一终端在获取到对所述目标表情的缩放指令后,获取与所述缩放指令匹配的目标缩放比例;
    所述第一终端根据所述目标缩放比例对所述目标表情进行缩放处理,得到缩放后的目标表情;
    所述在停止拖拽的第一目标位置展示所述目标表情,包括:
    在所述第一目标位置对所述缩放后的目标表情进行展示。
  3. 根据权利要求1所述的方法,其中,所述方法还包括:
    所述第一终端在获取到对所述目标表情的旋转指令后,获取与所述旋转指令匹配的目标旋转方向和目标旋转角度;
    所述第一终端根据所述目标旋转方向和目标旋转角度对所述目标表情进行旋转处理,得到旋转后的目标表情;
    所述在停止拖拽的第一目标位置展示所述目标表情,包括:
    在所述第一目标位置对所述旋转后的目标表情进行展示。
  4. 根据权利要求1所述的方法,其中,所述消息展示界面包括视觉渲染层;
    所述在停止拖拽的第一目标位置展示所述目标表情,包括:
    若所述第一目标位置处展示有消息框体,则以置于顶层的展示方式,在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标 表情,以使所述目标表情叠加展示在所述消息框体上;或,
    若所述第一目标位置处展示有消息框体,且所述消息框体上叠加展示其他表情,则以置于顶层的展示方式,在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情,以使所述目标表情叠加展示在所述其他表情上;或,
    若所述第一目标位置处空白,则在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情。
  5. 根据权利要求1至4中任一权利要求所述的方法,其中,所述方法还包括:
    所述第一终端在获取到对所述目标表情的隐藏展示指令后,控制所述目标表情按照预设移动轨迹,从所述第一目标位置开始向预设终止位置进行移动;
    所述第一终端在移动过程中调整所述目标表情的透明度和大小,直至取消对所述目标表情的展示;
    其中,越靠近所述预设终止位置所述目标表情越小且透明度越高。
  6. 根据权利要求5所述的方法,其中,所述方法还包括:
    所述第一终端在获取到对所述目标表情的取消隐藏指令后,控制所述目标表情按照预设移动轨迹,从所述预设终止位置开始向所述第一目标位置进行移动;
    所述第一终端在移动过程中调整所述目标表情的透明度和大小,直至所述目标表情恢复原始大小以及原始透明度;
    其中,越靠近所述第一目标位置所述目标表情越大且透明度越低。
  7. 根据权利要求1至4中任一权利要求所述的方法,其中,所述方法还包括:
    所述第一终端获取所述第一终端的第一屏幕尺寸信息;
    所述第一终端确定与所述目标表情关联的至少一个展示元素,所述展示元素为展示在所述第一目标位置处的消息框体和/或表情;
    所述第一终端计算所述目标表情相对于所述至少一个展示元素 的第一坐标位置信息;
    所述第一终端向第二终端发送表情展示数据,以使所述第二终端根据所述表情展示数据对所述目标表情进行展示,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
    其中,所述第二终端的第二用户位于所述第一终端的第一用户的好友关系链中。
  8. 一种表情展示方法,应用于第二终端,所述方法包括:
    所述第二终端接收第一终端发送的表情展示数据,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
    所述第二终端根据所述至少一个展示元素的标识信息,在所述消息展示界面上确定所述目标表情的展示范围区域;
    所述第二终端获取所述第二终端的第二屏幕尺寸信息;
    所述第二终端根据所述第一屏幕尺寸信息、第二屏幕尺寸信息以及所述第一坐标位置信息,在所述展示范围区域内确定第二目标位置;
    所述第二终端在所述第二目标位置处,以置于顶层的显示方式,将所述目标表情叠加在所述至少一个展示元素上进行展示。
  9. 一种表情展示装置,应用于第一终端,所述装置包括一个或多个处理器,以及一个或多个存储程序单元的存储器,其中,所述程序单元由所述处理器执行,所述程序单元包括:
    处理模块,被设置为在表情选取窗口获取到对选中的目标表情的拖拽指令后,按照获取到的拖拽轨迹移动所述目标表情;
    第一展示模块,被设置为在移动所述目标表情的过程中显示放置提示消息;
    第二展示模块,被设置为在消息展示界面获取对所述目标表情的停止拖拽指令,响应所述拖拽停止指令,在接收到由所述放置提示消息触发的放置确认指令后,在停止拖拽的第一目标位置展示所述目标 表情。
  10. 根据权利要求9所述的装置,其中,所述消息展示界面包括视觉渲染层;
    所述第二展示模块,被设置为若所述第一目标位置处展示有消息框体,则以置于顶层的展示方式,在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情,以使所述目标表情叠加展示在所述消息框体上;或,若所述第一目标位置处展示有消息框体,且所述消息框体上叠加展示其他表情,则以置于顶层的展示方式,在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情,以使所述目标表情叠加展示在所述其他表情上;或,若所述第一目标位置处空白,则在所述视觉渲染层上绘制展示在所述第一目标位置处的所述目标表情。
  11. 根据权利要求9所述的装置,其中,所述第二展示模块,还被设置为在获取到对所述目标表情的隐藏展示指令后,控制所述目标表情按照预设移动轨迹,从所述第一目标位置开始向预设终止位置进行移动;在移动过程中调整所述目标表情的透明度和大小,直至取消对所述目标表情的展示;
    其中,越靠近所述预设终止位置所述目标表情越小且透明度越高。
  12. 根据权利要求9所述的装置,其中,所述程序单元还包括:
    获取模块,被设置为获取所述第一终端的第一屏幕尺寸信息;
    确定模块,被设置为确定与所述目标表情关联的至少一个展示元素,所述展示元素为展示在所述第一目标位置处的消息框体和/或表情;
    计算模块,被设置为计算所述目标表情相对于所述至少一个展示元素的第一坐标位置信息;
    发送模块,被设置为向第二终端发送表情展示数据,以使所述第二终端根据所述表情展示数据对所述目标表情进行展示,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
    其中,所述第二终端的第二用户位于所述第一终端的第一用户的好友关系链中。
  13. 一种表情展示装置,应用于第二终端,所述装置包括一个或多个处理器,以及一个或多个存储程序单元的存储器,其中,所述程序单元由所述处理器执行,所述程序单元包括:
    接收模块,被设置为接收第一终端发送的表情展示数据,所述表情展示数据中至少包括所述第一坐标位置信息、所述第一屏幕尺寸信息以及所述至少一个展示元素的标识信息;
    确定模块,被设置为根据所述至少一个展示元素的标识信息,在所述消息展示界面上确定所述目标表情的展示范围区域;
    获取模块,被设置为获取所述第二终端的第二屏幕尺寸信息;
    所述确定模块,还被设置为根据所述第一屏幕尺寸信息、第二屏幕尺寸信息以及所述第一坐标位置信息,在所述展示范围区域内确定第二目标位置;
    展示模块,被设置为在所述第二目标位置处,以置于顶层的显示方式,将所述目标表情叠加在所述至少一个展示元素上进行展示。
  14. 一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至7中任一权利要求所述的表情展示方法;或,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求8所述的表情展示方法。
  15. 一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至7中任一权利要求所述的表情展示方法;或,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求8所述的表情展示方法。
PCT/CN2018/096609 2017-07-31 2018-07-23 表情展示方法、装置及计算机可读存储介质 WO2019024700A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/569,515 US11204684B2 (en) 2017-07-31 2019-09-12 Sticker presentation method and apparatus and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710639778.0A CN107479784B (zh) 2017-07-31 2017-07-31 表情展示方法、装置及计算机可读存储介质
CN201710639778.0 2017-07-31

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/569,515 Continuation US11204684B2 (en) 2017-07-31 2019-09-12 Sticker presentation method and apparatus and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2019024700A1 true WO2019024700A1 (zh) 2019-02-07

Family

ID=60597967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/096609 WO2019024700A1 (zh) 2017-07-31 2018-07-23 表情展示方法、装置及计算机可读存储介质

Country Status (4)

Country Link
US (1) US11204684B2 (zh)
CN (1) CN107479784B (zh)
TW (1) TWI672629B (zh)
WO (1) WO2019024700A1 (zh)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107479784B (zh) 2017-07-31 2022-01-25 腾讯科技(深圳)有限公司 表情展示方法、装置及计算机可读存储介质
CN109388297B (zh) * 2017-08-10 2021-10-22 腾讯科技(深圳)有限公司 表情展示方法、装置、计算机可读存储介质及终端
CN108322383B (zh) * 2017-12-27 2022-02-25 广州市百果园信息技术有限公司 表情交互显示方法、计算机可读存储介质及终端
CN108337548A (zh) * 2018-01-24 2018-07-27 优酷网络技术(北京)有限公司 基于全景视频的弹幕表情的展示控制方法及装置
CN108320316B (zh) * 2018-02-11 2022-03-04 秦皇岛中科鸿合信息科技有限公司 个性化表情包制作系统及方法
CN108846881B (zh) * 2018-05-29 2023-05-12 珠海格力电器股份有限公司 一种表情图像的生成方法及装置
CN109412935B (zh) * 2018-10-12 2021-12-07 北京达佳互联信息技术有限公司 即时通信的发送方法和接收方法、发送装置和接收装置
CN109787890B (zh) * 2019-03-01 2021-02-12 北京达佳互联信息技术有限公司 即时通信方法、装置及存储介质
US11321388B2 (en) * 2019-05-10 2022-05-03 Meta Platforms, Inc. Systems and methods for generating and sharing content
CN111162993B (zh) * 2019-12-26 2022-04-26 上海连尚网络科技有限公司 信息融合方法和设备
CN114756151A (zh) * 2020-12-25 2022-07-15 华为技术有限公司 一种界面元素显示方法及设备
KR20210135683A (ko) * 2020-05-06 2021-11-16 라인플러스 주식회사 인터넷 전화 기반 통화 중 리액션을 표시하는 방법, 시스템, 및 컴퓨터 프로그램
CN111813469A (zh) * 2020-05-21 2020-10-23 摩拜(北京)信息技术有限公司 信息显示方法及终端设备
CN112463089A (zh) * 2020-10-21 2021-03-09 贝壳技术有限公司 跨终端的图片同步缩放方法、装置、电子介质及存储介质
CN114546228B (zh) * 2020-11-12 2023-08-25 腾讯科技(深圳)有限公司 表情图像发送方法、装置、设备及介质
CN113144601B (zh) * 2021-05-26 2023-04-07 腾讯科技(深圳)有限公司 虚拟场景中的表情显示方法、装置、设备以及介质
CN113920224A (zh) * 2021-09-29 2022-01-11 北京达佳互联信息技术有限公司 素材展示方法、装置、电子设备及存储介质
CN114840117A (zh) * 2022-05-10 2022-08-02 北京字跳网络技术有限公司 信息输入页面的元素控制方法、装置、设备、介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100125785A1 (en) * 2008-11-19 2010-05-20 Bradford Allen Moore Portable Touch Screen Device, Method, and Graphical User Interface for Using Emoji Characters While in a Locked Mode
CN104932853A (zh) * 2015-05-25 2015-09-23 深圳市明日空间信息技术有限公司 动态表情播放方法及装置
US20150268780A1 (en) * 2014-03-24 2015-09-24 Hideep Inc. Method for transmitting emotion and terminal for the same
CN105389114A (zh) * 2015-11-10 2016-03-09 北京新美互通科技有限公司 内容输入方法及装置
CN105487770A (zh) * 2015-11-24 2016-04-13 腾讯科技(深圳)有限公司 图片发送方法及装置
CN106888153A (zh) * 2016-06-12 2017-06-23 阿里巴巴集团控股有限公司 展示要素生成方法、展示要素生成装置、展示要素和通讯软件
CN107479784A (zh) * 2017-07-31 2017-12-15 腾讯科技(深圳)有限公司 表情展示方法、装置及计算机可读存储介质

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050231512A1 (en) * 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
TWI454955B (zh) * 2006-12-29 2014-10-01 Nuance Communications Inc 使用模型檔產生動畫的方法及電腦可讀取的訊號承載媒體
KR101485787B1 (ko) * 2007-09-06 2015-01-23 삼성전자주식회사 단말 및 그의 컨텐츠 저장 및 실행 방법
CN101252549B (zh) * 2008-03-27 2012-04-11 腾讯科技(深圳)有限公司 表情图片缩略图的位置调整方法及即时通信系统
US8898630B2 (en) * 2011-04-06 2014-11-25 Media Direct, Inc. Systems and methods for a voice- and gesture-controlled mobile application development and deployment platform
US9207755B2 (en) * 2011-12-20 2015-12-08 Iconicast, LLC Method and system for emotion tracking, tagging, and rating and communication
US9110587B2 (en) * 2012-07-13 2015-08-18 Samsung Electronics Co., Ltd. Method for transmitting and receiving data between memo layer and application and electronic device using the same
WO2014134817A1 (zh) * 2013-03-07 2014-09-12 东莞宇龙通信科技有限公司 终端和终端操控方法
WO2015148733A2 (en) * 2014-03-25 2015-10-01 ScStan, LLC Systems and methods for the real-time modification of videos and images within a social network format
US10338793B2 (en) * 2014-04-25 2019-07-02 Timothy Isaac FISHER Messaging with drawn graphic input
TWI553542B (zh) * 2014-12-08 2016-10-11 英業達股份有限公司 表情圖像推薦系統及其方法
KR101620050B1 (ko) * 2015-03-03 2016-05-12 주식회사 카카오 인스턴트 메시지 서비스를 통한 시나리오 이모티콘 표시 방법 및 이를 위한 사용자 단말
US9632664B2 (en) * 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
KR102427833B1 (ko) * 2015-11-30 2022-08-02 삼성전자주식회사 사용자 단말장치 및 디스플레이 방법
CN105930828B (zh) * 2016-04-15 2021-05-14 腾讯科技(深圳)有限公司 表情分类标识的控制方法及装置
US10194288B2 (en) * 2016-06-12 2019-01-29 Apple Inc. Sticker distribution system for messaging apps
CN106803916B (zh) * 2017-02-27 2021-03-19 腾讯科技(深圳)有限公司 信息展示方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100125785A1 (en) * 2008-11-19 2010-05-20 Bradford Allen Moore Portable Touch Screen Device, Method, and Graphical User Interface for Using Emoji Characters While in a Locked Mode
US20150268780A1 (en) * 2014-03-24 2015-09-24 Hideep Inc. Method for transmitting emotion and terminal for the same
CN104932853A (zh) * 2015-05-25 2015-09-23 深圳市明日空间信息技术有限公司 动态表情播放方法及装置
CN105389114A (zh) * 2015-11-10 2016-03-09 北京新美互通科技有限公司 内容输入方法及装置
CN105487770A (zh) * 2015-11-24 2016-04-13 腾讯科技(深圳)有限公司 图片发送方法及装置
CN106888153A (zh) * 2016-06-12 2017-06-23 阿里巴巴集团控股有限公司 展示要素生成方法、展示要素生成装置、展示要素和通讯软件
CN107479784A (zh) * 2017-07-31 2017-12-15 腾讯科技(深圳)有限公司 表情展示方法、装置及计算机可读存储介质

Also Published As

Publication number Publication date
TW201911023A (zh) 2019-03-16
US11204684B2 (en) 2021-12-21
CN107479784B (zh) 2022-01-25
TWI672629B (zh) 2019-09-21
US20200004394A1 (en) 2020-01-02
CN107479784A (zh) 2017-12-15

Similar Documents

Publication Publication Date Title
TWI672629B (zh) 表情展示方法、裝置及電腦可讀取儲存媒體
TWI674555B (zh) 表情展示方法、裝置、電腦可讀取儲存媒體及終端
CN108701001B (zh) 显示图形用户界面的方法及电子设备
CN109905754B (zh) 虚拟礼物收取方法、装置及存储设备
JP6130926B2 (ja) ジェスチャーによる会話処理方法、装置、端末デバイス、プログラム、及び記録媒体
US10630629B2 (en) Screen display method, apparatus, terminal, and storage medium
CN116055610B (zh) 显示图形用户界面的方法和移动终端
US11604535B2 (en) Device and method for processing user input
WO2017125027A1 (zh) 一种进行信息展示的方法和装置、计算机存储介质
WO2017129031A1 (zh) 信息获取方法及装置
CN107728886B (zh) 一种单手操作方法和装置
CN108513671B (zh) 一种2d应用在vr设备中的显示方法及终端
US20210352040A1 (en) Message sending method and terminal device
CN108920069B (zh) 一种触控操作方法、装置、移动终端和存储介质
CN110673770B (zh) 消息展示方法及终端设备
WO2020001604A1 (zh) 显示方法及终端设备
CN108900407B (zh) 会话记录的管理方法、装置及存储介质
CN108052258B (zh) 一种终端任务的处理方法、任务处理装置及移动终端
CN111045628A (zh) 一种信息传输方法及电子设备
CA2873555A1 (en) Device and method for processing user input
EP2660695B1 (en) Device and method for processing user input
CN108008875B (zh) 一种控制光标移动的方法及终端设备
CN111026562A (zh) 一种消息发送方法及电子设备
CN111061574B (zh) 一种对象分享方法及电子设备
CN113608655A (zh) 信息处理方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18840194

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18840194

Country of ref document: EP

Kind code of ref document: A1