WO2019047809A1 - 处理应用中图像的方法、装置、终端设备和存储介质 - Google Patents

处理应用中图像的方法、装置、终端设备和存储介质 Download PDF

Info

Publication number
WO2019047809A1
WO2019047809A1 PCT/CN2018/103874 CN2018103874W WO2019047809A1 WO 2019047809 A1 WO2019047809 A1 WO 2019047809A1 CN 2018103874 W CN2018103874 W CN 2018103874W WO 2019047809 A1 WO2019047809 A1 WO 2019047809A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
application
expression
file
emoticon
Prior art date
Application number
PCT/CN2018/103874
Other languages
English (en)
French (fr)
Inventor
杨晓明
栗绍峰
朱明浩
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2020513636A priority Critical patent/JP7253535B2/ja
Priority to KR1020237022225A priority patent/KR20230104999A/ko
Priority to KR1020207007499A priority patent/KR20200036937A/ko
Priority to KR1020227006348A priority patent/KR20220028184A/ko
Publication of WO2019047809A1 publication Critical patent/WO2019047809A1/zh
Priority to US16/794,001 priority patent/US20200186484A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/08Annexed information, e.g. attachments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/216Handling conversation history, e.g. grouping of messages in sessions or threads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/42Mailbox-related aspects, e.g. synchronisation of mailboxes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management

Definitions

  • the present disclosure relates to the field of computer application technologies, and in particular, to a method, an apparatus, a terminal device, and a storage medium for processing an image in an application.
  • Emoticons occupy a very important position in the life of the Internet, which is different from the text to achieve the transmission of information in various scenarios. For example, in a social application, information transmitted with an emoticon image as a main content; in a web application, information published with an emoticon image as a component of the content, and the like.
  • the expression image exists in the form of an expression file, and the display content of the expression file is an expression image.
  • the expression file is obtained by using an expression package creation application to perform image editing to obtain an expression image and save it.
  • the user can only jump to the application to transfer the emoticon image corresponding to the emoticon file. For example, after the emoticon image is saved for the created emoticon image, it is jumped to an application that needs to use the emoticon image, such as a social application or a web application, and then the emoticon image corresponding to the emoticon file is sent by the operation process of adding the emoticon package.
  • the image edited by the corresponding emoticon creation application is obtained by an application that needs to use an emoticon image. Therefore, it is also necessary to export the image from the application and load it into the emoticon creation application.
  • the present disclosure provides a method, apparatus, terminal device, and storage medium for processing an image in an application.
  • a method of processing an image in an application the method being performed by a terminal device, the method comprising:
  • the self-made expression image is generated correspondingly to the self-made expression file configured in the application, and the self-made expression file is called by the application to implement a specified function of the application itself, and the specified function is different from the expression creation function.
  • An apparatus for processing an image in an application the apparatus being configured in a terminal device, the apparatus comprising:
  • An instruction receiver configured to receive an expression file generation instruction corresponding to an image displayed in the application
  • An emoticon editor configured to invoke an emoticon editing tool built into the application according to an instruction that the image is triggered to generate an emoticon file
  • a self-made expression obtainer configured to obtain a homemade expression image corresponding to the image by the expression editing tool and an expression editing operation triggered by the image;
  • a configurator configured to generate the self-made expression file configured in the application by the self-made expression file, where the self-made expression file is invoked by the application to implement a specified function of the application itself, and the specified function is different from the expression creation Features.
  • a terminal device comprising:
  • a memory having stored thereon computer readable instructions that, when executed by the processor, implement a method of processing an image in an application as described above.
  • a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements a method of processing an image in an application as described above.
  • FIG. 1 is a schematic diagram of an implementation environment, according to an exemplary embodiment
  • FIG. 2 is a block diagram of an apparatus, according to an exemplary embodiment
  • FIG. 3 is a flowchart illustrating a method of processing an image in an application according to an exemplary embodiment
  • FIG. 4 is a flow chart depicting the details of step S350, shown in accordance with the corresponding embodiment of Figure 3;
  • FIG. 5 is a flow chart for describing the details of step S370, according to the corresponding embodiment of Figure 3;
  • FIG. 6 is a flowchart illustrating a method of processing an image in an application, according to another exemplary embodiment
  • FIG. 7 is a schematic diagram of an interface of a session window between a user and a friend in an instant communication tool according to an exemplary embodiment
  • FIG. 8 is a schematic diagram of an interface of a session window of an outgoing call operation item according to the corresponding embodiment of FIG. 7;
  • FIG. 9 is a schematic diagram of an expression editing interface for cropping an image according to a corresponding embodiment of FIG. 8;
  • FIG. 10 is a schematic diagram of an interface for triggering text input in a completed cropped image according to a corresponding embodiment of FIG. 9;
  • FIG. 10 is a schematic diagram of an interface for triggering text input in a completed cropped image according to a corresponding embodiment of FIG. 9;
  • FIG. 11 is a schematic diagram of an interface for completing text input according to the corresponding embodiment of FIG. 10;
  • FIG. 12 is a schematic diagram of an interface for completing input text rendering according to a corresponding embodiment of FIG. 11;
  • FIG. 13 is a schematic diagram of a session window for transmitting a homemade emoticon image shown in the embodiment corresponding to FIG. 12;
  • FIG. 14 is a schematic diagram showing a thumbnail display of a self-made expression file according to a corresponding embodiment of FIG. 13 in a session window;
  • FIG. 15 is a schematic flowchart of an implementation process of a self-made expression image in an instant communication tool according to an exemplary embodiment
  • 16 is a block diagram of an apparatus for processing an image in an application, according to an exemplary embodiment
  • Figure 17 is a block diagram showing the details of a self-made expression obtainer according to the corresponding embodiment of Figure 16;
  • Figure 18 is a block diagram showing the details of the configurator shown in accordance with the corresponding embodiment of Figure 16;
  • FIG. 19 is a block diagram of an apparatus for processing an image in an application, according to another exemplary embodiment.
  • the implementation environment involved in the present disclosure includes at least a terminal device 110 used by a user and an application server 130 that cooperates with an application in the terminal device 110.
  • the terminal device 110 can be a desktop computer, a notebook computer, a smart phone, a tablet computer, or the like.
  • the application running by the terminal device 110 cooperates with the corresponding application server to implement the specified function of the deployment. Before implementing this specified function, the emoticon creation process of the displayed image will be performed in this application, and then the user-created emoticon file will be configured for the application.
  • device 200 is a block diagram of an apparatus, according to an exemplary embodiment.
  • device 200 can be a terminal device in the implementation environment shown in FIG. 2.
  • apparatus 200 can include one or more of the following components: processing component 202, memory 204, power component 206, multimedia component 208, audio component 210, sensor component 214, and communication component 216.
  • Processing component 202 typically controls the overall operation of device 200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations, and the like.
  • Processing component 202 can include one or more processors 218 to execute instructions to perform all or part of the steps of the methods described below.
  • processing component 202 can include one or more modules to facilitate interaction between component 202 and other components.
  • processing component 202 can include a multimedia module to facilitate interaction between multimedia component 208 and processing component 202.
  • Memory 204 is configured to store various types of data to support operation at device 200. Examples of such data include instructions of any application or method configured to operate on device 200.
  • the memory 204 can be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read only memory (Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Red-Only Memory (PROM), Read Only Memory ( Read-Only Memory (ROM), magnetic memory, flash memory, disk or optical disk. Also stored in memory 204 is one or more modules configured to be executed by the one or more processors 218 to perform any of the following Figures 3, 4, 5, and 6 Show all or part of the steps in the method.
  • SRAM Static Random Access Memory
  • EEPROM Electrically erasable programmable read only memory
  • EPROM Erasable Programmable Read Only Memory
  • PROM Programmable Red-Only Memory
  • Power component 206 provides power to various components of device 200.
  • Power component 206 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 200.
  • the multimedia component 208 includes a screen between the device 200 and the user that provides an output interface.
  • the screen may include a liquid crystal display (LCD) and a touch panel. If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the screen may also include an Organic Light Emitting Display (OLED).
  • OLED Organic Light Emitting Display
  • the audio component 210 is configured to output and/or input an audio signal.
  • the audio component 210 includes a microphone (Microphone, MIC for short) that is configured to receive an external audio signal when the device 200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 204 or transmitted via communication component 216.
  • audio component 210 also includes a speaker for outputting an audio signal.
  • Sensor assembly 214 includes one or more sensors for providing status assessment of various aspects to device 200.
  • sensor assembly 214 can detect an open/closed state of device 200, relative positioning of components, and sensor assembly 214 can also detect changes in position of one component of device 200 or device 200 and temperature changes of device 200.
  • the sensor assembly 214 can also include a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 216 is configured to facilitate wired or wireless communication between device 200 and other devices.
  • the device 200 can access a wireless network based on a communication standard such as WiFi (WIreless-Fidelity).
  • communication component 216 receives broadcast signals or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 216 also includes a Near Field Communication (NFC) module to facilitate short range communication.
  • NFC Near Field Communication
  • the NFC module can be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth technology, and other technologies. .
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • the apparatus 200 may be configured by one or more Application Specific Integrated Circuits (ASICs), digital signal processors, digital signal processing devices, programmable logic devices, field programmable gate arrays, Implemented by a controller, microcontroller, microprocessor or other electronic component for performing the methods described below.
  • ASICs Application Specific Integrated Circuits
  • digital signal processors digital signal processing devices
  • programmable logic devices programmable logic devices
  • field programmable gate arrays Implemented by a controller, microcontroller, microprocessor or other electronic component for performing the methods described below.
  • FIG. 3 is a flow chart showing a method of processing an image in an application, according to an exemplary embodiment.
  • the method of processing an image in an application is applicable to a terminal device of the aforementioned implementation environment, which in an exemplary embodiment may be the device shown in FIG. 2.
  • the method for processing an image in an application may be performed by a terminal device, and may include the following steps.
  • the image displayed in the application refers to any image displayed on the application interface.
  • This image includes various moving images and still images.
  • the application is any application that performs image display and uses an emoticon file to implement a specified function.
  • This application includes social applications and various web applications for displaying web information.
  • the application can be an instant messaging tool, a microblog, or the like.
  • the designated function is the session function, while for Weibo, its command function can be to publish a message containing an emoticon image.
  • the emoticon generation instruction corresponding to the image displayed in the application is generated for the image triggered by the application interface.
  • an image is displayed, and an emoticon generation instruction can be triggered on the image.
  • the displayed image is triggered as the image display proceeds.
  • the user triggers an operation such as clicking or long pressing on the image, and then an associated operation item of the image is activated, and the associated operation item includes an expression creation operation.
  • an instruction to display the image to be triggered to generate the expression file may be generated; correspondingly, for the application itself, during the image display, whether the monitor triggers the click or the long Press operation, if a click or long press is triggered on the application interface, and the triggered click or long press is positioned on the displayed image, the interface function is called to respond to the click or long press triggered on the displayed image.
  • the emoticon generation instruction is generated, and for the implementation of the present disclosure, the emoticon generation instruction input by the interface function is received at this time.
  • the displayed image of the application may be a dynamic image, and therefore, for the user, according to the need of the self-made expression image, the interception of the displayed dynamic image is also performed in the application to obtain One of the frames of the image generates an emoticon generation instruction for the frame image.
  • the number of images displayed in the application is one or more, corresponding to the object that receives the corresponding expression file generation instruction and then obtains the image of the expression itself. It may be an image in the displayed image, or a batch of expressions may be implemented for two or more images, which is not limited herein.
  • the application is a social application
  • the image is an image received or sent in a session of the social application.
  • the method specifically includes:
  • receives an expression file corresponding to an image displayed in the social application and receiving an emoticon file generated by the emoticon generation instruction for the displayed image to be invoked by the social application and used by the social application's session interface to receive and/or send the image.
  • the social application is used to implement a session between the logged-in user and other users, and the session includes a one-to-one session between the users, for example, a session between the user and his friend, and a group session between the users.
  • the session interface is implemented in the corresponding session window. As the session progresses, the message is sent and received. In the sent or received message, the carried image will be displayed along with the text on the session interface. So far, the image displayed in the application is obtained, and the process of obtaining the homemade emoticon image corresponding to the image can be initiated by triggering the image, thereby generating a homemade emoticon file.
  • the specified function configured is a function of delivering an emoticon image in a session.
  • the referred application may also be a web application or the like, for example, an emoticon mall application loaded through a browser, etc., and the specified functions configured are different depending on the application.
  • the specified function is the function that the emoticon image corresponding to the emoticon file is published, and the emoticon file is authorized to be used by the paying user.
  • any image displayed thereon can directly trigger the generation of the homemade emoticon image corresponding to the image and the generation of the corresponding homemade emoticon file, and finally apply to the application for displaying the image, and enhance the immediacy on the one hand.
  • the emoticon file configured in the application is also enriched, and any image obtained by the user through the application can be configured with the corresponding emotic file in the application.
  • step 310 specifically includes: in an application or a social application, activating an associated operation item of the displayed image, and obtaining an expression corresponding to the display image by triggering selection of the expression creation operation item in the associated operation item.
  • File generation instructions in an application or a social application, activating an associated operation item of the displayed image, and obtaining an expression corresponding to the display image by triggering selection of the expression creation operation item in the associated operation item.
  • the application is capable of image display and needs to use an emoticon image to realize its function
  • social applications are one of them, but there are other applications exemplified above.
  • the triggered execution of the homemade emoticon process of the specified image can be implemented by the associated action item of the image.
  • the associated operation item of the image includes an expression creation operation item and other operation items, and after the associated operation item of the displayed image is activated, the expression file generation instruction may be obtained according to the selected operation of the expression creation operation item in the associated operation item.
  • the configuration and activation of the associated operation item is implemented by built-in resource files and interface functions, that is, the display style of the associated operation item and the various operation items it contains are As the resource file is stored, after the corresponding operation is monitored, it can be called up by the interface function, and the activation is displayed on the application interface.
  • step S330 the embedware editing tool built in the application is called according to an instruction that the image is triggered to generate an emoticon file.
  • the expression editing tool is used to realize the instant processing of the image in the application, and obtain the corresponding expression image.
  • the expression editing tool is implemented by the image channel in the application, and realizes the rendering and drawing of the image and the file under the control of the expression editing operation triggered by the user.
  • the image is triggered to generate the expression file in step S310, it is necessary to obtain the desired expression image from the image, for example, adding a specific text on the image, or transforming the rendering effect of the image, etc.,
  • the image is triggered by an instruction to generate an emoticon file and the embed emoticon editing tool is invoked.
  • step S350 the self-made expression image corresponding to the image is obtained by the expression editing tool and the expression editing operation triggered by the image.
  • the image that is triggered to generate the expression file will enter the expression editing state under the action of the built-in expression editing tool, and the image processing process may be performed along with the expression editing operation triggered by the user. For example, by triggering the operation of inputting text, you can add text to the image. In addition, you can also trigger the operation of dragging and adjusting the position of the newly added text to adjust the position of the newly added text in the image.
  • the input text is rendered into a specified font, etc., and the desired emoticon image corresponding to the image is obtained after the desired emoticon editing operation is completed.
  • the homemade emoticon image is similar to the normal emoticon image, and the content description or content supplement is provided for the information.
  • the normal expression image is downloaded and added to the application by the application server, or may be added and configured in the application after receiving.
  • homemade emoticons are different, and their own emoticons are custom created using the images displayed in the app.
  • image-triggered expression editing operation may be triggered once or multiple times, and will not be limited herein.
  • the type of image editing operation triggered may be one or more types.
  • the triggered image editing operation is an operation of determining the image as a self-made emoticon image.
  • the self-made expression image corresponding to the obtained image refers to an expression image obtained based on the image displayed in the application, and the image obtained by the image is edited by the user.
  • step S370 the self-made expression image is generated correspondingly to the self-made expression file configured in the application, and the self-made expression file is called by the application to implement the specified function of the application itself, and the designated function is different from the expression creation function.
  • the self-made expression image finally obtained through the application of the built-in expression editing tool and the image editing operation will be saved in the form of an expression file, that is, a homemade expression file is generated and configured in the application, and the application can be called at any time.
  • the generated homemade emoticon file may be in the EIF format, and only the obtained homemade emoticon image is converted into an EIF format and stored to obtain a corresponding homemade emoticon file.
  • Generating a self-made emoticon file configured in an application which is advantageous for ensuring the displayed image quality, relative to the existing form of the self-made emoticon image.
  • an expression image if it is stored and transmitted in the form of an image, it is compressed in each storage and transmission, which causes the image quality of the expression image to become worse and worse, and the image quality is inferior.
  • the expression image exists in the form of an expression file, that is, the self-made expression file referred to above, and the expression image is only used as its display content, and since it has been compressed in the process of image editing, each transmission and secondary storage are performed. When the corresponding call and the called process, there is no secondary compression.
  • the self-made expression image is obtained by the image displayed by the application, and the homemade expression file generated by the self-made expression image and configured in the application can effectively improve the immediacy of the expression file creation in the application implementation scene. And customization, to ensure that users are immersed in the application, not interrupted or disturbed by other applications, which helps to improve the user's activity and dependence in the application.
  • the self-made emotic file is configured after the application, and can be called to implement the specified function, or can be postponed.
  • the specified function implemented by calling the self-made expression file may be immediacy, for example, the reproduction of the homemade expression file after editing the image generated during the session; or the pure production requirement of non-immediacy
  • the emoticons provided by the extension toolbar configured by the social application may be immediacy, for example, the reproduction of the homemade expression file after editing the image generated during the session.
  • An exemplary embodiment of the present disclosure is implemented in a social application, such that a user can make a homemade emoticon file for a sent or received picture and video in a session passer, and send it in the current session interface without requiring a special expression.
  • the application jumps to the social application, and then carries on the cumbersome operations such as importing and saving.
  • the material of the self-made expression file is derived from the image currently transmitted by the user, and the production process is completely in the course of the session, ensuring the user in the social application. Immersed.
  • the displayed image can be triggered by the application of the displayed image, and the application is to perform the specified function of the self-configuration through the expression file, in which the display will be received first.
  • the emoticon file corresponding to the image generates an instruction, and then invokes the embedive expression editing tool according to the instruction that the image is triggered to generate the emoticon file, obtains the self-emerging expression image corresponding to the image through the emoticon editing tool and the emoticon editing operation triggered by the image, and finally the homemade emoticon
  • the image generation is configured in the self-made expression file of the application, and the self-made expression file is called by the application to implement the specified function of the application itself.
  • the specified function is different from the expression creation function, and thus the application creates and configures the expression file for the displayed image.
  • the whole process, and only the triggering of the generated image file on the displayed image and the execution of the expression editing on the image requires the intervention of the user operation, avoiding the cumbersome operation, so that the implementation of the expression file and the configuration in the application are implemented by Cumbersome Of, and does not require additional means of expression package making applications, application configuration of the image displayed by the application of this expression is extremely simple files.
  • step S350 is a flow chart depicting the details of step S350, illustrated in accordance with the corresponding embodiment of FIG.
  • step S350 as shown in FIG. 4, the following steps may be included.
  • step S351 the expression editing tool jumps to enter the expression editing state corresponding to the displayed image.
  • the expression editing tool is built into the application and is implemented by the image channel deployed by the application, and provides image processing logic for the execution of the expression editing process.
  • the displayed image enters the expression editing state as the expression editing tool in the application is called. After the image enters the expression editing state, various expression editing operations can be arbitrarily triggered on the image.
  • Jumping into the expression editing state corresponding to the displayed image by the expression editing tool may be such that the image displayed in the application is directly in an editable state, and an operation item capable of triggering various expression editing operations is called for this, and the user selects An emoticon editing operation that can be triggered by any action item.
  • step S353 an image editing operation triggered by the image is received, and an image corresponding to the expression editing operation is performed on the image in the expression editing state by the expression editing tool, and the homemade expression image corresponding to the image is obtained.
  • the execution of the image processing process in the expression editing tool is implemented by the deployed image processing logic.
  • Corresponding image processing logic is executed in accordance with the triggered emoticon editing operation.
  • FIG. 5 is a flow chart depicting the details of step S370, shown in accordance with the corresponding embodiment of Figure 3.
  • step S370 as shown in FIG. 5, the following steps may be included.
  • step S371 a homemade emoticon file corresponding to the homemade emoticon image is generated.
  • the homemade expression file can be generated.
  • step S373 the user identifier registered by the application is obtained, and the self-made expression file is stored in the application server and/or the local terminal device by using the user identifier as an index.
  • the user implements the login in the application, and uniquely marks the user identity by the user identifier in the application.
  • the self-made expression file is customized by the user in the application. Therefore, the user identifier registered by the application is inevitably acquired, and the user identification is used as an index to store the homemade expression file.
  • the storage of the self-made expression file may be performed locally on the terminal device or in the application server, thereby implementing local storage or remote storage of the homemade expression file.
  • the method of processing an image in an application further includes the following steps after step S370.
  • a thumbnail display is displayed for the emoticon of the app in the emoticon of the app.
  • the thumbnail display of the self-made expression file in the expression panel refers to the process of displaying the thumbnail of the homemade expression file on the expression panel of the application, and is used to trigger the selection call of the homemade expression file.
  • the emoticons corresponding to the self-made emoticons include dynamic images and still images.
  • the thumbnail display of the self-made expression file in the expression panel is the display of the thumbnail of the static image in the expression panel; for the dynamic image, the dynamic image is realized by the gif (Graphics Interchange Format) sequence frame.
  • the image frame is composed of a set of consecutive frames, and a frame of the picture is extracted as a thumbnail of the moving image, which is displayed on the expression panel of the application, and the thumbnail display of the dynamic image in the expression panel is realized.
  • the invocation of the self-made emotic file can be triggered by the thumbnail displayed by the emoticon panel, thereby enabling the generation of the emoticon file and the implementation of the specified function in the application directly. In-app completion.
  • the self-made expression file is displayed in a thumbnail display on the expression panel, so that the self-made expression file is similar to other ordinary expression files, and is consistent with the current expression thumbnail logic, and has very high versatility.
  • FIG. 6 is a flowchart illustrating a method of processing an image in an application, according to another exemplary embodiment.
  • the thumbnail displayed by the self-made emotic file is associated with the emoticon file.
  • the method for processing the image in the application is as shown in FIG. 6, and may include the following steps.
  • step S410 the user file and the thumbnail file indexed by the application on the terminal device or the application server are selected by the selected operation triggered by the thumbnail displayed on the homemade emoticon file in the emoticon panel.
  • the display of the expression panel is performed in the application, and each expression file exists in the expression panel, that is, a thumbnail of the ordinary expression file and the homemade expression file. That is to say, the emoticon panel arranges the thumbnail corresponding to each emoticon file, and the user indexes from the terminal device local or the application server to the corresponding emoticon file by triggering the selected operation on a certain thumbnail.
  • the thumbnail has an association with the expression file. Specifically, the thumbnail corresponding to the expression file is also associated with the expression file, and the expression file is also stored in the terminal device and indexed by the user identifier. Application server.
  • the call of the corresponding emotic file can be implemented according to the selected thumbnail and the user ID registered by the application.
  • step S430 the transmission of the homemade emoticon file is performed in the application.
  • the transmission of the homemade emoticon file in the application is used to implement the specified function of the application self-made configuration.
  • the implementation of self-made emotic file transfers in the application is also different.
  • the application performs the transmission of the self-made expression file, and the application may publish the expression file to the specified page, or may transmit the expression file to the application registered by another user, or may be other implementation manners. It is not limited here.
  • the homemade emoticon file transmission is to publish the homemade emoticon image corresponding to the self-made emoticon file to the specified page of the emoticon mall application, so that other users can browse and view the homemade emoticon image published in the specified page. .
  • the self-made expression file transmission performed by the user includes: transmitting the expression file to a friend or a group participating in the session on the one hand, and displaying the expression image corresponding to the expression file to the application on the other hand.
  • the session window for this session for this session.
  • the expression file is sent to the social application user registered by the friend, or the social application registered by each member of the group, and the corresponding expression image is performed in the social application receiving the expression file.
  • the display in the session window it can be understood that the expression file is sent to the social application user registered by the friend, or the social application registered by each member of the group, and the corresponding expression image is performed in the social application receiving the expression file.
  • the display in the session window it can be understood that the expression file is sent to the social application user registered by the friend, or the social application registered by each member of the group, and the corresponding expression image is performed in the social application receiving the expression file.
  • the sending of the expression file is implemented by the application server, that is, the social application server corresponding to the social application.
  • the acquisition of the self-made expression file does not need to be implemented by means of an expression package creation application specially downloaded and deployed in the terminal device, thereby improving the simplicity of the custom creation of the expression file and Convenience, users can quickly and immersively create emoticons in the app and configure and propagate them.
  • An exemplary embodiment of the present disclosure is capable of fully integrating a custom production of an emoticon file with a conversational emulation based on a user's instant communication interaction in an application, on the one hand, for a terminal device whose screen size is limited, for example, a smartphone, greatly reduced
  • the material of the self-made expression file comes from the user's propagating image, that is, the image sent or received by the user.
  • the production of the self-made expression file is also completely realized in the conversation process, meeting the needs of instant communication and subject customization. It can also achieve the most timely and fast delivery.
  • the self-made expression file obtained by the exemplary embodiment of the present disclosure can also be propagated to other scenes by means of saving and forwarding, thereby implementing multiplexing of the homemade expression file.
  • FIG. 7 is an interface of a session window between a user and a friend in an instant communication tool according to an exemplary embodiment. schematic diagram.
  • the image 530 sent by the friend is displayed in the session window 510 as the message sent by the friend is received.
  • FIG. 8 is a schematic diagram of an interface of a session window of an outgoing call operation item according to the corresponding embodiment of FIG. 7.
  • FIG. 8 is a schematic diagram of an interface of a session window of an outgoing call operation item according to the corresponding embodiment of FIG. 7.
  • FIG. 9 is a schematic diagram of an emoticon editing interface for cropping an image according to the corresponding embodiment of FIG. 8.
  • FIG. 9 is a schematic diagram of an emoticon editing interface for cropping an image according to the corresponding embodiment of FIG. 8.
  • the user can select the default copy, or you can make a custom edit input, and the input text can be dragged to adjust the position, after the text is input, it will be rendered as the default font, even in the The logo is automatically marked on the corners to obtain an initial expression image.
  • FIG. 10 is a schematic diagram of an interface for triggering text input in a completed cropped image according to the corresponding embodiment of FIG. 9.
  • FIG. 11 is a schematic diagram of an interface for completing text input according to the corresponding embodiment of FIG.
  • FIG. 12 is a schematic diagram of an interface for completing the input text rendering according to the corresponding embodiment of FIG. So far, the homemade expression image can be finally obtained through FIG. 10 to FIG.
  • the self-made expression file is generated by the self-made expression image.
  • the user can call the self-made expression file in the conversation window 510 to send the self-made expression file to the friend.
  • FIG. 13 is the embodiment shown in FIG. A schematic diagram of a session window that sends a homemade emoticon image.
  • FIG. 14 is a schematic diagram showing a thumbnail display of a home-made emoticon file according to a corresponding embodiment of FIG. 13 in a session window.
  • FIG. 15 is a schematic flowchart of an implementation process of a self-made expression image in an instant communication tool according to an exemplary embodiment.
  • step 610 With the triggering of the expression creation, the instant messaging tool enters the image editing channel to open the expression editing tool, and under the action of the expression editing tool, the flow shown in steps 620 to 640 can be executed to obtain the homemade expression. image.
  • step 650 the user can send or choose to continue editing.
  • steps 660 to 670 the processes shown in steps 660 to 670 are performed.
  • step 710 can be performed to store the homemade expression image to the background expression channel, so as to realize the storage of the homemade expression image in the instant communication tool, and complete the homemade expression image in the instant communication tool. Configuration.
  • the homemade emoticon image configured in the instant messenger may execute step 730 to send the self-made emoticon image to the conversation window.
  • step 750 will be performed to capture the first frame of the dynamically displayed self-contained emoticon image as a preview image. Achieve a thumbnail display of the homemade emoticon in the emoticon panel.
  • the self-made expression image will be implemented based on the instant communication relationship chain, and can be propagated to other scenes while being called in the session.
  • FIG. 16 is a block diagram of an apparatus for processing an image in an application, according to an exemplary embodiment.
  • the apparatus for processing images in an application may include, but is not limited to, an instruction receiver 810, an expression editing caller 830, a self-made expression obtainer 850, and a configurator 870.
  • the instruction receiver 810 is configured to receive an emoticon generation instruction corresponding to the image displayed in the application.
  • the emoticon editor invoker 830 is configured to invoke an embedive emoticon tool built into the application based on an instruction that the image was triggered to generate an emoticon file.
  • the self-made expression obtainer 850 is configured to obtain a self-made expression image corresponding to the image through an expression editing tool and an expression editing operation triggered by the image.
  • the configurator 870 is configured to generate a self-made expression file configured to be applied to the application, and the self-made expression file is invoked by the application to implement the specified function of the application itself.
  • the application is a social application
  • the image is an image received or transmitted in a session of the social application
  • the instruction receiver 810 is further configured to receive and/or transmit a display of the image at the session interface of the social application.
  • receiving an emoticon file generating instruction corresponding to the image displayed in the social application, and the emoticon generating instruction generated by the emoticon file generating instruction for the displayed image is called by the social application and used to implement the function of transmitting the emoticon image in the session.
  • the instruction receiver 810 is further configured to activate an associated operation item of the displayed image in an application or a social application, and obtain a display image corresponding by trigger selection of the expression creation operation item in the associated operation item.
  • Emoticon file generation instructions are further configured to activate an associated operation item of the displayed image in an application or a social application, and obtain a display image corresponding by trigger selection of the expression creation operation item in the associated operation item.
  • Figure 17 is a block diagram depicting the details of a home-made expression obtainer, according to the corresponding embodiment of Figure 16 .
  • the self-made expression obtainer 850 may include, but is not limited to, a state jumper 851 and an image processing executor 853.
  • the state jumper 851 is configured to jump into the expression editing state corresponding to the displayed image by the expression editing tool.
  • the image processing executor 853 is configured to receive an expression editing operation triggered by the image in the expression editing state, and perform image processing corresponding to the expression editing operation on the image in the expression editing state by using the expression editing tool to obtain the homemade expression image corresponding to the image. .
  • Figure 18 is a block diagram depicting the details of a configurator, shown in accordance with the corresponding embodiment of Figure 16.
  • the configurator 870 may include, but is not limited to, an emoticon file generator 871 and an index memory 873.
  • the emoticon file generator 871 is configured to generate a self-made emoticon file corresponding to the homemade emoticon image.
  • the index memory 873 is configured to obtain a user identifier that is logged in by the application, and store the self-made expression file in the application server and/or the local terminal device by using the user identifier as an index.
  • the apparatus for processing an image in an application may further include, but is not limited to, a panel display controller.
  • the panel display controller is configured to display thumbnails of the application's emoticons for the homemade emoticons.
  • FIG. 19 is a block diagram of an apparatus for processing an image in an application, according to another exemplary embodiment.
  • the apparatus for processing an image in an application, as shown in FIG. 19, may also include, but is not limited to, an expression file invoker 910 and a transmitter 930.
  • the expression file invoker 910 is configured to select, by the selected operation triggered by the thumbnail displayed on the homemade emotic file in the emoticon panel, to invoke the embedding user identifier and the emoticon file indexed by the application locally on the terminal device or the application server.
  • the transmitter 930 is configured to transmit the homemade emotic file in the application.
  • the present disclosure further provides a terminal device that performs all or part of the steps of the file loading control method shown in any of FIG. 3, FIG. 4, FIG. 5, and FIG.
  • the device includes:
  • a memory having stored thereon computer readable instructions that, when executed by the processor, implement a method of processing an image in an application as described above.
  • the present disclosure further provides a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements a method of processing an image in an application as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Tourism & Hospitality (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种处理应用中图像的方法,由终端设备执行,包括:接收应用中所显示图像对应的表情文件生成指令(S310);根据图像被触发生成表情文件的指令调用应用内置的表情编辑工具(S330);通过表情编辑工具以及对图像触发的表情编辑操作获得图像对应的自制表情图像(S350);将自制表情图像相应生成配置于应用的自制表情文件,自制表情文件被应用调用而实现应用自身配置的指定功能,指定功能区别于表情制作功能(S370)。

Description

处理应用中图像的方法、装置、终端设备和存储介质
本申请要求申请日为2017年07月07日、申请号为2017108113891、发明创造名称为《处理应用中图像的方法、装置、终端设备和存储介质》的发明专利申请的优先权。
技术领域
本公开涉及计算机应用技术领域,特别涉及一种处理应用中图像的方法、装置、终端设备和存储介质。
背景技术
表情图像在互联网生活中占据着非常重要的地位,其区别于文字实现了各种场景下信息的传递。例如,社交应用中,以表情图像为主要内容而传递的信息;网络应用中,以表情图像为内容的组成部分而发布的信息等等。
表情图像是以表情文件的形式存在,而表情文件的显示内容即为表情图像。表情文件是借助于某一表情包制作应用进行图像编辑获得表情图像并保存而获得的。在获得此表情文件之后,用户只能跳转应用进行此表情文件所对应表情图像的传递。例如,为制作的表情图像保存得到表情文件之后,跳转至社交应用或者网络应用等需要使用表情图像的应用,再通过添加表情包的操作过程使得表情文件对应的表情图像被发送。
并且对应表情包制作应用所编辑的图像,由于是需要使用表情图像的应用所获得的,因此,也需要将此图像由应用中导出,进而载入表情包制作应用中。
整个流程较为繁琐,并且需要额外借助于特定的表情包制作应用,对于需要使用表情图像的应用而言,存在着较多的繁琐操作且无法及时快速实现相应表情文件的配置。
发明内容
本公开提供了一种处理应用中图像的方法、装置、终端设备和存储介质。
一种处理应用中图像的方法,所述方法由终端设备执行,所述方法包括:
接收终端设备所运行应用中所显示图像对应的表情文件生成指令;
根据所述图像被触发生成表情文件的指令调用所述应用内置的表情编辑工具;
通过所述表情编辑工具以及对所述图像触发的表情编辑操作获得所述图像对应的自制表情图像;
将所述自制表情图像相应生成配置于所述应用的自制表情文件,所述自制表情文件被所述应用调用而实现应用自身配置的指定功能,指定功能区别于所表情制作功能。
一种处理应用中图像的装置,所述装置被配置于终端设备,所述装置包括:
指令接收器,配置为接收应用中所显示图像对应的表情文件生成指令;
表情编辑调用器,配置为根据所述图像被触发生成表情文件的指令调用所述应用内置 的表情编辑工具;
自制表情获得器,配置为通过所述表情编辑工具以及对所述图像触发的表情编辑操作获得所述图像对应的自制表情图像;
配置器,配置为将所述自制表情图像相应生成配置于所述应用的自制表情文件,所述自制表情文件被所述应用调用而实现应用自身配置的指定功能,所述指定功能区别于表情制作功能。
一种终端设备,包括:
处理器;以及
存储器,所述存储器上存储有计算机可读指令,所述计算机可读指令被所述处理器执行时实现如上所述的处理应用中图像的方法。
一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如上所述的处理应用中图像的方法。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性的,并不能限制本公开。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是根据一示例性实施例示出的实施环境示意图;
图2是根据一示例性实施例示出的一种装置的框图;
图3是根据一示例性实施例示出的一种处理应用中图像的方法的流程图;
图4是根据图3对应实施例示出的对步骤S350的细节进行描述的流程图;
图5是根据图3对应实施例示出的对步骤S370的细节进行描述的流程图;
图6是根据另一示例性实施例示出的一种处理应用中图像的方法的流程图;
图7是根据一个示例性实施例示出的即时通信工具中用户与好友之间会话窗口的界面示意图;
图8是根据图7对应实施例示出的呼出操作项的会话窗口的界面示意图;
图9是根据图8对应实施例示出的对图像裁剪的表情编辑界面示意图;
图10是根据图9对应实施例示出的完成裁剪的图像中触发文字输入的界面示意图;
图11是根据图10对应实施例示出的完成文字输入的界面示意图;
图12是根据图11对应实施例示出的完成所输入文字渲染的界面示意图;
图13是图12所对应实施例示出的发送自制表情图像的会话窗口示意图;
图14是根据图13对应实施例示出的自制表情文件在会话窗口进行缩略显示的示意图;
图15是根据一示例性实施例示出的即时通信工具中自制表情图像的实现流程示意图;
图16是根据一示例性实施例示出的一种处理应用中图像的装置的框图;
图17是根据图16对应实施例示出的对自制表情获得器的细节进行描述的框图;
图18是根据图16对应实施例示出的对配置器的细节进行描述的框图;
图19是根据另一示例性实施例示出的一种处理应用中图像的装置的框图。
具体实施方式
这里将详细地对示例性实施例执行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在一个示例性实施例中,本公开所涉及的实施环境,如图1所示,至少包括用户所使用的终端设备110以及与此终端设备110中应用相配合的应用服务器130。
终端设备110可以是台式电脑、笔记本电脑、智能手机和平板电脑等。终端设备110所运行的应用与相应的应用服务器配合实现部署的指定功能。在实现这一指定功能之前,将在此应用中进行所显示图像的表情文件制作过程,进而为应用配置用户所自制的表情文件。
图2是根据一示例性实施例示出的一种装置的框图。例如,装置200可以是图2所示实施环境中的终端设备。
参照图2,装置200可以包括以下一个或多个组件:处理组件202,存储器204,电源组件206,多媒体组件208,音频组件210,传感器组件214以及通信组件216。
处理组件202通常控制装置200的整体操作,诸如与显示,电话呼叫,数据通信,相机操作以及记录操作相关联的操作等。处理组件202可以包括一个或多个处理器218来执行指令,以完成下述的方法的全部或部分步骤。此外,处理组件202可以包括一个或多个模块,便于处理组件202和其他组件之间的交互。例如,处理组件202可以包括多媒体模块,以方便多媒体组件208和处理组件202之间的交互。
存储器204被配置为存储各种类型的数据以支持在装置200的操作。这些数据的示例包括配置为在装置200上操作的任何应用程序或方法的指令。存储器204可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(Static Random Access Memory,简称SRAM),电可擦除可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,简称EEPROM),可擦除可编程只读存储器(Erasable Programmable Read Only Memory,简称EPROM),可编程只读存储器(Programmable Red-Only Memory,简称PROM),只读存储器(Read-Only Memory,简称ROM),磁存储器,快闪存储器,磁盘或光盘。存储器204中还存储有一个或多个模块,该一个或多个模块被配置成由该一个或多个处理器218执行,以完成下述图3、图4、图5和图6任一 所示方法中的全部或者部分步骤。
电源组件206为装置200的各种组件提供电力。电源组件206可以包括电源管理系统,一个或多个电源,及其他与为装置200生成、管理和分配电力相关联的组件。
多媒体组件208包括在所述装置200和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(Liquid Crystal Display,简称LCD)和触摸面板。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。屏幕还可以包括有机电致发光显示器(Organic Light Emitting Display,简称OLED)。
音频组件210被配置为输出和/或输入音频信号。例如,音频组件210包括一个麦克风(Microphone,简称MIC),当装置200处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器204或经由通信组件216发送。在一些实施例中,音频组件210还包括一个扬声器,用于输出音频信号。
传感器组件214包括一个或多个传感器,用于为装置200提供各个方面的状态评估。例如,传感器组件214可以检测到装置200的打开/关闭状态,组件的相对定位,传感器组件214还可以检测装置200或装置200一个组件的位置改变以及装置200的温度变化。在一些实施例中,该传感器组件214还可以包括磁传感器,压力传感器或温度传感器。
通信组件216被配置为便于装置200和其他设备之间有线或无线方式的通信。装置200可以接入基于通信标准的无线网络,如WiFi(WIreless-Fidelity,无线保真)。在一个示例性实施例中,通信组件216经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件216还包括近场通信(Near Field Communication,简称NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(Radio Frequency Identification,简称RFID)技术,红外数据协会(Infrared Data Association,简称IrDA)技术,超宽带(Ultra Wideband,简称UWB)技术,蓝牙技术和其他技术来实现。
在示例性实施例中,装置200可以被一个或多个应用专用集成电路(Application Specific Integrated Circuit,简称ASIC)、数字信号处理器、数字信号处理设备、可编程逻辑器件、现场可编程门阵列、控制器、微控制器、微处理器或其他电子元件实现,用于执行下述方法。
图3是根据一示例性实施例示出的一种处理应用中图像的方法的流程图。该处理应用中图像的方法适用于前述实施环境的终端设备,该终端设备在一个示例性实施例中可以是图2所示的装置。如图3所示,该处理应用中图像的方法,可以由终端设备执行,可以包括以下步骤。
在步骤S310中,接收应用中所显示图像对应的表情文件生成指令。
其中,应用中所显示图像,是指被显示于应用界面的任意图像。此图像包括各种动态图像和静态图像。应当说明的是,应用是任一进行图像显示且使用表情文件实现指定功能的应用程序。此应用包括社交应用以及进行网络信息展示的各种网络应用等等。例如,此应用可以是即时通信工具、微博等。对于即时通信工具而言,其指定功能是会话功能,而微博,其指令功能则可以是发布包含表情图像的消息。
应用中所显示图像对应的表情文件生成指令是针对被显示于应用界面的图像触发生成的。应用所显示的应用界面中,进行着图像的显示,此时即可对此图像触发表情文件生成指令。
具体而言,在应用界面上,随着图像显示的进行,触发所显示的图像。例如,在一个示例性实施例的具体实现中,对于用户而言,用户对此图像触发点击或者长按等操作,便随之激活此图像的相关联操作项,相关联操作项包括表情制作操作项以及其它操作项,用户选取表情制作操作项之后即可生成显示图像被触发生成表情文件的指令;与之相对应的,对于应用本身,在进行图像显示的过程中,监听是否触发点击或者长按操作,如果应用界面上触发了点击或者长按操作,并且所触发的点击或者长按操作被定位至所显示的图像上,则调用接口函数响应所显示图像上触发的点击或长按操作,生成表情文件生成指令,而对于本公开的实现而言,此时便接收到接口函数所传入的表情文件生成指令。
在另一个示例性实施例中,应用所显示的图像可以是动态图像,因此,对于用户而言,根据其所需要自制表情图像的需求,应用中也会进行所显示动态图像的截取,以获得其中一帧图像,对此帧图像生成表情文件生成指令。
在此应当说明的是,应用中所显示的图像,其数量是一个或者两个以上的,与之相对应的,接收对应的表情文件生成指令进而获得自身表情图像这一过程所作用的对象,可以是所显示图像中的一图像,也可以对两个以上的图像实现批量的表情制作,在此不进行限定。
在一个示例性实施例中,应用为社交应用,图像是社交应用的会话中接收或发送的图像,则步骤S310,具体包括:
在社交应用的会话界面进行接收和/或发送图像的显示中,接收社交应用中所显示图像对应的表情文件生成指令,表情文件生成指令为所显示图像生成的自身表情文件将被社交应用调用并用于实现会话功能。
其中,社交应用用于实现所登录用户与其它用户之间的会话,该会话包括用户之间一对一的会话,例如,用户与其好友之间的会话,也包括用户之间的群组会话。
相应会话窗口便实现了会话界面。随着会话的进行,进行着消息的发送和接收,在所发送或者接收的消息中,携带的图像将与文字一并显示于会话界面。至此,对于便获得了应用中所显示图像,可通过触发此图像而发起获得此图像对应的自制表情图像,进而生成自制表情文件的过程。
在此,对于社交应用而言,所配置的指定功能即为在会话中传递表情图像的功能。
当然,除了社交应用之外,所指的应用也可以是网络应用等等,例如,通过浏览器而加载的表情商城应用等,所配置的指定功能根据应用的不同而各不相同。对于表情商城应用,所配置的指定功能即为表情文件对应的表情图像发布,且将此表情文件授权于付费用户使用的功能。
对于应用而言,其上所显示的任意图像,都可直接触发此图像所对应的自制表情图像以及相应自制表情文件的生成,进而最终应用于显示此图像的应用,一方面增强了即时性,另一方面也丰富了应用中配置的表情文件,用户通过应用所获得的任意图像都可在此应用中配置相应的表情文件。
在另一个示例性实施例中,步骤310具体包括:应用或者社交应用中,激活所显示图像的相关联操作项,通过相关联操作项中表情制作操作项的触发选定获得显示图像对应的表情文件生成指令。
其中,如前所述的,应用能够进行图像显示并需要使用表情图像实现其功能的,而社交应用是其中之一,但是还存在着前述所例举的其它应用。
无论是对于社交应用还是其它应用,都可以通过图像的相关联操作项实现所指定图像的自制表情文件过程的触发执行。图像的相关联操作项包括表情制作操作项以及其它操作项,所显示图像的相关联操作项被激活之后,即可根据相关联操作项中表情制作操作项的选定操作获得表情文件生成指令。
应当理解的是,对于相关联操作项的配置和激活,是通过内置的资源文件和接口函数实现的,也就是说,相关联操作项的显示样式以及其所包含的各种操作项,都是作为资源文件存储的,在监听到相应的操作之后,即可在接口函数的作用下调出,激活显示于应用界面上。
在步骤S330中,根据图像被触发生成表情文件的指令调用应用内置的表情编辑工具。
其中,表情编辑工具用于实现应用中图像的即时处理,获得相应表情图像。表情编辑工具是由应用中图像通道实现的,在用户所触发表情编辑操作的控制下实现图像以及文件的渲染和绘制。
随着步骤S310中图像被触发生成表情文件的指令的接收,必然需要由图像获得所期望的表情图像,例如,在此图像上添加特定文字,或者变换此图像的渲染效果等等,因此,将由图像被触发生成表情文件的指令而调用应用内置的表情编辑工具。
在步骤S350中,通过表情编辑工具以及对图像触发的表情编辑操作获得图像对应的自制表情图像。
其中,被触发生成表情文件的图像,将会在应用所内置表情编辑工具的作用下,进入表情编辑状态,可以随着用户所触发的表情编辑操作而执行图像处理过程。例如,触发了输入文字的操作,即可在图像中新增文字,除此之外,也可对新增的文字触发拖动调整位置的操作,以调整图像中新增文字的位置,还可将输入的文字渲染为指定字体等等,完成了所需要的表情编辑操作之后即可获得图像对应的自制表情图像。
自制表情图像与普通表情图像相类似,都为所在信息进行内容描述或者内容补充。普通表情图像是由应用服务器下载并添加配置于应用中的,也可以是接收后添加配置于应用中的。但是,自制表情图像则有所不同,自身表情图像是使用应用中显示的图像自定义制作得到的。
可以理解的,对图像触发的表情编辑操作,可以是一次触发的,也可以是多次触发的,在此将不进行限定。而触发的图像编辑操作种类可以是一种或者多种。
当然,也可以不进行图像的表情编辑而直接将此图像作为表情图像,此是,所触发的图像编辑操作即为确定将此图像作为自制表情图像的操作。
应当说明的,所获得图像对应的自制表情图像,是指以应用中所显示图像为基础而获得的表情图像,由于存在着图像编辑过程,所获得的表情图像是用户所自定义制作的。
在步骤S370中,将自制表情图像相应生成配置于应用的自制表情文件,自制表情文件被应用调用而实现应用自身配置的指定功能,指定功能区别于表情制作功能。
其中,经由应用内置的表情编辑工具和图像编辑操作所最终得到的自制表情图像,将被保存为表情文件的形式,即生成自制表情文件,并配置于应用中,应用可随时调用。例如,生成的自制表情文件可以是EIF格式,只需要将获得的自制表情图像转换为EIF格式并存储即可获得相应的自制表情文件。
生成配置于应用的自制表情文件,相对于自制表情图像的存在形式,将有利于保证所显示的图像质量。具体而言,对于表情图像,若其以图像的形式存储和传输,在每次的存储和传输中,都会经过压缩,这将会导致表情图像的画质越来越差,图像质量低劣。而表情图像以表情文件,即前述所指的自制表情文件的形式存在,而表情图像仅作为其显示内容,则由于已经在图像编辑的过程中已经经过压缩,所以每次发送和二次保存的时候对应的调用和被调用过程中,不会经过任何二次压缩。
本公开的示例性实施例中,由应用所显示图像获得了自制表情图像,进而由自制表情图像生成并配置于应用中的自制表情文件,能够有效地提升应用实现场景中表情文件制作的即时性和定制性,能够保证用户在应用中的沉浸,不被其它应用中断或打扰,有助于提升用户在应用的活跃度和依赖度。
应当说明的是,自制表情文件被配置于应用之后,可随即调用而实现指定功能,也可延后调用。换而言之,调用自制表情文件而实现的指定功能,可以是即时性的,例如,对会话过程中产生的图像进行编辑制作之后进行自制表情文件的传播;也可以是非即时性的纯制作需求,例如,社交应用所配置的扩展工具栏所提供的表情制作功能。
本公开的示例性实施例在社交应用的实现,使得用户能够在会话过客中中对发送或接收的图片及视频制作成自制表情文件,在当前的会话界面中发送,不需要由专门的表情制作应用跳转至社交应用,再进行导入及保存之类的繁琐操作,自制表情文件的素材来源于用户当前传播的图像,且制作过程也完全在会话过程之中,保证了用户在社交应用中的沉浸。
通过本公开的示例性实施例,应用所显示的图像,可对其触发进行表情文件的制作,并且应用是通过表情文件执行自身配置的指定功能的,在此应用中,将首先接收到所显示图像对应的表情文件生成指令,然后根据图像被触发生成表情文件的指令调用应用内置的表情编辑工具,通过表情编辑工具以及对图像触发的表情编辑操作获得图像对应的自制表情图像,最后将自制表情图像生成配置于应用的自制表情文件,自制表情文件被应用调用而实现应用自身配置的指定功能,此指定功能区别于表情制作功能,至此,便实现了应用为所显示图像的表情文件制作以及配置的全过程,并且仅仅在对所显示图像触发生成表情文件以及对此图像执行表情编辑这两个节点需要用户操作的介入,避免了繁琐操作,使得实现表情文件的制作和在应用中的配置由繁琐变为简单化,且不需要借助于额外的表情包制作应用,由应用所显示图像到此应用中表情文件的配置极度简洁。
图4是根据图3对应实施例示出的对步骤S350的细节进行描述的流程图。在步骤S350中,如图4所示,可以包括以下步骤。
在步骤S351中,通过表情编辑工具跳转进入所显示图像对应的表情编辑状态。
其中,如前所述的,表情编辑工具被内置于应用中,是由应用所部署的图像通道实现的,为表情编辑过程的执行提供的图像处理逻辑。
随着应用中表情编辑工具的调用而使得所显示图像进入表情编辑状态。在图像进入表情编辑状态之后,就可以对此图像随意触发各种表情编辑操作。
通过表情编辑工具而跳转进入所显示图像对应的表情编辑状态,可以是使得应用中所显示图像直接处于能够编辑的状态,并为此呼出能够触发各种表情编辑操作的操作项,用户选定任一操作项即可触发的表情编辑操作。
除此之外,也可以在应用中跳转进入表情编辑工具对应的操控界面,图像被载入此操控界面,进而即可在此操控界面对图像触发各种表情编辑操作。
在步骤S353中,接收对图像触发的表情编辑操作,通过表情编辑工具对处于表情编辑状态的图像进行与表情编辑操作对应的图像处理,获得图像对应的自制表情图像。
其中,表情编辑工具中图像处理过程的执行是由部署的图像处理逻辑实现的。根据所触发表情编辑操作而执行相应的图像处理逻辑。
图5是根据图3对应实施例示出的对步骤S370的细节进行描述的流程图。在步骤S370中,如图5所示,可以包括以下步骤。
在步骤S371中,生成自制表情图像对应的自制表情文件。
其中,在对应用所显示图像触发了生成表情文件的指令之后,对应用所显示图像进行表情编辑而获得表情图像之后,即可生成自制表情文件。
在步骤S373中,获取应用所登录的用户标识,以用户标识为索引将自制表情文件存储于应用服务器和/或所在终端设备本地。
其中,用户在应用中实现了登录,并且在应用中通过用户标识来唯一标记用户身份。而自制表情文件是用户在应用中自定义制作得到的,因此,必然获取应用所登录的用户标 识,以此用户标识为索引进行自制表情文件的存储。
自制表情文件的存储,可以是在终端设备本地进行,也可以是在应用服务器中进行的,进而实现自制表情文件的本地存储或者远端存储。
由此可知,与之相对应的,在需要调用自制表情文件时,只需要以应用中所登录用户标识为索引进行调用,进而实现应用中配置的指定功能。
在另一个示例性实施例中,该处理应用中图像的方法,在步骤S370之后,还包括以下步骤。
为自制表情文件在应用的表情面板进行缩略显示。
其中,自制表情文件在表情面板的缩略显示,是指将自制表情文件的缩略图显示于应用的表情面板的过程,将用于触发自制表情文件的选择调用。如前所述的,自制表情文件所对应的表情图像包括动态图像和静态图像。对于静态图像,自制表情文件在表情面板的缩略显示即为此静态图像的缩略图在表情面板的显示;对于动态图像,动态图像是通过gif(Graphics Interchange Format,图像互换格式)序列帧实现的,即由一组连贯的画面帧组成,将一画面帧提取出来作为此动态图像的缩略图,显示于应用的表情面板,实现动态图像在表情面板的缩略显示。
由此,通过自制表情文件在表情面板的缩略显示,使得自制表情文件的调用能够通过表情面板所显示的缩略图而触发,进而使得应用中自制表情文件的生成和指定功能的实现能够直接在应用内完成。
在此示例性实施例中,自制表情文件在表情面板的缩略显示,使得自制表情文件与其它的普通表情文件相类似,与现行的表情缩略图逻辑保持一致,具备非常高的通用性。
图6是根据另一示例性实施例示出的一种处理应用中图像的方法的流程图。自制表情文件所显示缩略图关联于表情文件,为自制表情文件在应用的表情面板进行缩略显示步骤之后,该处理应用中图像的方法如图6所示,可以包括以下步骤。
在步骤S410中,通过表情面板中对自制表情文件所显示缩略图触发的选定操作而选择调用应用所登录用户标识和缩略图在终端设备本地或应用服务器索引到的表情文件。
其中,应用中进行表情面板的显示,表情面板中存在着每一表情文件,即普通表情文件和自制表情文件的缩略图。也就是说,表情面板布设各表情文件所对应的缩略图,用户通过触发对某一缩略图的选定操作而从终端设备本地或应用服务器索引到相应的表情文件。
缩略图与表情文件存在着关联性,具体而言,表情文件所对应的缩略图也是与此表情文件存在着关联性,并且此表情文件也是以用户标识为索引而存储于终端设备本地和/或应用服务器的。
因此,可以根据所选定的缩略图和应用所登录的用户标识来实现相应表情文件的调用。
在步骤S430中,在应用中进行自制表情文件的传输。
其中,应用中进行自制表情文件的传输用于实现应用自制配置的指定功能。根据应用 的不同,应用中进行自制表情文件传输的实现也各不相同。具体的,应用中进行自制表情文件的传输,可以是应用将此表情文件发布至指定页面,也可以是将此表情文件传输发送至另一用户所登录的应用,还可以是其它的一些实现方式,在此不进行限定。
例如,对于表情商城应用,其所进行的自制表情文件传输是将此自制表情文件对应的自制表情图像发布于表情商城应用的指定页面,以供其它用户浏览查看此指定页面中发布的自制表情图像。
又例如,对于社交应用,其所进行的自制表情文件传输包括:一方面将此表情文件传输至参与会话的好友或者群组,另一方面也将此表情文件所对应的表情图像显示至应用中此会话所对应的会话窗口。
在此社交应用中,可以理解的,表情文件被发送至好友所登录的社交应用客,或者群组中每一成员所登录的社交应用,在接收此表情文件的社交应用中,进行相应表情图像在会话窗口的显示。
可以理解的,表情文件的发送,是由应用服务器,即对应于此社交应用的社交应用服务器转发实现的。
通过此示例性实施例,对于应用而言,自制表情文件的获得并不需要借助于专门下载并部署于所在终端设备的表情包制作应用即可实现,提高了表情文件自定义制作的简易性和便捷性,用户可以快速的、沉浸式的在应用中制作表情文件并随即配置和传播。
本公开的示例性实施例能够基于用户在应用中的即时通信互动将表情文件的自定义制作与会话体现完全融合,一方面对于屏幕尺寸受到限制的终端设备言,例如,智能手机,极大降低了操作成本,另一方面,自制表情文件的素材来自于用户的传播图像,即用户发送或者接收的图像,自制表情文件的制作也完全在会话过程中实现,满足即时沟通以及题材定制化的需求,也能够实现最及时、快速的发送。
除此之外,本公开的示例性实施例所获得的自制表情文件还可通过另存和转发的方式传播到其它场景,进而实现此自制表情文件的复用。
以即时通信工具中自制表情图像的实现为例,结合具体应用场景,描述如上所述的方法。即时通信工具运行于用户手持的智能手机。
具体的,用户在智能手机打开即时通信工具,选择单个好友或群组,进入所在会话窗口。在会话过程中进行发送图像的操作,或者接收到对方或群组内其他好友发送的图像,例如,图7是根据一个示例性实施例示出的即时通信工具中用户与好友之间会话窗口的界面示意图。
在此用户与好友之间会话窗口510中,随着好友所发送消息的接收将好友所发送的图像530显示于会话窗口510。
用户在此会话窗口510中,长按图像530而呼出操作项550,对操作项550中的做表情操作项551触发选定操作,此时,即可进入图像530的表情编辑界面。图8是根据图7对应实施例示出的呼出操作项的会话窗口的界面示意图。
在跳转进入图像530的表情编辑之后,首先对图像530进行裁剪,将图像530裁剪为符合表情图像大小的尺寸和形状,并选择画面的特定范围,即如图9所示。图9是根据图8对应实施例示出的对图像裁剪的表情编辑界面示意图。
然后进行输入文字的操作,用户可以选择默认的文案,也可以进行自定义的编辑输入,并且输入后的文字可以拖动调整位置,完成文字的输入之后会被渲染为预设字体,甚至于在边角上自动打上标识,进而初步获得表情图像。
图10是根据图9对应实施例示出的完成裁剪的图像中触发文字输入的界面示意图,与之相对应的,图11是根据图10对应实施例示出的完成文字输入的界面示意图。
图12是根据图11对应实施例示出的完成所输入文字渲染的界面示意图。至此,经由图10至图12即可最终获得自制表情图像。
由此自制表情图像生成配置于应用中的自制表情文件,此时,用户可在此会话窗口510调用此自制表情文件,以向好友发送自制表情文件,图13是图12所对应实施例示出的发送自制表情图像的会话窗口示意图。
另一方面,配置于应用中的自制表情文件,将在会话窗口510的表情面板570进行缩略显示,即图14所示的缩略图580。图14是根据图13对应实施例示出的自制表情文件在会话窗口进行缩略显示的示意图。
此应用场景所涉及的实现过程如图15所示。图15是根据一示例性实施例示出的即时通信工具中自制表情图像的实现流程示意图。
如图15所示的,在即时通信工具所实现的会话场景下,接收到图像,此时便触发了表情制作。
如步骤610所示,随着表情制作的触发,即时通信工具中进入图像编辑通道打开表情编辑工具,在表情编辑工具的作用下,即可执行步骤620至步骤640所示的流程,获得自制表情图像。
此时,如步骤650所示,用户可以发送或者选择继续进行编辑。一方面,在所选择继续进行编辑的过程中,执行步骤660至步骤670所示的过程。
至此,在步骤650或者步骤670而获得自制表情图像,即可执行步骤710,存储自制表情图像至后台表情通道,以实现自制表情图像在即时通信工具的存储,完成自制表情图像在即时通信工具的配置。
配置于即时通信工具的自制表情图像便可以执行步骤730,将自制表情图像发送于会话窗口,另一方面,也将执行步骤750,对动态显示的自制表情图像截取第一帧画面作为预览图而实现自制表情图像在表情面板的缩略显示。
由此,将使得自制表情图像基于即时通信关系链实现,在会话中调用的同时,也可传播到其他场景。
下述为本公开装置实施例,可以用于执行本公开上述处理应用中图像的方法实施例。对于本公开装置实施例中未披露的细节,请参照本公开处理应用中图像的方法实施例。
图16是根据一示例性实施例示出的一种处理应用中图像的装置的框图。该处理应用中图像的装置,如图16所示,可以包括但不限于:指令接收器810、表情编辑调用器830、自制表情获得器850和配置器870。
指令接收器810,配置为接收应用中所显示图像对应的表情文件生成指令。
表情编辑调用器830,配置为根据图像被触发生成表情文件的指令调用应用内置的表情编辑工具。
自制表情获得器850,配置为通过表情编辑工具以及对图像触发的表情编辑操作获得图像对应的自制表情图像。
配置器870,配置为将自制表情图像相应生成配置于应用的自制表情文件,自制表情文件被应用调用而实现应用自身配置的指定功能。
在一个示例性实施例中,应用为社交应用,图像则是社交应用的会话中接收或发送的图像,指令接收器810进一步配置为在社交应用的会话界面进行接收和/或发送图像的显示中,接收社交应用中所显示图像对应的表情文件生成指令,表情文件生成指令为所显示图像生成的自制表情文件将被社交应用调用并用于实现会话中传递表情图像的功能。
在另一个示例性实施例中,指令接收器810进一步配置为应用或者社交应用中,激活所显示图像的相关联操作项,通过相关联操作项中表情制作操作项的触发选定获得显示图像对应的表情文件生成指令。
图17是根据图16对应实施例示出的对自制表情获得器的细节进行描述的框图。该自制表情获得器850,如图17所示,可以包括但不限于:状态跳转器851和图像处理执行器853。
状态跳转器851,配置为通过表情编辑工具跳转进入所显示图像对应的表情编辑状态。
图像处理执行器853,配置为接收对处于表情编辑状态的图像触发的表情编辑操作,通过表情编辑工具对处于表情编辑状态的图像进行与表情编辑操作对应的图像处理,获得图像对应的自制表情图像。
图18是根据图16对应实施例示出的对配置器的细节进行描述的框图。该配置器870,如图18所示,可以包括但不限于:表情文件生成器871和索引存储器873。
表情文件生成器871,配置为生成自制表情图像对应的自制表情文件。
索引存储器873,配置为获取应用所登录的用户标识,以用户标识为索引将自制表情文件存储于应用服务器和/或所在终端设备本地。
在一个示例性实施例中,该处理应用中图像的装置还可以包括但不限于:面板显示控制器。面板显示控制器配置为为自制表情文件在应用的表情面板进行缩略显示。
图19是根据另一示例性实施例示出的一种处理应用中图像的装置的框图。该处理应用中图像的装置,如图19所示,还可以包括但不限于:表情文件调用器910和传输器930。
表情文件调用器910,配置为通过表情面板中对自制表情文件所显示缩略图触发的选定操作而选择调用应用所登录用户标识和缩略图在终端设备本地或应用服务器索引到的 表情文件。
传输器930,配置为在应用中进行自制表情文件的传输。
可选的,本公开还提供一种终端设备,执行图3、图4、图5和图6任一所示的文件载入控制方法的全部或者部分步骤。所述装置包括:
处理器;
存储器,所述存储器上存储有计算机可读指令,所述计算机可读指令被所述处理器执行时实现如上所述的处理应用中图像的方法。
可选的,本公开还提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如上所述的处理应用中图像的方法。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围执行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (16)

  1. 一种处理应用中图像的方法,所述方法由终端设备执行,所述方法包括:
    接收终端设备所运行应用中所显示图像对应的表情文件生成指令;
    根据所述图像被触发生成表情文件的指令调用所述应用内置的表情编辑工具;
    通过所述表情编辑工具以及对所述图像触发的表情编辑操作获得所述图像对应的自制表情图像;
    将所述自制表情图像相应生成配置于所述应用的自制表情文件,所述自制表情文件被所述应用调用而实现应用自身配置的指定功能,所述指定功能区别于表情制作功能。
  2. 根据权利要求1所述的方法,其中,所述应用是社交应用,图像是所述社交应用的会话中接收或发送的图像,所述接收应用中所显示图像对应的表情文件生成指令,包括:
    在所述社交应用的会话界面进行接收和/或发送图像的显示中,接收所述社交应用中所显示图像对应的表情文件生成指令,所述表情文件生成指令为所显示图像生成的自制表情文件将被所述社交应用调用并用于实现会话中传递表情图像的功能。
  3. 根据权利要求1或2所述的方法,其中,所述接收所述应用或社交应用中所显示图像对应的表情文件生成指令,包括:
    所述应用或者社交应用中,激活所显示图像的相关联操作项,通过所述相关联操作项中表情制作操作项的触发选定获得所述显示图像对应的表情文件生成指令。
  4. 根据权利要求1所述的方法,其中,所述通过所述表情编辑工具以及对所述图像触发的表情编辑操作获得所述图像对应的自制表情图像,包括:
    通过所述表情编辑工具跳转进入所显示图像对应的表情编辑状态;
    接收对所述处于表情编辑状态的图像触发的表情编辑操作,通过所述表情编辑工具对处于表情编辑状态的图像进行与所述表情编辑操作对应的图像处理,获得图像对应的自制表情图像。
  5. 根据权利要求1所述的方法,其中,所述将所述自制表情图像相应生成配置于所述应用的自制表情文件,包括:
    生成所述自制表情图像对应的自制表情文件;
    获取所述应用所登录的用户标识,以所述用户标识为索引将所述自制表情文件存储于应用服务器和/或所在终端设备本地。
  6. 根据权利要求1所述的方法,其中,所述将所述自制表情图像相应生成配置于所述应用的自制表情文件之后,所述方法还包括:
    为所述自制表情文件在所述应用的表情面板进行缩略显示。
  7. 根据权利要求6所述的方法,其中,所述自制表情文件所显示缩略图关联于所述表情文件,所述为所述自制表情文件在所述应用的表情面板进行缩略显示之后,所述方法还包括:
    通过表情面板中对所述自制表情文件所显示缩略图触发的选定操作而选择调用所述应用所登录用户标识和所述缩略图在终端设备本地或应用服务器索引到的表情文件;
    在所述应用中进行所述自制表情文件的传输。
  8. 一种处理应用中图像的装置,所述装置被配置于终端设备,所述装置包括:
    指令接收器,配置为接收应用中所显示图像对应的表情文件生成指令;
    表情编辑调用器,配置为根据所述图像被触发生成表情文件的指令调用所述应用内置的表情编辑工具;
    自制表情获得器,配置为通过所述表情编辑工具以及对所述图像触发的表情编辑操作获得所述图像对应的自制表情图像;
    配置器,配置为将所述自制表情图像相应生成配置于所述应用的自制表情文件,所述自制表情文件被所述应用调用而实现应用自身配置的指定功能。
  9. 根据权利要求8所述的装置,其中,所述应用是社交应用,图像是社交应用的会话中接收或发送的图像,所述指令接收器进一步配置为在所述社交应用的会话界面进行接收和/或发送图像的显示中,接收所述社交应用中所显示图像对应的表情文件生成指令,所述表情文件生成指令为所显示图像生成的自制表情文件将被所述社交应用调用并用于实现会话中传递表情图像的功能。
  10. 根据权利要求8或9所述的装置,其中,所述指令接收器进一步配置为所述应用或者社交应用中,激活所显示图像的相关联操作项,通过所述相关联操作项中表情制作操作项的触发选定获得所述显示图像对应的表情文件生成指令。
  11. 根据权利要求8所述的装置,其中,所述自制表情获得器包括:
    状态跳转器,配置为通过所述表情编辑工具跳转进入所显示图像对应的表情编辑状态;
    图像处理执行器,配置为接收对所述处于表情编辑状态的图像触发的表情编辑操作,通过所述表情编辑工具对处于表情编辑状态的图像进行与所述表情编辑操作对应的图像处理,获得图像对应的自制表情图像。
  12. 根据权利要求8所述的装置,其中,所述配置器包括:
    表情文件生成器,配置为生成所述自制表情图像对应的自制表情文件;
    索引存储器,配置为获取所述应用所登录的用户标识,以所述用户标识为索引将所述自制表情文件存储于应用服务器和/或所在终端设备本地。
  13. 根据权利要求8所述的装置,其中,所述装置还包括:
    面板显示控制器,配置为为所述自制表情文件在所述应用的表情面板进行缩略显示。
  14. 根据权利要求13所述的装置,其中,所述装置还包括:
    表情文件调用器,配置为通过表情面板中对所述自制表情文件所显示缩略图触发的选定操作而选择调用所述应用所登录用户标识和所述缩略图在终端设备本地或应用服务器索引到的表情文件;
    传输器,配置为在所述应用中进行所述自制表情文件的传输。
  15. 一种终端设备,其中,包括:
    处理器;以及
    存储器,所述存储器上存储有计算机可读指令,所述计算机可读指令被所述处理器执行时实现根据权利要求1至7中任一项所述的处理应用中图像的方法。
  16. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现根据权利要求1至7中任一项所述的处理应用中图像的方法。
PCT/CN2018/103874 2017-09-07 2018-09-04 处理应用中图像的方法、装置、终端设备和存储介质 WO2019047809A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2020513636A JP7253535B2 (ja) 2017-09-07 2018-09-04 アプリケーションにおける画像を処理する方法、装置、機器端末及び記憶媒体
KR1020237022225A KR20230104999A (ko) 2017-09-07 2018-09-04 애플리케이션에서 이미지를 처리하기 위한 방법 및장치, 단말 장치 및 저장 매체
KR1020207007499A KR20200036937A (ko) 2017-09-07 2018-09-04 애플리케이션에서 이미지를 처리하기 위한 방법 및 장치, 단말 장치 및 저장 매체
KR1020227006348A KR20220028184A (ko) 2017-09-07 2018-09-04 애플리케이션에서 이미지를 처리하기 위한 방법 및 장치, 단말 장치 및 저장 매체
US16/794,001 US20200186484A1 (en) 2017-09-07 2020-02-18 Method and apparatus for processing image in application, terminal device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710811389.1 2017-09-07
CN201710811389.1A CN109472849B (zh) 2017-09-07 2017-09-07 处理应用中图像的方法、装置、终端设备和存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/794,001 Continuation US20200186484A1 (en) 2017-09-07 2020-02-18 Method and apparatus for processing image in application, terminal device and storage medium

Publications (1)

Publication Number Publication Date
WO2019047809A1 true WO2019047809A1 (zh) 2019-03-14

Family

ID=65634734

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/103874 WO2019047809A1 (zh) 2017-09-07 2018-09-04 处理应用中图像的方法、装置、终端设备和存储介质

Country Status (5)

Country Link
US (1) US20200186484A1 (zh)
JP (1) JP7253535B2 (zh)
KR (3) KR20230104999A (zh)
CN (1) CN109472849B (zh)
WO (1) WO2019047809A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI601994B (zh) 2015-12-15 2017-10-11 大立光電股份有限公司 取像用光學鏡頭組、取像裝置及電子裝置
CN110780955B (zh) * 2019-09-05 2023-08-22 连尚(新昌)网络科技有限公司 一种用于处理表情消息的方法与设备
CN112800365A (zh) * 2020-09-01 2021-05-14 腾讯科技(深圳)有限公司 表情包的处理方法、装置及智能设备
CN114693827A (zh) * 2022-04-07 2022-07-01 深圳云之家网络有限公司 表情生成方法、装置、计算机设备和存储介质
CN114880062B (zh) * 2022-05-30 2023-11-14 网易(杭州)网络有限公司 聊天表情展示方法、设备、电子设备及存储介质
CN115348225B (zh) * 2022-06-06 2023-11-07 钉钉(中国)信息技术有限公司 表情信息处理方法、电子设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072207A (zh) * 2007-06-22 2007-11-14 腾讯科技(深圳)有限公司 即时通讯工具中的交流方法及即时通讯工具
CN102811184A (zh) * 2012-08-28 2012-12-05 腾讯科技(深圳)有限公司 自定义表情的共享方法、终端、服务器及系统
CN106709975A (zh) * 2017-01-11 2017-05-24 山东财经大学 一种交互式三维人脸表情动画编辑方法、系统及扩展方法
CN107368199A (zh) * 2017-07-01 2017-11-21 北京奇虎科技有限公司 基于移动终端的社交软件的表情管理方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100471594B1 (ko) * 2002-11-26 2005-03-10 엔에이치엔(주) 사용자 정의 이모티콘 이미지를 이용한 컴퓨터 네트워크상에서의 데이터 전송 서비스 제공방법 및 그를 구현하기위한 응용 프로그램을 기록한 컴퓨터가 읽을 수 있는기록매체
CN101252550A (zh) * 2008-03-31 2008-08-27 腾讯科技(深圳)有限公司 自定义信息管理装置、方法及系统
CN106658079B (zh) * 2017-01-05 2019-04-30 腾讯科技(深圳)有限公司 自定义生成表情图像的方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072207A (zh) * 2007-06-22 2007-11-14 腾讯科技(深圳)有限公司 即时通讯工具中的交流方法及即时通讯工具
CN102811184A (zh) * 2012-08-28 2012-12-05 腾讯科技(深圳)有限公司 自定义表情的共享方法、终端、服务器及系统
CN106709975A (zh) * 2017-01-11 2017-05-24 山东财经大学 一种交互式三维人脸表情动画编辑方法、系统及扩展方法
CN107368199A (zh) * 2017-07-01 2017-11-21 北京奇虎科技有限公司 基于移动终端的社交软件的表情管理方法及装置

Also Published As

Publication number Publication date
JP2020533677A (ja) 2020-11-19
JP7253535B2 (ja) 2023-04-06
KR20220028184A (ko) 2022-03-08
CN109472849B (zh) 2023-04-07
KR20200036937A (ko) 2020-04-07
US20200186484A1 (en) 2020-06-11
KR20230104999A (ko) 2023-07-11
CN109472849A (zh) 2019-03-15

Similar Documents

Publication Publication Date Title
WO2019047809A1 (zh) 处理应用中图像的方法、装置、终端设备和存储介质
US10152207B2 (en) Method and device for changing emoticons in a chat interface
CN107566243B (zh) 一种基于即时通信的图片发送方法和设备
CN108156026B (zh) 对讲机配置方法及装置
CN106126025B (zh) 复制粘贴的交互方法及装置
CN111554382B (zh) 医学图像的处理方法及装置、电子设备和存储介质
WO2022078295A1 (zh) 一种设备推荐方法及电子设备
EP2950486B1 (en) Method and device for managing instant message
CN104239317A (zh) 在浏览器中实现图片编辑的方法及装置
US20160353406A1 (en) Media information sharing between networked mobile devices
CN110704030A (zh) 接口配置信息生成方法、装置、电子设备及存储介质
CN114153362A (zh) 信息处理方法及装置
CN108132736B (zh) 窗口中的显示控制方法和装置
CN106447747B (zh) 图像处理方法及装置
KR20220117070A (ko) 아바타 사용 권한 관리 방법 및 시스템
US11310177B2 (en) Message display method and terminal
US20150286361A1 (en) Single gesture video capture and share
KR102095666B1 (ko) 정보 입력 방법, 장치, 프로그램 및 저장매체
CN104994151A (zh) 信息发布方法和装置
CN114025317A (zh) 多媒体资源的传播方法、装置、服务器、终端及存储介质
CN109981729B (zh) 文件处理方法、装置、电子设备及计算机可读存储介质
CN114138413A (zh) 图标显示方法、装置、电子设备及存储介质
CN113919311A (zh) 数据展示方法、装置、电子设备及存储介质
CN110807116B (zh) 一种数据处理方法、装置和用于数据处理的装置
US20140214987A1 (en) Method and system of providing an instant messaging service

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18854582

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020513636

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20207007499

Country of ref document: KR

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 18854582

Country of ref document: EP

Kind code of ref document: A1