US20180188905A1 - Generating messaging streams with animated objects - Google Patents
Generating messaging streams with animated objects Download PDFInfo
- Publication number
- US20180188905A1 US20180188905A1 US15/398,497 US201715398497A US2018188905A1 US 20180188905 A1 US20180188905 A1 US 20180188905A1 US 201715398497 A US201715398497 A US 201715398497A US 2018188905 A1 US2018188905 A1 US 2018188905A1
- Authority
- US
- United States
- Prior art keywords
- user
- animated object
- animated
- action
- messaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000009471 action Effects 0.000 claims abstract description 156
- 238000000034 method Methods 0.000 claims abstract description 52
- 230000033001 locomotion Effects 0.000 claims description 30
- 230000015654 memory Effects 0.000 claims description 28
- 230000008859 change Effects 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 3
- 235000014510 cooky Nutrition 0.000 description 59
- 238000004891 communication Methods 0.000 description 27
- 238000007664 blowing Methods 0.000 description 9
- 230000002452 interceptive effect Effects 0.000 description 9
- 230000006399 behavior Effects 0.000 description 7
- 210000003811 finger Anatomy 0.000 description 7
- 238000012546 transfer Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 235000013305 food Nutrition 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 241000070023 Phoenicopterus roseus Species 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000010267 cellular communication Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000013475 authorization Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- RZVAJINKPMORJF-UHFFFAOYSA-N Acetaminophen Chemical compound CC(=O)NC1=CC=C(O)C=C1 RZVAJINKPMORJF-UHFFFAOYSA-N 0.000 description 1
- 241000289581 Macropus sp. Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- MMOXZBCLCQITDF-UHFFFAOYSA-N N,N-diethyl-m-toluamide Chemical compound CCN(CC)C(=O)C1=CC=CC(C)=C1 MMOXZBCLCQITDF-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 210000003746 feather Anatomy 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 235000021178 picnic Nutrition 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G06Q50/50—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
Definitions
- Implementations generally relate to a computer-implemented method to generate a messaging stream where one or more messages are exchanged between a first user and a second user.
- the method may include generating a messaging stream where one or more messages are exchanged between a first user and a second user.
- the method further includes receiving a selection of an animated object from the first user for the messaging stream.
- the method further includes providing the animated object in the messaging stream.
- the method further includes receiving a first action from the first user related to the animated object.
- the method further includes modifying a display of the animated object based on the first action.
- the method further includes receiving a second action from the second user related to the animated object.
- the method further includes modifying the display of the animated object based on the second action.
- receiving the first action from the first user includes detecting a change in airflow near a microphone associated with a user device. In some implementations, receiving the first action from the first user includes detecting movement of a finger on a touch screen or detection of movement of a pointing device. In some implementations, receiving the first action from the first user includes detecting movement of a user device based on information received from a sensor associated with the user device. In some implementations, receiving the first action from the first user includes receiving a message from the first user and determining a context from the messaging stream and modifying a display of the animated object based on the first action includes modifying the display of the animated object based on the context from the messaging stream.
- the method further comprises displaying the one or more messages within the messaging stream while the animated object remains fixed in a portion of the messaging stream.
- the first user is a chat bot and the animated object is related to the one or more messages exchanged between the first user and the second user.
- the display of the animated object is modified based on at least one of a word in the one or more messages, voice content, and a context of the messaging stream.
- the method further comprises identifying a set of objects to provide as options to the first user based on a type of user device on which the messaging stream is displayed, wherein the animated object is part of the set of objects.
- the method includes generating a messaging stream where one or more messages are exchanged between a first user and a second user.
- the method further includes receiving a selection of an animated object from the first user for the messaging stream.
- the method further includes causing a version of the animated object to be displayed based on a type of user device on which the animated object is displayed.
- the method further includes receiving an action from the second user related to the animated object.
- the method further includes in response to receiving the action, modifying the display of the animated object.
- the selection is a first selection and the animated object is a first animated object and the method further comprises providing the second user with a set of animated objects based on their relationship to the first object, receiving a second selection of a second animated object from the set of animated objects, and in response to the receiving the second selection, providing the second object in the messaging stream.
- the version of the animated object to be displayed based on the type of user device on which the animated object is displayed includes a complicated version for a desktop computer, a simpler version for a mobile device, and a more simple version for a smart watch.
- the selection is a first selection
- the animated object is a first animated object
- the action from the second user includes a second selection of a second animated object from the second user for the messaging stream
- modifying the display of the first animated object includes the first animated object interacting with the second animated object
- the method may include means for generating a messaging stream where one or more messages are exchanged between a first user and a second user.
- the method further includes means for receiving a selection of an animated object from the first user for the messaging stream.
- the method further includes means for providing the animated object in the messaging stream.
- the method further includes means for receiving a first action from the first user related to the animated object.
- the method further includes means for modifying a display of the animated object based on the first action.
- the method further includes means for receiving a second action from the second user related to the animated object.
- the method further includes means for modifying the display of the animated object based on the second action.
- the various implementations described below provide messaging streams that include interactive animated objects.
- the animated objects described below may include multimedia features, such as features that can be displayed on a display screen, projected in a virtual reality environment, played back as audio, played back via haptic feedback, or a combination of such modalities.
- the animated objects react to user input, messages exchanged in the messaging stream, messaging context, and a combination of such factors.
- Interactive animated objects may provide several advantages. For example, such animated objects may enhance the user experience of using a messaging application that implements the messaging stream, e.g., by enabling users to express themselves in interactive ways, rather than being restricted to sending text, images, or stock items such as animated GIF, stickers, and emoji.
- the animated objects may be acted upon concurrently, or sequentially, by multiple users.
- Such interactivity may provide geographically separated users an experience of collaboratively modifying an object.
- behaviors of the objects may be customizable by users.
- the animated objects may be preprogrammed for certain behaviors, e.g., may be configured to respond to various user actions, messaging context, etc. by being modified or displayed in accordance with the user actions or context.
- a technical advantage is that a large or even infinite number of behaviors of animated objects may be provided by a combination of preprogrammed behaviors, without the need to explicitly specify each behavior.
- This can permit complex animated objects to be stored and displayed with limited computational resources.
- Animated objects may be downloaded or accessed on demand, e.g., only upon insertion of the animated object in the messaging stream, which reduces storage space required to store the objects.
- Different versions of an animated object may be provided, as described below, for different user devices. Therefore, devices with limited storage space, network access capacity, and processing power, can still render a suitable version of an animated object.
- implementations described below provide animated objects in messaging streams for different types of user devices, without the need for device-specific reconfiguration.
- FIG. 1 illustrates a block diagram of an example system that generates a messaging stream with an animated object according to some implementations.
- FIG. 2 illustrates a block diagram of an example computing device that generates the messaging stream with the animated object according to some implementations.
- FIG. 3A illustrates an example user interface of a messaging stream that includes an animated cookie according to some implementations.
- FIG. 3B illustrates an example user interface of a messaging stream that includes the animated cookie with eyes that follow a pointing device controlled by a first user according to some implementations.
- FIG. 3C illustrates an example user interface of a messaging stream that includes the animated cookie that is viewable by multiple users according to some implementations.
- FIG. 3D illustrates an example user interface of a messaging stream that includes the animated cookie with eyes that follow a cursor controlled by a second user according to some implementations.
- FIG. 3E illustrates an example user interface of a messaging stream that includes the animated cookie that reacts to being moved according to some implementations.
- FIG. 3F illustrates an example user interface of a messaging stream that includes the animated cookie that reacts to being placed according to some implementations.
- FIG. 3G illustrates an example user interface of a messaging stream that includes the animated cookie with eyes that move to view text displayed within the messaging stream according to some implementations.
- FIG. 3H illustrates an example user interface of a messaging stream that includes the animated cookie that stays in a fixed location within the messaging stream according to some implementations.
- FIG. 3I illustrates an example user interface of a messaging stream that includes an animated cookie that reacts to a word in a message according to some implementations.
- FIG. 3J illustrates an example user interface of a messaging stream that includes the animated cookie that continues to react to the word in the message according to some implementations.
- FIG. 4A illustrates an example user interface of a messaging stream that includes an animated bear according to some implementations.
- FIG. 4B illustrates an example user interface of the messaging stream that includes the animated bear after a user performed a user action according to some implementations.
- FIG. 5A illustrates an example user interface of a messaging stream that includes animated bubbles according to some implementations.
- FIG. 5B illustrates the example user interface of the messaging stream that includes the animated bubbles after two users performed user actions according to some implementations.
- FIG. 6 illustrates an example user interface of a messaging stream that includes an animated money transfer object according to some implementations.
- FIG. 7A illustrates an example user interface of an animated takeout box according to some implementations.
- FIG. 7B illustrates the example user interface of the animated takeout box after a user performed a user action according to some implementations.
- FIG. 8A illustrates an example user interface of a messaging stream that includes an animated airplane according to some implementations.
- FIG. 8B illustrates an example user interface of the messaging stream that includes an animated couple that is displayed responsive to a user action related to the animated airplane according to some implementations.
- FIG. 9 illustrates a flowchart of an example method to generate a messaging stream that includes an animated object.
- a messaging application generates a messaging stream where messages are exchanged between a first user and a second user.
- the messaging application may receive a selection of an animated object from the first user for the messaging stream. For example, the first user may select an animated bubble to add to the messaging stream.
- the messaging application may add the animated object to the messaging stream. For example, the first user and the second user may view the animated bubble in the messaging stream.
- the messaging application may receive a first action from the first user related to the animated object. For example, the first user may blow into a user device and the user device detects a change in airflow near the microphone.
- the messaging application may modify a display of the animated object based on the first action. For example, the messaging application may show the bubble as moving upwards based on the user blowing into the user device.
- the messaging application may receive a second action from the second user related to the animated object. For example, the messaging application may receive indications based on the second user pulling the bubble with a finger on the user device.
- the messaging application may modify the display of the animated object based on the second action. For example, the messaging application may show the bubble as stretching in the direction corresponding to the movement of the finger and then popping.
- FIG. 1 illustrates a block diagram of an example system 100 that generates a messaging stream that includes an animated object.
- the illustrated system 100 includes a messaging server 101 , user devices 115 a , 115 n , a second server 120 , and a network 105 . Users 125 a , 125 n may be associated with respective user devices 115 a , 115 n .
- the system 100 may include other servers or devices not shown in FIG. 1 .
- a letter after a reference number e.g., “ 115 a ,” represents a reference to the element having that particular reference number.
- a reference number in the text without a following letter, e.g., “ 115 ,” represents a general reference to implementations of the element bearing that reference number.
- the messaging server 101 may include a processor, a memory, and network communication capabilities.
- the messaging server 101 is a hardware server.
- the messaging server 101 is communicatively coupled to the network 105 via signal line 102 .
- Signal line 102 may be a wired connection, such as Ethernet, coaxial cable, fiber-optic cable, etc., or a wireless connection, such as Wi-Fi®, Bluetooth®, or other wireless technology.
- the messaging server 101 sends and receives data to and from one or more of the user devices 115 a , 115 n and the second server 120 via the network 105 .
- the messaging server 101 may include a messaging application 103 a and a database 199 .
- the messaging application 103 a may be code and routines operable to generate a messaging stream that includes an animated object.
- the messaging application 103 a may be implemented using hardware including a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC).
- the messaging application 103 a may be implemented using a combination of hardware and software.
- the database 199 may store animated objects, messaging streams, etc.
- the database 199 may store the messages between a first user and a second user.
- the database 199 may also store social network data associated with users 125 , user preferences for the users 125 , etc.
- the user device 115 may be a computing device that includes a memory and a hardware processor.
- the user device may include a desktop computer, a mobile device, a tablet computer, a mobile telephone, a wearable device, a head-mounted display, a mobile email device, a portable game player, a portable music player, a reader device, or another electronic device capable of accessing a network 105 .
- user device 115 a is coupled to the network 105 via signal line 108 and user device 115 n is coupled to the network 105 via signal line 110 .
- Signal lines 108 and 110 may be a wired connection, such as Ethernet, coaxial cable, fiber-optic cable, etc., or a wireless connection, such as Wi-Fi®, Bluetooth®, or other wireless technology.
- User devices 115 a , 115 n are accessed by users 125 a , 125 n , respectively.
- the user devices 115 a , 115 n in FIG. 1 are used by way of example. While FIG. 1 illustrates two user devices, 115 a and 115 n , the disclosure applies to a system architecture having one or more user devices 115 .
- the user device 115 can be a user device that is included in a wearable device worn by the user 125 .
- the user device 115 is included as part of a clip (e.g., a wristband), part of jewelry, or part of a pair of glasses.
- the user device 115 can be a smart watch.
- the user 125 may view an animated object from the messaging application 103 on a display of the device worn by the user 125 .
- the user 125 may view the animated object on a display of a smart watch or a smart wristband.
- messaging application 103 b may be stored on a user device 115 a .
- the messaging application 103 may include a thin-client messaging application 103 b stored on the user device 115 a and a messaging application 103 a that is stored on the messaging server 101 .
- the messaging application 103 b stored on the user device 115 a may display a messaging stream that includes an animated object.
- the user device 115 a may identify a user action from a first user, such as shaking the user device 115 a to make snow fall over an animated object of a snowman.
- the user device 115 a may transmit information about the messaging stream and the user action to the messaging application 103 a stored on the messaging server 101 , which provides the information to a second user that accesses the messaging application 103 a from a desktop computer.
- the second server 120 may include a processor, a memory, and network communication capabilities.
- the second server 120 may access the network 105 via signal line 109 .
- the second server 120 may receive information from the messaging application 103 about the messaging stream and provide information to the messaging application 103 .
- the second server 120 may be associated with a bank and the second server 120 may communicate with the messaging application 103 to pay a bill using the bank information.
- a user 125 may instruct the messaging application 103 to authorize the second server 120 to pay a bill by clicking on an animated object of a bag of money.
- the second server 120 may send the messaging application 103 a notification that the transaction is complete.
- the messaging application 103 may modify the animated object to show the money moving from the money bag to the bank associated with the second server 120 .
- the second server 120 may include a bot that performs functions for a user 125 , such as ordering food, scheduling an appointment, making a reservation, booking a flight, etc.
- the second server 120 may include a separate application that provides information to the messaging application 103 , such as a calendar application that sends, upon user consent, information about the user's meetings.
- the entities of the system 100 are communicatively coupled via a network 105 .
- the network 105 may be a conventional type, wired or wireless, and may have numerous different configurations including a star configuration, token ring configuration or other configurations.
- the network 105 may include a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or other interconnected data paths across which multiple devices may communicate.
- the network 105 may be a peer-to-peer network.
- the network 105 may also be coupled to or include portions of a telecommunications network for sending data in a variety of different communication protocols.
- the network 105 includes Bluetooth® communication networks, WiFi®, or a cellular communications network for sending and receiving data including via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, email, etc.
- SMS short messaging service
- MMS multimedia messaging service
- HTTP hypertext transfer protocol
- FIG. 1 illustrates one network 105 coupled to the user devices 115 and the messaging server 101 , in practice one or more networks 105 may be coupled to these entities.
- the systems and methods discussed herein may collect or use personal information about users (e.g., user data, information about a user's social network, user's location, user's biometric information, user's activities and demographic information, the messaging server stores and analyzes videos), users are provided with opportunities to control whether information is collected, whether the personal information is stored, whether the personal information is used, whether the videos are analyzed, and how the information about the user is collected, stored, and used. That is, the systems and methods discussed herein collect, store, and/or use user personal information only upon receiving explicit authorization from the relevant users to do so. For example, a user is provided with control over whether programs or features collect user information about that particular user or other users relevant to the program or feature.
- users are provided with opportunities to control whether information is collected, whether the personal information is stored, whether the personal information is used, whether the videos are analyzed, and how the information about the user is collected, stored, and used. That is, the systems and methods discussed herein collect, store, and/or use user personal information only upon receiving explicit authorization from
- Each user for which personal information is to be collected is presented with one or more options to allow control over the information collection relevant to that user, to provide permission or authorization as to whether the information is collected and as to which portions of the information are to be collected.
- users can be provided with one or more such control options over a communication network.
- certain data may be treated in one or more ways before it is stored or used so that personally identifiable information is removed.
- a user's identity information may be treated, e.g., anonymized, so that no personally identifiable information can be determined from a video.
- a user's geographic location may be generalized to a larger region so that the user's particular location cannot be determined.
- FIG. 2 illustrates a block diagram of an example computing device 200 that generates composite images.
- the computing device 200 may be a messaging server 101 or a user device 115 .
- the computing device 200 may include a processor 235 , a memory 237 , a communication unit 239 , a display 241 , a speaker 243 , a sensor 245 , and a storage device 247 . Additional components may be present or some of the previous components may be omitted depending on the type of computing device 200 . For example, if the computing device 200 is the messaging server 101 , the computing device 200 may not include the display 241 , the speaker 243 , or the sensor 245 .
- a messaging application 103 may be stored in the memory 237 .
- the computing device 200 may not include storage device 247 .
- the computing device 200 may include other components not listed here, such as a battery, etc.
- the components of the computing device 200 may be communicatively coupled by a bus 220 .
- the processor 235 includes an arithmetic logic unit, a microprocessor, a general purpose controller or some other processor array to perform computations and provide instructions to a display device.
- Processor 235 processes data and may include various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets.
- FIG. 2 includes a single processor 235 , multiple processors 235 may be included.
- Other processors, operating systems, sensors, displays and physical configurations may be part of the computing device 200 .
- the processor 235 is coupled to the bus 220 for communication with the other components via signal line 222 .
- the memory 237 stores instructions that may be executed by the processor 235 and/or data.
- the instructions may include code for performing the techniques described herein.
- the memory 237 may be a dynamic random access memory (DRAM) device, a static RAM, or some other memory device.
- the memory 237 also includes a non-volatile memory, such as a (SRAM) device or flash memory, or similar permanent storage device and media including a hard disk drive, a compact disc read only memory (CD-ROM) device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device for storing information on a more permanent basis.
- the memory 237 includes code and routines operable to execute the messaging application 103 , which is described in greater detail below.
- the memory 237 is coupled to the bus 220 for communication with the other components via signal line 224 .
- the communication unit 239 transmits and receives data to and from at least one of the user device 115 and the messaging server 101 depending upon where the messaging application 103 may be stored.
- the communication unit 239 includes a port for direct physical connection to the network 105 or to another communication channel.
- the communication unit 239 includes a universal serial bus (USB), secure digital (SD), category 5 cable (CAT-5) or similar port for wired communication with the user device 115 or the messaging server 101 , depending on where the messaging application 103 may be stored.
- the communication unit 239 includes a wireless transceiver for exchanging data with the user device 115 , messaging server 101 , or other communication channels using one or more wireless communication methods, including IEEE 802.11, IEEE 802.16, Bluetooth® or another suitable wireless communication method.
- the communication unit 239 is coupled to the bus 220 for communication with the other components via signal line 226 .
- the communication unit 239 includes a cellular communications transceiver for sending and receiving data over a cellular communications network including via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, e-mail or another suitable type of electronic communication.
- SMS short messaging service
- MMS multimedia messaging service
- HTTP hypertext transfer protocol
- the communication unit 239 includes a wired port and a wireless transceiver.
- the communication unit 239 also provides other conventional connections to the network 105 for distribution of files and/or media objects using standard network protocols including, but not limited to, user datagram protocol (UDP), TCP/IP, HTTP, HTTP secure (HTTPS), simple mail transfer protocol (SMTP), SPDY, quick UDP internet connections (QUIC), etc.
- UDP user datagram protocol
- TCP/IP HTTP
- HTTP secure HTTP secure
- SMTP simple mail transfer protocol
- SPDY quick UDP internet connections
- the display 241 may include hardware operable to display graphical data received from the messaging application 103 .
- the display 241 may render graphics to display an overlay and a resulting composite image.
- the display 241 is coupled to the bus 220 for communication with the other components via signal line 228 .
- the speaker 243 may include hardware operable to emit noises. For example, in response to a user performing an action, the action module 206 may instruct the speaker 243 to emit a sound.
- the speaker is coupled to the buss 220 for communication with the other components via signal line 230 .
- the sensor 245 may include hardware operable to detect changes to the user device 115 .
- the sensor 245 may include motion sensors that measure movement of the computing device 200 .
- the sensor 245 may include an accelerometer and a gyroscope that detect acceleration forces and rotational forces along the x, y, and z-axes.
- the sensor 245 may also include position sensors that measure the physical position of the user device 115 , such as orientation sensors and magnetometers.
- the sensor 245 may also include hardware that detects sounds and/or pressure changes, such as a microphone that detects changes in airflow when a user blows on the computing device 200 .
- the sensor 245 is coupled to the bus 220 for communication and with the other components via signal line 232 .
- the sensor 245 may detect contact on a touch screen of the computing device 200 .
- the sensor 245 may detect a user's finger touching the touch screen and movement of the user's finger.
- the storage device 247 may be a non-transitory computer-readable storage medium that stores data that provides the functionality described herein.
- the storage device 247 may include the database 199 in FIG. 1 .
- the storage device 247 may be a DRAM device, a SRAM device, flash memory or some other memory device.
- the storage device 247 also includes a non-volatile memory or similar permanent storage device and media including a hard disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device for storing information on a permanent basis.
- the storage device 247 is coupled to the bus 220 for communication with the other components via signal line 234 .
- the messaging application 103 may include a messaging module 202 , an animation module 204 , an action module 206 , and a user interface module 208 .
- the messaging module 202 generates a messaging stream.
- the messaging module 202 includes a set of instructions executable by the processor 235 to generate the messaging stream.
- the messaging module 202 is stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235 .
- the messaging module 202 generates a messaging stream that includes data sent to and from users and chat bots, for example, by sending data to a user device 115 , a messaging server 101 , and/or a second server 120 .
- the messaging stream may include one or more messages where the messages have certain characteristics, such as a sender; a recipient; and message content including text, an animated object, images, video, and message metadata.
- the message metadata may include a timestamp, an originating device identifier, an expiration time, a retention time, various formats and effects applied, etc.
- the messaging stream includes a displayed messaging stream that includes messages displayed in chronological order within a user interface with various formats and effects applied.
- the messaging stream may be used as part of different messaging platforms, such as part of an instant messaging application, a short-message service (SMS), an email application, an enhanced-message service (EMS), a multimedia-message service (MMS), push messaging (e.g., HDML, WAP push, etc.), application-to-application messaging, etc.
- SMS short-message service
- EMS enhanced-message service
- MMS multimedia-message service
- push messaging e.g., HDML, WAP push, etc.
- application-to-application messaging etc.
- the messages may be available for a limited amount of time, archived for an indeterminate time, etc.
- the messages may be encrypted.
- the messaging module 202 generates messages that are independent of the animated objects and inaccessible to the animated objects.
- the messages are available to the animated objects and are used to modify the display of the animated objects, for example, when an animated object appears to react to content in a message.
- the messaging module 202 instructs the user interface module 208 to generate a user interface that includes the messaging stream.
- the user interface may include fields for entering text, videos, images, emojis, etc.
- the messaging module 202 receives messages between users and instructs the user interface module 208 to display the messages in the messaging stream. For example, user 125 a enters text via the user interface that says “Hey Carl!”
- the messaging module 202 on the user device 115 a transmits the message to user 125 n and the message is displayed on the user device 115 n .
- the message is transmitted from the user device 115 a to the messaging server 101 , which transmits the message to the user device 115 n.
- the animation module 204 generates animated objects.
- the animation module 204 includes a set of instructions executable by the processor 235 to generate the animated object.
- the animation module 204 is stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235 .
- the animation module 204 generates a group of animated objects.
- the animated objects may include cartoons; caricatures of people (famous people, avatars, generic people, etc.), animals (e.g., a bear), inanimate objects (e.g., a cookie); abstract moving objects (e.g., a swirling pattern with eyes and a mouth); etc.
- the animation module 204 may instruct the user interface module 208 to provide the group of animated in a user interface.
- the animation module 204 may organize the group of animated objects according to type and instruct the user interface module 208 to display an organized group of animated objects from which the user may select an animated object.
- the animation module 204 may instruct the user interface module 208 to display options for customizing an animated object.
- the user interface may include options for selecting a color or a size of an animated object.
- the options for customizing the animated object may be specific to the type of animated object.
- the user interface may include an option to select an eye color for animated objects that have eyes.
- the user interface may include options for limiting the amount of information and/or actions available to the animated object.
- the user interface may include options for disabling certain behaviors, such as nodding or producing a sound.
- the user interface may include options for disabling certain types of actions based on privacy concerns, such as disabling context-based animations that react based on words in the messages, while retaining animations that are in response to explicit user input, such as a user shaking the computing device 200 .
- the animation module 204 may instruct the user interface module 208 to provide a group of animated objects that are different depending on the type of computing device 200 being used to select an animated object.
- the group may include a subset of the animated objects that may be appropriate for the display on the smart watch.
- the group of animated objects may include different versions that are used for different devices.
- An animated object may include a complicated version for a desktop computer, a simpler version for a mobile device, and a more simple version for a smart watch.
- an animated object on the desktop may include a cartoon of a man with a background scene
- the animated object on the mobile device may include the cartoon of the man
- the animated object on the smart watch may include a cartoon of the man's face.
- the animation module 204 may instruct a computing device 200 to provide different attributes of the animated object based on the type of computing device 200 .
- the animation module 204 may instruct a computing device 200 that corresponds to a smart watch to provide vibrations and/or a sound, a computing device 200 that corresponds to a mobile device to provide a visual display, and a computing device 200 that corresponds to virtual reality goggles to provide a three-dimensional rendition of the user interface.
- the three-dimensional rendition may including placing the animated object at different depths in the user interface.
- the animation module 204 receives a selection of an animated object from a user.
- the animation module 204 generates the animated object and instructions the user interface module 208 to display the animated object in the messaging stream.
- the user interface module 208 may display the animated object in different parts of the messaging stream. For example, the user interface module 208 may display the animated object at the top of the user interface, at the bottom of the user interface or in the middle of the user interface. In another example, the user interface module 208 may display the one or more messages within the messaging stream while the animated object remains fixed in a portion of the messaging stream.
- the user interface module 208 may display the animated object according to a time that the first user selected the animated object so that the animated object is located after content that was provided before the animated object was selected and before content that is provided after the animated object was selected. In yet another example, the user interface module 208 may display the animated object in random locations. The user interface module 208 may also change the location of the animated object based on actions performed by a user, as described in greater detail below.
- the user interface module 208 displays the animated object within the user interface for each person that is viewing the same messaging stream.
- the messaging stream may be viewed by a single user that is making notes for himself, the messaging stream may be viewed by a first user and a second user, or the messaging stream may be viewed by a group of users.
- the user interface module 208 places the animated object in different locations based on characteristics associated with different users. For example, a first user may provide user input that the first user prefers to see the animated object at the bottom right-hand part of the screen.
- the animated object may be located in the center of the messaging stream for users from a group that interact with the animated object and at the top left of the messaging stream for users that have not interacted with the animated object.
- the animated object is displayed differently depending on the position of the animated object in the messaging stream and/or the time the animated object has been displayed and/or how long since a user action associated with the animated object occurred.
- the animated object may look different to multiple users that are viewing the same messaging stream based on metadata associated with each of the users.
- the metadata may include a time of day associated with a user device, a location associated with a user device, a time zone associated with a user device, user preferences associated with a user, etc.
- the animation module 204 may obtain information from the sensor 245 to determine information from the metadata. For example, the animation module 204 may determine the user's location, if the user has contented to such a determination by the animation module 204 , based on a sensor 245 that provides global positioning system (GPS) information.
- GPS global positioning system
- the animation module 204 may modify an animated object based on the metadata.
- the animated object is an animation of a person
- the person may wear shorts if the user is located in an area with warm weather, a jacket if the user is located in an area with cold weather, etc.
- the animation module 204 may display the animation of the person with different clothing or different behavior based on the time of day of the user. For example, the animation module 204 may display the animation of the person in pajamas and the animation may fall asleep if the time of day of the user is at night.
- the animation module 204 may modify the animated object based on user data.
- the animation module 204 may instruct the user interface module 208 to display the animated object with an alarm that goes off when the user has configured an alarm on the computing device 200 , the animated object may display a countdown associated with a timer that the user configured on the computing device 200 , the animated object may display an upcoming calendar event based on a calendar entry based on information associated with the second server 120 , etc.
- the animation module 204 instructs the user interface module 208 to provide the second user with set of animated objects based on their relationships to the first animated object. For example, where the first animated object conveys a message, such as “I'll be home soon,” the set of animated objects may include responses to the message.
- the user interface module 208 may replace a first animated object with the second animated object or the second animated object may be displayed in the same messaging stream as the first animated object.
- the first animated object and the second animated object interact with each other.
- the action module 206 modifies the animated object based on a user action.
- the action module 206 includes a set of instructions executable by the processor 235 to modify the animated object.
- the action module 206 may program the animated object (e.g., the animated object may include stored code or a prerecorded animation) to react to the user action.
- the action module 206 is stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235 .
- the action module 206 receives a first action from a first user related to the animated object.
- the action module 206 determines a type of action that occurred and instructs the user interface module 208 to modify a display of the animated object based on the action. For example, when the first user taps on an animated object, the action module 206 instructs the user interface module 208 to display hearts coming off of the animated object. In some implementations, the users that view the same messaging stream will see the hearts coming off of the animated object.
- the action module 206 receives a second action from a second user related to the animated object. For example, the second user makes a swiping motion, which the action module 206 detects. The action module 206 instructs the user interface module 208 to modify the display based on the second action.
- the action module 206 instructs the user interface module 208 to display the animated objects as grabbing the hearts with the animated object's mouth.
- FIGS. 3A-3J below provide an example of different user actions that cause the action module 206 to determine an action and the user interface module 208 to modify a display of the animated object based on the action.
- the actions may include taps, swipes, making noise, changing pressure, moving a pointing device (e.g., a mouse moving an arrow), moving the computing device 200 , capturing an image, providing a message with a context, providing a message with a word in the message, providing voice content, selecting options provided by the user interface, selecting another animated object, etc.
- a pointing device e.g., a mouse moving an arrow
- an example user interface 300 is illustrated of a messaging stream that includes an animated cookie according to some implementations.
- the messaging stream may include a list 301 of the users that are participating in the messaging stream.
- a first user may be associated with a first user icon 302 in the messaging stream.
- the first user may select an animated cookie 303 as the animated object and the user interface module 208 may display the animated cookie 303 within a word bubble 304 .
- FIG. 3B illustrates an example user interface 310 of a messaging stream that includes the animated cookie 303 with eyes that follow a cursor 311 controlled by a pointing device that is controlled by a first user according to some implementations.
- the user interface module 208 illustrates the cursor 311 with a circle; however, other variations are possible, such as an arrow, a rectangle, etc.
- the action module 206 determines that a pointing device is moved above the location of the animated cookie 303 in the user interface. This is illustrated in FIG. 3B with the cursor 311 above the animated cookie 303 .
- the action module 206 instructs the user interface module 208 to modify a display of the animated cookie 303 .
- the animated cookie 303 is updated to show animated eyes that move and track the movement of the cursor 311 as the cursor 311 moves above the animated cookie 303 .
- the animated cookie 303 may follow the location of the cursor 311 with the same speed of movement as the cursor 311 .
- the animated cookie 303 may include eyes that are animated to look up and down as quickly as the cursor 311 .
- moving the cursor 311 a particular speed could cause the animated object to react.
- moving the cursor 311 around the animated cookie 303 several times may cause the animated cookie 303 to look dizzy, for example, by depicting stars floating around the animated cookie 303 .
- FIG. 3C illustrates an example user interface 320 of a messaging stream that includes the animated cookie 303 that is viewable by multiple users according to some implementations.
- the second user is associated with a second user icon 321 in the messaging stream.
- the animated cookie 303 remains in the same location because the animated cookie 303 is part of the word bubble 304 .
- FIG. 3D illustrates an example user interface 330 of a messaging stream that includes the animated cookie 303 with eyes that follow a pointing device 331 controlled by the second user according to some implementations.
- the action module 206 determines a location of the cursor 311 and the user interface module 208 modifies the display of the animated cookie 303 to look above and to the left of where the cursor 311 is located.
- the second user moves the cursor 311 to the animated cookie 303 in order to move the animated cookie.
- the user interface module 208 also includes informative text below the second icon 321 to inform the users that the second user is moving the animated cookie 303 .
- FIG. 3E illustrates an example user interface 340 of a messaging stream that includes the animated cookie 303 that reacts to being moved according to some implementations.
- the user interface module 208 modifies the display of the animated cookie 303 to look as though the cursor 311 is tickling the animated cookie 303 by modifying the animated cookie 303 to have scrunched up eyes and a pursed smiling mouth. Because the second user moved the animated cookie 303 outside of the word bubble associated with the first user, the user interface module 208 creates a third icon 341 associated with the first user that indicates that the first user created the animated cookie by calling the animated cookie 303 an active sticker. The second user may move the animated cookie 303 anywhere within the messaging stream. In some implementations, the animated cookie stays fixed wherever a user places it.
- FIG. 3F illustrates an example user interface 350 of a messaging stream that includes an animated cookie 303 that reacts to being placed according to some implementations.
- the user interface module 208 illustrates the animated cookie 303 as continuing to laugh as if he was tickled by the pointing device.
- the user interface module 208 also modifies the text below the second icon 321 to inform the other users that the second user placed the animated cookie 303 .
- FIG. 3G illustrates an example user interface 360 of a messaging stream that includes an animated cookie 303 with eyes that move to view text displayed within the messaging stream according to some implementations.
- the action module 206 determines that the first user entered text within the messaging stream.
- the user interface module 208 modifies the animated cookie 303 to move the eye to look downward as if the animated cookie 303 is reading the text.
- the animated cookie 303 may not access the content of the text, but instead merely reacts to the appearance of text.
- the animated cookie 303 upon user consent, the animated cookie 303 has access to the text. This feature may be disabled, for example, if the user declines permission for the action module 206 to detect that the user entered text or received a message.
- FIG. 3H illustrates an example user interface 370 of a messaging stream that includes an animated cookie 303 that stays in a fixed location within the messaging stream according to some implementations. As more messages are added within the messaging stream, the messages scroll upward. Since the animated cookie 303 is placed in a fixed position, it stays within the center of the messaging stream.
- FIG. 3I illustrates an example user interface 380 of a messaging stream that includes an animated cookie 303 that reacts to a word in the message 381 , if the users participating in the messaging stream provide consent to access the messages exchanged in the messaging stream, according to some implementations.
- the first user provides a message that states “Oh! I have to go . . . chat later.”
- the action module 206 determines that the words “I have to go . . . chat later” indicate that the first user is about to end the chat.
- the action module 206 may determine a reaction for the animated cookie 303 based on the message 381 by using machine learning, comparing words in the message to lists of words associated with different contexts, etc.
- such a determination may be made by a comparison of one or more of the words with words known to be associated with ending a chat, e.g., based upon prior training data, based upon clustering previous messages, etc.
- the user interface module 208 modifies the display of the animated cookie 303 to start to fall asleep (e.g., by illustrating droopy eyes) based on the first user's message.
- FIG. 3J illustrates an example user interface 390 of a messaging stream that includes the animated cookie 303 that continues to react to the word within the message according to some implementations.
- the action module 206 determines, when consent is provided by the second user for access to messages in the messaging stream, that the second user is also about to leave the messaging stream based on the words associated with the second user stating “Ok!Bye.” Based on the action module 206 determining a meaning of the words, the user interface module 208 modifies the display of the animated cookie 303 to fall asleep (e.g., by displaying the animated cookie 303 saying “ZZZ”).
- the action from the user includes the user blowing on a computing device 200 .
- the user could blow on a mobile device or blow while wearing a headset.
- the sensor 245 may include a microphone that detects a change in airflow and transmits information about the change in airflow to the action module 206 .
- the action module 206 may determine a degree of the change in airflow and the action module 206 instructs the user interface module 208 to move the animated object based on the degree of the change in airflow.
- the animated object may be a box of tissues and blowing on the computing device 200 causes the tissues to move out of the box of tissues. If the user blows slightly on the computing device 200 , it causes a few of the tissues from the box of tissues to come out of the box.
- the animated object is a character, such as a cartoon fox, and blowing on the computing device 200 causes the fox to hang on to the side of the messaging stream window and hold on.
- the animated object is a bubble and blowing on the computing device 200 makes the bubble increase in size corresponding to the extent of the first user blowing on the computing device 200 .
- the action from the user includes moving the computing device 200 .
- the action module 206 may receive information from a sensor 245 (e.g., an accelerometer or a gyroscope) and determine a degree of the movement.
- the action module 206 may instruct the user interface module 208 to illustrate the animated object with additional changes. For example, the user shaking the user's smart watch or other mobile device causes the action module 206 to instruct the user interface module 208 to illustrate items moving, such as snow falling from the sky, nuts and bolts coming loose, a character's hair becoming disarrayed, the character shaking a fist at a user, etc.
- the action module 206 may receive information from the sensor 245 indicating that the user moved the computing device 200 to the user's ear and instruct the speaker 243 to emit a noise.
- the animated object emits a different noise depending on how the computing device 200 is moved, such as a yip if the user moves the computing device 200 to the user's left ear and a bark if the user moves the computing device 200 to the user's right ear.
- an example user interface 400 of a messaging stream is illustrated that includes an animated bear 405 according to some implementations.
- a first user may select an animated bear 405 and place the animated bear 405 in the center of the messaging stream below the messages.
- the action module 206 may identify that Steve provided a message stating “There might be bears at that picnic spot.”
- the action module 206 may instruct the user interface module 208 to modify the display of the bear by animating the animated bear's 405 eyes to move back and forth to give the appearance that the animated bear 405 is reading the text.
- FIG. 4B illustrates an example user interface 425 of the messaging stream that includes the animated bear 430 after a user performed a user action according to some implementations.
- the second user may turn the user device 115 upside down.
- the action module 206 receives information from a sensor 245 and determines that the degree of the movement of the user device 115 is 180 degrees.
- the action module 206 instructs the user interface module 208 to modify the display to show the animated bear 430 as upside down.
- the action module 206 upon user consent, may identify that Karen responded to Steve with “We'll fight them off!” The action module 206 determines that fight is an instruction associated with the animated bear 430 and instructs the user interface module 206 to modify the display of the animated bear 430 to show the animated bear 430 initially fighting and then falling upside down.
- the action from the user includes movement within the messaging stream.
- the movement may be the user touching a touch screen of a mobile device or movement of a pointing device, such as a mouse.
- the animated object may be a character with eyes (e.g., a smiley face).
- the action module 206 may receive information about the movement from the sensor 245 (e.g. a touch screen), and determine the direction of the movement.
- the action module 206 instructs the user interface module 208 to move the eyes of the animated object to correspond to the direction of the movement from the first user.
- the movement may stretch the animated object.
- FIG. 5A an example user interface 500 of a messaging stream that includes animated bubbles is illustrated according to some implementations.
- the first user uses a pointing device associated with a computing device 200 to pull a first arrow 502 in a first direction.
- the second user uses a pointing device associated with another computing device 200 to pull a second arrow 503 in a second direction. Both the first arrow 502 and the second arrow 503 are pulling the bubble 501 in a different direction.
- the action module 206 identifies the first action and the second action as pulling the bubble 501 in a tug of war by pulling in opposite directions.
- FIG. 5B illustrates the example user interface 525 of the messaging stream that includes the animated bubbles after two users performed user actions according to some implementations.
- the action module 206 determines how much each of the users pulls the bubble with the first arrow 502 and the second arrow 503 .
- the action module 206 applies animation rules to determine a winner of the tug of war by determining whether the first user or the second user first pulled the bubble a threshold number of pixels in the user interface. Other ways to determine a winner are possible, such as determining which user stretched the bubble a threshold pixel distance outside of a boundary.
- the action module 206 identifies a word in a message or a context of the messaging stream.
- the action module 206 may compare words in a message to a list of words associated with different meanings, such as different emotions (e.g., happy, sad, angry, etc.), different states (e.g., about to end the chat), etc.
- the action module 206 may use machine learning to predict a meaning associated with a user based on the message where the machine learning may be based on all messages available to the action module 206 or, subject to user content, the messages associated with a particular user.
- the context of the messaging stream may be based on user patterns, a time of day, a location of the computing device 200 , etc.
- the action module 206 may instruct the user interface module 208 to modify a display of the animated object based on the context.
- the action module 206 may, responsive to user content, identify voice content from the user.
- the action module 206 may convert the speech to text and identify an indent of the voice content. For example, the action module 206 may identify the user providing verbal instructions for an abstract animated object that includes “Jump around.” Based on the verbal content, the action module 206 instructs the user interface to modify the display of the abstract animated object to show it moving up and down.
- a first user may be a chat bot (e.g., an automated chat program) that provides services to a user.
- the animated object may be related to one or more messages exchanges between the chat bot and the user. For example, the user may instruct the chat bot to make a reservation, a purchase, etc. by entering a message.
- the action module 206 may, upon user consent, instruct the user interface module 208 to modify the animated object based on the message. For example, if the user instructs the chat bot to make a reservation, the animated object includes a graphic associated with making a reservation.
- an example user interface 600 of a messaging stream is illustrated that includes an animated money transfer object 601 according to some implementations.
- user Sara consents to the use of a chat bot to perform actions for her by instructing the Bankbot to pay off a credit card statement.
- the user provides a message 602 that states: “@Bankbot Pay off my credit card statement.”
- the Bankbot responds with a message 603 that states: “Confirmed. Payment posted.”
- the action module 206 determines that the user instructed the chat bot to transfer money from the user's account to a credit card company.
- the action module 206 instructs the user interface module 208 to display an animated money transfer object 601 that shows money moving from a money bag to a credit card.
- a messaging stream may include multiple users and a chat bot.
- the action module 206 may detect a movement of a user on the computing device 200 and instruct the user interface module 208 to modify a display of the animated object based on messages related to the chat bot and the movement of the user.
- FIG. 7A an example user interface 700 of an animated takeout box is illustrated according to some implementations.
- Kira and Sam exchange messages 701 about ordering delivery food from Tasty Thai.
- the users also consent to the chatbot accessing their messages to help the users.
- a food ordering chatbot places the order and informs the users that the food will arrive in 40 minutes.
- the first user selects a takeout box animated object 702 , which the user interface module 208 displays in the messaging stream.
- the second user touches the takeout box animated object 702 with a finger 703 .
- the action module 206 detects movement of the second user's finger 703 touching the animated object.
- FIG. 7B illustrates the example user interface 725 of the animated takeout box after a user performed a user action according to some implementations.
- the action module 206 determines the type of action that occurred and instructs the user interface module 208 to modify the takeout box animated object to display an opened takeout box animated object 726 .
- the action may include a second user selecting a second animated object.
- the second animated object may be from a subset of animated objects based on its relationship to a first animated object.
- the second animated object may be selected from a group of all animated objects.
- the user interface module 208 modifies a display by showing the first animated object interacting with the second animated object.
- the first animated object may be a fox with boxing gloves and the second animated object may be a kangaroo with boxing globes.
- the user interface module 208 may display the two animated objects fighting with each other.
- the first user may control the first animated object and the second user may control the second animated object such that the first and second user engage in a boxing match using the two animated objects.
- the animated objects react differently depending on how they were added to the messaging stream.
- Two animated objects may react differently depending on their proximity to each other and a length of time, such that animated objects that are close to each other react to each other more than animated objects that are far away from each other. For example, when two animated people are next to each other they look at each other and touch each other. When the two animated people are at opposite sides of the messaging stream, they wave every two minutes but otherwise do not interact.
- the user interface module 208 generates a user interface.
- the user interface module 208 includes a set of instructions executable by the processor 235 to generate the user interface.
- the user interface module 208 is stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235 .
- the user interface module 208 receives instructions from the animation module 204 to generate a user interface that includes a messaging stream.
- the user interface may include a group of animated objects for a first user to choose from.
- the user interface module 208 may receive a selection from the first user of one of the animated objects.
- the user interface module 208 displays the animated object in the messaging stream.
- the user interface module 208 receives instructions from the action module 206 to modifying a display of the animated object based on a first action from the first user. For example, the user interface module 208 receives instructions to modify an animated object of marbles to show them rolling around the messaging stream based on movement of a computing device associated with the first user. The user interface module 208 receives instructions from the action module 206 to modify the display of the animated object based on a second action from a second user. For example, the user interface module 208 receives instructions to show the marbles bouncing in the messaging stream based on the second user touching a computing device 200 associated with the second user to simulate the marbles bouncing within the messaging stream.
- the user interface module 208 provides a user interface that includes interactive features to change the appearance of the animated object.
- the user interface module 208 provides a user interface that provides a scratchpad for drawing.
- the scratchpad may include a toolkit with various tools for drawing such as a pencil, a paintbrush, color options, etc.
- the user interface module 208 provides a user interface that includes an interactive keyboard for producing music, beeps, tones, etc. When a user touches a key on the keyboard, the action module 206 detect the touch and instructs the speaker 243 to emit a beep, tone, etc.
- the user interface module 208 provides a user interface that includes interactive graphics, such as charts, timelines, etc. where the user has options for changing the appearance of the interactive graphics.
- FIG. 8A illustrates an example user interface 800 of a messaging stream that includes an animated airplane according to some implementations.
- a first user selects an animated object of an airplane 801 , which represents a message to the second user that the first user is about to get on an airplane.
- the first user swipes across a screen of the smartwatch to cause the user interface module 208 to modify the display of the airplane 801 to show the airplane 801 moving across the screen.
- the animation module 204 provides the second user with a set of animated objects based on their relationship to the airplane 801 .
- the set of animated objects could be a hand waving, a thumbs up, and two people kissing.
- the second user selects the animated object of two people kissing.
- FIG. 8B illustrates an example user interface 825 of the messaging stream that includes an animated couple 826 that is displayed responsive to a user action related to the animated airplane according to some implementations.
- the user action is the second user selecting the animated object of the two people kissing.
- FIG. 9 illustrates a flowchart of an example method 900 to generate a messaging stream that includes an animated object.
- the method 900 is performed by a messaging application 103 stored on a computing device 200 , such as a user device 115 , a messaging server 101 , or in part a user device 115 and in part a messaging server 101 .
- a messaging stream is generated where one or more messages are exchanged between a first user and a second user.
- the first user and the second user send messages to each other using an instant messaging platform, via text, via SMS, etc.
- a selection is received of an animated object from the first user for the messaging stream.
- the first user selects the animated object from a group of animated objects that are displayed for the first user's device.
- the animated object is an animated version of a flamingo.
- the user may customize the animated object by choosing a color for the animated object, a style of eyes for the animated object, etc.
- the animated object is provided in the messaging stream.
- the user interface module 208 displays the animated object in a default location, such as the center of the user interface.
- a first action is received from the first user related to the animated object.
- the action module 206 detects the first user blowing into a sensor 245 , such as a microphone, of the computing device 200 .
- a display of the animated object is modified based on the first action.
- the user interface module 208 modifies the display of the animated object to show the flamingo blowing around in the wind with his feathers flying.
- a second action is received from the second user related to the animated object. For example, the second user shakes the second user's computing device 200 .
- the display of the animated object is modified based on the second action. For example, the display is modified to show the flamingo bouncing up and down with movement corresponding to the shaking of the second user's computing device 200 .
- the implementations of the specification can also relate to a processor for performing one or more steps of the methods described above.
- the processor may be a special-purpose processor selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a non-transitory computer-readable storage medium, including, but not limited to, any type of disk including optical disks, ROMs, CD-ROMs, magnetic disks, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memories including USB keys with non-volatile memory, or any type of media suitable for storing electronic instructions, each coupled to a computer system bus.
- the specification can take the form of some entirely hardware implementations, some entirely software implementations or some implementations containing both hardware and software elements.
- the specification is implemented in software, which includes, but is not limited to, firmware, resident software, microcode, etc.
- a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- a data processing system suitable for storing or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus.
- the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
- the systems provide users with an opportunity to control whether programs or features collect user information (e.g., information about a user's social network, social actions or activities, profession, a user's preferences, or a user's current location), or control whether and/or how to receive content from the server that may be more relevant to the user.
- user information e.g., information about a user's social network, social actions or activities, profession, a user's preferences, or a user's current location
- certain data may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed.
- a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined.
- the user may have control over how information is collected about the user and used by the server.
Abstract
Description
- Current messaging applications allow for non-interactive, one-directional objects in the form of stickers, emojis, photos, GIFs, and sounds. However, these one-directional objects may be considered fleeting because they are limited in applicability. For example, a first user may send an emoji to a second user, who finds the emoji to be funny, but then forgets about the emoji entirely.
- The background description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
- Implementations generally relate to a computer-implemented method to generate a messaging stream where one or more messages are exchanged between a first user and a second user. The method may include generating a messaging stream where one or more messages are exchanged between a first user and a second user. The method further includes receiving a selection of an animated object from the first user for the messaging stream. The method further includes providing the animated object in the messaging stream. The method further includes receiving a first action from the first user related to the animated object. The method further includes modifying a display of the animated object based on the first action. The method further includes receiving a second action from the second user related to the animated object. The method further includes modifying the display of the animated object based on the second action.
- In some implementations, receiving the first action from the first user includes detecting a change in airflow near a microphone associated with a user device. In some implementations, receiving the first action from the first user includes detecting movement of a finger on a touch screen or detection of movement of a pointing device. In some implementations, receiving the first action from the first user includes detecting movement of a user device based on information received from a sensor associated with the user device. In some implementations, receiving the first action from the first user includes receiving a message from the first user and determining a context from the messaging stream and modifying a display of the animated object based on the first action includes modifying the display of the animated object based on the context from the messaging stream. In some implementations, the first action is pulling the animated object in a first direction and the second action is pulling the animated object in a second direction. In some implementations, the method further comprises displaying the one or more messages within the messaging stream while the animated object remains fixed in a portion of the messaging stream. In some implementations, the first user is a chat bot and the animated object is related to the one or more messages exchanged between the first user and the second user. In some implementations, the display of the animated object is modified based on at least one of a word in the one or more messages, voice content, and a context of the messaging stream. In some implementations, the method further comprises identifying a set of objects to provide as options to the first user based on a type of user device on which the messaging stream is displayed, wherein the animated object is part of the set of objects.
- In some implementations, the method includes generating a messaging stream where one or more messages are exchanged between a first user and a second user. The method further includes receiving a selection of an animated object from the first user for the messaging stream. The method further includes causing a version of the animated object to be displayed based on a type of user device on which the animated object is displayed. The method further includes receiving an action from the second user related to the animated object. The method further includes in response to receiving the action, modifying the display of the animated object.
- In some implementations, the selection is a first selection and the animated object is a first animated object and the method further comprises providing the second user with a set of animated objects based on their relationship to the first object, receiving a second selection of a second animated object from the set of animated objects, and in response to the receiving the second selection, providing the second object in the messaging stream. In some implementations, the version of the animated object to be displayed based on the type of user device on which the animated object is displayed includes a complicated version for a desktop computer, a simpler version for a mobile device, and a more simple version for a smart watch. In some implementations, the selection is a first selection, the animated object is a first animated object, the action from the second user includes a second selection of a second animated object from the second user for the messaging stream, and modifying the display of the first animated object includes the first animated object interacting with the second animated object.
- In some implementations, the method may include means for generating a messaging stream where one or more messages are exchanged between a first user and a second user. The method further includes means for receiving a selection of an animated object from the first user for the messaging stream. The method further includes means for providing the animated object in the messaging stream. The method further includes means for receiving a first action from the first user related to the animated object. The method further includes means for modifying a display of the animated object based on the first action. The method further includes means for receiving a second action from the second user related to the animated object. The method further includes means for modifying the display of the animated object based on the second action.
- The various implementations described below provide messaging streams that include interactive animated objects. The animated objects described below may include multimedia features, such as features that can be displayed on a display screen, projected in a virtual reality environment, played back as audio, played back via haptic feedback, or a combination of such modalities. In various implementations, based on user consent, the animated objects react to user input, messages exchanged in the messaging stream, messaging context, and a combination of such factors. Interactive animated objects may provide several advantages. For example, such animated objects may enhance the user experience of using a messaging application that implements the messaging stream, e.g., by enabling users to express themselves in interactive ways, rather than being restricted to sending text, images, or stock items such as animated GIF, stickers, and emoji. Further, in implementations that provide animated objects in messaging streams that include multiple users, the animated objects may be acted upon concurrently, or sequentially, by multiple users. Such interactivity may provide geographically separated users an experience of collaboratively modifying an object. A further benefit is that behaviors of the objects may be customizable by users.
- The implementations provided herein efficiently provide interactive animated objects in a messaging stream. For example, the animated objects may be preprogrammed for certain behaviors, e.g., may be configured to respond to various user actions, messaging context, etc. by being modified or displayed in accordance with the user actions or context. In this manner, a technical advantage is that a large or even infinite number of behaviors of animated objects may be provided by a combination of preprogrammed behaviors, without the need to explicitly specify each behavior. This can permit complex animated objects to be stored and displayed with limited computational resources. Animated objects may be downloaded or accessed on demand, e.g., only upon insertion of the animated object in the messaging stream, which reduces storage space required to store the objects. Different versions of an animated object may be provided, as described below, for different user devices. Therefore, devices with limited storage space, network access capacity, and processing power, can still render a suitable version of an animated object. Thus, implementations described below provide animated objects in messaging streams for different types of user devices, without the need for device-specific reconfiguration.
- The disclosure is illustrated by way of example, and not by way of limitation in the figures of the accompanying drawings in which like reference numerals are used to refer to similar elements.
-
FIG. 1 illustrates a block diagram of an example system that generates a messaging stream with an animated object according to some implementations. -
FIG. 2 illustrates a block diagram of an example computing device that generates the messaging stream with the animated object according to some implementations. -
FIG. 3A illustrates an example user interface of a messaging stream that includes an animated cookie according to some implementations. -
FIG. 3B illustrates an example user interface of a messaging stream that includes the animated cookie with eyes that follow a pointing device controlled by a first user according to some implementations. -
FIG. 3C illustrates an example user interface of a messaging stream that includes the animated cookie that is viewable by multiple users according to some implementations. -
FIG. 3D illustrates an example user interface of a messaging stream that includes the animated cookie with eyes that follow a cursor controlled by a second user according to some implementations. -
FIG. 3E illustrates an example user interface of a messaging stream that includes the animated cookie that reacts to being moved according to some implementations. -
FIG. 3F illustrates an example user interface of a messaging stream that includes the animated cookie that reacts to being placed according to some implementations. -
FIG. 3G illustrates an example user interface of a messaging stream that includes the animated cookie with eyes that move to view text displayed within the messaging stream according to some implementations. -
FIG. 3H illustrates an example user interface of a messaging stream that includes the animated cookie that stays in a fixed location within the messaging stream according to some implementations. -
FIG. 3I illustrates an example user interface of a messaging stream that includes an animated cookie that reacts to a word in a message according to some implementations. -
FIG. 3J illustrates an example user interface of a messaging stream that includes the animated cookie that continues to react to the word in the message according to some implementations. -
FIG. 4A illustrates an example user interface of a messaging stream that includes an animated bear according to some implementations. -
FIG. 4B illustrates an example user interface of the messaging stream that includes the animated bear after a user performed a user action according to some implementations. -
FIG. 5A illustrates an example user interface of a messaging stream that includes animated bubbles according to some implementations. -
FIG. 5B illustrates the example user interface of the messaging stream that includes the animated bubbles after two users performed user actions according to some implementations. -
FIG. 6 illustrates an example user interface of a messaging stream that includes an animated money transfer object according to some implementations. -
FIG. 7A illustrates an example user interface of an animated takeout box according to some implementations. -
FIG. 7B illustrates the example user interface of the animated takeout box after a user performed a user action according to some implementations. -
FIG. 8A illustrates an example user interface of a messaging stream that includes an animated airplane according to some implementations. -
FIG. 8B illustrates an example user interface of the messaging stream that includes an animated couple that is displayed responsive to a user action related to the animated airplane according to some implementations. -
FIG. 9 illustrates a flowchart of an example method to generate a messaging stream that includes an animated object. - In some implementations, a messaging application generates a messaging stream where messages are exchanged between a first user and a second user. The messaging application may receive a selection of an animated object from the first user for the messaging stream. For example, the first user may select an animated bubble to add to the messaging stream. The messaging application may add the animated object to the messaging stream. For example, the first user and the second user may view the animated bubble in the messaging stream.
- The messaging application may receive a first action from the first user related to the animated object. For example, the first user may blow into a user device and the user device detects a change in airflow near the microphone. The messaging application may modify a display of the animated object based on the first action. For example, the messaging application may show the bubble as moving upwards based on the user blowing into the user device. The messaging application may receive a second action from the second user related to the animated object. For example, the messaging application may receive indications based on the second user pulling the bubble with a finger on the user device. The messaging application may modify the display of the animated object based on the second action. For example, the messaging application may show the bubble as stretching in the direction corresponding to the movement of the finger and then popping.
-
FIG. 1 illustrates a block diagram of anexample system 100 that generates a messaging stream that includes an animated object. The illustratedsystem 100 includes amessaging server 101,user devices 115 a, 115 n, asecond server 120, and anetwork 105.Users 125 a, 125 n may be associated withrespective user devices 115 a, 115 n. In some implementations, thesystem 100 may include other servers or devices not shown inFIG. 1 . InFIG. 1 and the remaining figures, a letter after a reference number, e.g., “115 a,” represents a reference to the element having that particular reference number. A reference number in the text without a following letter, e.g., “115,” represents a general reference to implementations of the element bearing that reference number. - The
messaging server 101 may include a processor, a memory, and network communication capabilities. In some implementations, themessaging server 101 is a hardware server. Themessaging server 101 is communicatively coupled to thenetwork 105 viasignal line 102.Signal line 102 may be a wired connection, such as Ethernet, coaxial cable, fiber-optic cable, etc., or a wireless connection, such as Wi-Fi®, Bluetooth®, or other wireless technology. In some implementations, themessaging server 101 sends and receives data to and from one or more of theuser devices 115 a, 115 n and thesecond server 120 via thenetwork 105. Themessaging server 101 may include amessaging application 103 a and adatabase 199. - The
messaging application 103 a may be code and routines operable to generate a messaging stream that includes an animated object. In some implementations, themessaging application 103 a may be implemented using hardware including a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC). In some implementations, themessaging application 103 a may be implemented using a combination of hardware and software. - The
database 199 may store animated objects, messaging streams, etc. For example, thedatabase 199 may store the messages between a first user and a second user. Thedatabase 199 may also store social network data associated with users 125, user preferences for the users 125, etc. - The user device 115 may be a computing device that includes a memory and a hardware processor. For example, the user device may include a desktop computer, a mobile device, a tablet computer, a mobile telephone, a wearable device, a head-mounted display, a mobile email device, a portable game player, a portable music player, a reader device, or another electronic device capable of accessing a
network 105. - In the illustrated implementation, user device 115 a is coupled to the
network 105 viasignal line 108 anduser device 115 n is coupled to thenetwork 105 viasignal line 110.Signal lines User devices 115 a, 115 n are accessed byusers 125 a, 125 n, respectively. Theuser devices 115 a, 115 n inFIG. 1 are used by way of example. WhileFIG. 1 illustrates two user devices, 115 a and 115 n, the disclosure applies to a system architecture having one or more user devices 115. - In some implementations, the user device 115 can be a user device that is included in a wearable device worn by the user 125. For example, the user device 115 is included as part of a clip (e.g., a wristband), part of jewelry, or part of a pair of glasses. In another example, the user device 115 can be a smart watch. The user 125 may view an animated object from the
messaging application 103 on a display of the device worn by the user 125. For example, the user 125 may view the animated object on a display of a smart watch or a smart wristband. - In some implementations,
messaging application 103 b may be stored on a user device 115 a. Themessaging application 103 may include a thin-client messaging application 103 b stored on the user device 115 a and amessaging application 103 a that is stored on themessaging server 101. For example, themessaging application 103 b stored on the user device 115 a may display a messaging stream that includes an animated object. The user device 115 a may identify a user action from a first user, such as shaking the user device 115 a to make snow fall over an animated object of a snowman. The user device 115 a may transmit information about the messaging stream and the user action to themessaging application 103 a stored on themessaging server 101, which provides the information to a second user that accesses themessaging application 103 a from a desktop computer. - The
second server 120 may include a processor, a memory, and network communication capabilities. Thesecond server 120 may access thenetwork 105 viasignal line 109. Thesecond server 120 may receive information from themessaging application 103 about the messaging stream and provide information to themessaging application 103. For example, thesecond server 120 may be associated with a bank and thesecond server 120 may communicate with themessaging application 103 to pay a bill using the bank information. A user 125 may instruct themessaging application 103 to authorize thesecond server 120 to pay a bill by clicking on an animated object of a bag of money. Once the transaction is complete, thesecond server 120 may send themessaging application 103 a notification that the transaction is complete. Themessaging application 103 may modify the animated object to show the money moving from the money bag to the bank associated with thesecond server 120. In another example, thesecond server 120 may include a bot that performs functions for a user 125, such as ordering food, scheduling an appointment, making a reservation, booking a flight, etc. In yet another example, thesecond server 120 may include a separate application that provides information to themessaging application 103, such as a calendar application that sends, upon user consent, information about the user's meetings. - In the illustrated implementation, the entities of the
system 100 are communicatively coupled via anetwork 105. Thenetwork 105 may be a conventional type, wired or wireless, and may have numerous different configurations including a star configuration, token ring configuration or other configurations. Furthermore, thenetwork 105 may include a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or other interconnected data paths across which multiple devices may communicate. In some implementations, thenetwork 105 may be a peer-to-peer network. Thenetwork 105 may also be coupled to or include portions of a telecommunications network for sending data in a variety of different communication protocols. In some implementations, thenetwork 105 includes Bluetooth® communication networks, WiFi®, or a cellular communications network for sending and receiving data including via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, email, etc. AlthoughFIG. 1 illustrates onenetwork 105 coupled to the user devices 115 and themessaging server 101, in practice one ormore networks 105 may be coupled to these entities. - In situations in which the systems and methods discussed herein may collect or use personal information about users (e.g., user data, information about a user's social network, user's location, user's biometric information, user's activities and demographic information, the messaging server stores and analyzes videos), users are provided with opportunities to control whether information is collected, whether the personal information is stored, whether the personal information is used, whether the videos are analyzed, and how the information about the user is collected, stored, and used. That is, the systems and methods discussed herein collect, store, and/or use user personal information only upon receiving explicit authorization from the relevant users to do so. For example, a user is provided with control over whether programs or features collect user information about that particular user or other users relevant to the program or feature. Each user for which personal information is to be collected is presented with one or more options to allow control over the information collection relevant to that user, to provide permission or authorization as to whether the information is collected and as to which portions of the information are to be collected. For example, users can be provided with one or more such control options over a communication network. In addition, certain data may be treated in one or more ways before it is stored or used so that personally identifiable information is removed. As one example, a user's identity information may be treated, e.g., anonymized, so that no personally identifiable information can be determined from a video. As another example, a user's geographic location may be generalized to a larger region so that the user's particular location cannot be determined.
-
FIG. 2 illustrates a block diagram of anexample computing device 200 that generates composite images. Thecomputing device 200 may be amessaging server 101 or a user device 115. Thecomputing device 200 may include aprocessor 235, amemory 237, acommunication unit 239, adisplay 241, aspeaker 243, asensor 245, and astorage device 247. Additional components may be present or some of the previous components may be omitted depending on the type ofcomputing device 200. For example, if thecomputing device 200 is themessaging server 101, thecomputing device 200 may not include thedisplay 241, thespeaker 243, or thesensor 245. Amessaging application 103 may be stored in thememory 237. In implementations where thecomputing device 200 is a wearable device, thecomputing device 200 may not includestorage device 247. In some implementations, thecomputing device 200 may include other components not listed here, such as a battery, etc. The components of thecomputing device 200 may be communicatively coupled by abus 220. - The
processor 235 includes an arithmetic logic unit, a microprocessor, a general purpose controller or some other processor array to perform computations and provide instructions to a display device.Processor 235 processes data and may include various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. AlthoughFIG. 2 includes asingle processor 235,multiple processors 235 may be included. Other processors, operating systems, sensors, displays and physical configurations may be part of thecomputing device 200. Theprocessor 235 is coupled to thebus 220 for communication with the other components viasignal line 222. - The
memory 237 stores instructions that may be executed by theprocessor 235 and/or data. The instructions may include code for performing the techniques described herein. Thememory 237 may be a dynamic random access memory (DRAM) device, a static RAM, or some other memory device. In some implementations, thememory 237 also includes a non-volatile memory, such as a (SRAM) device or flash memory, or similar permanent storage device and media including a hard disk drive, a compact disc read only memory (CD-ROM) device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device for storing information on a more permanent basis. Thememory 237 includes code and routines operable to execute themessaging application 103, which is described in greater detail below. Thememory 237 is coupled to thebus 220 for communication with the other components viasignal line 224. - The
communication unit 239 transmits and receives data to and from at least one of the user device 115 and themessaging server 101 depending upon where themessaging application 103 may be stored. In some implementations, thecommunication unit 239 includes a port for direct physical connection to thenetwork 105 or to another communication channel. For example, thecommunication unit 239 includes a universal serial bus (USB), secure digital (SD), category 5 cable (CAT-5) or similar port for wired communication with the user device 115 or themessaging server 101, depending on where themessaging application 103 may be stored. In some implementations, thecommunication unit 239 includes a wireless transceiver for exchanging data with the user device 115,messaging server 101, or other communication channels using one or more wireless communication methods, including IEEE 802.11, IEEE 802.16, Bluetooth® or another suitable wireless communication method. Thecommunication unit 239 is coupled to thebus 220 for communication with the other components viasignal line 226. - In some implementations, the
communication unit 239 includes a cellular communications transceiver for sending and receiving data over a cellular communications network including via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, e-mail or another suitable type of electronic communication. In some implementations, thecommunication unit 239 includes a wired port and a wireless transceiver. Thecommunication unit 239 also provides other conventional connections to thenetwork 105 for distribution of files and/or media objects using standard network protocols including, but not limited to, user datagram protocol (UDP), TCP/IP, HTTP, HTTP secure (HTTPS), simple mail transfer protocol (SMTP), SPDY, quick UDP internet connections (QUIC), etc. - The
display 241 may include hardware operable to display graphical data received from themessaging application 103. For example, thedisplay 241 may render graphics to display an overlay and a resulting composite image. Thedisplay 241 is coupled to thebus 220 for communication with the other components viasignal line 228. - The
speaker 243 may include hardware operable to emit noises. For example, in response to a user performing an action, theaction module 206 may instruct thespeaker 243 to emit a sound. The speaker is coupled to thebuss 220 for communication with the other components viasignal line 230. - The
sensor 245 may include hardware operable to detect changes to the user device 115. For example, thesensor 245 may include motion sensors that measure movement of thecomputing device 200. For example, thesensor 245 may include an accelerometer and a gyroscope that detect acceleration forces and rotational forces along the x, y, and z-axes. Thesensor 245 may also include position sensors that measure the physical position of the user device 115, such as orientation sensors and magnetometers. Thesensor 245 may also include hardware that detects sounds and/or pressure changes, such as a microphone that detects changes in airflow when a user blows on thecomputing device 200. Thesensor 245 is coupled to thebus 220 for communication and with the other components viasignal line 232. Thesensor 245 may detect contact on a touch screen of thecomputing device 200. For example, thesensor 245 may detect a user's finger touching the touch screen and movement of the user's finger. - The
storage device 247 may be a non-transitory computer-readable storage medium that stores data that provides the functionality described herein. In implementations where thecomputing device 200 is themessaging server 101, thestorage device 247 may include thedatabase 199 inFIG. 1 . Thestorage device 247 may be a DRAM device, a SRAM device, flash memory or some other memory device. In some implementations, thestorage device 247 also includes a non-volatile memory or similar permanent storage device and media including a hard disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device for storing information on a permanent basis. Thestorage device 247 is coupled to thebus 220 for communication with the other components viasignal line 234. - The
messaging application 103 may include amessaging module 202, an animation module 204, anaction module 206, and auser interface module 208. - The
messaging module 202 generates a messaging stream. In some implementations, themessaging module 202 includes a set of instructions executable by theprocessor 235 to generate the messaging stream. In some implementations, themessaging module 202 is stored in thememory 237 of thecomputing device 200 and can be accessible and executable by theprocessor 235. - In some implementations, the
messaging module 202 generates a messaging stream that includes data sent to and from users and chat bots, for example, by sending data to a user device 115, amessaging server 101, and/or asecond server 120. The messaging stream may include one or more messages where the messages have certain characteristics, such as a sender; a recipient; and message content including text, an animated object, images, video, and message metadata. The message metadata may include a timestamp, an originating device identifier, an expiration time, a retention time, various formats and effects applied, etc. In some implementations, the messaging stream includes a displayed messaging stream that includes messages displayed in chronological order within a user interface with various formats and effects applied. - The messaging stream may be used as part of different messaging platforms, such as part of an instant messaging application, a short-message service (SMS), an email application, an enhanced-message service (EMS), a multimedia-message service (MMS), push messaging (e.g., HDML, WAP push, etc.), application-to-application messaging, etc. The messages may be available for a limited amount of time, archived for an indeterminate time, etc. The messages may be encrypted. In some implementations, the
messaging module 202 generates messages that are independent of the animated objects and inaccessible to the animated objects. In some implementations, the messages are available to the animated objects and are used to modify the display of the animated objects, for example, when an animated object appears to react to content in a message. - In some implementations, the
messaging module 202 instructs theuser interface module 208 to generate a user interface that includes the messaging stream. The user interface may include fields for entering text, videos, images, emojis, etc. Themessaging module 202 receives messages between users and instructs theuser interface module 208 to display the messages in the messaging stream. For example, user 125 a enters text via the user interface that says “Hey Carl!” Themessaging module 202 on the user device 115 a transmits the message touser 125 n and the message is displayed on theuser device 115 n. In some implementations, the message is transmitted from the user device 115 a to themessaging server 101, which transmits the message to theuser device 115 n. - The animation module 204 generates animated objects. In some implementations, the animation module 204 includes a set of instructions executable by the
processor 235 to generate the animated object. In some implementations, the animation module 204 is stored in thememory 237 of thecomputing device 200 and can be accessible and executable by theprocessor 235. - In some implementations, the animation module 204 generates a group of animated objects. The animated objects may include cartoons; caricatures of people (famous people, avatars, generic people, etc.), animals (e.g., a bear), inanimate objects (e.g., a cookie); abstract moving objects (e.g., a swirling pattern with eyes and a mouth); etc.
- The animation module 204 may instruct the
user interface module 208 to provide the group of animated in a user interface. The animation module 204 may organize the group of animated objects according to type and instruct theuser interface module 208 to display an organized group of animated objects from which the user may select an animated object. - In some implementations, the animation module 204 may instruct the
user interface module 208 to display options for customizing an animated object. For example, the user interface may include options for selecting a color or a size of an animated object. In some implementations, the options for customizing the animated object may be specific to the type of animated object. For example, the user interface may include an option to select an eye color for animated objects that have eyes. In some implementations, the user interface may include options for limiting the amount of information and/or actions available to the animated object. For example, the user interface may include options for disabling certain behaviors, such as nodding or producing a sound. In another example, the user interface may include options for disabling certain types of actions based on privacy concerns, such as disabling context-based animations that react based on words in the messages, while retaining animations that are in response to explicit user input, such as a user shaking thecomputing device 200. - In some implementations, the animation module 204 may instruct the
user interface module 208 to provide a group of animated objects that are different depending on the type ofcomputing device 200 being used to select an animated object. For example, where thecomputing device 200 is a smart watch, the group may include a subset of the animated objects that may be appropriate for the display on the smart watch. In some implementations, the group of animated objects may include different versions that are used for different devices. An animated object may include a complicated version for a desktop computer, a simpler version for a mobile device, and a more simple version for a smart watch. For example, an animated object on the desktop may include a cartoon of a man with a background scene, the animated object on the mobile device may include the cartoon of the man, and the animated object on the smart watch may include a cartoon of the man's face. - In some implementations, the animation module 204 may instruct a
computing device 200 to provide different attributes of the animated object based on the type ofcomputing device 200. For example, for a single animated object, the animation module 204 may instruct acomputing device 200 that corresponds to a smart watch to provide vibrations and/or a sound, acomputing device 200 that corresponds to a mobile device to provide a visual display, and acomputing device 200 that corresponds to virtual reality goggles to provide a three-dimensional rendition of the user interface. The three-dimensional rendition may including placing the animated object at different depths in the user interface. - The animation module 204 receives a selection of an animated object from a user. The animation module 204 generates the animated object and instructions the
user interface module 208 to display the animated object in the messaging stream. Theuser interface module 208 may display the animated object in different parts of the messaging stream. For example, theuser interface module 208 may display the animated object at the top of the user interface, at the bottom of the user interface or in the middle of the user interface. In another example, theuser interface module 208 may display the one or more messages within the messaging stream while the animated object remains fixed in a portion of the messaging stream. In yet another example, theuser interface module 208 may display the animated object according to a time that the first user selected the animated object so that the animated object is located after content that was provided before the animated object was selected and before content that is provided after the animated object was selected. In yet another example, theuser interface module 208 may display the animated object in random locations. Theuser interface module 208 may also change the location of the animated object based on actions performed by a user, as described in greater detail below. - The
user interface module 208 displays the animated object within the user interface for each person that is viewing the same messaging stream. For example, the messaging stream may be viewed by a single user that is making notes for himself, the messaging stream may be viewed by a first user and a second user, or the messaging stream may be viewed by a group of users. In some implementations, theuser interface module 208 places the animated object in different locations based on characteristics associated with different users. For example, a first user may provide user input that the first user prefers to see the animated object at the bottom right-hand part of the screen. In another example, the animated object may be located in the center of the messaging stream for users from a group that interact with the animated object and at the top left of the messaging stream for users that have not interacted with the animated object. In some implementations, the animated object is displayed differently depending on the position of the animated object in the messaging stream and/or the time the animated object has been displayed and/or how long since a user action associated with the animated object occurred. - In some implementations, the animated object may look different to multiple users that are viewing the same messaging stream based on metadata associated with each of the users. The metadata may include a time of day associated with a user device, a location associated with a user device, a time zone associated with a user device, user preferences associated with a user, etc. The animation module 204 may obtain information from the
sensor 245 to determine information from the metadata. For example, the animation module 204 may determine the user's location, if the user has contented to such a determination by the animation module 204, based on asensor 245 that provides global positioning system (GPS) information. The animation module 204 may modify an animated object based on the metadata. For example, where the animated object is an animation of a person, the person may wear shorts if the user is located in an area with warm weather, a jacket if the user is located in an area with cold weather, etc. The animation module 204 may display the animation of the person with different clothing or different behavior based on the time of day of the user. For example, the animation module 204 may display the animation of the person in pajamas and the animation may fall asleep if the time of day of the user is at night. In some implementations, upon consent of the user, the animation module 204 may modify the animated object based on user data. For example, the animation module 204 may instruct theuser interface module 208 to display the animated object with an alarm that goes off when the user has configured an alarm on thecomputing device 200, the animated object may display a countdown associated with a timer that the user configured on thecomputing device 200, the animated object may display an upcoming calendar event based on a calendar entry based on information associated with thesecond server 120, etc. - In some implementations, the animation module 204 instructs the
user interface module 208 to provide the second user with set of animated objects based on their relationships to the first animated object. For example, where the first animated object conveys a message, such as “I'll be home soon,” the set of animated objects may include responses to the message. Theuser interface module 208 may replace a first animated object with the second animated object or the second animated object may be displayed in the same messaging stream as the first animated object. In some implementations, the first animated object and the second animated object interact with each other. - The
action module 206 modifies the animated object based on a user action. In some implementations, theaction module 206 includes a set of instructions executable by theprocessor 235 to modify the animated object. For example, theaction module 206 may program the animated object (e.g., the animated object may include stored code or a prerecorded animation) to react to the user action. In some implementations, theaction module 206 is stored in thememory 237 of thecomputing device 200 and can be accessible and executable by theprocessor 235. - The
action module 206 receives a first action from a first user related to the animated object. Theaction module 206 determines a type of action that occurred and instructs theuser interface module 208 to modify a display of the animated object based on the action. For example, when the first user taps on an animated object, theaction module 206 instructs theuser interface module 208 to display hearts coming off of the animated object. In some implementations, the users that view the same messaging stream will see the hearts coming off of the animated object. Theaction module 206 receives a second action from a second user related to the animated object. For example, the second user makes a swiping motion, which theaction module 206 detects. Theaction module 206 instructs theuser interface module 208 to modify the display based on the second action. Continuing with the example above, theaction module 206 instructs theuser interface module 208 to display the animated objects as grabbing the hearts with the animated object's mouth.FIGS. 3A-3J below provide an example of different user actions that cause theaction module 206 to determine an action and theuser interface module 208 to modify a display of the animated object based on the action. - The actions may include taps, swipes, making noise, changing pressure, moving a pointing device (e.g., a mouse moving an arrow), moving the
computing device 200, capturing an image, providing a message with a context, providing a message with a word in the message, providing voice content, selecting options provided by the user interface, selecting another animated object, etc. - Turning to
FIG. 3A , anexample user interface 300 is illustrated of a messaging stream that includes an animated cookie according to some implementations. In this example, the messaging stream may include alist 301 of the users that are participating in the messaging stream. A first user may be associated with a first user icon 302 in the messaging stream. The first user may select ananimated cookie 303 as the animated object and theuser interface module 208 may display theanimated cookie 303 within aword bubble 304. -
FIG. 3B illustrates anexample user interface 310 of a messaging stream that includes theanimated cookie 303 with eyes that follow acursor 311 controlled by a pointing device that is controlled by a first user according to some implementations. In this example, theuser interface module 208 illustrates thecursor 311 with a circle; however, other variations are possible, such as an arrow, a rectangle, etc. Theaction module 206 determines that a pointing device is moved above the location of theanimated cookie 303 in the user interface. This is illustrated inFIG. 3B with thecursor 311 above theanimated cookie 303. Theaction module 206 instructs theuser interface module 208 to modify a display of theanimated cookie 303. For example, theanimated cookie 303 is updated to show animated eyes that move and track the movement of thecursor 311 as thecursor 311 moves above theanimated cookie 303. - The
animated cookie 303 may follow the location of thecursor 311 with the same speed of movement as thecursor 311. For example, if thecursor 311 moves up and down quickly, theanimated cookie 303 may include eyes that are animated to look up and down as quickly as thecursor 311. In some implementations, moving the cursor 311 a particular speed could cause the animated object to react. In this example, moving thecursor 311 around theanimated cookie 303 several times may cause theanimated cookie 303 to look dizzy, for example, by depicting stars floating around theanimated cookie 303. -
FIG. 3C illustrates anexample user interface 320 of a messaging stream that includes theanimated cookie 303 that is viewable by multiple users according to some implementations. In this example, the second user is associated with asecond user icon 321 in the messaging stream. Theanimated cookie 303 remains in the same location because theanimated cookie 303 is part of theword bubble 304. -
FIG. 3D illustrates anexample user interface 330 of a messaging stream that includes theanimated cookie 303 with eyes that follow a pointing device 331 controlled by the second user according to some implementations. In this example, theaction module 206 determines a location of thecursor 311 and theuser interface module 208 modifies the display of theanimated cookie 303 to look above and to the left of where thecursor 311 is located. - The second user moves the
cursor 311 to theanimated cookie 303 in order to move the animated cookie. In this example, theuser interface module 208 also includes informative text below thesecond icon 321 to inform the users that the second user is moving theanimated cookie 303. -
FIG. 3E illustrates anexample user interface 340 of a messaging stream that includes theanimated cookie 303 that reacts to being moved according to some implementations. In this example, as the second user moves theanimated cookie 303, theuser interface module 208 modifies the display of theanimated cookie 303 to look as though thecursor 311 is tickling theanimated cookie 303 by modifying theanimated cookie 303 to have scrunched up eyes and a pursed smiling mouth. Because the second user moved theanimated cookie 303 outside of the word bubble associated with the first user, theuser interface module 208 creates athird icon 341 associated with the first user that indicates that the first user created the animated cookie by calling theanimated cookie 303 an active sticker. The second user may move theanimated cookie 303 anywhere within the messaging stream. In some implementations, the animated cookie stays fixed wherever a user places it. -
FIG. 3F illustrates anexample user interface 350 of a messaging stream that includes ananimated cookie 303 that reacts to being placed according to some implementations. Theuser interface module 208 illustrates theanimated cookie 303 as continuing to laugh as if he was tickled by the pointing device. Theuser interface module 208 also modifies the text below thesecond icon 321 to inform the other users that the second user placed theanimated cookie 303. -
FIG. 3G illustrates anexample user interface 360 of a messaging stream that includes ananimated cookie 303 with eyes that move to view text displayed within the messaging stream according to some implementations. Theaction module 206 determines that the first user entered text within the messaging stream. Theuser interface module 208 modifies theanimated cookie 303 to move the eye to look downward as if theanimated cookie 303 is reading the text. In some implementations, theanimated cookie 303 may not access the content of the text, but instead merely reacts to the appearance of text. In some implementations, upon user consent, theanimated cookie 303 has access to the text. This feature may be disabled, for example, if the user declines permission for theaction module 206 to detect that the user entered text or received a message. -
FIG. 3H illustrates anexample user interface 370 of a messaging stream that includes ananimated cookie 303 that stays in a fixed location within the messaging stream according to some implementations. As more messages are added within the messaging stream, the messages scroll upward. Since theanimated cookie 303 is placed in a fixed position, it stays within the center of the messaging stream. -
FIG. 3I illustrates anexample user interface 380 of a messaging stream that includes ananimated cookie 303 that reacts to a word in themessage 381, if the users participating in the messaging stream provide consent to access the messages exchanged in the messaging stream, according to some implementations. In this example, the first user provides a message that states “Oh! I have to go . . . chat later.” Based upon user consent to access the message, theaction module 206 determines that the words “I have to go . . . chat later” indicate that the first user is about to end the chat. Theaction module 206 may determine a reaction for theanimated cookie 303 based on themessage 381 by using machine learning, comparing words in the message to lists of words associated with different contexts, etc. For example, such a determination may be made by a comparison of one or more of the words with words known to be associated with ending a chat, e.g., based upon prior training data, based upon clustering previous messages, etc. Theuser interface module 208 modifies the display of theanimated cookie 303 to start to fall asleep (e.g., by illustrating droopy eyes) based on the first user's message. -
FIG. 3J illustrates anexample user interface 390 of a messaging stream that includes theanimated cookie 303 that continues to react to the word within the message according to some implementations. Theaction module 206 determines, when consent is provided by the second user for access to messages in the messaging stream, that the second user is also about to leave the messaging stream based on the words associated with the second user stating “Ok!Bye.” Based on theaction module 206 determining a meaning of the words, theuser interface module 208 modifies the display of theanimated cookie 303 to fall asleep (e.g., by displaying theanimated cookie 303 saying “ZZZ”). - In some implementations, the action from the user includes the user blowing on a
computing device 200. For example, the user could blow on a mobile device or blow while wearing a headset. Thesensor 245 may include a microphone that detects a change in airflow and transmits information about the change in airflow to theaction module 206. Theaction module 206 may determine a degree of the change in airflow and theaction module 206 instructs theuser interface module 208 to move the animated object based on the degree of the change in airflow. For example, the animated object may be a box of tissues and blowing on thecomputing device 200 causes the tissues to move out of the box of tissues. If the user blows slightly on thecomputing device 200, it causes a few of the tissues from the box of tissues to come out of the box. If the user blows hard on thecomputing device 200, it causes several of the tissues from the box of tissues to come out of the box. In another example, the animated object is a character, such as a cartoon fox, and blowing on thecomputing device 200 causes the fox to hang on to the side of the messaging stream window and hold on. In yet another example, the animated object is a bubble and blowing on thecomputing device 200 makes the bubble increase in size corresponding to the extent of the first user blowing on thecomputing device 200. - In some implementations, the action from the user includes moving the
computing device 200. Theaction module 206 may receive information from a sensor 245 (e.g., an accelerometer or a gyroscope) and determine a degree of the movement. Theaction module 206 may instruct theuser interface module 208 to illustrate the animated object with additional changes. For example, the user shaking the user's smart watch or other mobile device causes theaction module 206 to instruct theuser interface module 208 to illustrate items moving, such as snow falling from the sky, nuts and bolts coming loose, a character's hair becoming disarrayed, the character shaking a fist at a user, etc. In another example, when the user moves thecomputing device 200 to the user's ear, theaction module 206 may receive information from thesensor 245 indicating that the user moved thecomputing device 200 to the user's ear and instruct thespeaker 243 to emit a noise. In some examples, the animated object emits a different noise depending on how thecomputing device 200 is moved, such as a yip if the user moves thecomputing device 200 to the user's left ear and a bark if the user moves thecomputing device 200 to the user's right ear. - Turning to
FIG. 4A , anexample user interface 400 of a messaging stream is illustrated that includes ananimated bear 405 according to some implementations. A first user may select ananimated bear 405 and place theanimated bear 405 in the center of the messaging stream below the messages. Upon user consent, theaction module 206 may identify that Steve provided a message stating “There might be bears at that picnic spot.” Theaction module 206 may instruct theuser interface module 208 to modify the display of the bear by animating the animated bear's 405 eyes to move back and forth to give the appearance that theanimated bear 405 is reading the text. -
FIG. 4B illustrates anexample user interface 425 of the messaging stream that includes theanimated bear 430 after a user performed a user action according to some implementations. For example, the second user may turn the user device 115 upside down. Theaction module 206 receives information from asensor 245 and determines that the degree of the movement of the user device 115 is 180 degrees. Theaction module 206 instructs theuser interface module 208 to modify the display to show theanimated bear 430 as upside down. - In another example, the
action module 206, upon user consent, may identify that Karen responded to Steve with “We'll fight them off!” Theaction module 206 determines that fight is an instruction associated with theanimated bear 430 and instructs theuser interface module 206 to modify the display of theanimated bear 430 to show theanimated bear 430 initially fighting and then falling upside down. - In some implementations, the action from the user includes movement within the messaging stream. The movement may be the user touching a touch screen of a mobile device or movement of a pointing device, such as a mouse. The animated object may be a character with eyes (e.g., a smiley face). The
action module 206 may receive information about the movement from the sensor 245 (e.g. a touch screen), and determine the direction of the movement. Theaction module 206 instructs theuser interface module 208 to move the eyes of the animated object to correspond to the direction of the movement from the first user. In another example, the movement may stretch the animated object. - Turning to
FIG. 5A , anexample user interface 500 of a messaging stream that includes animated bubbles is illustrated according to some implementations. The first user uses a pointing device associated with acomputing device 200 to pull afirst arrow 502 in a first direction. The second user uses a pointing device associated with anothercomputing device 200 to pull asecond arrow 503 in a second direction. Both thefirst arrow 502 and thesecond arrow 503 are pulling thebubble 501 in a different direction. Theaction module 206 identifies the first action and the second action as pulling thebubble 501 in a tug of war by pulling in opposite directions. -
FIG. 5B illustrates theexample user interface 525 of the messaging stream that includes the animated bubbles after two users performed user actions according to some implementations. Theaction module 206 determines how much each of the users pulls the bubble with thefirst arrow 502 and thesecond arrow 503. In some implementations, theaction module 206 applies animation rules to determine a winner of the tug of war by determining whether the first user or the second user first pulled the bubble a threshold number of pixels in the user interface. Other ways to determine a winner are possible, such as determining which user stretched the bubble a threshold pixel distance outside of a boundary. - In some implementations, when participant users in the messaging stream consent to such use of messages, the
action module 206 identifies a word in a message or a context of the messaging stream. Theaction module 206 may compare words in a message to a list of words associated with different meanings, such as different emotions (e.g., happy, sad, angry, etc.), different states (e.g., about to end the chat), etc. Theaction module 206 may use machine learning to predict a meaning associated with a user based on the message where the machine learning may be based on all messages available to theaction module 206 or, subject to user content, the messages associated with a particular user. The context of the messaging stream may be based on user patterns, a time of day, a location of thecomputing device 200, etc. and is determined only upon specific user consent to access such data. For example, where a user typically ends a messaging stream at the end of a workday. As a result, theaction module 206 may instruct theuser interface module 208 to modify a display of the animated object based on the context. - In some implementations, the
action module 206 may, responsive to user content, identify voice content from the user. Theaction module 206 may convert the speech to text and identify an indent of the voice content. For example, theaction module 206 may identify the user providing verbal instructions for an abstract animated object that includes “Jump around.” Based on the verbal content, theaction module 206 instructs the user interface to modify the display of the abstract animated object to show it moving up and down. - In some implementations, a first user may be a chat bot (e.g., an automated chat program) that provides services to a user. The animated object may be related to one or more messages exchanges between the chat bot and the user. For example, the user may instruct the chat bot to make a reservation, a purchase, etc. by entering a message. The
action module 206 may, upon user consent, instruct theuser interface module 208 to modify the animated object based on the message. For example, if the user instructs the chat bot to make a reservation, the animated object includes a graphic associated with making a reservation. - Turning to
FIG. 6 , anexample user interface 600 of a messaging stream is illustrated that includes an animatedmoney transfer object 601 according to some implementations. In this example, user Sara consents to the use of a chat bot to perform actions for her by instructing the Bankbot to pay off a credit card statement. The user provides amessage 602 that states: “@Bankbot Pay off my credit card statement.” The Bankbot responds with amessage 603 that states: “Confirmed. Payment posted.” Upon user consent for theaction module 206 to access the user's message, theaction module 206 determines that the user instructed the chat bot to transfer money from the user's account to a credit card company. Theaction module 206 instructs theuser interface module 208 to display an animatedmoney transfer object 601 that shows money moving from a money bag to a credit card. - In some implementations, various examples described above may be combined. For example, a messaging stream may include multiple users and a chat bot. The
action module 206 may detect a movement of a user on thecomputing device 200 and instruct theuser interface module 208 to modify a display of the animated object based on messages related to the chat bot and the movement of the user. - Turning to
FIG. 7A , anexample user interface 700 of an animated takeout box is illustrated according to some implementations. In this example, Kira andSam exchange messages 701 about ordering delivery food from Tasty Thai. The users also consent to the chatbot accessing their messages to help the users. A food ordering chatbot places the order and informs the users that the food will arrive in 40 minutes. The first user selects a takeout boxanimated object 702, which theuser interface module 208 displays in the messaging stream. The second user touches the takeout boxanimated object 702 with afinger 703. Theaction module 206 detects movement of the second user'sfinger 703 touching the animated object. -
FIG. 7B illustrates theexample user interface 725 of the animated takeout box after a user performed a user action according to some implementations. Based on the second user touching the takeout boxanimated object 702 inFIG. 7A , theaction module 206 determines the type of action that occurred and instructs theuser interface module 208 to modify the takeout box animated object to display an opened takeout boxanimated object 726. - In some implementations, the action may include a second user selecting a second animated object. The second animated object may be from a subset of animated objects based on its relationship to a first animated object. The second animated object may be selected from a group of all animated objects. In some implementations, the
user interface module 208 modifies a display by showing the first animated object interacting with the second animated object. For example, the first animated object may be a fox with boxing gloves and the second animated object may be a kangaroo with boxing globes. Theuser interface module 208 may display the two animated objects fighting with each other. In some implementations, the first user may control the first animated object and the second user may control the second animated object such that the first and second user engage in a boxing match using the two animated objects. - In some implementations where multiple animated objects are displayed in the messaging stream, the animated objects react differently depending on how they were added to the messaging stream. Two animated objects may react differently depending on their proximity to each other and a length of time, such that animated objects that are close to each other react to each other more than animated objects that are far away from each other. For example, when two animated people are next to each other they look at each other and touch each other. When the two animated people are at opposite sides of the messaging stream, they wave every two minutes but otherwise do not interact.
- The
user interface module 208 generates a user interface. In some implementations, theuser interface module 208 includes a set of instructions executable by theprocessor 235 to generate the user interface. In some implementations, theuser interface module 208 is stored in thememory 237 of thecomputing device 200 and can be accessible and executable by theprocessor 235. - In some implementations, the
user interface module 208 receives instructions from the animation module 204 to generate a user interface that includes a messaging stream. The user interface may include a group of animated objects for a first user to choose from. Theuser interface module 208 may receive a selection from the first user of one of the animated objects. Theuser interface module 208 displays the animated object in the messaging stream. - The
user interface module 208 receives instructions from theaction module 206 to modifying a display of the animated object based on a first action from the first user. For example, theuser interface module 208 receives instructions to modify an animated object of marbles to show them rolling around the messaging stream based on movement of a computing device associated with the first user. Theuser interface module 208 receives instructions from theaction module 206 to modify the display of the animated object based on a second action from a second user. For example, theuser interface module 208 receives instructions to show the marbles bouncing in the messaging stream based on the second user touching acomputing device 200 associated with the second user to simulate the marbles bouncing within the messaging stream. - In some implementations, the
user interface module 208 provides a user interface that includes interactive features to change the appearance of the animated object. For example, theuser interface module 208 provides a user interface that provides a scratchpad for drawing. The scratchpad may include a toolkit with various tools for drawing such as a pencil, a paintbrush, color options, etc. In another example, theuser interface module 208 provides a user interface that includes an interactive keyboard for producing music, beeps, tones, etc. When a user touches a key on the keyboard, theaction module 206 detect the touch and instructs thespeaker 243 to emit a beep, tone, etc. In yet another example, theuser interface module 208 provides a user interface that includes interactive graphics, such as charts, timelines, etc. where the user has options for changing the appearance of the interactive graphics. -
FIG. 8A illustrates anexample user interface 800 of a messaging stream that includes an animated airplane according to some implementations. In this example, a first user selects an animated object of anairplane 801, which represents a message to the second user that the first user is about to get on an airplane. The first user swipes across a screen of the smartwatch to cause theuser interface module 208 to modify the display of theairplane 801 to show theairplane 801 moving across the screen. The animation module 204 provides the second user with a set of animated objects based on their relationship to theairplane 801. For example, the set of animated objects could be a hand waving, a thumbs up, and two people kissing. The second user selects the animated object of two people kissing. -
FIG. 8B illustrates anexample user interface 825 of the messaging stream that includes ananimated couple 826 that is displayed responsive to a user action related to the animated airplane according to some implementations. In this example, the user action is the second user selecting the animated object of the two people kissing. -
FIG. 9 illustrates a flowchart of anexample method 900 to generate a messaging stream that includes an animated object. Themethod 900 is performed by amessaging application 103 stored on acomputing device 200, such as a user device 115, amessaging server 101, or in part a user device 115 and in part amessaging server 101. - At block 902, a messaging stream is generated where one or more messages are exchanged between a first user and a second user. For example, the first user and the second user send messages to each other using an instant messaging platform, via text, via SMS, etc. At block 904, a selection is received of an animated object from the first user for the messaging stream. For example, the first user selects the animated object from a group of animated objects that are displayed for the first user's device. For example, the animated object is an animated version of a flamingo. The user may customize the animated object by choosing a color for the animated object, a style of eyes for the animated object, etc. At
block 906, the animated object is provided in the messaging stream. For example, theuser interface module 208 displays the animated object in a default location, such as the center of the user interface. - At block 908, a first action is received from the first user related to the animated object. For example, the
action module 206 detects the first user blowing into asensor 245, such as a microphone, of thecomputing device 200. Atblock 910, a display of the animated object is modified based on the first action. For example, based on the first user blowing into thecomputing device 200, theuser interface module 208 modifies the display of the animated object to show the flamingo blowing around in the wind with his feathers flying. - At block 912, a second action is received from the second user related to the animated object. For example, the second user shakes the second user's
computing device 200. Atblock 914, the display of the animated object is modified based on the second action. For example, the display is modified to show the flamingo bouncing up and down with movement corresponding to the shaking of the second user'scomputing device 200. - In the above description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the specification. It will be apparent, however, to one skilled in the art that the disclosure can be practiced without these specific details. In some instances, structures and devices are shown in block diagram form in order to avoid obscuring the description. For example, the implementations can be described above primarily with reference to user interfaces and particular hardware. However, the implementations can apply to any type of computing device that can receive data and commands, and any peripheral devices providing services.
- Reference in the specification to “some implementations” or “some instances” means that a particular feature, structure, or characteristic described in connection with the implementations or instances can be included in at least one implementation of the description. The appearances of the phrase “in some implementations” in various places in the specification are not necessarily all referring to the same implementations.
- Some portions of the detailed descriptions above are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic data capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these data as bits, values, elements, symbols, characters, terms, numbers, or the like.
- It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms including “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission, or display devices.
- The implementations of the specification can also relate to a processor for performing one or more steps of the methods described above. The processor may be a special-purpose processor selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory computer-readable storage medium, including, but not limited to, any type of disk including optical disks, ROMs, CD-ROMs, magnetic disks, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memories including USB keys with non-volatile memory, or any type of media suitable for storing electronic instructions, each coupled to a computer system bus.
- The specification can take the form of some entirely hardware implementations, some entirely software implementations or some implementations containing both hardware and software elements. In some implementations, the specification is implemented in software, which includes, but is not limited to, firmware, resident software, microcode, etc.
- Furthermore, the description can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- A data processing system suitable for storing or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
- In situations in which the systems discussed above collect or use personal information, the systems provide users with an opportunity to control whether programs or features collect user information (e.g., information about a user's social network, social actions or activities, profession, a user's preferences, or a user's current location), or control whether and/or how to receive content from the server that may be more relevant to the user. In addition, certain data may be treated in one or more ways before it is stored or used, so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Thus, the user may have control over how information is collected about the user and used by the server.
Claims (20)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/398,497 US20180188905A1 (en) | 2017-01-04 | 2017-01-04 | Generating messaging streams with animated objects |
KR1020197009598A KR20190053207A (en) | 2017-01-04 | 2017-10-20 | Creation of messaging streams using animated objects |
CN201780061845.3A CN110114789A (en) | 2017-01-04 | 2017-10-20 | Generate the message flow with animation object |
JP2019520091A JP2019537117A (en) | 2017-01-04 | 2017-10-20 | Creating a messaging stream with animated objects |
PCT/US2017/057527 WO2018128663A1 (en) | 2017-01-04 | 2017-10-20 | Generating messaging streams with animated objects |
US16/702,432 US11003322B2 (en) | 2017-01-04 | 2019-12-03 | Generating messaging streams with animated objects |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/398,497 US20180188905A1 (en) | 2017-01-04 | 2017-01-04 | Generating messaging streams with animated objects |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/702,432 Continuation US11003322B2 (en) | 2017-01-04 | 2019-12-03 | Generating messaging streams with animated objects |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180188905A1 true US20180188905A1 (en) | 2018-07-05 |
Family
ID=60263047
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/398,497 Abandoned US20180188905A1 (en) | 2017-01-04 | 2017-01-04 | Generating messaging streams with animated objects |
US16/702,432 Active US11003322B2 (en) | 2017-01-04 | 2019-12-03 | Generating messaging streams with animated objects |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/702,432 Active US11003322B2 (en) | 2017-01-04 | 2019-12-03 | Generating messaging streams with animated objects |
Country Status (5)
Country | Link |
---|---|
US (2) | US20180188905A1 (en) |
JP (1) | JP2019537117A (en) |
KR (1) | KR20190053207A (en) |
CN (1) | CN110114789A (en) |
WO (1) | WO2018128663A1 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170236318A1 (en) * | 2016-02-15 | 2017-08-17 | Microsoft Technology Licensing, Llc | Animated Digital Ink |
US20180293483A1 (en) * | 2017-04-11 | 2018-10-11 | Microsoft Technology Licensing, Llc | Creating a Conversational Chat Bot of a Specific Person |
USD881923S1 (en) * | 2018-10-18 | 2020-04-21 | Facebook, Inc. | Display panel of a programmed computer system with a graphical user interface |
USD881922S1 (en) * | 2018-10-18 | 2020-04-21 | Facebook, Inc. | Display panel of a programmed computer system with a graphical user interface |
USD881921S1 (en) * | 2018-10-18 | 2020-04-21 | Facebook, Inc. | Display panel of a programmed computer system with a graphical user interface |
US10726603B1 (en) * | 2018-02-28 | 2020-07-28 | Snap Inc. | Animated expressive icon |
USD912076S1 (en) * | 2018-10-16 | 2021-03-02 | Facebook, Inc. | Display screen with graphical user interface |
US10943380B1 (en) * | 2019-08-15 | 2021-03-09 | Rovi Guides, Inc. | Systems and methods for pushing content |
US11003322B2 (en) * | 2017-01-04 | 2021-05-11 | Google Llc | Generating messaging streams with animated objects |
US20210165559A1 (en) * | 2017-11-13 | 2021-06-03 | Snap Inc. | Interface to display animated icon |
US11038832B2 (en) * | 2017-04-07 | 2021-06-15 | International Business Machines Corporation | Response status management in a social networking environment |
US11308110B2 (en) | 2019-08-15 | 2022-04-19 | Rovi Guides, Inc. | Systems and methods for pushing content |
US11402975B2 (en) * | 2020-05-18 | 2022-08-02 | Illuni Inc. | Apparatus and method for providing interactive content |
US20220291815A1 (en) * | 2020-05-20 | 2022-09-15 | Tencent Technology (Shenzhen) Company Limited | Message transmitting method and apparatus, message receiving method and apparatus, device, and medium |
US11470127B2 (en) * | 2020-05-06 | 2022-10-11 | LINE Plus Corporation | Method, system, and non-transitory computer-readable record medium for displaying reaction during VoIP-based call |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD803238S1 (en) * | 2016-06-12 | 2017-11-21 | Apple Inc. | Display screen or portion thereof with graphical user interface |
JP6869216B2 (en) * | 2018-11-07 | 2021-05-12 | スカパーJsat株式会社 | Augmented reality terminal |
CA3031479A1 (en) * | 2019-01-25 | 2020-07-25 | Jonathan Gagne | Computer animation methods and systems |
USD910056S1 (en) * | 2019-07-12 | 2021-02-09 | Google Llc | Display screen with graphical user interface |
JP7442091B2 (en) * | 2020-04-30 | 2024-03-04 | グリー株式会社 | Video distribution device, video distribution method, and video distribution program |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010033298A1 (en) * | 2000-03-01 | 2001-10-25 | Benjamin Slotznick | Adjunct use of instant messenger software to enable communications to or between chatterbots or other software agents |
US20090019117A1 (en) * | 2007-07-09 | 2009-01-15 | Jeffrey Bonforte | Super-emoticons |
US20090158170A1 (en) * | 2007-12-14 | 2009-06-18 | Rajesh Narayanan | Automatic profile-based avatar generation |
US20110218992A1 (en) * | 2010-03-05 | 2011-09-08 | Apple Inc. | Relevancy ranking for map-related search |
US20110248992A1 (en) * | 2010-04-07 | 2011-10-13 | Apple Inc. | Avatar editing environment |
US20110265018A1 (en) * | 2010-04-23 | 2011-10-27 | Ganz | Emotion and mood control of virtual characters in a virtual world |
US20120270578A1 (en) * | 2011-04-21 | 2012-10-25 | Walking Thumbs, LLC. | System and Method for Graphical Expression During Text Messaging Communications |
US8508469B1 (en) * | 1995-12-01 | 2013-08-13 | Immersion Corporation | Networked applications including haptic feedback |
US20140139450A1 (en) * | 2012-11-20 | 2014-05-22 | Immersion Corporation | System and Method for Simulated Physical Interactions With Haptic Effects |
US20150011277A1 (en) * | 2013-07-02 | 2015-01-08 | Kabam, Inc. | System and method for determining in-game capabilities based on device information |
US20160259526A1 (en) * | 2015-03-03 | 2016-09-08 | Kakao Corp. | Display method of scenario emoticon using instant message service and user device therefor |
US20180026925A1 (en) * | 2016-07-19 | 2018-01-25 | David James Kennedy | Displaying customized electronic messaging graphics |
Family Cites Families (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2180899A1 (en) * | 1995-07-12 | 1997-01-13 | Yasuaki Honda | Synchronous updating of sub objects in a three dimensional virtual reality space sharing system and method therefore |
US6219045B1 (en) * | 1995-11-13 | 2001-04-17 | Worlds, Inc. | Scalable virtual world chat client-server system |
US5880731A (en) * | 1995-12-14 | 1999-03-09 | Microsoft Corporation | Use of avatars with automatic gesturing and bounded interaction in on-line chat session |
KR100363952B1 (en) * | 2000-08-08 | 2002-12-12 | 학교법인 인하학원 | A Method for Multimedia Communication on Mobile PCs |
US7925703B2 (en) * | 2000-12-26 | 2011-04-12 | Numedeon, Inc. | Graphical interactive interface for immersive online communities |
US20070113181A1 (en) * | 2003-03-03 | 2007-05-17 | Blattner Patrick D | Using avatars to communicate real-time information |
US20070168863A1 (en) * | 2003-03-03 | 2007-07-19 | Aol Llc | Interacting avatars in an instant messaging communication session |
EP1652392A1 (en) * | 2003-07-31 | 2006-05-03 | Neomtel Co., Ltd. | Method for providing multimedia message |
US7342587B2 (en) * | 2004-10-12 | 2008-03-11 | Imvu, Inc. | Computer-implemented system and method for home page customization and e-commerce support |
US20070180402A1 (en) * | 2006-02-02 | 2007-08-02 | Richard Bassemir | Method for managing dynamic chat objects |
JP4281925B2 (en) * | 2006-06-19 | 2009-06-17 | 株式会社スクウェア・エニックス | Network system |
US9050528B2 (en) * | 2006-07-14 | 2015-06-09 | Ailive Inc. | Systems and methods for utilizing personalized motion control in virtual environment |
US8260263B2 (en) * | 2006-10-13 | 2012-09-04 | Dialogic Corporation | Dynamic video messaging |
US20080109741A1 (en) * | 2006-11-02 | 2008-05-08 | Ripl Corp. | User-generated content with instant-messaging functionality |
WO2008070184A1 (en) * | 2006-12-06 | 2008-06-12 | Prism Technologies Inc. | System, method, and apparatus for data-driven interactive wayfinding and associated services |
US8683353B2 (en) * | 2006-12-12 | 2014-03-25 | Motorola Mobility Llc | Method and system for distributed collaborative communications |
US8239487B1 (en) * | 2007-05-30 | 2012-08-07 | Rocketon, Inc. | Method and apparatus for promoting desired on-line activities using on-line games |
US20080309618A1 (en) * | 2007-06-12 | 2008-12-18 | Kazuyuki Okada | Methods and systems for controlling an input device, for generating collision data, and for controlling a camera angle |
US20090128567A1 (en) * | 2007-11-15 | 2009-05-21 | Brian Mark Shuster | Multi-instance, multi-user animation with coordinated chat |
US8612888B2 (en) * | 2008-04-01 | 2013-12-17 | Litl, Llc | Method and apparatus for managing digital media content |
US9003315B2 (en) * | 2008-04-01 | 2015-04-07 | Litl Llc | System and method for streamlining user interaction with electronic content |
US8531447B2 (en) * | 2008-04-03 | 2013-09-10 | Cisco Technology, Inc. | Reactive virtual environment |
US20090327899A1 (en) * | 2008-06-25 | 2009-12-31 | Steven Bress | Automated Creation of Virtual Worlds for Multimedia Presentations and Gatherings |
US8471843B2 (en) * | 2008-07-07 | 2013-06-25 | International Business Machines Corporation | Geometric and texture modifications of objects in a virtual universe based on real world user characteristics |
KR101956999B1 (en) * | 2008-07-15 | 2019-03-11 | 임머숀 코퍼레이션 | Systems and methods for transmitting haptic messages |
US20100083139A1 (en) * | 2008-09-26 | 2010-04-01 | International Business Machines Corporation | Virtual universe avatar companion |
US8683354B2 (en) * | 2008-10-16 | 2014-03-25 | At&T Intellectual Property I, L.P. | System and method for distributing an avatar |
US8464167B2 (en) * | 2008-12-01 | 2013-06-11 | Palo Alto Research Center Incorporated | System and method for synchronized authoring and access of chat and graphics |
US20100153858A1 (en) * | 2008-12-11 | 2010-06-17 | Paul Gausman | Uniform virtual environments |
US8630961B2 (en) * | 2009-01-08 | 2014-01-14 | Mycybertwin Group Pty Ltd | Chatbots |
US20110021109A1 (en) * | 2009-07-21 | 2011-01-27 | Borei Corporation | Toy and companion avatar on portable electronic device |
KR101686913B1 (en) * | 2009-08-13 | 2016-12-16 | 삼성전자주식회사 | Apparatus and method for providing of event service in a electronic machine |
US8458602B2 (en) * | 2009-08-31 | 2013-06-04 | Ganz | System and method for limiting the number of characters displayed in a common area |
US8924261B2 (en) * | 2009-10-30 | 2014-12-30 | Etsy, Inc. | Method for performing interactive online shopping |
EP2354897A1 (en) * | 2010-02-02 | 2011-08-10 | Deutsche Telekom AG | Around device interaction for controlling an electronic device, for controlling a computer game and for user verification |
CN102906667B (en) * | 2010-04-23 | 2016-11-23 | 意美森公司 | For providing the system and method for haptic effect |
US9634855B2 (en) * | 2010-05-13 | 2017-04-25 | Alexander Poltorak | Electronic personal interactive device that determines topics of interest using a conversational agent |
US8296151B2 (en) * | 2010-06-18 | 2012-10-23 | Microsoft Corporation | Compound gesture-speech commands |
US9525752B2 (en) * | 2010-10-22 | 2016-12-20 | Litl Llc | Method and apparatus for providing contextual services |
WO2013039748A2 (en) * | 2011-09-16 | 2013-03-21 | Social Communications Company | Capabilities based management of virtual areas |
US9645733B2 (en) * | 2011-12-06 | 2017-05-09 | Google Inc. | Mechanism for switching between document viewing windows |
US20140122619A1 (en) * | 2012-10-26 | 2014-05-01 | Xiaojiang Duan | Chatbot system and method with interactive chat log |
US9705829B2 (en) * | 2012-12-07 | 2017-07-11 | Linkedin Corporation | Communication systems and methods |
JP2014194747A (en) * | 2013-02-28 | 2014-10-09 | Canon Inc | Information processor, information processing method and computer program |
US20140300612A1 (en) * | 2013-04-03 | 2014-10-09 | Tencent Technology (Shenzhen) Company Limited | Methods for avatar configuration and realization, client terminal, server, and system |
WO2014201147A1 (en) * | 2013-06-12 | 2014-12-18 | Feghali John C | System and method for action-based input text messaging communication |
US20140372923A1 (en) * | 2013-06-14 | 2014-12-18 | Microsoft Corporation | High Performance Touch Drag and Drop |
US9766773B2 (en) * | 2013-07-31 | 2017-09-19 | Disney Enterprises, Inc. | Dynamic player activity environment response |
US20150058140A1 (en) * | 2013-08-21 | 2015-02-26 | Electronic Arts, Inc. | Systems and methods for in-application offers |
KR102161764B1 (en) * | 2013-10-31 | 2020-10-05 | 삼성전자주식회사 | Method and computer readable recording medium for displaying a communication window of a messenger using a cartoon image |
US9454840B2 (en) * | 2013-12-13 | 2016-09-27 | Blake Caldwell | System and method for interactive animations for enhanced and personalized video communications |
IN2014DE00332A (en) * | 2014-02-05 | 2015-08-07 | Nitin Vats | |
US20170124770A1 (en) * | 2014-03-15 | 2017-05-04 | Nitin Vats | Self-demonstrating object features and/or operations in interactive 3d-model of real object for understanding object's functionality |
US20170080346A1 (en) * | 2014-05-01 | 2017-03-23 | Mohamad Abbas | Methods and systems relating to personalized evolving avatars |
US20150332534A1 (en) * | 2014-05-15 | 2015-11-19 | Narvii Inc. | Systems and methods implementing user interface objects |
US9648062B2 (en) * | 2014-06-12 | 2017-05-09 | Apple Inc. | Systems and methods for multitasking on an electronic device with a touch-sensitive display |
US20160005320A1 (en) * | 2014-07-02 | 2016-01-07 | Christopher deCharms | Technologies for brain exercise training |
CA2898949A1 (en) * | 2014-08-14 | 2016-02-14 | Bassam Sibai | Selling an item in an interactive marketplace over a computer network |
CN105991394A (en) * | 2015-01-28 | 2016-10-05 | 阿里巴巴集团控股有限公司 | Group chat method and device |
CN104615747B (en) * | 2015-02-11 | 2017-05-17 | 腾讯科技(深圳)有限公司 | Information processing method, client side and server |
US20160357407A1 (en) * | 2015-06-04 | 2016-12-08 | Victorious, Inc. | Emotive Ballistics |
KR20170017289A (en) * | 2015-08-06 | 2017-02-15 | 삼성전자주식회사 | Apparatus and method for tranceiving a content |
US11477139B2 (en) * | 2016-02-25 | 2022-10-18 | Meta Platforms, Inc. | Techniques for messaging bot rich communication |
US20170246545A1 (en) * | 2016-02-28 | 2017-08-31 | Sqor, Inc. | Sports app with chat and pushed event information |
US10831802B2 (en) * | 2016-04-11 | 2020-11-10 | Facebook, Inc. | Techniques to respond to user requests using natural-language machine learning based on example conversations |
US20170345079A1 (en) * | 2016-05-31 | 2017-11-30 | Accenture Global Solutions Limited | Network of smart appliances |
US11108708B2 (en) * | 2016-06-06 | 2021-08-31 | Global Tel*Link Corporation | Personalized chatbots for inmates |
US10514822B2 (en) * | 2016-08-24 | 2019-12-24 | Motorola Solutions, Inc. | Systems and methods for text entry for multi-user text-based communication |
US10089793B2 (en) * | 2016-09-02 | 2018-10-02 | Russell Holmes | Systems and methods for providing real-time composite video from multiple source devices featuring augmented reality elements |
US10929917B2 (en) * | 2016-12-16 | 2021-02-23 | Paypal, Inc. | Accessing chat sessions via chat bots for cart generation |
US10796295B2 (en) * | 2016-12-22 | 2020-10-06 | Facebook, Inc. | Processing payment transactions using artificial intelligence messaging services |
US10853716B2 (en) * | 2016-12-27 | 2020-12-01 | Microsoft Technology Licensing, Llc | Systems and methods for a mathematical chat bot |
US20180188905A1 (en) * | 2017-01-04 | 2018-07-05 | Google Inc. | Generating messaging streams with animated objects |
-
2017
- 2017-01-04 US US15/398,497 patent/US20180188905A1/en not_active Abandoned
- 2017-10-20 KR KR1020197009598A patent/KR20190053207A/en not_active Application Discontinuation
- 2017-10-20 JP JP2019520091A patent/JP2019537117A/en not_active Withdrawn
- 2017-10-20 WO PCT/US2017/057527 patent/WO2018128663A1/en active Application Filing
- 2017-10-20 CN CN201780061845.3A patent/CN110114789A/en active Pending
-
2019
- 2019-12-03 US US16/702,432 patent/US11003322B2/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8508469B1 (en) * | 1995-12-01 | 2013-08-13 | Immersion Corporation | Networked applications including haptic feedback |
US20010033298A1 (en) * | 2000-03-01 | 2001-10-25 | Benjamin Slotznick | Adjunct use of instant messenger software to enable communications to or between chatterbots or other software agents |
US20090019117A1 (en) * | 2007-07-09 | 2009-01-15 | Jeffrey Bonforte | Super-emoticons |
US20090158170A1 (en) * | 2007-12-14 | 2009-06-18 | Rajesh Narayanan | Automatic profile-based avatar generation |
US20110218992A1 (en) * | 2010-03-05 | 2011-09-08 | Apple Inc. | Relevancy ranking for map-related search |
US20110248992A1 (en) * | 2010-04-07 | 2011-10-13 | Apple Inc. | Avatar editing environment |
US20110265018A1 (en) * | 2010-04-23 | 2011-10-27 | Ganz | Emotion and mood control of virtual characters in a virtual world |
US20120270578A1 (en) * | 2011-04-21 | 2012-10-25 | Walking Thumbs, LLC. | System and Method for Graphical Expression During Text Messaging Communications |
US20140139450A1 (en) * | 2012-11-20 | 2014-05-22 | Immersion Corporation | System and Method for Simulated Physical Interactions With Haptic Effects |
US20150011277A1 (en) * | 2013-07-02 | 2015-01-08 | Kabam, Inc. | System and method for determining in-game capabilities based on device information |
US20160259526A1 (en) * | 2015-03-03 | 2016-09-08 | Kakao Corp. | Display method of scenario emoticon using instant message service and user device therefor |
US20180026925A1 (en) * | 2016-07-19 | 2018-01-25 | David James Kennedy | Displaying customized electronic messaging graphics |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170236318A1 (en) * | 2016-02-15 | 2017-08-17 | Microsoft Technology Licensing, Llc | Animated Digital Ink |
US11003322B2 (en) * | 2017-01-04 | 2021-05-11 | Google Llc | Generating messaging streams with animated objects |
US11038832B2 (en) * | 2017-04-07 | 2021-06-15 | International Business Machines Corporation | Response status management in a social networking environment |
US20180293483A1 (en) * | 2017-04-11 | 2018-10-11 | Microsoft Technology Licensing, Llc | Creating a Conversational Chat Bot of a Specific Person |
US10853717B2 (en) * | 2017-04-11 | 2020-12-01 | Microsoft Technology Licensing, Llc | Creating a conversational chat bot of a specific person |
US11775134B2 (en) * | 2017-11-13 | 2023-10-03 | Snap Inc. | Interface to display animated icon |
US20210165559A1 (en) * | 2017-11-13 | 2021-06-03 | Snap Inc. | Interface to display animated icon |
US10726603B1 (en) * | 2018-02-28 | 2020-07-28 | Snap Inc. | Animated expressive icon |
US11468618B2 (en) | 2018-02-28 | 2022-10-11 | Snap Inc. | Animated expressive icon |
US11880923B2 (en) | 2018-02-28 | 2024-01-23 | Snap Inc. | Animated expressive icon |
US11688119B2 (en) | 2018-02-28 | 2023-06-27 | Snap Inc. | Animated expressive icon |
US11120601B2 (en) | 2018-02-28 | 2021-09-14 | Snap Inc. | Animated expressive icon |
USD912076S1 (en) * | 2018-10-16 | 2021-03-02 | Facebook, Inc. | Display screen with graphical user interface |
USD881922S1 (en) * | 2018-10-18 | 2020-04-21 | Facebook, Inc. | Display panel of a programmed computer system with a graphical user interface |
USD881923S1 (en) * | 2018-10-18 | 2020-04-21 | Facebook, Inc. | Display panel of a programmed computer system with a graphical user interface |
USD881921S1 (en) * | 2018-10-18 | 2020-04-21 | Facebook, Inc. | Display panel of a programmed computer system with a graphical user interface |
US11308110B2 (en) | 2019-08-15 | 2022-04-19 | Rovi Guides, Inc. | Systems and methods for pushing content |
US10943380B1 (en) * | 2019-08-15 | 2021-03-09 | Rovi Guides, Inc. | Systems and methods for pushing content |
US11470127B2 (en) * | 2020-05-06 | 2022-10-11 | LINE Plus Corporation | Method, system, and non-transitory computer-readable record medium for displaying reaction during VoIP-based call |
US11792241B2 (en) | 2020-05-06 | 2023-10-17 | LINE Plus Corporation | Method, system, and non-transitory computer-readable record medium for displaying reaction during VoIP-based call |
US11402975B2 (en) * | 2020-05-18 | 2022-08-02 | Illuni Inc. | Apparatus and method for providing interactive content |
US20220291815A1 (en) * | 2020-05-20 | 2022-09-15 | Tencent Technology (Shenzhen) Company Limited | Message transmitting method and apparatus, message receiving method and apparatus, device, and medium |
Also Published As
Publication number | Publication date |
---|---|
JP2019537117A (en) | 2019-12-19 |
WO2018128663A1 (en) | 2018-07-12 |
US20200104017A1 (en) | 2020-04-02 |
KR20190053207A (en) | 2019-05-17 |
US11003322B2 (en) | 2021-05-11 |
CN110114789A (en) | 2019-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11003322B2 (en) | Generating messaging streams with animated objects | |
US10169897B1 (en) | Systems and methods for character composition | |
US11438288B2 (en) | Displaying customized electronic messaging graphics | |
US20220284650A1 (en) | Animated chat presence | |
CN110945840B (en) | Method and system for providing embedded application associated with messaging application | |
CN117749754A (en) | Generating and displaying custom avatars in media overlays | |
CN117669605A (en) | Parsing electronic conversations for presentation in alternative interfaces | |
CN115641424A (en) | Augmented reality object manipulation | |
US20220197027A1 (en) | Conversation interface on an eyewear device | |
US20220198603A1 (en) | Recentering ar/vr content on an eyewear device | |
US20230400965A1 (en) | Media content player on an eyewear device | |
EP4268060A1 (en) | Recentering ar/vr content on an eyewear device | |
US11583779B2 (en) | Message interface expansion system | |
KR20230119005A (en) | Gesture control on eyewear devices | |
KR20230119004A (en) | Conversational interface on eyewear devices | |
US20220206924A1 (en) | Accessing third party resources via client application with messaging capability | |
US11893166B1 (en) | User avatar movement control using an augmented reality eyewear device | |
US20240104789A1 (en) | Text-guided cameo generation | |
US20240070950A1 (en) | Avatar call on an eyewear device | |
US20240119679A1 (en) | External screen streaming for an eyewear device | |
US20240111391A1 (en) | Presenting extended reality content in different physical environments | |
WO2024044184A1 (en) | External computer vision for an eyewear device | |
CN116635771A (en) | Conversational interface on eyewear device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TRAN, THANH;WILCOX, ERIC;REEL/FRAME:040867/0687 Effective date: 20170104 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044567/0001 Effective date: 20170929 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |