US20220300132A1 - Facilitating the editing of multimedia as part of sending the multimedia in a message - Google Patents

Facilitating the editing of multimedia as part of sending the multimedia in a message Download PDF

Info

Publication number
US20220300132A1
US20220300132A1 US17/806,230 US202217806230A US2022300132A1 US 20220300132 A1 US20220300132 A1 US 20220300132A1 US 202217806230 A US202217806230 A US 202217806230A US 2022300132 A1 US2022300132 A1 US 2022300132A1
Authority
US
United States
Prior art keywords
content item
multimedia content
editing
preview
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/806,230
Inventor
Benjamin S. Langholz
William McMillan Tyler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Inc
Original Assignee
Meta Platforms Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Platforms Inc filed Critical Meta Platforms Inc
Priority to US17/806,230 priority Critical patent/US20220300132A1/en
Assigned to META PLATFORMS, INC. reassignment META PLATFORMS, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: FACEBOOK, INC.
Assigned to FACEBOOK, INC. reassignment FACEBOOK, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LANGHOLZ, BENJAMIN S., TYLER, WILLIAM MCMILLAN
Publication of US20220300132A1 publication Critical patent/US20220300132A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/02User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/06Message adaptation to terminal or network requirements
    • H04L51/066Format adaptation, e.g. format conversion or compression
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/216Handling conversation history, e.g. grouping of messages in sessions or threads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/42Mailbox-related aspects, e.g. synchronisation of mailboxes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N5/23216
    • H04N5/23293
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]

Definitions

  • One or more embodiments relate generally to electronic messaging systems and methods. More specifically, one or more embodiments relate to systems and methods for increasing functionality in an electronic messaging system.
  • Computing devices provide numerous ways for people to connect and communicate with one another.
  • a variety of electronic messaging systems provide various methods to send and receive electronic messages.
  • a computing device can allow a user to communicate with other users using text messaging, instant messaging, social network posting, and other forms of electronic communication.
  • an electronic communication may include a variety of content including text, images, video, audio, and/or other multimedia.
  • electronic communication has become a popular way for people to connect and communicate with one another.
  • Including multimedia in electronic communications has become an especially popular way to add humor, context, and information to an electronic communication session. For example, a user may send a digital photograph to a co-user indicating his location. Similarly, a user may send a video laughing in response to an electronic message that includes a joke.
  • including multimedia in an electronic communication is an easy way to add a layer of expression to an electronic communication session that is typically difficult with only textual messages.
  • a user is typical not able to easily edit multimedia for inclusion in an electronic communication session. For example, a user may wish to edit light contrast, color saturation, or some other characteristic in a digital photograph and then include the edited digital photograph in an electronic communication.
  • a user in order to include edited multimedia in an electronic communication, a user first must edit the multimedia using software specifically dedicated to that purpose. Thus, a user is typically not able to edit multimedia without navigating away from the electronic communication session. This adds extra steps and hassle to the process of composing an electronic communication.
  • One or more embodiments provide benefits and/or solve one or more of the foregoing and other problems in the art with methods and systems that provide enhanced functionality for electronic messaging systems.
  • methods and systems described herein allow users greater functionality for including multimedia content items in electronic messages.
  • one or more embodiments can provide the foregoing or other benefits through an intuitive user interface.
  • systems and methods of one or more embodiments allow a user to select an existing multimedia content item for inclusion in an electronic message without navigating away from a communication thread.
  • a user interface displays both a communication thread with electronic messages sent between co-users and a collection of stored multimedia content items.
  • a user may browse and select a stored multimedia content item without navigating away from the communication thread.
  • systems and methods of one or more embodiments allow a user to easily edit multimedia content items for inclusion in an electronic message.
  • a user may edit a digital photograph or video without having to utilize a separate piece of software. This provides the user with a more intuitive and streamlined way for including edited multimedia content items in an electronic message.
  • FIG. 1 illustrates a schematic diagram of an electronic messaging system in accordance with one or more embodiments
  • FIG. 2 illustrates a block diagram of an environment for implementing the system of FIG. 1 in accordance with one or more embodiments
  • FIGS. 3A-3I illustrate user interfaces for selecting, editing, and sending a multimedia content item in accordance with one or more embodiments
  • FIGS. 4A-4E illustrate user interfaces for selecting, editing, and sending a multimedia content item in accordance with one or more additional embodiments
  • FIG. 5 illustrates a flowchart of a series of acts in a method of selecting and including multimedia content items in an electronic message in accordance with one or more embodiments
  • FIG. 6 illustrates a flowchart of a series of acts in another method of selecting and including multimedia content items in an electronic message in accordance with one or more embodiments
  • FIG. 7 illustrates a block diagram of an exemplary computing device in accordance with one or more embodiments.
  • FIG. 8 is an example network environment of a social networking system in accordance with one or more embodiments.
  • One or more embodiments include an electronic messaging system that provides users with efficient and effective user experiences when sending electronic communications including multimedia content. More specifically, one or more embodiments described herein allow users to easily and intuitively select multimedia content for inclusion in an electronic communication. For example, one or more embodiments allow a user to select a multimedia content item for inclusion in an electronic message without navigating away from a communication thread.
  • one or more embodiments allow a user to browse and select a multimedia content item from a gallery of selectable multimedia content item without navigating away from the communication thread.
  • one or more embodiments display a graphical user interface that includes a communication thread with electronic messages sent between co-users, as well as a display area or gallery of selectable multimedia content items.
  • one or more embodiments allow a user to receive and read messages while simultaneously browsing multimedia content items.
  • the display area or gallery includes a preview of multimedia content items likely to be selected by the user.
  • the display area or gallery can include a predetermined number of the most recent multimedia content items.
  • the display area or gallery can include multimedia content items related to participant in a communication session, content items related to a topic of the session, content items most often sent as messages, or content items selected by another criteria.
  • the electronic messaging system can modify a preview of multimedia content items to aid in viewing and selection of the content items.
  • the electronic messaging system can crop content items based on the size of the display area or gallery.
  • the electronic messaging system auto-play videos in the display area or gallery.
  • the electronic messaging system can allow a user to easily and effectively edit a multimedia content item for inclusion in a message.
  • the one electronic messaging system can allow a user to start an editing process or edit a multimedia content item in the preview area or gallery without navigating away from the electronic messaging system.
  • one or more embodiments allow a user to edit a multimedia content item without having to utilize a separate software outside of the electronic messaging system.
  • the electronic messaging system provides the preview area or gallery of multimedia content items below a communication thread.
  • the electronic messaging system allows a user to horizontally scroll through the multimedia content items.
  • the electronic messaging system can provide options to edit or send the multimedia content item as a message. If the edit option is selected, the electronic messaging system can make the multimedia content item available for editing.
  • FIG. 1 illustrates an example embodiment of an electronic messaging system 100 .
  • the electronic messaging system 100 may include, but is not limited to, a user interface manager 102 (or simply “UI manager”), a user input detector 104 , a content item manager 106 , a communication manager 108 , and a data storage 110 .
  • Each of the components 102 - 110 of the electronic messaging system 100 may be in communication with one another using any suitable communication technologies.
  • the disclosure herein shows the components 102 - 110 to be separate in FIG. 1 , any of the components 102 - 110 may be combined into fewer components, such as into a single facility or module, or divided into more components as may serve one or more embodiments.
  • components 102 - 110 may be located on, or implemented by, one or more computing devices, such as those described below in relation to FIG. 7 .
  • portions of the electronic messaging system 100 can be located on a computing device, while other portions of the electronic messaging system 100 are located on, or form part of, a social networking system, such as that described below in reference to FIG. 8 .
  • the components 102 - 110 can comprise software, hardware, or both.
  • the components 102 - 110 can comprise one or more instructions stored on a computer readable storage medium and executable by a processor of one or more computing devices. When executed by the one or more processors, the computer-executable instructions of the electronic messaging system 100 can cause a computing device(s) to perform the methods described herein.
  • the components 102 - 110 can comprise hardware, such as a special-purpose processing device to perform a certain function. Additionally or alternatively, the components 102 - 110 can comprise a combination of computer-executable instructions and hardware.
  • the electronic messaging system 100 can include a user interface manager 102 .
  • the user interface manager 102 provides, manages, updates, and/or controls graphical user interfaces (or simply “user interfaces”) that allow a user to view and interact with display elements.
  • the user interface manager 102 may identify, display, update, or otherwise provide various user interfaces that contain one or more display elements in various layouts.
  • the user interface manager 102 can display a variety of display elements within a graphical user interface.
  • the user interface manager 102 may display a graphical user interface on a display of a computing device.
  • display elements include, but are not limited to: buttons, text boxes, menus, thumbnails, scroll bars, hyperlinks, etc.
  • the user interface manager 102 can display and format display elements in any one of a variety of layouts.
  • the user interface manager 102 can also update, remove, resize, or reposition display elements in response to user interactions.
  • the electronic messaging system 100 may detect user input in a variety of ways. For instance, in one or more embodiments, the detected user input may cause the user interface manager 102 to update a graphical user interface based on the detected input. Similarly, in one or more embodiments, the detected user input may cause the user interface manager 102 to resize one or more display elements, to reposition one or more display elements within the graphical user interface, or to otherwise change or remove one or more display elements within the graphical user interface.
  • the user interface manager 102 can selectively update certain areas of a user interface in response to user interactions. For example, in one or more embodiments, detected user input may cause the user interface manager 102 to update or change within only one area of a graphical user interface. In one or more embodiments, upon a detected user interaction, the user interface manager 102 may update one area within a user interface from one type of display to a second type of display, while continuing to display another area within the user interface with no updates.
  • the user interface manager 102 can reorganize a user interface in response to user interactions. For example, in one or more embodiments, detected user input may cause the user interface manager 102 to split a graphical user interface into two or more areas. In one or more embodiments, upon a detected user interaction, the user interface manager 102 may reorganize a user interface from only displaying one area with a first collection of display elements to displaying two areas with the first collection of display elements in the first area and a second collection of display elements in the second area. Likewise, in one or more embodiments, the user interface manager 102 may also consolidate or remove areas within a graphical user interface in response to detected user interactions.
  • the electronic messaging system 100 may further include a user input detector 104 .
  • the user input detector 104 detects, receives, and/or facilitates user input in any suitable manner.
  • the user input detector 104 detects one or more user interactions.
  • a “user interaction” means a single input, a combination of inputs, received from a user by way of one or more input devices, or via one or more touch gestures as described above.
  • a user interaction can have variable duration and may take place anywhere on the graphical user interface managed by the user interface manager 102 described above.
  • the user input detector 104 can detect a user interaction from a keyboard, mouse, touch screen display, or any other input device.
  • the user input detector 104 can detect one or more touch gestures that form a user interaction (e.g., tap gestures, swipe gestures, pinch gestures, etc.) provided by a user by way of the touch screen.
  • the user input detector 104 can detect touch gestures in relation to and/or directed at one or more display elements displayed as part of the graphical user interface presented on the touch screen display.
  • the user input detector 104 may report any detected touch gesture in relation to and/or directed at one or more display elements to user interface manager 102 .
  • the user input detector 104 may additionally, or alternatively, receive data representative of a user interaction.
  • the user input detector 104 may receive one or more user configurable parameters from a user, one or more user commands from the user, and/or any other suitable user input.
  • the user input detector 104 can receive voice commands or otherwise sense, detect, or receive user input.
  • the electronic messaging system 100 may further include a content item manager 106 .
  • the content item manager 106 manages multimedia content items (such as digital files), tracks recent multimedia content items, creates previews of multimedia content items, manages edits to multimedia content items, and otherwise handles all actions effecting multimedia content items. For example, in one or more embodiments, the content item manager 106 determines the most recent content items stored on a system, creates or retrieves a preview of each of the recent content items, and presents the previews of the recent content items as part of a user interface.
  • the electronic messaging system 100 may be implemented on a computing device with data storage.
  • the content item manager 106 searches the data storage of the computing device for multimedia content items suitable for inclusion in a communication session.
  • the content item manager 106 may search the data storage of the computing device for digital photographs, digital videos, and/or sound recording.
  • the content item manager 106 may create a linked list linking to the multimedia files stored in the data storage of the computing device.
  • the content item manager 106 may create a copy of the multimedia content items stored in the data storage of the computing device.
  • the content item manager 106 may create or retrieve a preview of each multimedia content items. For example, in one or more embodiments, the content item manager 106 may create a preview of a digital photograph by cropping multimedia content item based an aspect ratio of a user interface. For example, in one or more embodiments, the content item manager 106 may crop a rectangular preview of digital photograph and/or digital video such that the preview is square. Additionally, in one or more embodiments, the content item manager 106 may create a preview of a digital video that includes a portion of the digital video that automatically plays within a portion of the user interface.
  • the content item manager 106 may search files stored on the computing device only for the most recent multimedia content items. For example, in one or more embodiments, the content item manager 106 may identify the ten most recent multimedia content items stored on the computing device. Alternatively, the content item manager 106 may identify a percentage of the most recent multimedia content items stored on the computing device. In one or more embodiments, the content item manager 106 may determine recentness of a multimedia content items by identifying a timestamp associated with the multimedia content items that indicates when the multimedia content items was created or added to the computing device. The content item manager 106 may then compare the identified timestamp to timestamps of other multimedia content items. In one or more alternative embodiments, the number or percentage of recent multimedia content items identified by the content item manager 106 may be a number configurable by the user or can be a set predetermined number.
  • the content item manager 106 may manage editing of multimedia content items. For example, in one or more embodiments, the content item manager 106 may create a copy of a multimedia content items, and present the copy for editing. In one or more embodiments, the content item manager 106 may track edits made to the copy of the multimedia content items and provide the edited copy for sending to one or more co-users. For instance, after a user selects a preview from the user interface, the content item manager 106 may create a copy of the multimedia content items associated with the preview and present the copy of the multimedia content items e to the user for editing. In one or more embodiments, the content item manager 106 may track edits to the copy of the multimedia content items and provide the edited copy of the multimedia content items for sending to one or more co-users as an electronic communication.
  • the content item manager 106 may provide the original multimedia content items for editing, rather than a copy of the multimedia content items, as discussed above. For example, in one or more embodiments, after a user selects a preview from the user interface, the content item manager 106 may present the multimedia content items associated with the preview to the user for editing. In that embodiment, the content item manager 106 may track edits to the multimedia content items and provide the edited multimedia content items for sending to one or more co-users as an electronic communication. Accordingly, in some embodiments, the content item manager 106 stores edits made to the multimedia content items within the data storage of a computing device, such that the edited multimedia content items is available to other applications on the computing device. In one or more alternative embodiments, as described above, the content item manager 106 discards edits made a copy of the multimedia content items once the copy of the multimedia content items has been provided for sending to one or more co-users of the communication system.
  • the content item manager 106 may enable a variety of edits to be performed in connection with either a multimedia content items, or a copy of a multimedia content items as described above.
  • the content item manager 106 may allow for edits such as altering the color contrast, altering brightness, altering sharpness, altering clarity, inverting color (i.e., color to sepia, color to black and white), adding text or image overlays, adding blended image effects, or any other type of edit suitable to be made to a digital photograph and/or digital video.
  • the content item manager 106 may allow for edits such as adding scene transitions, adding front or end credits, shortening or lengthening the runtime of the video, adding a sound track, or any other type of edit suitable to be made in connection with a digital video.
  • edits such as adding scene transitions, adding front or end credits, shortening or lengthening the runtime of the video, adding a sound track, or any other type of edit suitable to be made in connection with a digital video.
  • sound recordings the content item manager 106 may allow for edits such as altering volume, adding reverb, adding sound effects, concatenating additional recordings, or any other type of edit suitable to made in connection with a sound recording.
  • the electronic messaging system 100 may further include a communication manager 108 .
  • the communication manager 108 can facilitate receiving and sending data to and from the electronic messaging system 100 , or a device upon which the electronic messaging system 100 is implemented.
  • the communication manager 108 can instruct or activate one or more communication interfaces of a computing device, as described below to send or receive data, particularly data related to electronic communications.
  • the communication manager 108 can package or format content items to be sent or received from the electronic messaging system 100 in any necessary form that is able to be sent through one or more communication channels and using an appropriate communication protocol, as described further below with reference to FIG. 7 .
  • the electronic messaging system 100 can include a data storage 110 , as illustrated in FIG. 1 .
  • the data storage 110 may maintain content item data 112 representative of data associated with multimedia content items available for inclusion as an electronic communication.
  • the content item data 112 may include, but is not limited to: digital photographs, digital videos, recordings of sound inputs, as well as other data representing edits to the digital photographs, digital videos, and/or recordings of sound inputs.
  • FIG. 2 is a schematic diagram illustrating an example system 200 , within which one or more embodiments of the electronic messaging system 100 can be implemented.
  • the system 200 can include computing devices 202 , 204 , a network 206 , and a communication server 208 .
  • the computing devices 202 , 204 , the network 206 , and the communication server 208 may be communicatively coupled, as shown in FIG. 2 .
  • FIG. 2 illustrates a particular arrangement of the computing devices 202 , 204 , the network 206 , and the communication server 208 , various additional arrangements are possible.
  • the computing devices 202 , 204 may directly communicate with the communication server 208 , bypassing the network 206 , or alternatively, may directly communicate with each other.
  • the computing devices 202 , 204 , the network 206 , and the communication server 208 may communicate using any communication platforms and technologies suitable for transporting data and/or communication signals.
  • the computing devices 202 , 204 , the network 206 , and the communication server 208 may communicate via any known communication technologies, devices, media, and protocols supportive of remote data communications, examples of which will be described in more detail below with respect to FIG. 7 .
  • the computing devices 202 , 204 , and the communication server 208 may communicate via the network 206 , which may include one or more social networks as described further below with respect to FIG. 8 .
  • the communication server 208 may generate, store, receive, and transmit electronic communication data.
  • the communication server 208 may receive an electronic communication from the computing device 202 and send the received electronic communication to the computing device 204 .
  • the communication server 208 can transmit electronic messages between one or more users of the system 200 .
  • the communication server 208 can receive a wide range of electronic communication types, including but not limited to, text messages, instant messages, social-networking messages, social-networking posts, emails, and any other form of electronic communication. Additional details regarding the communication server 208 will be discussed below with respect to FIG. 7 .
  • the network 206 may represent a network or collection of networks (such as the Internet, a corporate intranet, a virtual private network (VPN), a local area network (LAN), a wireless local network (WLAN), a cellular network, a wide area network (WAN), a metropolitan area network (MAN), or a combination of two or more such networks.
  • the network 206 may be any suitable network over which the computing device 202 may access the communication server 208 and/or the computing device 204 , or vice versa.
  • the network 206 will be discussed in more detail below with regard to FIGS. 7 and 8 .
  • FIG. 2 illustrates that a user 210 can be associated with the computing device 202 , and that a user 212 can be associated with the computing device 204 .
  • the system 200 can include a large number of users, with each of the users interacting with the system 200 through one or more computing devices.
  • the user 210 can interact with the computing device 202 for the purpose of composing, and sending an electronic communication (e.g., instant message).
  • the user 210 may interact with the computing device 202 by way of a user interface, managed by the user interface manager 102 , on the computing device 202 .
  • the user 210 can utilize the user interface to cause the computing device 202 to compose and send an electronic communication to one or more of the plurality of users of the system 200 .
  • the components 102 - 110 may be implemented on one or more of the computing devices 202 , 204 and the communication server 208 .
  • the computing devices 202 , 204 , and the communication server 208 may communicate across the network 206 via the communication manager 108 of the electronic messaging system 100 .
  • the computing devices 202 , 204 may receive user inputs via the user input detector 104 .
  • the computing devices 202 , 204 may provide graphical user interfaces via the user interface manager 102 .
  • each of the computing devices 202 , 204 can include an instance of the electronic messaging system 100 .
  • each of the components 100 - 110 of the electronic messaging system 100 as described with regard to FIGS. 1 and 2 can provide, alone and/or in combination with the other components of the electronic messaging system 100 , one or more graphical user interfaces.
  • the components 102 - 110 can allow a user to interact with a collection of display elements for a variety of purposes.
  • FIGS. 3A-4E and the description that follows illustrate various example embodiments of the user interfaces and features that are in accordance with general principles as described above.
  • a computing device i.e., computing device 202 , 204 of FIG. 2
  • FIG. 3A illustrates a computing device 300 that may implement one or more of the components 102 - 110 of the electronic messaging system 100 .
  • the computing device 300 is a handheld device, such as a mobile phone device (e.g., a smartphone).
  • the term “handheld device” refers to a device sized and configured to be held/operated in a single hand of a user.
  • any other suitable computing device such as, but not limited to, a tablet device, a handheld device, larger wireless devices, laptop or desktop computer, a personal-digital assistant device, and/or any other suitable computing device can perform one or more of the processes and/or operations described herein.
  • the computing device 300 can include any of the features and components described below in reference to a computing device 700 of FIG. 7 .
  • the computing device 300 includes a touch screen display 302 that can display or provide user interfaces and by way of which user input may be received and/or detected.
  • a “touch screen display” refers to the display of a touch screen device.
  • a touch screen device may be a computing device 202 , 204 with at least one surface upon which a user 210 , 212 may perform touch gestures (e.g., a laptop, a tablet computer, a personal digital assistant, a media player, a mobile phone).
  • the computing device 300 may include any other suitable input device, such as a touch pad or those described below in reference to FIG. 7 .
  • FIG. 3A illustrates a touch screen display 302 of the computing device 300 displaying one embodiment of a graphical user interface, in particular a messaging graphical user interface 304 .
  • the user interface manager 102 provides various display areas and display elements as part of the messaging graphical user interface 304 .
  • the messaging graphical user interface 304 includes a communication thread 306 , as well as a message input control palette or toolbar 310 .
  • the communication manager 108 can facilitate receiving and sending data. In one or more embodiments, the communication manager 108 facilitates receiving and sending of electronic communications between the computing devices 202 , 204 . Also in one or more embodiments, the user interface manager 102 displays electronic communications sent and received via the communication manager 108 . In one or more embodiments, the user interface manager 102 can display electronic communications sent and received via the communication manager 108 in the communication thread 306 within the messaging graphical user interface 304 .
  • the user interface manager 102 provides the communication thread 306 that includes electronic messages 308 a sent from an account of a user of the communication device 300 .
  • the communication thread 306 can include electronic messages 308 b received by the account of the user of the computing device 300 .
  • the user interface manager 102 organizes the communication thread 306 such that new messages are added to the bottom of the communication thread 306 so that older messages are displayed at the top of the communication thread 306 .
  • the user interface manager 102 may organize the messages 308 a , 308 b in any manner that may indicate to a user the chronological or other relationship between the messages 308 a , 308 b.
  • the user interface manager 102 provides a variety of electronic communication characteristics to help a user distinguish between electronic communications in the communication thread 306 .
  • the user interface manager 102 displays the electronic messages 308 a sent from an account of the user of the computing device 300 pointed toward one side (i.e., the right side) of the messaging graphical user interface 304 .
  • the user interface manager 102 displays the electronic messages 308 b received by the communication manager 108 pointed toward the opposite side (i.e., the left side) of the messaging graphical user interface 304 .
  • the positioning and orientation of the electronic messages 308 a , 308 b provides a clear indicator to a user of the computing device 300 of the origin of the various electronic communications displayed within the messaging graphical user interface 304 .
  • Another characteristic provided by the user interface manager 102 that helps a user distinguish electronic communications may be a color of the electronic communications.
  • the user interface manager 102 displays sent electronic messages 308 a in a first color and received electronic messages 308 b in a second color.
  • the first and second colors may be black and white, respectively, with an inverted typeface color.
  • the user interface manager 102 may display the electronic messages 308 a , 308 b with white backgrounds and different colored outlines.
  • the user interface manager 102 may display the electronic messages 308 a , 308 b with backgrounds of different patterns, in different fonts, in different sizes or in any other manner that may distinguish the sent electronic messages 308 a from the received electronic messages 308 b .
  • the user interface manager 102 displays sent electronic messages 308 a with white typeface on a blue background.
  • the user interface manager 102 displays received electronic messages 308 b with black typeface on a grey background.
  • the user interface manager 102 may also provide a message input control palette or toolbar 310 .
  • the user interface manager 102 displays the message input control palette or toolbar 310 as part of the messaging graphical user interface 304 .
  • the message input control palette or tool bar 310 includes a variety of selectable message input controls that provide a user with various message input options or other options.
  • the message input control palette or toolbar 310 includes a text input control 312 a , a photo or video input control 312 b , a multimedia input control 312 c , a symbol input control 312 d , and a sound input control 312 e .
  • the message input control palette or toolbar 310 may provide the input controls 312 a - 312 e in a different order, may provide other input controls not displayed in FIG. 3A , or may omit one or more of the input controls 312 a - 312 e shown in FIG. 3A .
  • a user may interact with any of the input controls 312 a - 312 e in order to compose and send different types of electronic communications.
  • the user interface manager 102 may provide a touch screen display keyboard in a portion of the messaging graphical user interface 304 that the user may utilize to compose a textual message.
  • the user interface manager 102 may provide a camera viewfinder interface within a portion of the messaging graphical user interface 304 that the user may utilize to add a photo to the communication thread 306 .
  • the user interface manager 102 may provide a sound recording control by way of which the user can record a voice or other sound message. Likewise, as will be described in more detail below, if a user interacts with the multimedia input control 312 c , the user interface manager 102 may provide a multimedia content item display area with multimedia content items that the user can select to send as a message.
  • a user may interact with any of the message input controls 312 a - e in order to compose and send a message to one or more co-users via the electronic messaging system 100 .
  • a finger 314 of a user's hand is shown interacting with the multimedia input control 312 c .
  • the user input detector 104 can detect interactions (e.g., a tap touch gesture) of the finger 314 with the multimedia input control 312 c .
  • the user interface manager 102 may display an input control indicator 320 to indicate which input control 312 a - e is currently active. Additionally, as shown in FIG.
  • the user interface manager 102 can display a multimedia content item display area 316 containing one or more multimedia content item previews 318 a , 318 b.
  • the user interface manager 102 can provide the communication thread 306 in a first portion (i.e., the upper portion) of the messaging user interface 304 .
  • the user interface manager 102 can provide the multimedia content item display area 316 in a second portion (i.e., the lower portion) of the messaging user interface 304 .
  • the user interface manager 102 can allow the user to both view the communication thread 306 and any new messages, while also being able to view and browse content items.
  • the user interface manager 102 can arrange the communication thread 306 and the multimedia content item display area 316 horizontally or in another arrangement other than a vertical arrangement.
  • the content item manager 106 provides the multimedia content item previews 318 a , 318 b based on multimedia content items stored on computing device 300 .
  • the content item manager 106 may provide the multimedia content item preview 318 a based on a digital photograph stored on the computing device 300 .
  • the content item manager 106 provides the multimedia content item preview 318 a in the multimedia content item display area 316 based on the recentness of the multimedia content item associated with the content item preview 318 a .
  • the multimedia content item associated with the content item preview 318 a is the most recently stored multimedia content item on the computing device 300 or the most recently captured or created multimedia content item on the computing device 300 .
  • the content item manager 106 may crop the multimedia content item to create the content item preview 318 a .
  • the multimedia content item associated with the multimedia content item preview 318 a may be a rectangular digital photograph stored on the computing device 300 .
  • the content item manager 106 may crop the rectangular digital so that the content item preview 318 a is square and sized for presentation in the multimedia content item display area 316 .
  • the content item manager 106 may tailor the cropped content item preview 318 a to an aspect ratio of the multimedia content item display area 316 .
  • the content item manager 106 may tailor the cropped content item preview 318 a such that it is square within the multimedia content item display area 316 even if the multimedia content item display area 316 takes up a larger portion of the messaging graphical user interface 304 than is shown in FIG. 3B .
  • the content item manager 106 can help ensure that more than one content item preview 318 a , 318 b can be shown in the multimedia content item display area 316 . Additionally, by cropping the multimedia content items, the content item manager 106 can help ensure that the content items can fit within the communication thread 306 . Still another benefit of cropping the multimedia content items is to reduce a file size of the multimedia content items to enable quicker sending, receiving, and displaying of multimedia content items.
  • the content item manager 106 may provide the multimedia content item preview 318 b based on a digital video stored on the computing device 300 .
  • the content item manager 106 may provides a preview of a digital video that includes at least a portion of the digital video that auto plays within the multimedia content item display area 316 .
  • multimedia content item preview 318 b may be a portion of a digital video that auto plays within the multimedia content item display area 316 .
  • the portion of the digital video that auto plays may only be a few seconds long.
  • the portion of the digital video that auto plays may be a percentage of the total length of the digital video (e.g. a preview consisting of 10% of a 60 second digital video may be 6 seconds long).
  • the entire digital video can auto play within the multimedia content item display area 316 .
  • the content item manager 106 may crop the multimedia content item preview 318 b such that the preview 318 b is square and sized for presentation in the multimedia content item display area 316 .
  • the content item manager 106 may crop and tailor multimedia content item previews 318 a , 318 b such that the previews 318 a , 318 b are the same size within the multimedia content item display area 316 , even though the multimedia content items associated with the previews 318 a , 318 b are of differing sizes.
  • the user interface manager 102 may display the multimedia content item display area 316 such that the display area 316 is horizontally scrollable within the messaging graphical user interface 304 .
  • the user input detector 104 may detect a user interaction of the finger 314 of a user's hand within the multimedia content item display area 316 .
  • the detected user interaction may be a swipe touch gesture of the finger 314 moving across the touch screen display 302 along the direction of the arrow 322 .
  • the user interface manager 102 in response to the detected touch gesture, the user interface manager 102 may update the multimedia content item display area 316 to appear to be scrolling along the direction of the arrow 322 .
  • the multimedia content item preview 318 b will continue to auto play through the horizontal scroll.
  • the content item manager 106 may provide content item previews of multimedia content items likely to be selected by the user. For example, the content item manager 106 may provide content item previews of the most recent multimedia content items stored on the computing device 300 . For example, in one embodiment, the content item manager 106 may provide content item previews of any multimedia content items stored on the computing device 300 within the last 24 hours. In an alternative embodiment, the content item manager 106 may provide content item previews of a predetermined number of content items (e.g., 10, 20, 30) stored on the computing device 300 .
  • a predetermined number of content items e.g. 10, 20, 30
  • the content item manager 106 may provide content item previews based on other criteria besides recentness.
  • the electronic messaging system 100 can provide content item previews including participants in conversation included in the communication thread 306 .
  • the content item manager 106 may provide content item previews of multimedia items were created or “taken” at a particular location (e.g., all the pictures and videos taken at the lake house).
  • the content item manager 106 may provide content item previews related in some way to content of the messages in the communication thread 306 .
  • the user interface manager 102 may update the messaging graphical user interface 304 to include a camera roll in response to a detected over-scroll within the multimedia content item display area 316 .
  • the content item manager 106 may have provided only the multimedia content items previews 318 a and 318 b for display in the multimedia content item display area 316 .
  • the multimedia content item preview 318 b is the last preview displayed in the multimedia content item display area 316 .
  • the user interface manager 102 may indicate the multimedia content item preview 318 b is the last preview in the multimedia content item display area 316 by displaying a blank area 324 to right of the multimedia content item preview 318 b .
  • the user interface manager 102 may indicate the multimedia content item preview 318 b is the last preview by providing an automatic stop to a scrolling motion once the content preview 318 b is reached.
  • the user input detector 104 may detect a swipe gesture of the user's finger 314 across the multimedia content item display area 316 along the direction of the arrow 322 .
  • the user interface manager 102 may sequentially display the content item previews.
  • the user interface manager 102 may provide a camera roll of the computing device in response to a detected over-scroll. As illustrated in FIG.
  • an over-scroll occurs when a user continues to horizontally scroll the multimedia content item display area 316 along the direction of arrow 322 beyond the last content item preview 318 b , the user interface manager 102 updates the messaging graphical user interface 304 to include a camera roll of the computing device 300 .
  • the user interface manager 102 displays a camera roll 326 .
  • the camera roll 326 provides a content item preview 318 ′ for each multimedia content item stored on the computing device 300 .
  • the content item manager 106 may crop and tailor each camera roll preview 318 ′ in the same manner as described with regard to the multimedia content item previews 318 a and 318 b above.
  • the content item manager 106 may provide a portion of the digital video that auto plays from within the camera roll 326 . Accordingly, in one or more embodiments, the user interface manager 102 may display the camera roll 326 such that some of the camera roll content item previews 318 ′ are still images (i.e., based on digital photograph multimedia items) and some of the camera roll content item previews 318 ′ are auto playing video clips (i.e., based on digital video multimedia items). In one or more alternative embodiments, the content item manager 106 may not provide any specialized content item previews for the camera roll. Rather, in an alternative embodiment, the user interface manager 102 may simply display a minimized version of each digital picture and/or a minimized version of the first frame of each digital video within the camera roll 326 .
  • the camera roll 326 can occupy the entire touch screen display 302 .
  • the display of the camera roll 326 can cause a navigation away from the communication thread 306 .
  • the user interface manager 102 can provide the camera roll 326 within the area of the messaging user interface 304 previously occupied by the content item display area 316 . In such embodiments, the user can access the camera roll 326 without navigating away from the communication thread 306 .
  • a user may select a content item preview either from the camera roll 326 (i.e., one of the camera roll previews 318 ′ of FIG. 3E ) or from the multimedia content item display area 316 , as shown in FIG. 3F .
  • the user input detector 104 may detect a user selection of the multimedia content item preview 318 c from the multimedia content item display area 316 .
  • the user input detector 104 may detect a tap touch gesture of the finger 314 of the user with the content item preview 318 c .
  • the user input detector 104 may detect other types of user input such as a spoken command, an upward swipe touch gesture, or any other suitable type of user input.
  • the communication manager 108 may send the multimedia content item associated with the content item preview 318 a to one or more co-users. For example, the communication manager 108 may send the multimedia content item to the communication server 208 , which may then forward the multimedia content item to one or more computing devices 204 associated with one or more intended recipients. In that case, the user interface manager 102 may also add the multimedia content item to the communication thread 306 .
  • the user interface manager 102 may alter the display of the content item preview 318 c so as to indicate the selection of the content item preview 318 c .
  • the user interface manager 102 may present a blurred version of the content item preview 318 c ′, as shown in FIG. 3G .
  • the user interface manager 102 may alter the selected multimedia content item preview 318 c in other ways in order to indicate the selection.
  • the user interface manager 102 may alter the color scheme of the selected content item preview 318 c , may black out the selected preview 318 c , or may alter the selected content item preview 318 c in any other way suitable for this purpose.
  • the selected content item preview 318 c can also be enlarged or zoomed in.
  • the user interface manager 102 may also present one or more controls overlaid on the selected preview 318 c ′. For example, as illustrated in FIG. 3G , in response to the selection of the preview 318 c , the user interface manager 102 may overlay an edit control 328 and a send control 330 on the selected multimedia content item preview 318 c ′. In one or more embodiments, the user interface manager 102 may overlay other or additional controls on the selected preview 318 c ′. For instance, in one embodiment, the user interface manager 102 may also overlay a delete control over the selected preview 318 c′.
  • the content item manager 106 may remove the preview 318 c from the multimedia content item display area 316 . Additionally or alternatively, the content item manager 106 may permanently delete the multimedia content item associated with the preview 318 c from the computing device 300 . In one or more embodiments, the content item manager 106 may be configurable by a user in order to specify the actions taken with regard to the controls including the edit control 328 , the send control 330 , and the delete control.
  • the communication manager 108 may send the multimedia content item associated with the selected preview 318 c ′ to one or more co-users. Additionally, in one or more embodiments, the user interface manager 102 may also add the multimedia content item associated with the selected preview 318 c ′ to the communication thread 306 . In one or more embodiments, a user may unselect the selected preview 318 c ′ simply by tapping anywhere else on the messaging graphical user interface 304 .
  • the content item manager 106 may provide the multimedia content item associated with the multimedia content item preview 318 c ′ for editing in response to a detected selection of the edit control 328 .
  • the user input detector 104 may detect a user interaction (e.g., tap touch gesture) of a user's finger 314 with the edit control 328 .
  • the user interface manager 102 may present the multimedia content item 319 associated with the selected preview 318 c ′ for editing in response to the detected user interaction.
  • the user interface manager 102 may also present a variety of editing controls within the messaging graphical user interface 304 .
  • the user interface manager 102 may display a crop editing control 328 a (i.e., allows a user to crop the multimedia content item 319 ), an auto-edit control 328 b (i.e., performs preconfigured edits to the multimedia content item 319 ), a color edit control 328 c (i.e., allows a user to edit the colors of the multimedia content item 319 ), and a writing edit control 328 d (i.e., allows a user to add text or other drawn effects to the multimedia content item 319 ).
  • the user interface manager 102 may display other or additional edit controls while the multimedia content item 319 associated with the selected preview 318 a ′ is available for editing.
  • the content item manager 106 may allow edits to a copy of a multimedia content item 319 , rather than allowing edits to the original multimedia content item 319 .
  • This feature allows a user to send a copy of a multimedia content item to one or more co-users that has been edited only for purposes related to a specific communication session.
  • edits made within the electronic messaging system 100 may not be reflected to an original copy of the content item stored on the computing device 300 .
  • the content item manager 106 may receive one or more edits to multimedia content item 319 from a user. For example, as shown in FIG. 3H , a user's finger 314 may perform an edit 334 . In the embodiment shown, the edit 334 is writing added to the multimedia content item 319 , in connection with a selection of the writing edit control 328 d . In additional alternative embodiments, the content item manager 106 may receive further edits to the multimedia content item 319 in accordance with any of the edit controls 328 a - d described above.
  • the user interface manager 102 may present additional controls while the multimedia content item 319 is available for editing.
  • the user interface manager 102 may also present a send control 330 and a cancel control 332 .
  • a detected selection of the send control 330 may cause the communication manager 108 to send the multimedia content item 319 along with the edit 334 to one or more co-users.
  • a detected selection of the cancel control 332 may cause the content item manager 106 to discard any edits made to the multimedia content item 319 .
  • the user interface manager 102 may also present a save control while the multimedia content item 319 is available for editing.
  • a detected selection of the save control may cause the content item manager 106 to save the multimedia content item 319 along with the edits 334 to the computing device 300 .
  • the user interface manager 102 may again display the communication thread 306 , the message input control palette or toolbar 310 , and the multimedia content item display area 316 in response to a detected selection of the send control 330 , the cancel control 332 , or the save control as described with regard to FIG. 3H .
  • the user interface manager 102 in response to a detected selection of the send control 330 , can display the communication thread 306 , the message input control palette or toolbar 310 , and the multimedia content item display area 316 within the messaging graphical user interface 304 .
  • the user interface manager 102 may automatically add the multimedia content item 319 with the edit 334 to the communication thread 306 in response to the detected selection of the send control 330 .
  • the user interface manager 102 may display the multimedia content item preview 318 c within the multimedia content item display area 316 such that the content item preview 318 c does not reflect any edits. For example, because the content item manager 106 provided a copy of the content item 319 (shown in FIG. 3G ) for editing, the multimedia content item associated with the content item preview 318 c was not affected by the edits 334 . Accordingly, as shown in FIG. 3I , the user interface manager 102 displays the content item preview 318 c unchanged within the multimedia content item display area 316 . In one or more alternative embodiments, the content item manager 106 may provide the original multimedia content item for editing, rather than a copy 319 . In that alternative embodiment, the user interface manager 102 may display the preview 318 c including the edits 334 within the multimedia content item display area 316 .
  • the content item 319 when sent to one or more co-users and when added to the communication thread 306 , can have a size configured for display within a communication thread 306 .
  • the content item 319 can occupy less than the entire communication thread 306 both in a vertical direction and a horizontal direction.
  • the communication thread 306 can display both the content item 319 and one or more messages as shown by FIG. 3 I.
  • the content item 319 can be positioned on one side of the communication thread 306 so as to indicate whether the content item 319 was a sent or received message.
  • FIGS. 3A-3I illustrate the process for selecting and editing a digital photograph multimedia content item for inclusion in a communication session.
  • the electronic messaging system 100 also allows a user to select and edit a digital video multimedia content item for inclusion in a communication session.
  • the process for selecting and editing a digital video will now be discussed in relation to FIGS. 4A-4E .
  • the user input detector 104 may detect a selection of a multimedia content item preview 318 b from with the multimedia content item display area 316 .
  • the user input detector 104 may detect a tap touch gesture of the user's finger 314 on the content item preview 318 b .
  • the detected selection may be by way of another type of user interaction, such as a press-and-hold touch gesture, a spoken command, or any other type of user interaction suitable for this purpose.
  • the multimedia content item associated with the content item preview 318 b may be a digital video.
  • the communication manager 108 may immediately send the digital video associated with the multimedia content item preview 318 b in response to a particular user interaction. For example, the communication manager 108 may immediately send the digital video associated with the content item preview 318 b to one or more co-users upon detection of a selection of the content item preview 318 b . Additionally, the user interface manager 102 may immediately add the digital video to the communication thread 306 in response to detection of a selection of the content item preview 318 b.
  • the user interface manager 102 may alter the display of the content item preview 318 b so as to indicate the selection of the content item preview 318 b .
  • the user interface manager 102 may present a blurred version of the preview 318 b ′, as shown in FIG. 4B .
  • the content item preview 318 b ′ of a digital video is a portion of the digital video that auto plays within the multimedia content item display area 316 .
  • the user interface manager 102 may display a blurred portion of the digital video that auto plays within the multimedia content item display area 316 .
  • the user interface manager may display only a blurred single frame from the digital video.
  • the user interface manager 102 may also present one or more controls overlaid on the selected preview 318 b ′, as illustrated in FIG. 4B .
  • the user interface manager 102 may overlay the edit control 328 and the send control 330 on the selected multimedia content item preview 318 b ′.
  • the user interface manager 102 may overlay other or additional controls on the selected content item preview 318 b ′.
  • the user interface manager 102 may also overlay a delete control over the selected preview 318 b′.
  • the communication manager 108 may send the multimedia content item associated with the selected content item preview 318 b ′ to one or more co-users. Additionally, in one or more embodiments, the user interface manager 102 may also add the multimedia content item associated with the selected content item preview 318 b ′ to the communication thread 306 . In one or more embodiments, a user may unselect the selected content item preview 318 b ′ simply by tapping anywhere else on the messaging graphical user interface 304 .
  • the content item manager 106 may provide the multimedia content item associated with the multimedia content item preview 318 b ′ (or a copy thereof) for editing in response to a detected selection of the edit control 328 .
  • the user input detector 104 may detect a user interaction of the user's finger 314 with the edit control 328 .
  • the detected user interaction may be a tap touch gesture.
  • the detected user interaction may be an upward swipe touch gesture, a spoken command, or another type of user input suitable for this purpose.
  • the user interface manager 102 may present the content item 319 b associated with the selected preview 318 b ′ for editing within the messaging graphical user interface 304 in response to the detected user interaction.
  • the user interface manager 102 may also present a variety of editing controls within the messaging graphical user interface 304 .
  • the user interface manager 102 may display the crop editing control 328 a , the auto-edit control 328 b , the color edit control 328 c , and the writing edit control 328 d .
  • the user interface manager 102 may display other or additional edit controls as described above.
  • the editing controls 328 a - d may take on different functionality depending on the type of multimedia content item currently available for editing.
  • the multimedia content item type was a digital photograph.
  • the crop editing control 328 a may function to remove portions of the digital photograph, thus changing the displayed portion of the digital photograph.
  • the multimedia content item type is a digital video.
  • the crop editing control 328 a may function to remove portions of the digital video, thus changing the runtime of the digital video.
  • the other editing controls 328 b - d may have similarly alterable functionalities depending on the multimedia content item type.
  • the content item manager 106 may receive one or more edits to the content item 319 b via one or more touch gestures.
  • the content item manager 106 has received an edit 334 b changing a display property of the content item 319 b .
  • the edit 334 b includes the addition of a border around the content item 319 b
  • the content item manager 106 may receive further edits to the content item 319 b in accordance with any of the edit controls 328 a - d described above.
  • the user interface manager 102 may present additional controls while the content item 319 b is available for editing.
  • the user interface manager 102 may also present the send control 330 and the cancel control 332 .
  • a detected selection of the send control 330 may cause the communication manager 108 to send the content item 319 b along with the edits 334 b to one or more co-users.
  • a detected selection of the cancel control 332 may cause the file manager 108 to discard any edits made to the content item 319 b of the multimedia content item.
  • the user interface manager 102 may also present a save control while the copy 319 b is available for editing.
  • a detected selection of the save control may cause the content item manager 106 to save the content item 319 b along with the edits 334 b to the computing device 300 .
  • the user interface manager 102 may display the communication thread 306 , the message input control palette or toolbar 310 , and the multimedia content item display area 316 in response to a detected selection of the send control 330 , the cancel control 332 , or the save control as described with regard to FIG. 4C .
  • the user input detector 104 may detect a touch gesture performed by the user's finger 314 on the send control 330 .
  • the user interface manager 102 may display the communication thread 306 , the message input control palette or toolbar 310 , and the multimedia content item display area 316 within the messaging graphical user interface 304 , as shown in FIG. 4D .
  • the user interface manager 102 may automatically add the content item 319 b of the multimedia content item with the edits 334 b to the communication thread 306 in response to the detected selection of the send control 330 .
  • the content item preview 318 b may not include the edits 334 b made to the content item 319 b.
  • the content item manager 106 may also package the content item 319 b of the multimedia content item with the edits 334 b such that it may be played from within the communication thread 306 .
  • the content item manager 106 has added the content item 319 with the edits 334 b and a playback control 336 to the communication thread 306 .
  • the playback control 336 may further include a time remaining indicator, a pause button, or any other additional controls suitable for a video.
  • the user interface manager 102 may replace the multimedia content item display area 316 with another control.
  • the user input detector 104 may detect a user interaction of the user's finger 314 interacting with the text input control 312 a within the message input control palette or toolbar 310 .
  • the user interface manager 102 in response to this detected selection of the text input control 312 a , may replace the multimedia content item display area 316 with a touch screen display keyboard 338 .
  • the user interface manager 102 may replace the multimedia content item display area 316 with other types of controls in response to the detected selection of any of the input controls 312 a - 312 b.
  • FIGS. 1-4E the corresponding text, and the examples, provide a number of different systems and devices for selecting and including multimedia content items in a communication session.
  • embodiments of the present invention can also be described in terms of flowcharts comprising acts and steps in a method for accomplishing a particular result.
  • FIGS. 5 and 6 illustrate flowcharts of exemplary methods in accordance with one or more embodiments of the present invention. The methods described in relation to FIGS. 5 and 6 may be performed with less or more steps/acts or the steps/acts may be performed in differing orders. Additionally, the steps/acts described herein may be repeated or performed in parallel with one another or in parallel with different instances of the same or similar steps/acts.
  • FIG. 5 illustrates a flowchart of one example method 500 of selecting and including multimedia content items in a communication session.
  • the method 500 includes an act 502 of providing a messaging graphical user interface.
  • the act 502 can involve providing, on a client device, a messaging graphical user interface 304 .
  • the messaging graphical user interface 304 may include a communication thread 306 in a first portion.
  • the communication thread may include a plurality of electronic messages 308 a , 308 b exchanged between a user and one or more co-users.
  • the method 500 further includes an act 504 of detecting a selection of a multimedia input control.
  • the act 504 can include detecting a tap touch gesture selection of the multimedia input control 312 c .
  • detecting a selection of a multimedia input control 312 c can include detecting the selection of the multimedia input control 312 c from a palette of input controls 310 .
  • the method 500 also includes an act 506 of providing a multimedia content item display area.
  • the act 506 can involve, in response to the detected selection of the multimedia input control 312 c , providing a multimedia content item display area 316 in a second portion of the messaging graphical user interface 304 .
  • the multimedia content item display area 316 may provide a preview 318 a , 318 b of one or more multimedia content items stored on the client device 300 available for sending as an electronic message 308 a , 308 b.
  • providing the multimedia content item display area 316 may include providing a preview 318 a , 318 b of one or more recently stored multimedia content items.
  • one or more recently stored multimedia content items may include multimedia items that were stored within a given time limit, or window of time.
  • the one or more multimedia content items stored on the client device 300 available for sending as an electronic message 308 a , 308 b may include one or more digital photographs or digital videos.
  • providing a preview 318 a , 318 b of one or more multimedia content items may include providing a digital video that auto plays within the second portion of the messaging graphical user interface 304 .
  • the provided multimedia content item display area 316 may be horizontally scrollable. Furthermore, in one or more embodiments, the method 500 may also include detecting a horizontal over-scroll of the horizontally scrollable multimedia content item display area 316 . In response to the detected horizontal over-scroll, the method 500 may include displaying a camera roll 326 associated with the client device 300 .
  • the method 500 may further include cropping the preview 318 a , 318 b of each of the one or more multimedia content items stored on the client device 300 available for sending as an electronic message 308 a , 308 b .
  • cropping the preview 318 a , 318 b of each of the one or more multimedia content items may further include tailoring the cropped preview 318 a , 318 b of each of the one or more multimedia content items to an aspect ratio of the second portion of the messaging graphical user interface 304 .
  • tailoring the cropped preview 318 a , 318 b of each of the one or more multimedia content items to an aspect ratio of the second portion of the messaging graphical user interface 304 may include displaying the previews 318 a , 318 b in the second portion of the messaging graphical user interface 304 such that the previews 318 a , 318 b are square.
  • the method 500 may also include detecting a selection of a preview 318 a , 318 b of a multimedia content item provided in the multimedia content item display area 316 in the second portion of the messaging graphical user interface 304 .
  • the method in response to the detected selection of the preview 318 a , 318 b of the multimedia content item, the method can involve sending the multimedia content item 319 corresponding to the selected preview 318 a , 318 b to one or more co-users.
  • the method 500 in response to sending the multimedia content item 319 , may include adding the multimedia content item 319 to the communication thread 306 in the first portion of the messaging graphical user interface 304 .
  • FIG. 6 illustrates a flowchart of a method 600 of selecting and including multimedia content items in a communication session.
  • the method 600 includes an act 602 of providing a split-screen messaging graphical user interface.
  • the act 602 can involve providing a split-screen messaging graphical user interface 304 including two portions.
  • the first portion of the split-screen messaging graphical user interface 304 may include a communication thread 306 including a plurality of electronic messages 308 a , 308 b exchanged between a user and one or more co-users.
  • the second portion of the split-screen messaging graphical user interface 304 may include a multimedia content items display 316 including a preview 318 a , 318 b of one or more multimedia content items.
  • the preview 318 a , 318 b of one or more multimedia content items may be tailored to an aspect ratio of the second portion or the first portion of the messaging graphical user interface 304 .
  • the method 600 further includes an act 604 of detecting a selection of a preview of a multimedia content item.
  • the act 604 can involve detecting a selection of a preview 318 a , 318 b of a multimedia content item from the multimedia content item display 316 .
  • detecting a selection of a preview 318 a , 318 b of a multimedia content items may include detecting a tap touch gesture interacting with the multimedia content item.
  • the method 600 further includes an act 606 of overlaying a control on the selected preview of the multimedia content item.
  • the act 606 can involve, in response to the detected selection of the preview 318 a , 318 b of the multimedia content item from the multimedia content item display 316 , overlaying a first control on the selected preview 318 a , 318 b of the multimedia content item.
  • the first control overlaid on the selected multimedia content item may be an editing control 328 .
  • the act 606 may also include overlaying a second control on the selected preview 318 a , 318 b of the multimedia content item.
  • the second control overlaid on the selected preview 318 a , 318 b of the multimedia content items may be a send control 330 .
  • the act 606 may also include blurring the selected preview 318 a , 318 b of the multimedia content item.
  • the method 600 may further include detecting a selection of the editing control 328 overlaid on the selected preview 318 a , 318 b of the multimedia content item.
  • the detected selection of the editing control 328 may be a tap touch gesture.
  • the method 600 may also include presenting a copy 319 of the multimedia content item associated with the selected preview 318 a , 318 b for editing.
  • the method 600 may also include receiving one or more edits 334 , 334 b to the copy 319 of the multimedia content item. For example, receiving one or more edits 334 , 334 b to the copy 319 of the multimedia content item may be in response to presenting the copy 319 of the multimedia content item for editing.
  • the method 600 may further include adding the copy 319 of the multimedia content item with the one or more edits 334 , 334 b to the communication thread 306 . Additionally, the method 600 may include sending the copy 319 of the multimedia content item with the one or more edits 334 , 334 b .
  • the method 600 may include sending the copy of the multimedia content item with the one or more edits 334 , 334 b to the one or more co-users.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
  • Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
  • one or more of the processes described herein may be implemented at least in part as instructions embodied in a non-transitory computer-readable medium and executable by one or more computing devices (e.g., any of the media content access devices described herein).
  • a processor receives instructions, from a non-transitory computer-readable medium, (e.g., a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein.
  • a non-transitory computer-readable medium e.g., a memory, etc.
  • Computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Computer-readable media that store computer-executable instructions are non-transitory computer-readable storage media (devices).
  • Computer-readable media that carry computer-executable instructions are transmission media.
  • embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: non-transitory computer-readable storage media (devices) and transmission media.
  • Non-transitory computer-readable storage media includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • SSDs solid state drives
  • PCM phase-change memory
  • a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • a network or another communications connection can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to non-transitory computer-readable storage media (devices) (or vice versa).
  • computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system.
  • a network interface module e.g., a “NIC”
  • non-transitory computer-readable storage media (devices) can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • computer-executable instructions are executed on a general-purpose computer to turn the general-purpose computer into a special purpose computer implementing elements of the invention.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, and the like.
  • the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • cloud computing is defined as a model for enabling on-demand network access to a shared pool of configurable computing resources.
  • cloud computing can be employed in the marketplace to offer ubiquitous and convenient on-demand access to the shared pool of configurable computing resources.
  • the shared pool of configurable computing resources can be rapidly provisioned via virtualization and released with low management effort or service provider interaction, and then scaled accordingly.
  • a cloud-computing model can be composed of various characteristics such as, for example, on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, and so forth.
  • a cloud-computing model can also expose various service models, such as, for example, Software as a Service (“SaaS”), Platform as a Service (“PaaS”), and Infrastructure as a Service (“IaaS”).
  • SaaS Software as a Service
  • PaaS Platform as a Service
  • IaaS Infrastructure as a Service
  • a cloud-computing model can also be deployed using different deployment models such as private cloud, community cloud, public cloud, hybrid cloud, and so forth.
  • a “cloud-computing environment” is an environment in which cloud computing is employed.
  • FIG. 7 illustrates a block diagram of exemplary computing device 700 that may be configured to perform one or more of the processes described above.
  • Any of the computing devices 202 , 204 , 300 or communication server 208 can comprise a computing device 700 .
  • the computing device 700 can comprise a processor 702 , a memory 704 , a storage device 706 , an I/O interface 708 , and a communication interface 710 , which may be communicatively coupled by way of a communication infrastructure 712 .
  • FIG. 7 While an exemplary computing device 700 is shown in FIG. 7 , the components illustrated in FIG. 7 are not intended to be limiting. Additional or alternative components may be used in other embodiments.
  • the computing device 700 can include fewer components than those shown in FIG. 7 . Components of the computing device 700 shown in FIG. 7 will now be described in additional detail.
  • the processor 702 includes hardware for executing instructions, such as those making up a computer program.
  • the processor 702 may retrieve (or fetch) the instructions from an internal register, an internal cache, the memory 704 , or the storage device 706 and decode and execute them.
  • the processor 702 may include one or more internal caches for data, instructions, or addresses.
  • the processor 702 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in the memory 704 or the storage 706 .
  • TLBs translation lookaside buffers
  • the memory 704 may be used for storing data, metadata, and programs for execution by the processor(s).
  • the memory 704 may include one or more of volatile and non-volatile memories, such as Random Access Memory (“RAM”), Read Only Memory (“ROM”), a solid state disk (“SSD”), Flash, Phase Change Memory (“PCM”), or other types of data storage.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • SSD solid state disk
  • PCM Phase Change Memory
  • the memory 704 may be internal or distributed memory.
  • the storage device 706 includes storage for storing data or instructions.
  • storage device 706 can comprise a non-transitory storage medium described above.
  • the storage device 706 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these.
  • the storage device 706 may include removable or non-removable (or fixed) media, where appropriate.
  • the storage device 706 may be internal or external to the computing device 700 .
  • the storage device 706 is non-volatile, solid-state memory.
  • the storage device 706 includes read-only memory (ROM).
  • this ROM may be mask programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these.
  • the I/O interface 708 allows a user to provide input to, receive output from, and otherwise transfer data to and receive data from computing device 700 .
  • the I/O interface 608 may include a mouse, a keypad or a keyboard, a touch screen, a camera, an optical scanner, network interface, modem, other known I/O devices or a combination of such I/O interfaces.
  • the I/O interface 708 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers.
  • the I/O interface 708 is configured to provide graphical data to a display for presentation to a user.
  • the graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
  • the communication interface 710 can include hardware, software, or both. In any event, the communication interface 710 can provide one or more interfaces for communication (such as, for example, packet-based communication) between the computing device 700 and one or more other computing devices or networks. As an example and not by way of limitation, the communication interface 710 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI.
  • NIC network interface controller
  • WNIC wireless NIC
  • the communication interface 710 may facilitate communications with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these.
  • PAN personal area network
  • LAN local area network
  • WAN wide area network
  • MAN metropolitan area network
  • the communication interface 710 may facilitate communications with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination thereof.
  • GSM Global System for Mobile Communications
  • the communication interface 710 may facilitate communications various communication protocols.
  • Examples of communication protocols include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Mark-up Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Time Division Multiple Access (“TDMA”) technologies, Short Message Service (“SMS”), Multimedia Message Service (“MMS”), radio frequency (“RF”) signaling technologies, Long Term Evolution (“LTE”) technologies, wireless communication technologies, in-band and out-of-band signaling technologies, and other suitable communications networks and technologies.
  • TCP Transmission Control Protocol
  • IP Internet Protocol
  • the communication infrastructure 712 may include hardware, software, or both that couples components of the computing device 700 to each other.
  • the communication infrastructure 712 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination thereof.
  • AGP Accelerated Graphics Port
  • EISA Enhanced Industry Standard Architecture
  • FAB front-side bus
  • HT HYPERTRANSPORT
  • ISA Industry Standard Architecture
  • ISA Industry Standard Architecture
  • the network 206 and/or communication server 208 can comprise a social-networking system.
  • a social-networking system may enable its users (such as persons or organizations) to interact with the system and with each other.
  • the social-networking system may, with input from a user, create and store in the social-networking system a user profile associated with the user.
  • the user profile may include demographic information, communication-channel information, and information on personal interests of the user.
  • the social-networking system may also, with input from a user, create and store a record of relationships of the user with other users of the social-networking system, as well as provide services (e.g. wall posts, photo-sharing, event organization, messaging, games, or advertisements) to facilitate social interaction between or among users.
  • services e.g. wall posts, photo-sharing, event organization, messaging, games, or advertisements
  • the social-networking system may store records of users and relationships between users in a social graph comprising a plurality of nodes and a plurality of edges connecting the nodes.
  • the nodes may comprise a plurality of user nodes and a plurality of concept nodes.
  • a user node of the social graph may correspond to a user of the social-networking system.
  • a user may be an individual (human user), an entity (e.g., an enterprise, business, or third party application), or a group (e.g., of individuals or entities).
  • a user node corresponding to a user may comprise information provided by the user and information gathered by various systems, including the social-networking system.
  • the user may provide his or her name, profile picture, city of residence, contact information, birth date, gender, marital status, family status, employment, educational background, preferences, interests, and other demographic information to be included in the user node.
  • Each user node of the social graph may have a corresponding web page (typically known as a profile page).
  • the social-networking system can access a user node corresponding to the user name, and construct a profile page including the name, a profile picture, and other information associated with the user.
  • a profile page of a first user may display to a second user all or a portion of the first user's information based on one or more privacy settings by the first user and the relationship between the first user and the second user.
  • a concept node may correspond to a concept of the social-networking system.
  • a concept can represent a real-world entity, such as a movie, a song, a sports team, a celebrity, a group, a restaurant, or a place or a location.
  • An administrative user of a concept node corresponding to a concept may create or update the concept node by providing information of the concept (e.g., by filling out an online form), causing the social-networking system to associate the information with the concept node.
  • information associated with a concept can include a name or a title, one or more images (e.g., an image of cover page of a book), a web site (e.g., an URL address) or contact information (e.g., a phone number, an email address).
  • Each concept node of the social graph may correspond to a web page.
  • the social-networking system can access a concept node corresponding to the name, and construct a web page including the name and other information associated with the concept.
  • An edge between a pair of nodes may represent a relationship between the pair of nodes.
  • an edge between two user nodes can represent a friendship between two users.
  • the social-networking system may construct a web page (or a structured document) of a concept node (e.g., a restaurant, a celebrity), incorporating one or more selectable buttons (e.g., “like”, “check in”) in the web page.
  • a user can access the page using a web browser hosted by the user's client device and select a selectable button, causing the client device to transmit to the social-networking system a request to create an edge between a user node of the user and a concept node of the concept, indicating a relationship between the user and the concept (e.g., the user checks in a restaurant, or the user “likes” a celebrity).
  • a user may provide (or change) his or her city of residence, causing the social-networking system to create an edge between a user node corresponding to the user and a concept node corresponding to the city declared by the user as his or her city of residence.
  • the degree of separation between any two nodes is defined as the minimum number of hops required to traverse the social graph from one node to the other.
  • a degree of separation between two nodes can be considered a measure of relatedness between the users or the concepts represented by the two nodes in the social graph.
  • two users having user nodes that are directly connected by an edge may be described as “connected users” or “friends.”
  • two users having user nodes that are connected only through another user node i.e., are second-degree nodes
  • friends of friends may be described as “friends of friends.”
  • a social-networking system may support a variety of applications, such as photo sharing, on-line calendars and events, gaming, instant messaging, and advertising.
  • the social-networking system may also include media sharing capabilities.
  • the social-networking system may allow users to post photographs and other multimedia files to a user's profile page (typically known as “wall posts” or “timeline posts”) or in a photo album, both of which may be accessible to other users of the social-networking system depending upon the user's configured privacy settings.
  • the social-networking system may also allow users to configure events. For example, a first user may configure an event with attributes including time and date of the event, location of the event and other users invited to the event. The invited users may receive invitations to the event and respond (such as by accepting the invitation or declining it).
  • the social-networking system may allow users to maintain a personal calendar. Similarly to events, the calendar entries may include times, dates, locations and identities of other users.
  • FIG. 8 illustrates an example network environment of a social-networking system.
  • a social-networking system 802 may comprise one or more data stores.
  • the social-networking system 802 may store a social graph comprising user nodes, concept nodes, and edges between nodes as described earlier.
  • Each user node may comprise one or more data objects corresponding to information associated with or describing a user.
  • Each concept node may comprise one or more data objects corresponding to information associated with a concept.
  • Each edge between a pair of nodes may comprise one or more data objects corresponding to information associated with a relationship between users (or between a user and a concept, or between concepts) corresponding to the pair of nodes.
  • the social-networking system 802 may comprise one or more computing devices (e.g., servers) hosting functionality directed to operation of the social-networking system.
  • a user of the social-networking system 802 may access the social-networking system 802 using a client device such as client device 806 .
  • client device 806 can interact with the social-networking system 802 through a network 804 .
  • the client device 806 may be a desktop computer, laptop computer, tablet computer, personal digital assistant (PDA), in- or out-of-car navigation system, smart phone or other cellular or mobile phone, or mobile gaming device, other mobile device, or other suitable computing devices.
  • Client device 806 may execute one or more client applications, such as a web browser (e.g., Microsoft Windows Internet Explorer, Mozilla Firefox, Apple Safari, Google Chrome, Opera, etc.) or a native or special-purpose client application (e.g., Facebook for iPhone or iPad, Facebook for Android, etc.), to access and view content over a network 804 .
  • a web browser e.g., Microsoft Windows Internet Explorer, Mozilla Firefox, Apple Safari, Google Chrome, Opera, etc.
  • a native or special-purpose client application e.g., Facebook for iPhone or iPad, Facebook for Android, etc.
  • Network 804 may represent a network or collection of networks (such as the Internet, a corporate intranet, a virtual private network (VPN), a local area network (LAN), a wireless local area network (WLAN), a cellular network, a wide area network (WAN), a metropolitan area network (MAN), or a combination of two or more such networks) over which client devices 806 may access the social-networking system 802 .
  • networks such as the Internet, a corporate intranet, a virtual private network (VPN), a local area network (LAN), a wireless local area network (WLAN), a cellular network, a wide area network (WAN), a metropolitan area network (MAN), or a combination of two or more such networks

Abstract

One or more embodiments described herein include methods and systems of sending multimedia content items as electronic communications. More specifically, systems and methods described herein provide user the ability to easily and effectively select multimedia content items stored on a computing device for inclusion in a communication session without navigating away from the communication session. Additionally, systems and methods described herein provide a user the ability to edit multimedia content items for inclusion in the communication session.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. application Ser. No. 15/826,032 filed Nov. 29, 2017, which is a continuation of U.S. application Ser. No. 14/312,481 filed Jun. 23, 2014, which claims priority to and the benefit of U.S. Provisional Application No. 61/985,456 filed Apr. 28, 2014. Each of the aforementioned patents and applications are hereby incorporated by reference in their entirety.
  • BACKGROUND 1. Technical Field
  • One or more embodiments relate generally to electronic messaging systems and methods. More specifically, one or more embodiments relate to systems and methods for increasing functionality in an electronic messaging system.
  • 2. Background and Relevant Art
  • Computing devices (e.g., computers, tablets, and smart phones) provide numerous ways for people to connect and communicate with one another. For example, a variety of electronic messaging systems provide various methods to send and receive electronic messages. For instance, a computing device can allow a user to communicate with other users using text messaging, instant messaging, social network posting, and other forms of electronic communication. In addition, an electronic communication may include a variety of content including text, images, video, audio, and/or other multimedia. In general, electronic communication has become a popular way for people to connect and communicate with one another.
  • Including multimedia in electronic communications has become an especially popular way to add humor, context, and information to an electronic communication session. For example, a user may send a digital photograph to a co-user indicating his location. Similarly, a user may send a video laughing in response to an electronic message that includes a joke. Thus, including multimedia in an electronic communication is an easy way to add a layer of expression to an electronic communication session that is typically difficult with only textual messages.
  • Conventional processes for including multimedia in an electronic communication session are generally problematic. For example, a user typically navigates through several different user interfaces in order to select existing multimedia for inclusion in an electronic message. Navigating away from the electronic communication interface, however, can cause a user to miss messages or otherwise make adding multimedia time consuming and frustrating for a user.
  • Additionally, a user is typical not able to easily edit multimedia for inclusion in an electronic communication session. For example, a user may wish to edit light contrast, color saturation, or some other characteristic in a digital photograph and then include the edited digital photograph in an electronic communication. Generally, in order to include edited multimedia in an electronic communication, a user first must edit the multimedia using software specifically dedicated to that purpose. Thus, a user is typically not able to edit multimedia without navigating away from the electronic communication session. This adds extra steps and hassle to the process of composing an electronic communication.
  • Thus, there are several disadvantages to current methods for including multimedia in an electronic message.
  • SUMMARY
  • One or more embodiments provide benefits and/or solve one or more of the foregoing and other problems in the art with methods and systems that provide enhanced functionality for electronic messaging systems. For example, methods and systems described herein allow users greater functionality for including multimedia content items in electronic messages. Furthermore, one or more embodiments can provide the foregoing or other benefits through an intuitive user interface.
  • In addition to the foregoing, systems and methods of one or more embodiments allow a user to select an existing multimedia content item for inclusion in an electronic message without navigating away from a communication thread. For example, in one or more embodiments, a user interface displays both a communication thread with electronic messages sent between co-users and a collection of stored multimedia content items. Thus, a user may browse and select a stored multimedia content item without navigating away from the communication thread.
  • Furthermore, systems and methods of one or more embodiments allow a user to easily edit multimedia content items for inclusion in an electronic message. For example, in one or more embodiments, a user may edit a digital photograph or video without having to utilize a separate piece of software. This provides the user with a more intuitive and streamlined way for including edited multimedia content items in an electronic message.
  • Additional features and advantages of the present invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary embodiments. The features and advantages of such embodiments may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary embodiments as set forth hereinafter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to describe the manner in which the above recited and other advantages and features can be obtained, a more particular description briefly described above will be rendered by reference to specific embodiments thereof that are illustrated in the appended drawings. It should be noted that the figures are not drawn to scale, and that elements of similar structure or function are generally represented by like reference numerals for illustrative purposes throughout the figures. Understanding that these drawings depict only typical embodiments and are not therefore to be considered to be limiting of scope, one or more embodiments of the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
  • FIG. 1 illustrates a schematic diagram of an electronic messaging system in accordance with one or more embodiments;
  • FIG. 2 illustrates a block diagram of an environment for implementing the system of FIG. 1 in accordance with one or more embodiments;
  • FIGS. 3A-3I illustrate user interfaces for selecting, editing, and sending a multimedia content item in accordance with one or more embodiments;
  • FIGS. 4A-4E illustrate user interfaces for selecting, editing, and sending a multimedia content item in accordance with one or more additional embodiments;
  • FIG. 5 illustrates a flowchart of a series of acts in a method of selecting and including multimedia content items in an electronic message in accordance with one or more embodiments;
  • FIG. 6 illustrates a flowchart of a series of acts in another method of selecting and including multimedia content items in an electronic message in accordance with one or more embodiments;
  • FIG. 7 illustrates a block diagram of an exemplary computing device in accordance with one or more embodiments; and
  • FIG. 8 is an example network environment of a social networking system in accordance with one or more embodiments.
  • DETAILED DESCRIPTION
  • One or more embodiments include an electronic messaging system that provides users with efficient and effective user experiences when sending electronic communications including multimedia content. More specifically, one or more embodiments described herein allow users to easily and intuitively select multimedia content for inclusion in an electronic communication. For example, one or more embodiments allow a user to select a multimedia content item for inclusion in an electronic message without navigating away from a communication thread.
  • In particular, one or more embodiments allow a user to browse and select a multimedia content item from a gallery of selectable multimedia content item without navigating away from the communication thread. For example, one or more embodiments display a graphical user interface that includes a communication thread with electronic messages sent between co-users, as well as a display area or gallery of selectable multimedia content items. Thus, one or more embodiments allow a user to receive and read messages while simultaneously browsing multimedia content items.
  • In one or more embodiments, the display area or gallery includes a preview of multimedia content items likely to be selected by the user. For example, the display area or gallery can include a predetermined number of the most recent multimedia content items. Alternatively, the display area or gallery can include multimedia content items related to participant in a communication session, content items related to a topic of the session, content items most often sent as messages, or content items selected by another criteria.
  • In addition to the foregoing, the electronic messaging system can modify a preview of multimedia content items to aid in viewing and selection of the content items. For example, the electronic messaging system can crop content items based on the size of the display area or gallery. Furthermore, the electronic messaging system auto-play videos in the display area or gallery.
  • Furthermore, the electronic messaging system can allow a user to easily and effectively edit a multimedia content item for inclusion in a message. For example, the one electronic messaging system can allow a user to start an editing process or edit a multimedia content item in the preview area or gallery without navigating away from the electronic messaging system. Thus, one or more embodiments allow a user to edit a multimedia content item without having to utilize a separate software outside of the electronic messaging system.
  • In one or more embodiments, the electronic messaging system provides the preview area or gallery of multimedia content items below a communication thread. The electronic messaging system allows a user to horizontally scroll through the multimedia content items. Upon detecting a selection of a multimedia content item, the electronic messaging system can provide options to edit or send the multimedia content item as a message. If the edit option is selected, the electronic messaging system can make the multimedia content item available for editing.
  • FIG. 1 illustrates an example embodiment of an electronic messaging system 100. As shown, the electronic messaging system 100 may include, but is not limited to, a user interface manager 102 (or simply “UI manager”), a user input detector 104, a content item manager 106, a communication manager 108, and a data storage 110. Each of the components 102-110 of the electronic messaging system 100 may be in communication with one another using any suitable communication technologies. Although the disclosure herein shows the components 102-110 to be separate in FIG. 1, any of the components 102-110 may be combined into fewer components, such as into a single facility or module, or divided into more components as may serve one or more embodiments. In addition, the components 102-110 may be located on, or implemented by, one or more computing devices, such as those described below in relation to FIG. 7. Alternatively, portions of the electronic messaging system 100 can be located on a computing device, while other portions of the electronic messaging system 100 are located on, or form part of, a social networking system, such as that described below in reference to FIG. 8.
  • The components 102-110 can comprise software, hardware, or both. For example, the components 102-110 can comprise one or more instructions stored on a computer readable storage medium and executable by a processor of one or more computing devices. When executed by the one or more processors, the computer-executable instructions of the electronic messaging system 100 can cause a computing device(s) to perform the methods described herein. Alternatively, the components 102-110 can comprise hardware, such as a special-purpose processing device to perform a certain function. Additionally or alternatively, the components 102-110 can comprise a combination of computer-executable instructions and hardware.
  • As mentioned above, and as shown in FIG. 1, the electronic messaging system 100 can include a user interface manager 102. The user interface manager 102 provides, manages, updates, and/or controls graphical user interfaces (or simply “user interfaces”) that allow a user to view and interact with display elements. For example, the user interface manager 102 may identify, display, update, or otherwise provide various user interfaces that contain one or more display elements in various layouts.
  • More specifically, the user interface manager 102 can display a variety of display elements within a graphical user interface. For example, the user interface manager 102 may display a graphical user interface on a display of a computing device. For instance, display elements include, but are not limited to: buttons, text boxes, menus, thumbnails, scroll bars, hyperlinks, etc. In one or more embodiments, the user interface manager 102 can display and format display elements in any one of a variety of layouts.
  • Furthermore, the user interface manager 102 can also update, remove, resize, or reposition display elements in response to user interactions. For example, as will be described in more detail below, the electronic messaging system 100 may detect user input in a variety of ways. For instance, in one or more embodiments, the detected user input may cause the user interface manager 102 to update a graphical user interface based on the detected input. Similarly, in one or more embodiments, the detected user input may cause the user interface manager 102 to resize one or more display elements, to reposition one or more display elements within the graphical user interface, or to otherwise change or remove one or more display elements within the graphical user interface.
  • Additionally, the user interface manager 102 can selectively update certain areas of a user interface in response to user interactions. For example, in one or more embodiments, detected user input may cause the user interface manager 102 to update or change within only one area of a graphical user interface. In one or more embodiments, upon a detected user interaction, the user interface manager 102 may update one area within a user interface from one type of display to a second type of display, while continuing to display another area within the user interface with no updates.
  • Along similar lines, the user interface manager 102 can reorganize a user interface in response to user interactions. For example, in one or more embodiments, detected user input may cause the user interface manager 102 to split a graphical user interface into two or more areas. In one or more embodiments, upon a detected user interaction, the user interface manager 102 may reorganize a user interface from only displaying one area with a first collection of display elements to displaying two areas with the first collection of display elements in the first area and a second collection of display elements in the second area. Likewise, in one or more embodiments, the user interface manager 102 may also consolidate or remove areas within a graphical user interface in response to detected user interactions.
  • As mentioned above, and as illustrated in FIG. 1, the electronic messaging system 100 may further include a user input detector 104. The user input detector 104 detects, receives, and/or facilitates user input in any suitable manner. In some examples, the user input detector 104 detects one or more user interactions. As referred to herein, a “user interaction” means a single input, a combination of inputs, received from a user by way of one or more input devices, or via one or more touch gestures as described above. A user interaction can have variable duration and may take place anywhere on the graphical user interface managed by the user interface manager 102 described above.
  • For example, the user input detector 104 can detect a user interaction from a keyboard, mouse, touch screen display, or any other input device. In the event a touch screen display is utilized, the user input detector 104 can detect one or more touch gestures that form a user interaction (e.g., tap gestures, swipe gestures, pinch gestures, etc.) provided by a user by way of the touch screen. In some examples, the user input detector 104 can detect touch gestures in relation to and/or directed at one or more display elements displayed as part of the graphical user interface presented on the touch screen display. In one or more embodiments, the user input detector 104 may report any detected touch gesture in relation to and/or directed at one or more display elements to user interface manager 102.
  • The user input detector 104 may additionally, or alternatively, receive data representative of a user interaction. For example, the user input detector 104 may receive one or more user configurable parameters from a user, one or more user commands from the user, and/or any other suitable user input. For example, the user input detector 104 can receive voice commands or otherwise sense, detect, or receive user input.
  • As mentioned above, and as illustrated in FIG. 1, the electronic messaging system 100 may further include a content item manager 106. The content item manager 106 manages multimedia content items (such as digital files), tracks recent multimedia content items, creates previews of multimedia content items, manages edits to multimedia content items, and otherwise handles all actions effecting multimedia content items. For example, in one or more embodiments, the content item manager 106 determines the most recent content items stored on a system, creates or retrieves a preview of each of the recent content items, and presents the previews of the recent content items as part of a user interface.
  • For example, as will be described in more detail below, the electronic messaging system 100 may be implemented on a computing device with data storage. In one or more embodiments, the content item manager 106 searches the data storage of the computing device for multimedia content items suitable for inclusion in a communication session. For example, in one or more embodiments, the content item manager 106 may search the data storage of the computing device for digital photographs, digital videos, and/or sound recording. In one or more embodiments, the content item manager 106 may create a linked list linking to the multimedia files stored in the data storage of the computing device. Alternatively or additionally, the content item manager 106 may create a copy of the multimedia content items stored in the data storage of the computing device.
  • Once the content item manager 106 has identified the multimedia content items stored on the computing device, the content item manager 106 may create or retrieve a preview of each multimedia content items. For example, in one or more embodiments, the content item manager 106 may create a preview of a digital photograph by cropping multimedia content item based an aspect ratio of a user interface. For example, in one or more embodiments, the content item manager 106 may crop a rectangular preview of digital photograph and/or digital video such that the preview is square. Additionally, in one or more embodiments, the content item manager 106 may create a preview of a digital video that includes a portion of the digital video that automatically plays within a portion of the user interface.
  • In one or more embodiments, the content item manager 106 may search files stored on the computing device only for the most recent multimedia content items. For example, in one or more embodiments, the content item manager 106 may identify the ten most recent multimedia content items stored on the computing device. Alternatively, the content item manager 106 may identify a percentage of the most recent multimedia content items stored on the computing device. In one or more embodiments, the content item manager 106 may determine recentness of a multimedia content items by identifying a timestamp associated with the multimedia content items that indicates when the multimedia content items was created or added to the computing device. The content item manager 106 may then compare the identified timestamp to timestamps of other multimedia content items. In one or more alternative embodiments, the number or percentage of recent multimedia content items identified by the content item manager 106 may be a number configurable by the user or can be a set predetermined number.
  • Additionally, in one or more embodiments, the content item manager 106 may manage editing of multimedia content items. For example, in one or more embodiments, the content item manager 106 may create a copy of a multimedia content items, and present the copy for editing. In one or more embodiments, the content item manager 106 may track edits made to the copy of the multimedia content items and provide the edited copy for sending to one or more co-users. For instance, after a user selects a preview from the user interface, the content item manager 106 may create a copy of the multimedia content items associated with the preview and present the copy of the multimedia content items e to the user for editing. In one or more embodiments, the content item manager 106 may track edits to the copy of the multimedia content items and provide the edited copy of the multimedia content items for sending to one or more co-users as an electronic communication.
  • In one or more alternative embodiments, the content item manager 106 may provide the original multimedia content items for editing, rather than a copy of the multimedia content items, as discussed above. For example, in one or more embodiments, after a user selects a preview from the user interface, the content item manager 106 may present the multimedia content items associated with the preview to the user for editing. In that embodiment, the content item manager 106 may track edits to the multimedia content items and provide the edited multimedia content items for sending to one or more co-users as an electronic communication. Accordingly, in some embodiments, the content item manager 106 stores edits made to the multimedia content items within the data storage of a computing device, such that the edited multimedia content items is available to other applications on the computing device. In one or more alternative embodiments, as described above, the content item manager 106 discards edits made a copy of the multimedia content items once the copy of the multimedia content items has been provided for sending to one or more co-users of the communication system.
  • The content item manager 106 may enable a variety of edits to be performed in connection with either a multimedia content items, or a copy of a multimedia content items as described above. For example, with regard to digital photographs and/or digital videos, the content item manager 106 may allow for edits such as altering the color contrast, altering brightness, altering sharpness, altering clarity, inverting color (i.e., color to sepia, color to black and white), adding text or image overlays, adding blended image effects, or any other type of edit suitable to be made to a digital photograph and/or digital video. Furthermore, with regard to digital videos, the content item manager 106 may allow for edits such as adding scene transitions, adding front or end credits, shortening or lengthening the runtime of the video, adding a sound track, or any other type of edit suitable to be made in connection with a digital video. Additionally, with regard to sound recordings, the content item manager 106 may allow for edits such as altering volume, adding reverb, adding sound effects, concatenating additional recordings, or any other type of edit suitable to made in connection with a sound recording.
  • As mentioned above, and as illustrated in FIG. 1, the electronic messaging system 100 may further include a communication manager 108. The communication manager 108 can facilitate receiving and sending data to and from the electronic messaging system 100, or a device upon which the electronic messaging system 100 is implemented. In particular, the communication manager 108 can instruct or activate one or more communication interfaces of a computing device, as described below to send or receive data, particularly data related to electronic communications. Furthermore, the communication manager 108 can package or format content items to be sent or received from the electronic messaging system 100 in any necessary form that is able to be sent through one or more communication channels and using an appropriate communication protocol, as described further below with reference to FIG. 7.
  • As discussed above, the electronic messaging system 100 can include a data storage 110, as illustrated in FIG. 1. The data storage 110 may maintain content item data 112 representative of data associated with multimedia content items available for inclusion as an electronic communication. For example, the content item data 112 may include, but is not limited to: digital photographs, digital videos, recordings of sound inputs, as well as other data representing edits to the digital photographs, digital videos, and/or recordings of sound inputs.
  • FIG. 2 is a schematic diagram illustrating an example system 200, within which one or more embodiments of the electronic messaging system 100 can be implemented. As illustrated in FIG. 2, the system 200 can include computing devices 202, 204, a network 206, and a communication server 208. The computing devices 202, 204, the network 206, and the communication server 208 may be communicatively coupled, as shown in FIG. 2. Although FIG. 2 illustrates a particular arrangement of the computing devices 202, 204, the network 206, and the communication server 208, various additional arrangements are possible. For example, the computing devices 202, 204 may directly communicate with the communication server 208, bypassing the network 206, or alternatively, may directly communicate with each other.
  • The computing devices 202, 204, the network 206, and the communication server 208 may communicate using any communication platforms and technologies suitable for transporting data and/or communication signals. For example, the computing devices 202, 204, the network 206, and the communication server 208 may communicate via any known communication technologies, devices, media, and protocols supportive of remote data communications, examples of which will be described in more detail below with respect to FIG. 7. In addition, in certain embodiments, the computing devices 202, 204, and the communication server 208 may communicate via the network 206, which may include one or more social networks as described further below with respect to FIG. 8.
  • The communication server 208 may generate, store, receive, and transmit electronic communication data. For example, the communication server 208 may receive an electronic communication from the computing device 202 and send the received electronic communication to the computing device 204. In particular, the communication server 208 can transmit electronic messages between one or more users of the system 200. The communication server 208 can receive a wide range of electronic communication types, including but not limited to, text messages, instant messages, social-networking messages, social-networking posts, emails, and any other form of electronic communication. Additional details regarding the communication server 208 will be discussed below with respect to FIG. 7.
  • The network 206 may represent a network or collection of networks (such as the Internet, a corporate intranet, a virtual private network (VPN), a local area network (LAN), a wireless local network (WLAN), a cellular network, a wide area network (WAN), a metropolitan area network (MAN), or a combination of two or more such networks. Thus, the network 206 may be any suitable network over which the computing device 202 may access the communication server 208 and/or the computing device 204, or vice versa. The network 206 will be discussed in more detail below with regard to FIGS. 7 and 8.
  • In addition to the system and network elements of the system 200, FIG. 2 illustrates that a user 210 can be associated with the computing device 202, and that a user 212 can be associated with the computing device 204. Although FIG. 2 illustrates only two users 210, 212, the system 200 can include a large number of users, with each of the users interacting with the system 200 through one or more computing devices. For example, the user 210 can interact with the computing device 202 for the purpose of composing, and sending an electronic communication (e.g., instant message). The user 210 may interact with the computing device 202 by way of a user interface, managed by the user interface manager 102, on the computing device 202. For example, the user 210 can utilize the user interface to cause the computing device 202 to compose and send an electronic communication to one or more of the plurality of users of the system 200.
  • In one or more embodiments, the components 102-110, as described with regard to FIG. 1, may be implemented on one or more of the computing devices 202, 204 and the communication server 208. For example, the computing devices 202, 204, and the communication server 208 may communicate across the network 206 via the communication manager 108 of the electronic messaging system 100. In one or more embodiments, the computing devices 202, 204 may receive user inputs via the user input detector 104. Likewise, in one or more embodiments, the computing devices 202, 204 may provide graphical user interfaces via the user interface manager 102. Furthermore, in one or more embodiments each of the computing devices 202, 204 can include an instance of the electronic messaging system 100.
  • As will be described in more detail below, each of the components 100-110 of the electronic messaging system 100 as described with regard to FIGS. 1 and 2, can provide, alone and/or in combination with the other components of the electronic messaging system 100, one or more graphical user interfaces. In particular, the components 102-110 can allow a user to interact with a collection of display elements for a variety of purposes. In particular, FIGS. 3A-4E and the description that follows illustrate various example embodiments of the user interfaces and features that are in accordance with general principles as described above.
  • In some examples, a computing device (i.e., computing device 202, 204 of FIG. 2) can implement part or all of the electronic messaging system 100. For example, FIG. 3A illustrates a computing device 300 that may implement one or more of the components 102-110 of the electronic messaging system 100. As illustrated in FIG. 3A, the computing device 300 is a handheld device, such as a mobile phone device (e.g., a smartphone). As used herein, the term “handheld device” refers to a device sized and configured to be held/operated in a single hand of a user. in additional or alternative example, however, any other suitable computing device, such as, but not limited to, a tablet device, a handheld device, larger wireless devices, laptop or desktop computer, a personal-digital assistant device, and/or any other suitable computing device can perform one or more of the processes and/or operations described herein.
  • The computing device 300 can include any of the features and components described below in reference to a computing device 700 of FIG. 7. As illustrated in FIG. 3A, the computing device 300 includes a touch screen display 302 that can display or provide user interfaces and by way of which user input may be received and/or detected. As used herein, a “touch screen display” refers to the display of a touch screen device. In one or more embodiments, a touch screen device may be a computing device 202, 204 with at least one surface upon which a user 210, 212 may perform touch gestures (e.g., a laptop, a tablet computer, a personal digital assistant, a media player, a mobile phone). Additionally or alternatively, the computing device 300 may include any other suitable input device, such as a touch pad or those described below in reference to FIG. 7.
  • FIG. 3A illustrates a touch screen display 302 of the computing device 300 displaying one embodiment of a graphical user interface, in particular a messaging graphical user interface 304. For example, the user interface manager 102 provides various display areas and display elements as part of the messaging graphical user interface 304. In one or more embodiments, the messaging graphical user interface 304 includes a communication thread 306, as well as a message input control palette or toolbar 310.
  • As described above, the communication manager 108 can facilitate receiving and sending data. In one or more embodiments, the communication manager 108 facilitates receiving and sending of electronic communications between the computing devices 202, 204. Also in one or more embodiments, the user interface manager 102 displays electronic communications sent and received via the communication manager 108. In one or more embodiments, the user interface manager 102 can display electronic communications sent and received via the communication manager 108 in the communication thread 306 within the messaging graphical user interface 304.
  • For example, as illustrated in FIG. 3A, the user interface manager 102 provides the communication thread 306 that includes electronic messages 308 a sent from an account of a user of the communication device 300. Similarly, the communication thread 306 can include electronic messages 308 b received by the account of the user of the computing device 300. In one or more embodiments, the user interface manager 102 organizes the communication thread 306 such that new messages are added to the bottom of the communication thread 306 so that older messages are displayed at the top of the communication thread 306. In alternative embodiments, the user interface manager 102 may organize the messages 308 a, 308 b in any manner that may indicate to a user the chronological or other relationship between the messages 308 a, 308 b.
  • The user interface manager 102 provides a variety of electronic communication characteristics to help a user distinguish between electronic communications in the communication thread 306. For example, as illustrated in FIG. 3A, the user interface manager 102 displays the electronic messages 308 a sent from an account of the user of the computing device 300 pointed toward one side (i.e., the right side) of the messaging graphical user interface 304. On the other hand, the user interface manager 102 displays the electronic messages 308 b received by the communication manager 108 pointed toward the opposite side (i.e., the left side) of the messaging graphical user interface 304. In one or more embodiments, the positioning and orientation of the electronic messages 308 a, 308 b provides a clear indicator to a user of the computing device 300 of the origin of the various electronic communications displayed within the messaging graphical user interface 304.
  • Another characteristic provided by the user interface manager 102 that helps a user distinguish electronic communications may be a color of the electronic communications. For example, as shown in FIG. 3A, the user interface manager 102 displays sent electronic messages 308 a in a first color and received electronic messages 308 b in a second color. In one or more embodiments, the first and second colors may be black and white, respectively, with an inverted typeface color. In an alternative embodiment, the user interface manager 102 may display the electronic messages 308 a, 308 b with white backgrounds and different colored outlines.
  • In yet another alternative embodiment, the user interface manager 102 may display the electronic messages 308 a, 308 b with backgrounds of different patterns, in different fonts, in different sizes or in any other manner that may distinguish the sent electronic messages 308 a from the received electronic messages 308 b. For example, in one or more embodiments, the user interface manager 102 displays sent electronic messages 308 a with white typeface on a blue background. Likewise, in one or more embodiments, the user interface manager 102 displays received electronic messages 308 b with black typeface on a grey background.
  • As mentioned above, the user interface manager 102 may also provide a message input control palette or toolbar 310. As illustrated in FIG. 3A, the user interface manager 102 displays the message input control palette or toolbar 310 as part of the messaging graphical user interface 304. In one or more embodiments, the message input control palette or tool bar 310 includes a variety of selectable message input controls that provide a user with various message input options or other options. For example, in FIG. 3A, the message input control palette or toolbar 310 includes a text input control 312 a, a photo or video input control 312 b, a multimedia input control 312 c, a symbol input control 312 d, and a sound input control 312 e. In one or more alternative embodiments, the message input control palette or toolbar 310 may provide the input controls 312 a-312 e in a different order, may provide other input controls not displayed in FIG. 3A, or may omit one or more of the input controls 312 a-312 e shown in FIG. 3A.
  • As will be described below in greater detail, a user may interact with any of the input controls 312 a-312 e in order to compose and send different types of electronic communications. For example, if a user interacts with the text input control 312 a, the user interface manager 102 may provide a touch screen display keyboard in a portion of the messaging graphical user interface 304 that the user may utilize to compose a textual message. Similarly, if a user interacts with the photo input control 312 b, the user interface manager 102 may provide a camera viewfinder interface within a portion of the messaging graphical user interface 304 that the user may utilize to add a photo to the communication thread 306. Furthermore, if a user interacts with the sound input control 312 e, the user interface manager 102 may provide a sound recording control by way of which the user can record a voice or other sound message. Likewise, as will be described in more detail below, if a user interacts with the multimedia input control 312 c, the user interface manager 102 may provide a multimedia content item display area with multimedia content items that the user can select to send as a message.
  • A user may interact with any of the message input controls 312 a-e in order to compose and send a message to one or more co-users via the electronic messaging system 100. For example, in FIG. 3B, a finger 314 of a user's hand is shown interacting with the multimedia input control 312 c. In one or more embodiments, the user input detector 104 can detect interactions (e.g., a tap touch gesture) of the finger 314 with the multimedia input control 312 c. In one or more embodiments, the user interface manager 102 may display an input control indicator 320 to indicate which input control 312 a-e is currently active. Additionally, as shown in FIG. 3B, upon the user input detector 104 detecting a tap touch gesture on the multimedia input control 312 c, the user interface manager 102 can display a multimedia content item display area 316 containing one or more multimedia content item previews 318 a, 318 b.
  • In particular, as illustrated by FIG. 3F, the user interface manager 102 can provide the communication thread 306 in a first portion (i.e., the upper portion) of the messaging user interface 304. The user interface manager 102 can provide the multimedia content item display area 316 in a second portion (i.e., the lower portion) of the messaging user interface 304. Thus, the user interface manager 102 can allow the user to both view the communication thread 306 and any new messages, while also being able to view and browse content items. In alternative embodiments the user interface manager 102 can arrange the communication thread 306 and the multimedia content item display area 316 horizontally or in another arrangement other than a vertical arrangement.
  • As discussed above, in one or more embodiments, the content item manager 106 provides the multimedia content item previews 318 a, 318 b based on multimedia content items stored on computing device 300. For example, as shown in FIG. 3B, the content item manager 106 may provide the multimedia content item preview 318 a based on a digital photograph stored on the computing device 300. In one or more embodiments, the content item manager 106 provides the multimedia content item preview 318 a in the multimedia content item display area 316 based on the recentness of the multimedia content item associated with the content item preview 318 a. In other words, in one or more embodiments, the multimedia content item associated with the content item preview 318 a is the most recently stored multimedia content item on the computing device 300 or the most recently captured or created multimedia content item on the computing device 300.
  • In one or more embodiments, the content item manager 106 may crop the multimedia content item to create the content item preview 318 a. For example, the multimedia content item associated with the multimedia content item preview 318 a may be a rectangular digital photograph stored on the computing device 300. Accordingly, in one or more embodiments, the content item manager 106 may crop the rectangular digital so that the content item preview 318 a is square and sized for presentation in the multimedia content item display area 316. Additionally, in one or more embodiments, the content item manager 106 may tailor the cropped content item preview 318 a to an aspect ratio of the multimedia content item display area 316. In other words, in one or more embodiments, the content item manager 106 may tailor the cropped content item preview 318 a such that it is square within the multimedia content item display area 316 even if the multimedia content item display area 316 takes up a larger portion of the messaging graphical user interface 304 than is shown in FIG. 3B.
  • By cropping the multimedia content items, the content item manager 106 can help ensure that more than one content item preview 318 a, 318 b can be shown in the multimedia content item display area 316. Additionally, by cropping the multimedia content items, the content item manager 106 can help ensure that the content items can fit within the communication thread 306. Still another benefit of cropping the multimedia content items is to reduce a file size of the multimedia content items to enable quicker sending, receiving, and displaying of multimedia content items.
  • In one or more embodiments, as shown in FIG. 3B, the content item manager 106 may provide the multimedia content item preview 318 b based on a digital video stored on the computing device 300. For example, in one or more embodiments, the content item manager 106 may provides a preview of a digital video that includes at least a portion of the digital video that auto plays within the multimedia content item display area 316. For instance, a illustrated in FIG. 3B, multimedia content item preview 318 b, may be a portion of a digital video that auto plays within the multimedia content item display area 316. In one or more embodiments, the portion of the digital video that auto plays may only be a few seconds long. In one or more alternative embodiments, the portion of the digital video that auto plays may be a percentage of the total length of the digital video (e.g. a preview consisting of 10% of a 60 second digital video may be 6 seconds long). Alternatively, the entire digital video can auto play within the multimedia content item display area 316.
  • As with the multimedia content item preview 318 a discussed above, the content item manager 106 may crop the multimedia content item preview 318 b such that the preview 318 b is square and sized for presentation in the multimedia content item display area 316. In one or more embodiments, the content item manager 106 may crop and tailor multimedia content item previews 318 a, 318 b such that the previews 318 a, 318 b are the same size within the multimedia content item display area 316, even though the multimedia content items associated with the previews 318 a, 318 b are of differing sizes.
  • As shown, the user interface manager 102 may display the multimedia content item display area 316 such that the display area 316 is horizontally scrollable within the messaging graphical user interface 304. For example, as shown in FIG. 3C, the user input detector 104 may detect a user interaction of the finger 314 of a user's hand within the multimedia content item display area 316. For instance, the detected user interaction may be a swipe touch gesture of the finger 314 moving across the touch screen display 302 along the direction of the arrow 322. In one or more embodiments, in response to the detected touch gesture, the user interface manager 102 may update the multimedia content item display area 316 to appear to be scrolling along the direction of the arrow 322. In a preferred embodiment, the multimedia content item preview 318 b will continue to auto play through the horizontal scroll.
  • As discussed above, in one or more embodiments, the content item manager 106 may provide content item previews of multimedia content items likely to be selected by the user. For example, the content item manager 106 may provide content item previews of the most recent multimedia content items stored on the computing device 300. For example, in one embodiment, the content item manager 106 may provide content item previews of any multimedia content items stored on the computing device 300 within the last 24 hours. In an alternative embodiment, the content item manager 106 may provide content item previews of a predetermined number of content items (e.g., 10, 20, 30) stored on the computing device 300.
  • In yet additional alternative embodiments, the content item manager 106 may provide content item previews based on other criteria besides recentness. For example, the electronic messaging system 100 can provide content item previews including participants in conversation included in the communication thread 306. Alternatively, the content item manager 106 may provide content item previews of multimedia items were created or “taken” at a particular location (e.g., all the pictures and videos taken at the lake house). In still further embodiments, the content item manager 106 may provide content item previews related in some way to content of the messages in the communication thread 306.
  • In one or more embodiments, the user interface manager 102 may update the messaging graphical user interface 304 to include a camera roll in response to a detected over-scroll within the multimedia content item display area 316. For example, as shown in FIG. 3D, the content item manager 106 may have provided only the multimedia content items previews 318 a and 318 b for display in the multimedia content item display area 316. Accordingly, the multimedia content item preview 318 b is the last preview displayed in the multimedia content item display area 316. In one or more embodiments, the user interface manager 102 may indicate the multimedia content item preview 318 b is the last preview in the multimedia content item display area 316 by displaying a blank area 324 to right of the multimedia content item preview 318 b. Additionally or alternatively, the user interface manager 102 may indicate the multimedia content item preview 318 b is the last preview by providing an automatic stop to a scrolling motion once the content preview 318 b is reached.
  • In one or more embodiments, the user input detector 104 may detect a swipe gesture of the user's finger 314 across the multimedia content item display area 316 along the direction of the arrow 322. In response to the detected swipe gesture, in one or more embodiments, the user interface manager 102 may sequentially display the content item previews. In one or more embodiments, the user interface manager 102 may provide a camera roll of the computing device in response to a detected over-scroll. As illustrated in FIG. 3D, an over-scroll occurs when a user continues to horizontally scroll the multimedia content item display area 316 along the direction of arrow 322 beyond the last content item preview 318 b, the user interface manager 102 updates the messaging graphical user interface 304 to include a camera roll of the computing device 300.
  • For example, as illustrated in FIG. 3E, upon the detected over-scroll, the user interface manager 102 displays a camera roll 326. In one or more embodiments, the camera roll 326 provides a content item preview 318′ for each multimedia content item stored on the computing device 300. In one or more embodiments, the content item manager 106 may crop and tailor each camera roll preview 318′ in the same manner as described with regard to the multimedia content item previews 318 a and 318 b above.
  • Additionally, in one or more embodiments, if any of the camera roll previews 318′ is associated with a digital video multimedia content item, the content item manager 106 may provide a portion of the digital video that auto plays from within the camera roll 326. Accordingly, in one or more embodiments, the user interface manager 102 may display the camera roll 326 such that some of the camera roll content item previews 318′ are still images (i.e., based on digital photograph multimedia items) and some of the camera roll content item previews 318′ are auto playing video clips (i.e., based on digital video multimedia items). In one or more alternative embodiments, the content item manager 106 may not provide any specialized content item previews for the camera roll. Rather, in an alternative embodiment, the user interface manager 102 may simply display a minimized version of each digital picture and/or a minimized version of the first frame of each digital video within the camera roll 326.
  • As shown in FIG. 3E, in one or more embodiments the camera roll 326 can occupy the entire touch screen display 302. In other words, the display of the camera roll 326 can cause a navigation away from the communication thread 306. In alternative embodiments, the user interface manager 102 can provide the camera roll 326 within the area of the messaging user interface 304 previously occupied by the content item display area 316. In such embodiments, the user can access the camera roll 326 without navigating away from the communication thread 306.
  • In one or more embodiments, a user may select a content item preview either from the camera roll 326 (i.e., one of the camera roll previews 318′ of FIG. 3E) or from the multimedia content item display area 316, as shown in FIG. 3F. For example, as illustrated in FIG. 3F, the user input detector 104 may detect a user selection of the multimedia content item preview 318 c from the multimedia content item display area 316. For instance, the user input detector 104 may detect a tap touch gesture of the finger 314 of the user with the content item preview 318 c. In one or more alternative embodiments, the user input detector 104 may detect other types of user input such as a spoken command, an upward swipe touch gesture, or any other suitable type of user input.
  • Upon detection of a selection of a content item preview 318 c, the communication manager 108 may send the multimedia content item associated with the content item preview 318 a to one or more co-users. For example, the communication manager 108 may send the multimedia content item to the communication server 208, which may then forward the multimedia content item to one or more computing devices 204 associated with one or more intended recipients. In that case, the user interface manager 102 may also add the multimedia content item to the communication thread 306.
  • Alternatively, in response to selection of the multimedia content item preview 318 c, the user interface manager 102 may alter the display of the content item preview 318 c so as to indicate the selection of the content item preview 318 c. For example, in response to the selection of the multimedia content item preview 318 c, the user interface manager 102 may present a blurred version of the content item preview 318 c′, as shown in FIG. 3G. In one or more alternative embodiments, the user interface manager 102 may alter the selected multimedia content item preview 318 c in other ways in order to indicate the selection. For example, in one or more alternative embodiments, the user interface manager 102 may alter the color scheme of the selected content item preview 318 c, may black out the selected preview 318 c, or may alter the selected content item preview 318 c in any other way suitable for this purpose. Furthermore, in one or more embodiments the selected content item preview 318 c can also be enlarged or zoomed in.
  • Additionally, in response to a selection of the multimedia content item preview 318 c, the user interface manager 102 may also present one or more controls overlaid on the selected preview 318 c′. For example, as illustrated in FIG. 3G, in response to the selection of the preview 318 c, the user interface manager 102 may overlay an edit control 328 and a send control 330 on the selected multimedia content item preview 318 c′. In one or more embodiments, the user interface manager 102 may overlay other or additional controls on the selected preview 318 c′. For instance, in one embodiment, the user interface manager 102 may also overlay a delete control over the selected preview 318 c′.
  • In response to a detected selection of a delete control, the content item manager 106 may remove the preview 318 c from the multimedia content item display area 316. Additionally or alternatively, the content item manager 106 may permanently delete the multimedia content item associated with the preview 318 c from the computing device 300. In one or more embodiments, the content item manager 106 may be configurable by a user in order to specify the actions taken with regard to the controls including the edit control 328, the send control 330, and the delete control.
  • In response to a detected selection of the send control 330, the communication manager 108 may send the multimedia content item associated with the selected preview 318 c′ to one or more co-users. Additionally, in one or more embodiments, the user interface manager 102 may also add the multimedia content item associated with the selected preview 318 c′ to the communication thread 306. In one or more embodiments, a user may unselect the selected preview 318 c′ simply by tapping anywhere else on the messaging graphical user interface 304.
  • The content item manager 106 may provide the multimedia content item associated with the multimedia content item preview 318 c′ for editing in response to a detected selection of the edit control 328. For example, as shown in FIG. 3G, the user input detector 104 may detect a user interaction (e.g., tap touch gesture) of a user's finger 314 with the edit control 328. As shown in FIG. 3H, the user interface manager 102 may present the multimedia content item 319 associated with the selected preview 318 c′ for editing in response to the detected user interaction. In one or more embodiments, the user interface manager 102 may also present a variety of editing controls within the messaging graphical user interface 304. For example, the user interface manager 102 may display a crop editing control 328 a (i.e., allows a user to crop the multimedia content item 319), an auto-edit control 328 b (i.e., performs preconfigured edits to the multimedia content item 319), a color edit control 328 c (i.e., allows a user to edit the colors of the multimedia content item 319), and a writing edit control 328 d (i.e., allows a user to add text or other drawn effects to the multimedia content item 319). In one or more alternative embodiments, the user interface manager 102 may display other or additional edit controls while the multimedia content item 319 associated with the selected preview 318 a′ is available for editing.
  • In one or more embodiments, the content item manager 106 may allow edits to a copy of a multimedia content item 319, rather than allowing edits to the original multimedia content item 319. This feature allows a user to send a copy of a multimedia content item to one or more co-users that has been edited only for purposes related to a specific communication session. Thus, in one or more embodiments, edits made within the electronic messaging system 100 may not be reflected to an original copy of the content item stored on the computing device 300.
  • The content item manager 106 may receive one or more edits to multimedia content item 319 from a user. For example, as shown in FIG. 3H, a user's finger 314 may perform an edit 334. In the embodiment shown, the edit 334 is writing added to the multimedia content item 319, in connection with a selection of the writing edit control 328 d. In additional alternative embodiments, the content item manager 106 may receive further edits to the multimedia content item 319 in accordance with any of the edit controls 328 a-d described above.
  • Additionally, as shown in FIG. 3H, the user interface manager 102 may present additional controls while the multimedia content item 319 is available for editing. For example, the user interface manager 102 may also present a send control 330 and a cancel control 332. In one or more embodiments, a detected selection of the send control 330 may cause the communication manager 108 to send the multimedia content item 319 along with the edit 334 to one or more co-users. Additionally, in one or more embodiments, a detected selection of the cancel control 332 may cause the content item manager 106 to discard any edits made to the multimedia content item 319. In an alternative embodiment, the user interface manager 102 may also present a save control while the multimedia content item 319 is available for editing. In one or more embodiments, a detected selection of the save control may cause the content item manager 106 to save the multimedia content item 319 along with the edits 334 to the computing device 300.
  • In one or more embodiments, the user interface manager 102 may again display the communication thread 306, the message input control palette or toolbar 310, and the multimedia content item display area 316 in response to a detected selection of the send control 330, the cancel control 332, or the save control as described with regard to FIG. 3H. For example, as shown in FIG. 3I, in response to a detected selection of the send control 330, the user interface manager 102 can display the communication thread 306, the message input control palette or toolbar 310, and the multimedia content item display area 316 within the messaging graphical user interface 304. In one or more embodiments, as shown in FIG. 3I, the user interface manager 102 may automatically add the multimedia content item 319 with the edit 334 to the communication thread 306 in response to the detected selection of the send control 330.
  • In one or more embodiments, the user interface manager 102 may display the multimedia content item preview 318 c within the multimedia content item display area 316 such that the content item preview 318 c does not reflect any edits. For example, because the content item manager 106 provided a copy of the content item 319 (shown in FIG. 3G) for editing, the multimedia content item associated with the content item preview 318 c was not affected by the edits 334. Accordingly, as shown in FIG. 3I, the user interface manager 102 displays the content item preview 318 c unchanged within the multimedia content item display area 316. In one or more alternative embodiments, the content item manager 106 may provide the original multimedia content item for editing, rather than a copy 319. In that alternative embodiment, the user interface manager 102 may display the preview 318 c including the edits 334 within the multimedia content item display area 316.
  • As shown in FIG. 3I, when sent to one or more co-users and when added to the communication thread 306, the content item 319 can have a size configured for display within a communication thread 306. In particular, the content item 319 can occupy less than the entire communication thread 306 both in a vertical direction and a horizontal direction. By not occupying the total vertical area of the communication thread 306, the communication thread 306 can display both the content item 319 and one or more messages as shown by FIG. 3I. Along related lines, by not occupying the total horizontal area of the communication thread 306, the content item 319 can be positioned on one side of the communication thread 306 so as to indicate whether the content item 319 was a sent or received message.
  • FIGS. 3A-3I illustrate the process for selecting and editing a digital photograph multimedia content item for inclusion in a communication session. As discussed above, the electronic messaging system 100 also allows a user to select and edit a digital video multimedia content item for inclusion in a communication session. The process for selecting and editing a digital video will now be discussed in relation to FIGS. 4A-4E.
  • As described above, the user input detector 104 may detect a selection of a multimedia content item preview 318 b from with the multimedia content item display area 316. For example, as shown in FIG. 4A, the user input detector 104 may detect a tap touch gesture of the user's finger 314 on the content item preview 318 b. In one or more alternative embodiments, the detected selection may be by way of another type of user interaction, such as a press-and-hold touch gesture, a spoken command, or any other type of user interaction suitable for this purpose. In one or more embodiments, as described above, the multimedia content item associated with the content item preview 318 b may be a digital video.
  • As discussed above with regard to FIG. 3G, the communication manager 108 may immediately send the digital video associated with the multimedia content item preview 318 b in response to a particular user interaction. For example, the communication manager 108 may immediately send the digital video associated with the content item preview 318 b to one or more co-users upon detection of a selection of the content item preview 318 b. Additionally, the user interface manager 102 may immediately add the digital video to the communication thread 306 in response to detection of a selection of the content item preview 318 b.
  • Additionally or alternatively, in response to a touch gesture selection of the multimedia content item preview 318 b, the user interface manager 102 may alter the display of the content item preview 318 b so as to indicate the selection of the content item preview 318 b. For example, in response to the selection of the multimedia content item preview 318 b in FIG. 4A, the user interface manager 102 may present a blurred version of the preview 318 b′, as shown in FIG. 4B. As discussed above, the content item preview 318 b′ of a digital video is a portion of the digital video that auto plays within the multimedia content item display area 316. Accordingly, in response to a selection of the content item preview 318 b′, the user interface manager 102 may display a blurred portion of the digital video that auto plays within the multimedia content item display area 316. In one or more alternative embodiments, in response to a selection of the content item preview 318 b′, the user interface manager may display only a blurred single frame from the digital video.
  • Additionally in response to a selection of the multimedia content item preview 318 b, the user interface manager 102 may also present one or more controls overlaid on the selected preview 318 b′, as illustrated in FIG. 4B. For example, as illustrated in FIG. 4B, in response to the selection of the preview 318 a in FIG. 4A, the user interface manager 102 may overlay the edit control 328 and the send control 330 on the selected multimedia content item preview 318 b′. In one or more embodiments, the user interface manager 102 may overlay other or additional controls on the selected content item preview 318 b′. For instance, in one embodiment, the user interface manager 102 may also overlay a delete control over the selected preview 318 b′.
  • In response to a detected selection of the send control 330, the communication manager 108 may send the multimedia content item associated with the selected content item preview 318 b′ to one or more co-users. Additionally, in one or more embodiments, the user interface manager 102 may also add the multimedia content item associated with the selected content item preview 318 b′ to the communication thread 306. In one or more embodiments, a user may unselect the selected content item preview 318 b′ simply by tapping anywhere else on the messaging graphical user interface 304.
  • As described above, the content item manager 106 may provide the multimedia content item associated with the multimedia content item preview 318 b′ (or a copy thereof) for editing in response to a detected selection of the edit control 328. For example, as shown in FIG. 4B, the user input detector 104 may detect a user interaction of the user's finger 314 with the edit control 328. In one or more embodiments, the detected user interaction may be a tap touch gesture. In one or more alternative embodiments, the detected user interaction may be an upward swipe touch gesture, a spoken command, or another type of user input suitable for this purpose.
  • As shown in FIG. 4C, the user interface manager 102 may present the content item 319 b associated with the selected preview 318 b′ for editing within the messaging graphical user interface 304 in response to the detected user interaction. In one or more embodiments, and as described above, the user interface manager 102 may also present a variety of editing controls within the messaging graphical user interface 304. For example, the user interface manager 102 may display the crop editing control 328 a, the auto-edit control 328 b, the color edit control 328 c, and the writing edit control 328 d. In one or more alternative embodiments, the user interface manager 102 may display other or additional edit controls as described above.
  • In one or more embodiments, the editing controls 328 a-d may take on different functionality depending on the type of multimedia content item currently available for editing. For example, with regard to FIG. 3H, the multimedia content item type was a digital photograph. Accordingly, in one or more embodiments associated with a digital photograph, the crop editing control 328 a may function to remove portions of the digital photograph, thus changing the displayed portion of the digital photograph. However, with regard to FIG. 4C, the multimedia content item type is a digital video. As such, in one or more embodiments associated with a digital video, the crop editing control 328 a may function to remove portions of the digital video, thus changing the runtime of the digital video. In one or more embodiments, the other editing controls 328 b-d may have similarly alterable functionalities depending on the multimedia content item type.
  • As shown by FIG. 4C, the content item manager 106 may receive one or more edits to the content item 319 b via one or more touch gestures. In the embodiment shown, the content item manager 106 has received an edit 334 b changing a display property of the content item 319 b. In particular, the edit 334 b includes the addition of a border around the content item 319 b In additional alternative embodiments, the content item manager 106 may receive further edits to the content item 319 b in accordance with any of the edit controls 328 a-d described above.
  • Additionally, as shown in FIG. 4C, the user interface manager 102 may present additional controls while the content item 319 b is available for editing. For example, the user interface manager 102 may also present the send control 330 and the cancel control 332. In one or more embodiments, a detected selection of the send control 330 may cause the communication manager 108 to send the content item 319 b along with the edits 334 b to one or more co-users. Additionally, in one or more embodiments, a detected selection of the cancel control 332 may cause the file manager 108 to discard any edits made to the content item 319 b of the multimedia content item. In an alternative embodiment, the user interface manager 102 may also present a save control while the copy 319 b is available for editing. In one or more embodiments, a detected selection of the save control may cause the content item manager 106 to save the content item 319 b along with the edits 334 b to the computing device 300.
  • In one or more embodiments, the user interface manager 102 may display the communication thread 306, the message input control palette or toolbar 310, and the multimedia content item display area 316 in response to a detected selection of the send control 330, the cancel control 332, or the save control as described with regard to FIG. 4C. For example, as shown in FIG. 4C, the user input detector 104 may detect a touch gesture performed by the user's finger 314 on the send control 330. In response to a detected selection of the send control 330, in one or more embodiments, the user interface manager 102 may display the communication thread 306, the message input control palette or toolbar 310, and the multimedia content item display area 316 within the messaging graphical user interface 304, as shown in FIG. 4D. In one or more embodiments, as shown in FIG. 4D, the user interface manager 102 may automatically add the content item 319 b of the multimedia content item with the edits 334 b to the communication thread 306 in response to the detected selection of the send control 330. As described above, the content item preview 318 b may not include the edits 334 b made to the content item 319 b.
  • As described above, the content item manager 106 may also package the content item 319 b of the multimedia content item with the edits 334 b such that it may be played from within the communication thread 306. For example, as shown in FIG. 4D, the content item manager 106 has added the content item 319 with the edits 334 b and a playback control 336 to the communication thread 306. In one or more alternative embodiments, the playback control 336 may further include a time remaining indicator, a pause button, or any other additional controls suitable for a video.
  • In response to a detected user interaction, the user interface manager 102 may replace the multimedia content item display area 316 with another control. For example, as illustrated in FIG. 4E, the user input detector 104 may detect a user interaction of the user's finger 314 interacting with the text input control 312 a within the message input control palette or toolbar 310. In one or more embodiments, in response to this detected selection of the text input control 312 a, the user interface manager 102 may replace the multimedia content item display area 316 with a touch screen display keyboard 338. In one or more alternative embodiments, the user interface manager 102 may replace the multimedia content item display area 316 with other types of controls in response to the detected selection of any of the input controls 312 a-312 b.
  • FIGS. 1-4E, the corresponding text, and the examples, provide a number of different systems and devices for selecting and including multimedia content items in a communication session. In addition to the foregoing, embodiments of the present invention can also be described in terms of flowcharts comprising acts and steps in a method for accomplishing a particular result. For example, FIGS. 5 and 6 illustrate flowcharts of exemplary methods in accordance with one or more embodiments of the present invention. The methods described in relation to FIGS. 5 and 6 may be performed with less or more steps/acts or the steps/acts may be performed in differing orders. Additionally, the steps/acts described herein may be repeated or performed in parallel with one another or in parallel with different instances of the same or similar steps/acts.
  • FIG. 5 illustrates a flowchart of one example method 500 of selecting and including multimedia content items in a communication session. The method 500 includes an act 502 of providing a messaging graphical user interface. In particular, the act 502 can involve providing, on a client device, a messaging graphical user interface 304. In one or more embodiments, the messaging graphical user interface 304 may include a communication thread 306 in a first portion. In one or more embodiments, the communication thread may include a plurality of electronic messages 308 a, 308 b exchanged between a user and one or more co-users.
  • The method 500 further includes an act 504 of detecting a selection of a multimedia input control. In particular, the act 504 can include detecting a tap touch gesture selection of the multimedia input control 312 c. In one or more embodiments, detecting a selection of a multimedia input control 312 c can include detecting the selection of the multimedia input control 312 c from a palette of input controls 310.
  • The method 500 also includes an act 506 of providing a multimedia content item display area. In particular, the act 506 can involve, in response to the detected selection of the multimedia input control 312 c, providing a multimedia content item display area 316 in a second portion of the messaging graphical user interface 304. In one or more embodiments, the multimedia content item display area 316 may provide a preview 318 a, 318 b of one or more multimedia content items stored on the client device 300 available for sending as an electronic message 308 a, 308 b.
  • In one or more embodiments, providing the multimedia content item display area 316 may include providing a preview 318 a, 318 b of one or more recently stored multimedia content items. For example, one or more recently stored multimedia content items may include multimedia items that were stored within a given time limit, or window of time. Additionally, in one or more embodiments, the one or more multimedia content items stored on the client device 300 available for sending as an electronic message 308 a, 308 b may include one or more digital photographs or digital videos. Furthermore, providing a preview 318 a, 318 b of one or more multimedia content items may include providing a digital video that auto plays within the second portion of the messaging graphical user interface 304.
  • Additionally, in one or more embodiments, the provided multimedia content item display area 316 may be horizontally scrollable. Furthermore, in one or more embodiments, the method 500 may also include detecting a horizontal over-scroll of the horizontally scrollable multimedia content item display area 316. In response to the detected horizontal over-scroll, the method 500 may include displaying a camera roll 326 associated with the client device 300.
  • The method 500 may further include cropping the preview 318 a, 318 b of each of the one or more multimedia content items stored on the client device 300 available for sending as an electronic message 308 a, 308 b. In one or more embodiments, cropping the preview 318 a, 318 b of each of the one or more multimedia content items may further include tailoring the cropped preview 318 a, 318 b of each of the one or more multimedia content items to an aspect ratio of the second portion of the messaging graphical user interface 304. For example, in one or more embodiments, tailoring the cropped preview 318 a, 318 b of each of the one or more multimedia content items to an aspect ratio of the second portion of the messaging graphical user interface 304 may include displaying the previews 318 a, 318 b in the second portion of the messaging graphical user interface 304 such that the previews 318 a, 318 b are square.
  • The method 500 may also include detecting a selection of a preview 318 a, 318 b of a multimedia content item provided in the multimedia content item display area 316 in the second portion of the messaging graphical user interface 304. In one or more embodiments, in response to the detected selection of the preview 318 a, 318 b of the multimedia content item, the method can involve sending the multimedia content item 319 corresponding to the selected preview 318 a, 318 b to one or more co-users. Additionally, in one or more embodiments, in response to sending the multimedia content item 319, the method 500 may include adding the multimedia content item 319 to the communication thread 306 in the first portion of the messaging graphical user interface 304.
  • FIG. 6 illustrates a flowchart of a method 600 of selecting and including multimedia content items in a communication session. The method 600 includes an act 602 of providing a split-screen messaging graphical user interface. In particular, the act 602 can involve providing a split-screen messaging graphical user interface 304 including two portions. In one or more embodiments, the first portion of the split-screen messaging graphical user interface 304 may include a communication thread 306 including a plurality of electronic messages 308 a, 308 b exchanged between a user and one or more co-users. Additionally in one or more embodiments, the second portion of the split-screen messaging graphical user interface 304 may include a multimedia content items display 316 including a preview 318 a, 318 b of one or more multimedia content items. In one or more embodiments, the preview 318 a, 318 b of one or more multimedia content items may be tailored to an aspect ratio of the second portion or the first portion of the messaging graphical user interface 304.
  • The method 600 further includes an act 604 of detecting a selection of a preview of a multimedia content item. In particular, the act 604 can involve detecting a selection of a preview 318 a, 318 b of a multimedia content item from the multimedia content item display 316. For example, in one or more embodiments, detecting a selection of a preview 318 a, 318 b of a multimedia content items may include detecting a tap touch gesture interacting with the multimedia content item.
  • The method 600 further includes an act 606 of overlaying a control on the selected preview of the multimedia content item. In particular, the act 606 can involve, in response to the detected selection of the preview 318 a, 318 b of the multimedia content item from the multimedia content item display 316, overlaying a first control on the selected preview 318 a, 318 b of the multimedia content item. For example, in one or more embodiments, the first control overlaid on the selected multimedia content item may be an editing control 328. Additionally, in response to the detected selection of the preview 318 a, 318 b of the multimedia content item from the multimedia content item display 316, the act 606 may also include overlaying a second control on the selected preview 318 a, 318 b of the multimedia content item. For example, in one or more embodiments, the second control overlaid on the selected preview 318 a, 318 b of the multimedia content items may be a send control 330. Furthermore, in response to the detected selection of the preview 318 a, 318 b of the multimedia content item from the multimedia content item display 316, the act 606 may also include blurring the selected preview 318 a, 318 b of the multimedia content item.
  • The method 600 may further include detecting a selection of the editing control 328 overlaid on the selected preview 318 a, 318 b of the multimedia content item. For example, in one or more embodiments, the detected selection of the editing control 328 may be a tap touch gesture. Further more, in response to the detected selection of the editing control 328 overlaid on the selected preview 318 a, 318 b of the multimedia content item, the method 600 may also include presenting a copy 319 of the multimedia content item associated with the selected preview 318 a, 318 b for editing.
  • The method 600 may also include receiving one or more edits 334, 334 b to the copy 319 of the multimedia content item. For example, receiving one or more edits 334, 334 b to the copy 319 of the multimedia content item may be in response to presenting the copy 319 of the multimedia content item for editing. The method 600 may further include adding the copy 319 of the multimedia content item with the one or more edits 334, 334 b to the communication thread 306. Additionally, the method 600 may include sending the copy 319 of the multimedia content item with the one or more edits 334, 334 b. For example, the method 600 may include sending the copy of the multimedia content item with the one or more edits 334, 334 b to the one or more co-users.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. In particular, one or more of the processes described herein may be implemented at least in part as instructions embodied in a non-transitory computer-readable medium and executable by one or more computing devices (e.g., any of the media content access devices described herein). In general, a processor (e.g., a microprocessor) receives instructions, from a non-transitory computer-readable medium, (e.g., a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein.
  • Computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are non-transitory computer-readable storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: non-transitory computer-readable storage media (devices) and transmission media.
  • Non-transitory computer-readable storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to non-transitory computer-readable storage media (devices) (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system. Thus, it should be understood that non-transitory computer-readable storage media (devices) can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. In some embodiments, computer-executable instructions are executed on a general-purpose computer to turn the general-purpose computer into a special purpose computer implementing elements of the invention. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
  • Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
  • One or more embodiments of the invention can also be implemented in cloud computing environments. In this description, “cloud computing” is defined as a model for enabling on-demand network access to a shared pool of configurable computing resources. For example, cloud computing can be employed in the marketplace to offer ubiquitous and convenient on-demand access to the shared pool of configurable computing resources. The shared pool of configurable computing resources can be rapidly provisioned via virtualization and released with low management effort or service provider interaction, and then scaled accordingly.
  • A cloud-computing model can be composed of various characteristics such as, for example, on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, and so forth. A cloud-computing model can also expose various service models, such as, for example, Software as a Service (“SaaS”), Platform as a Service (“PaaS”), and Infrastructure as a Service (“IaaS”). A cloud-computing model can also be deployed using different deployment models such as private cloud, community cloud, public cloud, hybrid cloud, and so forth. In this description and in the claims, a “cloud-computing environment” is an environment in which cloud computing is employed.
  • FIG. 7 illustrates a block diagram of exemplary computing device 700 that may be configured to perform one or more of the processes described above. One will appreciate that one or more computing devices such as the computing device 700 may implement the electronic messaging system 100. Any of the computing devices 202, 204, 300 or communication server 208 can comprise a computing device 700. As shown by FIG. 7, the computing device 700 can comprise a processor 702, a memory 704, a storage device 706, an I/O interface 708, and a communication interface 710, which may be communicatively coupled by way of a communication infrastructure 712. While an exemplary computing device 700 is shown in FIG. 7, the components illustrated in FIG. 7 are not intended to be limiting. Additional or alternative components may be used in other embodiments. Furthermore, in certain embodiments, the computing device 700 can include fewer components than those shown in FIG. 7. Components of the computing device 700 shown in FIG. 7 will now be described in additional detail.
  • In one or more embodiments, the processor 702 includes hardware for executing instructions, such as those making up a computer program. As an example and not by way of limitation, to execute instructions, the processor 702 may retrieve (or fetch) the instructions from an internal register, an internal cache, the memory 704, or the storage device 706 and decode and execute them. In one or more embodiments, the processor 702 may include one or more internal caches for data, instructions, or addresses. As an example and not by way of limitation, the processor 702 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in the memory 704 or the storage 706.
  • The memory 704 may be used for storing data, metadata, and programs for execution by the processor(s). The memory 704 may include one or more of volatile and non-volatile memories, such as Random Access Memory (“RAM”), Read Only Memory (“ROM”), a solid state disk (“SSD”), Flash, Phase Change Memory (“PCM”), or other types of data storage. The memory 704 may be internal or distributed memory.
  • The storage device 706 includes storage for storing data or instructions. As an example and not by way of limitation, storage device 706 can comprise a non-transitory storage medium described above. The storage device 706 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. The storage device 706 may include removable or non-removable (or fixed) media, where appropriate. The storage device 706 may be internal or external to the computing device 700. In one or more embodiments, the storage device 706 is non-volatile, solid-state memory. In other embodiments, the storage device 706 includes read-only memory (ROM). Where appropriate, this ROM may be mask programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these.
  • The I/O interface 708 allows a user to provide input to, receive output from, and otherwise transfer data to and receive data from computing device 700. The I/O interface 608 may include a mouse, a keypad or a keyboard, a touch screen, a camera, an optical scanner, network interface, modem, other known I/O devices or a combination of such I/O interfaces. The I/O interface 708 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, the I/O interface 708 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
  • The communication interface 710 can include hardware, software, or both. In any event, the communication interface 710 can provide one or more interfaces for communication (such as, for example, packet-based communication) between the computing device 700 and one or more other computing devices or networks. As an example and not by way of limitation, the communication interface 710 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI.
  • Additionally or alternatively, the communication interface 710 may facilitate communications with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, the communication interface 710 may facilitate communications with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination thereof.
  • Additionally, the communication interface 710 may facilitate communications various communication protocols. Examples of communication protocols that may be used include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Mark-up Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Time Division Multiple Access (“TDMA”) technologies, Short Message Service (“SMS”), Multimedia Message Service (“MMS”), radio frequency (“RF”) signaling technologies, Long Term Evolution (“LTE”) technologies, wireless communication technologies, in-band and out-of-band signaling technologies, and other suitable communications networks and technologies.
  • The communication infrastructure 712 may include hardware, software, or both that couples components of the computing device 700 to each other. As an example and not by way of limitation, the communication infrastructure 712 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination thereof.
  • As mentioned above, the network 206 and/or communication server 208 can comprise a social-networking system. A social-networking system may enable its users (such as persons or organizations) to interact with the system and with each other. The social-networking system may, with input from a user, create and store in the social-networking system a user profile associated with the user. The user profile may include demographic information, communication-channel information, and information on personal interests of the user. The social-networking system may also, with input from a user, create and store a record of relationships of the user with other users of the social-networking system, as well as provide services (e.g. wall posts, photo-sharing, event organization, messaging, games, or advertisements) to facilitate social interaction between or among users.
  • The social-networking system may store records of users and relationships between users in a social graph comprising a plurality of nodes and a plurality of edges connecting the nodes. The nodes may comprise a plurality of user nodes and a plurality of concept nodes. A user node of the social graph may correspond to a user of the social-networking system. A user may be an individual (human user), an entity (e.g., an enterprise, business, or third party application), or a group (e.g., of individuals or entities). A user node corresponding to a user may comprise information provided by the user and information gathered by various systems, including the social-networking system.
  • For example, the user may provide his or her name, profile picture, city of residence, contact information, birth date, gender, marital status, family status, employment, educational background, preferences, interests, and other demographic information to be included in the user node. Each user node of the social graph may have a corresponding web page (typically known as a profile page). In response to a request including a user name, the social-networking system can access a user node corresponding to the user name, and construct a profile page including the name, a profile picture, and other information associated with the user. A profile page of a first user may display to a second user all or a portion of the first user's information based on one or more privacy settings by the first user and the relationship between the first user and the second user.
  • A concept node may correspond to a concept of the social-networking system. For example, a concept can represent a real-world entity, such as a movie, a song, a sports team, a celebrity, a group, a restaurant, or a place or a location. An administrative user of a concept node corresponding to a concept may create or update the concept node by providing information of the concept (e.g., by filling out an online form), causing the social-networking system to associate the information with the concept node. For example and without limitation, information associated with a concept can include a name or a title, one or more images (e.g., an image of cover page of a book), a web site (e.g., an URL address) or contact information (e.g., a phone number, an email address). Each concept node of the social graph may correspond to a web page. For example, in response to a request including a name, the social-networking system can access a concept node corresponding to the name, and construct a web page including the name and other information associated with the concept.
  • An edge between a pair of nodes may represent a relationship between the pair of nodes. For example, an edge between two user nodes can represent a friendship between two users. For another example, the social-networking system may construct a web page (or a structured document) of a concept node (e.g., a restaurant, a celebrity), incorporating one or more selectable buttons (e.g., “like”, “check in”) in the web page. A user can access the page using a web browser hosted by the user's client device and select a selectable button, causing the client device to transmit to the social-networking system a request to create an edge between a user node of the user and a concept node of the concept, indicating a relationship between the user and the concept (e.g., the user checks in a restaurant, or the user “likes” a celebrity).
  • As an example, a user may provide (or change) his or her city of residence, causing the social-networking system to create an edge between a user node corresponding to the user and a concept node corresponding to the city declared by the user as his or her city of residence. In addition, the degree of separation between any two nodes is defined as the minimum number of hops required to traverse the social graph from one node to the other. A degree of separation between two nodes can be considered a measure of relatedness between the users or the concepts represented by the two nodes in the social graph. For example, two users having user nodes that are directly connected by an edge (i.e., are first-degree nodes) may be described as “connected users” or “friends.” Similarly, two users having user nodes that are connected only through another user node (i.e., are second-degree nodes) may be described as “friends of friends.”
  • A social-networking system may support a variety of applications, such as photo sharing, on-line calendars and events, gaming, instant messaging, and advertising. For example, the social-networking system may also include media sharing capabilities. Also, the social-networking system may allow users to post photographs and other multimedia files to a user's profile page (typically known as “wall posts” or “timeline posts”) or in a photo album, both of which may be accessible to other users of the social-networking system depending upon the user's configured privacy settings. The social-networking system may also allow users to configure events. For example, a first user may configure an event with attributes including time and date of the event, location of the event and other users invited to the event. The invited users may receive invitations to the event and respond (such as by accepting the invitation or declining it). Furthermore, the social-networking system may allow users to maintain a personal calendar. Similarly to events, the calendar entries may include times, dates, locations and identities of other users.
  • FIG. 8 illustrates an example network environment of a social-networking system. In one or more embodiments, a social-networking system 802 may comprise one or more data stores. For example, the social-networking system 802 may store a social graph comprising user nodes, concept nodes, and edges between nodes as described earlier. Each user node may comprise one or more data objects corresponding to information associated with or describing a user. Each concept node may comprise one or more data objects corresponding to information associated with a concept. Each edge between a pair of nodes may comprise one or more data objects corresponding to information associated with a relationship between users (or between a user and a concept, or between concepts) corresponding to the pair of nodes.
  • In one or more embodiments, the social-networking system 802 may comprise one or more computing devices (e.g., servers) hosting functionality directed to operation of the social-networking system. A user of the social-networking system 802 may access the social-networking system 802 using a client device such as client device 806. In particular, the client device 806 can interact with the social-networking system 802 through a network 804.
  • The client device 806 may be a desktop computer, laptop computer, tablet computer, personal digital assistant (PDA), in- or out-of-car navigation system, smart phone or other cellular or mobile phone, or mobile gaming device, other mobile device, or other suitable computing devices. Client device 806 may execute one or more client applications, such as a web browser (e.g., Microsoft Windows Internet Explorer, Mozilla Firefox, Apple Safari, Google Chrome, Opera, etc.) or a native or special-purpose client application (e.g., Facebook for iPhone or iPad, Facebook for Android, etc.), to access and view content over a network 804.
  • Network 804 may represent a network or collection of networks (such as the Internet, a corporate intranet, a virtual private network (VPN), a local area network (LAN), a wireless local area network (WLAN), a cellular network, a wide area network (WAN), a metropolitan area network (MAN), or a combination of two or more such networks) over which client devices 806 may access the social-networking system 802.
  • While these methods, systems, and user interfaces utilize both publicly available information as well as information provided by users of the social-networking system, all use of such information is to be explicitly subject to all privacy settings of the involved users and the privacy policy of the social-networking system as a whole.
  • In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. Various embodiments and aspects of the invention(s) are described with reference to details discussed herein, and the accompanying drawings illustrate the various embodiments. The description above and drawings are illustrative of the invention and are not to be construed as limiting the invention. Numerous specific details are described to provide a thorough understanding of various embodiments of the present invention.
  • The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

What is claimed is:
1. A non-transitory computer readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, cause a mobile client device to:
provide a messaging graphical user interface of a messaging application, the messaging graphical user interface comprising:
a communication thread including a plurality of exchanged electronic messages in a first portion; and
a second portion comprising a preview of one or more multimedia content items from a camera roll of a mobile device;
receive a selection of a preview of a multimedia content item;
provide, without navigating away from the messaging application, the multimedia content item together with one or more editing controls;
edit the multimedia content item based on user interaction with an editing control of the one or more editing controls;
send the edited multimedia content item as an electronic message; and
add the edited multimedia content item to the communication thread.
2. The non-transitory computer readable storage medium as recited in claim 1, wherein the one or more multimedia content items comprise one or more digital photographs or digital videos.
3. The non-transitory computer readable storage medium as recited in claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile client device to edit the multimedia content item without editing a corresponding version of the multimedia content item stored within the camera roll.
4. The non-transitory computer readable storage medium as recited in claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile client device to provide the multimedia content item together with one or more editing controls by providing one or more of a crop editing control, an auto-edit control, or a color edit control.
5. The non-transitory computer readable storage medium as recited in claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile client device to edit the multimedia content item based on user interaction with an editing control of the one or more editing controls without utilizing a separate editing application.
6. The non-transitory computer readable storage medium as recited in claim 1, wherein the computer-executable instructions, when executed by the processor, further cause the mobile client device to.
7. The non-transitory computer readable storage medium as recited in claim 1, wherein the computer-executable instructions, when executed by the processor, cause the mobile client device to provide, without navigating away from the messaging application, the multimedia content item together with one or more editing controls by replacing with the communication thread and the second portion with an enlarged version of the multimedia content item compared to the preview of the multimedia content item with the one or more editing controls.
8. A mobile device comprising:
a touch screen; and
at least one processor configured to cause the mobile device to:
provide a messaging graphical user interface of a messaging application via the touch screen, the messaging graphical user interface comprising:
a communication thread including a plurality of exchanged electronic messages in a first portion; and
a second portion comprising a preview of one or more multimedia content items from a camera roll of a mobile device;
receive a selection of a preview of a multimedia content item;
provide, without navigating away from the messaging application, the multimedia content item together with one or more editing controls;
edit the multimedia content item based on user interaction with an editing control of the one or more editing controls;
send the edited multimedia content item as an electronic message; and
add the edited multimedia content item to the communication thread.
9. The mobile device as recited in claim 8, wherein the at least one processor configured to cause the mobile device to provide the multimedia content item together with one or more editing controls by providing one or more of a crop editing control, an auto-edit control, or a color edit control.
10. The mobile device as recited in claim 8, wherein the at least one processor configured to further cause the mobile device to:
in response to a detected press-and-hold touch gesture selection of the preview of the multimedia content item, provide a first control, wherein the first control is a first editing control; and
provide the multimedia content item together with one or more editing controls in response to selection of the first editing control.
11. The mobile device as recited in claim 10, wherein the at least one processor configured to cause the mobile device to, in response to the detected press-and-hold touch gesture selection of the preview of the multimedia content item, provide a second control adjacent to the first control, wherein the second control overlaid on the selected preview of the multimedia content item is a send control.
12. The mobile device as recited in claim 11, wherein the at least one processor configured to cause the mobile device to in response to the detected press-and-hold touch gesture selection of the preview of the multimedia content item, blur the selected preview of the multimedia content item underlying the first control and the second control.
13. The mobile device as recited in claim 8, wherein the at least one processor configured to cause the mobile device to:
provide the multimedia content item together with one or more editing controls by providing a copy of the multimedia content item in a user interface in place of the messaging graphical user interface; and
edit the multimedia content item based on user interaction with the one or more editing controls by editing the copy of the multimedia content item.
14. The mobile device as recited in claim 13, wherein the at least one processor configured to further cause the mobile device to:
crop the preview of each of the one or more multimedia content items to a size that allows multiple previews to fit within a scrollable multimedia content item display area; and
provide a plurality of cropped previews simultaneously together within the scrollable multimedia content item display area.
15. The mobile device as recited in claim 14, wherein the at least one processor configured to cause the mobile device to edit the copy of the multimedia content item without editing a corresponding version of the multimedia content item stored within the camera roll.
16. The mobile device as recited in claim 8, wherein the at least one processor configured to further cause the mobile device to edit the multimedia content item based on user interaction with an editing control of the one or more editing controls without utilizing a separate editing application.
17. A method comprising:
providing a messaging graphical user interface of a messaging application, the messaging graphical user interface comprising:
a communication thread including a plurality of exchanged electronic messages in a first portion; and
a second portion comprising a preview of one or more multimedia content items from a camera roll of a mobile device;
receiving a selection of a preview of a multimedia content item;
providing, without navigating away from the messaging application, the multimedia content item together with one or more editing controls;
editing the multimedia content item based on user interaction with an editing control of the one or more editing controls;
sending the edited multimedia content item as an electronic message; and
adding the edited multimedia content item to the communication thread.
18. The method as recited in claim 17, further comprising:
providing the multimedia content item together with one or more editing controls by providing a copy of the multimedia content item in a user interface in place of the messaging graphical user interface; and
editing the multimedia content item based on user interaction with the one or more editing controls by editing the copy of the multimedia content item.
19. The method as recited in claim 18, further comprising editing the copy of the multimedia content item without editing a corresponding version of the multimedia content item stored within the camera roll.
20. The method as recited in claim 18, further comprising editing the multimedia content item based on user interaction with an editing control of the one or more editing controls without utilizing a separate editing application.
US17/806,230 2014-04-28 2022-06-09 Facilitating the editing of multimedia as part of sending the multimedia in a message Abandoned US20220300132A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/806,230 US20220300132A1 (en) 2014-04-28 2022-06-09 Facilitating the editing of multimedia as part of sending the multimedia in a message

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201461985456P 2014-04-28 2014-04-28
US14/312,481 US9836207B2 (en) 2014-04-28 2014-06-23 Facilitating the sending of multimedia as a message
US15/826,032 US11397523B2 (en) 2014-04-28 2017-11-29 Facilitating the sending of multimedia as a message
US17/806,230 US20220300132A1 (en) 2014-04-28 2022-06-09 Facilitating the editing of multimedia as part of sending the multimedia in a message

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/826,032 Continuation US11397523B2 (en) 2014-04-28 2017-11-29 Facilitating the sending of multimedia as a message

Publications (1)

Publication Number Publication Date
US20220300132A1 true US20220300132A1 (en) 2022-09-22

Family

ID=50982798

Family Applications (9)

Application Number Title Priority Date Filing Date
US14/308,188 Active 2036-02-13 US10845982B2 (en) 2014-04-28 2014-06-18 Providing intelligent transcriptions of sound messages in a messaging application
US14/312,481 Active 2034-11-17 US9836207B2 (en) 2014-04-28 2014-06-23 Facilitating the sending of multimedia as a message
US14/311,758 Active US9391933B2 (en) 2014-04-28 2014-06-23 Composing messages within a communication thread
US14/314,623 Active US9391934B2 (en) 2014-04-28 2014-06-25 Capturing and sending multimedia as electronic messages
US15/179,584 Active 2034-09-03 US10809908B2 (en) 2014-04-28 2016-06-10 Composing messages within a communication thread
US15/191,157 Active 2035-04-05 US10976915B2 (en) 2014-04-28 2016-06-23 Capturing and sending multimedia as electronic messages
US15/826,032 Active 2034-09-14 US11397523B2 (en) 2014-04-28 2017-11-29 Facilitating the sending of multimedia as a message
US17/197,615 Active US11455093B2 (en) 2014-04-28 2021-03-10 Capturing and sending multimedia as electronic messages
US17/806,230 Abandoned US20220300132A1 (en) 2014-04-28 2022-06-09 Facilitating the editing of multimedia as part of sending the multimedia in a message

Family Applications Before (8)

Application Number Title Priority Date Filing Date
US14/308,188 Active 2036-02-13 US10845982B2 (en) 2014-04-28 2014-06-18 Providing intelligent transcriptions of sound messages in a messaging application
US14/312,481 Active 2034-11-17 US9836207B2 (en) 2014-04-28 2014-06-23 Facilitating the sending of multimedia as a message
US14/311,758 Active US9391933B2 (en) 2014-04-28 2014-06-23 Composing messages within a communication thread
US14/314,623 Active US9391934B2 (en) 2014-04-28 2014-06-25 Capturing and sending multimedia as electronic messages
US15/179,584 Active 2034-09-03 US10809908B2 (en) 2014-04-28 2016-06-10 Composing messages within a communication thread
US15/191,157 Active 2035-04-05 US10976915B2 (en) 2014-04-28 2016-06-23 Capturing and sending multimedia as electronic messages
US15/826,032 Active 2034-09-14 US11397523B2 (en) 2014-04-28 2017-11-29 Facilitating the sending of multimedia as a message
US17/197,615 Active US11455093B2 (en) 2014-04-28 2021-03-10 Capturing and sending multimedia as electronic messages

Country Status (11)

Country Link
US (9) US10845982B2 (en)
EP (4) EP3862862A1 (en)
JP (2) JP6262362B2 (en)
KR (2) KR101733637B1 (en)
CN (2) CN106255949B (en)
AU (2) AU2014392592B2 (en)
BR (1) BR112016025271A2 (en)
CA (2) CA2938529C (en)
IL (2) IL247022A (en)
MX (2) MX2016014088A (en)
WO (2) WO2015167589A1 (en)

Families Citing this family (382)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US8554868B2 (en) 2007-01-05 2013-10-08 Yahoo! Inc. Simultaneous sharing communication interface
US8977255B2 (en) 2007-04-03 2015-03-10 Apple Inc. Method and system for operating a multi-function portable electronic device using voice-activation
US9954996B2 (en) 2007-06-28 2018-04-24 Apple Inc. Portable electronic device with conversation management for incoming instant messages
US10002189B2 (en) 2007-12-20 2018-06-19 Apple Inc. Method and apparatus for searching using an active ontology
US9330720B2 (en) 2008-01-03 2016-05-03 Apple Inc. Methods and apparatus for altering audio output signals
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US20100030549A1 (en) 2008-07-31 2010-02-04 Lee Michael M Mobile device having human language translation capability with positional feedback
US8676904B2 (en) 2008-10-02 2014-03-18 Apple Inc. Electronic devices with voice command and contextual data processing capabilities
US8779265B1 (en) * 2009-04-24 2014-07-15 Shindig, Inc. Networks of portable electronic devices that collectively generate sound
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US10706373B2 (en) 2011-06-03 2020-07-07 Apple Inc. Performing actions associated with task items that represent tasks to perform
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10276170B2 (en) 2010-01-18 2019-04-30 Apple Inc. Intelligent automated assistant
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US8280954B2 (en) * 2010-03-25 2012-10-02 Scomm, Inc. Method and system for providing live real-time communication via text between mobile user devices
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
US9262612B2 (en) 2011-03-21 2016-02-16 Apple Inc. Device access using voice authentication
US10057736B2 (en) 2011-06-03 2018-08-21 Apple Inc. Active transport based notifications
IL300140B2 (en) 2011-07-12 2024-02-01 Snap Inc Methods and systems of providing visual content editing functions
US11734712B2 (en) 2012-02-24 2023-08-22 Foursquare Labs, Inc. Attributing in-store visits to media consumption based on data collected from user devices
US8972357B2 (en) 2012-02-24 2015-03-03 Placed, Inc. System and method for data collection to validate location data
US10134385B2 (en) 2012-03-02 2018-11-20 Apple Inc. Systems and methods for name pronunciation
US9154456B2 (en) 2012-04-17 2015-10-06 Trenda Innovations, Inc. Messaging system and method
US10155168B2 (en) 2012-05-08 2018-12-18 Snap Inc. System and method for adaptable avatars
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US9804759B2 (en) 2012-05-09 2017-10-31 Apple Inc. Context-specific user interfaces
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US10613743B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US10417037B2 (en) 2012-05-15 2019-09-17 Apple Inc. Systems and methods for integrating third party services with a digital assistant
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
JP2016508007A (en) 2013-02-07 2016-03-10 アップル インコーポレイテッド Voice trigger for digital assistant
US10652394B2 (en) 2013-03-14 2020-05-12 Apple Inc. System and method for processing voicemail
US10748529B1 (en) 2013-03-15 2020-08-18 Apple Inc. Voice activated device for use with a voice-based digital assistant
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
WO2014197335A1 (en) 2013-06-08 2014-12-11 Apple Inc. Interpreting and acting upon commands that involve sharing information with remote devices
CN110442699A (en) 2013-06-09 2019-11-12 苹果公司 Operate method, computer-readable medium, electronic equipment and the system of digital assistants
US10176167B2 (en) 2013-06-09 2019-01-08 Apple Inc. System and method for inferring user intent from speech inputs
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
KR102138515B1 (en) 2013-10-01 2020-07-28 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US9246961B2 (en) 2013-11-27 2016-01-26 Facebook, Inc. Communication user interface systems and methods
US10296160B2 (en) 2013-12-06 2019-05-21 Apple Inc. Method for extracting salient dialog usage from live data
US9628950B1 (en) 2014-01-12 2017-04-18 Investment Asset Holdings Llc Location-based messaging
US10845982B2 (en) 2014-04-28 2020-11-24 Facebook, Inc. Providing intelligent transcriptions of sound messages in a messaging application
US9396354B1 (en) 2014-05-28 2016-07-19 Snapchat, Inc. Apparatus and method for automated privacy protection in distributed images
US9537811B2 (en) 2014-10-02 2017-01-03 Snap Inc. Ephemeral gallery of ephemeral messages
US9966065B2 (en) 2014-05-30 2018-05-08 Apple Inc. Multi-command single utterance input method
US9715875B2 (en) 2014-05-30 2017-07-25 Apple Inc. Reducing the need for manual start/end-pointing and trigger phrases
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9430463B2 (en) 2014-05-30 2016-08-30 Apple Inc. Exemplar-based natural language processing
US10170123B2 (en) 2014-05-30 2019-01-01 Apple Inc. Intelligent assistant for home automation
US10382378B2 (en) 2014-05-31 2019-08-13 Apple Inc. Live location sharing
US20150350141A1 (en) 2014-05-31 2015-12-03 Apple Inc. Message user interfaces for capture and transmittal of media and location content
EP2955686A1 (en) 2014-06-05 2015-12-16 Mobli Technologies 2010 Ltd. Automatic article enrichment by social media trends
US9113301B1 (en) 2014-06-13 2015-08-18 Snapchat, Inc. Geo-location based event gallery
CN106462340B (en) 2014-06-27 2019-09-13 苹果公司 The user interface that size reduces
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
US9225897B1 (en) 2014-07-07 2015-12-29 Snapchat, Inc. Apparatus and method for supplying content aware photo filters
KR20160009915A (en) * 2014-07-17 2016-01-27 삼성전자주식회사 Method for processing data and electronic device thereof
EP3195098A2 (en) 2014-07-21 2017-07-26 Apple Inc. Remote user interface
CN106605201B (en) 2014-08-06 2021-11-23 苹果公司 Reduced size user interface for battery management
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
KR102016160B1 (en) 2014-09-02 2019-08-29 애플 인크. Reduced-size interfaces for managing alerts
EP4209872A1 (en) 2014-09-02 2023-07-12 Apple Inc. Phone user interface
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
EP3189416B1 (en) * 2014-09-02 2020-07-15 Apple Inc. User interface for receiving user input
US9818400B2 (en) 2014-09-11 2017-11-14 Apple Inc. Method and apparatus for discovering trending terms in speech requests
US10423983B2 (en) 2014-09-16 2019-09-24 Snap Inc. Determining targeting information based on a predictive targeting model
US10824654B2 (en) 2014-09-18 2020-11-03 Snap Inc. Geolocation-based pictographs
US11216869B2 (en) 2014-09-23 2022-01-04 Snap Inc. User interface to augment an image using geolocation
US10074360B2 (en) 2014-09-30 2018-09-11 Apple Inc. Providing an indication of the suitability of speech recognition
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10127911B2 (en) 2014-09-30 2018-11-13 Apple Inc. Speaker identification and unsupervised speaker adaptation techniques
US10284508B1 (en) * 2014-10-02 2019-05-07 Snap Inc. Ephemeral gallery of ephemeral messages with opt-in permanence
US20160105388A1 (en) * 2014-10-09 2016-04-14 Footspot, Inc. System and method for digital media capture and related social networking
KR102250777B1 (en) * 2014-10-21 2021-05-11 삼성전자주식회사 Method for providing content and electronic device thereof
US9754624B2 (en) * 2014-11-08 2017-09-05 Wooshii Ltd Video creation platform
US9015285B1 (en) 2014-11-12 2015-04-21 Snapchat, Inc. User interface for accessing media at a geographic location
US9385983B1 (en) 2014-12-19 2016-07-05 Snapchat, Inc. Gallery of messages from individuals with a shared interest
US9854219B2 (en) 2014-12-19 2017-12-26 Snap Inc. Gallery of videos set to an audio time line
US10311916B2 (en) 2014-12-19 2019-06-04 Snap Inc. Gallery of videos set to an audio time line
JP2016126445A (en) * 2014-12-26 2016-07-11 Line株式会社 Server, control method thereof, and program
US9754355B2 (en) 2015-01-09 2017-09-05 Snap Inc. Object recognition based photo filters
US11388226B1 (en) 2015-01-13 2022-07-12 Snap Inc. Guided personal identity based actions
US10133705B1 (en) 2015-01-19 2018-11-20 Snap Inc. Multichannel system
US9521515B2 (en) 2015-01-26 2016-12-13 Mobli Technologies 2010 Ltd. Content request by location
KR102377277B1 (en) * 2015-02-27 2022-03-23 삼성전자주식회사 Method and apparatus for supporting communication in electronic device
US10152299B2 (en) 2015-03-06 2018-12-11 Apple Inc. Reducing response latency of intelligent automated assistants
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US9886953B2 (en) 2015-03-08 2018-02-06 Apple Inc. Virtual assistant activation
US9721566B2 (en) 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US10223397B1 (en) 2015-03-13 2019-03-05 Snap Inc. Social graph based co-location of network users
KR102163528B1 (en) 2015-03-18 2020-10-08 스냅 인코포레이티드 Geo-fence authorization provisioning
US9692967B1 (en) 2015-03-23 2017-06-27 Snap Inc. Systems and methods for reducing boot time and power consumption in camera systems
US10210563B2 (en) 2015-03-23 2019-02-19 Hipmunk, Inc. Search machine with dynamically updated flag on graphical slider
CN108353126B (en) 2015-04-23 2019-08-23 苹果公司 Handle method, electronic equipment and the computer readable storage medium of the content of camera
US9881094B2 (en) 2015-05-05 2018-01-30 Snap Inc. Systems and methods for automated local story generation and curation
US10135949B1 (en) 2015-05-05 2018-11-20 Snap Inc. Systems and methods for story and sub-story navigation
US10460227B2 (en) 2015-05-15 2019-10-29 Apple Inc. Virtual assistant in a communication session
US10200824B2 (en) 2015-05-27 2019-02-05 Apple Inc. Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device
US10083688B2 (en) 2015-05-27 2018-09-25 Apple Inc. Device voice control for selecting a displayed affordance
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US20160373388A1 (en) * 2015-06-19 2016-12-22 Voxer Ip Llc Messaging application for recording and inserting a video message into a chat
US11284170B1 (en) 2015-06-29 2022-03-22 Twitter, Inc. Video preview mechanism
US20160378747A1 (en) 2015-06-29 2016-12-29 Apple Inc. Virtual assistant for media playback
US10993069B2 (en) 2015-07-16 2021-04-27 Snap Inc. Dynamically adaptive media content delivery
US10817898B2 (en) 2015-08-13 2020-10-27 Placed, Llc Determining exposures to content presented by physical objects
US10003938B2 (en) 2015-08-14 2018-06-19 Apple Inc. Easy location sharing
US10740384B2 (en) 2015-09-08 2020-08-11 Apple Inc. Intelligent automated assistant for media search and playback
US10331312B2 (en) 2015-09-08 2019-06-25 Apple Inc. Intelligent automated assistant in a media environment
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US20170078240A1 (en) * 2015-09-16 2017-03-16 Whatsapp Inc. Techniques to select and configure media for media messaging
US9787819B2 (en) * 2015-09-18 2017-10-10 Microsoft Technology Licensing, Llc Transcription of spoken communications
US20170090718A1 (en) 2015-09-25 2017-03-30 International Business Machines Corporation Linking selected messages in electronic message threads
US10212116B2 (en) * 2015-09-29 2019-02-19 International Business Machines Corporation Intelligently condensing transcript thread history into a single common reduced instance
US10157039B2 (en) * 2015-10-05 2018-12-18 Motorola Mobility Llc Automatic capturing of multi-mode inputs in applications
CN105282013A (en) 2015-10-30 2016-01-27 腾讯科技(深圳)有限公司 Item message notification method, device and system
US9652896B1 (en) 2015-10-30 2017-05-16 Snap Inc. Image based tracking in augmented reality systems
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10956666B2 (en) 2015-11-09 2021-03-23 Apple Inc. Unconventional virtual assistant interactions
CN105450930B (en) * 2015-11-09 2018-11-13 广州多益网络股份有限公司 A kind of self-timer method and device
US10474321B2 (en) 2015-11-30 2019-11-12 Snap Inc. Network resource location linking and visual content sharing
US9984499B1 (en) 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US9792896B2 (en) * 2015-12-15 2017-10-17 Facebook, Inc. Providing intelligent transcriptions of sound messages in a messaging application
CN105429860A (en) * 2015-12-15 2016-03-23 浙江吉利控股集团有限公司 Instant communication device and method
US10354425B2 (en) 2015-12-18 2019-07-16 Snap Inc. Method and system for providing context relevant media augmentation
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
CN107038023A (en) * 2016-02-02 2017-08-11 腾讯科技(深圳)有限公司 The exchange method and device of interaction comment
US11023514B2 (en) 2016-02-26 2021-06-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10679389B2 (en) 2016-02-26 2020-06-09 Snap Inc. Methods and systems for generation, curation, and presentation of media collections
US10285001B2 (en) 2016-02-26 2019-05-07 Snap Inc. Generation, curation, and presentation of media collections
DK201670539A1 (en) * 2016-03-14 2017-10-02 Apple Inc Dictation that allows editing
KR20170109283A (en) * 2016-03-21 2017-09-29 현대자동차주식회사 Vehicle and method for controlling vehicle
US10339365B2 (en) 2016-03-31 2019-07-02 Snap Inc. Automated avatar generation
BR112018073693A2 (en) 2016-05-18 2019-02-26 Apple Inc devices, methods, and graphical user interfaces for messaging
US9959037B2 (en) * 2016-05-18 2018-05-01 Apple Inc. Devices, methods, and graphical user interfaces for messaging
DK180169B1 (en) * 2016-05-18 2020-07-13 Apple Inc Devices, procedures, and graphical messaging user interfaces
US11227589B2 (en) 2016-06-06 2022-01-18 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
US10368208B2 (en) 2016-06-12 2019-07-30 Apple Inc. Layers in messaging applications
US10009536B2 (en) 2016-06-12 2018-06-26 Apple Inc. Applying a simulated optical effect based on data received from multiple camera sensors
US9681265B1 (en) 2016-06-28 2017-06-13 Snap Inc. System to track engagement of media items
US10430838B1 (en) 2016-06-28 2019-10-01 Snap Inc. Methods and systems for generation, curation, and presentation of media collections with automated advertising
US10387514B1 (en) 2016-06-30 2019-08-20 Snap Inc. Automated content curation and communication
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
WO2018022602A1 (en) 2016-07-25 2018-02-01 Ctrl-Labs Corporation Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
WO2020112986A1 (en) 2018-11-27 2020-06-04 Facebook Technologies, Inc. Methods and apparatus for autocalibration of a wearable electrode sensor system
US10664157B2 (en) * 2016-08-03 2020-05-26 Google Llc Image search query predictions by a keyboard
KR102420857B1 (en) 2016-08-30 2022-07-15 스냅 인코포레이티드 Systems and methods for simultaneous localization and mapping
US10474753B2 (en) 2016-09-07 2019-11-12 Apple Inc. Language identification using recurrent neural networks
US10133916B2 (en) 2016-09-07 2018-11-20 Steven M. Gottlieb Image and identity validation in video chat events
DK179471B1 (en) 2016-09-23 2018-11-26 Apple Inc. Image data for enhanced user interactions
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US10942971B2 (en) * 2016-10-14 2021-03-09 NewsRx, LLC Inserting elements into artificial intelligence content
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
CN112738408B (en) 2016-11-07 2022-09-16 斯纳普公司 Selective identification and ordering of image modifiers
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
US10203855B2 (en) 2016-12-09 2019-02-12 Snap Inc. Customized user-controlled media overlays
CN108459781B (en) * 2016-12-13 2021-03-12 阿里巴巴(中国)有限公司 Input box display control method and device and user terminal
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
US11204787B2 (en) 2017-01-09 2021-12-21 Apple Inc. Application integration with a digital assistant
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US10454857B1 (en) 2017-01-23 2019-10-22 Snap Inc. Customized digital avatar accessories
US10915911B2 (en) 2017-02-03 2021-02-09 Snap Inc. System to determine a price-schedule to distribute media content
KR101894928B1 (en) 2017-02-14 2018-09-05 (주)스톤아이 Bonus calculating apparatus using number of visit and method thereof
KR20180094331A (en) * 2017-02-15 2018-08-23 삼성전자주식회사 Electronic apparatus and method for outputting message data thereof
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US11250075B1 (en) 2017-02-17 2022-02-15 Snap Inc. Searching social media content
US10074381B1 (en) 2017-02-20 2018-09-11 Snap Inc. Augmented reality speech balloon system
US10565795B2 (en) 2017-03-06 2020-02-18 Snap Inc. Virtual vision system
US10523625B1 (en) 2017-03-09 2019-12-31 Snap Inc. Restricted group content collection
US10581782B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US10582277B2 (en) 2017-03-27 2020-03-03 Snap Inc. Generating a stitched data stream
US11170393B1 (en) 2017-04-11 2021-11-09 Snap Inc. System to calculate an engagement score of location based media content
TWI647609B (en) * 2017-04-14 2019-01-11 緯創資通股份有限公司 Instant messaging method, system and electronic device and server
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10212541B1 (en) 2017-04-27 2019-02-19 Snap Inc. Selective location-based identity communication
CN111343075B (en) 2017-04-27 2022-09-16 斯纳普公司 Location privacy association on map-based social media platform
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US10467147B1 (en) 2017-04-28 2019-11-05 Snap Inc. Precaching unlockable data elements
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
US10417266B2 (en) 2017-05-09 2019-09-17 Apple Inc. Context-aware ranking of intelligent response suggestions
DK180048B1 (en) 2017-05-11 2020-02-04 Apple Inc. MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION
DK201770439A1 (en) 2017-05-11 2018-12-13 Apple Inc. Offline personal assistant
US10726832B2 (en) 2017-05-11 2020-07-28 Apple Inc. Maintaining privacy of personal information
US10395654B2 (en) 2017-05-11 2019-08-27 Apple Inc. Text normalization based on a data-driven learning network
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
US11301477B2 (en) 2017-05-12 2022-04-12 Apple Inc. Feedback analysis of a digital assistant
DK201770429A1 (en) 2017-05-12 2018-12-14 Apple Inc. Low-latency intelligent automated assistant
DK201770431A1 (en) 2017-05-15 2018-12-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
DK179867B1 (en) 2017-05-16 2019-08-06 Apple Inc. RECORDING AND SENDING EMOJI
US10403278B2 (en) 2017-05-16 2019-09-03 Apple Inc. Methods and systems for phonetic matching in digital assistant services
US10303715B2 (en) 2017-05-16 2019-05-28 Apple Inc. Intelligent automated assistant for media exploration
DK179549B1 (en) 2017-05-16 2019-02-12 Apple Inc. Far-field extension for digital assistant services
KR20230144661A (en) 2017-05-16 2023-10-16 애플 인크. Emoji recording and sending
US10311144B2 (en) 2017-05-16 2019-06-04 Apple Inc. Emoji word sense disambiguation
US20180336892A1 (en) 2017-05-16 2018-11-22 Apple Inc. Detecting a trigger of a digital assistant
US10803120B1 (en) 2017-05-31 2020-10-13 Snap Inc. Geolocation based playlists
US10657328B2 (en) 2017-06-02 2020-05-19 Apple Inc. Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling
DK180859B1 (en) 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
US10796103B2 (en) 2017-07-12 2020-10-06 T-Mobile Usa, Inc. Word-by-word transmission of real time text
US10404632B2 (en) 2017-07-12 2019-09-03 T-Mobile Usa, Inc. Determining when to partition real time text content and display the partitioned content within separate conversation bubbles
CN107592415B (en) * 2017-08-31 2020-08-21 努比亚技术有限公司 Voice transmission method, terminal, and computer-readable storage medium
US11475254B1 (en) 2017-09-08 2022-10-18 Snap Inc. Multimodal entity identification
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US10445429B2 (en) 2017-09-21 2019-10-15 Apple Inc. Natural language understanding using vocabularies with compressed serialized tries
US10755051B2 (en) 2017-09-29 2020-08-25 Apple Inc. Rule-based natural language processing
US10499191B1 (en) 2017-10-09 2019-12-03 Snap Inc. Context sensitive presentation of content
CN112040858A (en) 2017-10-19 2020-12-04 脸谱科技有限责任公司 System and method for identifying biological structures associated with neuromuscular source signals
US10573043B2 (en) 2017-10-30 2020-02-25 Snap Inc. Mobile-based cartographic control of display content
US10636424B2 (en) 2017-11-30 2020-04-28 Apple Inc. Multi-turn canned dialog
US11265273B1 (en) 2017-12-01 2022-03-01 Snap, Inc. Dynamic media overlay with smart widget
WO2019117361A1 (en) * 2017-12-14 2019-06-20 라인 가부시키가이샤 Interaction method and system in messaging service environment
US10749831B2 (en) * 2017-12-15 2020-08-18 Microsoft Technology Licensing, Llc Link with permission protected data preview
US11017173B1 (en) 2017-12-22 2021-05-25 Snap Inc. Named entity recognition visual context and caption data
US10523606B2 (en) 2018-01-02 2019-12-31 Snap Inc. Generating interactive messages with asynchronous media content
US10567321B2 (en) 2018-01-02 2020-02-18 Snap Inc. Generating interactive messages with asynchronous media content
US10678818B2 (en) 2018-01-03 2020-06-09 Snap Inc. Tag distribution visualization system
US10733982B2 (en) 2018-01-08 2020-08-04 Apple Inc. Multi-directional dialog
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10733375B2 (en) 2018-01-31 2020-08-04 Apple Inc. Knowledge-based framework for improving natural language understanding
US11112964B2 (en) 2018-02-09 2021-09-07 Apple Inc. Media capture lock affordance for graphical user interface
US11507614B1 (en) 2018-02-13 2022-11-22 Snap Inc. Icon based tagging
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US10885136B1 (en) 2018-02-28 2021-01-05 Snap Inc. Audience filtering system
US10789959B2 (en) 2018-03-02 2020-09-29 Apple Inc. Training speaker recognition models for digital assistants
US10327096B1 (en) 2018-03-06 2019-06-18 Snap Inc. Geo-fence selection system
US10592604B2 (en) 2018-03-12 2020-03-17 Apple Inc. Inverse text normalization for automatic speech recognition
KR102494540B1 (en) 2018-03-14 2023-02-06 스냅 인코포레이티드 Creation of collectible items based on location information
US10818288B2 (en) 2018-03-26 2020-10-27 Apple Inc. Natural assistant interaction
US11163941B1 (en) 2018-03-30 2021-11-02 Snap Inc. Annotating a collection of media content items
US10909331B2 (en) 2018-03-30 2021-02-02 Apple Inc. Implicit identification of translation payload with neural machine translation
US10219111B1 (en) 2018-04-18 2019-02-26 Snap Inc. Visitation tracking system
DK180078B1 (en) 2018-05-07 2020-03-31 Apple Inc. USER INTERFACE FOR AVATAR CREATION
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera
US11145294B2 (en) 2018-05-07 2021-10-12 Apple Inc. Intelligent automated assistant for delivering content from user experiences
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US10928918B2 (en) 2018-05-07 2021-02-23 Apple Inc. Raise to speak
DK201870380A1 (en) 2018-05-07 2020-01-29 Apple Inc. Displaying user interfaces associated with physical activities
US10592001B2 (en) * 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10984780B2 (en) 2018-05-21 2021-04-20 Apple Inc. Global semantic word embeddings using bi-directional recurrent neural networks
US10896197B1 (en) 2018-05-22 2021-01-19 Snap Inc. Event detection system
DK179822B1 (en) 2018-06-01 2019-07-12 Apple Inc. Voice interaction at a primary device to access call functionality of a companion device
DK201870355A1 (en) 2018-06-01 2019-12-16 Apple Inc. Virtual assistant operation in multi-device environments
US10892996B2 (en) 2018-06-01 2021-01-12 Apple Inc. Variable latency device coordination
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
US11386266B2 (en) 2018-06-01 2022-07-12 Apple Inc. Text correction
US10496705B1 (en) 2018-06-03 2019-12-03 Apple Inc. Accelerated task performance
US11063889B2 (en) 2018-06-08 2021-07-13 Snap Inc. Generating interactive messages with entity assets
KR20200002610A (en) * 2018-06-29 2020-01-08 캐논 가부시끼가이샤 Electronic device, control method for electronic device, and computer readable medium
US10679393B2 (en) 2018-07-24 2020-06-09 Snap Inc. Conditional modification of augmented reality object
USD890198S1 (en) 2018-08-21 2020-07-14 Facebook, Inc. Display screen with graphical user interface
USD894921S1 (en) 2018-08-21 2020-09-01 Facebook, Inc. Display screen with graphical user interface
US11017164B1 (en) * 2018-08-27 2021-05-25 Facebook, Inc. Systems and methods for collecting multiple forms of digital content using a single landing screen
US10942978B1 (en) 2018-08-27 2021-03-09 Facebook, Inc. Systems and methods for creating interactive metadata elements in social media compositions
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system
US11025582B1 (en) 2018-09-05 2021-06-01 Facebook, Inc. Systems and methods for creating multiple renditions of a social media composition from inputs to a single digital composer
DK201870623A1 (en) 2018-09-11 2020-04-15 Apple Inc. User interfaces for simulated depth effects
EP3853698A4 (en) 2018-09-20 2021-11-17 Facebook Technologies, LLC Neuromuscular text entry, writing and drawing in augmented reality systems
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US10674072B1 (en) 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
US11010561B2 (en) 2018-09-27 2021-05-18 Apple Inc. Sentiment prediction from textual data
CN109120866B (en) * 2018-09-27 2020-04-03 腾讯科技(深圳)有限公司 Dynamic expression generation method and device, computer readable storage medium and computer equipment
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
US11462215B2 (en) 2018-09-28 2022-10-04 Apple Inc. Multi-modal inputs for voice commands
US10698583B2 (en) 2018-09-28 2020-06-30 Snap Inc. Collaborative achievement interface
US11170166B2 (en) 2018-09-28 2021-11-09 Apple Inc. Neural typographical error modeling via generative adversarial networks
US10839159B2 (en) 2018-09-28 2020-11-17 Apple Inc. Named entity normalization in a spoken dialog system
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
CN112214152B (en) * 2018-09-30 2022-01-18 广东电网有限责任公司 Image classification method and device and electronic terminal
US11475898B2 (en) 2018-10-26 2022-10-18 Apple Inc. Low-latency multi-speaker speech recognition
US10778623B1 (en) 2018-10-31 2020-09-15 Snap Inc. Messaging and gaming applications communication platform
USD941320S1 (en) * 2018-11-06 2022-01-18 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD941850S1 (en) * 2018-11-06 2022-01-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10447842B1 (en) * 2018-11-28 2019-10-15 Motorola Solutions, Inc. Push-to-talk to messaging application graphical interface
US10939236B1 (en) 2018-11-30 2021-03-02 Snap Inc. Position service to determine relative position to map features
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11638059B2 (en) 2019-01-04 2023-04-25 Apple Inc. Content playback on multiple devices
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
US10838599B2 (en) 2019-02-25 2020-11-17 Snap Inc. Custom media overlay system
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11348573B2 (en) 2019-03-18 2022-05-31 Apple Inc. Multimodality in digital assistant systems
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11012390B1 (en) * 2019-03-28 2021-05-18 Snap Inc. Media content response in a messaging system
US11249614B2 (en) 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US10810782B1 (en) 2019-04-01 2020-10-20 Snap Inc. Semantic texture mapping system
CN114938360B (en) * 2019-04-12 2023-04-18 腾讯科技(深圳)有限公司 Data processing method and device based on instant messaging application
DK201970509A1 (en) 2019-05-06 2021-01-15 Apple Inc Spoken notifications
DK201970530A1 (en) 2019-05-06 2021-01-28 Apple Inc Avatar integration with multiple applications
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US11475884B2 (en) 2019-05-06 2022-10-18 Apple Inc. Reducing digital assistant latency when a language is incorrectly determined
US11423908B2 (en) 2019-05-06 2022-08-23 Apple Inc. Interpreting spoken requests
US11307752B2 (en) 2019-05-06 2022-04-19 Apple Inc. User configurable task triggers
CN110134515A (en) * 2019-05-16 2019-08-16 极智(上海)企业管理咨询有限公司 A kind of multichannel short message service system, method and device
US11140099B2 (en) 2019-05-21 2021-10-05 Apple Inc. Providing message response suggestions
US10560898B1 (en) 2019-05-30 2020-02-11 Snap Inc. Wearable device location systems
US10582453B1 (en) 2019-05-30 2020-03-03 Snap Inc. Wearable device location systems architecture
DK180129B1 (en) 2019-05-31 2020-06-02 Apple Inc. User activity shortcut suggestions
US11289073B2 (en) 2019-05-31 2022-03-29 Apple Inc. Device text to speech
DK201970511A1 (en) 2019-05-31 2021-02-15 Apple Inc Voice identification in digital assistant systems
US11496600B2 (en) 2019-05-31 2022-11-08 Apple Inc. Remote execution of machine-learned models
US11360641B2 (en) 2019-06-01 2022-06-14 Apple Inc. Increasing the relevance of new available information
US11468890B2 (en) 2019-06-01 2022-10-11 Apple Inc. Methods and user interfaces for voice-based control of electronic devices
US11106342B1 (en) * 2019-06-03 2021-08-31 Snap Inc. User interfaces to facilitate multiple modes of electronic communication
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
KR20210030470A (en) * 2019-07-16 2021-03-17 엘지전자 주식회사 Display device
CN110674618A (en) * 2019-09-03 2020-01-10 北京达佳互联信息技术有限公司 Content display method, device, equipment and medium
US11488406B2 (en) 2019-09-25 2022-11-01 Apple Inc. Text detection using global geometry estimators
US11821742B2 (en) 2019-09-26 2023-11-21 Snap Inc. Travel based notifications
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
US20210210099A1 (en) * 2020-01-06 2021-07-08 Soundhound, Inc. Multi Device Proxy
US11228551B1 (en) 2020-02-12 2022-01-18 Snap Inc. Multiple gateway message exchange
US11265274B1 (en) 2020-02-28 2022-03-01 Snap Inc. Access and routing of interactive messages
US11516167B2 (en) 2020-03-05 2022-11-29 Snap Inc. Storing data based on device location
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US10956743B1 (en) 2020-03-27 2021-03-23 Snap Inc. Shared augmented reality system
US11430091B2 (en) 2020-03-27 2022-08-30 Snap Inc. Location mapping for large scale augmented-reality
DK181103B1 (en) 2020-05-11 2022-12-15 Apple Inc User interfaces related to time
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11061543B1 (en) 2020-05-11 2021-07-13 Apple Inc. Providing relevant data items based on context
US11183193B1 (en) 2020-05-11 2021-11-23 Apple Inc. Digital assistant hardware abstraction
US11039074B1 (en) 2020-06-01 2021-06-15 Apple Inc. User interfaces for managing media
AU2020239717B2 (en) * 2020-06-01 2022-06-16 Apple Inc. User interfaces for managing media
JP2023529126A (en) 2020-06-08 2023-07-07 アップル インコーポレイテッド Presentation of avatars in a 3D environment
WO2021253048A1 (en) * 2020-06-10 2021-12-16 Snap Inc. Contextual sending menu
US11503432B2 (en) 2020-06-15 2022-11-15 Snap Inc. Scalable real-time location sharing framework
US11483267B2 (en) 2020-06-15 2022-10-25 Snap Inc. Location sharing using different rate-limited links
US11290851B2 (en) 2020-06-15 2022-03-29 Snap Inc. Location sharing using offline and online objects
US11314776B2 (en) 2020-06-15 2022-04-26 Snap Inc. Location sharing using friend list versions
US11308327B2 (en) 2020-06-29 2022-04-19 Snap Inc. Providing travel-based augmented reality content with a captured image
US11490204B2 (en) 2020-07-20 2022-11-01 Apple Inc. Multi-device audio adjustment coordination
US11438683B2 (en) 2020-07-21 2022-09-06 Apple Inc. User identification using headphones
US11349797B2 (en) 2020-08-31 2022-05-31 Snap Inc. Co-location connection service
US11212449B1 (en) 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
US11297281B1 (en) * 2021-03-22 2022-04-05 Motorola Mobility Llc Manage a video conference session in a multi-tasking environment
US11606756B2 (en) 2021-03-29 2023-03-14 Snap Inc. Scheduling requests for location data
US11645324B2 (en) 2021-03-31 2023-05-09 Snap Inc. Location-based timeline media content system
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
CN115562540A (en) * 2021-07-01 2023-01-03 祖玛视频通讯公司 Method for applying video effects in a video communication session
US11868596B2 (en) * 2021-07-28 2024-01-09 Capital One Services, Llc Color-based system for generating notifications
WO2023034025A1 (en) * 2021-09-02 2023-03-09 Snap Inc. Scan-based messaging for electronic eyewear devices
US11829834B2 (en) 2021-10-29 2023-11-28 Snap Inc. Extended QR code
CN114780190B (en) * 2022-04-13 2023-12-22 脸萌有限公司 Message processing method, device, electronic equipment and storage medium
KR20240037705A (en) * 2022-09-15 2024-03-22 삼성에스디에스 주식회사 Receiving device and method of managing attention message

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100169772A1 (en) * 2008-12-31 2010-07-01 Verizon Data Services Llc Tabbed content view on a touch-screen device
US20120009896A1 (en) * 2010-07-09 2012-01-12 Microsoft Corporation Above-lock camera access
US20120190388A1 (en) * 2010-01-07 2012-07-26 Swakker Llc Methods and apparatus for modifying a multimedia object within an instant messaging session at a mobile communication device
US20120210200A1 (en) * 2011-02-10 2012-08-16 Kelly Berger System, method, and touch screen graphical user interface for managing photos and creating photo books
US20140085487A1 (en) * 2012-09-25 2014-03-27 Samsung Electronics Co. Ltd. Method for transmitting image and electronic device thereof

Family Cites Families (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US14092A (en) 1856-01-15 ctjmings
US2011A (en) 1841-03-18 Appabatxts for piling saws
US14312A (en) 1856-02-26 Air-cock for steam-heating
US5477264A (en) 1994-03-29 1995-12-19 Eastman Kodak Company Electronic imaging system using a removable software-enhanced storage device
JPH11308434A (en) 1998-04-24 1999-11-05 Sanyo Electric Co Ltd Image printer
US6473769B1 (en) 1999-03-31 2002-10-29 Microsoft Corporation Property linking in object-oriented computing environments
US7315848B2 (en) 2001-12-12 2008-01-01 Aaron Pearse Web snippets capture, storage and retrieval system and method
JP2004304297A (en) 2003-03-28 2004-10-28 Kyocera Corp Mobile information terminal
GB2421879A (en) * 2003-04-22 2006-07-05 Spinvox Ltd Converting voicemail to text message for transmission to a mobile telephone
US20070054678A1 (en) * 2004-04-22 2007-03-08 Spinvox Limited Method of generating a sms or mms text message for receipt by a wireless information device
FR2872660B1 (en) 2004-07-05 2006-12-22 Eastman Kodak Co SHOOTING APPARATUS AND METHOD FOR FORMATION OF ANNOTATED IMAGES
EP1653398A1 (en) 2004-10-29 2006-05-03 Research In Motion Limited Extended User Interface for Email Composition
US7376699B2 (en) * 2004-12-02 2008-05-20 Scenera Technologies, Llc System and method for sending an image from a communication device
US20060174203A1 (en) 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Viewfinder for shared image device
US20060215242A1 (en) 2005-03-24 2006-09-28 Mohammad Besharat Communication device configured to automatically append an image to a message and method therefore
US7685530B2 (en) 2005-06-10 2010-03-23 T-Mobile Usa, Inc. Preferred contact group centric interface
US20070136750A1 (en) 2005-12-13 2007-06-14 Microsoft Corporation Active preview for media items
KR100746943B1 (en) * 2006-02-06 2007-08-07 주식회사 케이티프리텔 Mobile terminal for chatting by using sms and method thereof
US9304675B2 (en) * 2006-09-06 2016-04-05 Apple Inc. Portable electronic device for instant messaging
US8330773B2 (en) 2006-11-21 2012-12-11 Microsoft Corporation Mobile data and handwriting screen capture and forwarding
JP5196777B2 (en) 2006-12-11 2013-05-15 株式会社スクウェア・エニックス Game equipment
US8031170B2 (en) 2007-05-09 2011-10-04 Research In Motion Limited User interface for selecting a photo tag
US8694379B2 (en) 2007-05-14 2014-04-08 Microsoft Corporation One-click posting
KR101396974B1 (en) 2007-07-23 2014-05-20 엘지전자 주식회사 Portable terminal and method for processing call signal in the portable terminal
US8422550B2 (en) 2007-07-27 2013-04-16 Lagavulin Limited Apparatuses, methods, and systems for a portable, automated contractual image dealer and transmitter
US8407603B2 (en) 2008-01-06 2013-03-26 Apple Inc. Portable electronic device for instant messaging multiple recipients
US8913176B2 (en) * 2008-09-05 2014-12-16 Lg Electronics Inc. Mobile terminal and method of performing multi-focusing and photographing image including plurality of objects using the same
US20100124906A1 (en) 2008-11-14 2010-05-20 Nokia Corporation Method and Apparatus for Transmitting and Receiving Data
US8584031B2 (en) 2008-11-19 2013-11-12 Apple Inc. Portable touch screen device, method, and graphical user interface for using emoji characters
US8406386B2 (en) * 2008-12-15 2013-03-26 Verizon Patent And Licensing Inc. Voice-to-text translation for visual voicemail
US20100223345A1 (en) * 2009-03-02 2010-09-02 Microsoft Corporation Communications application having conversation and meeting environments
EP2228711A3 (en) 2009-03-12 2014-06-04 Lg Electronics Inc. Mobile terminal and method for providing user interface thereof
US8564541B2 (en) 2009-03-16 2013-10-22 Apple Inc. Zhuyin input interface on a device
US20100262924A1 (en) 2009-04-08 2010-10-14 Kalu Onuka Kalu System and method for linking items to a group
KR101588730B1 (en) 2009-04-21 2016-01-26 엘지전자 주식회사 Mobile terminal and method for communicating using instant messaging service thereof
KR101590766B1 (en) 2009-06-26 2016-02-02 삼성전자주식회사 Apparatus and method for grouping message and displaying
US9251428B2 (en) * 2009-07-18 2016-02-02 Abbyy Development Llc Entering information through an OCR-enabled viewfinder
KR101598632B1 (en) 2009-10-01 2016-02-29 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Mobile terminal and method for editing tag thereof
EP2312800B1 (en) 2009-10-14 2014-07-30 BlackBerry Limited System and Method for Managing Messages in Conversational-Type Messaging Applications
US8239783B2 (en) 2009-11-19 2012-08-07 Microsoft Corporation Integrated viewfinder and digital media
US8698762B2 (en) 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface for navigating and displaying content in context
US8554731B2 (en) 2010-03-31 2013-10-08 Microsoft Corporation Creating and propagating annotated information
US9003306B2 (en) 2010-05-04 2015-04-07 Qwest Communications International Inc. Doodle-in-chat-context
US20110276901A1 (en) 2010-05-04 2011-11-10 Qwest Communications International Inc. Family chat
KR101606727B1 (en) 2010-06-25 2016-03-28 엘지전자 주식회사 Mobile terminal and operation method thereof
KR101701836B1 (en) 2010-07-05 2017-02-02 엘지전자 주식회사 Mobile terminal and Method for controlling message transmission thereof
US8645136B2 (en) * 2010-07-20 2014-02-04 Intellisist, Inc. System and method for efficiently reducing transcription error using hybrid voice transcription
KR20120023405A (en) 2010-09-03 2012-03-13 삼성전자주식회사 Method and apparatus for providing user interface
WO2012038771A1 (en) 2010-09-21 2012-03-29 Sony Ericsson Mobile Communications Ab System and method of enhancing messages
US8760488B2 (en) 2010-10-22 2014-06-24 Litl Llc Video integration
US8676891B2 (en) 2010-11-01 2014-03-18 Google Inc. Visibility inspector in social networks
US9203796B2 (en) 2010-11-12 2015-12-01 Facebook, Inc. Messaging system with multiple messaging channels
KR101879702B1 (en) * 2010-11-12 2018-07-18 페이스북, 인크. Messaging System with Multiple Messaging Channels
US8738705B2 (en) 2010-12-21 2014-05-27 Facebook, Inc. Categorizing social network objects based on user affiliations
EP2503769B1 (en) * 2011-03-18 2019-06-19 BlackBerry Limited Method and apparatus for join selection of a conference call
ES2870588T3 (en) 2011-04-29 2021-10-27 American Greetings Corp Systems, procedures and apparatus for creating, editing, distributing and displaying electronic greeting cards
KR101704549B1 (en) * 2011-06-10 2017-02-22 삼성전자주식회사 Method and apparatus for providing interface for inpputing character
US8635293B2 (en) 2011-06-13 2014-01-21 Microsoft Corporation Asynchronous video threads
US8660582B2 (en) 2011-09-20 2014-02-25 Steve Y. Chen System and method for electronic communications between users in a similar geographic location
US20130083215A1 (en) 2011-10-03 2013-04-04 Netomat, Inc. Image and/or Video Processing Systems and Methods
KR101789626B1 (en) 2011-10-10 2017-10-25 엘지전자 주식회사 Mobile terminal and method for controlling the same
US8819154B2 (en) 2011-10-14 2014-08-26 Blackberry Limited User interface methods and apparatus for use in communicating text and photo messages
EP2582120A1 (en) 2011-10-14 2013-04-17 Research In Motion Limited User interface methods and apparatus for use in communicating text and photo messages
US9042527B2 (en) * 2011-10-17 2015-05-26 At&T Intellectual Property I, L.P. Visual voice mail delivery mechanisms
JP5793054B2 (en) 2011-10-20 2015-10-14 京セラ株式会社 Portable terminal device, program, and execution suppression method
KR101301794B1 (en) 2011-11-04 2013-08-29 (주)카카오 Method for providing instant messaging service using dynamic emoticon and mobile phone therefor
US20130117378A1 (en) 2011-11-06 2013-05-09 Radoslav P. Kotorov Method for collaborative social shopping engagement
KR101833825B1 (en) 2011-11-11 2018-03-02 엘지전자 주식회사 Mobile terminal and method for controlling thereof
KR20130054071A (en) 2011-11-16 2013-05-24 삼성전자주식회사 Mobile apparatus for processing multiple applications and method thereof
US9743357B2 (en) * 2011-12-16 2017-08-22 Joseph Akwo Tabe Energy harvesting computer device in association with a communication device configured with apparatus for boosting signal reception
KR101655876B1 (en) 2012-01-05 2016-09-09 삼성전자 주식회사 Operating Method For Conversation based on a Message and Device supporting the same
JP5859330B2 (en) 2012-02-02 2016-02-10 株式会社コナミデジタルエンタテインメント Message transmission system, control method and program
KR102008495B1 (en) 2012-02-24 2019-08-08 삼성전자주식회사 Method for sharing content and mobile terminal thereof
US9363220B2 (en) 2012-03-06 2016-06-07 Apple Inc. Context-sensitive help for image viewing and editing application
EP2637128B1 (en) * 2012-03-06 2018-01-17 beyo GmbH Multimodal text input by a keyboard/camera text input module replacing a conventional keyboard text input module on a mobile device
US10282055B2 (en) * 2012-03-06 2019-05-07 Apple Inc. Ordered processing of edits for a media editing application
US20130246138A1 (en) 2012-03-16 2013-09-19 J. Aaron Johnson Systems and methods propagating advertising materials in a social media environment
JP5706868B2 (en) 2012-03-30 2015-04-22 Line株式会社 System and method for providing avatar / game / entertainment functionality on a messenger platform
US9154456B2 (en) 2012-04-17 2015-10-06 Trenda Innovations, Inc. Messaging system and method
KR101873761B1 (en) 2012-04-20 2018-07-03 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6023879B2 (en) * 2012-05-18 2016-11-09 アップル インコーポレイテッド Apparatus, method and graphical user interface for operating a user interface based on fingerprint sensor input
US9135751B2 (en) 2012-06-05 2015-09-15 Apple Inc. Displaying location preview
KR101917691B1 (en) 2012-06-07 2018-11-13 엘지전자 주식회사 Mobile terminal and control method thereof
US20160050289A1 (en) 2012-06-26 2016-02-18 Google Inc. Automatic sharing of digital content
US20130346904A1 (en) * 2012-06-26 2013-12-26 International Business Machines Corporation Targeted key press zones on an interactive display
US9911222B2 (en) 2012-07-06 2018-03-06 Tangome, Inc. Animation in threaded conversations
US20150178968A1 (en) 2012-07-13 2015-06-25 Entetrainer Oy Imaging module in mobile device
KR102001215B1 (en) 2012-07-20 2019-07-17 삼성전자주식회사 Method and system for sharing content, device and computer readable recording medium thereof
US20140040773A1 (en) 2012-07-31 2014-02-06 Apple Inc. Transient Panel Enabling Message Correction Capabilities Prior to Data Submission
US9785314B2 (en) 2012-08-02 2017-10-10 Facebook, Inc. Systems and methods for displaying an animation to confirm designation of an image for sharing
US8428453B1 (en) 2012-08-08 2013-04-23 Snapchat, Inc. Single mode visual media capture
US9443271B2 (en) 2012-08-15 2016-09-13 Imvu, Inc. System and method for increasing clarity and expressiveness in network communications
WO2014031899A1 (en) 2012-08-22 2014-02-27 Goldrun Corporation Augmented reality virtual content platform apparatuses, methods and systems
CN103634198B (en) * 2012-08-29 2018-07-27 深圳市京弘全智能科技股份有限公司 A kind of interface display method and device of instant communication software
US8928724B2 (en) * 2012-08-31 2015-01-06 Microsoft Corporation Unified user experience for mobile calls
JP5631947B2 (en) 2012-09-21 2014-11-26 株式会社コナミデジタルエンタテインメント Management device, message management method and program
US8762895B2 (en) 2012-10-28 2014-06-24 Google Inc. Camera zoom indicator in mobile devices
US9509645B2 (en) 2012-12-28 2016-11-29 Lg Electronics Inc. Mobile terminal, message transceiving server and controlling method thereof
KR101331444B1 (en) 2013-02-08 2013-11-21 에스케이플래닛 주식회사 Method for instant messaging service, storage medium recording program and device therefor
KR102074940B1 (en) 2013-03-07 2020-03-02 삼성전자주식회사 Method for transmitting and receiving message and apparatus for the same
US9261985B2 (en) * 2013-03-11 2016-02-16 Barnes & Noble College Booksellers, Llc Stylus-based touch-sensitive area for UI control of computing device
US9946365B2 (en) * 2013-03-11 2018-04-17 Barnes & Noble College Booksellers, Llc Stylus-based pressure-sensitive area for UI control of computing device
US9419935B2 (en) 2013-08-02 2016-08-16 Whatsapp Inc. Voice communications with real-time status notifications
CN110673769A (en) 2013-08-16 2020-01-10 联想(北京)有限公司 Information processing method and electronic equipment
US9804760B2 (en) 2013-08-22 2017-10-31 Apple Inc. Scrollable in-line camera for capturing and sharing content
US20150089389A1 (en) 2013-09-24 2015-03-26 Sap Ag Multiple mode messaging
KR20160086848A (en) 2013-11-27 2016-07-20 페이스북, 인크. Communication user interface systems and methods
US9246961B2 (en) 2013-11-27 2016-01-26 Facebook, Inc. Communication user interface systems and methods
CN103761216B (en) * 2013-12-24 2018-01-16 上海斐讯数据通信技术有限公司 Edit the method and mobile terminal of text
US20150264303A1 (en) 2014-03-17 2015-09-17 Microsoft Corporation Stop Recording and Send Using a Single Action
US10845982B2 (en) 2014-04-28 2020-11-24 Facebook, Inc. Providing intelligent transcriptions of sound messages in a messaging application

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100169772A1 (en) * 2008-12-31 2010-07-01 Verizon Data Services Llc Tabbed content view on a touch-screen device
US20120190388A1 (en) * 2010-01-07 2012-07-26 Swakker Llc Methods and apparatus for modifying a multimedia object within an instant messaging session at a mobile communication device
US20120009896A1 (en) * 2010-07-09 2012-01-12 Microsoft Corporation Above-lock camera access
US20120210200A1 (en) * 2011-02-10 2012-08-16 Kelly Berger System, method, and touch screen graphical user interface for managing photos and creating photo books
US20140085487A1 (en) * 2012-09-25 2014-03-27 Samsung Electronics Co. Ltd. Method for transmitting image and electronic device thereof

Also Published As

Publication number Publication date
US11455093B2 (en) 2022-09-27
EP2940570A1 (en) 2015-11-04
MX2016014088A (en) 2017-02-09
EP3825835A1 (en) 2021-05-26
IL247021A (en) 2017-04-30
JP2017521736A (en) 2017-08-03
CN106255949B (en) 2020-04-24
CA2938529C (en) 2017-03-21
JP6266805B2 (en) 2018-01-24
WO2015167589A1 (en) 2015-11-05
US20150312182A1 (en) 2015-10-29
US9836207B2 (en) 2017-12-05
US20160283109A1 (en) 2016-09-29
CN106255949A (en) 2016-12-21
EP2940569A1 (en) 2015-11-04
AU2014392592A1 (en) 2016-09-01
EP3862862A1 (en) 2021-08-11
US9391934B2 (en) 2016-07-12
US10976915B2 (en) 2021-04-13
AU2014392593B2 (en) 2017-04-06
KR101733637B1 (en) 2017-05-10
JP6262362B2 (en) 2018-01-17
US20150312184A1 (en) 2015-10-29
AU2014392593A1 (en) 2016-09-01
US20180081518A1 (en) 2018-03-22
US20150312175A1 (en) 2015-10-29
US9391933B2 (en) 2016-07-12
KR20160113309A (en) 2016-09-28
BR112016025271A2 (en) 2017-08-15
JP2017521737A (en) 2017-08-03
WO2015167590A1 (en) 2015-11-05
CA2938528A1 (en) 2015-11-05
CA2938529A1 (en) 2015-11-05
US20160299658A1 (en) 2016-10-13
EP2940569B1 (en) 2021-03-17
IL247022A (en) 2017-12-31
MX2016014087A (en) 2017-02-09
CN106255989A (en) 2016-12-21
US20150312185A1 (en) 2015-10-29
KR20160118336A (en) 2016-10-11
US10845982B2 (en) 2020-11-24
US11397523B2 (en) 2022-07-26
CA2938528C (en) 2017-03-21
KR101734312B1 (en) 2017-05-11
MX365297B (en) 2019-05-29
US10809908B2 (en) 2020-10-20
AU2014392592B2 (en) 2017-02-16
US20210191589A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
US20220300132A1 (en) Facilitating the editing of multimedia as part of sending the multimedia in a message
US10698575B2 (en) Communication user interface systems and methods
US10397156B2 (en) Providing message status notifications during electronic messaging
US20150149927A1 (en) Communication user interface systems and methods

Legal Events

Date Code Title Description
AS Assignment

Owner name: META PLATFORMS, INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:FACEBOOK, INC.;REEL/FRAME:060329/0155

Effective date: 20211028

Owner name: FACEBOOK, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LANGHOLZ, BENJAMIN S.;TYLER, WILLIAM MCMILLAN;REEL/FRAME:060154/0671

Effective date: 20140814

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION