US20160226806A1 - Digital media messages and files - Google Patents
Digital media messages and files Download PDFInfo
- Publication number
- US20160226806A1 US20160226806A1 US15/094,557 US201615094557A US2016226806A1 US 20160226806 A1 US20160226806 A1 US 20160226806A1 US 201615094557 A US201615094557 A US 201615094557A US 2016226806 A1 US2016226806 A1 US 2016226806A1
- Authority
- US
- United States
- Prior art keywords
- digital
- media message
- electronic device
- content segment
- digital content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- H04L51/22—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/42—Mailbox-related aspects, e.g. synchronisation of mailboxes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/06—Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/08—Annexed information, e.g. attachments
Definitions
- electronic device messaging applications have been developed to assist the user in creating digital messages that include, for example, images, audio, video, and other content.
- the functionality of existing messaging applications is limited.
- such applications generally do not enable the user to combine a wide array of digital content segments (e.g., an audio segment, a video segment, a digital image, etc.) such that two or more content segments, such as segments from different sources, can be presented to the recipient simultaneously as an integrated component of the digital message.
- content segments may be captured and/or saved locally on a user device, sharing such content segments with other user devices as part of the digital message and via such applications can be cumbersome due to the size of the audio files, video files, and/or other components of such content segments.
- large audio files, video files, and the like can require significant memory for storage locally on a user device, and can also require a substantial amount of bandwidth to upload and/or download on most wireless networks.
- handling such content segments can place significant strain on user device resources and network resources, and can hinder device and/or network performance.
- traditional techniques for rendering digital messages require that the video rendering and content upload processes begin only after a user has finished generating and/or composing the message. This constraint can result in prolonged upload/render times once the message is complete.
- Example embodiments of the present disclosure are directed toward curing one or more of the deficiencies described above.
- FIG. 1 is a schematic diagram of an illustrative computing environment for implementing various embodiments of digital media message generation.
- FIG. 2 is a schematic diagram of illustrative components in an example server computer that may be used in an example digital media message generation environment.
- FIG. 3 is a schematic diagram of illustrative components in an example electronic device that may be used in an example digital media message generation environment.
- FIG. 4 shows an illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message.
- FIG. 5 shows another illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message.
- FIG. 6 shows still another illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message.
- FIG. 7 shows yet another illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message.
- FIG. 8 shows an illustrative user interface screen displayed on an electronic device that enables users to share an example digital media message.
- FIG. 9 is a flow diagram illustrating an example method of the present disclosure.
- FIG. 10 shows an illustrative user interface screen displayed on an electronic device, as well as example audio and video tracks.
- FIG. 11 shows another illustrative user interface screen displayed on an electronic device, as well as example audio and video tracks.
- FIG. 12 shows an illustrative electronic file of the present disclosure.
- FIG. 13 is a flow diagram illustrating another example method of the present disclosure.
- FIG. 14 shows another illustrative user interface screen displayed on an electronic device that enables users to create and/or modify a digital media message.
- the disclosure is directed to devices and techniques for generating digital media messages that can be easily shared between users of electronic devices as a means of communication.
- the techniques described herein enable users to combine a variety of different digital content segments into a single digital media message.
- the user may create a digital media message by capturing audio content segments, video content segments, digital images, web content, and the like.
- Such content segments may be captured by the user during generation of the digital media message.
- such content segments may be captured by the user prior to generating the digital media message and may be saved in a memory of the electronic device, or in a cloud-based memory, for incorporation into the digital media message.
- replacing, for example, part of a video track of an underlying digital video segment with a digital image may reduce the file size of the resulting digital media message.
- the replaced portion of the video track may typically be rendered at approximately 300 frames/second for a duration of the portion of the video track, and may be characterized by a commensurate memory and/or file size (e.g., in bytes).
- the selected digital image may comprise a single frame that may be rendered for the duration of the replaced portion of the video track.
- replacing a portion of the video track of the underlying digital video segment with the digital image may reduce the number of frames/second of the underlying video segment, thereby reducing file size thereof.
- a digital media message generated using such techniques may have a smaller file size and may require/take up less memory than a corresponding digital media message generated using the underlying digital video segment with the video track unchanged (e.g., without replacing a portion of the video track with a selected digital image).
- such a reduced file size may reduce the server and/or electronic device memory required to receive and/or store such messages. Such a reduced file size may also reduce the processor load required to provide, render, display, and/or otherwise process such digital media messages. As a result, such a reduction in file size and/or memory requirements will reduce overall network load/traffic, and will improve network, server, and/or electronic device performance and efficiency. Additionally, various embodiments of the present disclosure may enable digital content segments associated with the digital media message to be uploaded to a server computer or other cloud-based resource substantially in real time. In some examples, various digital content segments associated with the digital media message being created may be transferred to such a server computer while the digital content segment is being captured by a user device. As a result, the time required for transferring and/or rendering the finished digital media message on an additional user device may be greatly reduce.
- FIG. 1 is a schematic diagram of an illustrative computing environment 100 for implementing various embodiments of digital media message generation.
- the computing environment 100 may include server(s) 102 and one or more electronic devices 104 ( 1 )- 104 (N) (collectively “electronic devices 104 ”) that are communicatively connected by a network 106 .
- the network 106 may be a local area network (“LAN”), a larger network such as a wide area network (“WAN”), or a collection of networks, such as the Internet. Protocols for network communication, such as TCP/IP, may be used to implement the network 106 .
- LAN local area network
- WAN wide area network
- Protocols for network communication such as TCP/IP
- embodiments are described herein as using a network such as the Internet, other distribution techniques may be implemented that transmit information via memory cards, flash memory, or other portable memory devices.
- a media message engine 108 on the electronic devices 104 and/or a media message engine 110 on the server(s) 102 may receive one or more digital content segments 112 ( 1 )- 112 (N) (collectively, “digital content segments 112 ” or “content segments 112 ”) and may generate one or more digital media messages 114 (or “media messages 114 ”) based on the content segments 112 .
- the media message engine 108 may receive one or more content segments 112 via interaction of a user 116 with an electronic device 104 .
- the media message engine 108 may provide such content segments 112 to the media message engine 110 on the server 102 , via the network 106 , to generate at least a portion of the media message 114 .
- the media message 114 may be generated by the media message engine 108 of the respective electronic device 108 .
- the media message 114 may be directed to one or more additional electronic devices 118 ( 1 )- 118 (N) (collectively “electronic devices 118 ”) via the network 106 .
- Such electronic devices 118 may be disposed at a location remote from the electronic devices 104 , and one or more users 120 may consume the digital media message 114 via one or more of the electronic devices 118 .
- Each of the electronic devices 104 may include a display component, a digital camera, and an audio input and transmission component. Such audio input and transmission components may include one or more microphones.
- the electronic devices 104 may also include hardware and/or software that support voice over Internet Protocol (VoIP) as well as any of the display, input, and/or output components described herein.
- VoIP voice over Internet Protocol
- Each of the electronic devices 104 may further include a web browser that enables the user 116 to navigate to a web page via the network 106 .
- the user 116 may generate and/or capture one or more digital content segments 112 using, for example, the camera and the microphone.
- the user 116 may capture one or more digital images using the camera and/or may capture one or more video clips using the camera in conjunction with the microphone.
- each web page may present content that the user 116 may capture via the electronic device 104 , using various copy and/or save commands included in the web browser of the electronic device 104 , and the user may incorporate such content into one or more content segments 112 .
- Any of the content segments 112 described herein may be provided to one or both of the media message engines 108 , 110 , and the media message engines 108 , 110 may incorporate such content segments 112 into the media message 114 .
- the media message engines 108 , 110 may tag the respective content segments 112 with associated metadata.
- the associated metadata may include profile information about the type of content (e.g., image, video, audio, text, animation, etc.), the source of the content segment 112 (e.g., camera, microphone, internet web page, etc.), and/or a position in a play sequence of the digital media message 114 with which the content segment 112 is to be associated.
- the media message engines 108 , 110 described herein may integrate and/or otherwise combine two or more digital content segments 112 to form the digital media message 114 .
- the digital content segments 112 may be presented to the user sequentially when the media message 114 is played.
- the media message engines 108 , 110 may combine two or more digital content segments 112 such that the combined digital content segments 112 are presented simultaneously when the media message 114 is played.
- the media message engines 108 , 110 may also distribute the finalized media message 114 to one or more of the electronic devices 118 .
- Various example components and functionality of the media message engines 108 , 110 will be described in greater detail below with respect to, for example, FIGS. 2 and 3 .
- the electronic devices 104 , 118 may include a mobile phone a portable computer, a tablet computer, an electronic book reader device (an “eBook reader device”), or other devices.
- Each of the electronic devices 104 , 118 may have software and hardware components that enable the display of digital content segments 112 , either separately or combined, as well as the various digital media messages 114 described herein.
- the electronic devices 104 , 118 noted above are merely examples, and other electronic devices that are equipped with network communication components, data processing components, electronic displays for displaying data, and audio output capabilities may also be employed.
- FIG. 2 is a schematic diagram of illustrative components in example server(s) 102 of the present disclosure.
- the server(s) 102 may include one or more processor(s) 202 and memory 204 .
- the memory 204 may include computer readable media.
- Computer readable media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. As defined herein, computer readable media does not include communication media in the form of modulated data signals, such as carrier waves, or other transmission mechanisms.
- the media message engine 110 may be a hardware or a software component of the server(s) 102 and in some embodiments, the media message engine 110 may comprise a component of the memory 204 . As shown in FIG. 2 , in some embodiments the media message engine 110 may include one or more of a content presentation module 206 , a segment collection module 208 , an analysis module 210 , an integration module 212 , and a distribution module 214 . The modules may include routines, programs instructions, objects, and/or data structures that perform particular tasks or implement particular abstract data types. The server(s) 102 may also implement a data store 216 that stores data, digital content segments 112 , and/or other information or content used by the media message engine 110 .
- the content presentation module 206 may enable a human reader to select digital content segments 112 for the purpose of including the selected digital content segments 112 in a digital media message 114 .
- the content presentation module 206 may present a web page to a user 116 of an electronic device 104 , such as via the network 106 .
- the content presentation module 206 may present digital content, information, and/or one or more digital content segments 112 to the user 116 of an electronic device 104 via the network 106 .
- the content presentation module 206 may also enable the user 116 to select content, information, and/or one or more digital content segments 112 .
- the content presentation module 206 may present further content, information, and/or digital content segments 112 to the user 116 .
- the content presentation module 206 may also tag the selected digital content segment 112 for inclusion in the digital media message 114 .
- the segment collection module 208 may collect audio recordings, video recordings, images, files, web content, audio files, video files, web addresses, and/or other digital content segments 112 identified, selected, and/or captured by the user 116 . Additionally, the segment collection module 208 may label each digital content segment 112 with metadata.
- the metadata may include profile information about the type of content (e.g., image, video, audio, text, animation, etc.), the source of the content segment 112 (e.g., camera, microphone, internet web page, etc.), and/or a position in a play sequence of the digital media message 114 with which the content segment 112 is to be associated.
- the metadata for an audio recording may include identification information identifying the digital content segment 112 as comprising an audio recording, information indicating that the digital content segment 112 was captured using a microphone of an electronic device 104 , information indicating the date and time of recordation, the length of the recording, and/or other information.
- Such metadata may be provided to the content presentation module 206 by the segment collection module 208 or alternatively, such metadata may be provided to the segment collection module 208 by the content presentation module 206 .
- the analysis module 210 may be used by the segment collection module 208 to determine whether a collected content segment 112 meets certain quality criteria.
- the quality criteria may include whether a background noise level in the content segment 112 is below a maximum noise level, whether video and/or image quality in the content segment 112 is above a minimum pixel or other like quality threshold, and so forth.
- the integration module 212 may use at least a portion of the metadata described above to assess and/or otherwise determine which content segment 112 to select for integration into the digital media message 114 . Additionally or alternatively, the integration module 212 may use results received from the analysis module 210 to make one or more such determinations. Such determinations may be provided to the user 116 of the electronic device 104 while a digital media message 114 is being generated as a way of guiding the user with regard to the combination of one or more content segments 112 . For instance, the integration module 212 may provide advice, suggestions, or recommendations to the user 116 as to which content segment 112 to select for integration into the digital media message 114 based on one or more of the factors described above.
- the distribution module 214 may facilitate presentation of the digital media message 114 to one or more users 120 of the electronic devices 118 . For example, once completed, the distribution module 214 may direct the digital media message 114 to one or more of the electronic devices 118 via the network 106 . Additionally or alternatively, the distribution module 214 may be configured to direct one or more digital content segments 112 between the servers 102 and one or more of the electronic devices 104 . In such embodiments, the distribution module 214 may comprise one or more kernels, drivers, or other like components configured to provide communication between the servers 102 and one or more of the electronic devices 104 , 118 .
- the data store 216 may store any of the metadata, content, information, or other data utilized in creating one or more content segments 112 and/or digital media messages 114 .
- the data store 216 may store any of the images, video files, audio files, web links, media, or other content that is captured or otherwise received via the electronic device 104 .
- Such content may be, for example, provided to the data store 216 via the network during creation of a content segment 112 and/or a digital media message 114 .
- such content may be provided to the data store 216 prior to generating a content segment 112 and/or a digital media message 114 .
- such content may be obtained and/or received from the data store 216 during generation of a content segment 112 and/or a digital media message 114 .
- one or more modules of the media message engine 110 described above may be combined or omitted. Additionally, one or more modules of the media message engine 110 may also be included in the media message engine 108 of the electronic device 104 .
- the example methods and techniques of the present disclosure such as methods of generating a digital media message, may be performed solely on either the server 102 or one of the electronic devices 104 . Alternatively, in further embodiments, methods and techniques of the present disclosure may be performed, at least in part, on both the server 102 and one of the electronic devices 104 .
- FIG. 3 is a schematic diagram of illustrative components in an example electronic device 104 that is used to prepare and/or consume digital content segments 112 and digital media messages 114 .
- the electronic device 104 shown in FIG. 3 may include one or more of the components described above with respect to the server 102 such that digital content segments 112 and/or digital media messages 114 may be created and/or consumed solely on the electronic device 104 .
- the electronic device 104 may include one or more processor(s) 302 and memory 304 .
- the memory 304 may include computer readable media.
- Computer readable media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. As defined herein, computer readable media does not include communication media in the form of modulated data signals, such as a carrier wave, or other transmission mechanisms.
- the memory 304 of the electronic device 104 may also include a media message engine 110 , and the engine 110 may include any of the modules or other components described above with respect to the media message engine 108 .
- the media message engine 110 of the electronic device 104 may include one or more of a content interface module 306 , a content display module 308 , a user interface module 310 , and a data store 312 similar to the data store 216 described above.
- the modules may include routines, programs, instructions, objects, and/or data structures that perform particular tasks or implement particular abstract data types.
- the electronic device 104 may also include one or more cameras, microphones, displays (e.g., a touch screen display), keyboards, mice, touch pads, proximity sensors, capacitance sensors, or other user interface devices 314 .
- the content interface module 306 may enable the user to request and download content, digital content segments 112 , or other information from the server(s) 102 and/or from the internet.
- the content interface module 306 may download such content via any wireless or wired communication interfaces, such as Universal Serial Bus (USB), Ethernet, Bluetooth®, Wi-Fi, and/or the like.
- the content interface module 306 may include and/or enable one or more search engines or other applications on the electronic device 104 to enable the user 116 to search for images, video, audio, and/or other content to be included in a digital media message 114 .
- the content display module 308 may present content, digital content segments 112 , digital media messages 114 , or other information on a display of the electronic device 104 for viewing.
- the content display module 308 may provide functionalities that enable the user 116 to manipulate individual digital content segments 112 or other information as a digital media message 114 is being generated.
- the content display module 308 may provide editing functionality enabling the user 116 to delete, move, modify, augment, cut, paste, copy, save, or otherwise alter portions of each digital content segment 112 as part of generating a digital media message 114 .
- FIG. 4 shows an illustrative user interface 400 that enables the user 116 to generate a digital media message 114 .
- the user interface 400 may be displayed on an electronic device 104 that enables users to create, capture, search for, and/or select digital content segments 112 , and to generate and/or consume digital media messages 114 .
- the user interface 400 may be displayed, for example, on a display 402 of the electronic device 104 .
- the user interface 400 may be a web page that is presented to the user 116 via a web browser on the electronic device 104 .
- the user interface 400 may be an interface generated and provided by the content display module 308 as part of a digital media message generation application operating locally on the electronic device 104 .
- example embodiments in which the user interface 400 is generated and provided by the content display module 308 and/or the message generation engine 108 as part of a digital media message generation application operating locally on the electronic device 104 will be described unless otherwise noted.
- the message generation engine 108 may present a user interface 400 that includes a first portion 404 displaying an image 406 , and a second portion 408 that includes one or more thumbnails 410 ( 1 )- 410 (N) (collectively “thumbnails 410 ”).
- the image 406 displayed in the first portion 404 may be one or more images, photos, or first frames of a video stored in the memory 304 of the electronic device 104 .
- the content display module 308 may present one or more images 406 in the first portion 404 that are obtained in real time via, for example, a camera or other user interface device 314 of the electronic device 104 .
- the first portion 404 may provide an image 406 of objects that are within a field of view of the camera, and at least the first portion 404 may be receptive to user input such as, for example, touch input, touch and hold input, swipe input, tap input, double tap input, pinch input, and/or other gestures.
- the message generation engine 108 may receive input from a user of the electronic device 104 via either the first portion 404 or the second portion 408 .
- such input may comprise one or more gestures such as a touch and hold command within the first portion 404 . Receipt of such an input in the first portion 404 may cause the message generation engine 108 to capture and/or otherwise receive a first digital content segment 112 via, for example, the camera or other user interface device 314 of the electronic device 104 .
- the received digital content segment 112 may be displayed within the first portion 404 as the content segment 112 is being recorded and/or otherwise captured by the camera.
- the message generation engine 108 may associate the digital content segment 112 with a desired position in a play sequence of a digital media message 114 .
- the message generation engine 108 may receive input from the user of the electronic device 104 that includes a touch and hold command on one or more of the thumbnails 410 provided in the second portion 408 . Receipt of such an input in the second portion 408 may cause the message generation engine 108 to receive a video segment and/or an image associated with the respective thumbnail 410 for inclusion in the digital media message 114 .
- the message generation engine 108 may also associate digital content segments 112 receives by selection of one or more of the thumbnails 410 with the respective desired position in the play sequence of the digital media message 114 .
- each of the thumbnails 410 may be representative and/or otherwise indicative of a respective photo, image, and/or video stored in the memory 304 .
- such content may have been captured by a user 116 of the electronic device 104 prior to commencing generation of the digital media message 114 .
- one or more photos, images, videos, and/or other content corresponding to one or more of the thumbnails 410 may be captured during generation of the digital media message 114 .
- the second portion 408 may comprise a scrollable thumbnail library including respective thumbnails 410 that may be selected by the user 116 for inclusion in the digital media message 114 .
- the user interface 400 may also include one or more controls configured to assist the user 116 in capturing one or more digital content segments 112 , modifying one or more of the digital content segments, and/or generating one or more digital media messages 114 .
- the user interface 400 may include a zoom control 412 configured to enlarge or reduce, for example, the size of the image 406 shown in the first portion 404 and/or to enlarge or reduce the size of the first portion 404 itself.
- the user interface 400 may also include a user interface device control 414 configured to control one or more operations of a user interface devices 314 of the electronic device 104 .
- the user interface device control 414 may be configured to control activation of one or more cameras of the device 104 .
- the user interface device control 414 may be configured to select and/or toggle between a first camera of the electronic device 14 on a first side of the electronic device 104 and a second camera on a second side of the electronic device 104 opposite the first side.
- the user interface 400 may also include a plurality of additional controls including one or more navigation controls 416 and/or one or more editing controls 418 .
- the user interface 400 may include a navigation control 416 that, upon selection thereof by the user 116 , may enable the user to browse backward or forward between different user interfaces 400 while generating a digital media message 114 .
- a first navigation control 416 may comprise a “back” control while a second navigation control 416 may comprise a “forward” control.
- one or more of the editing controls 418 may enable a user 116 to add, remove, cut, paste, draw, rotate, flip, shade, color, fade, darken, and/or otherwise modify various aspects of the digital media message 114 and/or various digital content segments 112 .
- one or more of the editing controls 418 may comprise an “undo” control that enables the user 116 to delete and/or otherwise remove one or more digital content segments 112 from a play sequence of the digital media message 114 .
- one or more additional controls may be presented to the user 116 by the media message engine 108 .
- such editing controls 418 may further comprise any audio, video, image, or other editing tools known in the art.
- at least one of the controls described herein may be configured to modify a first digital content segment 112 before a second, third, or other additional digital content segment 112 is received by the media message engine 108 .
- the user interface 400 may also include a message bar 420 configured to provide guidance to the user 116 before, during, and/or after generation of the digital media message 114 .
- the message bar 420 may provide instructions to the user 116 and/or other guidance related to use of one or more of the controls described above, next steps to be taken in order to generate the digital media message 114 , the completion status of the digital media message 114 , and/or other information.
- the message bar 420 may be disposed between the first portion 404 and the second portion 408 .
- the message bar 420 may be disposed above the first portion 404 , below the second portion 408 , and/or at any other position on the user interface 400 .
- the message bar 420 may instruct the user 116 to touch and hold, for example, the first portion 404 or the second portion 408 in order to begin generating a digital media message 114 .
- FIG. 5 illustrates another example user interface 500 of the present disclosure.
- the media message engine 108 may provide such an example user interface 500 during the process of generating a digital media message 114 and, for example, after at least one digital content segment 112 has been received by the media message engine 108 via the electronic device 104 .
- the user interface 500 may include visual indicia of a play sequence 502 associated with the digital media message 114 that is currently being generated. Such visual indicia may include a first portion corresponding to a first digital content segment 112 received by the media message engine 108 , and at least one additional portion corresponding to a respective additional digital content segment 112 received by the media message engine 108 .
- the visual indicia of the play sequence 502 may include one or more thumbnails 504 illustrating and/or otherwise indicative of respective digital content segments 112 that have previously been added to and/or otherwise associated with the digital media message 114 .
- the visual indicia of the play sequence 502 may include various thumbnails 504 provided in the sequential order in which each respective content segment 112 has been received by the media message engine 108 .
- digital content segments 112 received earlier in time during the generation of a digital media message 114 may be represented by respective thumbnails 504 disposed further to the left-hand side of the display 402 than additional thumbnails 504 representing respective digital content segments 112 received relatively later in time.
- each respective thumbnail 504 may illustrate one or more scenes from a video, a representation of a photo or image, and/or any other visual representation of the respective digital content segment 112 to which the thumbnail 504 corresponds.
- the thumbnails 504 provided as part of the visual indicia of the play sequence 502 may assist the user 116 in recalling the content and/or general flow of the digital media message 114 during creation thereof.
- the example thumbnail 504 illustrated in FIG. 5 is representative of, for example, the image 406 described above with respect to FIG. 4 .
- a video, photo, image, or other such content associated with a digital content segment 112 received via the user interface 400 of FIG. 4 may, for example, be associated with a first position in the play sequence associated with the user interface 500 of FIG. 5 .
- the user interface 500 may also include one or more controls associated with the visual indicia of the play sequence 502 .
- such controls may include a play control 506 .
- the play control 506 may be configured to play, display, and/or otherwise provide a preview of the digital media message 114 to the user via, for example, the first portion 404 of the display 402 .
- the media message engine 108 may play one or more portions of the digital media message 114 currently being generated in response to receiving a touch input and/or other input via the play control 506 .
- further functionality may be provided to the user 116 via the play control 506 and/or via one or more additional controls associated with the play control 506 .
- the play control 506 and/or other associated controls may enable the user 116 to increase or decrease the speed at which the preview of the digital media message 114 is provided.
- the play control 506 and/or other associated controls may also enable the user 116 to skip between multiple digital content segments 112 associated with the corresponding play sequence.
- the play control 506 and/or other associated controls may enable the user 116 to pause the preview of the digital media message 114 .
- such functionality may be accessed via multiple taps, multiple touches, or other gestures such as swipe gestures, and the like received via the first portion 404 .
- such additional play controls may be rendered, displayed, and/or otherwise provided via the display 402 at a location, for example, proximate the play control 506 .
- the media message engine 108 may provide an image 508 to the user 116 via the first portion 404 .
- the image 508 may correspond to one or more of the thumbnails 410 shown in the second portion 408 .
- the user 116 may select a thumbnail 410 of the second portion 408 by touching and/or holding the desired thumbnail 410 with the hand 422 of the user 116 .
- an image 508 corresponding to the selected thumbnail 410 may be displayed in the first portion 404 .
- the media message engine 108 may receive at least one digital content segment 112 in response to selection of one or more such thumbnails 410 by the user 116 .
- the media message engine 108 may not only receive a first digital content segment 112 comprising a photo, video, image, and/or other content corresponding to the selected thumbnail 410 , but may also receive a different additional content segment 112 while the surface and/or portion of the display 402 corresponding to the thumbnail 410 is contacted by the hand 422 of the user 116 .
- the additional digital content segment 112 may comprise audio or other like input captured by a microphone or other user interface device 314 of the electronic device 104 while the surface and/or portion of the display 402 corresponding to the thumbnail 410 is contacted by the hand 422 of the user 116 .
- both of the respective digital content segments may be added to the play sequence of the digital media message 114 such that the respective digital content segments 112 are presented simultaneously when the digital media message 114 is played.
- the image 508 may correspond to the thumbnail 410 ( 2 ) currently being contacted by the hand 422 of the user 116 .
- FIG. 6 illustrates a further user interface 600 provided by the media message engine 108 .
- the media message engine 108 may provide such an example user interface 600 during the process of generating a digital media message 114 and, for example, after a plurality of digital content segments 112 have been received by the media message engine 108 via the electronic device 104 .
- the user interface 600 may include visual indicia of the play sequence 502 that includes the thumbnail 504 described above with respect to FIG. 5 , as well as a thumbnail 602 illustrating and/or otherwise indicative of a digital content segment 112 associated with the image 508 described above with respect to FIG. 5 .
- the various thumbnails 504 , 602 included in the visual indicia of the play sequence 502 may be provided in the sequential order in which each respective content segment 112 has been received by the media message engine 108 .
- the thumbnail 504 is disposed further to the left-hand side of the display 402 than the thumbnail 602 , thereby indicating that a digital content segments 112 corresponding to the thumbnail 504 was received earlier in time than a digital content segment 112 corresponding to the thumbnail 602 .
- the media message engine 108 may provide an image 604 to the user 116 via the first portion 404 .
- the image 604 may correspond to one or more of the thumbnails 410 shown in the second portion 408 .
- the user 116 may select a thumbnail 410 ( 3 ) of the second portion 408 by touching and/or holding a section and/or surface of the display 402 associated with the desired thumbnail 410 ( 3 ).
- the image 604 corresponding to the selected thumbnail 410 ( 3 ) may be displayed in the first portion 404 .
- the media message engine 108 may receive at least one digital content segment 112 in response to selection of one or more such thumbnails 410 ( 3 ) by the user 116 .
- the media message engine 108 may not only receive a first digital content segment 112 comprising a photo, video, image, and/or other content corresponding to the selected thumbnail 410 ( 3 ), but may also receive a different second content segment 112 while the surface and/or portion of the display 402 corresponding to the thumbnail 410 ( 3 ) is contacted by the hand 422 of the user 116 .
- the second digital content segment 112 may comprise audio or other like input captured by a microphone or other user interface device 314 of the electronic device 104 while the surface and/or portion of the display 402 corresponding to the thumbnail 410 ( 3 ) is contacted by the hand 422 of the user 116 .
- receiving such first and second content segments 112 may cause, for example, the media message engine 108 or other components of the electronic device 104 to store at least one of the first and second content segments 112 in the memory 304 and/or in the memory 204 of the server 102 .
- the first digital content segment 112 may be stored separately from the second digital content segment 112 .
- the first and second digital content segments 112 may be added to the play sequence of the digital media message 114 such that the respective digital content segments 112 are presented simultaneously when the digital media message 114 is played.
- the media message engine 108 may combine such first and second digital content segments 112 .
- the second digital content segment 112 may be presented simultaneously with the first digital content segment 112 (e.g., a photo, video, image, audio, or other content) when the digital media message 114 is played.
- Combining digital content segments 112 in this way may include generating a combined segment that is configured such that, for example, audio from the second content segment 112 described above is presented simultaneously with at least one of a photo, video, image, audio, or other content of the first content segment 112 when a portion of the digital media message 114 corresponding to the combined segment is played.
- the media message engine 108 may associate the combined segment with any position in the play sequence desired by the user 116 .
- the user interface 600 may also include one or more controls configured to enable the user 116 to share the digital media message 114 with other users, such as users 120 of remote electronic devices 118 .
- the user interface 600 may include one or more share controls 606 .
- the media message engine 108 may provide, such as via the display 402 , a plurality of additional controls configured to assist the user 116 in providing the digital media message 114 for sharing with a remote electronic device 118 .
- additional controls will be described in greater detail below.
- FIG. 7 illustrates yet another example user interface 700 of the present disclosure.
- the media message engine 108 may provide such an example user interface 700 during the process of generating a digital media message 114 and, for example, after a final digital content segment 112 has been received by the media message engine 108 via the electronic device 104 .
- the user interface 700 may include visual indicia of the play sequence 502 that includes the thumbnails described above with respect to FIGS. 5 and 6 , as well as a thumbnail 702 illustrating and/or otherwise indicative of a digital content segment 112 associated with the image 604 .
- the user interface 700 may also include an image 704 , and the image 704 may be one or more images, photos, or first frames of a video stored in the memory 304 of the electronic device 104 .
- the content display module 308 may present one or more images 704 in the first portion 404 that are obtained in real time via, for example, a camera or other user interface device 314 of the electronic device 104 .
- the first portion 404 may provide an image 704 of objects that are within a field of view of the camera.
- the media message engine 108 may receive an input, such as a touch input, indicative of selection of the control 606 by the user 116 .
- the media message engine 108 may provide the example user interface 800 illustrated in FIG. 8 .
- Such an example user interface 800 may include, among other things, a message thumbnail 801 indicating and/or otherwise identifying the digital media message 114 that the user 116 desires to share.
- a message thumbnail 801 may be similar to one or more of the thumbnails 504 , 602 , 702 described above.
- the message thumbnail 801 may be larger than one or more of the thumbnails 504 , 602 , 702 , and/or may have one or more visual characteristics (e.g., highlighting, shading, a label, a frame, etc.) configured to make it easier for the user 116 to distinguish the message thumbnail 801 from one or more of the thumbnails 504 , 602 , 702 .
- the message thumbnail 801 may comprise, for example, a first frame and/or any other image or content indicative of the digital media message 114 being generated by the user 116 .
- Such an example user interface 800 may also include a plurality of controls configured to assist the user 116 in providing the digital media message 114 for sharing with, for example, a remote electronic device 118 , such as via the network 106 .
- one or more of the controls 802 may enable the user 116 to add a title, a name, and/or other identifier to the media message 114 such that the media message 114 may be easily recognizable and/or identifiable by one or more users 120 of the remote electronic device 118 .
- the title and/or other identifier added to the media message 114 may be provided to the user 120 simultaneously and/or otherwise in conjunction with the digital media message 114 when the user 120 consumes at least a portion of the digital media message 114 on the remote electronic device 118 .
- the user interface 800 may include one or more controls 804 , 806 configured to enable the user 116 to privatize the digital media message 114 prior to providing the digital media message 114 for sharing with a remote electronic device 118 .
- one or more such controls 804 may enable the user 116 to encrypt and/or otherwise configure the digital media message 114 such that only an approved user 120 or plurality of users 120 may receive and/or access the digital media message 114 .
- the media message engine 108 may receive an input, such as a touch input, indicative of selection of the control 804 by the user 116 .
- the media message engine 108 may enable the user 116 to browse, for example, an address book or other like directory stored in the memory 304 of the electronic device 104 and/or in the memory 204 of the server 102 . Upon browsing such a directory, the user 116 may select one or more contacts approved by the user 116 to have access to the digital media message 114 . Additionally or alternatively, in response to receiving such an input, the media message engine 108 may enable the user 116 to password protect and/or otherwise encrypt the digital media message 114 prior to sharing.
- one or more of the controls 806 may comprise a slide bar and/or other like icon indicating whether the user 116 has privatized the digital media message 114 . For example, such a control 806 may change color, transition between a “no” indication and a “yes” indication, and/or may otherwise provide a visual indication of the privacy status/level of the digital media message 114 .
- the user interface 800 may also include one or more controls 808 configured to enable the user 116 to select one or more means of providing the digital media message 114 for sharing with a remote electronic device 118 .
- one or more such controls 808 may enable the user 116 to select from a plurality of common social media websites and/or other portals useful in sharing the digital media message 114 .
- the media message engine 108 may receive an input, such as a touch input, indicative of selection of the control 808 by the user 116 . In response to receiving such an input, the media message engine 108 may enable the user 116 to access an existing account on the selected social media portal. Once such an account has been accessed, the media message engine 108 may provide the digital media message 114 to the selected social media portal for sharing with remote users 120 via the selected portal.
- One or more such controls 808 may also enable the user 116 to select between email, text messaging (SMS), instant messaging, and/or other like means for sharing the digital media message 114 .
- the media message engine 108 may receive an input, such as a touch input, indicative of selection of the control 808 by the user 116 .
- the media message engine 108 may enable the user 116 to browse, for example, an address book or other like directory stored in the memory 304 of the electronic device 104 and/or in the memory 204 of the server 102 .
- the user 116 may select one or more contacts with which the user 116 desires to share the digital media message 114 .
- the user 116 may provide the digital media message 114 to the selected users by providing an input, such as a touch input, indicative of selection of a share control 810 .
- FIG. 9 shows an example method 900 associated with generating and sharing a digital media message 114 .
- the example method 900 is illustrated as a collection of steps in a logical flow diagram, which represents operations that can be implemented in hardware, software, or a combination thereof.
- the steps represent computer-executable instructions stored in memory. When such instructions are executed by one or more processors, such instructions may cause the processor, various components of the electronic device, and/or the electronic device, generally, to perform the recited operations.
- Such computer-executable instructions may include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
- the electronic device 104 may receive an input from the user 116 of the electronic device 104 .
- such an input may be a touch input, a touch and hold input, a swipe, a drag, a voice command, and/or any other input described herein.
- the media message engine 108 may receive a touch and hold input via the touch-sensitive display 402 of the electronic device 104 .
- the input received at block 902 may indicate selection of and/or may otherwise correspond to an image, a video, an audio segment, and/or any other digital content segment 112 ( 1 ) stored in the memory 304 of the electronic device 104 and/or in the memory 204 of the server computer 102 .
- the electronic device 104 may display a plurality of thumbnails 410 on the display 402 .
- the user 116 may provide a touch and hold input via one or more of the thumbnails 410 (e.g., thumbnail 410 ( 1 ), as shown in FIG. 10 ), and such an input may indicate that the user 116 has selected the digital image corresponding to the particular thumbnail 410 ( 1 ) for inclusion in a digital media message 114 .
- the selected digital image (e.g., a first digital content segment 112 ( 1 )) may be stored in the memory 304 of the electronic device 104 and/or in the memory 204 , and visual indicia of a play sequence 502 associated with the digital media message 114 being generated may be displayed on the display 402 . Further, the input described with respect to block 902 may be received via the location on the display 402 at which the thumbnail 410 ( 1 ) is being displayed.
- receiving the input at block 902 may cause the electronic device 104 to capture a video segment, audio segment, photo, digital image, or other such digital content segment (e.g., a second digital content segment 112 ( 2 )) using one or more of the user interface devices 314 .
- the processor 302 may cause a microphone, digital camera, and/or other user interface 314 of the electronic device 104 to capture a digital content segment 112 ( 2 ) in response to the input received at block 902 .
- the media message engine 108 may cause the captured digital content segment 112 ( 2 ) to be stored in the memory 304 and/or the memory 204 of the server 102 for future use.
- each digital content segment 112 used to generate the digital media message 114 may be stored in the memory 204 of the server 102 .
- the first and second digital content segments 112 ( 1 ), 112 ( 2 ) may not be provided to the server computer 102 at block 904 .
- the user 116 of the electronic device 104 may access the first and second digital content segments 112 ( 1 ), 112 ( 2 ) on the server 102 , via the network 106 , and may select one or more of the digital content segments 112 ( 1 ), 112 ( 2 ) for use in generating the digital media message 114 using the electronic device 104 .
- the first and second digital content segments 112 ( 1 ), 112 ( 2 ) may comprise respective portions of a clip of the digital media message 114 being generated.
- the digital media message 114 may comprise one or more clips 1000 ( 1 ), 1000 ( 2 ), . . . 1000 (N) (collectively, “clips 1000 ”), and each clip 1000 of the digital media message 114 may comprise one of more frames.
- the digital media message 114 may comprise an audio track 1002 and a corresponding video track 1004 .
- each clip 1000 may make up a respective sequential portion of the audio and video tracks 1002 , 1004 .
- the digital content segment 112 ( 1 ) (e.g., a digital image) may make up each frame of the video track 1004 and the corresponding digital content segment 112 ( 2 ) (e.g., an audio segment) may make up each frame of the audio track 1002 . It is understood that using the digital image 112 ( 1 ) to make up each frame of a video track 1004 may reduce the device and/or server memory required for storing the video track 1004 , the network bandwidth required to transfer the video track 1004 , and/or other device resource requirements relative to comparable video tracks 1004 comprising actual video footage.
- using the digital image 112 ( 1 ) to make up each frame of the video track 1004 may reduce the overall size of the resulting digital media message 114 , thus reducing the network bandwidth required to transfer the digital media message 114 and reducing the memory, processor resources, and/or other server or device resources needed to render the digital media message 114 .
- the various clips 1000 may comprise any of a variety of formats.
- such clips 1000 may be made from audio segments, video segments, digital images, and/or any of the other types of digital content segments 112 described herein.
- the digital media message 114 being created comprises a first clip 1000 ( 1 ) in which the audio track 1002 comprises an audio segment 112 ( 2 ) and in which the video track 1004 comprises a digital image segment 112 ( 1 ).
- the digital media message 114 also includes a second clip 1000 ( 2 ) comprising a digital video.
- the video track 1004 of the digital media message 114 corresponding to the second clip 1000 ( 2 ) may comprise a video segment 112 ( 3 ) of the digital video
- the audio track 1002 of the digital media message 114 corresponding to the second clip 1000 ( 2 ) may comprise an audio segment 112 ( 4 ) of the digital video.
- the electronic device 104 may generate an identifier that is unique to the digital content segment 112 ( 1 ) described above with respect to block 902 .
- the media message engine 108 may cause the processor 302 and/or other components of the electronic device 104 to generate a unique series of numbers, letters, symbols, and/or other identifiers.
- the identifier generated at block 906 may be randomly generated by the media message engine 108 , and/or other components of the electronic device 104 .
- the electronic device 104 may link, couple, attach, and/or otherwise associate the identifier with the digital content segment 112 ( 1 ) in the memory 304 .
- the media message engine 108 and/or the processor 302 may attach, save, embed, and/or otherwise associate the identifier with the digital content segment 112 ( 1 ) such that the identifier becomes part of, integrated with, and/or is otherwise carried with the digital content segment 112 ( 1 ).
- the identifier generated at block 906 may comprise a label and/or other like moniker by which one or more components of the computing environment 100 may identify the digital content segment 112 ( 1 ).
- the electronic device 104 may generate an identifier that is unique to the digital content segment 112 ( 2 ) described above with respect to block 904 .
- the media message engine 108 may cause the processor 302 and/or other components of the electronic device 104 to generate a unique series of numbers, letters, symbols, and/or other identifiers.
- the electronic device 104 may link, couple, attach, and/or otherwise associate the identifier generated at block 908 with the digital content segment 112 ( 2 ).
- the processes and/or operations performed by the media message engine 108 , the processor 302 , and/or other components of the electronic device 104 at block 908 with respect to the digital content segment 112 ( 2 ) may be similar to and/or the same as the processes and/or operations described above with respect to block 906 .
- the electronic device 104 may provide one or more of the digital content segments 112 ( 1 ), 112 ( 2 ) described herein to the server computer 102 via the network 106 .
- the media message engine 108 may cause the processor 302 , and/or one or more communication interfaces or other like hardware and/or software devices or components of the electronic device 104 to transfer at least one of the digital content segments 112 ( 1 ), 112 ( 2 ) from the electronic device 104 to the server computer 102 .
- the identifier described above with respect to block 906 may be transferred in association with the digital content segment 112 ( 1 ) (e.g., the digital image) if the digital content segment 112 ( 1 ) is transferred by the electronic device 104 at block 910 .
- the identifier described above with respect to block 908 may be transferred in association with the digital content segment 112 ( 2 ) (e.g., the audio segment) if the digital content segment 112 ( 2 ) is transferred by the electronic device 104 at block 910 .
- at least one of the digital content segments 112 ( 1 ), 112 ( 2 ) may be transferred by the electronic device 104 to the server computer 102 substantially in real time at block 910 .
- the electronic device 104 may begin transferring and/or otherwise transfer such a digital content segment 112 to the server computer 102 as the digital content segment 112 is being captured.
- digital content segments 112 transferred to the server computer 102 by electronic device 104 may be stored in the memory 204 and, over time, may form a global repository of digital content by which the various digital media messages 114 described herein may be formed.
- the server computer 102 may protect against storing multiple copies of the same digital content segments 112 .
- digital content segments 112 comprising audio segments, video segments, and the like may typically be unique due to the nature in which such digital content segments 112 are created at the user level.
- Digital content segments 112 comprising digital images have a greater tendency to be duplicates (e.g., stored locally on multiple electronic devices 104 , 118 and/or selected for use in a digital media message 114 by multiple different users through web-based searches) since users tend to utilize the same clipart, publicly available images, and/or other digital images when generating digital media messages 114 .
- the server computer 102 may compare the respective unique identifier associated with the one or more digital content segments 112 (generated at blocks 906 and 908 ) to a plurality of identifiers stored in the memory 304 .
- the electronic device 104 may provide the respective unique identifiers to the server computer 102 at block 910 before providing the one or more digital content segments 112 at block 910 .
- the electronic device 104 may provide the respective unique identifiers to the server computer 102 at block 910 along with the transferred digital content segments 112 .
- the server computer 102 may accept the digital content segment 112 for storage in the memory 304 if no match is found between the unique identifier associated with the received digital content segment 112 and the plurality of identifiers stored in the memory 304 .
- the server computer 102 may deny and/or otherwise prohibit storage of the corresponding digital content segment 112 in the memory 304 based on such a comparison and/or the resulting match.
- the electronic device 104 provides the respective unique identifiers to the server computer 102 at block 910 before providing the one or more digital content segments 112
- the digital content segment 112 corresponding to the matched unique identifier may not be sent to the server computer 102 based on and/or as a result of finding such a match.
- the electronic device 104 may generate one or more electronic files providing a sequential clip listing associated with rendering the digital media message 114 .
- the media message engine 108 may cause the processor 302 and/or one or more other components of the electronic device 104 to generate an edit decision list (hereinafter “EDL”), a text file, a data file, a spreadsheet, a gif file, a tif file, and/or other file indicating an order in which various clips of the digital media message 114 are to be rendered upon playback.
- FIG. 12 illustrates an example electronic file 1200 of the present disclosure.
- an example electronic file 1200 may include a title 1202 of the digital media message 114 being generated.
- the electronic file 1200 may also include a time-based, frame-based, and/or otherwise sequential clip listing 1204 setting forth the content, components, and/or or other parameters of each clip 1000 in the digital media message 114 , and the sequential order in which each clip 1000 is to be rendered.
- the clip listing 1204 may include information 1206 , 1208 specific to each clip 1000 of the digital media message 114 .
- the information 1206 , 1208 shown in FIG. 12 corresponds to the example clips 1000 ( 1 ), 1000 ( 2 ) described above with respect to FIGS. 10 and 11 .
- such information 1206 , 1208 may include the content identifiers 1210 , 1212 described above with respect to blocks 906 , 908 .
- Such information 1206 , 1208 may also include a first indicator 1214 identifying a first frame (e.g., a start frame) of the digital media message 114 corresponding to a digital content segment 112 being rendered in the respective clip.
- the information 1206 , 1208 may also include a second indicator 1216 identifying a second frame (e.g., an end frame) of the digital media message 114 corresponding to a digital content segment 112 being rendered in the respective clip.
- the first and second identifiers 1214 , 1216 may indicate the frames of the digital media message 114 at which the corresponding digital content segment(s) 112 is/are to start and stop, respectively.
- the information 1206 , 1208 may include at least one additional indicator 1218 identifying a volume level corresponding to the digital content segment 112 being rendered in the respective clip.
- the various indicators described above with respect to the electronic file 1200 are merely examples.
- the electronic files 1200 of the present disclosure may include more than, less than, and/or different indicators than those described herein.
- the order in which the clips 1000 ( 1 ), 1000 ( 2 ) are set forth in the clip listing 1204 may correspond to the order in which the digital content segments 112 are generated, organized, and/or otherwise arranged during the process of generating a digital media message 114 .
- Such an order may correspond to, for example, the order set forth in the play sequence 502 .
- such an order may be defined by the frames indicated by the identifiers 1214 , 1216 .
- the sequential clip listing 1204 included in the electronic file 1200 may be made up of the various clips 1000 ( 1 ), 1000 ( 2 ) described above, and such clips 1000 ( 1 ), 1000 ( 2 ) may each comprise respective frame groups formed from the plurality of frames to be included in the digital media message 114 . In this way, each frame group may comprise a respective clip 1000 of the digital media message 114 . Further, rendering the digital media message 114 in accordance with the sequential clip listing 1204 may cause the digital content segments 12 to be rendered in accordance with the indicators 1214 , 1216 corresponding to each respective clip 1000 . For example, when rendering the digital media message 114 in accordance with the electronic file 1200 shown in FIG.
- rendering a digital media message 114 in accordance with an electronic file 1200 may cause a digital image to be presented and/or otherwise rendered on the electronic device simultaneously with an audio segment, and/or other digital content segment 112 , from a first frame of the digital media message 114 to a second frame.
- the electronic device 104 may provide one or more of the electronic files 1200 described herein to the server computer 102 via the network 106 .
- the media message engine 108 may cause the processor 302 , and/or one or more communication interfaces or other like hardware and/or software devices or components of the electronic device 104 to transfer an EDL file or other like file 1200 from the electronic device 104 to the server computer 102 .
- the electronic file 1200 may be transferred by the electronic device 104 to the server computer 102 substantially in real time at block 914 .
- the electronic file 1200 may be transferred to the server computer 102 as the digital content segment 112 is being generated.
- the electronic device 104 may transfer the electronic file 1200 to the server computer 102 separately from at least one of the digital content segments 112 transferred to the server computer 102 by electronic device 104 at block 910 .
- at least one of the digital content segments 112 may be transferred to the server computer 102 by a first signal generated by the electronic device 104
- electronic file 1200 may be transferred to the server computer 102 by a second signal generated by the electronic device 104 different from the first signal.
- a first digital content segment 112 ( 1 ) may be transferred to the server computer 102 via a first signal generated by the electronic device 104 and sent using the network 106
- a second digital content segment 112 ( 2 ) may be transferred to the server computer 102 via a second signal generated by the electronic device 104 separate from the first signal and sent using the network 106
- electronic file 1200 may be transferred to the server computer 102 via a third signal generated by the electronic device 104 , separate from the first and second signals, and sent using the network 106 .
- the various digital content segments 112 described herein may be sent to the server computer 102 for storage and/or for further use in generating and/or rendering a digital media message 114 , and the digital content segments 112 may be provided to the server computer 102 bearing no association with the respective electronic device 104 .
- the only information linking the digital content segments 112 provided to the server computer 102 at block 910 with the digital media message 114 being generated by a user 116 of the electronic device 104 may be the electronic file 1200 identifying the digital content segments 112 by their respective unique identifiers 1210 , 1212 .
- the digital content segments 112 provided to the server computer 102 at block 910 may be stored in a first database, data store, division, and/or other first portion of the memory 204
- the electronic file 1200 provided to the server computer 102 at block 914 may be stored in a second database, data store, division, and/or other second portion of the memory 204 different from the first portion.
- the electronic file 1200 may be stored in the second portion of the memory 204 in association with a telephone number, serial number, id number, and/or other like indicator uniquely identifying an electronic device 118 of an intended recipient of the digital media message 114 .
- the electronic device 104 may provide instructions to the server computer 102 via the network 106 .
- the media message engine 108 may cause the processor 302 , and/or one or more communication interfaces or other like hardware and/or software devices or components of the electronic device 104 to generate and send a signal comprising instructions to share the digital media message 114 with a second electronic device 118 different from the electronic device 104 via the network 106 .
- the instructions may include a telephone number, serial number, id number, and/or other like indicator uniquely identifying the electronic device 118 , and the electronic device 118 may belong to an intended recipient of the digital media message 114 .
- the server computer 102 may generate one or more additional electronic files using the various digital content segments 112 provided by the electronic device 104 at block 910 .
- One or more such additional electronic files may also be generated by the server computer 102 based on the electronic file 1200 provided by the electronic device 104 at block 914 .
- the server computer 102 may generate one or more such additional electronic files in response to receiving the instructions provided at block 916 or in response to receiving the electronic file 1200 at block 914 .
- the server computer 102 may generate a first additional electronic file comprising a plurality of frame groups formed from at least one of the digital content segments 112 provided by the electronic device 104 at block 910 .
- the first additional electronic file may comprise a plurality of frames of the digital media message 114 , and may be optimized by the server computer 102 for the purpose of streaming the digital media message 114 from the server computer 102 to an additional electronic device 118 via the network 106 .
- the server computer 102 may break up one or more of the clips 1000 identified by the electronic file 1200 into frame groups.
- the frame groups may be subgroups of frames that, together, make up the one or more clips 1000 of the digital media message 114 .
- the server computer 102 may then remove digital content from each of the frame groups (such as from the audio track 1002 , the video track 1004 , etc.) and/or may otherwise modify the quality and/or fidelity level of the individual frames within each of the frame groups. In some examples, one or more entire frames may be removed from one or more of the frame groups. As a result, the various different frame groups formed by the server computer 102 may have a different respective quality levels. For example, a first frame group formed by the server computer 102 may have a first level of fidelity, and a second frame group formed by the server computer 102 may have a second level of fidelity different from the first level of fidelity. This difference in the level of fidelity may, in some instances, be perceptible when the resulting digital media message (i.e., the first additional electronic file) is streamed and/or otherwise rendered.
- the resulting digital media message i.e., the first additional electronic file
- the server computer 102 may generate a second additional electronic file that is optimized for uploading by the server computer 102 to an additional electronic device 118 and/or for downloading by the electronic device 118 .
- the second additional electronic file may comprise digitally compressed and/or otherwise modified version of one or more frames of the plurality of frames making up the digital media message 114 .
- the server computer 102 may compress the various frames of digital media message 114 identified by the electronic file 1200 . Processing and/or handling such an example second additional electronic file may require less network bandwidth, fewer processor resources, and/or reduced memory storage space relative to a corresponding digital media message 114 , rendered using the digital content segments 112 provided by the electronic device 104 at block 910 .
- the instructions provided at block 916 may cause the server computer 102 generate and/or provide a signal to the electronic device 118 via the network 106 , and using the indicator uniquely identifying the electronic device 118 .
- the signal generated by the server computer 102 may be provided to the electronic device 118 via at least one of a text message, an email, and/or a website, such as a social media website.
- the signal provided by the server computer 102 may include a request for permission associated with sharing the digital media message 114 .
- such a signal may request that a user 120 of the electronic device 118 provide permission to the server computer 102 for sharing the digital media message 114 .
- the server computer 102 may receive additional instructions from the electronic device 118 via the network 106 .
- the user 120 of the electronic device 118 may grant permission to the server computer 102 to share the media message 114 .
- the server computer 102 may stream the first additional electronic file described above, via the network 106 .
- the user 120 may view the digital media message 114 without downloading and/or otherwise receiving the digital content segments 112 utilized to render the digital media message 114 .
- the server computer 102 may transfer the second additional electronic file described above to the electronic device 118 via the network 106 .
- the server computer 102 may upload the compressed versions of the digital content segments 112 to the electronic device 118 for local rendering and/or viewing on the electronic device 118 .
- FIG. 13 illustrates another an example method 1300 associated with generating and sharing a digital media message 114 .
- the example method 1300 is illustrated as a collection of steps in a logical flow diagram, which represents operations that can be implemented in hardware, software, or a combination thereof.
- the steps represent computer-executable instructions stored in memory. When such instructions are executed by one or more processors, such instructions may cause the processor, various components of a server computer, and/or the server computer, generally, to perform the recited operations.
- Such computer-executable instructions may include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
- the server computer 102 may receive various digital content and corresponding unique identifiers from one or more electronic devices 104 , and via the network 106 .
- the electronic device 104 may receive an input indicating selection of and/or may otherwise corresponding to an image, a video, an audio segment, and/or any other digital content segment 112 ( 1 ) stored in the memory 304 of the electronic device 104 .
- a user 116 may provide a touch and hold input via the display 402 and such an input may indicate that the user 116 has selected a digital image corresponding to a particular thumbnail 410 ( 1 ) for inclusion in a digital media message 114 .
- receiving such an input may cause the electronic device 104 to capture a video segment, audio segment, photo, digital image, or other such digital content segment (e.g., a second digital content segment 112 ( 2 )) using one or more of the user interface devices 314 .
- the processor 302 may cause a microphone, digital camera, and/or other user interface 314 of the electronic device 104 to capture a digital content segment 112 ( 2 ) in response to the input described above.
- the electronic device 104 may generate a first identifier that is unique to the digital content segment 112 ( 1 ) and a second identifier that is unique to the digital content segment 112 ( 2 ).
- the media message engine 108 may cause the processor 302 and/or other components of the electronic device 104 to generate a unique series of numbers, letters, symbols, and/or other identifiers 1210 , 1212 .
- the electronic device 104 may link, couple, attach, and/or otherwise associate the first identifier 1210 with the digital content segment 112 ( 1 ) in the memory 304 .
- the electronic device 104 may also link, couple, attach, and/or otherwise associate the second identifier 1212 with the digital content segment 112 ( 2 ) in the memory 304 .
- the electronic device 104 may provide one or more of the digital content segments 112 ( 1 ), 112 ( 2 ) described herein to the server computer 102 via the network 106 .
- the media message engine 108 may cause the processor 302 , and/or one or more communication interfaces or other like hardware and/or software devices or components of the electronic device 104 to transfer at least one of the digital content segments 112 ( 1 ), 112 ( 2 ) from the electronic device 104 to the server computer 102 .
- the first identifier 1210 may be transferred in association with the digital content segment 112 ( 1 ) (e.g., the digital image) when the digital content segment 112 ( 1 ) is transferred by the electronic device 104 to the server computer 102
- the second identifier 1212 may be transferred in association with the digital content segment 112 ( 2 ) (e.g., the audio segment) when the digital content segment 112 ( 2 ) is transferred by the electronic device 104 to the server computer 102
- the server computer 102 may receive the digital content segments 112 ( 1 ), 112 ( 2 ) and identifiers 1210 , 1212 described above.
- the server computer 102 may receive an electronic file 1200 from the electronic device 104 , via the network 106 .
- the electronic device 104 may generate one or more electronic files 1200 providing a sequential clip listing associated with rendering the digital media message 114 .
- An example electronic file 1200 may include a title 1202 of the digital media message 114 being generated.
- the electronic file 1200 may also include a time-based, frame-based, and/or otherwise sequential clip listing 1204 setting forth the content, components, and/or or other parameters of each clip 1000 in the digital media message 114 , and the sequential order in which each clip 1000 is to be rendered.
- the electronic file 1200 may further include the content identifiers 1210 , 1212 described above, a first indicator 1214 identifying a first frame (e.g., a start frame) of the digital media message 114 corresponding to a digital content segment 112 being rendered in the respective clip, and a second indicator 1216 identifying a second frame (e.g., an end frame) of the digital media message 114 corresponding to a digital content segment 112 being rendered in the respective clip.
- the electronic file 1200 may also include at least one additional indicator 1218 identifying a volume level corresponding to the digital content segment 112 being rendered in a respective clip 1000 .
- the electronic device 104 may transfer such an electronic file 1200 to the server computer 102 , and the server computer 102 may receive the electronic file 1200 at block 1304 .
- the server computer 102 may receive instructions from the electronic device 104 via the network 106 to share the digital media message with an additional electronic device 118 via the network 106 .
- Such instructions may include, for example, a third indicator (e.g., a telephone number, serial number, id number, and/or other like indicator) uniquely identifying the electronic device 118 .
- the media message engine 108 may cause the processor 302 , and/or one or more communication interfaces or other like hardware and/or software devices or components of the electronic device 104 to generate and send an instruction signal to the server computer 102 via the network 106 .
- the server computer 102 may receive such a signal at block 1306 .
- the server computer 102 may generate one or more additional electronic files using the various digital content segments 112 received from the electronic device 104 .
- One or more such additional electronic files may also be generated by the server computer 102 based on the electronic file 1200 received from the electronic device 104 .
- the server computer 102 may generate one or more such additional electronic files in response to receiving the instructions or in response to receiving the electronic file 1200 .
- the server computer 102 may generate a first additional electronic file comprising a plurality of frame groups formed from at least one of the digital content segments 112 provided by the electronic device 104 at block 910 .
- the first additional electronic file may comprise a plurality of frames of the digital media message 114 , and may be optimized by the server computer 102 for the purpose of streaming the digital media message 114 from the server computer 102 to the electronic device 118 via the network 106 .
- the server computer 102 may break up one or more of the clips 1000 identified by the electronic file 1200 into frame groups.
- the various different frame groups formed by the server computer 102 at block 1308 may have a different respective quality levels.
- a first frame group formed by the server computer 102 may have a first level of fidelity
- a second frame group formed by the server computer 102 may have a second level of fidelity different from the first level of fidelity.
- the server computer 102 may generate a second additional electronic file that is optimized for uploading by the server computer 102 to the electronic device 118 and/or for downloading by the electronic device 118 .
- the second additional electronic file may comprise digitally compressed and/or otherwise modified version of one or more frames of the plurality of frames making up the digital media message 114 .
- the server computer 102 may generate one or more signals and may provide one or more such signals to the electronic device 118 via at least one of a text message, an email, and/or a website, such as a social media website. Additionally, in such examples, the signal provided by the server computer 102 may include a request for permission associated with sharing the digital media message 114 . For example, such a signal may request that a user 120 of the electronic device 118 provide permission to the server computer 102 for sharing the digital media message 114 .
- the server computer 102 may, at block 1312 , receive additional instructions from the electronic device 118 via the network 106 .
- the user 120 of the electronic device 118 may grant permission to the server computer 102 to share the media message 114 .
- the server computer 102 may, at block 1314 , stream the first additional electronic file described above, via the network 106 .
- the user 120 may view the digital media message 114 without downloading and/or otherwise receiving the digital content segments 112 utilized to render the digital media message 114 .
- the server computer 102 may, at block 1316 , transfer the second additional electronic file described above to the electronic device 118 via the network 106 .
- the server computer 102 may upload the compressed versions of the digital content segments 112 to the electronic device 118 for local rendering and/or viewing on the electronic device 118 .
- embodiments of the present disclosure may enable a user 120 of the electronic device 118 to edit, alter, and/or otherwise modify a digital media message 114 generated by the user 116 of the device 104 .
- the electronic device 118 may render the digital media message 114 corresponding to the second additional electronic file locally.
- the user 120 may then add content, delete content, reorder content, and/or otherwise modify the digital media message 114 locally on the electronic device 118 .
- the server computer 102 may transfer the master content (e.g. the original/native digital content segments 112 used to generate the digital media message 114 ) to the electronic device 118 for rendering and/or modification.
- the electronic device 118 may generate a second electronic file similar to the electronic file 1200 described above with respect to FIG. 12 .
- a second electronic file may comprise an additional EDL file, text file, data file, and/or other like file, and may include a time-based, frame-based, and/or otherwise sequential clip listing setting forth the content, components, and/or or other parameters of each clip in the modified digital media message, and the sequential order in which each clip is to be rendered.
- Electronic device 118 may transfer the second electronic file to the server computer 102 via the network 106 .
- the server computer 102 may store the second electronic file with the first electronic file 1200 in the memory 204 . In this way, the server computer may preserve various editing decisions over time, by saving various EDLs generated by multiple users 116 , 120 .
- FIG. 14 illustrates an example user interface 1400 of the present disclosure configured to assist the user 120 in modifying a digital media message 114 rendered on the electronic device 118 .
- the user interface 1400 may include one or more controls configured to assist the user 120 in making further modifications to one or more of the digital content segments 112 , the play sequence, and/or other components of the digital media message 114 .
- the user interface 1400 may include a control 1402 configured to enable the user 120 to add one or more images, videos, photos, audio clips, and/or other content to the digital media message 114 .
- the media message engine 108 may receive an input, such as a touch input, indicative of selection of the control 1402 by the user 120 .
- the media message engine 108 may enable the user 120 to browse various photos, images, videos, and/or other content stored in the memory 304 and/or in the memory 204 of the server 102 . Additionally and/or alternatively, in response to receiving such an input, the media message engine 108 may enable the user 120 to perform a web-based search, such as via one or more search engines or applications of the electronic device 118 , for such content. The user 120 may be permitted to select one or more such content items for use. Upon selection of such a content item, the media message engine 108 may add the selected item to the play sequence of the digital media message 114 and/or may combine the selected item with one or more content segments 112 of the digital media message 114 . Additionally, the user interface 1400 may include one or more controls 1404 configured to enable the user 120 to delete one or more audio clips, video clips, segments, files, and/or other content from the digital media message 114 .
- the user interface 1400 may further include one or more controls 1406 configured to enable the user 120 to modify one or more of the digital content segments 112 , the play sequence, and/or other components of the digital media message 114 .
- Such controls 1406 may comprise, among other things, any audio, video, image, or other editing tools known in the art.
- such controls 1406 may provide editing functionality enabling the user 120 to move, modify, augment, cut, paste, copy, save, or otherwise alter portions of each digital content segment 112 as part of generating a digital media message 114 .
- control 1406 may enable the user 120 to cut, paste, draw, rotate, flip, shade, color, fade, darken, and/or otherwise modify various aspects of the digital media message 114 and/or various digital content segments 112 included in the play sequence thereof.
- at least one of the controls 1402 , 1404 , 1406 may be similar to and/or the same as one or more of the controls 418 described above.
- the user interface 1400 may also include one or more navigation controls 1408 configured to enable the user 120 to save the modified digital media message 114 locally and/or at the server computer 102 .
- the user interface 1400 or other user interfaces described herein may provide one or more additional controls operable to enable the user 120 of the device 118 to observe the creation of a digital media message 114 (by another user 116 on a separate electronic device 104 ) substantially in real-time.
- Such functionality may be possible in examples in which the clips 1000 , digital content segments 112 , and/or other components of the digital media message 114 are transferred to the server computer 102 in substantially real-time (such as at block 910 ) as the digital media message 114 is being created.
- example embodiments of the present disclosure provide devices and methods for generating digital media messages as a means for communication between users in remote locations.
- Such digital media messages include various combinations of audio, video, images, photos, and/or other digital content segments, and can be quickly and artfully created by each user with little effort.
- the methods of generating such a digital media message described herein may reduce the file size and/or memory requirements of such messages.
- a digital media message 114 generated using the techniques described herein will require/take up less memory than a corresponding digital media message generated by other methods.
- Such digital media messages will also enable the various networks 106 , servers 102 , and/or electronic devices 104 , 118 described herein to transfer, store, and/or process such digital media messages 114 more quickly and with fewer network, server, and/or device resources.
- a reduction in file size and/or memory requirements will reduce overall network load/traffic, and will improve network, server, and/or electronic device performance.
- Such a reduction in file size and/or memory requirements will enable the various networks 106 , servers 102 , and/or electronic devices 104 , 118 described herein to provide, render, display, and/or otherwise process such digital media messages 114 more quickly and with fewer network, server, and/or device resources.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- This application is a continuation-in-part of U.S. patent application Ser. No. 14/683,779, filed on Apr. 10, 2015, which is a continuation-in-part of U.S. patent application Ser. No. 14/569,169, filed Dec. 12, 2014, which claims the benefit of U.S. Provisional Application No. 62/042,114, filed Aug. 26, 2014, and U.S. Provisional Application No. 62/038,493, filed Aug. 18, 2014. This application also claims the benefit of U.S. Provisional Application No. 62/146,045, filed Apr. 10, 2015. The entire disclosures of each of the above applications are incorporated herein by reference.
- It is common for users of electronic devices to communicate with other remote users by voice, email, text messaging instant messaging, and the like. While these means of electronic communication may be convenient in various situations, such means are only suited for transferring a limited amount of rich content between users. For instance, while text messages and email may be used to transmit written dialogue between users, and audio, video, web content, or other files may be transmitted with the text or email messages as attachments, such files are not integrated with the various components of the text or email message in any way.
- As a result of these shortcomings, electronic device messaging applications have been developed to assist the user in creating digital messages that include, for example, images, audio, video, and other content. However, the functionality of existing messaging applications is limited. For example, such applications generally do not enable the user to combine a wide array of digital content segments (e.g., an audio segment, a video segment, a digital image, etc.) such that two or more content segments, such as segments from different sources, can be presented to the recipient simultaneously as an integrated component of the digital message. Additionally, while such content segments may be captured and/or saved locally on a user device, sharing such content segments with other user devices as part of the digital message and via such applications can be cumbersome due to the size of the audio files, video files, and/or other components of such content segments. For example, large audio files, video files, and the like can require significant memory for storage locally on a user device, and can also require a substantial amount of bandwidth to upload and/or download on most wireless networks. As a result, handling such content segments can place significant strain on user device resources and network resources, and can hinder device and/or network performance. Further, traditional techniques for rendering digital messages require that the video rendering and content upload processes begin only after a user has finished generating and/or composing the message. This constraint can result in prolonged upload/render times once the message is complete.
- Example embodiments of the present disclosure are directed toward curing one or more of the deficiencies described above.
- The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same reference numbers in different figures indicate similar or identical items.
-
FIG. 1 is a schematic diagram of an illustrative computing environment for implementing various embodiments of digital media message generation. -
FIG. 2 is a schematic diagram of illustrative components in an example server computer that may be used in an example digital media message generation environment. -
FIG. 3 is a schematic diagram of illustrative components in an example electronic device that may be used in an example digital media message generation environment. -
FIG. 4 shows an illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message. -
FIG. 5 shows another illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message. -
FIG. 6 shows still another illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message. -
FIG. 7 shows yet another illustrative user interface screen displayed on an electronic device that enables users to generate a portion of an example digital media message. -
FIG. 8 shows an illustrative user interface screen displayed on an electronic device that enables users to share an example digital media message. -
FIG. 9 is a flow diagram illustrating an example method of the present disclosure. -
FIG. 10 shows an illustrative user interface screen displayed on an electronic device, as well as example audio and video tracks. -
FIG. 11 shows another illustrative user interface screen displayed on an electronic device, as well as example audio and video tracks. -
FIG. 12 shows an illustrative electronic file of the present disclosure. -
FIG. 13 is a flow diagram illustrating another example method of the present disclosure. -
FIG. 14 shows another illustrative user interface screen displayed on an electronic device that enables users to create and/or modify a digital media message. - The disclosure is directed to devices and techniques for generating digital media messages that can be easily shared between users of electronic devices as a means of communication. The techniques described herein enable users to combine a variety of different digital content segments into a single digital media message. For example, the user may create a digital media message by capturing audio content segments, video content segments, digital images, web content, and the like. Such content segments may be captured by the user during generation of the digital media message. Alternatively, such content segments may be captured by the user prior to generating the digital media message and may be saved in a memory of the electronic device, or in a cloud-based memory, for incorporation into the digital media message.
- In various embodiments, replacing, for example, part of a video track of an underlying digital video segment with a digital image may reduce the file size of the resulting digital media message. In particular, the replaced portion of the video track may typically be rendered at approximately 300 frames/second for a duration of the portion of the video track, and may be characterized by a commensurate memory and/or file size (e.g., in bytes). The selected digital image, on the other hand, may comprise a single frame that may be rendered for the duration of the replaced portion of the video track. Thus, replacing a portion of the video track of the underlying digital video segment with the digital image may reduce the number of frames/second of the underlying video segment, thereby reducing file size thereof. As a result, a digital media message generated using such techniques may have a smaller file size and may require/take up less memory than a corresponding digital media message generated using the underlying digital video segment with the video track unchanged (e.g., without replacing a portion of the video track with a selected digital image).
- Reducing the file size and/or memory requirements of digital media messages in this way has many technical effects and/or advantages. For example, such a reduction in file size and/or memory requirements will enable the various networks, servers, and/or electronic devices described herein to transfer and/or render digital media messages more quickly and with fewer network, server, and/or device resources. As a result, such a reduction in file size and/or memory requirements will reduce overall network load/traffic, and will improve network, server, and/or electronic device performance. As another example, such a reduction in file size and/or memory requirements will enable the various networks, servers, and/or electronic devices described herein to provide, render, display, and/or otherwise process such digital media messages more quickly and with fewer network, server, and/or device resources. In particular, such a reduced file size may reduce the server and/or electronic device memory required to receive and/or store such messages. Such a reduced file size may also reduce the processor load required to provide, render, display, and/or otherwise process such digital media messages. As a result, such a reduction in file size and/or memory requirements will reduce overall network load/traffic, and will improve network, server, and/or electronic device performance and efficiency. Additionally, various embodiments of the present disclosure may enable digital content segments associated with the digital media message to be uploaded to a server computer or other cloud-based resource substantially in real time. In some examples, various digital content segments associated with the digital media message being created may be transferred to such a server computer while the digital content segment is being captured by a user device. As a result, the time required for transferring and/or rendering the finished digital media message on an additional user device may be greatly reduce.
- Illustrative environments, devices, and techniques for generating digital media messages are described below. However, the described message generation techniques may be implemented in other environments and by other devices or techniques, and this disclosure should not interpreted as being limited to the example environments, devices, and techniques described herein.
-
FIG. 1 is a schematic diagram of anillustrative computing environment 100 for implementing various embodiments of digital media message generation. Thecomputing environment 100 may include server(s) 102 and one or more electronic devices 104(1)-104(N) (collectively “electronic devices 104”) that are communicatively connected by anetwork 106. Thenetwork 106 may be a local area network (“LAN”), a larger network such as a wide area network (“WAN”), or a collection of networks, such as the Internet. Protocols for network communication, such as TCP/IP, may be used to implement thenetwork 106. Although embodiments are described herein as using a network such as the Internet, other distribution techniques may be implemented that transmit information via memory cards, flash memory, or other portable memory devices. - A
media message engine 108 on theelectronic devices 104 and/or amedia message engine 110 on the server(s) 102 may receive one or more digital content segments 112(1)-112(N) (collectively, “digital content segments 112” or “content segments 112”) and may generate one or more digital media messages 114 (or “media messages 114”) based on thecontent segments 112. In example embodiments, themedia message engine 108 may receive one ormore content segments 112 via interaction of auser 116 with anelectronic device 104. In some embodiments, themedia message engine 108 may providesuch content segments 112 to themedia message engine 110 on theserver 102, via thenetwork 106, to generate at least a portion of themedia message 114. Alternatively, at least a portion of themedia message 114 may be generated by themedia message engine 108 of the respectiveelectronic device 108. In either example, themedia message 114 may be directed to one or more additional electronic devices 118(1)-118(N) (collectively “electronic devices 118”) via thenetwork 106. Suchelectronic devices 118 may be disposed at a location remote from theelectronic devices 104, and one ormore users 120 may consume thedigital media message 114 via one or more of theelectronic devices 118. - Each of the
electronic devices 104 may include a display component, a digital camera, and an audio input and transmission component. Such audio input and transmission components may include one or more microphones. Theelectronic devices 104 may also include hardware and/or software that support voice over Internet Protocol (VoIP) as well as any of the display, input, and/or output components described herein. Each of theelectronic devices 104 may further include a web browser that enables theuser 116 to navigate to a web page via thenetwork 106. In some embodiments, theuser 116 may generate and/or capture one or moredigital content segments 112 using, for example, the camera and the microphone. For example, theuser 116 may capture one or more digital images using the camera and/or may capture one or more video clips using the camera in conjunction with the microphone. Additionally, each web page may present content that theuser 116 may capture via theelectronic device 104, using various copy and/or save commands included in the web browser of theelectronic device 104, and the user may incorporate such content into one ormore content segments 112. Any of thecontent segments 112 described herein may be provided to one or both of themedia message engines media message engines such content segments 112 into themedia message 114. - Upon receiving the
content segments 112 described herein, themedia message engines respective content segments 112 with associated metadata. The associated metadata may include profile information about the type of content (e.g., image, video, audio, text, animation, etc.), the source of the content segment 112 (e.g., camera, microphone, internet web page, etc.), and/or a position in a play sequence of thedigital media message 114 with which thecontent segment 112 is to be associated. - The
media message engines digital content segments 112 to form thedigital media message 114. In some examples, thedigital content segments 112 may be presented to the user sequentially when themedia message 114 is played. Alternatively, themedia message engines digital content segments 112 such that the combineddigital content segments 112 are presented simultaneously when themedia message 114 is played. Themedia message engines media message 114 to one or more of theelectronic devices 118. Various example components and functionality of themedia message engines FIGS. 2 and 3 . - In various embodiments, the
electronic devices electronic devices digital content segments 112, either separately or combined, as well as the variousdigital media messages 114 described herein. Theelectronic devices -
FIG. 2 is a schematic diagram of illustrative components in example server(s) 102 of the present disclosure. The server(s) 102 may include one or more processor(s) 202 andmemory 204. Thememory 204 may include computer readable media. Computer readable media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. As defined herein, computer readable media does not include communication media in the form of modulated data signals, such as carrier waves, or other transmission mechanisms. - The
media message engine 110 may be a hardware or a software component of the server(s) 102 and in some embodiments, themedia message engine 110 may comprise a component of thememory 204. As shown inFIG. 2 , in some embodiments themedia message engine 110 may include one or more of acontent presentation module 206, asegment collection module 208, ananalysis module 210, anintegration module 212, and adistribution module 214. The modules may include routines, programs instructions, objects, and/or data structures that perform particular tasks or implement particular abstract data types. The server(s) 102 may also implement adata store 216 that stores data,digital content segments 112, and/or other information or content used by themedia message engine 110. - The
content presentation module 206 may enable a human reader to selectdigital content segments 112 for the purpose of including the selecteddigital content segments 112 in adigital media message 114. In various embodiments, thecontent presentation module 206 may present a web page to auser 116 of anelectronic device 104, such as via thenetwork 106. In further embodiments, thecontent presentation module 206 may present digital content, information, and/or one or moredigital content segments 112 to theuser 116 of anelectronic device 104 via thenetwork 106. Thecontent presentation module 206 may also enable theuser 116 to select content, information, and/or one or moredigital content segments 112. Once theuser 116 has selected, for example, adigital content segment 112, thecontent presentation module 206 may present further content, information, and/ordigital content segments 112 to theuser 116. Thecontent presentation module 206 may also tag the selecteddigital content segment 112 for inclusion in thedigital media message 114. - The
segment collection module 208 may collect audio recordings, video recordings, images, files, web content, audio files, video files, web addresses, and/or otherdigital content segments 112 identified, selected, and/or captured by theuser 116. Additionally, thesegment collection module 208 may label eachdigital content segment 112 with metadata. The metadata may include profile information about the type of content (e.g., image, video, audio, text, animation, etc.), the source of the content segment 112 (e.g., camera, microphone, internet web page, etc.), and/or a position in a play sequence of thedigital media message 114 with which thecontent segment 112 is to be associated. For example, the metadata for an audio recording may include identification information identifying thedigital content segment 112 as comprising an audio recording, information indicating that thedigital content segment 112 was captured using a microphone of anelectronic device 104, information indicating the date and time of recordation, the length of the recording, and/or other information. Such metadata may be provided to thecontent presentation module 206 by thesegment collection module 208 or alternatively, such metadata may be provided to thesegment collection module 208 by thecontent presentation module 206. - The
analysis module 210 may be used by thesegment collection module 208 to determine whether a collectedcontent segment 112 meets certain quality criteria. In various embodiments, the quality criteria may include whether a background noise level in thecontent segment 112 is below a maximum noise level, whether video and/or image quality in thecontent segment 112 is above a minimum pixel or other like quality threshold, and so forth. - The
integration module 212 may use at least a portion of the metadata described above to assess and/or otherwise determine whichcontent segment 112 to select for integration into thedigital media message 114. Additionally or alternatively, theintegration module 212 may use results received from theanalysis module 210 to make one or more such determinations. Such determinations may be provided to theuser 116 of theelectronic device 104 while adigital media message 114 is being generated as a way of guiding the user with regard to the combination of one ormore content segments 112. For instance, theintegration module 212 may provide advice, suggestions, or recommendations to theuser 116 as to whichcontent segment 112 to select for integration into thedigital media message 114 based on one or more of the factors described above. - The
distribution module 214 may facilitate presentation of thedigital media message 114 to one ormore users 120 of theelectronic devices 118. For example, once completed, thedistribution module 214 may direct thedigital media message 114 to one or more of theelectronic devices 118 via thenetwork 106. Additionally or alternatively, thedistribution module 214 may be configured to direct one or moredigital content segments 112 between theservers 102 and one or more of theelectronic devices 104. In such embodiments, thedistribution module 214 may comprise one or more kernels, drivers, or other like components configured to provide communication between theservers 102 and one or more of theelectronic devices - The
data store 216 may store any of the metadata, content, information, or other data utilized in creating one ormore content segments 112 and/ordigital media messages 114. For example, thedata store 216 may store any of the images, video files, audio files, web links, media, or other content that is captured or otherwise received via theelectronic device 104. Such content may be, for example, provided to thedata store 216 via the network during creation of acontent segment 112 and/or adigital media message 114. Alternatively, such content may be provided to thedata store 216 prior to generating acontent segment 112 and/or adigital media message 114. In such examples, such content may be obtained and/or received from thedata store 216 during generation of acontent segment 112 and/or adigital media message 114. - In example embodiments, one or more modules of the
media message engine 110 described above may be combined or omitted. Additionally, one or more modules of themedia message engine 110 may also be included in themedia message engine 108 of theelectronic device 104. As a result, the example methods and techniques of the present disclosure, such as methods of generating a digital media message, may be performed solely on either theserver 102 or one of theelectronic devices 104. Alternatively, in further embodiments, methods and techniques of the present disclosure may be performed, at least in part, on both theserver 102 and one of theelectronic devices 104. -
FIG. 3 is a schematic diagram of illustrative components in an exampleelectronic device 104 that is used to prepare and/or consumedigital content segments 112 anddigital media messages 114. As noted above, theelectronic device 104 shown inFIG. 3 may include one or more of the components described above with respect to theserver 102 such thatdigital content segments 112 and/ordigital media messages 114 may be created and/or consumed solely on theelectronic device 104. Additionally and/or alternatively, theelectronic device 104 may include one or more processor(s) 302 andmemory 304. Thememory 304 may include computer readable media. Computer readable media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. As defined herein, computer readable media does not include communication media in the form of modulated data signals, such as a carrier wave, or other transmission mechanisms. - Similar to the
memory 204 of theserver 102, thememory 304 of theelectronic device 104 may also include amedia message engine 110, and theengine 110 may include any of the modules or other components described above with respect to themedia message engine 108. Additionally or alternatively, themedia message engine 110 of theelectronic device 104 may include one or more of acontent interface module 306, acontent display module 308, a user interface module 310, and adata store 312 similar to thedata store 216 described above. The modules may include routines, programs, instructions, objects, and/or data structures that perform particular tasks or implement particular abstract data types. Theelectronic device 104 may also include one or more cameras, microphones, displays (e.g., a touch screen display), keyboards, mice, touch pads, proximity sensors, capacitance sensors, or other user interface devices 314. - The
content interface module 306 may enable the user to request and download content,digital content segments 112, or other information from the server(s) 102 and/or from the internet. Thecontent interface module 306 may download such content via any wireless or wired communication interfaces, such as Universal Serial Bus (USB), Ethernet, Bluetooth®, Wi-Fi, and/or the like. Additionally, thecontent interface module 306 may include and/or enable one or more search engines or other applications on theelectronic device 104 to enable theuser 116 to search for images, video, audio, and/or other content to be included in adigital media message 114. - The
content display module 308 may present content,digital content segments 112,digital media messages 114, or other information on a display of theelectronic device 104 for viewing. In various embodiments, thecontent display module 308 may provide functionalities that enable theuser 116 to manipulate individualdigital content segments 112 or other information as adigital media message 114 is being generated. For example, thecontent display module 308 may provide editing functionality enabling theuser 116 to delete, move, modify, augment, cut, paste, copy, save, or otherwise alter portions of eachdigital content segment 112 as part of generating adigital media message 114. -
FIG. 4 shows anillustrative user interface 400 that enables theuser 116 to generate adigital media message 114. For example, theuser interface 400 may be displayed on anelectronic device 104 that enables users to create, capture, search for, and/or selectdigital content segments 112, and to generate and/or consumedigital media messages 114. Theuser interface 400 may be displayed, for example, on adisplay 402 of theelectronic device 104. In some examples, theuser interface 400 may be a web page that is presented to theuser 116 via a web browser on theelectronic device 104. Alternatively, theuser interface 400 may be an interface generated and provided by thecontent display module 308 as part of a digital media message generation application operating locally on theelectronic device 104. For the duration of this disclosure, example embodiments in which theuser interface 400 is generated and provided by thecontent display module 308 and/or themessage generation engine 108 as part of a digital media message generation application operating locally on theelectronic device 104 will be described unless otherwise noted. - As shown, the
message generation engine 108 may present auser interface 400 that includes afirst portion 404 displaying animage 406, and asecond portion 408 that includes one or more thumbnails 410(1)-410(N) (collectively “thumbnails 410”). In example embodiments, theimage 406 displayed in thefirst portion 404 may be one or more images, photos, or first frames of a video stored in thememory 304 of theelectronic device 104. Alternatively, thecontent display module 308 may present one ormore images 406 in thefirst portion 404 that are obtained in real time via, for example, a camera or other user interface device 314 of theelectronic device 104. For example, thefirst portion 404 may provide animage 406 of objects that are within a field of view of the camera, and at least thefirst portion 404 may be receptive to user input such as, for example, touch input, touch and hold input, swipe input, tap input, double tap input, pinch input, and/or other gestures. - As will be described in greater detail below, the
message generation engine 108 may receive input from a user of theelectronic device 104 via either thefirst portion 404 or thesecond portion 408. In some embodiments, such input may comprise one or more gestures such as a touch and hold command within thefirst portion 404. Receipt of such an input in thefirst portion 404 may cause themessage generation engine 108 to capture and/or otherwise receive a firstdigital content segment 112 via, for example, the camera or other user interface device 314 of theelectronic device 104. In such embodiments, the receiveddigital content segment 112 may be displayed within thefirst portion 404 as thecontent segment 112 is being recorded and/or otherwise captured by the camera. Themessage generation engine 108 may associate thedigital content segment 112 with a desired position in a play sequence of adigital media message 114. - Additionally and/or alternatively, the
message generation engine 108 may receive input from the user of theelectronic device 104 that includes a touch and hold command on one or more of thethumbnails 410 provided in thesecond portion 408. Receipt of such an input in thesecond portion 408 may cause themessage generation engine 108 to receive a video segment and/or an image associated with therespective thumbnail 410 for inclusion in thedigital media message 114. Themessage generation engine 108 may also associatedigital content segments 112 receives by selection of one or more of thethumbnails 410 with the respective desired position in the play sequence of thedigital media message 114. - In example embodiments, each of the
thumbnails 410 may be representative and/or otherwise indicative of a respective photo, image, and/or video stored in thememory 304. For example, such content may have been captured by auser 116 of theelectronic device 104 prior to commencing generation of thedigital media message 114. Alternatively, one or more photos, images, videos, and/or other content corresponding to one or more of thethumbnails 410 may be captured during generation of thedigital media message 114. Thus, in some embodiments, thesecond portion 408 may comprise a scrollable thumbnail library includingrespective thumbnails 410 that may be selected by theuser 116 for inclusion in thedigital media message 114. - As shown in
FIG. 4 , theuser interface 400 may also include one or more controls configured to assist theuser 116 in capturing one or moredigital content segments 112, modifying one or more of the digital content segments, and/or generating one or moredigital media messages 114. For example, theuser interface 400 may include azoom control 412 configured to enlarge or reduce, for example, the size of theimage 406 shown in thefirst portion 404 and/or to enlarge or reduce the size of thefirst portion 404 itself. Theuser interface 400 may also include a userinterface device control 414 configured to control one or more operations of a user interface devices 314 of theelectronic device 104. For example, the userinterface device control 414 may be configured to control activation of one or more cameras of thedevice 104. In particular, the userinterface device control 414 may be configured to select and/or toggle between a first camera of the electronic device 14 on a first side of theelectronic device 104 and a second camera on a second side of theelectronic device 104 opposite the first side. - The
user interface 400 may also include a plurality of additional controls including one or more navigation controls 416 and/or one or more editing controls 418. For example, theuser interface 400 may include anavigation control 416 that, upon selection thereof by theuser 116, may enable the user to browse backward or forward betweendifferent user interfaces 400 while generating adigital media message 114. For example, afirst navigation control 416 may comprise a “back” control while asecond navigation control 416 may comprise a “forward” control. - Additionally, one or more of the editing controls 418 may enable a
user 116 to add, remove, cut, paste, draw, rotate, flip, shade, color, fade, darken, and/or otherwise modify various aspects of thedigital media message 114 and/or variousdigital content segments 112. For example, one or more of the editing controls 418 may comprise an “undo” control that enables theuser 116 to delete and/or otherwise remove one or moredigital content segments 112 from a play sequence of thedigital media message 114. Although a variety of different controls have been described above with regard to theuser interface 400, it is understood that in further example embodiments one or more additional controls may be presented to theuser 116 by themedia message engine 108. For example, such editing controls 418 may further comprise any audio, video, image, or other editing tools known in the art. In some examples, at least one of the controls described herein may be configured to modify a firstdigital content segment 112 before a second, third, or other additionaldigital content segment 112 is received by themedia message engine 108. - The
user interface 400 may also include amessage bar 420 configured to provide guidance to theuser 116 before, during, and/or after generation of thedigital media message 114. For example, themessage bar 420 may provide instructions to theuser 116 and/or other guidance related to use of one or more of the controls described above, next steps to be taken in order to generate thedigital media message 114, the completion status of thedigital media message 114, and/or other information. As shown inFIG. 4 , in example embodiments themessage bar 420 may be disposed between thefirst portion 404 and thesecond portion 408. Alternatively, in further example embodiments themessage bar 420 may be disposed above thefirst portion 404, below thesecond portion 408, and/or at any other position on theuser interface 400. In an example embodiment, themessage bar 420 may instruct theuser 116 to touch and hold, for example, thefirst portion 404 or thesecond portion 408 in order to begin generating adigital media message 114. -
FIG. 5 illustrates anotherexample user interface 500 of the present disclosure. In example embodiments, themedia message engine 108 may provide such anexample user interface 500 during the process of generating adigital media message 114 and, for example, after at least onedigital content segment 112 has been received by themedia message engine 108 via theelectronic device 104. For example, theuser interface 500 may include visual indicia of aplay sequence 502 associated with thedigital media message 114 that is currently being generated. Such visual indicia may include a first portion corresponding to a firstdigital content segment 112 received by themedia message engine 108, and at least one additional portion corresponding to a respective additionaldigital content segment 112 received by themedia message engine 108. - In some examples, the visual indicia of the
play sequence 502 may include one ormore thumbnails 504 illustrating and/or otherwise indicative of respectivedigital content segments 112 that have previously been added to and/or otherwise associated with thedigital media message 114. In example embodiments, the visual indicia of theplay sequence 502 may includevarious thumbnails 504 provided in the sequential order in which eachrespective content segment 112 has been received by themedia message engine 108. For example,digital content segments 112 received earlier in time during the generation of adigital media message 114 may be represented byrespective thumbnails 504 disposed further to the left-hand side of thedisplay 402 thanadditional thumbnails 504 representing respectivedigital content segments 112 received relatively later in time. Further, in example embodiments eachrespective thumbnail 504 may illustrate one or more scenes from a video, a representation of a photo or image, and/or any other visual representation of the respectivedigital content segment 112 to which thethumbnail 504 corresponds. In this way, thethumbnails 504 provided as part of the visual indicia of theplay sequence 502 may assist theuser 116 in recalling the content and/or general flow of thedigital media message 114 during creation thereof. Theexample thumbnail 504 illustrated inFIG. 5 is representative of, for example, theimage 406 described above with respect toFIG. 4 . In such an example, a video, photo, image, or other such content associated with adigital content segment 112 received via theuser interface 400 ofFIG. 4 may, for example, be associated with a first position in the play sequence associated with theuser interface 500 ofFIG. 5 . - The
user interface 500 may also include one or more controls associated with the visual indicia of theplay sequence 502. For example, such controls may include aplay control 506. In example embodiments, theplay control 506 may be configured to play, display, and/or otherwise provide a preview of thedigital media message 114 to the user via, for example, thefirst portion 404 of thedisplay 402. In such embodiments, themedia message engine 108 may play one or more portions of thedigital media message 114 currently being generated in response to receiving a touch input and/or other input via theplay control 506. In some embodiments, further functionality may be provided to theuser 116 via theplay control 506 and/or via one or more additional controls associated with theplay control 506. - For example, the
play control 506 and/or other associated controls may enable theuser 116 to increase or decrease the speed at which the preview of thedigital media message 114 is provided. Theplay control 506 and/or other associated controls may also enable theuser 116 to skip between multipledigital content segments 112 associated with the corresponding play sequence. Additionally, theplay control 506 and/or other associated controls may enable theuser 116 to pause the preview of thedigital media message 114. In examples in which such functionality is provided via theplay control 506, such functionality may be accessed via multiple taps, multiple touches, or other gestures such as swipe gestures, and the like received via thefirst portion 404. Alternatively, in examples in which such functionality is provided via one or more additional play controls, such additional play controls may be rendered, displayed, and/or otherwise provided via thedisplay 402 at a location, for example, proximate theplay control 506. - As shown in
FIG. 5 , in some examples themedia message engine 108 may provide animage 508 to theuser 116 via thefirst portion 404. In such examples, theimage 508 may correspond to one or more of thethumbnails 410 shown in thesecond portion 408. For example, in some embodiments theuser 116 may select athumbnail 410 of thesecond portion 408 by touching and/or holding the desiredthumbnail 410 with thehand 422 of theuser 116. When such athumbnail 410 is selected in this way, animage 508 corresponding to the selectedthumbnail 410 may be displayed in thefirst portion 404. For example, themedia message engine 108 may receive at least onedigital content segment 112 in response to selection of one or moresuch thumbnails 410 by theuser 116. In example embodiments, when one or more of thethumbnails 410 is selected in this way, themedia message engine 108 may not only receive a firstdigital content segment 112 comprising a photo, video, image, and/or other content corresponding to the selectedthumbnail 410, but may also receive a differentadditional content segment 112 while the surface and/or portion of thedisplay 402 corresponding to thethumbnail 410 is contacted by thehand 422 of theuser 116. - For example, in such embodiments the additional
digital content segment 112 may comprise audio or other like input captured by a microphone or other user interface device 314 of theelectronic device 104 while the surface and/or portion of thedisplay 402 corresponding to thethumbnail 410 is contacted by thehand 422 of theuser 116. In such embodiments, both of the respective digital content segments may be added to the play sequence of thedigital media message 114 such that the respectivedigital content segments 112 are presented simultaneously when thedigital media message 114 is played. As shown inFIG. 5 , theimage 508 may correspond to the thumbnail 410(2) currently being contacted by thehand 422 of theuser 116. -
FIG. 6 illustrates afurther user interface 600 provided by themedia message engine 108. In example embodiments, themedia message engine 108 may provide such anexample user interface 600 during the process of generating adigital media message 114 and, for example, after a plurality ofdigital content segments 112 have been received by themedia message engine 108 via theelectronic device 104. For example, theuser interface 600 may include visual indicia of theplay sequence 502 that includes thethumbnail 504 described above with respect toFIG. 5 , as well as athumbnail 602 illustrating and/or otherwise indicative of adigital content segment 112 associated with theimage 508 described above with respect toFIG. 5 . - As noted above, the
various thumbnails play sequence 502 may be provided in the sequential order in which eachrespective content segment 112 has been received by themedia message engine 108. For example, thethumbnail 504 is disposed further to the left-hand side of thedisplay 402 than thethumbnail 602, thereby indicating that adigital content segments 112 corresponding to thethumbnail 504 was received earlier in time than adigital content segment 112 corresponding to thethumbnail 602. - As shown in
FIG. 6 , in some examples themedia message engine 108 may provide animage 604 to theuser 116 via thefirst portion 404. In such examples, theimage 604 may correspond to one or more of thethumbnails 410 shown in thesecond portion 408. For example, in the embodiment ofFIG. 6 theuser 116 may select a thumbnail 410(3) of thesecond portion 408 by touching and/or holding a section and/or surface of thedisplay 402 associated with the desired thumbnail 410(3). As described above with respect toFIG. 5 , when such a thumbnail 410(3) is selected by theuser 116 in this way, theimage 604 corresponding to the selected thumbnail 410(3) may be displayed in thefirst portion 404. - For example, the
media message engine 108 may receive at least onedigital content segment 112 in response to selection of one or more such thumbnails 410(3) by theuser 116. In example embodiments, when the thumbnails 410(3) is selected in this way, themedia message engine 108 may not only receive a firstdigital content segment 112 comprising a photo, video, image, and/or other content corresponding to the selected thumbnail 410(3), but may also receive a differentsecond content segment 112 while the surface and/or portion of thedisplay 402 corresponding to the thumbnail 410(3) is contacted by thehand 422 of theuser 116. For example, in such embodiments the seconddigital content segment 112 may comprise audio or other like input captured by a microphone or other user interface device 314 of theelectronic device 104 while the surface and/or portion of thedisplay 402 corresponding to the thumbnail 410(3) is contacted by thehand 422 of theuser 116. In such embodiments, receiving such first andsecond content segments 112 may cause, for example, themedia message engine 108 or other components of theelectronic device 104 to store at least one of the first andsecond content segments 112 in thememory 304 and/or in thememory 204 of theserver 102. In some embodiments, the firstdigital content segment 112 may be stored separately from the seconddigital content segment 112. Additionally, the first and seconddigital content segments 112 may be added to the play sequence of thedigital media message 114 such that the respectivedigital content segments 112 are presented simultaneously when thedigital media message 114 is played. - In particular, in example embodiments in which the first
digital content segment 112 comprises a photo, video, image, audio, and/or other content corresponding to the selected thumbnail 410(3), and a seconddigital content segment 112 comprises an audio segment, a video segment, or other like input captured by a user interface device 314 of theelectronic device 104 while the surface and/or portion of thedisplay 402 corresponding to the thumbnail 410(3) is contacted by thehand 422, themedia message engine 108 may combine such first and seconddigital content segments 112. By combining such digital content segments, the second digital content segment 112 (e.g., an audio segment or a video segment) may be presented simultaneously with the first digital content segment 112 (e.g., a photo, video, image, audio, or other content) when thedigital media message 114 is played. Combiningdigital content segments 112 in this way may include generating a combined segment that is configured such that, for example, audio from thesecond content segment 112 described above is presented simultaneously with at least one of a photo, video, image, audio, or other content of thefirst content segment 112 when a portion of thedigital media message 114 corresponding to the combined segment is played. In such examples, themedia message engine 108 may associate the combined segment with any position in the play sequence desired by theuser 116. - Further, the
user interface 600 may also include one or more controls configured to enable theuser 116 to share thedigital media message 114 with other users, such asusers 120 of remoteelectronic devices 118. For example, theuser interface 600 may include one or more share controls 606. In example embodiments, when one or more such share controls 606 is actuated by theuser 116, themedia message engine 108 may provide, such as via thedisplay 402, a plurality of additional controls configured to assist theuser 116 in providing thedigital media message 114 for sharing with a remoteelectronic device 118. Such additional controls will be described in greater detail below. -
FIG. 7 illustrates yet anotherexample user interface 700 of the present disclosure. In example embodiments, themedia message engine 108 may provide such anexample user interface 700 during the process of generating adigital media message 114 and, for example, after a finaldigital content segment 112 has been received by themedia message engine 108 via theelectronic device 104. For example, theuser interface 700 may include visual indicia of theplay sequence 502 that includes the thumbnails described above with respect toFIGS. 5 and 6 , as well as athumbnail 702 illustrating and/or otherwise indicative of adigital content segment 112 associated with theimage 604. - The
user interface 700 may also include animage 704, and theimage 704 may be one or more images, photos, or first frames of a video stored in thememory 304 of theelectronic device 104. Alternatively, thecontent display module 308 may present one ormore images 704 in thefirst portion 404 that are obtained in real time via, for example, a camera or other user interface device 314 of theelectronic device 104. For example, thefirst portion 404 may provide animage 704 of objects that are within a field of view of the camera. - In some embodiments, the
media message engine 108 may receive an input, such as a touch input, indicative of selection of thecontrol 606 by theuser 116. In response to receiving such an input, themedia message engine 108 may provide theexample user interface 800 illustrated inFIG. 8 . Such anexample user interface 800 may include, among other things, amessage thumbnail 801 indicating and/or otherwise identifying thedigital media message 114 that theuser 116 desires to share. In example embodiments, such amessage thumbnail 801 may be similar to one or more of thethumbnails message thumbnail 801 may be larger than one or more of thethumbnails user 116 to distinguish the message thumbnail 801 from one or more of thethumbnails message thumbnail 801 may comprise, for example, a first frame and/or any other image or content indicative of thedigital media message 114 being generated by theuser 116. - Such an
example user interface 800 may also include a plurality of controls configured to assist theuser 116 in providing thedigital media message 114 for sharing with, for example, a remoteelectronic device 118, such as via thenetwork 106. For example, one or more of thecontrols 802 may enable theuser 116 to add a title, a name, and/or other identifier to themedia message 114 such that themedia message 114 may be easily recognizable and/or identifiable by one ormore users 120 of the remoteelectronic device 118. In some examples, the title and/or other identifier added to themedia message 114 may be provided to theuser 120 simultaneously and/or otherwise in conjunction with thedigital media message 114 when theuser 120 consumes at least a portion of thedigital media message 114 on the remoteelectronic device 118. - In addition, the
user interface 800 may include one ormore controls user 116 to privatize thedigital media message 114 prior to providing thedigital media message 114 for sharing with a remoteelectronic device 118. For example, one or moresuch controls 804 may enable theuser 116 to encrypt and/or otherwise configure thedigital media message 114 such that only an approveduser 120 or plurality ofusers 120 may receive and/or access thedigital media message 114. In example embodiments, themedia message engine 108 may receive an input, such as a touch input, indicative of selection of thecontrol 804 by theuser 116. In response to receiving such an input, themedia message engine 108 may enable theuser 116 to browse, for example, an address book or other like directory stored in thememory 304 of theelectronic device 104 and/or in thememory 204 of theserver 102. Upon browsing such a directory, theuser 116 may select one or more contacts approved by theuser 116 to have access to thedigital media message 114. Additionally or alternatively, in response to receiving such an input, themedia message engine 108 may enable theuser 116 to password protect and/or otherwise encrypt thedigital media message 114 prior to sharing. In any of the example embodiments described herein, one or more of thecontrols 806 may comprise a slide bar and/or other like icon indicating whether theuser 116 has privatized thedigital media message 114. For example, such acontrol 806 may change color, transition between a “no” indication and a “yes” indication, and/or may otherwise provide a visual indication of the privacy status/level of thedigital media message 114. - The
user interface 800 may also include one ormore controls 808 configured to enable theuser 116 to select one or more means of providing thedigital media message 114 for sharing with a remoteelectronic device 118. For example, one or moresuch controls 808 may enable theuser 116 to select from a plurality of common social media websites and/or other portals useful in sharing thedigital media message 114. In such example embodiments, themedia message engine 108 may receive an input, such as a touch input, indicative of selection of thecontrol 808 by theuser 116. In response to receiving such an input, themedia message engine 108 may enable theuser 116 to access an existing account on the selected social media portal. Once such an account has been accessed, themedia message engine 108 may provide thedigital media message 114 to the selected social media portal for sharing withremote users 120 via the selected portal. - One or more
such controls 808 may also enable theuser 116 to select between email, text messaging (SMS), instant messaging, and/or other like means for sharing thedigital media message 114. In such example embodiments, themedia message engine 108 may receive an input, such as a touch input, indicative of selection of thecontrol 808 by theuser 116. In response to receiving such an input, themedia message engine 108 may enable theuser 116 to browse, for example, an address book or other like directory stored in thememory 304 of theelectronic device 104 and/or in thememory 204 of theserver 102. Upon browsing such a directory, theuser 116 may select one or more contacts with which theuser 116 desires to share thedigital media message 114. Upon selecting such contacts, theuser 116 may provide thedigital media message 114 to the selected users by providing an input, such as a touch input, indicative of selection of ashare control 810. -
FIG. 9 shows anexample method 900 associated with generating and sharing adigital media message 114. Theexample method 900 is illustrated as a collection of steps in a logical flow diagram, which represents operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the steps represent computer-executable instructions stored in memory. When such instructions are executed by one or more processors, such instructions may cause the processor, various components of the electronic device, and/or the electronic device, generally, to perform the recited operations. Such computer-executable instructions may include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described steps can be combined in any order and/or in parallel to implement the process. For discussion purposes, and unless otherwise specified, themethod 900 is described with reference to theenvironment 100 ofFIG. 1 . - At
block 902, theelectronic device 104 may receive an input from theuser 116 of theelectronic device 104. In some examples, such an input may be a touch input, a touch and hold input, a swipe, a drag, a voice command, and/or any other input described herein. For example, atblock 902 themedia message engine 108 may receive a touch and hold input via the touch-sensitive display 402 of theelectronic device 104. In such embodiments, the input received atblock 902 may indicate selection of and/or may otherwise correspond to an image, a video, an audio segment, and/or any other digital content segment 112(1) stored in thememory 304 of theelectronic device 104 and/or in thememory 204 of theserver computer 102. For example, as described above with respect to at leastFIGS. 4-7 , and as further illustrated byFIGS. 10 and 11 , in some examples theelectronic device 104 may display a plurality ofthumbnails 410 on thedisplay 402. In such examples, theuser 116 may provide a touch and hold input via one or more of the thumbnails 410 (e.g., thumbnail 410(1), as shown inFIG. 10 ), and such an input may indicate that theuser 116 has selected the digital image corresponding to the particular thumbnail 410(1) for inclusion in adigital media message 114. In such examples, the selected digital image (e.g., a first digital content segment 112(1)) may be stored in thememory 304 of theelectronic device 104 and/or in thememory 204, and visual indicia of aplay sequence 502 associated with thedigital media message 114 being generated may be displayed on thedisplay 402. Further, the input described with respect to block 902 may be received via the location on thedisplay 402 at which the thumbnail 410(1) is being displayed. - In some embodiments, receiving the input at
block 902 may cause theelectronic device 104 to capture a video segment, audio segment, photo, digital image, or other such digital content segment (e.g., a second digital content segment 112(2)) using one or more of the user interface devices 314. For example, atblock 904, theprocessor 302 may cause a microphone, digital camera, and/or other user interface 314 of theelectronic device 104 to capture a digital content segment 112(2) in response to the input received atblock 902. In such an example, themedia message engine 108 may cause the captured digital content segment 112(2) to be stored in thememory 304 and/or thememory 204 of theserver 102 for future use. Examples in which at least one of the digital content segments 112(1), 112(2) are transferred to theserver 102 for storage in thememory 204 will be described below with respect to block 910. It is understood, however, that in still further example embodiments, eachdigital content segment 112 used to generate thedigital media message 114 may be stored in thememory 204 of theserver 102. In such examples, the first and second digital content segments 112(1), 112(2) may not be provided to theserver computer 102 atblock 904. Instead, theuser 116 of theelectronic device 104 may access the first and second digital content segments 112(1), 112(2) on theserver 102, via thenetwork 106, and may select one or more of the digital content segments 112(1), 112(2) for use in generating thedigital media message 114 using theelectronic device 104. - As shown schematically in
FIGS. 10 and 11 , the first and second digital content segments 112(1), 112(2) may comprise respective portions of a clip of thedigital media message 114 being generated. In particular, thedigital media message 114 may comprise one or more clips 1000(1), 1000(2), . . . 1000(N) (collectively, “clips 1000”), and eachclip 1000 of thedigital media message 114 may comprise one of more frames. Further, as shown inFIGS. 10 and 11 , thedigital media message 114 may comprise anaudio track 1002 and acorresponding video track 1004. Thus, in such embodiments, eachclip 1000 may make up a respective sequential portion of the audio andvideo tracks FIG. 10 , in the first clip 1000(1) of thedigital media message 114, the digital content segment 112(1) (e.g., a digital image) may make up each frame of thevideo track 1004 and the corresponding digital content segment 112(2) (e.g., an audio segment) may make up each frame of theaudio track 1002. It is understood that using the digital image 112(1) to make up each frame of avideo track 1004 may reduce the device and/or server memory required for storing thevideo track 1004, the network bandwidth required to transfer thevideo track 1004, and/or other device resource requirements relative tocomparable video tracks 1004 comprising actual video footage. As a result, using the digital image 112(1) to make up each frame of thevideo track 1004 may reduce the overall size of the resultingdigital media message 114, thus reducing the network bandwidth required to transfer thedigital media message 114 and reducing the memory, processor resources, and/or other server or device resources needed to render thedigital media message 114. - As illustrated in
FIG. 11 , in example embodiments in which adigital media message 114 comprises more than one clip (e.g., first and second clips 1000(1), 1000(2)), thevarious clips 1000 may comprise any of a variety of formats. For example,such clips 1000 may be made from audio segments, video segments, digital images, and/or any of the other types ofdigital content segments 112 described herein. In the example shown inFIG. 11 , thedigital media message 114 being created comprises a first clip 1000(1) in which theaudio track 1002 comprises an audio segment 112(2) and in which thevideo track 1004 comprises a digital image segment 112(1). Thedigital media message 114 also includes a second clip 1000(2) comprising a digital video. In such examples, thevideo track 1004 of thedigital media message 114 corresponding to the second clip 1000(2) may comprise a video segment 112(3) of the digital video, and theaudio track 1002 of thedigital media message 114 corresponding to the second clip 1000(2) may comprise an audio segment 112(4) of the digital video. - At
block 906, theelectronic device 104 may generate an identifier that is unique to the digital content segment 112(1) described above with respect to block 902. For example, themedia message engine 108 may cause theprocessor 302 and/or other components of theelectronic device 104 to generate a unique series of numbers, letters, symbols, and/or other identifiers. In some examples, the identifier generated atblock 906 may be randomly generated by themedia message engine 108, and/or other components of theelectronic device 104. Additionally, atblock 906, theelectronic device 104 may link, couple, attach, and/or otherwise associate the identifier with the digital content segment 112(1) in thememory 304. For example, in embodiments in which the digital content segment 112(1) comprises a digital file, themedia message engine 108 and/or theprocessor 302 may attach, save, embed, and/or otherwise associate the identifier with the digital content segment 112(1) such that the identifier becomes part of, integrated with, and/or is otherwise carried with the digital content segment 112(1). In some examples, the identifier generated atblock 906 may comprise a label and/or other like moniker by which one or more components of thecomputing environment 100 may identify the digital content segment 112(1). - At
block 908, theelectronic device 104 may generate an identifier that is unique to the digital content segment 112(2) described above with respect to block 904. For example, themedia message engine 108 may cause theprocessor 302 and/or other components of theelectronic device 104 to generate a unique series of numbers, letters, symbols, and/or other identifiers. Additionally, atblock 908, theelectronic device 104 may link, couple, attach, and/or otherwise associate the identifier generated atblock 908 with the digital content segment 112(2). The processes and/or operations performed by themedia message engine 108, theprocessor 302, and/or other components of theelectronic device 104 atblock 908 with respect to the digital content segment 112(2) may be similar to and/or the same as the processes and/or operations described above with respect to block 906. - At
block 910, theelectronic device 104 may provide one or more of the digital content segments 112(1), 112(2) described herein to theserver computer 102 via thenetwork 106. For example, atblock 910 themedia message engine 108 may cause theprocessor 302, and/or one or more communication interfaces or other like hardware and/or software devices or components of theelectronic device 104 to transfer at least one of the digital content segments 112(1), 112(2) from theelectronic device 104 to theserver computer 102. In such examples, it is understood that the identifier described above with respect to block 906 may be transferred in association with the digital content segment 112(1) (e.g., the digital image) if the digital content segment 112(1) is transferred by theelectronic device 104 atblock 910. Additionally, the identifier described above with respect to block 908 may be transferred in association with the digital content segment 112(2) (e.g., the audio segment) if the digital content segment 112(2) is transferred by theelectronic device 104 atblock 910. In some examples, at least one of the digital content segments 112(1), 112(2) may be transferred by theelectronic device 104 to theserver computer 102 substantially in real time atblock 910. For example, in embodiments in which the digital content segment 112(2) captured atblock 904 comprises an audio segment, a video segment, or other like digital segment, theelectronic device 104 may begin transferring and/or otherwise transfer such adigital content segment 112 to theserver computer 102 as thedigital content segment 112 is being captured. - In some examples,
digital content segments 112 transferred to theserver computer 102 byelectronic device 104 may be stored in thememory 204 and, over time, may form a global repository of digital content by which the variousdigital media messages 114 described herein may be formed. In such examples, theserver computer 102 may protect against storing multiple copies of the samedigital content segments 112. In thememory 204. For example,digital content segments 112 comprising audio segments, video segments, and the like may typically be unique due to the nature in which suchdigital content segments 112 are created at the user level.Digital content segments 112 comprising digital images, on the other hand, have a greater tendency to be duplicates (e.g., stored locally on multipleelectronic devices digital media message 114 by multiple different users through web-based searches) since users tend to utilize the same clipart, publicly available images, and/or other digital images when generatingdigital media messages 114. As a result, in some embodiments theserver computer 102 may compare the respective unique identifier associated with the one or more digital content segments 112 (generated atblocks 906 and 908) to a plurality of identifiers stored in thememory 304. In some examples, theelectronic device 104 may provide the respective unique identifiers to theserver computer 102 atblock 910 before providing the one or moredigital content segments 112 atblock 910. Alternatively, theelectronic device 104 may provide the respective unique identifiers to theserver computer 102 atblock 910 along with the transferreddigital content segments 112. In any of the examples described herein, theserver computer 102 may accept thedigital content segment 112 for storage in thememory 304 if no match is found between the unique identifier associated with the receiveddigital content segment 112 and the plurality of identifiers stored in thememory 304. On the other hand, if theserver computer 102 finds a match between the unique identifier associated with the receiveddigital content segment 112 and an identifier of the plurality of identifiers stored in thememory 304, the server computer may deny and/or otherwise prohibit storage of the correspondingdigital content segment 112 in thememory 304 based on such a comparison and/or the resulting match. In examples in which theelectronic device 104 provides the respective unique identifiers to theserver computer 102 atblock 910 before providing the one or moredigital content segments 112, thedigital content segment 112 corresponding to the matched unique identifier may not be sent to theserver computer 102 based on and/or as a result of finding such a match. - At
block 912, theelectronic device 104 may generate one or more electronic files providing a sequential clip listing associated with rendering thedigital media message 114. For example, atblock 912 themedia message engine 108 may cause theprocessor 302 and/or one or more other components of theelectronic device 104 to generate an edit decision list (hereinafter “EDL”), a text file, a data file, a spreadsheet, a gif file, a tif file, and/or other file indicating an order in which various clips of thedigital media message 114 are to be rendered upon playback.FIG. 12 illustrates an exampleelectronic file 1200 of the present disclosure. - As shown in
FIG. 12 , an exampleelectronic file 1200 may include atitle 1202 of thedigital media message 114 being generated. Theelectronic file 1200 may also include a time-based, frame-based, and/or otherwisesequential clip listing 1204 setting forth the content, components, and/or or other parameters of eachclip 1000 in thedigital media message 114, and the sequential order in which eachclip 1000 is to be rendered. For example, as shown inFIG. 12 , theclip listing 1204 may includeinformation clip 1000 of thedigital media message 114. Theinformation FIG. 12 corresponds to the example clips 1000(1), 1000(2) described above with respect toFIGS. 10 and 11 . In such examples,such information content identifiers blocks Such information first indicator 1214 identifying a first frame (e.g., a start frame) of thedigital media message 114 corresponding to adigital content segment 112 being rendered in the respective clip. Theinformation second indicator 1216 identifying a second frame (e.g., an end frame) of thedigital media message 114 corresponding to adigital content segment 112 being rendered in the respective clip. In such examples, the first andsecond identifiers digital media message 114 at which the corresponding digital content segment(s) 112 is/are to start and stop, respectively. In still further examples, theinformation additional indicator 1218 identifying a volume level corresponding to thedigital content segment 112 being rendered in the respective clip. - It is understood that the various indicators described above with respect to the
electronic file 1200 are merely examples. In further embodiments, theelectronic files 1200 of the present disclosure may include more than, less than, and/or different indicators than those described herein. Additionally, the order in which the clips 1000(1), 1000(2) are set forth in theclip listing 1204 may correspond to the order in which thedigital content segments 112 are generated, organized, and/or otherwise arranged during the process of generating adigital media message 114. Such an order may correspond to, for example, the order set forth in theplay sequence 502. Additionally, such an order may be defined by the frames indicated by theidentifiers - It is also understood that the
sequential clip listing 1204 included in theelectronic file 1200 may be made up of the various clips 1000(1), 1000(2) described above, and such clips 1000(1), 1000(2) may each comprise respective frame groups formed from the plurality of frames to be included in thedigital media message 114. In this way, each frame group may comprise arespective clip 1000 of thedigital media message 114. Further, rendering thedigital media message 114 in accordance with thesequential clip listing 1204 may cause the digital content segments 12 to be rendered in accordance with theindicators respective clip 1000. For example, when rendering thedigital media message 114 in accordance with theelectronic file 1200 shown inFIG. 12 , in clip 1000(1) thedigital content segment 112 associated with theidentifier 1210 will be rendered fromframe 0 of thedigital media message 114 toframe 121. Likewise, in clip 1000(1) thedigital content segment 112 associated with theidentifier 1212 will be rendered fromframe 0 of thedigital media message 114 toframe 121. Accordingly, in some examples, rendering adigital media message 114 in accordance with anelectronic file 1200, such as the exampleelectronic file 1200 shown inFIG. 12 , may cause a digital image to be presented and/or otherwise rendered on the electronic device simultaneously with an audio segment, and/or otherdigital content segment 112, from a first frame of thedigital media message 114 to a second frame. - At
block 914 theelectronic device 104 may provide one or more of theelectronic files 1200 described herein to theserver computer 102 via thenetwork 106. For example, atblock 914 themedia message engine 108 may cause theprocessor 302, and/or one or more communication interfaces or other like hardware and/or software devices or components of theelectronic device 104 to transfer an EDL file or other likefile 1200 from theelectronic device 104 to theserver computer 102. In some examples, theelectronic file 1200 may be transferred by theelectronic device 104 to theserver computer 102 substantially in real time atblock 914. For example, in such embodiments theelectronic file 1200 may be transferred to theserver computer 102 as thedigital content segment 112 is being generated. In some example embodiments, theelectronic device 104 may transfer theelectronic file 1200 to theserver computer 102 separately from at least one of thedigital content segments 112 transferred to theserver computer 102 byelectronic device 104 atblock 910. In such examples, at least one of thedigital content segments 112 may be transferred to theserver computer 102 by a first signal generated by theelectronic device 104, andelectronic file 1200 may be transferred to theserver computer 102 by a second signal generated by theelectronic device 104 different from the first signal. In still further examples, a first digital content segment 112(1) may be transferred to theserver computer 102 via a first signal generated by theelectronic device 104 and sent using thenetwork 106, a second digital content segment 112(2) may be transferred to theserver computer 102 via a second signal generated by theelectronic device 104 separate from the first signal and sent using thenetwork 106, andelectronic file 1200 may be transferred to theserver computer 102 via a third signal generated by theelectronic device 104, separate from the first and second signals, and sent using thenetwork 106. - In this way, the various
digital content segments 112 described herein may be sent to theserver computer 102 for storage and/or for further use in generating and/or rendering adigital media message 114, and thedigital content segments 112 may be provided to theserver computer 102 bearing no association with the respectiveelectronic device 104. In some examples, the only information linking thedigital content segments 112 provided to theserver computer 102 atblock 910 with thedigital media message 114 being generated by auser 116 of theelectronic device 104 may be theelectronic file 1200 identifying thedigital content segments 112 by their respectiveunique identifiers digital content segments 112 provided to theserver computer 102 atblock 910 may be stored in a first database, data store, division, and/or other first portion of thememory 204, and theelectronic file 1200 provided to theserver computer 102 atblock 914 may be stored in a second database, data store, division, and/or other second portion of thememory 204 different from the first portion. In some examples, theelectronic file 1200 may be stored in the second portion of thememory 204 in association with a telephone number, serial number, id number, and/or other like indicator uniquely identifying anelectronic device 118 of an intended recipient of thedigital media message 114. - At
block 916 theelectronic device 104 may provide instructions to theserver computer 102 via thenetwork 106. For example, atblock 916 themedia message engine 108 may cause theprocessor 302, and/or one or more communication interfaces or other like hardware and/or software devices or components of theelectronic device 104 to generate and send a signal comprising instructions to share thedigital media message 114 with a secondelectronic device 118 different from theelectronic device 104 via thenetwork 106. In such examples, the instructions may include a telephone number, serial number, id number, and/or other like indicator uniquely identifying theelectronic device 118, and theelectronic device 118 may belong to an intended recipient of thedigital media message 114. - In any of the example embodiments described herein, the
server computer 102 may generate one or more additional electronic files using the variousdigital content segments 112 provided by theelectronic device 104 atblock 910. One or more such additional electronic files may also be generated by theserver computer 102 based on theelectronic file 1200 provided by theelectronic device 104 atblock 914. Theserver computer 102 may generate one or more such additional electronic files in response to receiving the instructions provided atblock 916 or in response to receiving theelectronic file 1200 atblock 914. - For example, the
server computer 102 may generate a first additional electronic file comprising a plurality of frame groups formed from at least one of thedigital content segments 112 provided by theelectronic device 104 atblock 910. In such examples, the first additional electronic file may comprise a plurality of frames of thedigital media message 114, and may be optimized by theserver computer 102 for the purpose of streaming thedigital media message 114 from theserver computer 102 to an additionalelectronic device 118 via thenetwork 106. For example, in generating such a first additional electronic file, theserver computer 102 may break up one or more of theclips 1000 identified by theelectronic file 1200 into frame groups. In such examples, the frame groups may be subgroups of frames that, together, make up the one ormore clips 1000 of thedigital media message 114. Theserver computer 102 may then remove digital content from each of the frame groups (such as from theaudio track 1002, thevideo track 1004, etc.) and/or may otherwise modify the quality and/or fidelity level of the individual frames within each of the frame groups. In some examples, one or more entire frames may be removed from one or more of the frame groups. As a result, the various different frame groups formed by theserver computer 102 may have a different respective quality levels. For example, a first frame group formed by theserver computer 102 may have a first level of fidelity, and a second frame group formed by theserver computer 102 may have a second level of fidelity different from the first level of fidelity. This difference in the level of fidelity may, in some instances, be perceptible when the resulting digital media message (i.e., the first additional electronic file) is streamed and/or otherwise rendered. - Additionally or alternatively, the
server computer 102 may generate a second additional electronic file that is optimized for uploading by theserver computer 102 to an additionalelectronic device 118 and/or for downloading by theelectronic device 118. In such examples, the second additional electronic file may comprise digitally compressed and/or otherwise modified version of one or more frames of the plurality of frames making up thedigital media message 114. Accordingly, in generating such an additional electronic file, theserver computer 102 may compress the various frames ofdigital media message 114 identified by theelectronic file 1200. Processing and/or handling such an example second additional electronic file may require less network bandwidth, fewer processor resources, and/or reduced memory storage space relative to a correspondingdigital media message 114, rendered using thedigital content segments 112 provided by theelectronic device 104 atblock 910. - Moreover, in any of the example embodiments described herein, the instructions provided at
block 916 may cause theserver computer 102 generate and/or provide a signal to theelectronic device 118 via thenetwork 106, and using the indicator uniquely identifying theelectronic device 118. In such examples, the signal generated by theserver computer 102 may be provided to theelectronic device 118 via at least one of a text message, an email, and/or a website, such as a social media website. Additionally, in such examples, the signal provided by theserver computer 102 may include a request for permission associated with sharing thedigital media message 114. For example, such a signal may request that auser 120 of theelectronic device 118 provide permission to theserver computer 102 for sharing thedigital media message 114. - In some examples, in response to requesting permission associated with sharing the
digital media message 114, theserver computer 102 may receive additional instructions from theelectronic device 118 via thenetwork 106. For example, in response to receiving the signal provided by theserver computer 102, theuser 120 of theelectronic device 118 may grant permission to theserver computer 102 to share themedia message 114. In such examples, in response to receiving such additional instructions and/or such permission from theelectronic device 118, theserver computer 102 may stream the first additional electronic file described above, via thenetwork 106. In such examples, theuser 120 may view thedigital media message 114 without downloading and/or otherwise receiving thedigital content segments 112 utilized to render thedigital media message 114. Additionally or alternatively, in response to receiving such additional instructions and/or such permission from theelectronic device 118, theserver computer 102 may transfer the second additional electronic file described above to theelectronic device 118 via thenetwork 106. In such examples, theserver computer 102 may upload the compressed versions of thedigital content segments 112 to theelectronic device 118 for local rendering and/or viewing on theelectronic device 118. -
FIG. 13 illustrates another anexample method 1300 associated with generating and sharing adigital media message 114. Similar to themethod 900, theexample method 1300 is illustrated as a collection of steps in a logical flow diagram, which represents operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the steps represent computer-executable instructions stored in memory. When such instructions are executed by one or more processors, such instructions may cause the processor, various components of a server computer, and/or the server computer, generally, to perform the recited operations. Such computer-executable instructions may include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described steps can be combined in any order and/or in parallel to implement the process. For discussion purposes, and unless otherwise specified, themethod 1300 is described with reference to theenvironment 100 ofFIG. 1 and, in particular, with theserver computer 102. - At
block 1302, theserver computer 102 may receive various digital content and corresponding unique identifiers from one or moreelectronic devices 104, and via thenetwork 106. For example, as described above with respect toFIG. 9 , theelectronic device 104 may receive an input indicating selection of and/or may otherwise corresponding to an image, a video, an audio segment, and/or any other digital content segment 112(1) stored in thememory 304 of theelectronic device 104. For example, auser 116 may provide a touch and hold input via thedisplay 402 and such an input may indicate that theuser 116 has selected a digital image corresponding to a particular thumbnail 410(1) for inclusion in adigital media message 114. Additionally, receiving such an input may cause theelectronic device 104 to capture a video segment, audio segment, photo, digital image, or other such digital content segment (e.g., a second digital content segment 112(2)) using one or more of the user interface devices 314. For example, theprocessor 302 may cause a microphone, digital camera, and/or other user interface 314 of theelectronic device 104 to capture a digital content segment 112(2) in response to the input described above. - Moreover, as described with respect to
FIG. 9 theelectronic device 104 may generate a first identifier that is unique to the digital content segment 112(1) and a second identifier that is unique to the digital content segment 112(2). For example, themedia message engine 108 may cause theprocessor 302 and/or other components of theelectronic device 104 to generate a unique series of numbers, letters, symbols, and/orother identifiers electronic device 104 may link, couple, attach, and/or otherwise associate thefirst identifier 1210 with the digital content segment 112(1) in thememory 304. Theelectronic device 104 may also link, couple, attach, and/or otherwise associate thesecond identifier 1212 with the digital content segment 112(2) in thememory 304. - Further, the
electronic device 104 may provide one or more of the digital content segments 112(1), 112(2) described herein to theserver computer 102 via thenetwork 106. For example, themedia message engine 108 may cause theprocessor 302, and/or one or more communication interfaces or other like hardware and/or software devices or components of theelectronic device 104 to transfer at least one of the digital content segments 112(1), 112(2) from theelectronic device 104 to theserver computer 102. Additionally, thefirst identifier 1210 may be transferred in association with the digital content segment 112(1) (e.g., the digital image) when the digital content segment 112(1) is transferred by theelectronic device 104 to theserver computer 102, and thesecond identifier 1212 may be transferred in association with the digital content segment 112(2) (e.g., the audio segment) when the digital content segment 112(2) is transferred by theelectronic device 104 to theserver computer 102. Accordingly, atblock 1302 theserver computer 102 may receive the digital content segments 112(1), 112(2) andidentifiers - At
block 1304 theserver computer 102 may receive anelectronic file 1200 from theelectronic device 104, via thenetwork 106. For example, as noted above theelectronic device 104 may generate one or moreelectronic files 1200 providing a sequential clip listing associated with rendering thedigital media message 114. An exampleelectronic file 1200 may include atitle 1202 of thedigital media message 114 being generated. Theelectronic file 1200 may also include a time-based, frame-based, and/or otherwisesequential clip listing 1204 setting forth the content, components, and/or or other parameters of eachclip 1000 in thedigital media message 114, and the sequential order in which eachclip 1000 is to be rendered. Theelectronic file 1200 may further include thecontent identifiers first indicator 1214 identifying a first frame (e.g., a start frame) of thedigital media message 114 corresponding to adigital content segment 112 being rendered in the respective clip, and asecond indicator 1216 identifying a second frame (e.g., an end frame) of thedigital media message 114 corresponding to adigital content segment 112 being rendered in the respective clip. Theelectronic file 1200 may also include at least oneadditional indicator 1218 identifying a volume level corresponding to thedigital content segment 112 being rendered in arespective clip 1000. Theelectronic device 104 may transfer such anelectronic file 1200 to theserver computer 102, and theserver computer 102 may receive theelectronic file 1200 atblock 1304. - At
block 1306 theserver computer 102 may receive instructions from theelectronic device 104 via thenetwork 106 to share the digital media message with an additionalelectronic device 118 via thenetwork 106. Such instructions may include, for example, a third indicator (e.g., a telephone number, serial number, id number, and/or other like indicator) uniquely identifying theelectronic device 118. For example, as noted above themedia message engine 108 may cause theprocessor 302, and/or one or more communication interfaces or other like hardware and/or software devices or components of theelectronic device 104 to generate and send an instruction signal to theserver computer 102 via thenetwork 106. Theserver computer 102 may receive such a signal atblock 1306. - At
block 1308 theserver computer 102 may generate one or more additional electronic files using the variousdigital content segments 112 received from theelectronic device 104. One or more such additional electronic files may also be generated by theserver computer 102 based on theelectronic file 1200 received from theelectronic device 104. Theserver computer 102 may generate one or more such additional electronic files in response to receiving the instructions or in response to receiving theelectronic file 1200. - For example, at
block 1308 theserver computer 102 may generate a first additional electronic file comprising a plurality of frame groups formed from at least one of thedigital content segments 112 provided by theelectronic device 104 atblock 910. In such examples, the first additional electronic file may comprise a plurality of frames of thedigital media message 114, and may be optimized by theserver computer 102 for the purpose of streaming thedigital media message 114 from theserver computer 102 to theelectronic device 118 via thenetwork 106. For example, in generating such a first additional electronic file, theserver computer 102 may break up one or more of theclips 1000 identified by theelectronic file 1200 into frame groups. As noted above, the various different frame groups formed by theserver computer 102 atblock 1308 may have a different respective quality levels. For example, a first frame group formed by theserver computer 102 may have a first level of fidelity, and a second frame group formed by theserver computer 102 may have a second level of fidelity different from the first level of fidelity. - Additionally or alternatively, at
block 1308 theserver computer 102 may generate a second additional electronic file that is optimized for uploading by theserver computer 102 to theelectronic device 118 and/or for downloading by theelectronic device 118. In such examples, the second additional electronic file may comprise digitally compressed and/or otherwise modified version of one or more frames of the plurality of frames making up thedigital media message 114. - At
block 1310 theserver computer 102 may generate one or more signals and may provide one or more such signals to theelectronic device 118 via at least one of a text message, an email, and/or a website, such as a social media website. Additionally, in such examples, the signal provided by theserver computer 102 may include a request for permission associated with sharing thedigital media message 114. For example, such a signal may request that auser 120 of theelectronic device 118 provide permission to theserver computer 102 for sharing thedigital media message 114. - In response to requesting permission associated with sharing the
digital media message 114, theserver computer 102 may, atblock 1312, receive additional instructions from theelectronic device 118 via thenetwork 106. For example, in response to receiving the signal provided by theserver computer 102, theuser 120 of theelectronic device 118 may grant permission to theserver computer 102 to share themedia message 114. In such examples, in response to receiving such additional instructions and/or such permission from theelectronic device 118, theserver computer 102 may, atblock 1314, stream the first additional electronic file described above, via thenetwork 106. In such examples, theuser 120 may view thedigital media message 114 without downloading and/or otherwise receiving thedigital content segments 112 utilized to render thedigital media message 114. - Additionally or alternatively, in response to receiving such additional instructions and/or such permission from the
electronic device 118, theserver computer 102 may, atblock 1316, transfer the second additional electronic file described above to theelectronic device 118 via thenetwork 106. In such examples, theserver computer 102 may upload the compressed versions of thedigital content segments 112 to theelectronic device 118 for local rendering and/or viewing on theelectronic device 118. - In some examples, embodiments of the present disclosure may enable a
user 120 of theelectronic device 118 to edit, alter, and/or otherwise modify adigital media message 114 generated by theuser 116 of thedevice 104. For example, in embodiments in which the second additional electronic file is transferred to theelectronic device 118 atblock 1316, theelectronic device 118 may render thedigital media message 114 corresponding to the second additional electronic file locally. Theuser 120 may then add content, delete content, reorder content, and/or otherwise modify thedigital media message 114 locally on theelectronic device 118. Additionally or alternatively, theserver computer 102 may transfer the master content (e.g. the original/nativedigital content segments 112 used to generate the digital media message 114) to theelectronic device 118 for rendering and/or modification. In any of the example embodiments described herein, theelectronic device 118 may generate a second electronic file similar to theelectronic file 1200 described above with respect toFIG. 12 . For example, such a second electronic file may comprise an additional EDL file, text file, data file, and/or other like file, and may include a time-based, frame-based, and/or otherwise sequential clip listing setting forth the content, components, and/or or other parameters of each clip in the modified digital media message, and the sequential order in which each clip is to be rendered.Electronic device 118 may transfer the second electronic file to theserver computer 102 via thenetwork 106. Upon receiving the second electronic file from theelectronic device 118, theserver computer 102 may store the second electronic file with the firstelectronic file 1200 in thememory 204. In this way, the server computer may preserve various editing decisions over time, by saving various EDLs generated bymultiple users -
FIG. 14 illustrates anexample user interface 1400 of the present disclosure configured to assist theuser 120 in modifying adigital media message 114 rendered on theelectronic device 118. In example embodiments, theuser interface 1400 may include one or more controls configured to assist theuser 120 in making further modifications to one or more of thedigital content segments 112, the play sequence, and/or other components of thedigital media message 114. For example, theuser interface 1400 may include acontrol 1402 configured to enable theuser 120 to add one or more images, videos, photos, audio clips, and/or other content to thedigital media message 114. In example embodiments, themedia message engine 108 may receive an input, such as a touch input, indicative of selection of thecontrol 1402 by theuser 120. In response to receiving such an input, themedia message engine 108 may enable theuser 120 to browse various photos, images, videos, and/or other content stored in thememory 304 and/or in thememory 204 of theserver 102. Additionally and/or alternatively, in response to receiving such an input, themedia message engine 108 may enable theuser 120 to perform a web-based search, such as via one or more search engines or applications of theelectronic device 118, for such content. Theuser 120 may be permitted to select one or more such content items for use. Upon selection of such a content item, themedia message engine 108 may add the selected item to the play sequence of thedigital media message 114 and/or may combine the selected item with one ormore content segments 112 of thedigital media message 114. Additionally, theuser interface 1400 may include one ormore controls 1404 configured to enable theuser 120 to delete one or more audio clips, video clips, segments, files, and/or other content from thedigital media message 114. - The
user interface 1400 may further include one ormore controls 1406 configured to enable theuser 120 to modify one or more of thedigital content segments 112, the play sequence, and/or other components of thedigital media message 114.Such controls 1406 may comprise, among other things, any audio, video, image, or other editing tools known in the art. In example embodiments,such controls 1406 may provide editing functionality enabling theuser 120 to move, modify, augment, cut, paste, copy, save, or otherwise alter portions of eachdigital content segment 112 as part of generating adigital media message 114. Additionally, thecontrol 1406 may enable theuser 120 to cut, paste, draw, rotate, flip, shade, color, fade, darken, and/or otherwise modify various aspects of thedigital media message 114 and/or variousdigital content segments 112 included in the play sequence thereof. In some embodiments, at least one of thecontrols controls 418 described above. Theuser interface 1400 may also include one ormore navigation controls 1408 configured to enable theuser 120 to save the modifieddigital media message 114 locally and/or at theserver computer 102. In addition, theuser interface 1400 or other user interfaces described herein may provide one or more additional controls operable to enable theuser 120 of thedevice 118 to observe the creation of a digital media message 114 (by anotheruser 116 on a separate electronic device 104) substantially in real-time. Such functionality may be possible in examples in which theclips 1000,digital content segments 112, and/or other components of thedigital media message 114 are transferred to theserver computer 102 in substantially real-time (such as at block 910) as thedigital media message 114 is being created. - In summary, example embodiments of the present disclosure provide devices and methods for generating digital media messages as a means for communication between users in remote locations. Such digital media messages include various combinations of audio, video, images, photos, and/or other digital content segments, and can be quickly and artfully created by each user with little effort. The methods of generating such a digital media message described herein may reduce the file size and/or memory requirements of such messages. As a result, a
digital media message 114 generated using the techniques described herein will require/take up less memory than a corresponding digital media message generated by other methods. Such digital media messages will also enable thevarious networks 106,servers 102, and/orelectronic devices digital media messages 114 more quickly and with fewer network, server, and/or device resources. As a result, such a reduction in file size and/or memory requirements will reduce overall network load/traffic, and will improve network, server, and/or electronic device performance. Such a reduction in file size and/or memory requirements will enable thevarious networks 106,servers 102, and/orelectronic devices digital media messages 114 more quickly and with fewer network, server, and/or device resources. - Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the claims.
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/094,557 US20160226806A1 (en) | 2014-08-18 | 2016-04-08 | Digital media messages and files |
PCT/US2017/026234 WO2017176940A1 (en) | 2016-04-08 | 2017-04-05 | Digital media messages and files |
US15/683,120 US10735360B2 (en) | 2014-08-18 | 2017-08-22 | Digital media messages and files |
US16/909,870 US10992623B2 (en) | 2014-08-18 | 2020-06-23 | Digital media messages and files |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462038493P | 2014-08-18 | 2014-08-18 | |
US201462042114P | 2014-08-26 | 2014-08-26 | |
US14/569,169 US9973459B2 (en) | 2014-08-18 | 2014-12-12 | Digital media message generation |
US201562146045P | 2015-04-10 | 2015-04-10 | |
US14/683,779 US10037185B2 (en) | 2014-08-18 | 2015-04-10 | Digital media message generation |
US15/094,557 US20160226806A1 (en) | 2014-08-18 | 2016-04-08 | Digital media messages and files |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/683,779 Continuation-In-Part US10037185B2 (en) | 2014-08-18 | 2015-04-10 | Digital media message generation |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/683,120 Continuation US10735360B2 (en) | 2014-08-18 | 2017-08-22 | Digital media messages and files |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160226806A1 true US20160226806A1 (en) | 2016-08-04 |
Family
ID=56554905
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/094,557 Abandoned US20160226806A1 (en) | 2014-08-18 | 2016-04-08 | Digital media messages and files |
US15/683,120 Active 2035-10-11 US10735360B2 (en) | 2014-08-18 | 2017-08-22 | Digital media messages and files |
US16/909,870 Active US10992623B2 (en) | 2014-08-18 | 2020-06-23 | Digital media messages and files |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/683,120 Active 2035-10-11 US10735360B2 (en) | 2014-08-18 | 2017-08-22 | Digital media messages and files |
US16/909,870 Active US10992623B2 (en) | 2014-08-18 | 2020-06-23 | Digital media messages and files |
Country Status (1)
Country | Link |
---|---|
US (3) | US20160226806A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10038657B2 (en) | 2014-08-18 | 2018-07-31 | Nightlight Systems Llc | Unscripted digital media message generation |
US10691408B2 (en) | 2014-08-18 | 2020-06-23 | Nightlight Systems Llc | Digital media message generation |
US10735361B2 (en) | 2014-08-18 | 2020-08-04 | Nightlight Systems Llc | Scripted digital media message generation |
US10735360B2 (en) | 2014-08-18 | 2020-08-04 | Nightlight Systems Llc | Digital media messages and files |
US11237708B2 (en) * | 2020-05-27 | 2022-02-01 | Bank Of America Corporation | Video previews for interactive videos using a markup language |
US11252274B2 (en) * | 2019-09-30 | 2022-02-15 | Snap Inc. | Messaging application sticker extensions |
US11461535B2 (en) * | 2020-05-27 | 2022-10-04 | Bank Of America Corporation | Video buffering for interactive videos using a markup language |
Family Cites Families (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7096358B2 (en) | 1998-05-07 | 2006-08-22 | Maz Technologies, Inc. | Encrypting file system |
US6538665B2 (en) | 1999-04-15 | 2003-03-25 | Apple Computer, Inc. | User interface for presenting media information |
US6624826B1 (en) | 1999-09-28 | 2003-09-23 | Ricoh Co., Ltd. | Method and apparatus for generating visual representations for audio documents |
JP2004508757A (en) | 2000-09-08 | 2004-03-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | A playback device that provides a color slider bar |
US7254775B2 (en) | 2001-10-03 | 2007-08-07 | 3M Innovative Properties Company | Touch panel system and method for distinguishing multiple touch inputs |
JP2003256461A (en) | 2002-03-04 | 2003-09-12 | Fuji Photo Film Co Ltd | Method and device for retrieving image, and program |
US6987221B2 (en) * | 2002-05-30 | 2006-01-17 | Microsoft Corporation | Auto playlist generation with multiple seed songs |
GB2391149B (en) | 2002-07-19 | 2005-10-26 | Autodesk Canada Inc | Processing scene objects |
US7519910B2 (en) | 2002-10-10 | 2009-04-14 | International Business Machines Corporation | Method for transferring files from one machine to another using adjacent desktop displays in a virtual network |
US7353282B2 (en) | 2002-11-25 | 2008-04-01 | Microsoft Corporation | Methods and systems for sharing a network resource with a user without current access |
US8108342B2 (en) | 2007-09-10 | 2012-01-31 | Robert Salinas | Methods and systems of content mobilization, mobile search, and video editing through a web interface |
US7209167B2 (en) | 2003-01-15 | 2007-04-24 | Hewlett-Packard Development Company, L.P. | Method and apparatus for capture of sensory data in association with image data |
ATE341894T1 (en) | 2003-10-15 | 2006-10-15 | Sony Ericsson Mobile Comm Ab | EVENT NOTIFICATION THROUGH MMS IN A PORTABLE COMMUNICATION DEVICE |
US7873911B2 (en) | 2004-08-31 | 2011-01-18 | Gopalakrishnan Kumar C | Methods for providing information services related to visual imagery |
IL173222A0 (en) | 2006-01-18 | 2006-06-11 | Clip In Touch Internat Ltd | Apparatus and method for creating and transmitting unique dynamically personalized multimedia messages |
US8683362B2 (en) | 2008-05-23 | 2014-03-25 | Qualcomm Incorporated | Card metaphor for activities in a computing device |
CN101485193B (en) | 2006-10-19 | 2012-04-18 | 松下电器产业株式会社 | Image generating device and image generating method |
US8819724B2 (en) | 2006-12-04 | 2014-08-26 | Qualcomm Incorporated | Systems, methods and apparatus for providing sequences of media segments and corresponding interactive data on a channel in a media distribution system |
US8054969B2 (en) | 2007-02-15 | 2011-11-08 | Avaya Inc. | Transmission of a digital message interspersed throughout a compressed information signal |
US20080229204A1 (en) | 2007-03-12 | 2008-09-18 | Brian David Johnson | Apparatus, System And Method For The Navigation Of Aggregated Content Using Skipping And Content Metadata |
USD621845S1 (en) | 2007-06-23 | 2010-08-17 | Apple Inc. | Graphical user interface for a display screen or portion thereof |
US8055640B2 (en) | 2007-07-02 | 2011-11-08 | Lg Electronics Inc. | System and method for transmitting multimedia contents |
CN101114260A (en) | 2007-08-24 | 2008-01-30 | 华南理工大学 | Intelligent demonstration equipment used for electric document demonstration |
US20090100068A1 (en) | 2007-10-15 | 2009-04-16 | Ravi Gauba | Digital content Management system |
USD597101S1 (en) | 2008-01-08 | 2009-07-28 | Apple Inc. | Animated image for a portion of a display screen |
US8155505B2 (en) * | 2008-06-06 | 2012-04-10 | Apple Inc. | Hybrid playlist |
US9390169B2 (en) | 2008-06-28 | 2016-07-12 | Apple Inc. | Annotation of movies |
US9032320B2 (en) | 2008-09-08 | 2015-05-12 | Disney Enterprises, Inc. | Time and location based GUI for accessing media |
US20100077289A1 (en) | 2008-09-08 | 2010-03-25 | Eastman Kodak Company | Method and Interface for Indexing Related Media From Multiple Sources |
AU2009301595A1 (en) | 2008-10-08 | 2010-04-15 | Jeremie Salvatore De Villiers | System and method for the automated customization of audio and video media |
US20100125791A1 (en) | 2008-11-14 | 2010-05-20 | Rebelvox, Llc | User interface for a telecommunication and multimedia management system and method |
WO2010058334A1 (en) | 2008-11-21 | 2010-05-27 | Koninklijke Philips Electronics N.V. | Merging of a video and still pictures of the same event, based on global motion vectors of this video |
US8862097B2 (en) | 2008-12-03 | 2014-10-14 | Entersekt International Limited | Secure transaction authentication |
CA2748309A1 (en) | 2008-12-23 | 2010-07-01 | Vericorder Technology Inc. | Digital media editing interface |
US8737815B2 (en) | 2009-01-23 | 2014-05-27 | The Talk Market, Inc. | Computer device, method, and graphical user interface for automating the digital transformation, enhancement, and editing of personal and professional videos |
US8290777B1 (en) | 2009-06-12 | 2012-10-16 | Amazon Technologies, Inc. | Synchronizing the playing and displaying of digital content |
US8736561B2 (en) | 2010-01-06 | 2014-05-27 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
US8698762B2 (en) | 2010-01-06 | 2014-04-15 | Apple Inc. | Device, method, and graphical user interface for navigating and displaying content in context |
USD626138S1 (en) | 2010-02-04 | 2010-10-26 | Microsoft Corporation | User interface for a display screen |
US20110202835A1 (en) | 2010-02-13 | 2011-08-18 | Sony Ericsson Mobile Communications Ab | Item selection method for touch screen devices |
US8839118B2 (en) | 2010-06-30 | 2014-09-16 | Verizon Patent And Licensing Inc. | Users as actors in content |
US9619100B2 (en) | 2010-08-30 | 2017-04-11 | Nokia Technologies Oy | Method, apparatus, and computer program product for adapting a content segment based on an importance level |
US9037971B2 (en) | 2010-09-15 | 2015-05-19 | Verizon Patent And Licensing Inc. | Secondary audio content by users |
US20130212521A1 (en) | 2010-10-11 | 2013-08-15 | Teachscape, Inc. | Methods and systems for use with an evaluation workflow for an evidence-based evaluation |
EP2630594A1 (en) | 2010-10-19 | 2013-08-28 | Sony Ericsson Mobile Communications AB | System and method for generating a visual summary of previously viewed multimedia content |
US9646352B2 (en) | 2010-12-10 | 2017-05-09 | Quib, Inc. | Parallel echo version of media content for comment creation and delivery |
CN102651731B (en) | 2011-02-24 | 2016-06-29 | 腾讯科技(深圳)有限公司 | A kind of video broadcasting method and device thereof |
US9117483B2 (en) | 2011-06-03 | 2015-08-25 | Michael Edward Zaletel | Method and apparatus for dynamically recording, editing and combining multiple live video clips and still photographs into a finished composition |
JP2013038453A (en) | 2011-08-03 | 2013-02-21 | Sony Corp | Information processing apparatus and display method |
US9933935B2 (en) | 2011-08-26 | 2018-04-03 | Apple Inc. | Device, method, and graphical user interface for editing videos |
CN103988519B (en) | 2011-10-14 | 2018-06-05 | 谷歌有限责任公司 | Album Cover Art is created for Media Browser |
US8869068B2 (en) | 2011-11-22 | 2014-10-21 | Backplane, Inc. | Content sharing application utilizing radially-distributed menus |
US9960932B2 (en) | 2011-12-28 | 2018-05-01 | Evernote Corporation | Routing and accessing content provided by an authoring application |
USD701533S1 (en) | 2012-01-06 | 2014-03-25 | Samsung Electronics Co., Ltd. | Display screen portion with icon |
USD686238S1 (en) | 2012-01-09 | 2013-07-16 | Milyoni, Inc. | Display screen with a graphical user interface of a social network presentation system |
USD704205S1 (en) | 2012-02-22 | 2014-05-06 | Blackberry Limited | Display screen with a graphical user interface |
US9940629B2 (en) | 2012-04-18 | 2018-04-10 | Apple Inc. | Personalizing digital gifts |
US9832519B2 (en) | 2012-04-18 | 2017-11-28 | Scorpcast, Llc | Interactive video distribution system and video player utilizing a client server architecture |
US20130294746A1 (en) | 2012-05-01 | 2013-11-07 | Wochit, Inc. | System and method of generating multimedia content |
US9396758B2 (en) | 2012-05-01 | 2016-07-19 | Wochit, Inc. | Semi-automatic generation of multimedia content |
USD728590S1 (en) | 2012-05-02 | 2015-05-05 | Pantech Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD727931S1 (en) | 2012-05-02 | 2015-04-28 | Pantech Co., Ltd. | Display screen or portion thereof with graphical user interface |
US9449523B2 (en) | 2012-06-27 | 2016-09-20 | Apple Inc. | Systems and methods for narrating electronic books |
US8428453B1 (en) | 2012-08-08 | 2013-04-23 | Snapchat, Inc. | Single mode visual media capture |
US20140055633A1 (en) | 2012-08-27 | 2014-02-27 | Richard E. MARLIN | Device and method for photo and video capture |
US10552030B2 (en) | 2012-10-15 | 2020-02-04 | Kirusa, Inc. | Multi-gesture media recording system |
US20140163957A1 (en) | 2012-12-10 | 2014-06-12 | Rawllin International Inc. | Multimedia message having portions of media content based on interpretive meaning |
US20140163980A1 (en) | 2012-12-10 | 2014-06-12 | Rawllin International Inc. | Multimedia message having portions of media content with audio overlay |
WO2014100893A1 (en) | 2012-12-28 | 2014-07-03 | Jérémie Salvatore De Villiers | System and method for the automated customization of audio and video media |
US9323733B1 (en) | 2013-06-05 | 2016-04-26 | Google Inc. | Indexed electronic book annotations |
EP3036925A1 (en) | 2013-08-19 | 2016-06-29 | Doowapp Limited | Method and arrangement for processing and providing media content |
US20150092006A1 (en) | 2013-10-01 | 2015-04-02 | Filmstrip, Inc. | Image with audio conversation system and method utilizing a wearable mobile device |
US9977591B2 (en) * | 2013-10-01 | 2018-05-22 | Ambient Consulting, LLC | Image with audio conversation system and method |
US9426523B2 (en) | 2014-06-25 | 2016-08-23 | International Business Machines Corporation | Video composition by dynamic linking |
US20160226806A1 (en) | 2014-08-18 | 2016-08-04 | KnowMe Systems, Inc. | Digital media messages and files |
US10038657B2 (en) | 2014-08-18 | 2018-07-31 | Nightlight Systems Llc | Unscripted digital media message generation |
US20160048313A1 (en) | 2014-08-18 | 2016-02-18 | KnowMe Systems, Inc. | Scripted digital media message generation |
US10037185B2 (en) | 2014-08-18 | 2018-07-31 | Nightlight Systems Llc | Digital media message generation |
US9973459B2 (en) | 2014-08-18 | 2018-05-15 | Nightlight Systems Llc | Digital media message generation |
-
2016
- 2016-04-08 US US15/094,557 patent/US20160226806A1/en not_active Abandoned
-
2017
- 2017-08-22 US US15/683,120 patent/US10735360B2/en active Active
-
2020
- 2020-06-23 US US16/909,870 patent/US10992623B2/en active Active
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10038657B2 (en) | 2014-08-18 | 2018-07-31 | Nightlight Systems Llc | Unscripted digital media message generation |
US10691408B2 (en) | 2014-08-18 | 2020-06-23 | Nightlight Systems Llc | Digital media message generation |
US10728197B2 (en) | 2014-08-18 | 2020-07-28 | Nightlight Systems Llc | Unscripted digital media message generation |
US10735361B2 (en) | 2014-08-18 | 2020-08-04 | Nightlight Systems Llc | Scripted digital media message generation |
US10735360B2 (en) | 2014-08-18 | 2020-08-04 | Nightlight Systems Llc | Digital media messages and files |
US10992623B2 (en) | 2014-08-18 | 2021-04-27 | Nightlight Systems Llc | Digital media messages and files |
US11082377B2 (en) | 2014-08-18 | 2021-08-03 | Nightlight Systems Llc | Scripted digital media message generation |
US11252274B2 (en) * | 2019-09-30 | 2022-02-15 | Snap Inc. | Messaging application sticker extensions |
US11616875B2 (en) * | 2019-09-30 | 2023-03-28 | Snap Inc. | Messaging application sticker extensions |
US11237708B2 (en) * | 2020-05-27 | 2022-02-01 | Bank Of America Corporation | Video previews for interactive videos using a markup language |
US11461535B2 (en) * | 2020-05-27 | 2022-10-04 | Bank Of America Corporation | Video buffering for interactive videos using a markup language |
US11481098B2 (en) | 2020-05-27 | 2022-10-25 | Bank Of America Corporation | Video previews for interactive videos using a markup language |
Also Published As
Publication number | Publication date |
---|---|
US20170353412A1 (en) | 2017-12-07 |
US20200322296A1 (en) | 2020-10-08 |
US10992623B2 (en) | 2021-04-27 |
US10735360B2 (en) | 2020-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10992623B2 (en) | Digital media messages and files | |
US10691408B2 (en) | Digital media message generation | |
US10728197B2 (en) | Unscripted digital media message generation | |
US11082377B2 (en) | Scripted digital media message generation | |
US9990349B2 (en) | Streaming data associated with cells in spreadsheets | |
US9977591B2 (en) | Image with audio conversation system and method | |
US9514157B2 (en) | Multi-dimensional browsing of content | |
US9753624B2 (en) | Non-destructive collaborative editing | |
US9973459B2 (en) | Digital media message generation | |
US20140376887A1 (en) | Mobile device video selection and edit | |
US20100299601A1 (en) | Configuring channels for sharing media | |
WO2016134415A1 (en) | Generation of combined videos | |
GB2500968A (en) | Extended applications of multimedia content previews in the cloud-based content management system | |
US20150092006A1 (en) | Image with audio conversation system and method utilizing a wearable mobile device | |
US9449646B2 (en) | Methods and systems for media file management | |
WO2017176940A1 (en) | Digital media messages and files | |
WO2015131700A1 (en) | File storage method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KNOWME SYSTEMS, INC., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WEIL, JOSEPH;MARTINEZ, WILLIAM JOSEPH;JARECKI, ANDREW;AND OTHERS;SIGNING DATES FROM 20160806 TO 20160922;REEL/FRAME:039845/0977 |
|
AS | Assignment |
Owner name: NIGHTLIGHT SYSTEMS LLC, DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KNOWME SYSTEMS, INC.;REEL/FRAME:041332/0974 Effective date: 20160928 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |