US20170352379A1 - Video editing using mobile terminal and remote computer - Google Patents
Video editing using mobile terminal and remote computer Download PDFInfo
- Publication number
- US20170352379A1 US20170352379A1 US15/192,209 US201615192209A US2017352379A1 US 20170352379 A1 US20170352379 A1 US 20170352379A1 US 201615192209 A US201615192209 A US 201615192209A US 2017352379 A1 US2017352379 A1 US 2017352379A1
- Authority
- US
- United States
- Prior art keywords
- video
- alpha
- visual effect
- format
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/11—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/038—Cross-faders therefor
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/36—Monitoring, i.e. supervising the progress of recording or reproducing
Definitions
- the present disclosure relates to video editing. More specifically, the present disclosure relates to video editing using a mobile terminal and at least one remote computer.
- One aspect of the invention provides a method of video editing.
- the method comprises: providing a video editing mobile application on a mobile terminal, wherein the video editing mobile application does not have the capability of video editing to overlay a user selected visual effect video clip over a user selected user video for generating a single resulting video while the video editing mobile application offers video editing of adding the user selected visual effect video clip to the user selected user video by computing power of one or more remote computers, wherein the video editing mobile application comprises alpha-format still images corresponding to each of a plurality of visual effects offered therein; receiving a user command for selecting a first user video for editing; receiving at least one user command for adding a first one of the plurality of visual effects to the user selected first user video, wherein adding involves selecting the first visual effect, and selecting a first time window for adding the first visual effect within a time span of the first user video; providing a preview displaying a series of alpha-format still images over the first user video to emulate the first visual effect over the first user video without generating a single video clip in
- the command data identifies the first user video, identifies the first visual effect, specifies the first time window for adding the first visual effect within the time span of the first user video. Adding further involves selecting at least one location for adding the first visual effect within a display area of the first user video, wherein the command data further specifies the at least one location for adding the first visual effect within the display of the first user video.
- the series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview the series of alpha-format still images are displayed in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the second alpha-format still image is displayed alone, wherein there is no overlap of two or more alpha-formal still images at a given point in time of the preview.
- the series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview the series of alpha-format still images are displayed in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the first and second alpha-format still images are displayed together. At a third point in time of the preview subsequent to the second point, the second alpha-format still image is displayed alone.
- the mobile application comprises a visual effect library storing the series of alpha-format still images for the first visual effect, wherein the visual effect library does not store or comprise an alpha-format video clip for the first visual effect.
- a mobile terminal comprising a touch screen display, a memory and at least one processor
- the mobile terminal comprises video editing mobile application software stored in the memory for executing using the at least one processor
- the video editing mobile application software does not have the capability of video editing to overlay a user selected visual effect video clip over a user selected user video for generating a single resulting video while offering video editing of adding the user selected visual effect video clip to the user selected user video by computing power of one or more remote computers
- the video editing mobile application comprising alpha-format still images corresponding to each of a plurality of visual effects offered therein
- the video editing mobile application software configured: to receive a user command for selecting a first user video for editing; to receive at least one user command for adding a first one of the plurality of visual effects to the user selected first user video, wherein adding the first visual effect to the first user video involves selecting the first visual effect, and selecting a first time window for adding the first visual effect within a time span of the first user video; to provide a preview displaying
- the command data identifies the first user video, identifies the first visual effect, specifies the first time window for adding the first visual effect within the time span of the first user video, and specifies the at least one location for adding the first visual effect within the display of the first user video. Adding further involves selecting at least one location for adding the first visual effect within a display area of the first user video, wherein the command data further specifies the at least one location for adding the first visual effect within the display of the first user video.
- the series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview of the series of alpha-format still images are displayed at a regular time interval in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the second alpha-format still image is displayed alone, wherein there is no overlap of two or more alpha-format still images at a given point in time of the preview.
- the series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview of the series of alpha-format still images are displayed at a regular time interval in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the first and second alpha-format still images are displayed together. At a third point in time of the preview subsequent to the second point, the second alpha-format still image is displayed alone.
- the mobile application comprises a visual effect library storing the series of alpha-format still images for the first visual effect, wherein the visual effect library does not store or comprise an alpha-format video clip for the first visual effect.
- FIG. 1 illustrates a video editing system according to embodiments.
- FIG. 2 illustrates a video editing preview on a mobile terminal and corresponding video editing on a server according to embodiments.
- FIG. 3 illustrates components of a video editing system according to embodiments.
- FIG. 4 illustrates a visual effect library on a mobile terminal and a corresponding visual effect library on a server according to embodiments.
- FIG. 5 illustrates superimposing visual effects over a user video according to embodiments.
- FIG. 6 illustrates a procedure of video editing according to embodiments.
- FIGS. 7A illustrates an interface of mobile application when a user video is selected according to embodiments.
- FIG. 7B illustrates an interface of mobile application when a user navigates a user video according to embodiments.
- FIG. 7C illustrates an interface of mobile application when a user enters a command for adding a visual effect using according to embodiments.
- FIG. 7D illustrates an interface of mobile application when a user enters a command for adding a visual effect using according to embodiments.
- FIG. 7E illustrates an interface of mobile application when a visual effect is selected to be added to a user video according to embodiments.
- FIG. 8A illustrates an example timeline of displaying still images of visual effect according to embodiments.
- FIG. 8B illustrates an example timeline of displaying still images of visual effect according to embodiments.
- FIGS. 9A-9C illustrate setting locations of visual effect on a mobile application according to embodiments.
- FIGS. 10A-10C illustrate a preview of visual effect changing its locations according to embodiments.
- the present invention provides a video editing system and method that utilize at least one mobile terminal for user interface and at least one remote computer for editing user videos.
- a mobile terminal 100 is connected to a server 200 wired or wireless via the Internet or information network 300 .
- the mobile terminal 100 includes a mobile application for video editing.
- the server 200 includes software for editing user videos.
- a user enters video editing instructions to the mobile application.
- the mobile application of the mobile terminal 100 presents a preview for the user's review and confirmation.
- the mobile application of the mobile terminal 100 sends a video editing request to the server 200 .
- the server 200 performs editing of the user video and generates a resulting edited video.
- the resulting video 210 generated by the server 200 corresponds to the preview 110 presented on the mobile terminal 100 in a manner in which for each visual effect included in the preview 110 , the resulting video 210 includes a corresponding visual effect.
- the resulting video 210 is a single video clip superimposing the user video and at least one visual effect.
- the corresponding preview is not a single video clip and rather a visual representation of the user video along with still images that correspond to the at least one visual effect.
- the mobile terminal 100 stores still images of visual effects.
- the server 200 stores video clips that correspond to the still images stored in the mobile terminal's library.
- FIG. 3 illustrates components of the mobile terminal 100 and the server 200 of a video editing system according to embodiments.
- the mobile terminal 100 includes a mobile application 120 , a visual effect library 140 and a user video storage 160 .
- the server 200 includes video editing software 220 and a visual effect library 240 .
- the term “mobile terminal” refers to mobile consumer electronic devices, such as smartphones, tablet computers, laptop computers, wearable computing devices, and other mobile computing devices.
- the mobile terminal includes a display, a user input device, a memory and at least one processor for executing software.
- the mobile terminal includes a touch screen display although not limited thereto.
- the term “server” refers to one or more computers that are typically stationary rather than mobile, although not limited thereto.
- the server is at least one networked computer of a service provider for providing video editing services.
- the mobile application 120 is software installed on mobile terminal 100 and capable of accessing components of mobile terminal 100 for providing user interfaces for video editing.
- the mobile application 120 communicates with the video editing software 220 of the server 200 for delegating a video editing task to the server 200 .
- the mobile application 120 provides user interfaces for receiving user commands for video editing, providing a video editing preview, and playing an edited or resulting video from the server 200 .
- the visual effect library 140 is data store of visual effects for use in the mobile application 120 .
- the visual effect library 140 stores a plurality of sets of still images, in which each set of still images represents one visual effect. As illustrated in FIG. 4 , the visual effect library 140 further stores information and data relating to the visual effects (A, BB, . . . AZ).
- the visual effect library 140 stores an identification (ID), an accompanying sound recording, the number of still images (image count), etc.
- the visual effect library 140 includes addresses or locations of the still images for each visual effect.
- the term “visual effect” or “visual effects” refer to one or more visual objects for adding to a user video.
- the visual object may be stationary or moving on a screen.
- the visual object may be colored and translucent, but not a filter to apply to a full screen of frames of the user video.
- a visual effect may accompany with a sound recording.
- a set of still images for each visual effect are snapshots or frames of a corresponding visual effect video.
- each still image includes one or more non-transparent objects or portions on a transparent background, referred to as an alpha ( ⁇ ) format still image.
- the still images of a visual effect are to be overlaid over user video frames on the mobile application 120 in a preview without forming an integrated video.
- the user video storage 160 is data store for user videos.
- the user videos stored in the user video storage 160 include videos captured at the mobile terminal 100 and/or videos downloaded from other sources.
- the video editing software 220 is software of the server 200 for performing video editing tasks based on a request from the mobile application 120 .
- Video editing by the video editing software 220 is, among others, combining a user video and at least one visual effect video such that the resulting video is in a single file and the visual effect video overlaps some frames of the user video.
- the visual effect library 240 is server-side data store of visual effects.
- the visual effects library 240 of the server 200 stores video clips for visual effects (A, BB, . . . , AZ), e.g., one video clip for a visual effect.
- the visual effect library 240 further stores information and data relating to the visual effects such as an identification (ID), a frame rate of the video clip (frame per second, fps), an accompanying sound recording, etc.
- Each video clip for a visual effect includes a transparent background and one or more non-transparent objects or portions, referred to as an alpha ( ⁇ ) format video clip.
- the frames of the video clip are to be integrated with user video frames to form a single edited video by video editing.
- alpha ( ⁇ ) format that supports an alpha ( ⁇ ) channel for storing transparency information of each pixel is used for the visual effect video clip.
- each visual effect has still image data in the mobile terminal 100 and video data in the server 200 .
- the visual effect A for example, a series of still images (A 1 -A 10 ) are stored in the visual effect library 140 of the mobile terminal 100 , and a video clip (A) is stored in the visual effect library 240 .
- the video clip stored in the server 200 directly corresponds to the still images stored in the mobile terminal.
- the still images A 1 -A 10 are a subset of frames selected from the corresponding video of visual effect A.
- each still image is a snapshot or frame of the video clip or a modified or simplified version of the snapshot or frame.
- the number of still images is substantially less than the number of frames in the corresponding video (video frame count).
- the video frame count is 5, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100 and 200 times greater than the image frame count for the visual effect.
- the ratio of the video frame count to the image frame count is in a range formed by any two numbers listed in the previous sentence.
- FIG. 5 illustrates an example timeline of video editing according to embodiments, in which a user video for editing runs from t 0 through t 6 .
- the visual effect A 520 is superimposed over the user video 510 from t 1 to t 3
- the visual effect B 530 is superimposed over the user video 510 from t 2 to t 5 .
- both visual effect A and visual effect B are superimposed with the user video 510 .
- FIG. 6 illustrates an example procedure of video editing.
- a user activates the mobile application 120 on the mobile terminal 100 .
- the user selects a user video 510 .
- the user selects a visual effect and at 630 selects parameters for adding the visual effect to the selected user video.
- the user may add more than one visual effect as in FIG. 5 .
- the mobile application 120 plays a preview for the user's review and confirmation of adding the visual effects.
- the mobile application In response to confirmation, at 650 , the mobile application generates a video editing request for sending to the server 200 at 660 .
- the server 200 edits the user video in accordance with the request. Subsequently, at 680 completion of the video editing is notified to the mobile application 120 , and at 690 the user may play the edited video on the mobile application 120
- the user selects a user video from user videos stored in the user video storage 160 .
- the mobile application 120 displays a scene of the user video 510 and provides a user interface for navigating a timeline of the selected user video 510 .
- the user may select a video for editing from the Internet or a network. Then, the mobile application 120 may download the selected user video or part of the selected video for displaying on its user interface.
- the user selects one or more visual effects to add to the user video.
- the mobile application 120 provides a user interface for selecting a visual effect from the visual effects available in the visual effect library of the mobile terminal 100 .
- the user sets one or more parameters for the selected visual effect A via the user interface of the mobile application 120 .
- the parameters include time frame (start-end), size, orientation, location within the screen, and display strengths (degree of transparency) of the visual effect A.
- the mobile application 120 saves the user selections and settings.
- a preview of video editing may be displayed at the user's request.
- the preview is a play of the selected user video along with the still images of the selected visual effects that are superimposed over the user video frames in accordance with the user's setting of the parameters.
- the preview is not an integrated video, in which the still images are incorporated or integrated with the user video.
- at least part of the still images may be integrated with or incorporated into the user video to provide the preview. The user may approve the preview or goes back to steps 620 and 630 for changes.
- the mobile application 120 generates and sends a video editing request to the server 200 at the user's command
- the video editing request includes details of the user's selections and settings for video editing.
- the video editing request includes the selected user video, identification of selected visual effect and parameters for each visual effect.
- the video editing request includes identification or location information of the user video rather than including the user video data itself.
- the video editing software 220 of the server 200 performs video editing in accordance with the video editing request from the mobile application 120 .
- the video editing software 220 retrieves the selected user video if needed. Also, the video editing software 220 retrieves the video clip for each visual effect identified in the video editing request. Then, the video editing at the server 200 relates to combining the video clip of the selected visual effect with the selected user video as specified by the parameters included in the request from the mobile application 120 .
- the video editing software 220 superimposes frames of the video clip of the selected visual effect over frames of the user video based on the timeframe specified in the video request. Specifically, in the video editing, a frame of the visual effect video clip and a frame of the user video are integrated as a single frame such that the video editing results in a single integrated video incorporating visual effect from the video clip in the user video. To do the frame-by-frame integration, in some embodiments, the video editing software 220 may determine if the frame rates of the user video and visual effect video clip and adjust the frame rate of the visual effect video clip to match the frame rate of the user video.
- FIGS. 7A-7E illustrate example user interfaces of the mobile application 120 for video editing.
- a first window 810 of the mobile application 120 displays the starting frame (t 0 ) of the user video 510 .
- a second window 820 of the mobile application 120 displays a video time-bar 822 showing frames of the user video 510 .
- the mobile application 120 provides an indicator 824 indicating position of the current frame (scene) displayed in the first window 810 on the video time-bar 822 .
- the mobile application 120 graphical provides icons 830 representing video editing available in the mobile application. Each of the icons 830 represents color adjusting, adding background music, overlaying visual effects, and trimming.
- the mobile application 120 displays the first scene (t 1 ) of the user video 510 as the user moves the video time-bar 822 .
- FIG. 7C-7E illustrates interface of the mobile application 120 when the user selects the visual effect A and selecting parameters of the visual effect A at the steps of 620 and 630 .
- the mobile application 120 displays multiple icons 840 showing visual effect categories available when the user selects an icon 831 of overlaying visual effects among the icons 830 .
- the mobile application 120 displays multiple icons 843 representing visual effects in the selected category when the user selects an icon 842 representing heart-shape visual effects.
- the mobile application 120 displays a con 860 indicating location and size of the visual effect A in the first window 810 over a scene of the user video 510 .
- the mobile application 120 displays a time line 850 for indicating start/end of the visual effect A.
- the timeline 850 is sized and aligned with the video time-bar 822 for indicating a corresponding portion of the user video 510 where the visual effect A will be combined.
- the mobile application 120 provides the icon of the Visual Effect A using at least one form the still images A 1 -A 10 .
- the user can set a starting point of the visual effect by moving the time line 850 relative to the time bar 822 of the user video.
- the user can adjust size of the visual effect A by dragging a size-adjusting mark 862 provided along with the icon 860 .
- the mobile application 120 provides a preview of the visual effect A over the user video 510 .
- FIGS. 8A illustrates an example timeline of displaying still images of a visual effect A ( FIG. 5 ) in a preview.
- the visual effect A begins at t 1 and continues until t 3 (t 1 +5 second).
- each still image stays for 0.5 sec. in the preview such that each still image is presented along with multiple frames of the user video 510 .
- the first still image A 01 is displayed over the frames of the user video 510 in the same time segment.
- the second still image till image A 02 is displayed over the frames of the user video 510 in the same time segment.
- the time segment for each still image may change.
- the mobile application 120 displays the still images A 1 -A 10 one after another in sequence without overlapping with each other as in FIG. 8A .
- two or more still images may be displayed at a given time during preview.
- the transparency (display strength) of the object(s) included in each still image is constant and does not change over time during the time segment in which the particular still image is presented in the preview.
- the transparency (display strength) of the object(s) included in each still image changes over time during the time segment in which the particular still image is presented in the preview.
- each of the still images A 01 -A 10 is displayed at either of 100% displays strength and 0% displays strength during display of the visual effect A in preview.
- still image A 01 is at 100% during the first time segment and at 0% for the rest of time segments.
- Display strength of 100% is 0% transparency, which is to display the object(s) of each still image as original.
- Display strength of 0% corresponds 100% transparency, which will result in no display of the object(s) of visual effect.
- alpha ( ⁇ ) format that supports an alpha ( ⁇ ) channel for storing transparency information of each pixel of the still images are used for changing displays strength.
- the still image A 09 is displayed in the time segment between t 1 +4.0 and t 1 +5.0.
- the display strength of the still image A 09 gradually increases (fade-in) to its peak at t 1 +4.5 and then gradually decreases (fade-out) until t 1 +5.0.
- the still image A 09 is displayed together with the still image A 08 as the still image A 08 fades out.
- the still image A 09 is displayed together with the still image A 10 as the still image A 10 fades in.
- beginning of still image A 10 may be delayed such that there is some time period in which only the still image A 09 is displayed as visual effect along with the user video frames. Also, in other embodiments, the still image A 10 may begin before complete disappearance of the still image A 08 .
- the overlapping display and staggering display strength individually and in combination are advantageous as the display of these still images can generate smoother motions with a smaller number or count of still images than the on-off display strength as illustrated in FIG. 8A .
- FIGS. 9A to 9C illustrates setting different locations of the visual effect A at multiple points of the timeline of user video.
- FIGS. 10A to 10C illustrates a preview of the visual effect A changing its locations along the timeline of the user video.
- visual effect icons 962 , 964 , 966 represent a visual representation of the visual effect A at corresponding points 952 , 954 , 956 in the timeline 850 .
- the user can adjust location of the visual effect A at the point 952 by moving the moving visual effect icons 962 as in FIG. 9A .
- the user can change locations of the visual effect A at the points 954 , 956 in the timeline 850 by moving the icons 964 , 966 .
- the mobile application 120 For each of the points 952 , 954 , 956 the mobile application 120 stores the location of the visual effect as part of meters for video editing.
- the mobile application 120 includes the stored locations in a request for server-side video editing.
- the server 200 Based on the request, the server 200 generates a resulting video in which the visual effect A moves along a trace connecting the multiple locations of 962 , 964 , 966 .
- parameters of the visual effect A for overlaying include different settings of the visual effect A at two or more points in the timeline of the visual effect A.
- mobile' application 120 does not perform, by itself, a video editing to combine visual effects to user videos because such video editing task is too heavy for limited computational power of the mobile terminal 100 . Instead, the mobile application 120 provide of a video editing and delegates the video editing to the server 200 to take advantage of computational power of the server 200 .
- a preview of visual effects is provided using still-images representing the visual effects without using a video clip of the visual effects. Process of providing a video editing preview is not a simplified version of corresponding video editing at the server because the mobile application 120 does not modify the user video or create a new video file incorporating the visual effects to the user video.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
- The present disclosure relates to video editing. More specifically, the present disclosure relates to video editing using a mobile terminal and at least one remote computer.
- People use smartphones to take and edit videos. Simple video editing like trimming can be done on smartphones. Smartphones may not have processing power or functions of more complex video editing like superimposing visual objects. More complex editing is typically performed using desktop computers or specialized systems.
- One aspect of the invention provides a method of video editing. The method comprises: providing a video editing mobile application on a mobile terminal, wherein the video editing mobile application does not have the capability of video editing to overlay a user selected visual effect video clip over a user selected user video for generating a single resulting video while the video editing mobile application offers video editing of adding the user selected visual effect video clip to the user selected user video by computing power of one or more remote computers, wherein the video editing mobile application comprises alpha-format still images corresponding to each of a plurality of visual effects offered therein; receiving a user command for selecting a first user video for editing; receiving at least one user command for adding a first one of the plurality of visual effects to the user selected first user video, wherein adding involves selecting the first visual effect, and selecting a first time window for adding the first visual effect within a time span of the first user video; providing a preview displaying a series of alpha-format still images over the first user video to emulate the first visual effect over the first user video without generating a single video clip in which the first visual effect overlays the first user video; in response to a user's confirmation of the preview, uploading, to at least one remote computer, command data for adding the first visual effect to the first user video at the first time window and at the at least one location such that the at least one remote computer performs video editing of combining a first alpha-format video clip for the first visual effect with the first user video in accordance with the command data to generate a single resulting video in which the first alpha-format video clip for the first visual effect overlays the first user video at the first time window and at the at least one location; and receiving the single resulting video for playing on the mobile terminal, wherein each still image of the series of alpha-format still images comprises a non-transparent visual effect with a transparent background, wherein the first alpha-format video clip comprises a non-transparent visual effect with a transparent background.
- In the above-described method, the command data identifies the first user video, identifies the first visual effect, specifies the first time window for adding the first visual effect within the time span of the first user video. Adding further involves selecting at least one location for adding the first visual effect within a display area of the first user video, wherein the command data further specifies the at least one location for adding the first visual effect within the display of the first user video. The series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview the series of alpha-format still images are displayed in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the second alpha-format still image is displayed alone, wherein there is no overlap of two or more alpha-formal still images at a given point in time of the preview. Still in the above-described method, the series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview the series of alpha-format still images are displayed in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the first and second alpha-format still images are displayed together. At a third point in time of the preview subsequent to the second point, the second alpha-format still image is displayed alone. At the second point, display strength of the non-transparent visual effect of the first alpha-format still image is lower than display strength of the non-transparent visual effect of the first alpha-format still image, displayed at the first point such that the non-transparent visual effect of the first alpha-format still image fades out over time from the first point to the second point. The mobile application comprises a visual effect library storing the series of alpha-format still images for the first visual effect, wherein the visual effect library does not store or comprise an alpha-format video clip for the first visual effect.
- Another aspect of the invention provides a mobile terminal comprising a touch screen display, a memory and at least one processor, wherein the mobile terminal comprises video editing mobile application software stored in the memory for executing using the at least one processor, wherein the video editing mobile application software does not have the capability of video editing to overlay a user selected visual effect video clip over a user selected user video for generating a single resulting video while offering video editing of adding the user selected visual effect video clip to the user selected user video by computing power of one or more remote computers, the video editing mobile application comprising alpha-format still images corresponding to each of a plurality of visual effects offered therein, the video editing mobile application software configured: to receive a user command for selecting a first user video for editing; to receive at least one user command for adding a first one of the plurality of visual effects to the user selected first user video, wherein adding the first visual effect to the first user video involves selecting the first visual effect, and selecting a first time window for adding the first visual effect within a time span of the first user video; to provide a preview displaying a series of alpha-format still images over the first user video to emulate the first visual effect over the first user video without generating a single video clip in which the first visual effect overlays the first user video; in response to a user's confirmation of the preview, to upload, to at least one remote computer, command data for adding the first visual effect to the first user video at the first time window and at the at least one location such that the at least one remote computer performs video editing of combining a first alpha-format video clip for the first visual effect with the first user video in accordance with the command data to generate a single resulting video in which the first alpha-format video clip for the first visual effect overlays the first user video at the first time window and at the at least one location; and to receive the single resulting video for playing on the mobile terminal, wherein each still image of the series of alpha-format still images comprises a non-transparent visual effect with a transparent background, wherein the first alpha-format video clip comprises a non-transparent visual effect with a transparent background.
- In the above-described mobile terminal, the command data identifies the first user video, identifies the first visual effect, specifies the first time window for adding the first visual effect within the time span of the first user video, and specifies the at least one location for adding the first visual effect within the display of the first user video. Adding further involves selecting at least one location for adding the first visual effect within a display area of the first user video, wherein the command data further specifies the at least one location for adding the first visual effect within the display of the first user video. The series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview of the series of alpha-format still images are displayed at a regular time interval in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the second alpha-format still image is displayed alone, wherein there is no overlap of two or more alpha-format still images at a given point in time of the preview.
- Still in the above-described mobile terminal, the series of alpha-format still images comprises a first alpha-format still image and a second alpha-format still image immediately following the first alpha-format still image, wherein in the preview of the series of alpha-format still images are displayed at a regular time interval in sequence such that at a first point in time of the preview, the first alpha-format still image is displayed alone and that at a second point in time of the preview subsequent to the first point, the first and second alpha-format still images are displayed together. At a third point in time of the preview subsequent to the second point, the second alpha-format still image is displayed alone. At the second point, display strength of the non-transparent visual effect of the first alpha-format still image is lower than display strength of the non-transparent visual effect of the first alpha-format still image displayed at the first point such that the non-transparent visual effect of the first alpha-format still image fades out over time from the first point to the second point. The mobile application comprises a visual effect library storing the series of alpha-format still images for the first visual effect, wherein the visual effect library does not store or comprise an alpha-format video clip for the first visual effect.
-
FIG. 1 illustrates a video editing system according to embodiments. -
FIG. 2 illustrates a video editing preview on a mobile terminal and corresponding video editing on a server according to embodiments. -
FIG. 3 illustrates components of a video editing system according to embodiments. -
FIG. 4 illustrates a visual effect library on a mobile terminal and a corresponding visual effect library on a server according to embodiments. -
FIG. 5 illustrates superimposing visual effects over a user video according to embodiments. -
FIG. 6 illustrates a procedure of video editing according to embodiments. -
FIGS. 7A illustrates an interface of mobile application when a user video is selected according to embodiments. -
FIG. 7B illustrates an interface of mobile application when a user navigates a user video according to embodiments. -
FIG. 7C illustrates an interface of mobile application when a user enters a command for adding a visual effect using according to embodiments. -
FIG. 7D illustrates an interface of mobile application when a user enters a command for adding a visual effect using according to embodiments. -
FIG. 7E illustrates an interface of mobile application when a visual effect is selected to be added to a user video according to embodiments. -
FIG. 8A illustrates an example timeline of displaying still images of visual effect according to embodiments. -
FIG. 8B illustrates an example timeline of displaying still images of visual effect according to embodiments. -
FIGS. 9A-9C illustrate setting locations of visual effect on a mobile application according to embodiments. -
FIGS. 10A-10C illustrate a preview of visual effect changing its locations according to embodiments. - The drawings are provided to illustrate examples and embodiments described herein and are not intended to limit the scope of the invention.
- Embodiments of the invention will now be described with reference to the accompanying drawings. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention.
- With the improvement of computing power of smartphones, editing videos can be performed on a smartphone. However, for better management of smartphone resources, and for more sophisticated video editing, it may be desirable to delegate video editing tasks to a computer having more resources and more video editing functionalities. The present invention provides a video editing system and method that utilize at least one mobile terminal for user interface and at least one remote computer for editing user videos.
- User Instructions on Mobile terminal and Video Editing on Server
- Referring to
FIG. 1 , amobile terminal 100 is connected to aserver 200 wired or wireless via the Internet orinformation network 300. Themobile terminal 100 includes a mobile application for video editing. Theserver 200 includes software for editing user videos. In embodiments, a user enters video editing instructions to the mobile application. The mobile application of themobile terminal 100 presents a preview for the user's review and confirmation. In response to the user's confirmation for editing, the mobile application of themobile terminal 100 sends a video editing request to theserver 200. In response to the video editing request, theserver 200 performs editing of the user video and generates a resulting edited video. - As illustrated in
FIG. 2 , in embodiments, the resultingvideo 210 generated by theserver 200 corresponds to thepreview 110 presented on themobile terminal 100 in a manner in which for each visual effect included in thepreview 110, the resultingvideo 210 includes a corresponding visual effect. The resultingvideo 210 is a single video clip superimposing the user video and at least one visual effect. On the other hand, the corresponding preview is not a single video clip and rather a visual representation of the user video along with still images that correspond to the at least one visual effect. In embodiments, to present the preview of video editing, themobile terminal 100 stores still images of visual effects. On the other hand, theserver 200 stores video clips that correspond to the still images stored in the mobile terminal's library. -
FIG. 3 illustrates components of themobile terminal 100 and theserver 200 of a video editing system according to embodiments. Themobile terminal 100 includes amobile application 120, avisual effect library 140 and auser video storage 160. Theserver 200 includesvideo editing software 220 and avisual effect library 240. - In this disclosure, the term “mobile terminal” refers to mobile consumer electronic devices, such as smartphones, tablet computers, laptop computers, wearable computing devices, and other mobile computing devices. In embodiments, the mobile terminal includes a display, a user input device, a memory and at least one processor for executing software. In some embodiments, the mobile terminal includes a touch screen display although not limited thereto.
- In this disclosure, the term “server” refers to one or more computers that are typically stationary rather than mobile, although not limited thereto. In some embodiments, the server is at least one networked computer of a service provider for providing video editing services.
- The
mobile application 120 is software installed onmobile terminal 100 and capable of accessing components ofmobile terminal 100 for providing user interfaces for video editing. Themobile application 120 communicates with thevideo editing software 220 of theserver 200 for delegating a video editing task to theserver 200. Themobile application 120 provides user interfaces for receiving user commands for video editing, providing a video editing preview, and playing an edited or resulting video from theserver 200. - The
visual effect library 140 is data store of visual effects for use in themobile application 120. In embodiments, thevisual effect library 140 stores a plurality of sets of still images, in which each set of still images represents one visual effect. As illustrated inFIG. 4 , thevisual effect library 140 further stores information and data relating to the visual effects (A, BB, . . . AZ). For each visual effect, in embodiments, thevisual effect library 140 stores an identification (ID), an accompanying sound recording, the number of still images (image count), etc. In other embodiments, thevisual effect library 140 includes addresses or locations of the still images for each visual effect. - In the present disclosure, the term “visual effect” or “visual effects” refer to one or more visual objects for adding to a user video. The visual object may be stationary or moving on a screen. The visual object may be colored and translucent, but not a filter to apply to a full screen of frames of the user video. When added to the user video, a visual effect may accompany with a sound recording.
- A set of still images for each visual effect are snapshots or frames of a corresponding visual effect video. In embodiments, each still image includes one or more non-transparent objects or portions on a transparent background, referred to as an alpha (α) format still image. The still images of a visual effect are to be overlaid over user video frames on the
mobile application 120 in a preview without forming an integrated video. - The
user video storage 160 is data store for user videos. In embodiments, the user videos stored in theuser video storage 160 include videos captured at themobile terminal 100 and/or videos downloaded from other sources. - The
video editing software 220 is software of theserver 200 for performing video editing tasks based on a request from themobile application 120. Video editing by thevideo editing software 220 is, among others, combining a user video and at least one visual effect video such that the resulting video is in a single file and the visual effect video overlaps some frames of the user video. - The
visual effect library 240 is server-side data store of visual effects. In embodiments, thevisual effects library 240 of theserver 200 stores video clips for visual effects (A, BB, . . . , AZ), e.g., one video clip for a visual effect. As illustrated inFIG. 4 , thevisual effect library 240 further stores information and data relating to the visual effects such as an identification (ID), a frame rate of the video clip (frame per second, fps), an accompanying sound recording, etc. - Each video clip for a visual effect includes a transparent background and one or more non-transparent objects or portions, referred to as an alpha (α) format video clip. The frames of the video clip are to be integrated with user video frames to form a single edited video by video editing. In embodiment, alpha (α) format that supports an alpha (α) channel for storing transparency information of each pixel is used for the visual effect video clip.
- Referring to
FIG. 4 , in embodiments, each visual effect has still image data in themobile terminal 100 and video data in theserver 200. With regard to the visual effect A, for example, a series of still images (A1-A10) are stored in thevisual effect library 140 of themobile terminal 100, and a video clip (A) is stored in thevisual effect library 240. - For each visual effect, the video clip stored in the
server 200 directly corresponds to the still images stored in the mobile terminal. In some embodiments, the still images A1-A10 are a subset of frames selected from the corresponding video of visual effect A. In some embodiments, each still image is a snapshot or frame of the video clip or a modified or simplified version of the snapshot or frame. In embodiments, for each visual effect, the number of still images (image frame count) is substantially less than the number of frames in the corresponding video (video frame count). For example, the video frame count is 5, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100 and 200 times greater than the image frame count for the visual effect. In embodiments, the ratio of the video frame count to the image frame count is in a range formed by any two numbers listed in the previous sentence. -
FIG. 5 illustrates an example timeline of video editing according to embodiments, in which a user video for editing runs from t0 through t6. In the illustrated example, thevisual effect A 520 is superimposed over the user video 510 from t1 to t3, and thevisual effect B 530 is superimposed over the user video 510 from t2 to t5. In the example, between t2 and t3, both visual effect A and visual effect B are superimposed with the user video 510. -
FIG. 6 illustrates an example procedure of video editing. First, a user activates themobile application 120 on themobile terminal 100. Then, at 610, the user selects a user video 510. Subsequently, at 620 the user selects a visual effect and at 630 selects parameters for adding the visual effect to the selected user video. In embodiments, the user may add more than one visual effect as inFIG. 5 . Subsequently, at 640 themobile application 120 plays a preview for the user's review and confirmation of adding the visual effects. In response to confirmation, at 650, the mobile application generates a video editing request for sending to theserver 200 at 660. In response to the video editing request, at 670 theserver 200 edits the user video in accordance with the request. Subsequently, at 680 completion of the video editing is notified to themobile application 120, and at 690 the user may play the edited video on themobile application 120 - At 610 the user selects a user video from user videos stored in the
user video storage 160. In response, themobile application 120 displays a scene of the user video 510 and provides a user interface for navigating a timeline of the selected user video 510. In the alternative to selecting one from theuser video storage 160, the user may select a video for editing from the Internet or a network. Then, themobile application 120 may download the selected user video or part of the selected video for displaying on its user interface. - At 620, the user selects one or more visual effects to add to the user video. In embodiments, the
mobile application 120 provides a user interface for selecting a visual effect from the visual effects available in the visual effect library of themobile terminal 100. Subsequent to selection of each visual effect, e.g. visual effect A, at 630, the user sets one or more parameters for the selected visual effect A via the user interface of themobile application 120. In embodiments, the parameters include time frame (start-end), size, orientation, location within the screen, and display strengths (degree of transparency) of the visual effect A. As the user selects visual effects and their parameters, themobile application 120 saves the user selections and settings. - Subsequently, at 640, a preview of video editing may be displayed at the user's request. In embodiments, the preview is a play of the selected user video along with the still images of the selected visual effects that are superimposed over the user video frames in accordance with the user's setting of the parameters. In embodiments, the preview is not an integrated video, in which the still images are incorporated or integrated with the user video. In other embodiments, at least part of the still images may be integrated with or incorporated into the user video to provide the preview. The user may approve the preview or goes back to
steps - Subsequently at 650, the
mobile application 120 generates and sends a video editing request to theserver 200 at the user's command The video editing request includes details of the user's selections and settings for video editing. In embodiments, the video editing request includes the selected user video, identification of selected visual effect and parameters for each visual effect. In some embodiments, the video editing request includes identification or location information of the user video rather than including the user video data itself. - At 670, the
video editing software 220 of theserver 200 performs video editing in accordance with the video editing request from themobile application 120. Thevideo editing software 220 retrieves the selected user video if needed. Also, thevideo editing software 220 retrieves the video clip for each visual effect identified in the video editing request. Then, the video editing at theserver 200 relates to combining the video clip of the selected visual effect with the selected user video as specified by the parameters included in the request from themobile application 120. - During video editing, the
video editing software 220 superimposes frames of the video clip of the selected visual effect over frames of the user video based on the timeframe specified in the video request. Specifically, in the video editing, a frame of the visual effect video clip and a frame of the user video are integrated as a single frame such that the video editing results in a single integrated video incorporating visual effect from the video clip in the user video. To do the frame-by-frame integration, in some embodiments, thevideo editing software 220 may determine if the frame rates of the user video and visual effect video clip and adjust the frame rate of the visual effect video clip to match the frame rate of the user video. -
FIGS. 7A-7E illustrate example user interfaces of themobile application 120 for video editing. Referring toFIG. 7A , afirst window 810 of themobile application 120 displays the starting frame (t0) of the user video 510. Asecond window 820 of themobile application 120 displays a video time-bar 822 showing frames of the user video 510. In thesecond window 820, themobile application 120 provides anindicator 824 indicating position of the current frame (scene) displayed in thefirst window 810 on the video time-bar 822. Themobile application 120 graphical providesicons 830 representing video editing available in the mobile application. Each of theicons 830 represents color adjusting, adding background music, overlaying visual effects, and trimming. InFIG. 7B , themobile application 120 displays the first scene (t1) of the user video 510 as the user moves the video time-bar 822. -
FIG. 7C-7E illustrates interface of themobile application 120 when the user selects the visual effect A and selecting parameters of the visual effect A at the steps of 620 and 630. InFIG. 7C , themobile application 120 displaysmultiple icons 840 showing visual effect categories available when the user selects anicon 831 of overlaying visual effects among theicons 830. InFIG. 7D , themobile application 120 displaysmultiple icons 843 representing visual effects in the selected category when the user selects anicon 842 representing heart-shape visual effects. - When the user selects the icon 844 (representing the Visual Effect A 520) the
mobile application 120 displays acon 860 indicating location and size of the visual effect A in thefirst window 810 over a scene of the user video 510. Themobile application 120 displays atime line 850 for indicating start/end of the visual effect A. Thetimeline 850 is sized and aligned with the video time-bar 822 for indicating a corresponding portion of the user video 510 where the visual effect A will be combined. In embodiments, themobile application 120 provides the icon of the Visual Effect A using at least one form the still images A1-A10. In embodiments, the user can set a starting point of the visual effect by moving thetime line 850 relative to thetime bar 822 of the user video. In some embodiments, the user can adjust size of the visual effect A by dragging a size-adjustingmark 862 provided along with theicon 860. When the user selects theconfirm icon 870 themobile application 120 provides a preview of the visual effect A over the user video 510. - Presentation of Visual Effect in Preview
-
FIGS. 8A illustrates an example timeline of displaying still images of a visual effect A (FIG. 5 ) in a preview. The visual effect A begins at t1 and continues until t3 (t1+5 second). In the illustrated example ofFIG. 8A , each still image stays for 0.5 sec. in the preview such that each still image is presented along with multiple frames of the user video 510. Specifically, in the first time segment from t1 to t1+0.5, the first still image A01 is displayed over the frames of the user video 510 in the same time segment. For the following segment, from t1+0.5 to t1+1.0, the second still image till image A02 is displayed over the frames of the user video 510 in the same time segment. In other embodiments, based on user settings and/or input, the time segment for each still image may change. - In some embodiments, the
mobile application 120 displays the still images A1-A10 one after another in sequence without overlapping with each other as inFIG. 8A . In other embodiments, as inFIG. 8B , two or more still images may be displayed at a given time during preview. In some embodiments, the transparency (display strength) of the object(s) included in each still image is constant and does not change over time during the time segment in which the particular still image is presented in the preview. In other embodiments, the transparency (display strength) of the object(s) included in each still image changes over time during the time segment in which the particular still image is presented in the preview. - In the example of
FIG. 8A , each of the still images A01-A10 is displayed at either of 100% displays strength and 0% displays strength during display of the visual effect A in preview. For example, still image A01 is at 100% during the first time segment and at 0% for the rest of time segments. Display strength of 100% is 0% transparency, which is to display the object(s) of each still image as original. Display strength of 0% corresponds 100% transparency, which will result in no display of the object(s) of visual effect. In embodiment, alpha (α) format that supports an alpha (α) channel for storing transparency information of each pixel of the still images are used for changing displays strength. - In the example of
FIGS. 8B , the still image A09 is displayed in the time segment between t1+4.0 and t1+5.0. The display strength of the still image A09 gradually increases (fade-in) to its peak at t1+4.5 and then gradually decreases (fade-out) until t1+5.0. From t1+4.0 to t1+4.5, the still image A09 is displayed together with the still image A08 as the still image A08 fades out. Similarly, from t1+4.5 to t1+5.0, the still image A09 is displayed together with the still image A10 as the still image A10 fades in. In the illustrated example, while ending of still image A08 an beginning of still image A10 coincide at the time t1+4.5, in other embodiments, beginning of still image A10 may be delayed such that there is some time period in which only the still image A09 is displayed as visual effect along with the user video frames. Also, in other embodiments, the still image A10 may begin before complete disappearance of the still image A08. The overlapping display and staggering display strength individually and in combination are advantageous as the display of these still images can generate smoother motions with a smaller number or count of still images than the on-off display strength as illustrated inFIG. 8A . -
FIGS. 9A to 9C illustrates setting different locations of the visual effect A at multiple points of the timeline of user video.FIGS. 10A to 10C illustrates a preview of the visual effect A changing its locations along the timeline of the user video. Referring toFIGS. 9A to 9C ,visual effect icons corresponding points timeline 850. The user can adjust location of the visual effect A at thepoint 952 by moving the movingvisual effect icons 962 as inFIG. 9A . Similarly, the user can change locations of the visual effect A at thepoints timeline 850 by moving theicons points mobile application 120 stores the location of the visual effect as part of meters for video editing. Themobile application 120 includes the stored locations in a request for server-side video editing. Based on the request, theserver 200 generates a resulting video in which the visual effect A moves along a trace connecting the multiple locations of 962, 964, 966. In some embodiment, parameters of the visual effect A for overlaying include different settings of the visual effect A at two or more points in the timeline of the visual effect A. - Mobile Application—Not performing Video Editing by Itself
- In embodiments, mobile'
application 120 does not perform, by itself, a video editing to combine visual effects to user videos because such video editing task is too heavy for limited computational power of themobile terminal 100. Instead, themobile application 120 provide of a video editing and delegates the video editing to theserver 200 to take advantage of computational power of theserver 200. In embodiments, a preview of visual effects is provided using still-images representing the visual effects without using a video clip of the visual effects. Process of providing a video editing preview is not a simplified version of corresponding video editing at the server because themobile application 120 does not modify the user video or create a new video file incorporating the visual effects to the user video. - Although the invention has been disclosed in the context of certain embodiments and examples, it will be understood by those skilled in the art that various features and aspects of the present invention extend beyond the specifically disclosed embodiments to other alternative embodiments. In addition, while a number of variations have been shown and described in detail, other modifications, which are within the scope of the invention, will be readily apparent to those of skill in the art based upon this disclosure. It is also contemplated that various combinations or sub-combinations of the specific features and aspects of the embodiments may be made and still fall within the invention. Accordingly, it should be understood that various features and aspects of the disclosed embodiments can be combined with or substituted for one another in order to form varying modes of the disclosed invention. Thus, it is intended that the scope of the present invention herein disclosed should not be limited by the particular disclosed embodiments described above, and that various changes in form and details may be made without departing from the spirit and scope of the present disclosure as set forth in the following claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/192,209 US9852768B1 (en) | 2016-06-03 | 2016-06-24 | Video editing using mobile terminal and remote computer |
PCT/IB2017/000833 WO2017208080A1 (en) | 2016-06-03 | 2017-06-02 | Video editing using mobile terminal and remote computer |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/173,586 US9773524B1 (en) | 2016-06-03 | 2016-06-03 | Video editing using mobile terminal and remote computer |
US15/192,209 US9852768B1 (en) | 2016-06-03 | 2016-06-24 | Video editing using mobile terminal and remote computer |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/173,586 Continuation US9773524B1 (en) | 2016-06-03 | 2016-06-03 | Video editing using mobile terminal and remote computer |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170352379A1 true US20170352379A1 (en) | 2017-12-07 |
US9852768B1 US9852768B1 (en) | 2017-12-26 |
Family
ID=60479195
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/192,209 Expired - Fee Related US9852768B1 (en) | 2016-06-03 | 2016-06-24 | Video editing using mobile terminal and remote computer |
Country Status (2)
Country | Link |
---|---|
US (1) | US9852768B1 (en) |
WO (1) | WO2017208080A1 (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10622021B2 (en) * | 2016-02-19 | 2020-04-14 | Avcr Bilgi Teknolojileri A.S | Method and system for video editing |
US10915610B2 (en) * | 2017-07-17 | 2021-02-09 | Tata Consultancy Services Limited | Systems and methods for inclusive captcha |
US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11165949B2 (en) | 2016-06-12 | 2021-11-02 | Apple Inc. | User interface for capturing photos with different camera magnifications |
US11178335B2 (en) | 2018-05-07 | 2021-11-16 | Apple Inc. | Creative camera |
US11204692B2 (en) * | 2017-06-04 | 2021-12-21 | Apple Inc. | User interface camera effects |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11223771B2 (en) | 2019-05-06 | 2022-01-11 | Apple Inc. | User interfaces for capturing and managing visual media |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11350026B1 (en) | 2021-04-30 | 2022-05-31 | Apple Inc. | User interfaces for altering visual media |
US11468625B2 (en) | 2018-09-11 | 2022-10-11 | Apple Inc. | User interfaces for simulated depth effects |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
EP4171006A4 (en) * | 2020-07-23 | 2023-11-22 | Beijing Bytedance Network Technology Co., Ltd. | Previewing method and apparatus for effect application, and device and storage medium |
USD1013715S1 (en) * | 2021-04-01 | 2024-02-06 | Instasize, Inc. | Display screen or portion thereof with a graphical user interface |
USD1013716S1 (en) * | 2021-04-01 | 2024-02-06 | Instasize, Inc. | Display screen or portion thereof with a graphical user interface |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111010591B (en) * | 2019-12-05 | 2021-09-17 | 北京中网易企秀科技有限公司 | Video editing method, browser and server |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5546518A (en) * | 1995-01-06 | 1996-08-13 | Microsoft Corporation | System and method for composing a display frame of multiple layered graphic sprites |
KR20040065479A (en) * | 2003-01-14 | 2004-07-22 | 삼성전자주식회사 | System and method for editing multimedia file using internet |
US20060277209A1 (en) * | 2005-06-06 | 2006-12-07 | Javaground Usa, Inc. | Efficient and automatic software application development system for wireless devices |
WO2007137240A2 (en) * | 2006-05-21 | 2007-11-29 | Motionphoto, Inc. | Methods and apparatus for remote motion graphics authoring |
US20080046925A1 (en) * | 2006-08-17 | 2008-02-21 | Microsoft Corporation | Temporal and spatial in-video marking, indexing, and searching |
US20080177630A1 (en) * | 2007-01-19 | 2008-07-24 | Babak Maghfourian | Method apparatus, system, media, and signals for billing a sponsor of an object link in interactive sequenced media |
EP2111588A2 (en) * | 2007-02-13 | 2009-10-28 | Nidvid, Inc. | Media editing system and method |
JP5237174B2 (en) * | 2009-04-09 | 2013-07-17 | Kddi株式会社 | Content editing method, content server, system, and program for editing original content by portable terminal |
US8818172B2 (en) * | 2009-04-14 | 2014-08-26 | Avid Technology, Inc. | Multi-user remote video editing |
US9270927B2 (en) * | 2010-06-22 | 2016-02-23 | New Blue, Inc. | System and method for distributed media personalization |
KR20120027563A (en) * | 2010-09-13 | 2012-03-22 | 유비벨록스(주) | Remote processing service method and system |
KR101191776B1 (en) | 2010-10-15 | 2012-12-20 | 주식회사 디엔디엔 | System and method for managing car parking |
US20120251080A1 (en) | 2011-03-29 | 2012-10-04 | Svendsen Jostein | Multi-layer timeline content compilation systems and methods |
EP2766816A4 (en) * | 2011-10-10 | 2016-01-27 | Vivoom Inc | Network-based rendering and steering of visual effects |
KR20130116644A (en) | 2012-04-16 | 2013-10-24 | 주식회사 케이티 | Method and apparatus for providing services of social interactive video |
US9710950B2 (en) * | 2012-04-27 | 2017-07-18 | Adobe Systems Incorporated | Extensible sprite sheet generation mechanism for declarative data formats and animation sequence formats |
KR20140017303A (en) | 2012-07-31 | 2014-02-11 | 삼성전기주식회사 | Apparatus and method for testing printed circuit board |
US9591347B2 (en) | 2012-10-31 | 2017-03-07 | Google Inc. | Displaying simulated media content item enhancements on mobile devices |
US8745500B1 (en) * | 2012-12-10 | 2014-06-03 | VMIX Media, Inc. | Video editing, enhancement and distribution platform for touch screen computing devices |
WO2014115147A1 (en) | 2013-01-24 | 2014-07-31 | Telesofia Medical Ltd. | System and method for flexible video construction |
US20150050009A1 (en) * | 2013-08-13 | 2015-02-19 | Wevideo, Inc. | Texture-based online multimedia editing |
KR101528312B1 (en) | 2014-02-14 | 2015-06-11 | 주식회사 케이티 | Method for editing video and apparatus therefor |
US9509827B2 (en) | 2014-03-12 | 2016-11-29 | Intel IP Corporation | Apparatus, system and method of managing at a mobile device execution of an application by a computing device |
EP3127118A4 (en) | 2014-03-31 | 2017-12-06 | GoPro, Inc. | Distributed video processing and selective video upload in a cloud environment |
US20150301708A1 (en) * | 2014-04-21 | 2015-10-22 | VMIX Media, Inc. | Video Editing Graphical User Interface |
-
2016
- 2016-06-24 US US15/192,209 patent/US9852768B1/en not_active Expired - Fee Related
-
2017
- 2017-06-02 WO PCT/IB2017/000833 patent/WO2017208080A1/en active Application Filing
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10622021B2 (en) * | 2016-02-19 | 2020-04-14 | Avcr Bilgi Teknolojileri A.S | Method and system for video editing |
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
US11641517B2 (en) | 2016-06-12 | 2023-05-02 | Apple Inc. | User interface for camera effects |
US11165949B2 (en) | 2016-06-12 | 2021-11-02 | Apple Inc. | User interface for capturing photos with different camera magnifications |
US11245837B2 (en) | 2016-06-12 | 2022-02-08 | Apple Inc. | User interface for camera effects |
US11687224B2 (en) | 2017-06-04 | 2023-06-27 | Apple Inc. | User interface camera effects |
US11204692B2 (en) * | 2017-06-04 | 2021-12-21 | Apple Inc. | User interface camera effects |
US10915610B2 (en) * | 2017-07-17 | 2021-02-09 | Tata Consultancy Services Limited | Systems and methods for inclusive captcha |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11178335B2 (en) | 2018-05-07 | 2021-11-16 | Apple Inc. | Creative camera |
US11468625B2 (en) | 2018-09-11 | 2022-10-11 | Apple Inc. | User interfaces for simulated depth effects |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11669985B2 (en) | 2018-09-28 | 2023-06-06 | Apple Inc. | Displaying and editing images with depth information |
US11895391B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11223771B2 (en) | 2019-05-06 | 2022-01-11 | Apple Inc. | User interfaces for capturing and managing visual media |
US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
US11617022B2 (en) | 2020-06-01 | 2023-03-28 | Apple Inc. | User interfaces for managing media |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
US11330184B2 (en) | 2020-06-01 | 2022-05-10 | Apple Inc. | User interfaces for managing media |
EP4171006A4 (en) * | 2020-07-23 | 2023-11-22 | Beijing Bytedance Network Technology Co., Ltd. | Previewing method and apparatus for effect application, and device and storage medium |
US11941728B2 (en) | 2020-07-23 | 2024-03-26 | Beijing Bytedance Network Technology Co., Ltd. | Previewing method and apparatus for effect application, and device, and storage medium |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
USD1013715S1 (en) * | 2021-04-01 | 2024-02-06 | Instasize, Inc. | Display screen or portion thereof with a graphical user interface |
USD1013716S1 (en) * | 2021-04-01 | 2024-02-06 | Instasize, Inc. | Display screen or portion thereof with a graphical user interface |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11416134B1 (en) | 2021-04-30 | 2022-08-16 | Apple Inc. | User interfaces for altering visual media |
US11350026B1 (en) | 2021-04-30 | 2022-05-31 | Apple Inc. | User interfaces for altering visual media |
US11418699B1 (en) | 2021-04-30 | 2022-08-16 | Apple Inc. | User interfaces for altering visual media |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
Also Published As
Publication number | Publication date |
---|---|
WO2017208080A9 (en) | 2018-03-15 |
US9852768B1 (en) | 2017-12-26 |
WO2017208080A1 (en) | 2017-12-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9852768B1 (en) | Video editing using mobile terminal and remote computer | |
CN108989691B (en) | Video shooting method and device, electronic equipment and computer readable storage medium | |
US11450350B2 (en) | Video recording method and apparatus, video playing method and apparatus, device, and storage medium | |
US10735798B2 (en) | Video broadcast system and a method of disseminating video content | |
US9773524B1 (en) | Video editing using mobile terminal and remote computer | |
US11049522B2 (en) | Digital media editing | |
WO2020107297A1 (en) | Video clipping control method, terminal device, system | |
JP7053869B2 (en) | Video generation methods, devices, electronics and computer readable storage media | |
CN109275028B (en) | Video acquisition method, device, terminal and medium | |
US20170024110A1 (en) | Video editing on mobile platform | |
US10090018B2 (en) | Method and device for generating video slides | |
KR101528312B1 (en) | Method for editing video and apparatus therefor | |
US8151179B1 (en) | Method and system for providing linked video and slides from a presentation | |
CN111491174A (en) | Virtual gift acquisition and display method, device, equipment and storage medium | |
US9966110B2 (en) | Video-production system with DVE feature | |
CN112653920B (en) | Video processing method, device, equipment and storage medium | |
CN113806306B (en) | Media file processing method, device, equipment, readable storage medium and product | |
CN113727140A (en) | Audio and video processing method and device and electronic equipment | |
CN111352560B (en) | Screen splitting method and device, electronic equipment and computer readable storage medium | |
KR101915792B1 (en) | System and Method for Inserting an Advertisement Using Face Recognition | |
KR101703321B1 (en) | Method and apparatus for providing contents complex | |
US20180213288A1 (en) | Systems and methods for creating video compositions | |
WO2017106960A1 (en) | Methods, apparatus and computer-readable media for customized media production and templates therefor | |
CN115424125A (en) | Media content processing method, device, equipment, readable storage medium and product | |
US11765333B1 (en) | Systems and methods for improved transitions in immersive media |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: KNOBBE, MARTENS, OLSON & BEAR, LLP, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:MAVERICK CO., LTD.;REEL/FRAME:045692/0243 Effective date: 20171215 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20211226 |