WO2015020497A1 - Procédé de traitement d'interface utilisateur d'édition vidéo exécutée sur un dispositif mobile à écran tactile, dispositif mobile et support d'enregistrement - Google Patents

Procédé de traitement d'interface utilisateur d'édition vidéo exécutée sur un dispositif mobile à écran tactile, dispositif mobile et support d'enregistrement Download PDF

Info

Publication number
WO2015020497A1
WO2015020497A1 PCT/KR2014/007412 KR2014007412W WO2015020497A1 WO 2015020497 A1 WO2015020497 A1 WO 2015020497A1 KR 2014007412 W KR2014007412 W KR 2014007412W WO 2015020497 A1 WO2015020497 A1 WO 2015020497A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
frames
frame
video
user
Prior art date
Application number
PCT/KR2014/007412
Other languages
English (en)
Korean (ko)
Inventor
정춘선
김경중
김종득
Original Assignee
넥스트리밍(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 넥스트리밍(주) filed Critical 넥스트리밍(주)
Publication of WO2015020497A1 publication Critical patent/WO2015020497A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 

Definitions

  • the present invention relates to a Graphic User Interface (GUI) of video editing software executed on a mobile.
  • GUI Graphic User Interface
  • FIG. 1 is a view for explaining a general user interface of such a video editing software for a personal computer (PC).
  • PC personal computer
  • video files are loaded into a memory and a part of the video is selected to insert a video clip corresponding to the selected part in the timeline.
  • the video editing software when rendering is performed to obtain the final result, the video editing software generates the final video edit result by rendering each video clip arranged in the timeline in chronological order.
  • These user interfaces are implemented to allow the user to accurately select the portion to be edited from the video file and easily select the desired length.
  • a part of the video file may be selected by loading a video file into a memory and then setting a start position and an end position in seconds.
  • an instance corresponding to the selected portion may be created and inserted into an appropriate position of the timeline by dragging it with a mouse.
  • the hot topic in the recent software technology field can be said to be mobile computing using a smartphone.
  • Video editing software is no exception, and video editing software for smartphones running on smartphones is being developed and distributed one after another.
  • mobile devices such as smartphones or tablet computers have a rather small screen size of only 3 or 4 inches up to 10 inches, which is inherently disadvantageous for video editing. can do.
  • Patent Literature 1 Korean Intellectual Property Office, Patent Publication No. 10-2009-0058305 "Video recording editing apparatus and method"
  • Patent Document 2 Korean Intellectual Property Office, Patent Registration No. 10-0860510 "Generating a slide show with visual effects inserted into a mobile device"
  • Non-Patent Literature 1 (Non-Patent Literature 1) Literature 3. Choi, Young-Young, Choi, Study on Mobile Video Editing Techniques in the Smartphone Market, Journal of the Korea Contents Association 10 (5) 115-123 1738-6764 KCI, 2010
  • the present invention was developed to solve the problems of the prior art as described above, a mobile video editing software for selecting a portion of the video to create a video clip and insert it on the timeline with only simple and intuitive operation.
  • the purpose is to present a user interface.
  • Another object of the present invention is to display a plurality of frames to visually know the contents of the video on a relatively narrow screen, and to generate a video clip by selecting some of them, but it is simple and An intuitive operation is to provide a user interface for dynamically adjusting the time interval of frames.
  • an apparatus for providing a video editing user interface includes: user interface processing means for displaying a first user interface on one side of a touch screen and a second user interface on the other side of the screen;
  • Frame acquiring means for acquiring frames to be displayed on the first user interface using a predetermined frame acquiring interval from the moving image data selected by the user;
  • Frame interval adjusting means for setting a frame acquisition interval by a user operation
  • rendering means for generating a moving picture file including the moving picture clips by rendering moving picture clips arranged in a second user interface.
  • the user interface processing means displays the frames acquired by the frame obtaining means on the first user interface, and the user touches and drags any two frames displayed on the first user interface in the direction of the second user interface. Accordingly, a video clip is generated from the position of the video corresponding to the frame leading in time order among the two selected frames to the position of the video corresponding to the later frame.
  • the icons representing this are arranged in the second user interface, that is, the timeline.
  • the frame interval adjusting means changes the frame acquisition interval in real time as the user touches any two points on the first user interface and then narrows or increases the distance between the two contact points while maintaining the contact state.
  • the frame acquiring means re-acquires frames by using the frame acquiring interval changed from the video data when the frame acquiring interval is changed in real time.
  • the user interface processing means then updates the obtained frames by displaying them on the first user interface.
  • a method for providing a video editing user interface includes: a) acquiring a frame at a predetermined frame acquisition interval from a portion of video data;
  • step d) changing the frame acquisition interval in real time as the user narrows or increases the distance between the two contact points while simultaneously touching the first user interface area and maintains the contact state, and branches to step a);
  • the user touches any two of them and then drags the two selected frames simply by dragging in the direction of the timeline.
  • 1 is a view for explaining the user interface configuration of the conventional video editing software
  • FIG. 2 is a diagram illustrating a user interface of mobile video editing software according to the present invention.
  • FIG. 3 is a view for explaining the configuration of a video editing user interface providing apparatus using a touch screen gesture input according to the present invention
  • FIG. 4 is a diagram illustrating a process of generating a video clip and inserting it into a timeline through a touch screen gesture input.
  • FIG. 5 is a flowchart illustrating a method of providing a video editing user interface using a touch screen gesture input in a time series.
  • FIG. 6 is a diagram illustrating a process of adjusting a frame interval in real time through a touch screen gesture input.
  • a mobile device having a touch screen comprising: user interface processing means for displaying a first user interface on one side of a touch screen and a second user interface on the other side of the screen; Frame acquiring means for acquiring frames to be displayed on the first user interface using a predetermined frame acquiring interval from the moving image data selected by the user; Frame interval adjusting means for setting a frame acquisition interval by a user operation; And rendering means for generating a video file including the video clips by rendering the video clips arranged in a second user interface.
  • the user interface processing means displays the frames acquired by the frame acquiring means on a first user interface, and as the user touches and drags any two frames displayed on the first user interface toward the second user interface, Among the two selected frames, a video clip is generated from the position of the video corresponding to the frame leading in chronological order to the position of the video corresponding to the later frame, and the icons representing the same are arranged in the second user interface.
  • the user touches any two points on the first user interface and then changes the frame acquisition interval in real time as the interval between the two contact points is narrowed or increased while maintaining the contact state, and the frame acquiring means obtains the frame acquisition interval. In the real-time change of the frame, the frames are re-obtained using the changed frame acquisition interval from the video data, and the user interface processing means displays the updated frames on the first user interface and updates the frames.
  • a video editing user interface processing method executed in a mobile device having a touch screen comprising: a) a frame obtaining step of obtaining a frame at predetermined frame acquisition intervals from the moving image data; b) displaying at least a portion of the obtained plurality of frames in chronological order along one side of the screen of the touch screen; c) As a user selects any two frames among the plurality of frames displayed in the chronological order, the position of the video corresponding to the preceding frame in chronological order among the selected two frames is a starting point, and the video corresponding to the later frames in chronological order.
  • step d as the frame acquisition interval is changed in real time, the frame is acquired again according to the changed frame acquisition interval, and at least some of the plurality of acquired frames are updated in real time by displaying in chronological order along one side of the screen. It is characterized by.
  • the mobile inserts the generated video clip at the timeline position corresponding to the coordinate where the drag ends.
  • step e) further comprises a rendering step of rendering the video clips inserted in the timeline in the timeline order.
  • step d) if the user touches any two points on the touch screen area where the plurality of frames are displayed and drags the two contact points apart, the frame acquisition interval is widened.
  • the frame acquisition interval is narrowed and then branched to the frame acquisition step.
  • step d) when the user drags the two contact points apart from each other, the newly acquired frames are displayed in both directions about a frame adjacent to the center coordinates of the first contact point of the two contact points.
  • step d as the user drags the two contact points apart from each other, an animation is displayed in which the frames move away in both directions about a center coordinate of the first contact point of the two contact points, and then the newly acquired frames are displayed. Characterized in that.
  • step d) when the user drags so that the distance between the two contact points becomes narrow, the newly acquired frames are displayed in both directions about a frame adjacent to the center coordinates of the last contact positions of the two contact points.
  • step d as the user drags the gap between the two contact points to be narrowed, an animation is displayed in which the frames on both sides move closer in the direction of the center coordinate of the last contact point of the two contact points, and then newly acquired frames are displayed. It is characterized by displaying.
  • a computer-readable recording medium having a program code comprising: a) obtaining a frame at predetermined frame acquisition intervals from a partial section of moving image data; b) displaying at least a portion of the obtained plurality of frames on a first user interface region shown on one side of a screen of the touch screen; c) Any two frames among the frames displayed in the first user interface area are selected by the user and dragged toward the second user interface area on the other side of the touch screen, corresponding to the frames preceding the time in the selected two frames. Creates a movie clip with the start point of the movie as the starting point and the ending point of the movie corresponding to the late frames in chronological order, and the generated movie clip on the timeline corresponding to the position where the dragging of the second user interface area ends. Inserting in position; And d) changing the frame acquisition interval in real time as the user narrows or increases the distance between the two contact points while simultaneously touching the first user interface area and maintains a contact state, and branches to step a). Characterized in that it comprises a.
  • ⁇ means means a unit that processes at least one function or operation, Each of these may be implemented by software or hardware, or a combination thereof.
  • the timeline literally means the time base.
  • the user arranges a number of video clips along the timeline, sets effects on each of the arranged video clips, and then renders the resulting video file.
  • the timeline means a time position occupied by each video clip in the video file that is rendered as the final result.
  • the video clip may be used at a location on a timeline corresponding to a section “1 minute 10 seconds to 1 minute 30 seconds”.
  • a video clip refers to a piece or a portion of a video that is split into pieces for authoring a video edit.
  • the user can insert a locally stored movie file as a whole into the timeline, or select a portion of the movie file to insert into the timeline, in which case they are part of the output of the final rendered movie file, arranged on the timeline. It may be referred to as a video clip in that it is configured.
  • FIG. 2 is a diagram illustrating a user interface of mobile video editing software according to the present invention.
  • the first user interface 11 is displayed at the top of the screen.
  • the first user interface 11 displays various scenes of the video in time series.
  • the first user interface 11 displays a plurality of frames in chronological order. Each frame visually displays the scenes of the video.
  • the user may generate a video clip by selecting a desired portion of the entire video through the first user interface 11.
  • the second user interface 12 is displayed at the bottom of the screen.
  • the second user interface 12 visually represents the timeline.
  • the user can place multiple movie clips at appropriate locations on the second user interface 12.
  • a second user interface 12 having a shape extended to the left and right is shown, and an icon of a video clip is displayed thereon.
  • the position at which the icon of the movie clip is displayed corresponds to the position on the timeline of the movie to be edited.
  • the present invention presupposes the user interface configuration of the video editing software for mobile as shown in FIG.
  • a user interface is further proposed to generate a video clip by accurately selecting a part of the video by simple operation.
  • Video editing user interface providing apparatus is to provide such a user interface, which is software executed in the mobile 1, It may be implemented in the form of firmware.
  • FIG. 3 is a view for explaining the configuration of a video editing user interface providing apparatus according to the present invention.
  • the apparatus 100 for providing a video editing user interface includes a user interface processing unit 110, a frame obtaining unit 120, a frame interval adjusting unit 130, and a rendering unit ( 140).
  • the user interface processing unit 110 displays a first user interface 11 displaying a plurality of frames obtained from a video in chronological order and a second user interface 12 corresponding to a timeline. Mark each on.
  • the frame obtaining means 120 is a predetermined from such video data. A plurality of frames are obtained using the frame acquisition interval.
  • the frame obtaining unit 120 may obtain a plurality of frames by decoding the video and generating a still image at 10 second intervals.
  • the nearest key frame may be obtained every 10 seconds.
  • the frame acquisition interval is not necessarily limited to the time interval, but may be set as, for example, "every fifth keyframe". In such a case, the frame acquiring means 120 may acquire one for every five key frames.
  • the frames may be preferably obtained in the form of thumbnails (Thumbnail).
  • the user interface processing means 110 displays the plurality of frames thus obtained on the first user interface 11 in chronological order.
  • thumbnails of frames may be displayed in chronological order from left to right.
  • the scenes of the partial sections of the video are displayed in chronological order through the first user interface 11.
  • the user may check various scenes of the video at a glance through the first user interface 11.
  • FIG. 4 is a diagram illustrating a process of generating a video clip and inserting it into a timeline through a touch screen gesture input.
  • the user touches any two of the frames shown in the first user interface 11, and then drags to a specific position of the second user interface 12 and then touches.
  • the user interface processing means 110 moves a video clip corresponding to a position corresponding to a later frame from a position corresponding to a frame preceding the time frame among the two frames.
  • the generated video clip is inserted at a position on the timeline corresponding to the coordinate on the second user interface 12 where the user lifts a finger and displays an icon of the video clip.
  • the frame touched and dragged by the user is not necessarily included in the video clip, and according to an algorithm for extracting a part of the video, the video clip is based on a keyframe immediately before or after the frame touched and dragged by the user. You can also create
  • the frames corresponding to the first scene and the last scene of the section desired by the user will not be simultaneously displayed on the first user interface 11.
  • the user may widen the time interval of the frames displayed on the first user interface 11 by touching and spreading any two points of the first user interface 11 to both sides.
  • the time interval of the frames displayed on the first user interface 11 may be narrowed by touching two points of the first user interface 11 and then narrowing them toward the center.
  • the frame interval adjusting means 130 may obtain a frame acquisition interval. Changes dynamically.
  • the frame acquiring means 120 newly acquires frames from the video using the dynamically changed frame acquiring interval.
  • the user interface processing means 110 updates the obtained new frames by displaying them on the first user interface 11.
  • the user can create a video clip of the desired length from the video scene by the user through a simple operation process.
  • the generated video clip may be arranged on the timeline of the second user interface 12 with a simple operation.
  • the user can obtain the final video edit result through a rendering process as in the conventional video editing software.
  • the rendering means 140 generates a video file which is the final edited product including the video clips by rendering the video clips arranged in the second user interface 12.
  • FIGS. 4 to 6 a video editing user interface providing method using a touch screen gesture input according to the present invention will be described with reference to FIGS. 4 to 6.
  • FIG. 5 is a flowchart illustrating a method of providing a video editing user interface using a touch screen gesture input in time series
  • FIG. 6 is a diagram illustrating a process of adjusting a frame interval in real time through a touch screen gesture input.
  • FIGS. 4 and 6 illustrates a process of providing a user interface as shown in FIGS. 4 and 6 by a mobile 1 having a touch screen while a video editing app is executed.
  • the present invention provides a frame acquisition step (step a), a frame display step (b step), a clip selection step (c step), an acquisition interval adjustment step (d step), and a rendering step (e). Step).
  • the mobile 1 acquires a frame at a predetermined frame acquisition interval from the video data (step a).
  • the frame is preferably obtained in the form of a thumbnail.
  • the frames may not necessarily be acquired at regular intervals with respect to the entire video data, and the frames may be acquired only as much as the portions to be displayed on the first user interface 11.
  • the frame display step After acquiring the frame in this way, in the frame display step, at least a part of the obtained plurality of frames is displayed on the first user interface 11.
  • the display is preferably performed in one direction in the region of the first user interface 11 in chronological order.
  • frames leading in chronological order may be displayed on the first user interface 11 as shown in FIG. 2.
  • the frames displayed on the first user interface 11 may be frames obtained from a portion of the video, and when a user wants to check a frame of another portion, the user manipulates the progress bar by dragging the progress bar. can do.
  • the mobile 1 may dynamically acquire and display the corresponding portion of the video on the first user interface 11.
  • the clip selection step corresponds to the preceding frame in chronological order among the two frames selected by the user.
  • a moving clip is created by using the position of the moving image as a starting point and using the ending position of the moving image corresponding to a late frame in chronological order.
  • a movie clip it is referred to as the creation of a movie clip, but it does not necessarily mean creating an instance corresponding to a selected portion from the movie and storing it as independent data, and relates to the start position and end position of the selected region in the presence of the original movie data. You can also treat it as generating information.
  • the user selects one of the plurality of frames displayed on the first user interface 11 corresponding to the start point and the end point, respectively, and then selects the movie clip function by selecting a movie clip function. You can also create
  • the mobile 1 can display any two frames shown in the first user interface 11 as shown in FIG. 4 so that the video clip can be selected and created through a touch screen operation in a more intuitive and simple manner. Touch and then drag to the lower second user interface 12 to create a video clip, and automatically place the created video clip on the timeline.
  • the mobile 1 creates a video clip corresponding to the section between the two frames, and the user's dragging.
  • the video clip is inserted on the timeline corresponding to the position on the second user interface 12 where the drag is terminated.
  • a four-step editing process by selecting the start position of the movie clip / then the end position of the movie clip / by pressing the Create button of the movie clip / and dragging the created movie clip onto the timeline This is completed by the drag operation after only one multi-touch.
  • the frames corresponding to the start position and the end position that the user wants to select in the first user interface 11 should be simultaneously displayed.
  • the user touches two points on the first user interface 11 with two fingers, for example, a thumb and an index finger, and then drags the fingers apart.
  • two adjacent frames may be simultaneously touched and then dragged left and right in a feeling of spreading the distance between the two frames.
  • FIG. 6A illustrates a case in which the user touches two points on the first user interface 11 and then performs a control operation of widening the distance between the two contact points.
  • the mobile 1 widens the frame acquisition interval in real time in the acquisition interval adjustment step (d step).
  • the mobile 1 branches to step a).
  • the frame is acquired again according to the frame acquisition interval changed in step a).
  • step b) the newly acquired frames are updated by displaying them on the first user interface 11.
  • the time interval of the frames simultaneously displayed on the first user interface 11 is wider.
  • the user can repeatedly perform the same operation.
  • the physical interval between the frames as the user moves the finger apart to intuitively know that the time interval between the frames (that is, the time interval at which two frames are played in the video) has become wider. (I.e., the distance between frames on the screen) is animated to move away from each other.
  • the center coordinates of two touch points that is, when the user touches the touch screen with two fingers attached, the frames are left and right centered on the coordinates between the two fingers. You can display away animations.
  • the newly acquired frames around the frame are displayed while the frame adjacent to the coordinate between the two fingers is displayed on the first user interface 11 as possible. Display.
  • the user may narrow the time intervals of the frames to manipulate the frames to appear on the first user interface 11.
  • the user may touch the two points on the first user interface 11 with two fingers apart and then drag them in such a way as to pinch them.
  • 6B illustrates a case in which the user touches two points on the first user interface 11 and then performs a control operation of narrowing the distance between the two contact points.
  • the mobile 1 narrows the frame acquisition interval in real time in the acquisition interval adjustment step (d step).
  • the frame is reacquired according to the frame acquisition interval changed in step a), and the control unit proceeds to step b) to update the newly acquired frames by displaying them on the first user interface 11.
  • the time interval between the frames is not narrow, so that the user can intuitively know that the physical intervals between the frames are closer to each other during the user's finger-drag dragging operation.
  • the newly acquired frames are displayed around the corresponding frame while keeping adjacent frames as possible in the intermediate coordinates of the two contact points.
  • the mobile 1 generates the final video edit result by rendering the video clips arranged in the second user interface 12 in the timeline order.
  • the mobile 1 may store it locally or automatically upload it to a social network service or cloud through a network according to a user's selection or setting.
  • each step of the method of providing a user interface for the mobile 1 to dynamically widen or narrow the time interval of frames for selecting a movie clip is a software that is a set of computer readable instructions. It can be implemented in the form of and recorded on the recording medium.
  • the recording medium may include all types of media that can be read by a computer, and examples thereof include DVD-ROM, CD-ROM, hard disk, USB memory, and flash memory.
  • the expression contained in the recording medium includes the case of being recorded on this type of recording medium, as well as the case provided through the communication line in the form of an intangible carrier wave (Carrier Wave).
  • the expression "mobile” in the above but a smart phone and a tablet computer as an example, but not limited to any means equipped with a touch screen, a portable electronic device running video editing software.
  • a well-known type such as a mobile communication terminal or a PDA may be a new type of device satisfying the above condition.
  • the present invention can be applied to the mobile user interface technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Television Signal Processing For Recording (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé destiné au traitement d'une interface utilisateur d'édition vidéo exécutée sur un dispositif mobile à écran tactile. Selon la présente invention, une trame est acquise à chaque intervalle d'acquisition de trame prédéfini à partir de certaines sections d'une vidéo choisie par un utilisateur et est ensuite affichée dans une première région d'interface utilisateur. Ensuite, lorsque l'utilisateur fait glisser deux trames quelconques parmi un certain nombre de trames affichées sur une seconde région d'interface utilisateur sur l'autre côté d'un écran, un clip vidéo qui correspond à une section entre les deux trames choisies est généré et est ensuite inséré à une position sur une ligne de temps correspondant au point auquel le glissement s'est terminé. Par ailleurs, lorsque l'utilisateur touche simultanément deux points sur la première interface utilisateur et fait ensuite glisser ces deux points de contact de sorte à les rapprocher ou à les éloigner l'un de l'autre, l'intervalle d'acquisition de trame est réinitialisé dynamiquement, une trame est à nouveau acquise à partir de la vidéo, et les traitements susmentionnés sont alors répétés.
PCT/KR2014/007412 2013-08-09 2014-08-08 Procédé de traitement d'interface utilisateur d'édition vidéo exécutée sur un dispositif mobile à écran tactile, dispositif mobile et support d'enregistrement WO2015020497A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0094881 2013-08-09
KR1020130094881A KR101399234B1 (ko) 2013-08-09 2013-08-09 터치스크린을 갖는 모바일 디바이스에서 실행되는 동영상 편집 사용자 인터페이스 처리방법, 모바일 디바이스 및 기록매체

Publications (1)

Publication Number Publication Date
WO2015020497A1 true WO2015020497A1 (fr) 2015-02-12

Family

ID=50895235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/007412 WO2015020497A1 (fr) 2013-08-09 2014-08-08 Procédé de traitement d'interface utilisateur d'édition vidéo exécutée sur un dispositif mobile à écran tactile, dispositif mobile et support d'enregistrement

Country Status (2)

Country Link
KR (1) KR101399234B1 (fr)
WO (1) WO2015020497A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113032623A (zh) * 2021-03-10 2021-06-25 珠海安士佳电子有限公司 一种智能视频数据检索方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106911953A (zh) 2016-06-02 2017-06-30 阿里巴巴集团控股有限公司 一种视频播放控制方法、装置及视频播放系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090058305A (ko) * 2007-12-04 2009-06-09 삼성전자주식회사 동영상 촬영 편집 장치 및 방법
KR20100086136A (ko) * 2009-01-22 2010-07-30 (주)코드엑트 동영상 편집 시스템
JP2012156686A (ja) * 2011-01-25 2012-08-16 Grafficia Inc 検索方法および検索装置、ならびに動画編集装置
KR20130027412A (ko) * 2011-09-07 2013-03-15 이-린 첸 개인 비디오를 제작하는데 사용되는 편집시스템

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130052753A (ko) * 2011-08-16 2013-05-23 삼성전자주식회사 터치스크린을 이용한 어플리케이션 실행 방법 및 이를 지원하는 단말기

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090058305A (ko) * 2007-12-04 2009-06-09 삼성전자주식회사 동영상 촬영 편집 장치 및 방법
KR20100086136A (ko) * 2009-01-22 2010-07-30 (주)코드엑트 동영상 편집 시스템
JP2012156686A (ja) * 2011-01-25 2012-08-16 Grafficia Inc 検索方法および検索装置、ならびに動画編集装置
KR20130027412A (ko) * 2011-09-07 2013-03-15 이-린 첸 개인 비디오를 제작하는데 사용되는 편집시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113032623A (zh) * 2021-03-10 2021-06-25 珠海安士佳电子有限公司 一种智能视频数据检索方法
CN113032623B (zh) * 2021-03-10 2024-04-05 珠海安士佳电子有限公司 一种智能视频数据检索方法

Also Published As

Publication number Publication date
KR101399234B1 (ko) 2014-05-27

Similar Documents

Publication Publication Date Title
WO2015020496A1 (fr) Dispositif fournissant une interface utilisateur pour édition vidéo et son procédé de fourniture, et support d'enregistrement
WO2014069964A1 (fr) Procédé permettant de modifier un film, son terminal et support d'enregistrement
WO2016056871A1 (fr) Édition de vidéo au moyen de données contextuelles, et découverte de contenu au moyen de grappes
WO2014088310A1 (fr) Dispositif d'affichage et son procédé de commande
WO2012018212A2 (fr) Dispositif tactile et son procédé de commande de dossiers par effleurement
WO2014193161A1 (fr) Procédé et dispositif d'interface utilisateur pour rechercher du contenu multimédia
US10622021B2 (en) Method and system for video editing
WO2014137131A1 (fr) Procédé et appareil de manipulation de données sur un écran d'un dispositif électronique
WO2016028042A1 (fr) Procédé de fourniture d'une image visuelle d'un son et dispositif électronique mettant en œuvre le procédé
WO2016095071A1 (fr) Procédé de traitement vidéo, dispositif de traitement vidéo et dispositif de lecture
WO2016022002A1 (fr) Appareil et procédé de commande du contenu à l'aide d'une interaction de ligne
WO2018038428A1 (fr) Dispositif électronique, et procédé de rendu de contenu multimédia à 360°
WO2014098539A1 (fr) Appareil de terminal utilisateur et son procédé de commande
WO2015178677A1 (fr) Dispositif formant terminal utilisateur, procédé de commande d'un dispositif formant terminal utilisateur et système multimédia associé
WO2019139270A1 (fr) Dispositif d'affichage et procédé de fourniture de contenu associé
WO2017057960A1 (fr) Dispositif électronique et procédé permettant de le commander
WO2016204449A1 (fr) Dispositif électronique d'affichage d'une pluralité d'images et procédé de traitement d'une image
WO2017078350A1 (fr) Procédé d'affichage de contenu et dispositif électronique pour le mettre en œuvre
WO2019037542A1 (fr) Procédé et appareil de prévisualisation de source de télévision, et support de stockage lisible par ordinateur
WO2015088196A1 (fr) Appareil et procédé d'édition de sous-titres
WO2015020497A1 (fr) Procédé de traitement d'interface utilisateur d'édition vidéo exécutée sur un dispositif mobile à écran tactile, dispositif mobile et support d'enregistrement
WO2015041491A1 (fr) Procédé et dispositif pour afficher un contenu
WO2018056587A1 (fr) Appareil électronique et son procédé de commande
WO2016114462A1 (fr) Procédé d'interface utilisateur pour une combinaison de blocs de fichier vidéo
WO2021040180A1 (fr) Dispositif d'affichage et procédé de commande associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14834308

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02.06.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14834308

Country of ref document: EP

Kind code of ref document: A1