WO2015122627A1 - Procédé d'édition d'image et appareil correspondant - Google Patents

Procédé d'édition d'image et appareil correspondant Download PDF

Info

Publication number
WO2015122627A1
WO2015122627A1 PCT/KR2015/000532 KR2015000532W WO2015122627A1 WO 2015122627 A1 WO2015122627 A1 WO 2015122627A1 KR 2015000532 W KR2015000532 W KR 2015000532W WO 2015122627 A1 WO2015122627 A1 WO 2015122627A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
special effect
editing
effect object
preview
Prior art date
Application number
PCT/KR2015/000532
Other languages
English (en)
Korean (ko)
Inventor
정민
곽별샘
김동훈
오주현
Original Assignee
주식회사 케이티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 케이티 filed Critical 주식회사 케이티
Publication of WO2015122627A1 publication Critical patent/WO2015122627A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8211Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a sound signal
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring

Definitions

  • the present invention relates to an image editing technology, and more particularly, to an image editing method and apparatus for providing an image editing interface that can be conveniently used by a user.
  • Korean Patent Publication No. 10-2006-0111116 discloses a method and system for providing a video data editing function to a mobile communication terminal.
  • the conventional video editing technology using a portable device provides only a uniform interface which does not consider user convenience at all.
  • the conventional image editing technology mounted in a portable device provides only a simple editing technique such as brightness adjustment, size adjustment, simple synthesis of pre-registered images and captured images, and the like to the user.
  • the conventional video editing technology mounted on a portable device requires a lot of resources of the portable device at the time of editing, without considering the processing performance of the portable device, thereby reducing the performance of the portable device. There is also a problem.
  • the present invention has been proposed to solve such a conventional problem, and an object thereof is to provide an image editing method and apparatus for providing an image editing interface so that a user can conveniently edit the image.
  • Another object of the present invention is to provide an image editing method for providing a user with an image editing result using a small resource, and an apparatus therefor.
  • an image editing apparatus comprising: a display unit configured to display an image editing interface including an image to be edited and an editing tool; Check an editing section in which a special effect is synthesized among the entire playback sections of the video, select a predetermined number of scenes in which the special effect object is expressed, and display the selected number of scenes on the display unit, and the special effect object is selected from the displayed scenes.
  • a position adjustment module which, when moved by the user, resets the position of the special effect object in the scene;
  • a special effect processing module for generating edit history data including the position of the special effect object for each scene.
  • a video editing method for synthesizing a video with a special effect includes checking an editing section synthesized with a special effect among all playback sections of an image to be edited. step; Selecting and displaying a predetermined number of scenes in which the special effect object is expressed in the editing section; Resetting the position of the special effect object in the scene in which the special effect object is moved when the special effect object is moved by the user among the displayed special effect scenes; And generating editing history data including the location of the special effect object for each scene.
  • the present invention provides the user with an interface for reassigning the location of a special effect to each scene, and enables the special effect to be synthesized with the original image accurately at the user's intended location through the interface.
  • the present invention when the control bar located in the timeline of the editing section is moved, by displaying the image synthesized with a special effect in a certain area on the screen corresponding to the position of the control bar to allow the user to immediately confirm the edit result In this case, the user's convenience can be improved.
  • the present invention saves the resources of the image editing apparatus generated when generating the preview image by displaying the preview image using the partial image expressing the special effect without using the entire graphic image to which the special effect object moves. There is also an advantage.
  • the present invention transmits the editing history data instead of the entire synthesized video to the server to perform the video synthesis based on the editing history data in the server, thereby minimizing the data traffic generated between the server and the video editing device, There is an advantage of reducing the load on the editing apparatus.
  • FIG. 1 is a diagram illustrating a configuration of an image editing system according to an exemplary embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a configuration of an image editing apparatus according to an embodiment of the present invention.
  • 3A and 3B are flowcharts illustrating a method of performing image editing through an image editing interface in an image editing apparatus according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating an image editing interface according to an embodiment of the present invention.
  • FIG. 5 is a diagram illustrating a state where an expression position and a playback section for a special effect are designated in an image editing interface according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a state in which a special effect adjustment icon is output in an image editing interface according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating an effect position adjustment interface according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating a state in which a special effect object is repositioned in the effect position adjusting interface according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating an image editing interface on which a preview image is output, according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method of generating a preview image in an image editing apparatus according to an embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a method of registering edit history data in a video editing system and providing a synthesized image to a communication device based on the registered edit history data according to an embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a configuration of an image editing system according to an exemplary embodiment of the present invention.
  • an image editing system includes an image editing apparatus 100 and an image providing server 200.
  • the image editing apparatus 100 and the image providing server 200 communicate with each other through a network 300.
  • the network 300 includes a broadband wired communication network and a mobile communication network, which corresponds to well-known conventional techniques of the present invention, and thus detailed description thereof will be omitted.
  • the image providing server 200 stores a plurality of images, special effect data, and editing history data.
  • the edit history data includes special effect identification information, location of a special effect object for each scene, composite data including a reproduction time of the special effect, and identification information of an original image.
  • the image providing server 200 receives and stores edit history data of a specific image from the image editing apparatus 100.
  • the image providing server 200 may receive and store the edited original image file from the image editing apparatus 100, and receive and store identification information (eg, a URL) of the original image file from the image editing apparatus 100. Can be.
  • the video providing server 200 when the video providing server 200 receives a specific video request from the video editing apparatus 100 or the other communication device 400, the video providing server 200 extracts the edit history data of the video, and requests the video and the special request based on the edit history data. After synthesizing the effect, the synthesized image is transmitted to the corresponding video editing apparatus 100 or the communication apparatus 400.
  • the image editing apparatus 100 provides a user with an image editing interface that allows the user to edit the image more conveniently.
  • the image editing apparatus 100 When the image is edited through the image editing interface, the image editing apparatus 100 generates the edit history data of the image, and edits the image.
  • the history data is stored or transmitted to the image providing server 200.
  • the video editing apparatus 100 receives, from the user, a repositioning signal for repositioning the display position of the special effect through the video editing interface, and generates composite data based on the repositioning signal.
  • the image editing apparatus 100 may display the preview image by using a partial image expressing a special effect without using the entire graphic image constituting the animation of the object.
  • FIG. 2 is a diagram illustrating a configuration of an image editing apparatus according to an embodiment of the present invention.
  • the image editing apparatus 100 may include a display unit 110, a storage unit 120, a camera 130, a communication unit 140, and an editing interface providing unit ( 150).
  • the display unit 110 is a display means for displaying various information processed by the image editing apparatus 100, and may be used by liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or light emitting diode (LED) technology. Can be.
  • the display unit 110 may be a touch display such as a resistive type or an infrared type. When the display unit 110 is a touch display, an output interface and an input interface are provided to the user.
  • the storage unit 120 is a storage means for storing data necessary for image editing.
  • the storage unit 120 stores an image file and stores various special effect data necessary for image synthesis.
  • the storage unit 120 may include data about a special effect that is moved in a predetermined direction while the light is flashing, data about a special effect that is reduced and disappear as the text is moved in a predetermined direction, and a special effect that disappears when the text is expanded in a predetermined direction. It stores data related to various special effects such as data about.
  • the storage unit 120 stores edit history data including the synthesis data and the identification information of the original video.
  • the composite data includes special effect identification information, positions of special effect objects for each scene, and playback time of the special effects.
  • the storage unit 120 stores the special effect data having a smaller size compared with the special effect data stored in the image providing server 200. That is, the storage unit 120 compares the audio and effect animation images included in each special effect data stored in the image providing server 200 and stores the low quality audio and effect animations. For example, when the image providing server 200 stores 30 frames per second special effect animation and 384 kbps audio data, the storage unit 120 of the image editing apparatus 100 may perform 4 frames per second special effect animation and 16 kbps unit. Audio data can be stored. More preferably, the storage unit 120 stores data in which the special effect object image and the voice are separated from the special effect animation image included in the special effect data.
  • the storage unit 120 may include high speed random access memory, and may also include one or more magnetic disk storage devices, nonvolatile memory such as flash memory devices, or other nonvolatile semiconductor memory devices.
  • the camera 130 has a lens to capture a peripheral image through the lens.
  • the communication unit 140 performs a function of communicating with various servers, in particular, the image providing server 200 via the network 300.
  • the editing interface providing unit 150 generates a user interface related to image editing and displays the same on the display unit 110. In detail, when the user requests editing of a specific image, the editing interface providing unit 150 generates an image editing interface including the image and an editing tool for editing the image and outputs the image editing interface to the display unit 110. .
  • the editing interface providing unit 150 includes a special effect processing module 151, a position adjusting module 152, and a preview providing module 153.
  • the special effect processing module 151 generates a composite data according to the input information input by the user on the image editing interface.
  • the special effect processing module 151 extracts the object for the selected special effect from the storage 120 and outputs the object to the image editing interface, and the first of the special effect objects is displayed. The location is input from the user.
  • the special effect processing module 151 receives information on the playback section in which the special effect is expressed from the entire playback section of the original video to be edited, and receives the playback section in which the special effect is displayed in the editing section. Confirm.
  • the special effect processing module 151 after editing the image is finished, generates the edit history data including the synthesis data and the identification information of the original image and stores it in the storage unit 120. In addition, the special effect processing module 151 may transmit the edit history data to the image providing server 200 through the communication unit 140.
  • the position adjustment module 152 When the position adjustment signal of the special effect is input from the user, the position adjustment module 152 generates and outputs an effect position adjustment interface for adjusting the position of the special effect object in the editing section to the display unit 110.
  • the position adjusting module 152 captures a predetermined number (eg, four) images among the images constituting the image of the editing section, and captures a special effect scene in which a special effect object appears on the captured image. The image is generated for each image and outputs to the display unit 110 an effect position adjusting interface including a plurality of special effect scenes thus generated.
  • the position adjusting module 152 may divide the image of the editing section into a preset unit time, and capture each image from each of the divided images, and the scene change among the images constituting the image of the editing section. You can capture multiple, more severe images.
  • the position adjusting module 152 checks the position of the special effect object expressed in each scene, and expresses the special effect object for each captured image at the identified position. Meanwhile, when the position adjustment of the user is completed through the effect position adjustment interface, the position adjustment module 152 checks the position information of the special effect object for each scene and transmits it to the special effect processing module 151.
  • the preview providing module 153 generates and provides a preview image synthesized with the special effect to the user.
  • the preview providing module 153 synthesizes a preview image in which an image corresponding to the special effect selected by the user and the editing section is synthesized.
  • the preview providing module 153 checks the playback time corresponding to the point corresponding to the endpoint control icon, and outputs a preview image or an image corresponding to the playback time to the display unit 110 among the synthesized whole images.
  • the preview providing module 153 outputs the preview image on a sub screen having a predetermined size (see 28 of FIG. 9).
  • the preview providing module 153 does not use the entire graphic image to which the special effect object is moved in order to reduce the resources of the image editing apparatus 100 required when displaying the preview image. Instead, the preview image may be displayed on the display unit 110 using some images expressing a special effect.
  • 3A and 3B are flowcharts illustrating a method of performing image editing through an image editing interface in an image editing apparatus according to an embodiment of the present invention.
  • the editing interface providing unit 150 displays the video editing interface including the video and the editing tool on the display 110 (S301).
  • the image to be edited may be an image already stored in the storage 120, an image captured in real time through the camera 130, or an image received from the image providing server 200.
  • FIG. 4 is a diagram illustrating an image editing interface according to an embodiment of the present invention.
  • the editing interface providing unit 150 includes a video selected by the user for editing, a timeline 22 representing a playback time of the video, a playback control bar 23, a playback button 21, and various special effects.
  • the image editing interface including the editing tools 24a, 24b, and 24c capable of synthesizing the same is displayed on the display unit 110.
  • the playback control bar 23 is a graphic element representing a playback time point that is currently playing. When the user changes the position of the playback control bar 23 on the timeline 22, the playback control bar 23 is located. The video from the time of reproduction is output. Also, among the special effects tools of FIG.
  • the star-shaped special effects tool 24a is an editing tool used to synthesize a halo effect for emphasizing a specific object, that is, an effect of shining light at the position of the specific object.
  • reference numeral 24b of FIG. 4 is a tool used to process computer graphics of a specific object
  • reference numeral 24c is an editing tool used to synthesize text input by a user (or text registered in advance) with an image.
  • Each special effect has a minimum duration (eg, 0.5 seconds) and also has a default duration (eg, 2 seconds).
  • each special effect has an animation effect that moves in a constant direction.
  • the image editing interface providing unit 150 receives selection information regarding any one of a plurality of editing tools listed in the image editing interface from the user (S303).
  • the special effect processing module 151 checks the special effect output according to the editing tool selected by the user, extracts the object related to the special effect from the storage 120 and outputs the object to the image editing interface, and the special effect The first position at which the object is expressed is received from the user (S305).
  • the special effect processing module 151 receives information about a playback section in which the special effects are displayed from the user among the playback sections of the entire image, and confirms that the playback section in which the special effects are displayed is an editing section (S307). ). That is, the video editing interface providing unit 150 sets a playback section in which the special effects are expressed among the entire playback sections of the video as the edit section.
  • FIG. 5 is a diagram illustrating a state where an expression position and a playback section for a special effect are designated in an image editing interface according to an embodiment of the present invention.
  • the initial position of the special effect object 25 is set at the face of a person.
  • the user may set the initial position of the special effect object in the editing section by moving the initial position at which the special effect object 25 is expressed by touching and dragging it to the face.
  • the special effect of FIG. 5 illustrates the halo effect in which light shines.
  • the video editing interface of FIG. 5 sets a playback section in which the special effect is synthesized with the original video according to a user's manipulation.
  • the video editing interface may set a point at which video playback is paused, that is, a point where the playback control bar 23 is located as a start point at which a special effect appears, and also add a default playback time of the special effect at the start point. You can set the point you want to be the end point for the special effect to end.
  • the video editing interface providing unit 150 sets the edit section to 1 minute 00 seconds when the playback time at which the halo effect having a default playback time of 2 seconds starts is 1 minute 00 seconds in an image having a total playback time of 2 minutes. To 1 minute 02 seconds.
  • each of the start point control icon 26a and the end point control icon 26b located on the timeline 22 represents a start point and an end point at which a special effect is played, respectively, and the start point control icon 26a.
  • the playback section corresponding to the timeline between and the endpoint control icon 26b becomes the edit section.
  • the image editing interface providing unit 150 changes the start point position at which the special effect is displayed. That is, when the user changes the position of the starting point control icon 26a on the timeline 22 by dragging, the image editing interface providing unit 150 changes the point in time at which the special effect is expressed among the entire playback time of the image. do. In this case, the image editing interface providing unit 150 changes the position of the end point control icon 26b by the position where the start point control icon 26a is moved. That is, when the user changes the position of the start point control icon 26a on the timeline 21, the position of the end point control icon 26b is also changed by the position where the start point control icon 26a is changed.
  • the special effect processing module 151 may receive a special effect adjustment signal from a user who wants to fine-tune the special effect (S309).
  • the special effect processing module 151 outputs a plurality of icons 27a and 27b for controlling the special effect when the special effect object 25 is clicked on the edited video interface.
  • the adjustment icon 27a is clicked among the icons of the user, it may be recognized that a special effect adjustment signal is received from the user.
  • FIG. 6 is a diagram illustrating a state in which a special effect adjustment icon is output in an image editing interface according to an embodiment of the present invention.
  • the special effect processing module 151 when the special effect object 25 is clicked on the edited video interface, the special effect processing module 151 outputs a plurality of icons 27a and 27b for controlling the special effect.
  • the adjustment icon 27a is an icon used to finely adjust the position of the special effect object
  • the trash can icon 27b is an icon used to remove the special effect.
  • the special effect processing module 151 recognizes that a special effect adjustment signal is received from the user.
  • the special effect processing module 151 activates the position adjusting module 152 when a special effect adjusting signal is received from the user, and the position adjusting module 152 performs a predetermined number (eg, scenes) on the image of the editing section. Capture an image of (S311).
  • the position adjustment module 152 may divide the image of the editing section by a predetermined unit time, and capture an image for each of the divided images.
  • the position adjusting module 152 may capture a plurality of images having a large degree of scene change among the images constituting the image of the editing section.
  • a predetermined number of images having a large amount of change in the movement of the object may be captured in the edit section. have.
  • the position adjustment module 152 checks the position of the special effect object in each of the captured images for each image (S313). At this time, the position adjustment module 152 expresses the special effect object in each captured image based on the initial position of the special effect object received from the user, the playing time of the editing section, and the movement direction and speed of the special effect object set as a default. You can check the location.
  • the position adjusting module 152 generates an image for each image capturing a special effect scene in which the special effect object is expressed at the identified special effect object position, and the effect position adjusting interface in which the generated plurality of special effect scenes are divided on the screen. Is output to the display unit 110 (S315).
  • the special effect object for each scene is used as basic data to interpolate the special effect animation.
  • FIG. 7 is a diagram illustrating an effect position adjustment interface according to an embodiment of the present invention.
  • the position adjusting module 152 may output an effect position adjusting interface including four special effect scenes to the display 110 based on the position of the image corresponding to the editing section and the position of the special effect object.
  • the number displayed in the upper left represents a playback order for the scene in the editing section.
  • some special effect objects are not synthesized as intended by the user, but are displayed at the wrong position.
  • the presentation position of the special effect object is not positioned at the position intended by the user, that is, at the face image border, but is located elsewhere.
  • the halo effect is only displayed on the face image border only at the first playback time, and the halo effect is unintended at the subsequent playback time.
  • the special effects i.e., the halo effect
  • the original image are synthesized in the state as shown in FIG. 7
  • the halo effect is only displayed on the face image border only at the first playback time, and the halo effect is unintended at the subsequent playback time.
  • the special effect is only displayed at the location intended by the user only near the first playback time. The effect is not expressed.
  • the position adjusting module 152 receives a repositioning signal of the special effect for one or more of the special effect scenes included in the effect position adjusting interface (S317). That is, the position adjusting module 152 receives a movement signal of at least one special effect object from a plurality of special effect scenes included in the effect position adjusting interface. In this case, the position adjustment module 152 may receive a movement signal of the special effect object from the user through touch drag.
  • FIG. 8 is a diagram illustrating a state in which a special effect object is repositioned in the effect position adjusting interface according to an embodiment of the present invention.
  • the position of the special effect object according to FIG. 8 may be compared with that of the special effect object according to FIG. 7. That is, in the effect position adjustment interface of FIG. 8, it can be seen that the positions of the special effect objects in the special effect scenes corresponding to the second, third and fourth positions have been repositioned near the face borders according to the user's input signal. have.
  • the image editing apparatus 100 provides an effect position adjusting interface for precisely adjusting the special effects, and allows the display position of the special effect object to be adjusted by the user in each special effect scene through the effect position adjusting interface. .
  • the position adjustment module 152 checks the position of the special effect object for each scene and transmits it to the special effect processing module 151 (S319). At this time, the position adjustment module 152 checks the playback time indicated by each scene and transmits it to the special effect processing module 151.
  • the special effect processing module 151 generates the composite data including the reproduction time of each scene, the display position of the special effect object for each scene, the reproduction time of the special effect, and the special effect identification information (for example, a halo effect). (S321).
  • the special effect processing module 151 in step S309, if a special effect adjustment signal is not received from the user, the composition including the initial position information of the special effect set by the user, the playback time of the special effect and the special effect identification information Generate data
  • the image editing interface providing unit 150 may select an endpoint control icon 26b corresponding to an end point of the editing section on the image editing interface. That is, the image editing interface providing unit 150 may receive a click signal of the endpoint control icon 26b corresponding to the end point of the editing section from the user who wants to check the composite image preview.
  • the preview providing module 153 generates a preview image obtained by synthesizing the original image corresponding to the editing section and the special effect, and the partial image corresponding to the playback time point at which the endpoint control icon 26b is located among the generated preview images. (Or an image) is extracted and displayed in a predetermined area on the image editing interface, thereby providing a preview image synthesized to the user (S325).
  • the preview providing module 153 outputs the preview image on the sub screen.
  • the preview providing module 153 checks the playback time point corresponding to the changed position, and displays the preview image (or image) of the playback time point and the corresponding section on the sub screen. Display.
  • the preview providing module 153 places a special effect object at a designated position for each scene based on the composite data generated in step S321, and animate each of the arranged special effect objects through an animation interpolation technique. And an original image are synthesized, and the synthesized image is displayed on the sub screen as a preview image.
  • FIG. 9 is a diagram illustrating an image editing interface on which a preview image is output, according to an embodiment of the present invention.
  • the preview providing module 153 when the endpoint control icon 26b is selected in the image editing interface, the preview providing module 153 generates a preview image 28 synthesized with a special effect and outputs the preview image 28 to the sub screen of the image editing interface. do.
  • the preview providing module 153 checks a playback time corresponding to the position of the endpoint control icon 26b in the editing section, and outputs a part of the preview image corresponding to the playback time.
  • the preview providing module 153 checks a playing time point at which the end point control icon 26b is located, and the partial image of the preview image corresponding to the playing time point. (Or partial video) is displayed. Accordingly, the user can check the edited image by moving the endpoint control icon 26b.
  • the special effect processing module 151 when the special effect processing module 151 receives an edit end signal from the user (S327), the special effects processing module 151 generates the edit history data including the synthesis data generated in step S321 and identification information of the original video, and then stores the storage unit 120. Store in (S329).
  • the special effect processing module 151 may transmit the edit history data stored in the storage 120 to the image providing server 200 to register the edit history data of the user in the image providing server 200.
  • the special effect processing module 151 transmits the original image itself to the image providing server 200 when the original image is an image stored solely in the storage unit 120 of the image editing apparatus 100, and thus the original image. Also register with the image providing server 200.
  • the image editing apparatus 100 may display the preview image using some images expressing a special effect in order to reduce resources consumed when processing the preview image.
  • FIG. 10 is a flowchart illustrating a method of generating a preview image in an image editing apparatus according to an embodiment of the present invention.
  • the preview providing module 153 checks an image belonging to an editing section among all reproduction sections of the original video (S1001).
  • the preview providing module 153 checks the special effects set by the user in the editing section, and extracts the image and audio of the low quality special effect animation implementing the special effects from the storage 120 (S1003). That is, the preview providing module 153 extracts a special effect image having a low frame rate per second and audio having a low bit rate compared to the special effect animation and audio stored in the image providing server 200.
  • the preview providing module 153 couples the special effect animation image and the audio to the image corresponding to the editing section (S1005).
  • the preview providing module 153 displays the special effect animation image at the corresponding playback time and the special effect audio is mixed with the audio of the original image. And audio are combined with the video in the editing section.
  • the preview providing module 153 may adjust the position of each image (ie, an object image) of the special effect animation based on the position of the special effect object for each scene recorded in the composite data.
  • the preview providing module 153 sets an overlap scene transition effect between each special effect object, and smoothly connects images of each special effect object (S1009). That is, if the second special effect object image is displayed while the first special effect object image is displayed, the preview providing module 153 disappears from the screen while gradually fading the first special effect object image and the second special effect object image.
  • the overlap transition effect is set between each special effect object so that the image of the first special effect object and the second special effect object are naturally transitioned while the image is gradually cleared while the image is blurred.
  • the preview providing module 153 displays on the display unit 110 a preview image displayed on the original image at each playback time of the special effect object.
  • the preview image is displayed on the sub screen
  • the preview image may be displayed on the full screen on the image editing apparatus 100.
  • FIG. 11 is a flowchart illustrating a method of registering edit history data in a video editing system and providing a synthesized image to a communication device based on the registered edit history data according to an embodiment of the present invention.
  • the editing interface providing unit 150 of the video editing apparatus 100 receives a video to be edited from a user
  • the editing interface providing unit 150 displays the video editing interface including the video and the editing tool on the display 110. (S1101).
  • the special effect processing module 151 receives the editing information such as the position of the special effect object, the playback section of the special effect, the special effect identification information, and the like from the user through the video editing interface (S1103). Generate synthetic data on the basis.
  • the position adjusting module 152 may be reassigned the position of the special effect object from the user through the effect position adjusting interface.
  • the preview providing module 153 may generate and display a preview image synthesized with a special effect.
  • the special effect processing module 151 when the user finishes editing the video, the special effect processing module 151 generates edit history data including the synthesis data and the identification information of the original video (S1105).
  • the special effect processing module 151 transmits the generated edit history data to the image providing server 200 by using the communication unit 140 (S1107).
  • the special effect processing module 151 may transmit the original image itself (that is, the original image file) to the image providing server 200.
  • the image providing server 200 stores the edit history data received from the image editing apparatus 100 (S1109).
  • the image providing server 200 receives the image source file from the image editing apparatus 100
  • the image providing server 200 stores the image source file and the edit history data together.
  • the image providing server 200 may receive an image request message for requesting the image from the communication device 400 (S1111). Then, the image providing server 200 extracts the edit history data of the image (S1113). In this case, the image providing server 200 receives image identification information from the communication device 400 and extracts editing history data including the image identification information.
  • the image providing server 200 extracts an original image based on the image identification information, and combines the original image and a special effect based on the edit history data (S1115).
  • the image providing server 200 selects a special effect image to be synthesized based on the special effect identification information recorded in the editing history data, and further, selects a special effect image from the original video based on the playback time of the special effect recorded in the editing history data.
  • the image providing server 200 may further include a frame corresponding to each scene in the frame image of the original image belonging to the playback section (ie, the editing section) of the special effect based on the scene-specific special effect object position recorded in the editing history data. Check the image.
  • the image providing server 200 arranges the special effect object at a designated position (that is, the special effect object position) in each of the checked frame images, and animates the special effect object through an animation interpolation technique.
  • the image providing server 200 synthesizes the animated special effect object and the original image.
  • the image providing server 200 transmits the synthesized image to the communication device 400 (S1117).
  • the communication device 400 confirms the video edited by the video editing apparatus 100 by outputting the video received from the video providing server 200 to the screen.
  • the method of the present invention as described above may be implemented as a program and stored in a recording medium (CD-ROM, RAM, ROM, floppy disk, hard disk, magneto-optical disk, etc.) in a computer-readable form. Since this process can be easily implemented by those skilled in the art will not be described in more detail.
  • a recording medium CD-ROM, RAM, ROM, floppy disk, hard disk, magneto-optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Television Signal Processing For Recording (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention porte sur un procédé d'édition d'image destiné à fournir une interface d'édition d'image qu'un utilisateur peut facilement utiliser, et sur un appareil correspondant. Un appareil d'édition d'image selon la présente invention comprend : une unité d'affichage pour afficher une interface d'édition d'image comprenant une image à éditer et un outil d'édition ; un module de réglage de position pour identifier une section d'édition, dans laquelle un effet spécial est synthétisé, dans la totalité de la section de reproduction de l'image, sélectionner un nombre prédéterminé de scènes dans lesquelles un objet à effet spécial est exprimé dans la section d'édition et afficher les scènes sélectionnées sur l'unité d'affichage, et, lorsque l'objet à effet spécial est déplacé par un utilisateur dans les scènes affichées, reconfigurer la position de l'objet à effet spécial dans les scènes ; et un module de traitement d'effet spécial pour générer des données de détails d'édition dans lesquelles une position d'un objet à effet spécial dans chaque scène est incluse.
PCT/KR2015/000532 2014-02-14 2015-01-19 Procédé d'édition d'image et appareil correspondant WO2015122627A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140017293A KR101528312B1 (ko) 2014-02-14 2014-02-14 영상 편집 방법 및 이를 위한 장치
KR10-2014-0017293 2014-02-14

Publications (1)

Publication Number Publication Date
WO2015122627A1 true WO2015122627A1 (fr) 2015-08-20

Family

ID=53503311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/000532 WO2015122627A1 (fr) 2014-02-14 2015-01-19 Procédé d'édition d'image et appareil correspondant

Country Status (2)

Country Link
KR (1) KR101528312B1 (fr)
WO (1) WO2015122627A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110290410A (zh) * 2019-07-31 2019-09-27 安徽华米信息科技有限公司 影像位置调节方法、装置、系统及调节信息生成设备
CN111105344A (zh) * 2018-10-26 2020-05-05 北京微播视界科技有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
CN113018867A (zh) * 2021-03-31 2021-06-25 苏州沁游网络科技有限公司 一种特效文件的生成、播放方法、电子设备及存储介质
WO2022262485A1 (fr) * 2021-06-18 2022-12-22 腾讯科技(深圳)有限公司 Procédé et appareil de réglage de position pour commandes de fonctionnement et terminal et support de stockage

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9852768B1 (en) 2016-06-03 2017-12-26 Maverick Co., Ltd. Video editing using mobile terminal and remote computer
US9773524B1 (en) 2016-06-03 2017-09-26 Maverick Co., Ltd. Video editing using mobile terminal and remote computer
KR102005429B1 (ko) * 2016-09-12 2019-07-30 이규상 상호 연관된 광고를 제공하는 장치 및 방법
CN106385591B (zh) * 2016-10-17 2020-05-15 腾讯科技(上海)有限公司 视频处理方法及视频处理装置
CN107743212A (zh) * 2017-09-28 2018-02-27 努比亚技术有限公司 一种视频处理方法、移动终端及计算机可读存储介质
CN111629252B (zh) * 2020-06-10 2022-03-25 北京字节跳动网络技术有限公司 视频处理方法、装置、电子设备及计算机可读存储介质
EP4192023A4 (fr) 2021-01-12 2024-02-14 Samsung Electronics Co Ltd Dispositif électronique, procédé et support de stockage non transitoire pour édition vidéo
KR20220102013A (ko) * 2021-01-12 2022-07-19 삼성전자주식회사 영상 편집을 위한 전자 장치, 방법 및 비 일시적 저장 매체

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10322647A (ja) * 1997-05-14 1998-12-04 Matsushita Electric Ind Co Ltd 動画エディットシステム
JP2006140867A (ja) * 2004-11-15 2006-06-01 Canopus Co Ltd 映像編集装置およびその方法
KR20120050689A (ko) * 2010-11-11 2012-05-21 주식회사 케이티 사용자 제작 콘텐츠를 생성하는 방법 및 장치

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10322647A (ja) * 1997-05-14 1998-12-04 Matsushita Electric Ind Co Ltd 動画エディットシステム
JP2006140867A (ja) * 2004-11-15 2006-06-01 Canopus Co Ltd 映像編集装置およびその方法
KR20120050689A (ko) * 2010-11-11 2012-05-21 주식회사 케이티 사용자 제작 콘텐츠를 생성하는 방법 및 장치

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111105344A (zh) * 2018-10-26 2020-05-05 北京微播视界科技有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
CN110290410A (zh) * 2019-07-31 2019-09-27 安徽华米信息科技有限公司 影像位置调节方法、装置、系统及调节信息生成设备
CN110290410B (zh) * 2019-07-31 2021-10-29 合肥华米微电子有限公司 影像位置调节方法、装置、系统及调节信息生成设备
CN113018867A (zh) * 2021-03-31 2021-06-25 苏州沁游网络科技有限公司 一种特效文件的生成、播放方法、电子设备及存储介质
WO2022262485A1 (fr) * 2021-06-18 2022-12-22 腾讯科技(深圳)有限公司 Procédé et appareil de réglage de position pour commandes de fonctionnement et terminal et support de stockage

Also Published As

Publication number Publication date
KR101528312B1 (ko) 2015-06-11

Similar Documents

Publication Publication Date Title
WO2015122627A1 (fr) Procédé d'édition d'image et appareil correspondant
JP4882288B2 (ja) 表示制御装置、システム及び表示制御方法
WO2016048024A1 (fr) Appareil d'affichage et procédé d'affichage correspondant
CN106804000A (zh) 直播回放方法及装置
EP3024223B1 (fr) Terminal de visioconférence, procédé d'accès à des données de flux secondaire et support de stockage informatique
US20240062443A1 (en) Video sharing method and apparatus, device, and medium
CN106375775A (zh) 虚拟礼物展示方法及装置
US20060023949A1 (en) Information-processing apparatus, information-processing method, recording medium, and program
CN103813126A (zh) 进行视频通话时提供用户感兴趣信息的方法及其电子装置
WO2006011399A1 (fr) Dispositif et procede de traitement d’informations, support d’enregistrement et programme
WO2022237571A1 (fr) Procédé et appareil de fusion d'image, dispositif électronique et support de stockage
TW201225647A (en) Remote control of television displays
WO2021083145A1 (fr) Procédé et dispositif de traitement vidéo, terminal et support de stockage
CN110740261A (zh) 视频录制方法、装置、终端及存储介质
WO2015041402A1 (fr) Appareil d'affichage d'image, son procédé de commande, et procédé d'affichage d'image
JP2023538061A (ja) 検索コンテンツのマッチング方法、装置、電子機器および記憶媒体
WO2015122628A1 (fr) Procédé d'ajustement de ligne temporelle pour édition d'image et appareil d'édition d'image
EP4057633A1 (fr) Procédé et appareil de traitement vidéo, et dispositif terminal
CN106954093B (zh) 全景视频处理方法、装置及系统
KR102138835B1 (ko) 정보 노출 방지 영상 제공 장치 및 방법
WO2015190780A1 (fr) Terminal utilisateur et son procédé de commande
JP6355319B2 (ja) 再生装置及びその制御方法、管理装置及びその制御方法、映像再生システム、並びにプログラム及び記憶媒体
KR102417084B1 (ko) 멀티채널 미디어 송수신 방법 및 시스템
JP2012156726A (ja) 情報処理装置、情報処理方法及びプログラム
JP2008090526A (ja) 会議情報保存装置、システム、会議情報表示装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15748586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07.12.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 15748586

Country of ref document: EP

Kind code of ref document: A1