KR101776674B1 - Apparatus for editing video and the operation method - Google Patents

Apparatus for editing video and the operation method Download PDF

Info

Publication number
KR101776674B1
KR101776674B1 KR1020160013377A KR20160013377A KR101776674B1 KR 101776674 B1 KR101776674 B1 KR 101776674B1 KR 1020160013377 A KR1020160013377 A KR 1020160013377A KR 20160013377 A KR20160013377 A KR 20160013377A KR 101776674 B1 KR101776674 B1 KR 101776674B1
Authority
KR
South Korea
Prior art keywords
editing
display area
image
user
edit
Prior art date
Application number
KR1020160013377A
Other languages
Korean (ko)
Other versions
KR20170092260A (en
Inventor
박영민
Original Assignee
(주)바이널씨
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)바이널씨 filed Critical (주)바이널씨
Priority to KR1020160013377A priority Critical patent/KR101776674B1/en
Publication of KR20170092260A publication Critical patent/KR20170092260A/en
Application granted granted Critical
Publication of KR101776674B1 publication Critical patent/KR101776674B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • H04N5/232
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Abstract

According to an embodiment of the present invention, a portable image editing apparatus and an operation method thereof are provided. The method includes generating a first editing screen, the first editing screen including a first video display area and an editing target display area; Displaying in real time the first image information generated by capturing an external image according to a user's camera photographing input in the first image display area; And adding the first image information as an editing object by displaying a representative image corresponding to the first image information on the editing object display area when the photographing is completed, The step of displaying and the step of adding to the editing object may be repeated at least once according to the camera photographing input of the user.

Description

[0001] APPARATUS FOR EDITING VIDEO AND THE OPERATION METHOD [0002]

The present invention relates to a portable image editing apparatus and an operation method thereof, and more particularly, to an image editing apparatus and an operation method thereof that can provide a more intuitive and easy-to-use user interface for image editing through a mobile device.

Recently, as mobile terminals such as smart phones have become widespread throughout the world, the software technology field including applications for mobile games and messengers is undergoing a great change mainly in such smart phones. Video editing technology is no exception, and mobile video editing applications are being released. Such a moving image editing application is installed in a mobile terminal such as a smart phone and is arranged to arrange a plurality of moving images to be edited in the direction from left to right along the time line and insert special effects between the clips And a final movie file in which a plurality of movie clips are merged is obtained through the editing process.

However, in the conventional mobile video editing application, since a screen size of a smart phone or the like is small, it is necessary to repeatedly switch a plurality of screens in order to provide an interface for adding a moving image to be edited, inserting a special effect, The user is required to perform a plurality of inputs, which may cause inconvenience and hassle.

Therefore, there is a need for a technique that allows a user to perform editing on moving images more intuitively and easily.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a portable video service providing apparatus and a method of operating the same that can provide a user with a more intuitive and easy-to-use user interface for image editing through a mobile device do.

The technical problems of the present invention are not limited to the above-mentioned technical problems, and other technical problems which are not mentioned can be understood by those skilled in the art from the following description.

According to an embodiment of the present invention, a method of operating an image editing apparatus is provided. The method includes generating a first editing screen, the first editing screen including a first video display area and an editing target display area; Displaying in real time the first image information generated by capturing an external image according to a user's camera photographing input in the first image display area; And adding the first image information as an editing object by displaying a representative image corresponding to the first image information on the editing object display area when the photographing is completed, The step of displaying and the step of adding to the editing object may be repeated at least once according to the camera photographing input of the user.

According to an embodiment of the present invention, there is provided a computer-readable recording medium on which a program for performing a method of operating an image editing apparatus is recorded.

According to an embodiment of the present invention, an image editing apparatus is provided. The apparatus comprises: a photographing unit for photographing an external image according to a user's camera photographing input to generate first image information; A first editing screen including a first video display area and an editing object display area is generated and the first video information generated by the shooting of the photographing part is displayed in real time in the first video display area, A control unit displaying the representative image corresponding to the first image information on the editing object display area when the image is completed and adding the first image information as an object to be edited; And a display unit for displaying the first editing screen by the control unit, wherein the controller displays the first video display area and the editing target at least once according to the camera shooting input of the user .

According to the present invention, it is possible to minimize the need for screen switching and user input in the process of editing an image, thereby increasing the convenience of the user.

Further, according to the present invention, convenience in use can be further increased by providing an intuitive user interface capable of simultaneously editing a plurality of special effects on one screen.

BRIEF DESCRIPTION OF THE DRAWINGS A brief description of each drawing is provided to more fully understand the drawings recited in the description of the invention.
FIG. 1 illustrates an image editing apparatus according to an embodiment of the present invention.
2 illustrates an operation method of an image editing apparatus according to an embodiment of the present invention.
FIG. 3 illustrates an embodiment of step S210 of FIG.
FIG. 4 shows an embodiment of step S220 of FIG.
FIG. 5 illustrates an embodiment of step S230 of FIG.
6 shows an operation example of an image editing apparatus according to an embodiment of the present invention.
FIG. 7 illustrates an operation example of an image editing apparatus according to an embodiment of the present invention.
FIG. 8 illustrates an operation example of an image editing apparatus according to an embodiment of the present invention.

Hereinafter, embodiments according to the present invention will be described with reference to the accompanying drawings. It should be noted that, in adding reference numerals to the constituent elements of the drawings, the same constituent elements are denoted by the same reference numerals whenever possible, even if they are shown in different drawings. In the following description of the embodiments of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the difference that the embodiments of the present invention are not conclusive. In addition, embodiments of the present invention will be described below, but the technical idea of the present invention is not limited thereto and can be variously modified by those skilled in the art.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "indirectly connected" . Throughout the specification, when an element is referred to as "comprising ", it means that it can include other elements as well, without excluding other elements unless specifically stated otherwise. In describing the components of the embodiment of the present invention, terms such as first, second, A, B, (a), and (b) may be used. These terms are intended to distinguish the constituent elements from other constituent elements, and the terms do not limit the nature, order or order of the constituent elements.

FIG. 1 illustrates an image editing apparatus according to an embodiment of the present invention.

Referring to FIG. 1, the image editing apparatus 100 may include a communication unit 110, a photographing unit 120, a display unit 130, an input unit 140, a storage unit 150, and a control unit 160.

The communication unit 110 is provided for direct connection with the outside or via a network, and may be a wired and / or wireless communication unit. Specifically, the communication unit 110 transmits data from the control unit 160, the storage unit 150, and the like by wire or wireless, or receives data from the outside via wire or wireless and transfers the data to the control unit 160 or the storage unit 150 ). ≪ / RTI > The communication unit 110 may be a LAN, a Wideband Code Division Multiple Access (WCDMA), a Long Term Evolution (LTE), a Wireless Broadband Internet (WiBro), a Radio Frequency (RF) communication, a wireless LAN, Fidelity, Near Field Communication (NFC), Bluetooth, and infrared communication. However, it should be understood that the present invention is not limited to the above-described embodiments, and various kinds of wired and wireless communication technologies applicable to the related art may be used according to the embodiments to which the present invention is applied.

The photographing unit 120 can photograph the outside and generate image information such as a moving image. The image information generated by the photographing unit 120 may be stored in the storage unit 150 or may be transmitted to the controller 160. Thereafter, the controller 160 may perform a series of processes of adding the transmitted image information as an editing target. The photographing unit 120 may be embodied as a conventional camera module known in the art, but the present invention is not limited thereto. Various constructions may be used according to the embodiment to which the present invention is applied.

The display unit 130 displays an image or data provided from the control unit 160 on the screen. For example, the display unit 130 may display a user interface (e.g., a first editing screen, a second editing screen, and a third editing screen) for inputting a user required for the operation of the image editing apparatus 100 . The display unit 130 may be formed of a liquid crystal display (LCD), an organic light emitting diode (OLED), an active matrix organic light emitting diode (AMOLED) Provides a menu of the device, input data, function setting information, and various other information visually to the user.

The input unit 140 receives a user input for controlling the image editing apparatus 100 and generates an input signal and provides the input signal to the control unit 160. The input unit 140 may be a keyboard, a mouse, a gyro sensor, a voice recognition sensor, a key pad, a dome switch, a touch pad, a touch screen, screen, jog wheel, jog switch, and the like can be applied.

The storage unit 150 may store various image information as well as various data associated with the image editing process performed by the image editing apparatus 100. The storage unit 150 may be a hard disk drive (HDD), a read only memory (ROM), a random access memory (RAM), an electrically erasable and programmable read only memory (EEPROM) flash memory, CF (Compact Flash) card, SD (Secure Digital) card, SM (Smart Media) card, MMC (Multimedia) card or Memory Stick And may be provided in the image editing apparatus 100 or may be provided in a separate apparatus.

The control unit 160 may control the overall operation of the image editing apparatus 100 and the signal flow between the internal structures of the image editing apparatus 100 and may perform a data processing function of processing the data. In particular, in the present invention, the controller 160 may provide image editing functions such as addition of an editing object, editing of a reproduction section, insertion of special effects, and merging of at least one image information.

First, the control unit 160 may add at least one image information as an editing target. Specifically, the control unit 160 generates image information (i.e., first image information) by photographing the outside through the photographing unit 120 by a user, or generates at least one image When the information (i.e., the second image information) is selected, the editing target can be added by inserting the representative image of the image information into the editing target display area.

In addition, the control unit 160 can edit at least one of the editing objects. That is, for example, when the user selects at least one of the representative images displayed in the edit target display area, the control unit 160 sets the edit segment in accordance with the user input for each of the corresponding edit objects, It is possible to edit the reproduction section with respect to the edition target by extracting the frame to which it belongs.

In addition, the control unit 160 may perform editing of the special effect. Specifically, when a user's insertion request for at least one special effect is input, the control unit 160 displays at least one effect editing area in response to the user's insertion request, By setting a part of each of the special effects as an insertion section, editing of the special effect is performed.

The control unit 160 can merge the playback sections to be edited into one to generate the integrated image information. Specifically, the control unit 160 sets the merging order of the editing object on the basis of the sorting order of the representative images displayed in the editing object display area, merges the corresponding editing objects in accordance with the merging order, and generates the integrated image information .

The operation of the control unit 160 will be described in more detail with reference to FIGS. 2 to 6. FIG.

2 illustrates an operation method of an image editing apparatus according to an embodiment of the present invention.

In step S210, the control unit 160 may add at least one video information as an editing target. Here, the image information refers to a moving image composed of a combination of a plurality of frames, and may include at least one first image information generated by photographing the outside and / or second image information previously stored in the storage unit 150 . Step S210 may be performed based on the user's input on the first editing screen generated by the controller 160 and displayed on the display unit 130. [

In step S220, the controller 160 may edit the playback interval of at least one of the editing objects. This editing of the reproduction section can be performed by setting an editing section according to a user's selection for each of at least one editing object and extracting a frame belonging to the editing section. Step S220 may be performed based on the user's input on the second editing screen generated by the controller 160 and displayed on the display unit 130. [

In step S230, the control unit 160 may perform editing for inserting at least one special effect. That is, if the user selects at least one special effect to be inserted into the editing target, the control unit 160 may set at least a part of the editing target playback interval as the insertion period of the special effect, based on a predetermined user input. Then, based on the setting of the insertion section, the control section 160 can process one or more special effects to be inserted into the editing object and / or the combined image information in which they are combined. Here, the special effects may include various effects applicable in the field such as stopping and / or moving picture images, inserting texts, filter effects such as black and white, sepia, music, voice insertion, and the like. Step S230 may be performed based on the user's input on the third editing screen generated by the controller 160 and displayed on the display unit 130. [

In step S240, the control unit 160 can generate the integrated image information by merging the objects to be edited. Such merging may be performed on the basis of the sorting order of the representative images displayed in the editing object display area by returning to the first editing screen. That is, the control unit 160 sets the merging order of the corresponding editing target based on the sorting order of the representative images, and merges the reproduction regions of the editing object in accordance with the merging order, thereby generating the combined image information. The generated integrated image information may be stored in the storage unit 150 together with editing history information (e.g., information on the merged image information, merging order, etc.). On the other hand, if at least one of the editing objects is edited in step S220, the editing step may be performed in step S240 by merging the edited reproduction section with the rest of the editing objects that have not been edited.

However, according to an embodiment of the present invention, the operation method 200 may perform at least one of step S220 and step S230, . ≪ / RTI >

FIG. 3 illustrates an embodiment of step S210 of FIG.

Referring to FIGS. 2 and 3, step S210 of the operation method 200 may include steps S310 through S350.

In step S310, the control unit 160 generates a first editing screen and displays the first editing screen on the display unit 130. FIG. Here, the first editing screen refers to a user interface for receiving a user input for performing addition of an editing object and merging of editing objects, and may include a first image display area and an editing object display area. The first image display area is an area for reproducing image information. As described below, the image information can be generated at the same time as the generation of the image information, or the non-real time display of the image information selected by the user during the editing object . The editing target display area is an area in which a representative image corresponding to the editing target is displayed, and can provide information on the editing target added and the merging order of the editing target. However, the configuration of the first edit screen is an exemplary one, and is not limited thereto.

In step S320, the control unit 160 may determine whether or not the camera shooting input of the user is received. Such a camera shooting input may be performed by, for example, a single touch on a predetermined shooting button included in the first editing screen or a continuous touch for a predetermined time.

When the user's camera photographing input is received, the control unit 160 controls the photographing unit 120, which is driven by the camera photographing input, Can be displayed in the video display area.

When the photographing by the photographing unit 120 is completed, in step S330, the control unit 160 displays the representative image corresponding to the generated first image information on the edit target display area, thereby adding the first image information as an edit target . Such a representative image is an image for identifying an editing target, and may be generated based on at least a part of frames constituting image information to be edited.

The steps S330 to S340 may be repeated one or more times in accordance with the user's camera photographing input. That is, when the user performs the photographing a plurality of times, the controller 160 displays the first image information generated during the photographing in real time through the first image display area for each photographing, And a representative image corresponding to the first image information is inserted into the editing object display area.

If there is no camera shooting input by the user, in step S350, the addition of the editing object may be performed on the previously stored image information. That is, when the user selects at least one second image information previously stored in the storage unit 150 as an editing object, the controller 160 displays a representative image corresponding to the second image information in the editing object display region, One second image information can be simultaneously or sequentially added as an editing object. According to an embodiment, step S350 may be performed at least once, and although not shown in FIG. 3, after step S350, step S320 may be performed again according to a user's input.

On the other hand, the representative images can be implemented so that they are sequentially displayed and aligned in the direction from left to right in the editing object display area based on the order in which the editing objects are added, and the order of sorting these representative images is changed . Thereafter, as described above, the editing object can be merged into one integrated image information according to the sorting order of the representative images on the editing object display area.

In one embodiment, step S210 may further include displaying the entire timeline in the editing object display area. At this time, the entire timeline can be divided into a plurality of areas based on the order of the representative images displayed in the editing object display area and the reproduction time of the editing object corresponding to the representative images. Through this timeline, the user can visually recognize the merge order of the editing object, the length of the reproduction time, and the like. On the other hand, if the editing of the playback section is performed through step S220, the controller 160 may be configured to change the divided area of the timeline based on the playback time that is changed according to the setting of the edit section.

FIG. 4 shows an embodiment of step S220 of FIG.

Referring to FIGS. 2 and 4, step S220 of the operation method 200 may include steps S410 through S430.

In step S410, the control unit 160 generates a second editing screen and displays the second editing screen on the display unit 130. [ The second editing screen refers to a user interface for receiving user input to perform editing of a playback section with respect to at least one of the editing objects, and may include a second video display area, an editing target display area, and a frame editing area have. Here, the second video display area is an area for displaying the video corresponding to the editing section to the user in advance, and the frame editing area displays at least a part of a plurality of frames constituting the editing target, Is a region for displaying a frame editing bar for allowing the user to perform setting for the editing section. However, the configuration of the second editing screen is illustrative and not restrictive.

In step S420, the control unit 160 may display at least a part of the frames constituting the editing target selected by the user in the frame editing area. Specifically, when the user selects at least one of the editing objects by, for example, touching at least one of the representative images displayed in the editing object display area, the control unit 1160 displays a plurality of frames A predetermined number of representative frames may be obtained and displayed in a frame editing area by arranging them in chronological order. Here, the representative frame can be obtained by extracting two or more frames corresponding to a predetermined time interval from the start frame in a plurality of frames to be edited.

In step S430, the control unit 160 may set an editing section based on the representative frame displayed in the frame editing area. Specifically, first, the control unit 160 may display at least one of the representative frames of the editing target to overlap with the frame editing bar, and then modify at least one of the position and the size of the frame editing bar according to the user input. This modification can be performed by laterally moving, enlarging and reducing the frame editing bars corresponding to predetermined user inputs, respectively. Then, the control unit 160 may set some reproduction intervals identified by the representative frames included between both ends of the modified frame editing bar as edit intervals.

Thereafter, the control unit 160 can complete the editing of the reproduction section by extracting frames belonging to the editing section set for each of the editing objects. According to this process, the edit object whose playback interval is changed may be stored in the storage unit 150, and may be merged with another edit object through step S240 as described above.

On the other hand, if at least one editing object is edited through the step S220, the representative image displayed in the editing object display area may be updated.

As described above, the video editing apparatus 100 can selectively edit the reproduction section on a plurality of editing objects on the same second editing screen without changing the screen, thereby improving convenience for the user.

Fig. 5 shows an embodiment of the step S230 of Fig. 2, respectively.

Referring to FIGS. 2 and 5, operation S230 of operation method 200 may include operations S510 through S520.

In step S510, the control unit 160 generates a third editing screen and displays the third editing screen on the display unit 130. FIG. The step S510 may be performed in response to the insertion request of at least one special effect of the user. The third editing screen refers to a user interface for receiving user input for performing editing of at least one special effect, and may include a third image display area and at least one effect editing area. Here, the third image display area is an area for reproducing an exemplary image in which each special effect is inserted, and the effect editing area is an area for setting an insertion interval for each special effect according to user input, The region may be created in a number corresponding to the special effect to be inserted. Further, at least one guide line for identifying each of the reproduction sections of the editing object may be displayed in each of the effect editing areas. That is, each of the effect editing areas can be divided into one or more areas corresponding to the respective reproduction sections of the editing target by these guidelines. However, the configuration of the third editing screen is an exemplary one, and is not limited thereto.

In step S520, the controller 160 may set an insertion interval of each special effect. Specifically, first, the controller 160 displays an effect insertion bar in each of the effect editing areas, and may modify at least one of the position and the size of the effect insertion bar according to user input. This deformation can be performed by laterally moving, enlarging and reducing the effect insertion bars respectively corresponding to predetermined user inputs. In this modification, the controller 160 may perform automatic correction for matching at least one of both ends of the effect insertion bar to the guide line. That is, according to this modification, when at least one of the opposite ends of the effect insertion bar is adjacent to any of the guide lines within a predetermined distance, the controller 160 may further change the effect insertion bar so that at least one of the opposite ends is adjacent And to perform automatic correction to match the guidelines. Through the automatic correction of the effect insertion bar, both ends of the effect insertion bar can be easily matched with the boundaries of the respective reproduction sections during the process of deforming the effect insertion bar for setting the insertion section, Can be increased.

In this case, the control unit 160 may set at least a part of the reproduction section of the editing object as the insertion section of the special effect based on the modified effect insertion bar. That is, the control unit 160 identifies an area occupied by the effect insertion bar among at least one region divided by the guideline for each of the effect editing regions, and inserts the reproduction region of the editing object corresponding to the identified region into each of the special effects Section. Then, based on the setting of the insertion section, the control section 160 performs processing such that at least one special effect is inserted into the editing object and / or the combined image information in which they are combined.

6 shows an operation example of an image editing apparatus according to an embodiment of the present invention.

Referring to FIG. 6, the user can add at least one image information as an edit target through the first edit screen.

That is, for example, the user drives the photographing unit 120 through an input (that is, a camera photographing input) to the photographing button 630 to photograph at least one image information Or the first image information), or may select one or more pieces of image information (i.e., second image information) stored in the storage unit 150 through the input to the import button 640 as an object to be edited. When the first image information is generated by the user and / or the second image information is selected, the image editing apparatus 100 generates a first image information and / or a representative image corresponding to the second image information, By displaying in the display area 620, one or more pieces of video information can be added as an editing target. In particular, when the user generates a plurality of first image information through sequential shooting, the image editing device 100 adds the generated first image information to the editing target each time the shooting is completed And the first image information generated in each shooting process may be reproduced to the user through the first image display area 610. [ As the editing target is added, an entire time line 650 indicating the order of the editing target and the reproducing time can be displayed on the upper side of the editing target display area 620.

On the other hand, in the first edit screen, as described above, the user changes the order of sorting the representative images through a predetermined user input (e.g., drag and drop) You can change the merge order.

FIG. 7 illustrates an operation example of an image editing apparatus according to an embodiment of the present invention.

Referring to FIG. 7, the user can select a part of the image information to be edited through the second editing screen and edit the playback section. That is, when the user selects at least one of the representative images displayed in the editing object display area 620, at least one representative frame of the editing object corresponding to the selected representative image and the frame editing bar 730 overlapping the at least one representative frame are subjected to frame editing Area 720, as shown in FIG.

Thereafter, the user can perform editing on the playback section by performing some transformation such as lateral movement, enlargement, and reduction on the frame editing bar 730 through user input and selecting some playback sections as edit sections. On the other hand, the video corresponding to the selected editing section can be provided to the user through the second video display area 710, and when the playback time of at least one editing object is changed by editing the playback section, The divided area of the area 650 can be updated.

FIG. 8 illustrates an operation example of an image editing apparatus according to an embodiment of the present invention.

Referring to FIG. 8, a user can perform editing for inserting at least one special effect into a reproduction section of an editing target through a third editing screen. That is, when the user selects one or more special effects, the number of effect editing areas 820 corresponding to the special effects to be inserted through the third editing screen can be displayed. At this time, one or more guide lines 830 may be displayed in each effect editing area 820, thereby allowing the user to identify each of the playback sections to be edited to be inserted with a special effect.

In addition, an effect insertion bar 840 for setting an insertion interval for each special effect is provided in the effect editing area 820, and the user can insert the effect insertion bar 840 through the user input, Or the like, the reproduction section of the editing object corresponding to the area occupied by each of the effect insertion bars 840 can be set as the insertion section for the special effect.

Meanwhile, in this case, an exemplary image in which the special effect selected by the user is inserted can be provided to the user through the third image display area 810.

Meanwhile, the various embodiments described herein may be implemented by hardware, middleware, microcode, software, and / or a combination thereof. For example, various embodiments may include one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ), Processors, controllers, microcontrollers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.

Also, for example, various embodiments may be stored or encoded in a computer-readable medium including instructions. The instructions stored or encoded in the computer-readable medium may cause a programmable processor or other processor to perform the method, for example, when the instructions are executed. The computer-readable medium includes computer storage media. The storage medium may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage media, magnetic disk storage media or other magnetic storage devices, Or any other medium that can be used to store data in the form of data structures.

Such hardware, software, firmware, etc. may be implemented within the same device or within separate devices to support the various operations and functions described herein. Additionally, components, units, modules, components, etc. described in the present invention as "parts" may be implemented separately or together as separate but interoperable logic devices. The description of different features for modules, units, etc. is intended to emphasize different functional embodiments and does not necessarily imply that they must be implemented by individual hardware or software components. Rather, the functionality associated with one or more modules or units may be performed by individual hardware or software components,

Although acts in a particular order are shown in the figures, it should be understood that these acts are performed in the specific order shown, or in a sequential order, or that all illustrated acts need to be performed to achieve the desired result . In any environment, multitasking and parallel processing may be advantageous. Moreover, the division of various components in the above-described embodiments should not be understood as requiring such a distinction in all embodiments, and the components described may generally be integrated together into a single software product or packaged into multiple software products It should be understood.

As described above, an optimal embodiment has been disclosed in the drawings and specification. Although specific terms have been employed herein, they are used for purposes of illustration only and are not intended to limit the scope of the invention as defined in the claims or the claims. Therefore, those skilled in the art will appreciate that various modifications and equivalent embodiments are possible without departing from the scope of the present invention. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

Claims (19)

A method of operating an image editing apparatus,
Generating a first edit screen, the first edit screen including a first video display area and an edit target display area;
Displaying in real time the first image information generated by capturing an external image according to a user's camera photographing input in the first image display area; And
And adding the first image information as an edit target by displaying a representative image corresponding to the first image information on the edit target display area when the photographing is completed,
Wherein the step of displaying on the first video display area and the step of adding to the editing object are repeated a plurality of times on the first editing screen in accordance with the camera shooting input of the user.
The method according to claim 1,
Further comprising the step of adding the second image information to the editing object by displaying a representative image corresponding to at least one second image information previously stored in the editing object display region according to the selection of the user .
3. The method of claim 2,
Further comprising the step of displaying the entire timeline in the editing subject display area,
Wherein the entire time line is divided into a plurality of areas based on the order of the representative images displayed in the editing object display area and the reproduction time of the editing object corresponding to the representative image
The method of claim 3,
Generating a second edit screen, the second edit screen including a second video display area, the edit target display area, and a frame edit area;
Displaying at least a part of frames constituting an editing object selected by the user in the frame editing area;
Further comprising setting an editing section of the editing object based on the user's selection of a frame used for image editing among the displayed frames,
And an image corresponding to the set edit section is displayed in the second video display area.
5. The method of claim 4,
Further comprising changing a divided region of the entire timeline based on a reproduction time that is changed in accordance with the setting of the edit section for the frame to be edited.
3. The method of claim 2,
Creating a third edit screen, the third edit screen including at least one effect edit area corresponding to a third video display area and at least one special effect to be inserted; And
Further comprising setting at least a part of a reproduction section of the editing object as an insertion section of the special effect in accordance with a user's selection of at least a part of the effect editing area,
Wherein at least one guide line for identifying a reproduction section of the editing object is displayed in each of the effect editing areas and an inserted image of a special effect corresponding to the set insertion section is displayed in the third image display area.
7. The method of claim 6,
Displaying an effect insertion bar for setting an insertion interval of the special effect in each of the effect editing areas;
Transforming at least one of the position and the size of each of the effect insertion bars according to the input of the user; And
And setting at least a part of the reproduction section of the editing object as an insertion section of each of the special effects based on the modified effect insertion bar.
8. The method of claim 7,
At the time of deformation, performing at least one of both ends of each of the effect insertion bars to match the adjacent guide line.
3. The method of claim 2,
Further comprising the step of merging the editing object based on a sorting order of representative images displayed in the editing object display area.
10. A computer-readable recording medium having recorded thereon a program for performing the method according to any one of claims 1 to 9. A video editing apparatus comprising:
A photographing unit for photographing an external image according to a user's camera photographing input to generate first image information;
A first editing screen including a first video display area and an editing object display area is generated and the first video information generated by the shooting of the photographing part is displayed in real time in the first video display area, A control unit displaying the representative image corresponding to the first image information on the editing object display area when the image is completed and adding the first image information as an object to be edited; And
And a display unit for displaying the first editing screen by the control unit,
Wherein the control unit repeatedly performs the display on the first video display area and the addition on the first editing screen a plurality of times in response to a camera shooting input of the user.
12. The method of claim 11,
Wherein the control unit adds the second image information to the editing object by displaying a representative image corresponding to at least one second image information previously stored in the editing object display region according to the selection of the user.
13. The method of claim 12,
Wherein the control unit displays the entire time line in the edit target display area,
Wherein the entire time line is divided into a plurality of regions based on the order of the representative images displayed in the editing object display region and the reproduction time of the editing object corresponding to the representative image.
14. The method of claim 13,
The control unit generates a second edit screen including a second video display area, the edit target display area, and a frame edit area, and displays at least a part of frames constituting the edit target selected by the user in the frame edit area And setting an editing section of the editing target based on the user's selection of a frame used for image editing among the displayed frames,
And an image corresponding to the set edit section is displayed in the second video display area.
15. The method of claim 14,
Wherein the control unit changes the divided area of the entire timeline based on a reproduction time that is changed in accordance with the setting of the edit section for the frame to be edited.
13. The method of claim 12,
Wherein the control unit generates a third editing screen including at least one effect editing area corresponding to a third video display area and at least one special effect to be inserted, Sets at least a part of the reproduction section of the editing object as an insertion section of the special effect,
Wherein at least one guide line for identifying a reproduction section of the editing object is displayed in each of the effect editing areas and an inserted image of a special effect corresponding to the set insertion section is displayed in the third image display area.
17. The method of claim 16,
Wherein the control unit displays an effect insertion bar for setting an insertion interval of the special effect in each of the effect editing areas and modifies at least one of the position and the size of each of the effect insertion bars according to the input of the user, And sets at least a part of the reproduction section of the editing object as an insertion section of each of the special effects based on the effect insertion bar.
18. The method of claim 17,
Wherein the control unit performs automatic correction to match at least one of both ends of each of the effect insertion bars to the adjacent guide line when the deformation occurs.
13. The method of claim 12,
And the control unit merges the editing target based on the sorting order of the representative images displayed in the editing target display area.
KR1020160013377A 2016-02-03 2016-02-03 Apparatus for editing video and the operation method KR101776674B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160013377A KR101776674B1 (en) 2016-02-03 2016-02-03 Apparatus for editing video and the operation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160013377A KR101776674B1 (en) 2016-02-03 2016-02-03 Apparatus for editing video and the operation method

Publications (2)

Publication Number Publication Date
KR20170092260A KR20170092260A (en) 2017-08-11
KR101776674B1 true KR101776674B1 (en) 2017-09-08

Family

ID=59651468

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160013377A KR101776674B1 (en) 2016-02-03 2016-02-03 Apparatus for editing video and the operation method

Country Status (1)

Country Link
KR (1) KR101776674B1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102274723B1 (en) * 2019-01-02 2021-07-08 주식회사 케이티 Device, method and computer program for editing time slice images
CN111757013B (en) * 2020-07-23 2022-04-29 北京字节跳动网络技术有限公司 Video processing method, device, equipment and storage medium
KR102441619B1 (en) * 2021-09-07 2022-09-07 양승철 A method for editing video through recording of edits, and an apparatus and a system therefor
KR102627285B1 (en) * 2022-07-12 2024-01-23 주식회사 래비노 3D image generation system and method for compensating images of moving cameras

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101490506B1 (en) * 2014-07-08 2015-02-10 주식회사 테라클 Method and apparatus for editing moving picture contents

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101490506B1 (en) * 2014-07-08 2015-02-10 주식회사 테라클 Method and apparatus for editing moving picture contents

Also Published As

Publication number Publication date
KR20170092260A (en) 2017-08-11

Similar Documents

Publication Publication Date Title
US11622079B2 (en) Systems and methods for previewing newly captured image content and reviewing previously stored image content
CN102754352B (en) Method and apparatus for providing information of multiple applications
US20170024110A1 (en) Video editing on mobile platform
EP3147906A1 (en) Mobile terminal and method for controlling the same
US11317028B2 (en) Capture and display device
KR101776674B1 (en) Apparatus for editing video and the operation method
CN105488145B (en) Display methods, device and the terminal of web page contents
US20150149960A1 (en) Method of generating panorama image, computer-readable storage medium having recorded thereon the method, and panorama image generating device
JP2016521418A (en) Method and apparatus for generating and editing an image with an object inserted
CN110753251A (en) Video switching method and device and electronic equipment
KR20180027917A (en) Display apparatus and control method thereof
JP5174735B2 (en) Operation control device, operation control method, and operation control program
CN112887794A (en) Video editing method and device
US10817167B2 (en) Device, method and computer program product for creating viewable content on an interactive display using gesture inputs indicating desired effects
KR101825598B1 (en) Apparatus and method for providing contents, and computer program recorded on computer readable recording medium for executing the method
JP2018005893A (en) Program, system, and method for recording video images
US20200105302A1 (en) Editing apparatus for controlling representative image to appropriate image, method of controlling the same, and storage medium therefor
KR102368203B1 (en) Electrocnic device for generating video index based on user interface and operating method thereof
KR101610007B1 (en) Method for generating video data
KR101553272B1 (en) Control method for event of multimedia content and building apparatus for multimedia content using timers
TW201513655A (en) Video editing
JP6141495B1 (en) Program, system, and method for recording moving image
CN114745506A (en) Video processing method and electronic equipment
JP2020167625A (en) Image processing device and control method of the same
JP2016165043A (en) Moving imaging taking system, information processing terminal, moving image confirmation method and program

Legal Events

Date Code Title Description
A201 Request for examination
N231 Notification of change of applicant
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant