WO2022098187A1 - 광고 컨텐츠의 편집 방법 및 그 장치 - Google Patents
광고 컨텐츠의 편집 방법 및 그 장치 Download PDFInfo
- Publication number
- WO2022098187A1 WO2022098187A1 PCT/KR2021/016137 KR2021016137W WO2022098187A1 WO 2022098187 A1 WO2022098187 A1 WO 2022098187A1 KR 2021016137 W KR2021016137 W KR 2021016137W WO 2022098187 A1 WO2022098187 A1 WO 2022098187A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- advertisement content
- menu
- editing
- clip
- editable
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000003780 insertion Methods 0.000 claims abstract description 23
- 230000037431 insertion Effects 0.000 claims abstract description 23
- 238000004891 communication Methods 0.000 claims description 33
- 238000011156 evaluation Methods 0.000 claims description 29
- 230000002730 additional effect Effects 0.000 claims description 19
- 230000004913 activation Effects 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 3
- 230000014759 maintenance of location Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 67
- 230000006870 function Effects 0.000 description 22
- 230000000875 corresponding effect Effects 0.000 description 17
- 230000008569 process Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 230000001965 increasing effect Effects 0.000 description 5
- 238000011093 media selection Methods 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000013078 crystal Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012854 evaluation process Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000008018 melting Effects 0.000 description 2
- 238000002844 melting Methods 0.000 description 2
- 230000004297 night vision Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000004043 responsiveness Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 235000019640 taste Nutrition 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0276—Advertisement creation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0242—Determining effectiveness of advertisements
- G06Q30/0244—Optimization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0277—Online advertisement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present disclosure relates to a method and apparatus for editing advertisement content, and to a method and apparatus for editing advertisement content capable of maximizing advertisement effects through personalized advertisement editing.
- portable terminals such as smart phones and tablets have been widely distributed, and due to the improvement of performance of such portable terminals and the development of wireless communication technology, users can shoot, edit, and share moving pictures using the portable terminal.
- the portable terminal is basically provided with a camera device, user needs for editing an image or a moving picture captured by the camera device are increasing.
- advertisements are made through PC-based web pages, portal sites, banners, links, and/or separate windows.
- the mobile advertisement may be provided to an unspecified number of people who own a mobile terminal, but may be selectively provided to a plurality of targeted mobile terminal users associated with the advertisement product or service in order to maximize the advertising effect to lead to a purchase.
- mobile and PC-based advertisement contents may be produced in various forms, for example, moving pictures, in addition to text and images, in order to further enhance advertisement effects.
- the advertisement content may be supplied through various video platforms such as YouTube and VLOG.
- the advertisement content provides a benefit to the user who has actively shared the advertisement with other users so that it can be widely spread to a large number of users. These advertisement contents are distributed only in the form produced by advertisers and producers.
- Advertisement content only in the form it has been produced has a limit in inducing a strong advertising effect on users who watch it. Accordingly, there is a method for producers to edit and provide advertisement contents to users within the scope of not impairing the advertisement concept even with the same advertisement contents, but it is not possible to reflect the tastes of all users interested in advertisement products, etc. Compared to the inefficient production cost, the advertising effect is also opaque. Therefore, in the field of advertisement contents, various methods are being considered for inducing active sharing by individual users and transforming original advertisement contents into elements that can appeal to other users while being original through individual users. In particular, advertisement contents edited by individual users are processed into a plurality of similar advertisements transformed into various shapes, so that advertisement effects can be maximized for users of various tastes. Accordingly, various methods for estimating the advertisement effect by the advertisement content while editing the advertisement content using the above-mentioned video editing technique are being tried.
- the technical object of the present disclosure relates to a method and apparatus for editing advertisement content, and to provide a method and apparatus for editing advertisement content capable of maximizing advertisement effects through personalized advertisement editing.
- a method of editing advertisement content performed by a computing device including at least one processor includes extracting and presenting an editable element from a loaded advertisement content; receiving the presented editable element by user input; selecting an insert element for the editable element by the user input; and editing the advertisement content based on the selected insertion element.
- the editable element may include at least one of a media object constituting the advertisement content, an editing tool for editing the advertisement content, and category information designated for the concept of the advertisement content.
- the editing tool may include an editing function for imparting an additional effect to the media constituting the advertisement content.
- the category information may include at least one of a media type of the advertisement content allowing the concept to be modified, information related to the media objects, and insertable additional effect information.
- the presenting of the editable element may include at least one of the media object, an editing user interface for editing the advertisement content, and a predetermined area of an editing application provided by the computing device. , can present the editable element.
- the extracting of the editable element may include extracting based on attribute information included in the advertisement content, wherein the attribute information distinguishes between an editable element and a non-editable element in the advertisement content. can be recorded.
- the receiving of the editable element includes receiving an input of the user selecting the editable element, and receiving an insertion activation request according to the user input for the editable element can do.
- the selecting of the insertion element may include presenting a plurality of candidate items insertable to the editable element, and receiving the candidate item selected by the user's input.
- the steps of sharing the edited advertisement content through a content platform, evaluating the shared edited advertisement content to generate evaluation information, and the computing device and the advertisement content may further include transmitting the evaluation information to a server related to the provision.
- the generating of the evaluation information may include generating first evaluation information based on at least one of an insertion element added to the edited advertisement content and an advertisement element maintained in the advertisement content.
- the first evaluation information may include at least one of a retention rate of the advertisement element, importance information of the advertisement element set in the advertisement content, and information on a degree of deformation of the advertisement content according to the insertion element. can be created based on
- the first evaluation information is generated by at least one of the computing device and the server, and calculation reference data used to calculate the degree of variation of the edited advertisement content from the degree of variation information and the importance information may be provided from the server.
- the generating of the evaluation information may further include generating second evaluation information by collecting reaction information on the edited advertisement content from other users of the content platform. there is.
- a computing device for editing advertisement content includes a communication module; and a processor that transmits and receives to and from the communication module and controls the computing device, wherein the processor extracts and presents an editable element from the loaded advertisement content, receives the presented editable element by a user input, and the An alternative element for the editable element is selected by a user input, and the advertisement content is edited based on the selected alternative element.
- a method and apparatus for editing advertisement content capable of maximizing advertisement effects through personalized advertisement editing may be provided.
- FIG. 1 is a diagram illustrating an electronic device to which various embodiments of the present disclosure are applied.
- FIG. 2 is a diagram for explaining a system hierarchical structure of an electronic device to which various embodiments of the present disclosure are applied.
- FIG. 3 is a flowchart illustrating a sequence of a video editing method to which various embodiments of the present disclosure are applied.
- FIG. 4 is a diagram illustrating an editing UI provided by an apparatus for controlling a video editing UI according to various embodiments of the present disclosure.
- 5A to 5E are diagrams illustrating a clip editing UI provided in a video editing UI according to various embodiments of the present disclosure
- FIG. 6 is a flowchart illustrating a method of editing advertisement content according to an embodiment of the present disclosure.
- FIGS. 7A to 7D are diagrams illustrating a process in which advertisement contents are edited by the method of editing advertisement contents according to an embodiment of the present disclosure.
- FIG. 8 is a flowchart of an evaluation process of edited advertisement content according to another embodiment of the present disclosure.
- a component when it is said that a component is “connected”, “coupled” or “connected” with another component, it is not only a direct connection relationship, but also an indirect connection relationship in which another component exists in the middle. may also include.
- a component when a component is said to "include” or “have” another component, it means that another component may be further included without excluding other components unless otherwise stated. .
- the components that are distinguished from each other are for clearly explaining each characteristic, and the components do not necessarily mean that the components are separated. That is, a plurality of components may be integrated to form one hardware or software unit, or one component may be distributed to form a plurality of hardware or software units. Accordingly, even if not specifically mentioned, such integrated or dispersed embodiments are also included in the scope of the present disclosure.
- components described in various embodiments do not necessarily mean essential components, and some may be optional components. Accordingly, an embodiment composed of a subset of components described in one embodiment is also included in the scope of the present disclosure. In addition, embodiments including other components in addition to components described in various embodiments are also included in the scope of the present disclosure.
- Various embodiments of the present disclosure may be implemented in an electronic device having a display unit such as a smart phone or a tablet, and the video editing device according to an embodiment of the present disclosure may be implemented by an electronic device having a video editing application. there is. Alternatively, it may be implemented by an electronic device having an image processing unit and a control unit capable of processing moving picture and subtitle data.
- an electronic device to which various embodiments of the present disclosure are applied means a portable electronic device.
- FIG. 1 is a diagram illustrating an electronic device to which various embodiments of the present disclosure are applied, and is a block diagram illustrating an electronic device 101 in a network environment 100 .
- the electronic device 101 may be referred to as a computing device, and the electronic device 101 may have a video editing application embedded therein, or the application may be downloaded and installed from the outside.
- an electronic device 101 communicates with the electronic device 102 through a first network 198 (eg, short-range wireless communication) or a second network 199 ( For example, it may communicate with the electronic device 104 or the server 108 through long-distance wireless communication. According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 . According to an embodiment, the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and an interface 177 .
- a camera module 180 for transmitting and receiving data through networks 198 and 199 , and the like.
- a power management module 188 for transmitting and receiving data through networks 198 and 199 , and the like.
- a communication module 190 for transmitting and receiving data through networks 198 and 199 , and the like.
- at least one of these components eg, the display device 160 or the camera module 180
- the processor 120 for example, runs software (eg, the program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing and operations.
- the processor 120 may load a command or data received from another component (eg, the communication module 190 ) into the volatile memory 132 for processing, and store the result data in the non-volatile memory 134 .
- the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor) and an auxiliary processor 123 independently operated therefrom.
- the auxiliary processor 123 may be additionally or alternatively mounted on the main processor 121 to use less power than the main processor 121 .
- the auxiliary processor 123 may include the auxiliary processor 123 (eg, a graphic processing device, an image signal processor, a sensor hub processor, or a communication processor) specialized for a specified function.
- the auxiliary processor 123 may be operated separately from or embedded in the main processor 121 .
- the auxiliary processor 123 replaces the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, for example, among the components of the electronic device 101 . At least a part of functions or states related to at least one component (eg, the display device 160 or the communication module 190 ) may be controlled. As another example, while the main processor 121 is in an active (eg, performing an application) state, the auxiliary processor 123 may perform a function or state related to at least components of the electronic device 101 together with the main processor 121 . At least some of them can be controlled.
- the coprocessor 123 (eg, image signal processor or communication processor) is implemented as some component of another functionally related component (eg, camera module 180 or communication module 190).
- the memory 130 stores at least one component of the electronic device 101 (eg, various data used by the processor 120 , for example, software (eg, the program 140 ) and instructions related thereto). It is possible to store input data or output data for the memory 130 , and may include a volatile memory 132 or a non-volatile memory 134 .
- the program 140 is software stored in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
- the application 146 may include a plurality of software for various functions, and may have a content editing application according to the present disclosure.
- the editing application is executed through the processor 140 and may be software that creates a new image or selects and edits an existing image.
- the input device 150 is a device for receiving commands or data to be used by a component (eg, the processor 120) of the electronic device 101 from the outside (eg, a user) of the electronic device 101, for example, for example, it may include a microphone, mouse, or keyboard.
- the sound output device 155 may be a device for outputting a sound signal to the outside of the electronic device 101 .
- the sound output device 155 may include a speaker used for general purposes such as multimedia playback or recording playback, and a receiver used exclusively for receiving calls.
- the receiver may be formed integrally with or separately from the speaker.
- the display device 160 may be a device for visually providing information to a user of the electronic device 101 .
- the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
- the display device 160 may include a touch circuitry or a pressure sensor capable of measuring the intensity of the pressure applied to the touch.
- the display device 160 may detect the coordinates of the touch input area, the number of touch input areas, the touch input gesture, etc. based on the touch circuit or the pressure sensor, and display the detected result to the main processor. (121) or may be provided to the auxiliary processor (123).
- the audio module 170 may interactively convert a sound and an electrical signal. According to an embodiment, the audio module 170 acquires a sound through the input device 150 , or an external electronic device (eg, a sound output device 155 ) connected to the electronic device 101 by wire or wirelessly. Sound may be output through the electronic device 102 (eg, a speaker or headphones).
- an external electronic device eg, a sound output device 155
- Sound may be output through the electronic device 102 (eg, a speaker or headphones).
- the interface 177 may support a designated protocol capable of connecting to an external electronic device (eg, the electronic device 102 ) in a wired or wireless manner.
- the interface 177 may include a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card
- the connection terminal 178 is a connector capable of physically connecting the electronic device 101 and an external electronic device (eg, the electronic device 102 ), for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector. (eg a headphone connector).
- the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, an image sensor, an image signal processor, or a flash.
- the power management module 188 is a module for managing power supplied to the electronic device 101 , and may be configured as, for example, at least a part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 is a device for supplying power to at least one component of the electronic device 101 , and may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
- the communication module 190 establishes a wired or wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108), and establishes the established communication channel. It can support performing data communication through
- the communication module 190 may include one or more communication processors that support wired communication or wireless communication, which are operated independently of the processor 120 (eg, an application processor).
- the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : LAN (local area network) communication module, or power line communication module), and using the corresponding communication module among them, the first network 198 (eg, Bluetooth, BLE (bluetooth low energy), WiFi direct or IrDA) (a short-range communication network such as an infrared data association) or a second network 199 (eg, a cellular network, the Internet, or a telecommunication network such as a computer network (eg, LAN or WAN)) with an external electronic device.
- the first network 198 eg, Bluetooth, BLE (bluetooth low energy), WiFi direct or IrDA)
- a second network 199 eg, a cellular network, the Internet, or a telecommunication network such as a computer network (eg, LAN or
- peripheral devices eg, a bus, general purpose input/output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
- GPIO general purpose input/output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
- Each of the electronic devices 102 and 104 may be the same as or different from the electronic device 101 .
- at least some of the operations executed in the electronic device 101 may be executed in another one or a plurality of external electronic devices.
- the electronic device 101 when the electronic device 101 needs to perform a specific function or service automatically or upon request, the electronic device 101 performs the function, in addition to or instead of executing the function or service itself. At least some functions related to a service may be requested from the external electronic device.
- the external electronic device may execute the requested function or additional function, and may transmit the result to the electronic device 101 .
- the electronic device 101 may provide the requested function or service by processing the received result as it is or additionally.
- cloud computing, distributed computing, or client-server computing technology may be used.
- FIG. 2 is a diagram for explaining a system hierarchical structure of an electronic device to which various embodiments of the present disclosure are applied.
- the electronic device 200 includes a hardware layer 210 corresponding to the electronic device 100 of FIG. 1 and an operating system (OS) that manages the hardware layer 210 as an upper layer of the hardware layer 210 .
- OS operating system
- Operating System The OS layer 220 , as an upper layer of the OS layer 220 , may be configured to include a framework layer 230 , and an application layer 240 .
- the OS layer 220 controls the overall operation of the hardware layer 210 and performs a function of managing the hardware layer 210 . That is, the OS layer 220 is a layer in charge of basic functions such as hardware management, memory, and security.
- the OS layer 220 may include a driver for operating or driving a hardware device included in the electronic device, such as a display driver for driving a display device, a camera driver for driving a camera module, and an audio driver for driving an audio module.
- the OS layer 220 may include a library and runtime that a developer can access.
- a framework layer 230 exists as a higher layer than the OS layer 220 , and the framework layer 230 serves to connect the application layer 240 and the OS layer 220 . That is, the framework layer 230 includes a location manager, a notification manager, and a frame buffer for displaying an image on the display unit.
- An application layer 240 implementing various functions of the electronic device 100 is located in an upper layer of the framework layer 230 .
- the application layer 240 may include various application programs such as a call application 241 , a video editing application 242 , a camera application 243 , a browser application 244 , and a gesture application 245 .
- the OS layer 220 may provide at least one application included in the application layer 240 or a menu or UI for adding or deleting an application included in the application layer 240 , and through this, at least one application included in the application layer 240 .
- An application or application program may be added or deleted by the user.
- the electronic device 100 of FIG. 1 may be connected to the other electronic devices 102 and 104 or the server 108 through communication, and may be connected to the other electronic devices 102 or 104 or the server 108 according to a user's request.
- Data ie, at least one application or application program
- provided from the server 108 may be received and stored and recorded in the memory.
- At least one application or application program stored in the memory may be configured and operated in the application layer 240 .
- at least one application or application program may be selected by the user using a menu or UI provided by the OS layer 220 , and the selected at least one application or application program may be deleted.
- FIG. 3 is a flowchart illustrating a sequence of a video editing method to which various embodiments of the present disclosure are applied.
- the video editing method may be operated by the aforementioned electronic device (or computing device), and the operation may be started as a video editing application is selected and executed by a user input ( S105).
- the electronic device may output an initial screen of the video editing application to a display device (eg, a display).
- a display device eg, a display
- a menu (or UI) for creating a new image project and an image project selection menu (or UI) for selecting an image project being edited in advance may be provided on the initial screen.
- step S115 may be performed, and if the image project selection menu (or UI) is selected, step S125 may be performed (S310) .
- the electronic device may provide a menu (or UI) for setting basic information of the new image project, and may set and apply basic information input through the menu (or UI) to the new image project.
- the basic information may include an aspect ratio of a new image project.
- the electronic device may provide a menu (or UI) for selecting an aspect ratio such as 16:9, 9:16, 1:1, etc., and select the screen ratio input through the menu (or UI). It can be set and applied to a new video project.
- the electronic device may generate a new image project by reflecting the basic information set in step S115, and store the created new image project in a storage medium (S304).
- the electronic device automatically adjusts master volume, master volume size, audio fade-in preference, audio fade-out preference, video fade-in preference, video fade-out preference, image clip preference, layer length preference
- a menu for setting at least one of settings and basic pan & zoom settings of an image clip may be provided, and a value input through the menu (or UI) may be set as basic information of a new image project.
- the electronic device may determine an aspect ratio, automatic adjustment of master volume, size of master volume, audio fade in preference, audio fade out preference, video fade in preference, video fade out preference, image clip preference , default settings for layer length, and default settings for pan & zoom of image clips can be automatically set to predetermined values.
- the electronic device provides a setting menu (or UI), and through the setting menu (or UI), the aspect ratio, automatic adjustment of the master volume, the size of the master volume, the audio fade-in default setting, the audio fade-out default setting, Receive control values for video fade-in default settings, video fade-out default settings, image clip default length settings, layer length default settings, and image clip pan & zoom default settings. You can also set information.
- the electronic device may provide a project list including an image project stored in the storage medium and provide an environment for selecting at least one image project included in the project list.
- the user may select at least one image project included in the project list, and the electronic device may load the at least one image project selected by the user ( S130 ).
- the electronic device may provide an editing UI.
- the editing UI may include an image display window 401 , a media setting window 402 , a media input window 403 , a clip display window 404 , a clip setting window 405 , and the like.
- the image display window, the media setting window, and the media input window may be displayed on the upper part of the display, and the clip display window and the clip setting window may be displayed on the lower part of the display.
- the media setting window may include an export menu, a capture menu, a setting menu, and the like, and the export menu, capture menu, and setting menu may be provided in the form of an icon or text that can recognize the corresponding menu.
- the media input window may include a media input menu 403a, a layer input menu 403b, an audio input menu 403c, an audio input menu 403d, a shooting menu 403e, and the like, and the media input menu 403a.
- the layer input menu 403b, the audio input menu 403c, the voice input menu 403d, the shooting menu 403e, etc. may be provided in the form of icons or text that can recognize the corresponding menus.
- each menu may include a sub-menu, and as each menu is selected, the electronic device may configure and display a corresponding sub-menu.
- the media input menu 403a may be connected to the media selection window as a sub-menu, and the media selection window may provide an environment for selecting media stored in the storage medium.
- Media selected through the media selection window may be inserted and displayed in the clip display window.
- the electronic device may check the selected media type through the media selection window, set a clip time of the corresponding media in consideration of the checked media type, and insert and display the selected media type in the clip display window.
- the type of media may include an image, a video, and the like.
- the electronic device may check the default length setting value of the image clip and set the image clip time according to the default length setting value of the image clip.
- the electronic device may set the time of the moving picture clip according to the length of the corresponding medium.
- the layer input menu 403b may include a media input menu, an effect input menu, an overlay input menu, a text input menu, a media input menu, and a drawing input menu.
- the media input menu may have the same configuration as the aforementioned media input menu.
- Effect input menu includes Blur Effect, Mosaic Effect, Noise Effect, Sandstorm Effect, Melting Point Effect, Crystal Effect, Star Filter Effect, Display Board Effect, Haze Effect, Fisheye Lens Effect, Magnifying Glass Render Effect, Flower Twist Effect, Night Vision Effect , an environment for selecting sketch effects, etc. may be provided.
- An effect selected through the effect input menu may be inserted and displayed in the clip display window.
- the electronic device may check the default setting value of the layer length and set the effect clip time according to the default setting value of the layer length.
- the overlay input menu may provide an environment in which stickers and icons of various shapes or shapes can be selected. Stickers, icons, etc. selected through the overlay input menu may be inserted and displayed in the clip display window.
- the electronic device may check the default setting value of the layer length, and set the clip time of the sticker or icon according to the basic setting value of the layer length.
- the text input menu may provide an environment for entering text, eg, a Qwety keyboard. Text input through the text input menu may be inserted and displayed in the clip display window.
- the electronic device may check the default setting value of the layer length and set the text clip time according to the default setting value of the layer length.
- the drawing input menu may be configured to provide a drawing area to the image display window, and to display a drawing object in a touch input area on the image display window.
- the handwriting input menu is a sub-menu that selects the drawing tool from the drawing tool selection menu, the color selection menu to select the drawing color, the thickness setting menu to set the thickness of the drawing object, the erase part menu to delete the created drawing object, and the It may include a Delete All menu to delete the entire object, and the like.
- the electronic device may check the default setting value of the layer length and set the drawing object clip time according to the default setting value of the layer length.
- the audio input menu 403c may be connected to the audio selection window as a sub-menu, and the audio selection window may provide an environment for selecting an audio file stored in a storage medium. An audio file selected through the audio selection window may be inserted and displayed in the clip display window.
- the voice input menu 403d may be a menu for recording a sound input through a microphone.
- the electronic device may detect a voice signal input from the microphone by activating a microphone provided in the device.
- the electronic device may display a recording start button and, when the recording start button is input, may start recording of the hidden wire signal.
- the electronic device may visualize and display the voice signal input from the microphone. For example, the electronic device may check the magnitude or frequency characteristic of the voice signal, and display the checked characteristic in the form of a level meter or a graph.
- the shooting menu 403e may be a menu for shooting an image or an image input through a camera module provided in the electronic device.
- the shooting menu 403e may be displayed through an icon that visualizes the camera device.
- the photographing menu 403e may include an image/video photographing selection menu for selecting a camera for photographing an image or a camcorder for photographing an image as a sub-menu. Based on this, when the shooting menu 403e is selected by the user, the electronic device may display an image/video shooting selection menu. In addition, the electronic device may activate an image capturing mode or an image capturing mode of the camera module according to selection through the image/video capturing selection menu.
- the clip display window 404 may include at least one clip line for displaying clips corresponding to media, effects, overlays, texts, drawings, audio, voice signals, etc. input through the media input window.
- the clip line may include a main click line 404a and a sub clip line 404b.
- the clip line provided at the top of the clip display window is referred to as the main click line 404a, and the lower end of the main click line 404a.
- At least one clip line provided in the sub clip line 404b may be referred to as a sub clip line 404b.
- the electronic device may fix and display the main click line 404a at the top of the clip display window, check the drag input based on the region where the sub clip line 404b exists, and check the drag input corresponding to the direction of the drag input to display the sub clip line (404b) can be displayed by scrolling up and down.
- the electronic device moves the sub clip line 404b to the upper region and displays it, and when the direction of the drag input is confirmed in the lower direction, the electronic device displays the sub clip line 404b ) can be displayed by moving it to the lower area.
- the electronic device may display the upper and lower widths of the main clip line 404a differently in response to the movement of the sub clip line 404b. For example, when the sub clip line 404b moves upward, the vertical width of the main clip line 404a is reduced and displayed, and when the sub clip line 404b moves downward, the vertical width of the main clip line 404a is increased and displayed. can do.
- the clip display window may include a time display line 404c and a reference line (Play Head) 404d indicating the time of the image project.
- the time display line 404c may be displayed on the upper portion of the main clip line 404a described above, and may include a scale or a number in a predetermined unit.
- the reference line 404d may be displayed as a line connected vertically from the time display line 404c to the lower end of the clip display window, and may be displayed in a color (eg, red) that can be easily recognized by the user. .
- the reference line 404d may be provided in a fixed form in a predetermined area, and the object and time display line 404c included in the main clip line 404a and the sub clip line 404b provided in the clip display window are It may be configured to move in a left and right direction.
- the electronic device displays the main clip line 404a and the sub clip line
- the object included in the 404b and the time display line 404c can be displayed by moving in the left and right directions.
- the electronic device may be configured to display a frame or object corresponding to the reference line 404d on the image display window.
- the electronic device may check a detailed time (eg, in units of 1/1000 second) that the reference line 404d comes into contact with and display the confirmed detailed time on the clip display window together.
- the electronic device may check whether multi-touch has occurred in the clip display window, and when multi-touch occurs, a scale or number of a predetermined unit included in the time display line 404c is changed and displayed in response to the multi-touch. can For example, when it is confirmed that the multi-touch interval gradually decreases, the electronic device may decrease the interval between scales or numbers. When an input with a gradually increasing interval of the multi-touch is checked, the electronic device may display the scale or numbers by increasing the interval.
- the electronic device may configure the clip display window 404 to select a clip displayed on the clip line, and may visualize and display that the clip is selected when the clip is selected. For example, when selection of a clip is detected, the electronic device may provide a clip selector at a boundary of the selected clip, and the click selector may be displayed in a predetermined color, for example, yellow.
- the electronic device may provide a clip editing UI for editing the selected clip.
- the electronic device may display the clip editing UI in an area where the media input window 403 exists.
- the clip editing UI may be set differently according to the type of the selected clip. Specifically, when the type of clip is a video clip, the electronic device performs a trim/split menu 501 , a pan/zoom menu 502 , an audio control menu 503 , a clip graphic menu 504 , and a speed control menu 505 .
- the clip editing UI 500 may be configured and provided, including a modulation menu 512 , a vignette control menu 513 , an audio extraction menu 514 , and the like.
- the clip editing UI for each type of clip may be configured based on the structure of the image editing UI.
- the electronic device may further display the clip editing extension UI 530 in an area where the media setting window exists.
- the clip editing extension UI displayed in the area of the media setting window may also be set differently depending on the type of the selected clip. For example, when the type of clip is a video clip, an image clip, an audio clip, or an audio signal clip, the electronic device displays the clip editing extension UI 530 including a clip deletion menu, a clip duplication menu, a clip layer duplication menu, and the like. Organize and provide, and when the type of clip is a video clip, image clip, audio clip, or audio clip, the electronic device expands clip editing including a clip delete menu, a clip clone menu, a clip layer clone menu, etc.
- UI can be configured and provided, and if it is an effect clip, text clip, overlay clip, or drawing clip, the electronic device includes a delete clip menu, duplicate clip menu, bring to front menu, bring forward menu, send back menu, You can configure and provide a clip editing extension UI, including a Send to Back menu, a horizontal center menu, a vertical center menu, and more.
- the clip setting window may include a clip enlargement display menu 550 and a clip movement control menu 560 .
- the electronic device may enlarge the clip display window to the entire area of the display.
- the clip movement control menu 560 may display the clip by moving it according to the reference line.
- the clip movement control menu 560 may include a start area movement menu or an end area movement menu, and the start area movement menu or the end area movement menu is adaptively displayed in consideration of the position of the reference line in contact with the clip. desirable.
- the electronic device basically provides a start region movement menu, and when the clip comes into contact with the start position of the reference line, the electronic device may replace the start region movement menu with the end region movement menu and display it.
- step S140 the electronic device may check a user input input through the editing UI, configure an image project corresponding thereto, and store the configured image project in a storage medium.
- the editing UI is configured to include an export menu in the media setting window.
- the electronic device reflects the information configured in the image project to configure image data. and may be stored in a storage medium (S150).
- the structure of the editing UI provided by the video editing UI control apparatus may be configured as follows.
- the editing UI may basically include an image display window 401, a media setting window 402, a media input window 403, a clip display window 404, a clip setting window 405, and the like. At least one clip selected through may be displayed on the clip display window 404 .
- clip editing menus 501,502, ... 514 may be provided in the area where the media input window 403 exists. there is.
- the clip editing menus 501 , 502 , ... 514 may be adaptively provided according to the structure of the editing UI for each clip type.
- the video clip editing menu is: Trim/Split Menu, Pan/Zoom Menu, Audio Control Menu, Clip Graphics Menu, Speed Control Menu, Reverse Control Menu, Rotate/Mirror Menu, Filter Menu, Brightness/Contrast/Gamma Control Menu, Audio EQ It may include a control menu, a detailed volume control menu, a voice modulation control menu, a vignetting ON/OFF control menu, an audio extraction menu, and the like.
- the trim/split menu is a sub-menu, and may include a play-head left trim menu, a play-head right trim menu, a split-from-playhead menu, a still image split and an insert menu, and the like.
- the audio control menu is a sub-menu, and may include a master volume control bar, a sound effect volume control bar, an automatic volume ON/OFF menu, a left and right balance adjustment bar, a pitch adjustment bar, and the like.
- the master volume control bar, sound effect volume control bar, left and right balance adjustment bar, pitch control bar, etc. can be set to support the detailed control UI, and these master volume control bar, sound effect volume control bar, left and right balance adjustment bar, and pitch
- the control bar and the like may be managed by the main editing UI.
- the UI set as the main editing UI may be configured to display the detailed adjustment UI together.
- a touch input when generated for more than a predetermined time (eg, 1 second) in an area where the main editing UI set to support the detailed adjustment UI exists, it may be configured to activate the detailed adjustment UI as a sub-menu.
- a predetermined time eg, 1 second
- the clip graphic menu may be configured to select at least one graphic to be inserted into a clip.
- the speed control menu may include at least one predetermined speed control button (eg, 1x, 4x, 8x), a speed control bar, a mute ON/OFF menu, a pitch maintenance ON/OFF menu, and the like. And, the speed control bar may be managed as the main editing UI.
- the reverse control menu may be configured to perform reverse processing of an image included in a corresponding clip.
- the audio EQ control menu may be configured to select at least one audio EQ to be applied to an image.
- the filter menu may be configured to select at least one image filter to be applied to an image.
- the brightness/contrast/gamma control menu can include a brightness control bar, a contrast control bar, and a gamma control bar as sub-menus to control the brightness/contrast/gamma value of an image.
- the gamma adjustment bar and the like are managed as the main editing UI, and may be set to support the detailed adjustment UI.
- the rotation/mirror menu is a sub-menu, and may include a horizontal mirroring menu, a vertical mirroring menu, a counterclockwise rotation menu, a clockwise rotation menu, etc., and the counterclockwise rotation menu and clockwise rotation menu are managed as the main editing UI and may be set to support the detailed control UI.
- the detailed volume control menu is a menu for controlling the volume of the audio included in the image, and may include a control point addition menu, a control point deletion menu, a voice control bar, etc., and the voice control bar is managed as the main editing UI, It can be set to support the detailed control UI.
- the audio modulation control menu may be configured to select at least one audio modulation method to be applied to an image.
- the image clip editing menu may include a trim/split menu, pan/zoom menu, rotation/mirroring control menu, clip graphic menu, filter menu, brightness/contrast/gamma control menu, vignetting ON/OFF control menu, etc. and these menus may be configured similarly to the control menu illustrated in FIG. 6A .
- the effect clip editing menu may include an effect setting menu, a transparency control menu, a trim/split menu, a rotation/mirroring control menu, etc.
- the trim/split menu and the rotation/mirroring control menu are similar to the video clip editing menu. can be configured.
- the effect setting menu and the transparency control menu may include an effect setting bar and a transparency control bar as sub-menus, respectively. The effect setting bar and the transparency control bar are managed as the main editing UI, and can be set to support the detailed control UI. there is.
- the overlay clip edit menu may include an overlay color setting menu, a transparency control menu, a trim/split menu, a rotation/mirroring control menu, a blend type setting menu, and the like. It may be configured similarly to the edit menu.
- the transparency control menu may include a transparency control bar as a sub-menu, and the transparency control bar is managed as a main editing UI and may be set to support a detailed control UI.
- the text clip editing menu includes text font setting menu, text color setting menu, trim/split menu, transparency control menu, rotation/mirror control menu, text alignment setting menu, shadow ON/OFF menu, glow ON/OFF menu, It may include an outline ON/OFF menu, a background color ON/OFF menu, a blend type setting menu, etc., and the trim/split menu, transparency control menu, rotation/mirror control menu, etc. can be configured similarly to the video clip editing menu .
- color adjustment bar eg R/G/B adjustment bar
- a transparency control bar for adjusting the transparency may be included, and the color control bar (eg, R/G/B control bar) or the transparency control bar is managed as the main editing UI, and may be set to support the detailed control UI.
- the drawing clip editing menu may include a transparency control menu, a trim/split menu, a rotation/mirror control menu, a blend type setting menu, etc.
- the trim/split menu, rotation/mirror control menu, etc. can be configured similarly.
- the transparency control menu may include a transparency control bar as a sub-menu, and the transparency control bar is managed as a main editing UI and may be set to support a detailed control UI.
- the audio clip editing menu may include an audio control menu, a voice EQ control menu, a detailed volume control menu, a voice modulation control menu, a ducking ON/OFF control menu, a repeat ON/OFF control menu, a trim/split menu, etc. .
- the audio control menu, the voice EQ control menu, the detailed volume control menu, the voice modulation control menu, the trim/split menu, etc. may be configured similarly to the video clip editing menu.
- FIG. 6 is a flowchart illustrating a method of editing advertisement content according to an embodiment of the present disclosure.
- the operation may be started.
- the electronic device may output an initial screen of the video editing application to a display device (eg, a display).
- a menu (or UI) for creating a new image project and an image project selection menu (or UI) for selecting an image project being edited in advance may be provided on the initial screen.
- an advertisement content editing process may be performed using a process similar to step S115.
- an advertisement content editing process may proceed using a process similar to step S125.
- a user may load a project related to advertisement content 600 by using an editing application ( S205 ).
- the user may access the advertisement content source using the electronic device 101 to obtain the advertisement content ( 600 in FIG. 7A ).
- Advertising content sources may be, for example, mobile web pages, banners, links, YouTube, VLOG, social media, and advertising content shared by other users and served from the sources.
- a user may create a new video project by loading advertisement content provided from an advertisement content source into a video editing application.
- the user may select and load a project related to advertisement content that has already been saved as a project in the application and is being edited.
- the advertisement content 600 may include at least one of text, audio, image, and video as a media object, and may be original content produced by an advertiser or producer, or modified advertising content edited by other users.
- the media object of the advertisement content 600 may include at least one of various editing elements illustrated in the effect input menu, the overlay input menu, and the drawing input menu.
- the advertisement content 600 may include an object that is allowed to be edited in the corresponding content, ie, an editable element, as attribute information.
- the element may be attribute data defining at least one of a media object constituting the advertisement content 600 , an editing tool for editing the advertisement content 600 , and category information designated for the concept of the advertisement content 600 .
- the attribute data may be implemented in the form of meta data.
- the elements can be produced as advertisement content by being arranged and combined in time and space.
- each element may be overlapped and arranged in a depth direction in the same time and two-dimensional space, and in this case, depth information between each element may be included.
- the arrangement and combination of the elements described above may be referred to as a relationship of elements of advertisement content in this specification.
- the element related to the media object may be attribute data designating at least a part of an image displayed on the advertisement content 600 .
- the media object related element is a property that specifies at least one of music, sound, image frame, graphic icon, graphic sticker, background image, text, and overlapping layer constituting the illustrated advertisement content 600 . It can be data.
- the editable element may be designated as at least a part of an object constituting the media object (image, music, and sound as the aforementioned objects).
- the element related to the editing tool may include an editing function that gives an additional effect to the media constituting the advertisement content 600 .
- the editing tool is provided in the media input window 403 and the clip editing UI 500 and may be editing elements that provide various additional effects to the media.
- the editable element may be designated as at least one of the plurality of editable elements.
- the category information is a major form for determining the concept of the advertisement content 600 , and may include at least one of a media type of the advertisement content 600 , association information between media objects, and additional effect information.
- the media type may be designated as at least one representative type among a plurality of types including video, audio, image, and text.
- An entity related to an advertisement concept or identity may be expressed by combining a plurality of media objects. For example, media objects may be correlated to express an image composition representing a concept, a brand, a logo song, a character, a product, or a specific background.
- the advertisement producer may designate the advertisement concept object as category information after generating the advertisement content, and the category information may include association information between the concept object and the related media object.
- the additional effect information may be an additional effect related to the advertisement concept object, and the additional effect is substantially the same as described above.
- the editable element may include at least one of a media form of the advertisement content 600 allowing concept transformation, association information between media objects, and insertable additional effect information.
- the advertisement content 600 includes only editable elements as attribute information
- other elements of the advertisement content 600 may be regarded as non-editable elements.
- the attribute information may be recorded by designating an editable element and a non-editable element in the advertisement content 600 .
- other elements of the advertisement content 600 may be regarded as non-editable elements.
- the editable element and/or the non-editable element may be designated by an advertisement producer and recorded as attribute information.
- user-editable and non-editable elements may be designated by advertisers, producers, and former editors for advertisement identity, concept and effect, and the like.
- a category governing the entire advertisement format may also be designated by an advertiser, a producer, a former editor, and the like. Designation of these elements may be labeled on a necessary element by a producer or the like, and a category may be preset when the producer or the like creates advertisement content.
- FIGS. 7A to 7D are diagrams illustrating a process in which advertisement contents are edited by the method of editing advertisement contents according to an embodiment of the present disclosure.
- the advertisement content 600 may be provided to the editing UI application and displayed on the image display window 401 .
- the clip display window 404 may provide clip lines 404a and 404b for each media object of the advertisement content 600 .
- a clip related to an original image or moving picture of a person displayed in the advertisement content 600 illustrated in FIG. 7A may be disposed on the main clip line 404a.
- the notification related to the distortion effect of the person in the advertisement content 600 may be displayed as information related to the additional effect on the main clip line 404a, but is not limited thereto.
- the notification may be implemented in a separate clip line or other form in the media input window 403 .
- Clips related to music, sub-images, texts, sounds, stickers, icons, etc. included in the advertisement content 600 may be represented by corresponding sub-clip lines 404b, respectively.
- clip editing menus 501 , 502 , ... 514 may be provided in an area where the media input window 403 exists.
- the clip editing menus 501 , 502 , ... 514 may be adaptively provided according to the structure of the editing UI for each clip type exemplified above. Detailed descriptions related thereto will be omitted.
- the electronic device in which the video editing application is embedded may extract a user-editable element and a non-editable element from the advertisement content 600 ( S210 ).
- the processor 140 may analyze the attribute information of the advertisement content 600 to distinguish an editable element and a non-editable element.
- the processor 140 may present the editable element to at least one of a media object, an editing user interface for editing the advertisement content 600, and a predetermined area of the editing application.
- 7A as an example, a human face constituting the advertisement content 600, various images around the face, music, special sound, designed text such as “HAPPY HALLOWEEN”, background music, and distortion effect of a human face etc.
- 7A illustrates an editable element being presented in an editing user interface.
- the processor 140 may process an editing user interface related to the elements to visually indicate that it is activated.
- the processor 140 may visually indicate that only at least some functions of the additional effects provided in the submenu of the menu are activated.
- the effect input menu in the layer input menu 604c includes a blue effect, a mosaic effect, a noise effect, a sandstorm effect, a melting point effect, a crystal effect, a star filter effect, a display board effect, a haze effect, a fisheye lens effect, and a magnifying glass lens effect. , flower twist effect, night vision effect, sketch effect, etc. can be provided.
- the editable element may be represented by at least some of effects applied to the advertisement content 600 .
- the above example describes activation of the effect function key.
- the effects applied to the advertisement content 600 may be displayed on the sub clip line 404b, and the effect that becomes an editable element is displayed on the sub clip line ( 404b) may be visually indicated to be active.
- the processor 140 may visually process the original human face to represent an editable element in the image display window 401 .
- the list of editable media objects may be presented in an editing application other than the image display window 401 and the clip display window 404 .
- the processor 140 may include at least one of a media form of the advertisement content 600 allowing a transformation of an advertisement concept, information related to media objects, and information about insertable additional effects.
- the processor 140 may visually process the media objects to be distinguished from other objects in the image display window 401 so that the media objects related to the allowable media types, related information, and additional effect information may be identified.
- a clip line associated with the media objects may be visually activated in the clip display window 404 .
- an additional effect function allowing concept transformation may be displayed on a submenu of the media input window 403 , or a list of media objects that cannot be transformed due to an advertisement concept may be presented in a separate area.
- the user may select an editable element from the advertisement content 600 (S215).
- the processor 140 presents the editable element and the editable range category through the clip display window 404, and provides the clip editing UI illustrated in FIG. 5 through the media input window 403 when the user selects a clip can do.
- the editable category information of the advertisement content 600 includes all types of video, image, music, and text, and a process of editing a human face among editable elements will be exemplified.
- the present embodiment is not limited to this example, and various edits are possible for elements that allow editing in the advertisement content 600 .
- editing can be performed with background music, voice, sticker, icon, text addition, video frame order change, video and image overlapping at the same time and two-dimensional space, category transformation within the allowable range, etc. .
- the processor 140 receives the user input for the selected editable element.
- the processor 140 may display the clip selector 606 in response to receiving.
- the processor 140 may present an editable UI to the clip in the media input window 403 .
- the processor 140 provides the user's insertion activation request to the interface 516 in the media input window 403, and the user selects the insertion activation request, so that the processor 140 inserts the selected human face to be replaced.
- the request may be activated (S220).
- the interface of the insertion activation request appears as Replace 516 in the media input window 403 , and the user selects Replace 516 to activate replacement editing of the advertisement content 600 .
- a soft key associated with an activation request may clearly present to the user that the media object is editable.
- the soft key may activate presentation of a candidate item related to the replacement insertion element when there are a plurality of insertion elements to be replaced.
- the processor 140 may present a plurality of candidate items that can be inserted into the editable element according to the insertion activation request, and receive the candidate item 608 selected by the user's input ( S225 ).
- a plurality of images to replace the human face selected as the editable element are presented as the candidate item list 517 .
- the candidate item list may provide an object type button 518 to be presented for each object type on the MIDI.
- the user wants to select a picture as an insertion element other than a video, the user can touch a picture interface on the object type button 518 to check a candidate item of the picture.
- candidate items may be presented for each media object, or categories of various concepts in which media objects are combined may be presented as candidate items.
- the processor 140 may determine the selected item as an insertion element and edit the advertisement content 600 based on the insertion element ( S230 ).
- the processor 140 may replace and edit the existing human face displayed on the clip display window 404 in the advertisement content 600 with the selected image based on the other person's face image selected by the user.
- the distortion effect applied to the existing human face is processed to be applied to the selected image 608 , and the edited advertisement content 600a may be provided on the image display window 401 .
- the edited image 610 in which the selected image and the distortion effect are combined may be generated to be included in the edited advertisement content 600a.
- the user may upload to the content platform 602 through the export function or project storage mentioned in FIG. 3 to share the edited advertisement content with other users, or save the unedited advertisement content as a project to be re-edited later.
- FIG. 8 is a flowchart of an evaluation process of edited advertisement content according to another embodiment of the present disclosure.
- the advertisement content 600a edited through FIG. 6 may be uploaded through the video editing application itself or another content platform.
- At least one of the application-embedded processor 140 and the server of the content platform may calculate edit evaluation data for the uploaded edited advertisement content 600a ( S305 ).
- the editing evaluation data may be calculated based on at least one of an insertion element added to the edited advertisement content 600a and an advertisement element maintained in the advertisement content 600 .
- the editorial evaluation data may be generated based on at least one of the retention rate of the advertisement element, the importance information of the advertisement element set in the advertisement content 600, and the deformation degree information of the advertisement content 600 according to the insertion element. .
- Calculation reference data and importance information used to calculate the degree of variation of the edited advertisement content from the degree of variation information may be provided from a content platform or a server of an advertisement provider.
- an advertisement provider may give high importance, ie, weight, to editable advertisement elements and specific elements among categories.
- a high-weighted element may be a media object, an additional effect, or a category desired to be exposed in an original state as much as possible.
- the weight may be presented to the user during the editing process of the advertisement content in FIG. 6 , and the user may edit even an editable element with high importance according to the advertisement effect and his/her intention.
- the content platform or processor 140 determines at least one of a media object, an additional effect, and a category maintained in the advertisement content 600 , and determines the entire advertisement content It is possible to calculate the ratio and importance for the elements maintained in In addition, along with the degree of deformation of the edited element, a rate of change in importance according to the degree of deformation may be calculated.
- the advertisement producer may generate importance information by setting weights related to the advertisement concept for each editable element.
- the edit evaluation data may be calculated to be lower than an element having a low weight.
- the deformation degree information may include a separation degree in which the inserted element replaced in the editable element is deformed compared to the original element.
- the separation degree may be a degree changed based on calculation reference data for the concept or identity of the original image by machine learning analysis of the image.
- the separation degree may be a changed degree for a music mood, a tempo, and the like, based on calculation reference data for the original music.
- the processor 140 or the server may generate the first evaluation information according to the edited evaluation data ( S310 ).
- the second evaluation information may be generated by collecting reaction information on the edited advertisement content 600a of another user (S315).
- the reaction information to the server of the content platform and advertisement provider is the edited advertisement contents ( 600a) may be evaluated to derive second evaluation information. Responsiveness of other users may be calculated, for example, through preference evaluation, preference comments, and the like.
- the first and second evaluations according to the above-described steps S310 and S315 may be performed in a different order from that of FIG. 8, for example, in the reverse order or in the same stage, and weights according to the first and second evaluation information may also be set depending on the situation. .
- the first and second evaluation information may be transmitted to the electronic device 101 of the user and the server of the advertisement content provider ( S320 ).
- the advertisement provider may quantitatively analyze the advertisement effect on the original advertisement content 600 and the edited advertisement content 600a.
- the user may be provided with various benefits through advertisers, application operators, advertising agencies, and the like.
- the content is exemplified as advertisement content, but as long as the properties of the content are not impaired, the content may include not only advertisements but also general content generated by individual users.
- Example methods of the present disclosure are expressed as a series of operations for clarity of description, but this is not intended to limit the order in which the steps are performed, and if necessary, each step may be performed simultaneously or in a different order.
- other steps may be included in addition to the illustrated steps, other steps may be excluded from some steps, or additional other steps may be included except some steps.
- various embodiments of the present disclosure may be implemented by hardware, firmware, software, or a combination thereof.
- ASICs Application Specific Integrated Circuits
- DSPs Digital Signal Processors
- DSPDs Digital Signal Processing Devices
- PLDs Programmable Logic Devices
- FPGAs Field Programmable Gate Arrays
- general purpose It may be implemented by a processor (general processor), a controller, a microcontroller, a microprocessor, and the like.
- the scope of the present disclosure includes software or machine-executable instructions (eg, operating system, application, firmware, program, etc.) that cause operation according to the method of various embodiments to be executed on a device or computer, and such software or and non-transitory computer-readable media in which instructions and the like are stored and executable on a device or computer.
- software or machine-executable instructions eg, operating system, application, firmware, program, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Game Theory and Decision Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims (14)
- 적어도 하나의 프로세서를 포함하는 컴퓨팅 장치가 수행하는 광고 컨텐츠의 편집 방법에 있어서,로딩된 광고 컨텐츠에서 편집가능 요소를 추출하여 제시하는 단계;사용자 입력에 의해 상기 제시된 편집가능 요소를 수신하는 단계;상기 사용자 입력에 의해 상기 편집가능 요소에 대한 삽입 요소를 선택하는 단계; 및상기 선택된 삽입 요소에 기초하여 상기 광고 컨텐츠를 편집하는 단계를 포함하는, 광고 컨텐츠의 편집 방법.
- 제 1항에 있어서,상기 편집가능 요소는 상기 광고 컨텐츠를 구성하는 미디어 객체, 상기 광고 컨텐츠를 편집하는 편집 도구 및 상기 광고 컨텐츠의 컨셉을 위해 지정된 카테고리 정보 중 적어도 하나를 포함하는, 광고 컨텐츠의 편집 방법.
- 제 2항에 있어서,상기 편집 도구는 상기 광고 컨텐츠를 구성하는 미디어에 부가 효과를 부여하는 편집 기능을 포함하는, 광고 컨텐츠의 편집 방법.
- 제 2항에 있어서,상기 카테고리 정보는 상기 컨셉의 변형을 허용하는 상기 광고 컨텐츠의 미디어 형태, 상기 미디어 객체 간의 연관 정보 및 삽입가능한 부가 효과 정보 중 적어도 하나를 포함하는, 광고 컨텐츠의 편집 방법.
- 제 2항에 있어서,상기 편집가능 요소를 제시하는 단계는, 상기 미디어 객체, 상기 광고 컨텐츠를 편집하기 위한 편집 유저 인터페이스, 상기 컴퓨팅 장치가 제공하는 편집 어플리케이션 소정 영역 중 적어도 어느 하나에, 상기 편집가능 요소를 제시하는, 광고 컨텐츠의 편집 방법.
- 제 1항에 있어서,상기 편집가능 요소를 추출하는 단계는 상기 광고 컨텐츠에 포함된 속성 정보에 기초하여 추출하되, 상기 속성 정보는 상기 광고 컨텐츠에서 편집가능 요소와 편집불가능 요소를 구별하여 기록하는, 광고 컨텐츠의 편집 방법.
- 제 1항에 있어서,상기 편집가능 요소를 수신하는 단계는, 상기 편집가능 요소를 선택한 상기 사용자의 입력을 수신하고, 상기 편집가능 요소에 대한 상기 사용자 입력에 따른 삽입 활성화 요청을 수신하는, 광고 컨텐츠의 편집 방법.
- 제 1항에 있어서,상기 삽입 요소를 선택하는 단계는, 상기 편집가능 요소에 삽입가능한 복수의 후보 아이템을 제시하고, 상기 사용자의 입력에 의해 선택된 후보 아이템을 수신하는, 광고 컨텐츠의 편집 방법.
- 제 1항에 있어서,상기 편집된 광고 컨텐츠를 컨텐츠 플랫폼을 통해 공유하는 단계;상기 공유된 편집 광고 컨텐츠를 평가하여 평가 정보를 생성하는 단계; 및상기 컴퓨팅 장치 및 상기 광고 컨텐츠의 제공과 관련된 서버에 상기 평가 정보를 전달하는 단계를 더 포함하는, 광고 컨텐츠의 편집 방법.
- 제 9항에 있어서,상기 평가 정보를 생성하는 단계는, 상기 편집 광고 컨텐츠에 부가된 삽입 요소 및 상기 광고 컨텐츠에 유지되는 광고 요소 중 적어도 하나에 기초하여 제 1 평가 정보를 생성하는, 광고 컨텐츠의 편집 방법.
- 제 10항에 있어서,상기 제 1 평가 정보는 상기 광고 요소의 유지율, 상기 광고 컨텐츠에서 설정한 상기 광고 요소의 중요도 정보, 상기 삽입 요소에 따른 상기 광고 컨텐츠의 변형도 정보 중 적어도 하나에 기초하여 생성되는, 광고 컨텐츠의 편집 방법.
- 제 11항에 있어서,상기 제 1 평가 정보는 상기 컴퓨팅 장치 및 상기 서버 중 적어도 하나에 의해 생성되며, 상기 변형도 정보에서 상기 편집 광고 컨텐츠의 변형도를 산출하는데 이용되는 산출 기준 데이터 및 상기 중요도 정보는 상기 서버로부터 제공되는, 광고 컨텐츠의 편집 방법.
- 제 9항에 있어서,상기 평가 정보를 생성하는 단계는, 상기 컨텐츠 플랫폼의 다른 사용자로부터 상기 편집 광고 컨텐츠에 대한 반응 정보를 수집하여 제 2 평가 정보를 생성하는 단계를 더 포함하는, 광고 컨텐츠의 편집 방법.
- 광고 컨텐츠의 편집을 위한 컴퓨팅 장치에 있어서,통신 모듈; 및상기 통신 모듈과 송수신하며 상기 컴퓨팅 장치를 제어하는 프로세서를 포함하되,상기 프로세서는,로딩된 광고 컨텐츠에서 편집가능 요소를 추출하여 제시하고,사용자 입력에 의해 상기 제시된 편집가능 요소를 수신하고,상기 사용자 입력에 의해 상기 편집가능 요소에 대한 대체 요소를 선택하고,상기 선택된 대체 요소에 기초하여 상기 광고 컨텐츠를 편집하는, 컴퓨팅 장치.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/251,779 US20240005364A1 (en) | 2020-11-09 | 2021-11-08 | Method and device for editing advertisement content |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2020-0148865 | 2020-11-09 | ||
KR20200148865 | 2020-11-09 | ||
KR1020210152200A KR20220063103A (ko) | 2020-11-09 | 2021-11-08 | 광고 컨텐츠의 편집 방법 및 그 장치 |
KR10-2021-0152200 | 2021-11-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022098187A1 true WO2022098187A1 (ko) | 2022-05-12 |
Family
ID=81458172
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/016137 WO2022098187A1 (ko) | 2020-11-09 | 2021-11-08 | 광고 컨텐츠의 편집 방법 및 그 장치 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20240005364A1 (ko) |
WO (1) | WO2022098187A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040024909A (ko) * | 2002-09-17 | 2004-03-24 | 최상철 | 변환형 동영상 캐릭터 광고 제작 방법 |
US20070146812A1 (en) * | 2005-12-02 | 2007-06-28 | Lawton Scott S | Reader editable advertising |
KR20080087067A (ko) * | 2007-02-08 | 2008-09-30 | 리얼네트웍스아시아퍼시픽 주식회사 | 컨텐츠 편집툴을 이용한 광고용 멀티미디어 컨텐츠제공방법 |
KR20100012702A (ko) * | 2008-07-29 | 2010-02-08 | 엔에이치엔비즈니스플랫폼 주식회사 | 콘텐츠 편집 광고 방법 및 시스템 |
KR20140026671A (ko) * | 2012-08-22 | 2014-03-06 | 에스케이플래닛 주식회사 | 광고 중개 시스템과 방법 및 이를 지원하는 장치 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060015904A1 (en) * | 2000-09-08 | 2006-01-19 | Dwight Marcus | Method and apparatus for creation, distribution, assembly and verification of media |
US8655718B2 (en) * | 2007-12-18 | 2014-02-18 | Yahoo! Inc. | Methods for augmenting user-generated content using a monetizable feature |
US20140067522A1 (en) * | 2012-09-01 | 2014-03-06 | Sokrati Technologies Pvt Ltd | Method and system for managing online paid advertisements |
-
2021
- 2021-11-08 US US18/251,779 patent/US20240005364A1/en not_active Abandoned
- 2021-11-08 WO PCT/KR2021/016137 patent/WO2022098187A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040024909A (ko) * | 2002-09-17 | 2004-03-24 | 최상철 | 변환형 동영상 캐릭터 광고 제작 방법 |
US20070146812A1 (en) * | 2005-12-02 | 2007-06-28 | Lawton Scott S | Reader editable advertising |
KR20080087067A (ko) * | 2007-02-08 | 2008-09-30 | 리얼네트웍스아시아퍼시픽 주식회사 | 컨텐츠 편집툴을 이용한 광고용 멀티미디어 컨텐츠제공방법 |
KR20100012702A (ko) * | 2008-07-29 | 2010-02-08 | 엔에이치엔비즈니스플랫폼 주식회사 | 콘텐츠 편집 광고 방법 및 시스템 |
KR20140026671A (ko) * | 2012-08-22 | 2014-03-06 | 에스케이플래닛 주식회사 | 광고 중개 시스템과 방법 및 이를 지원하는 장치 |
Also Published As
Publication number | Publication date |
---|---|
US20240005364A1 (en) | 2024-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016028042A1 (en) | Method of providing visual sound image and electronic device implementing the same | |
WO2015064903A1 (en) | Displaying messages in an electronic device | |
WO2017082519A1 (ko) | 응답 메시지를 추천하는 사용자 단말 장치 및 그 방법 | |
WO2016056871A1 (en) | Video editing using contextual data and content discovery using clusters | |
WO2016093506A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2020162709A1 (en) | Electronic device for providing graphic data based on voice and operating method thereof | |
AU2013355486A1 (en) | Display device and method of controlling the same | |
WO2020054978A1 (ko) | 이미지 생성 장치 및 그 방법 | |
WO2014098528A1 (en) | Text-enlargement display method | |
EP3022848A1 (en) | Method of providing message and user device supporting the same | |
WO2017057960A1 (en) | Electronic device and method for controlling the same | |
WO2020060130A1 (en) | Display apparatus and control method thereof | |
WO2014126331A1 (en) | Display apparatus and control method thereof | |
WO2015088196A1 (ko) | 자막 편집 장치 및 자막 편집 방법 | |
WO2019039874A1 (en) | ELECTRONIC MESSAGE TRANSMISSION DEVICE AND METHOD OF OPERATION THEREFOR | |
WO2020013651A1 (ko) | 전자 장치 및 전자 장치의 컨텐트 전송 방법 | |
WO2015041491A1 (en) | Method and device for displaying content | |
WO2018056587A1 (en) | Electronic apparatus and controlling method thereof | |
WO2022098187A1 (ko) | 광고 컨텐츠의 편집 방법 및 그 장치 | |
WO2022203422A1 (ko) | 공유 컨텐츠를 이용하는 컨텐츠 편집 방법 및 장치 | |
WO2022231234A1 (ko) | 편집이력 정보 제공을 위한 동영상 편집 ui 제어 방법 및 그 장치 | |
WO2013065912A1 (ko) | 화면 출력을 제어하는 방법, 단말기 및 기록매체 | |
WO2014107059A1 (ko) | 재생 가능 객체를 생성하는 사용자 단말 장치 및 그 인터렉션 방법 | |
WO2022231378A1 (ko) | 더미 미디어를 이용하는 컨텐츠 공유 방법, 전자 장치 및 컴퓨터 프로그램 | |
WO2020242064A1 (ko) | 모바일 장치 및 모바일 장치의 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21889652 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18251779 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21889652 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.10.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21889652 Country of ref document: EP Kind code of ref document: A1 |