KR20170032590A - Method and system for applying image effect collectively for images included in a group - Google Patents

Method and system for applying image effect collectively for images included in a group Download PDF

Info

Publication number
KR20170032590A
KR20170032590A KR1020150130097A KR20150130097A KR20170032590A KR 20170032590 A KR20170032590 A KR 20170032590A KR 1020150130097 A KR1020150130097 A KR 1020150130097A KR 20150130097 A KR20150130097 A KR 20150130097A KR 20170032590 A KR20170032590 A KR 20170032590A
Authority
KR
South Korea
Prior art keywords
image
input
screen
images
user
Prior art date
Application number
KR1020150130097A
Other languages
Korean (ko)
Other versions
KR101741906B1 (en
Inventor
남세동
정창영
김영훈
Original Assignee
라인 가부시키가이샤
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 라인 가부시키가이샤 filed Critical 라인 가부시키가이샤
Priority to KR1020150130097A priority Critical patent/KR101741906B1/en
Publication of KR20170032590A publication Critical patent/KR20170032590A/en
Application granted granted Critical
Publication of KR101741906B1 publication Critical patent/KR101741906B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing

Abstract

The present invention discloses a method and system for applying an image effect to images included in one group at one time. An image editing method comprises the following steps of: displaying at least one of images included in one group or a thumbnail image of the at least one image on a screen of an electronic device; recognizing an input of a user with respect to the image or thumbnail image displayed on the screen; randomly selecting one of image effects according to the recognized input of the user, or selecting an image effect corresponding to a type of the input of the user among the image effects; and applying the selected image effect to each of the images included in the image group or to each of the thumbnail images of the images at one time.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a method and system for collectively applying image effects to images included in one group,

The following description relates to a method and system for batch application of image effects to images contained in a group.

There are techniques for applying various effects to an image. For example, Korean Patent Laid-Open Publication No. 10-2007-0016295 discloses a digital image color effect changing method, wherein when a color effect menu is selected, an original image and a color effect image are simultaneously displayed. These prior art disclose the application of image effects to one image at a time.

However, there is a need to batch-apply the same effect to multiple images at once. In addition, users who are not familiar with image effects often want to apply various effects to the image. It is very difficult for these users to find the desired effect and apply it to the desired image. In addition, users who are not familiar with image effects are more likely to find out what effect they have when a particular effect is applied to an image. For example, when displaying a blur effect on an image, a user who does not know what the blur effect is may find it difficult to grasp exactly what effect was applied with only one blur effect image.

References: <PCT / KR / 2014/010167, US20140019540A1, US20130332543A1, US20130260893>

At least one image (or a thumbnail image of at least one image) of images classified into one group such as a folder or an album is displayed on the screen, and a user's input (or thumbnail image) (Or an image effect corresponding to a type of an input) randomly selected according to an input image (or a thumbnail image) of each group.

(Or thumbnail image) displayed on the screen in accordance with the first input of the user (for example, the first touch input) and changes the image effect selected on the second input (second touch input, for example) And a method and system for sequentially applying selected image effects to a changed image when an image displayed on the screen is changed.

Displaying at least one image of images included in one group of images or a thumbnail image of the at least one image on a screen in an electronic device; Recognizing a user's input on an image or a thumbnail image displayed on the screen; Randomly selecting one of the plurality of image effects according to an input of the recognized user or selecting an image effect corresponding to a type of input of the input user among the plurality of image effects; And collectively applying the selected image effect to each of the images included in the image group or each of the thumbnail images of the images.

CLAIMS What is claimed is: 1. A computer-implemented electronic device system, comprising: at least one processor configured to execute computer-readable instructions, wherein the at least one processor comprises: at least one image A display control unit for controlling the electronic device to display a thumbnail image of the at least one image on a screen; An input recognizing unit recognizing a user's input on an image or a thumbnail image displayed on the screen; An image effect selection unit that randomly selects one of the plurality of image effects according to an input of the recognized user or selects an image effect corresponding to the input type of the input user among the plurality of image effects part; And an image effect applying unit for applying the selected image effect collectively to each of the images included in the image group or each of the thumbnail images of the images.

At least one image (or a thumbnail image of at least one image) of images classified into one group such as a folder or an album is displayed on the screen, and a user's input (or thumbnail image) (Or an image effect corresponding to the type of the input) randomly selected in accordance with the selected image group (or thumbnail images).

(Or thumbnail image) displayed on the screen in accordance with the first input of the user (for example, the first touch input) and changes the image effect selected on the second input (second touch input, for example) When the image displayed on the screen is changed, the selected image effect can be sequentially applied to the changed image.

1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention.
2 is a block diagram illustrating an internal configuration of an electronic device and a server according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating an example of components that a processor of an electronic device according to an embodiment of the present invention may include.
4 is a flowchart illustrating an example of a method that an electronic device according to an embodiment of the present invention can perform.
5 to 7 illustrate an example of a process of applying an image effect using a touch input to a touch screen in an embodiment of the present invention.
FIG. 8 is a diagram illustrating an example in which a user's input is recognized using an effect application interface in an embodiment of the present invention.
9 and 10 are views showing an example of applying an image effect to a thumbnail image in an embodiment of the present invention.

Hereinafter, embodiments will be described in detail with reference to the accompanying drawings.

1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention. 1 shows an example in which a plurality of electronic devices 110, 120, 130, 140, a plurality of servers 150, 160, and a network 170 are included. 1, the number of electronic devices and the number of servers are not limited to those shown in FIG.

The plurality of electronic devices 110, 120, 130, 140 may be a fixed terminal implemented as a computer device or a mobile terminal. Examples of the plurality of electronic devices 110, 120, 130 and 140 include a smart phone, a mobile phone, a navigation device, a computer, a notebook, a digital broadcast terminal, a PDA (Personal Digital Assistants) ), And tablet PCs. For example, the electronic device 1 110 may communicate with other electronic devices 120, 130, 140 and / or the servers 150, 160 via the network 170 using a wireless or wired communication scheme.

The communication method is not limited, and may include a communication method using a communication network (for example, a mobile communication network, a wired Internet, a wireless Internet, a broadcasting network) that the network 170 may include, as well as a short-range wireless communication between the devices. For example, the network 170 may be a personal area network (LAN), a local area network (LAN), a campus area network (CAN), a metropolitan area network (MAN), a wide area network (WAN) , A network such as the Internet, and the like. The network 170 may also include any one or more of a network topology including a bus network, a star network, a ring network, a mesh network, a star-bus network, a tree or a hierarchical network, It is not limited.

Each of the servers 150 and 160 is a computer device or a plurality of computers that communicate with a plurality of electronic devices 110, 120, 130 and 140 through a network 170 to provide commands, codes, files, Lt; / RTI &gt; devices.

In one example, the server 160 may provide a file for installation of the application to the electronic device 1 (110) connected via the network 170. [ In this case, the electronic device 1 (110) can install an application using a file provided from the server (160). The server 150 is connected to the server 150 according to the control of the operating system (OS) and at least one program (for example, the browser or the installed application) I can receive contents. For example, when the electronic device 1 (110) transmits a service request message to the server 150 via the network 170 under the control of the application, the server 150 transmits a code corresponding to the service request message to the electronic device 1 The first electronic device 110 can provide a service or contents to the user by displaying and displaying a screen according to the code according to the control of the application. As another example, the server 150 may establish a communication session for the messaging service and route the message transmission / reception between the plurality of electronic devices 110, 120, 130, 140 through the established communication session.

2 is a block diagram illustrating an internal configuration of an electronic device and a server according to an embodiment of the present invention. In FIG. 2, an internal configuration of the electronic device 1 (110) as an example of one electronic device and the server 150 as an example of one server will be described. Other electronic devices 120, 130, 140 or server 160 may have the same or similar internal configurations.

The electronic device 1 110 and the server 150 may include memories 211 and 221, processors 212 and 222, communication modules 213 and 223 and input / output interfaces 214 and 224. The memories 211 and 221 may be a computer-readable recording medium and may include a permanent mass storage device such as a random access memory (RAM), a read only memory (ROM), and a disk drive. The memory 211 and 221 may store an operating system and at least one program code (for example, a code for a browser installed in the electronic device 1 (110) or the above-described application). These software components may be loaded from a computer readable recording medium separate from the memories 211 and 221 using a drive mechanism. Such a computer-readable recording medium may include a computer-readable recording medium such as a floppy drive, a disk, a tape, a DVD / CD-ROM drive, and a memory card. In other embodiments, the software components may be loaded into memory 211, 221 via communication modules 213, 223 rather than a computer readable recording medium. For example, at least one program may be a program installed by a file distribution system (for example, the server 160 described above) that distributes installation files of developers or applications, May be loaded into the memory 211, 221 based on the application described above.

Processors 212 and 222 may be configured to process instructions of a computer program by performing basic arithmetic, logic, and input / output operations. The instructions may be provided to the processors 212 and 222 by the memories 211 and 221 or the communication modules 213 and 223. For example, the processor 212, 222 may be configured to execute a command received in accordance with a program code stored in a recording device, such as the memory 211, 221.

The communication modules 213 and 223 may provide functions for the electronic device 1 110 and the server 150 to communicate with each other through the network 170 and may provide functions for communicating with other electronic devices (for example, the electronic device 2 120) Or to communicate with another server (e.g., server 160). For example, when the processor 212 of the electronic device 1 110 receives a request (e.g., a streaming service request for the content) generated in accordance with a program code stored in a recording device such as the memory 211, To the server 150 via the network 170 in accordance with the &lt; / RTI &gt; Conversely, control signals, commands, contents, files, and the like provided under the control of the processor 222 of the server 150 are transmitted to the communication module 223 of the electronic device 110 via the communication module 223 and the network 170 213 to the electronic device 1 (110). For example, control signals and commands of the server 150 received through the communication module 213 may be transmitted to the processor 212 or the memory 211, May be stored as a storage medium that may further include a &lt; RTI ID = 0.0 &gt;

The input / output interfaces 214 and 224 may be means for interfacing with the input / output device 215. For example, the input device may include a device such as a keyboard or a mouse, and the output device may include a device such as a display for displaying a communication session of the application. As another example, the input / output interface 214 may be a means for interfacing with a device having integrated functions for input and output, such as a touch screen. More specifically, the processor 212 of the electronic device 1 (110) uses the data provided by the server 150 or the electronic device 2 (120) in processing commands of the computer program loaded in the memory 211 A service screen or contents can be displayed on the display through the input / output interface 214. [

Also, in other embodiments, electronic device 1 110 and server 150 may include more components than the components of FIG. However, there is no need to clearly illustrate most prior art components. For example, electronic device 1 110 may be implemented to include at least a portion of input / output devices 215 described above, or may be implemented with other components such as a transceiver, Global Positioning System (GPS) module, camera, Elements.

FIG. 3 is a diagram illustrating an example of components that a processor of an electronic device according to an exemplary embodiment of the present invention may include; FIG. 4 is a flowchart illustrating a method of an electronic device according to an exemplary embodiment of the present invention; Fig. 3, the processor 212 of the electronic device 110 includes a display control unit 310, an input recognition unit 320, an image effect selection unit 330, and an image effect application unit 340 . The processor 212 and the components of the processor 212 may control the electronic device 1 110 to perform the steps 410 to 460 included in the image editing method of FIG. 4, (Computer-readable instructions) of the at least one program and the code of the operating system that the computer 211 includes.

In step 410, the processor 212 may load the program code stored in the file of the application for the image editing method into the memory 211. [ For example, a program file of an application may be provided by a file distribution server via a network, and may be used to install an application in the electronic device 1 (110). When the application installed in the electronic device 1 (110) is executed, the processor 212 can load the program code into the memory 211. [

At this time, the display control unit 310, the input recognition unit 320, the image effect selection unit 330, and the image effect application unit 340 included in the processor 212 and the processor 212 are loaded into the memory 211, Lt; RTI ID = 0.0 &gt; 430-460 &lt; / RTI &gt; At this time, the components of the processor 212 and the processor 212 may control the first electronic device 110 in executing the image editing method. For example, the processor 212 may control the communication module 213 included in the electronic device 1 110 to enable the electronic device 1 110 to communicate with the server 150 or other electronic devices, Can be controlled. As another example, the processor 212 may control the electronic device 1 110 to store the program code in the memory 211 in loading the program code into the memory 211.

At step 420, the processor 212 may load the images contained in one of the at least one group of images stored in the storage into memory. The storage may be a storage included in the electronic device 1 (110) or a storage on the web. When the storage on the web is used, the processor 212 controls the electronic device 1 (110) so that the electronic device 1 (110) accesses the storage on the web via the network 170, downloads the images, can do. The processor 212 loads the images stored in the storage into the memory 211 of the electronic device 1 110 or the images stored in the storage on the web and loads the images into the memory 211 of the electronic device 110. [ (Not shown). In this case, step 420 may be performed by the loading section. Images can be stored and managed in the form of folders or albums in storage, and storage can have one or more folders or albums.

In step 430, the display control unit 310 may control the first electronic device 110 to display on the screen at least one image of the images included in one image group or a thumbnail image of at least one image. Where the images contained in one group of images may be images loaded into memory in step 420. In this case, the processor 212 may first generate a thumbnail image of images before controlling the electronic device 1 (110) to display a thumbnail image on the screen in the display control unit 310. [

In step 440, the input recognition unit 320 may recognize the user's input on the image or thumbnail image displayed on the screen. For example, the input recognizing unit 320 may recognize the user's gesture with respect to the image displayed on the screen or the thumbnail image through the at least one sensor included in the electronic device 110 as a user's input. For example, when the electronic device 1 (110) includes a touch screen, the user's input to the image (or the thumbnail image) is a touch input of the user to the entire area of the touch screen screen Or a user's touch input to an area where an image (or thumbnail image) is displayed. Or a user's selection input to an effect application interface (e.g., an effect application button) that is further displayed on the screen in association with the image (or thumbnail image). Also, a user's input may be recognized using a user input using at least one sensor such as a camera, a motion sensor, an acceleration sensor, or the like, or a physical button that the electronic device 110 may include.

In step 450, the image effect selector 330 randomly selects one of the plurality of image effects according to the input of the recognized user, or selects one of the plurality of image effects, The corresponding image effect can be selected. In other words, the type of the image effect to be applied to the images may be selected randomly according to the recognition of the user's predetermined input, or may be selected according to the type of input of the input user. For example, the image effect selection unit 330 may randomly select one of a plurality of image effects as the preset gesture is recognized, reselect the image effect in which the previously selected history exists, An image effect corresponding to the recognized gesture can be selected from among a plurality of preset image effects.

In step 460, the image effect applying unit 340 may apply the selected image effect collectively to each of the images included in the image group or to each of the thumbnail images of the images. The process of selecting a desired image effect among a plurality of image effects through a separate interface and instructing to apply a selected image effect to a specific image may be omitted. For example, a user does not need to directly select an image effect, and does not need to instruct the application of an image effect, but merely selects the image effect selected by a simple input (for example, a swipe input on the touch screen) All images (or all thumbnail images) can be applied in batch.

In one embodiment, in step 460, the image effect applying unit 340 may apply the selected image effect to all the images of the group irrespective of whether or not they are displayed on the screen.

In another embodiment, the image effect applying unit 340 may select an image effect (or thumbnail images) to be displayed on the screen among the images (or thumbnail images) of the image group in step 460 Can be applied.

In a more specific embodiment, applying image effects directly to images that are not yet displayed to the user can be inefficient because it increases the load on the hardware. Accordingly, the image effect applying unit 340 can apply the image effect to the images displayed on the screen. For example, the image effect applying unit 340 may apply the image effect to the first image displayed on the screen. Thereafter, when the image displayed on the screen changes from the first image to the second image, the image effect applying unit 340 may apply an image effect to the second image and display the image on the screen. In other words, in step 460, the image effect applying unit 340 applies the selected image effect or thumbnail image to another image or another thumbnail image every time the image or thumbnail image displayed on the screen is changed to another image or another thumbnail image included in the image group Can be applied. Therefore, although the selected image effect is not applied to all the images of the group yet, since the selected image effect is applied to the displayed images, the user can select all the images of the group As shown in FIG. As described above, according to the embodiment, it is possible to minimize the load of the hardware by collectively applying selected image effects to all the images (or all the thumbnail images) of the image group.

Hereinafter, an example of applying an image effect to images using a touch screen will be described. For example, the electronic device 1 (110) illustrated in FIGS. 3 and 4 may include a touch screen. At this time, in step 430, the display control unit 310 changes the image or thumbnail image displayed on the screen to another image or another thumbnail image included in the image group according to the first touch input of the user on the touch screen, It is possible to control the device 1 (110). In this case, in step 440, the input recognition unit 320 may recognize the gesture of the user as the user's input in accordance with the second touch input of the user on the touch screen. In this case, the first touch input and the second touch input may be a touch and drag input or a swipe input in directions different from each other.

5 to 7 illustrate an example of a process of applying an image effect using a touch input to a touch screen in an embodiment of the present invention. 5 to 7 illustrate screens (the first screen 510, the second screen 520, the third screen 610, the fourth screen 620, and the fifth screen 710) of the electronic device 110 )).

In FIG. 5, the first screen 510 shows the image 1 511 displayed. At this time, the second screen 520 generates a touch and drag event in the downward direction on the touch screen of the first screen 510, so that the image 1 511 is moved to the image 2 521 Is changed.

In FIG. 6, the third screen 610 shows the image 2 512 displayed. In this case, the fourth screen 620 generates a touch-and-drag event in the left or right direction on the touch screen of the third screen 610, so that the second image 521 is displayed on the second image 621 to which the image effect is applied, As shown in Fig. At this time, the image effect may be selected randomly among a plurality of image effects, or a preset image effect may be selected according to the type of the user's input.

For example, if a user generates a touch-and-drag event in the right direction, one of the plurality of image effects can be selected randomly. If the user again generates a touch-and-drag event in the right direction, the image effect of the other of the plurality of image effects can be selected at random. The selection of such a random image effect can be performed each time a touch-and-drag event in the right direction of the user is generated. Also, if the user generates a touch-and-drag event in the left direction, the previously selected image effect of the currently selected image effect may again be selected.

As another example, a new random image effect may be selected each time the user generates a touch-and-drag event in the left or right direction.

As another example, different image effects may be selected for each case when the user generates a touch-and-drag event in the left direction and a case in which the touch and drag event occurs in the right direction. For example, when the user generates a left-side touch-and-drag event, a predetermined first image effect may be selected. Also, when the user generates a touch and drag event in the right direction, a predetermined second image effect can be selected.

The user's input for selecting an image effect can be pre-set in various ways.

7, in the fifth screen 710, when the user generates a touch-and-drag event in the upward direction on the screen displaying the image 2 621 to which the image effect is applied, the image 2 621 to which the image effect is applied, 1 (511) but the image 1 (711) to which the image effect is applied.

Thus, the user can collectively apply the same image effect to all the images belonging to the group of currently displayed images by only one input (touch and drag input in the left or right direction in this embodiment). Here, as shown in FIGS. 5 to 7, a process of selecting an image effect using a separate user interface or a process of applying a selected image effect to a selected image may be omitted. Therefore, the user can apply the image effect to the images in a batch process very easily without having to worry about which image effect should be selected. In addition, the user can grasp the intuitively applied image effects while viewing the images to which the image effects are applied in a batch.

As described above, the batch application of the selected image effect can be prioritized over the image displayed on the screen. For example, after the image effect is applied to the image 2 521 displayed on the fourth screen 610 of FIG. 6 according to the input of the user, another image (for example, image 1 (For example, image 1 (711) to which an image effect has been applied) to which an image effect is applied by applying an image effect to the image when it is to be displayed. If a new image 3 (not shown) is to be displayed, the electronic device 110 generates an image 3 (not shown) to which the image effect is applied by applying an image effect to the image 3 when the image 3 is displayed And can be displayed. In this case, the selected image effect has not yet been applied to all the images, but from the user's view, it may be felt that the selected image effect is applied to all the images of the group.

In the embodiments of FIGS. 5 and 7, images are displayed in accordance with a touch-and-drag event in the upward or downward direction, and an image effect is applied to the images in accordance with a left-right or right- However, the present invention is not limited thereto. For example, images may be changed and displayed in accordance with a touch-and-drag event in the left or right direction, and an image effect may be applied in accordance with a touch-and-drag event in the upward or downward direction.

Further, the input of the user is not limited to the touch gesture. For example, in step 430 described with reference to FIGS. 3 and 4, the display control unit 310 controls the first electronic device 110 to display an effect application interface on the screen in association with the image or the thumbnail image displayed on the screen . In this case, in step 440, the input recognition unit 320 can recognize a command input through the effect application interface as a user's input. For example, the effect applying interface may be implemented as an effect applying button and displayed on the screen. When the user presses the effect applying button (for example, when the user's touch is detected in the display area of the effect applying button displayed on the touch screen The input selection signal can be recognized as the input of the user.

FIG. 8 is a diagram illustrating an example in which a user's input is recognized using an effect application interface in an embodiment of the present invention. In FIG. 8, the sixth screen 810 shows an example in which the effect application button 812 is displayed together with the image 3 811. If the user selects the effect applying button 812 (for example, taps) and a selection signal (selection instruction) is generated, the electronic device 110 selects the image effect and outputs the selected image effect to the corresponding group You can batch apply to images. The seventh screen 820 shows an example in which the image 3 811 is changed to the image 3 821 to which the image effect is applied as the user selects the effect applying button 812. At this time, if the user changes the image to be displayed on the screen as shown in FIGS. 5 to 7, the images to which the same image effect is applied can be displayed on the screen. Both the image applying method using the user's gesture and the method using the effect applying interface according to the embodiment may be used.

The save button 822 displayed on the seventh screen 820 may be a user interface for storing images to which an image effect is applied. For example, in step 430, the display control unit 310 may control the first electronic device 110 to display a storage interface for storing an image to which the selected image effect is applied on the screen. In this case, the image editing method of FIG. 4 may further include storing images of an image group to which the selected image effect is applied according to a command input through the storage interface (not shown). For example, the processor 212 of the electronic device 1 110 may further include a storage control unit (not shown) for controlling the electronic device 1 110 to store images of the image group to which the image effect is applied.

At this time, the images may be stored in the storage included in the electronic device 1 (110) or stored in the storage on the web to which the electronic device 1 (110) can connect.

In the following embodiments, an example of applying an image effect to a thumbnail for an image will be described.

9 and 10 are views showing an example of applying an image effect to a thumbnail image in an embodiment of the present invention.

In FIG. 9, the eighth screen 910 shows an example in which thumbnails of images included in one group are displayed. At this time, the ninth screen 920 shows an example in which thumbnails are moved as the touch-drag event occurs in the upper direction by the user, and other thumbnails that are not displayed on the screen are displayed.

In FIG. 10, the tenth screen 1010 shows an example in which a touch-and-drag event occurs in the left or right direction by the user. In this case, the electronic device 1 (110) recognizes the input of the user according to the touch-and-drag event and can select the image effect in response to the input of the recognized user. Also, the electronic device 1 (110) may apply the selected image effect to each of the thumbnails displayed on the screen.

At this time, when a touch and drag event occurs in the upper or lower direction by the user, other thumbnails that are not displayed on the screen may be displayed. In this case, the selected image effect may be applied to other thumbnails displayed newly.

In addition, in the case of a thumbnail, an effect application interface and a storage interface (an effect application button 812 and a save button 822 described with reference to FIG. 8) may be provided.

In addition, when the user selects one of the thumbnails displayed on the screen, an image corresponding to the selected thumbnail may be displayed on the screen. In this case, the selected image effect may be applied to the image displayed on the screen. The image effect can be applied even when the image displayed on the screen is switched to the screen on which the thumbnail image is displayed.

As described above, according to the embodiments of the present invention, at least one image (or a thumbnail image of at least one image) of images classified into one group such as a folder or an album is displayed on the screen, (Or an image effect corresponding to the type of the input) randomly selected according to the user's input on the group (or thumbnail image) of the group can be collectively applied to each of the images (or thumbnail images) of the group. In addition, the image (or thumbnail image) displayed on the screen is changed in accordance with the first input of the user (for example, the first touch input), and the image effect selected in accordance with the second input When the image displayed on the screen is changed, the selected image effect can be sequentially applied to the changed image.

The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI &gt; or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

Claims (15)

Displaying at least one image of images included in one group of images or a thumbnail image of the at least one image on a screen in an electronic device;
Recognizing a user's input on an image or a thumbnail image displayed on the screen;
Randomly selecting one of the plurality of image effects according to an input of the recognized user or selecting an image effect corresponding to a type of input of the input user among the plurality of image effects; And
Collectively applying the selected image effect to each of the images included in the image group or each of the thumbnail images of the images
The image editing method comprising:
The method according to claim 1,
Wherein the collective application comprises:
Wherein each time an image or a thumbnail image displayed on the screen is changed to another image or another thumbnail image included in the image group, the selected image effect is applied to the another image or another thumbnail image and displayed Way.
The method according to claim 1,
Wherein the recognizing comprises:
Recognizing the gesture of the user for the image or the thumbnail image displayed on the screen as the input of the user through at least one sensor included in the electronic device,
Wherein the selecting comprises:
The method comprising: selecting one of the plurality of image effects randomly as the preset gesture is recognized; reselecting the image effect in which the previously selected history exists; And selecting an image effect corresponding to the selected gesture.
The method of claim 3,
Wherein the displaying comprises:
The electronic device changes an image or a thumbnail image displayed on the screen to another image or another thumbnail image included in the image group according to the first touch input of the user on the touch screen included as the at least one sensor, and,
Wherein the recognizing comprises:
Recognizes the gesture of the user according to a second touch input of the user to the touch screen,
Wherein the first touch input and the second touch input are a touch and drag input or a swipe input in different directions from each other.
The method according to claim 1,
Wherein the displaying comprises:
Further displaying an effect application interface on the screen in association with an image or a thumbnail image displayed on the screen,
Wherein the recognizing comprises:
And recognizing a command input through the effect application interface as an input of the user.
The method according to claim 1,
Wherein the displaying comprises:
Further displaying a storage interface for storing an image to which the selected image effect is applied on the screen,
Storing images of the image group to which the selected image effect is applied according to an instruction input through the storage interface
Further comprising the steps of:
The method according to claim 1,
Wherein the images of the group of images include images stored and managed in the form of a folder or one album in the storage of the electronic device or on the web,
Loading the images stored in the storage into the memory of the electronic device prior to the displaying step or receiving images stored in the storage on the web and loading the images into the memory of the electronic device
Further comprising the steps of:
A computer-readable recording medium having recorded thereon a program for executing the method according to any one of claims 1 to 7. In a computer-implemented electronic device system,
At least one processor configured to execute computer readable instructions,
Lt; / RTI &gt;
Wherein the at least one processor comprises:
A display control unit for controlling the electronic device to display at least one image of images included in one image group or a thumbnail of the at least one image on a screen;
An input recognizing unit recognizing a user's input on an image or a thumbnail image displayed on the screen;
An image effect selection unit that randomly selects one of the plurality of image effects according to an input of the recognized user or selects an image effect corresponding to the input type of the input user among the plurality of image effects part; And
An image effect applying unit for applying the selected image effect to each of the images included in the image group or each of the thumbnail images of the images,
&Lt; / RTI &gt;
10. The method of claim 9,
Wherein the image effect applying unit comprises:
Wherein each time an image or a thumbnail image displayed on the screen is changed to another image or another thumbnail image included in the image group, the selected image effect is applied to the another image or another thumbnail image.
10. The method of claim 9,
At least one sensor
Further comprising:
The input recognizing unit recognizes,
Recognizing a gesture of a user for an image or a thumbnail image displayed on the screen as an input of the user through the at least one sensor,
Wherein the image effect selection unit comprises:
The method comprising: selecting one of the plurality of image effects randomly as the preset gesture is recognized; reselecting the image effect in which the previously selected history exists; Gt; wherein the image effect corresponding to the selected gesture is selected.
12. The method of claim 11,
Wherein the at least one sensor comprises a touch screen,
The display control unit
Controls the electronic device to display an image or a thumbnail image displayed on the screen by changing the image or the thumbnail image included in the image group to another image or another thumbnail image by recognizing the first touch input of the user on the touch screen,
The input recognizing unit recognizes,
Recognizes the gesture of the user according to a second touch input of the user to the touch screen,
Wherein the first touch input and the second touch input are a touch and drag input or a swipe input in different directions.
10. The method of claim 9,
The display control unit
Controls the electronic device to further display an effect applying interface on the screen in association with an image or a thumbnail image displayed on the screen,
The input recognizing unit recognizes,
And recognizes a command input through the effect application interface as an input of the user.
10. The method of claim 9,
The display control unit
Controlling the electronic device to further display a storage interface for storing an image to which the selected image effect is applied on the screen,
Wherein the at least one processor comprises:
A storage controller for controlling the electronic device to store images of the image group to which the selected image effect is applied according to an instruction input through the storage interface,
&Lt; / RTI &gt;
10. The method of claim 9,
Wherein the images of the group of images include images stored and managed in the form of a folder or one album in the storage of the electronic device or on the web,
Wherein the at least one processor comprises:
A loading unit for loading images stored in the storage into the memory of the electronic device prior to the displaying step or loading images stored in the storage on the web into the memory of the electronic device,
&Lt; / RTI &gt;
KR1020150130097A 2015-09-15 2015-09-15 Method and system for applying image effect collectively for images included in a group KR101741906B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150130097A KR101741906B1 (en) 2015-09-15 2015-09-15 Method and system for applying image effect collectively for images included in a group

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150130097A KR101741906B1 (en) 2015-09-15 2015-09-15 Method and system for applying image effect collectively for images included in a group

Publications (2)

Publication Number Publication Date
KR20170032590A true KR20170032590A (en) 2017-03-23
KR101741906B1 KR101741906B1 (en) 2017-05-31

Family

ID=58496360

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150130097A KR101741906B1 (en) 2015-09-15 2015-09-15 Method and system for applying image effect collectively for images included in a group

Country Status (1)

Country Link
KR (1) KR101741906B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210040701A (en) 2019-10-04 2021-04-14 삼성전자주식회사 Electronic device for synchronizing modification of screen between screens and method for the same

Also Published As

Publication number Publication date
KR101741906B1 (en) 2017-05-31

Similar Documents

Publication Publication Date Title
KR102549529B1 (en) Method for launching a second application using a first application icon in an electronic device
US20150067555A1 (en) Method for configuring screen and electronic device thereof
US20130024818A1 (en) Apparatus and Method for Handling Tasks Within a Computing Device
KR102080146B1 (en) Operating Method associated with connected Electronic Device with External Display Device and Electronic Device supporting the same
KR101891582B1 (en) Method and system for processing highlight comment in content
KR20170124954A (en) Electronic device and controling method thereof
WO2016007181A1 (en) Peer to peer remote application discovery
KR20140072033A (en) Arranging tiles
US9047469B2 (en) Modes for applications
WO2011123840A2 (en) Interacting with remote applications displayed within a virtual desktop of a tablet computing device
US10979374B2 (en) Method, system, and non-transitory computer readable record medium for sharing information in chatroom using application added to platform in messenger
US10628018B2 (en) Method and user interface (UI) for customized user access to application functionalities
AU2014287956A1 (en) Method for displaying and electronic device thereof
JP2014514668A (en) Multi-input gestures in hierarchical domains
KR102470651B1 (en) Method and system for information providing interface based on new user experience
US20180365198A1 (en) Method and apparatus for providing web browsing interface
KR20200014108A (en) Method, system, and non-transitory computer readable record medium for searching non-text using text in conversation
KR101741906B1 (en) Method and system for applying image effect collectively for images included in a group
US10250943B2 (en) Method, apparatus, and computer readable recording medium for automatic grouping and management of content in real-time
KR101764998B1 (en) Method and system for filtering image
KR102309243B1 (en) Method, system, and computer program for sharing content to chat room in picture-in-picture mode
CN112312058B (en) Interaction method and device and electronic equipment
KR101918705B1 (en) Screen configuration method and screen configuration systema for reducing cognitive load
KR101721333B1 (en) Method and system for proviiding informational data using communication session for transmitting and receiving message
US20180307379A1 (en) Method and apparatus for providing web browsing interface

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant