KR20170032590A - Method and system for applying image effect collectively for images included in a group - Google Patents
Method and system for applying image effect collectively for images included in a group Download PDFInfo
- Publication number
- KR20170032590A KR20170032590A KR1020150130097A KR20150130097A KR20170032590A KR 20170032590 A KR20170032590 A KR 20170032590A KR 1020150130097 A KR1020150130097 A KR 1020150130097A KR 20150130097 A KR20150130097 A KR 20150130097A KR 20170032590 A KR20170032590 A KR 20170032590A
- Authority
- KR
- South Korea
- Prior art keywords
- image
- input
- screen
- images
- user
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20004—Adaptive image processing
Abstract
Description
The following description relates to a method and system for batch application of image effects to images contained in a group.
There are techniques for applying various effects to an image. For example, Korean Patent Laid-Open Publication No. 10-2007-0016295 discloses a digital image color effect changing method, wherein when a color effect menu is selected, an original image and a color effect image are simultaneously displayed. These prior art disclose the application of image effects to one image at a time.
However, there is a need to batch-apply the same effect to multiple images at once. In addition, users who are not familiar with image effects often want to apply various effects to the image. It is very difficult for these users to find the desired effect and apply it to the desired image. In addition, users who are not familiar with image effects are more likely to find out what effect they have when a particular effect is applied to an image. For example, when displaying a blur effect on an image, a user who does not know what the blur effect is may find it difficult to grasp exactly what effect was applied with only one blur effect image.
References: <PCT / KR / 2014/010167, US20140019540A1, US20130332543A1, US20130260893>
At least one image (or a thumbnail image of at least one image) of images classified into one group such as a folder or an album is displayed on the screen, and a user's input (or thumbnail image) (Or an image effect corresponding to a type of an input) randomly selected according to an input image (or a thumbnail image) of each group.
(Or thumbnail image) displayed on the screen in accordance with the first input of the user (for example, the first touch input) and changes the image effect selected on the second input (second touch input, for example) And a method and system for sequentially applying selected image effects to a changed image when an image displayed on the screen is changed.
Displaying at least one image of images included in one group of images or a thumbnail image of the at least one image on a screen in an electronic device; Recognizing a user's input on an image or a thumbnail image displayed on the screen; Randomly selecting one of the plurality of image effects according to an input of the recognized user or selecting an image effect corresponding to a type of input of the input user among the plurality of image effects; And collectively applying the selected image effect to each of the images included in the image group or each of the thumbnail images of the images.
CLAIMS What is claimed is: 1. A computer-implemented electronic device system, comprising: at least one processor configured to execute computer-readable instructions, wherein the at least one processor comprises: at least one image A display control unit for controlling the electronic device to display a thumbnail image of the at least one image on a screen; An input recognizing unit recognizing a user's input on an image or a thumbnail image displayed on the screen; An image effect selection unit that randomly selects one of the plurality of image effects according to an input of the recognized user or selects an image effect corresponding to the input type of the input user among the plurality of image effects part; And an image effect applying unit for applying the selected image effect collectively to each of the images included in the image group or each of the thumbnail images of the images.
At least one image (or a thumbnail image of at least one image) of images classified into one group such as a folder or an album is displayed on the screen, and a user's input (or thumbnail image) (Or an image effect corresponding to the type of the input) randomly selected in accordance with the selected image group (or thumbnail images).
(Or thumbnail image) displayed on the screen in accordance with the first input of the user (for example, the first touch input) and changes the image effect selected on the second input (second touch input, for example) When the image displayed on the screen is changed, the selected image effect can be sequentially applied to the changed image.
1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention.
2 is a block diagram illustrating an internal configuration of an electronic device and a server according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating an example of components that a processor of an electronic device according to an embodiment of the present invention may include.
4 is a flowchart illustrating an example of a method that an electronic device according to an embodiment of the present invention can perform.
5 to 7 illustrate an example of a process of applying an image effect using a touch input to a touch screen in an embodiment of the present invention.
FIG. 8 is a diagram illustrating an example in which a user's input is recognized using an effect application interface in an embodiment of the present invention.
9 and 10 are views showing an example of applying an image effect to a thumbnail image in an embodiment of the present invention.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings.
1 is a diagram illustrating an example of a network environment according to an embodiment of the present invention. 1 shows an example in which a plurality of
The plurality of
The communication method is not limited, and may include a communication method using a communication network (for example, a mobile communication network, a wired Internet, a wireless Internet, a broadcasting network) that the
Each of the
In one example, the
2 is a block diagram illustrating an internal configuration of an electronic device and a server according to an embodiment of the present invention. In FIG. 2, an internal configuration of the electronic device 1 (110) as an example of one electronic device and the
The
The
The input /
Also, in other embodiments,
FIG. 3 is a diagram illustrating an example of components that a processor of an electronic device according to an exemplary embodiment of the present invention may include; FIG. 4 is a flowchart illustrating a method of an electronic device according to an exemplary embodiment of the present invention; Fig. 3, the
In
At this time, the
At
In
In
In
In
In one embodiment, in
In another embodiment, the image
In a more specific embodiment, applying image effects directly to images that are not yet displayed to the user can be inefficient because it increases the load on the hardware. Accordingly, the image
Hereinafter, an example of applying an image effect to images using a touch screen will be described. For example, the electronic device 1 (110) illustrated in FIGS. 3 and 4 may include a touch screen. At this time, in
5 to 7 illustrate an example of a process of applying an image effect using a touch input to a touch screen in an embodiment of the present invention. 5 to 7 illustrate screens (the
In FIG. 5, the
In FIG. 6, the
For example, if a user generates a touch-and-drag event in the right direction, one of the plurality of image effects can be selected randomly. If the user again generates a touch-and-drag event in the right direction, the image effect of the other of the plurality of image effects can be selected at random. The selection of such a random image effect can be performed each time a touch-and-drag event in the right direction of the user is generated. Also, if the user generates a touch-and-drag event in the left direction, the previously selected image effect of the currently selected image effect may again be selected.
As another example, a new random image effect may be selected each time the user generates a touch-and-drag event in the left or right direction.
As another example, different image effects may be selected for each case when the user generates a touch-and-drag event in the left direction and a case in which the touch and drag event occurs in the right direction. For example, when the user generates a left-side touch-and-drag event, a predetermined first image effect may be selected. Also, when the user generates a touch and drag event in the right direction, a predetermined second image effect can be selected.
The user's input for selecting an image effect can be pre-set in various ways.
7, in the
Thus, the user can collectively apply the same image effect to all the images belonging to the group of currently displayed images by only one input (touch and drag input in the left or right direction in this embodiment). Here, as shown in FIGS. 5 to 7, a process of selecting an image effect using a separate user interface or a process of applying a selected image effect to a selected image may be omitted. Therefore, the user can apply the image effect to the images in a batch process very easily without having to worry about which image effect should be selected. In addition, the user can grasp the intuitively applied image effects while viewing the images to which the image effects are applied in a batch.
As described above, the batch application of the selected image effect can be prioritized over the image displayed on the screen. For example, after the image effect is applied to the
In the embodiments of FIGS. 5 and 7, images are displayed in accordance with a touch-and-drag event in the upward or downward direction, and an image effect is applied to the images in accordance with a left-right or right- However, the present invention is not limited thereto. For example, images may be changed and displayed in accordance with a touch-and-drag event in the left or right direction, and an image effect may be applied in accordance with a touch-and-drag event in the upward or downward direction.
Further, the input of the user is not limited to the touch gesture. For example, in
FIG. 8 is a diagram illustrating an example in which a user's input is recognized using an effect application interface in an embodiment of the present invention. In FIG. 8, the
The
At this time, the images may be stored in the storage included in the electronic device 1 (110) or stored in the storage on the web to which the electronic device 1 (110) can connect.
In the following embodiments, an example of applying an image effect to a thumbnail for an image will be described.
9 and 10 are views showing an example of applying an image effect to a thumbnail image in an embodiment of the present invention.
In FIG. 9, the
In FIG. 10, the
At this time, when a touch and drag event occurs in the upper or lower direction by the user, other thumbnails that are not displayed on the screen may be displayed. In this case, the selected image effect may be applied to other thumbnails displayed newly.
In addition, in the case of a thumbnail, an effect application interface and a storage interface (an
In addition, when the user selects one of the thumbnails displayed on the screen, an image corresponding to the selected thumbnail may be displayed on the screen. In this case, the selected image effect may be applied to the image displayed on the screen. The image effect can be applied even when the image displayed on the screen is switched to the screen on which the thumbnail image is displayed.
As described above, according to the embodiments of the present invention, at least one image (or a thumbnail image of at least one image) of images classified into one group such as a folder or an album is displayed on the screen, (Or an image effect corresponding to the type of the input) randomly selected according to the user's input on the group (or thumbnail image) of the group can be collectively applied to each of the images (or thumbnail images) of the group. In addition, the image (or thumbnail image) displayed on the screen is changed in accordance with the first input of the user (for example, the first touch input), and the image effect selected in accordance with the second input When the image displayed on the screen is changed, the selected image effect can be sequentially applied to the changed image.
The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.
The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computer system and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.
The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.
Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.
Claims (15)
Recognizing a user's input on an image or a thumbnail image displayed on the screen;
Randomly selecting one of the plurality of image effects according to an input of the recognized user or selecting an image effect corresponding to a type of input of the input user among the plurality of image effects; And
Collectively applying the selected image effect to each of the images included in the image group or each of the thumbnail images of the images
The image editing method comprising:
Wherein the collective application comprises:
Wherein each time an image or a thumbnail image displayed on the screen is changed to another image or another thumbnail image included in the image group, the selected image effect is applied to the another image or another thumbnail image and displayed Way.
Wherein the recognizing comprises:
Recognizing the gesture of the user for the image or the thumbnail image displayed on the screen as the input of the user through at least one sensor included in the electronic device,
Wherein the selecting comprises:
The method comprising: selecting one of the plurality of image effects randomly as the preset gesture is recognized; reselecting the image effect in which the previously selected history exists; And selecting an image effect corresponding to the selected gesture.
Wherein the displaying comprises:
The electronic device changes an image or a thumbnail image displayed on the screen to another image or another thumbnail image included in the image group according to the first touch input of the user on the touch screen included as the at least one sensor, and,
Wherein the recognizing comprises:
Recognizes the gesture of the user according to a second touch input of the user to the touch screen,
Wherein the first touch input and the second touch input are a touch and drag input or a swipe input in different directions from each other.
Wherein the displaying comprises:
Further displaying an effect application interface on the screen in association with an image or a thumbnail image displayed on the screen,
Wherein the recognizing comprises:
And recognizing a command input through the effect application interface as an input of the user.
Wherein the displaying comprises:
Further displaying a storage interface for storing an image to which the selected image effect is applied on the screen,
Storing images of the image group to which the selected image effect is applied according to an instruction input through the storage interface
Further comprising the steps of:
Wherein the images of the group of images include images stored and managed in the form of a folder or one album in the storage of the electronic device or on the web,
Loading the images stored in the storage into the memory of the electronic device prior to the displaying step or receiving images stored in the storage on the web and loading the images into the memory of the electronic device
Further comprising the steps of:
At least one processor configured to execute computer readable instructions,
Lt; / RTI >
Wherein the at least one processor comprises:
A display control unit for controlling the electronic device to display at least one image of images included in one image group or a thumbnail of the at least one image on a screen;
An input recognizing unit recognizing a user's input on an image or a thumbnail image displayed on the screen;
An image effect selection unit that randomly selects one of the plurality of image effects according to an input of the recognized user or selects an image effect corresponding to the input type of the input user among the plurality of image effects part; And
An image effect applying unit for applying the selected image effect to each of the images included in the image group or each of the thumbnail images of the images,
≪ / RTI >
Wherein the image effect applying unit comprises:
Wherein each time an image or a thumbnail image displayed on the screen is changed to another image or another thumbnail image included in the image group, the selected image effect is applied to the another image or another thumbnail image.
At least one sensor
Further comprising:
The input recognizing unit recognizes,
Recognizing a gesture of a user for an image or a thumbnail image displayed on the screen as an input of the user through the at least one sensor,
Wherein the image effect selection unit comprises:
The method comprising: selecting one of the plurality of image effects randomly as the preset gesture is recognized; reselecting the image effect in which the previously selected history exists; Gt; wherein the image effect corresponding to the selected gesture is selected.
Wherein the at least one sensor comprises a touch screen,
The display control unit
Controls the electronic device to display an image or a thumbnail image displayed on the screen by changing the image or the thumbnail image included in the image group to another image or another thumbnail image by recognizing the first touch input of the user on the touch screen,
The input recognizing unit recognizes,
Recognizes the gesture of the user according to a second touch input of the user to the touch screen,
Wherein the first touch input and the second touch input are a touch and drag input or a swipe input in different directions.
The display control unit
Controls the electronic device to further display an effect applying interface on the screen in association with an image or a thumbnail image displayed on the screen,
The input recognizing unit recognizes,
And recognizes a command input through the effect application interface as an input of the user.
The display control unit
Controlling the electronic device to further display a storage interface for storing an image to which the selected image effect is applied on the screen,
Wherein the at least one processor comprises:
A storage controller for controlling the electronic device to store images of the image group to which the selected image effect is applied according to an instruction input through the storage interface,
≪ / RTI >
Wherein the images of the group of images include images stored and managed in the form of a folder or one album in the storage of the electronic device or on the web,
Wherein the at least one processor comprises:
A loading unit for loading images stored in the storage into the memory of the electronic device prior to the displaying step or loading images stored in the storage on the web into the memory of the electronic device,
≪ / RTI >
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150130097A KR101741906B1 (en) | 2015-09-15 | 2015-09-15 | Method and system for applying image effect collectively for images included in a group |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150130097A KR101741906B1 (en) | 2015-09-15 | 2015-09-15 | Method and system for applying image effect collectively for images included in a group |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170032590A true KR20170032590A (en) | 2017-03-23 |
KR101741906B1 KR101741906B1 (en) | 2017-05-31 |
Family
ID=58496360
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150130097A KR101741906B1 (en) | 2015-09-15 | 2015-09-15 | Method and system for applying image effect collectively for images included in a group |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101741906B1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210040701A (en) | 2019-10-04 | 2021-04-14 | 삼성전자주식회사 | Electronic device for synchronizing modification of screen between screens and method for the same |
-
2015
- 2015-09-15 KR KR1020150130097A patent/KR101741906B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
KR101741906B1 (en) | 2017-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102549529B1 (en) | Method for launching a second application using a first application icon in an electronic device | |
US20150067555A1 (en) | Method for configuring screen and electronic device thereof | |
US20130024818A1 (en) | Apparatus and Method for Handling Tasks Within a Computing Device | |
KR102080146B1 (en) | Operating Method associated with connected Electronic Device with External Display Device and Electronic Device supporting the same | |
KR101891582B1 (en) | Method and system for processing highlight comment in content | |
KR20170124954A (en) | Electronic device and controling method thereof | |
WO2016007181A1 (en) | Peer to peer remote application discovery | |
KR20140072033A (en) | Arranging tiles | |
US9047469B2 (en) | Modes for applications | |
WO2011123840A2 (en) | Interacting with remote applications displayed within a virtual desktop of a tablet computing device | |
US10979374B2 (en) | Method, system, and non-transitory computer readable record medium for sharing information in chatroom using application added to platform in messenger | |
US10628018B2 (en) | Method and user interface (UI) for customized user access to application functionalities | |
AU2014287956A1 (en) | Method for displaying and electronic device thereof | |
JP2014514668A (en) | Multi-input gestures in hierarchical domains | |
KR102470651B1 (en) | Method and system for information providing interface based on new user experience | |
US20180365198A1 (en) | Method and apparatus for providing web browsing interface | |
KR20200014108A (en) | Method, system, and non-transitory computer readable record medium for searching non-text using text in conversation | |
KR101741906B1 (en) | Method and system for applying image effect collectively for images included in a group | |
US10250943B2 (en) | Method, apparatus, and computer readable recording medium for automatic grouping and management of content in real-time | |
KR101764998B1 (en) | Method and system for filtering image | |
KR102309243B1 (en) | Method, system, and computer program for sharing content to chat room in picture-in-picture mode | |
CN112312058B (en) | Interaction method and device and electronic equipment | |
KR101918705B1 (en) | Screen configuration method and screen configuration systema for reducing cognitive load | |
KR101721333B1 (en) | Method and system for proviiding informational data using communication session for transmitting and receiving message | |
US20180307379A1 (en) | Method and apparatus for providing web browsing interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) | ||
GRNT | Written decision to grant |