WO2013073109A1 - Image processing device, image processing method and image processing program - Google Patents

Image processing device, image processing method and image processing program Download PDF

Info

Publication number
WO2013073109A1
WO2013073109A1 PCT/JP2012/006729 JP2012006729W WO2013073109A1 WO 2013073109 A1 WO2013073109 A1 WO 2013073109A1 JP 2012006729 W JP2012006729 W JP 2012006729W WO 2013073109 A1 WO2013073109 A1 WO 2013073109A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
image
input
unit
processing
Prior art date
Application number
PCT/JP2012/006729
Other languages
French (fr)
Japanese (ja)
Inventor
勝長 ▲辻▼
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013073109A1 publication Critical patent/WO2013073109A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to an image processing apparatus, an image processing method, and an image processing program for processing an image.
  • FIG. 13 is a diagram for explaining the procedure of the conventional clipping process and image processing. For example, when a desired image is obtained from an image displayed on the screen of the portable information terminal 101, the image processing software is used to cut out the image, and then the processing is switched to the image processing and the part to which the effect is applied. chosen.
  • Patent Documents 1 to 3 by providing different functions to the finger, the stylus, or the finger and the stylus, many functions are realized with a small number of steps.
  • the present invention has been made in view of the above-described conventional circumstances, and provides an image processing apparatus, an image processing method, and an image processing program capable of performing simple and intuitive operations using different input means. For the purpose.
  • An image processing apparatus is an image processing apparatus that processes an image, and includes an operation receiving unit that receives an operation to a predetermined area by an input unit, and an input that determines an input unit that has received the operation by the operation receiving unit.
  • a means determination unit a storage unit that is assigned to each input unit and stores processing information for executing different processing with the same function on an image, and the input among the processing information stored in the storage unit
  • a processing unit that executes processing corresponding to the processing information assigned to the input means determined by the means determination unit.
  • An image processing method of the present invention is an image processing method in an image processing apparatus for processing an image, the step of receiving an operation to a predetermined area by an input unit, the step of determining an input unit that has received the operation, Processing corresponding to processing information assigned to the determined input means among processing information assigned to each input means and stored in the storage unit for executing different processing with the same function on the image Performing the steps.
  • the image processing program of the present invention is a program for causing a computer to execute each step of the image processing method.
  • FIG. 1 is a block diagram illustrating a configuration example of an image processing apparatus according to a first embodiment of the present invention.
  • the figure which shows an example of the table in which the various functions stored in the memory
  • the figure which shows the housing
  • (A), (B) The flowchart which shows an example of the operation
  • FIGS. 4A to 4C are diagrams for explaining a procedure of conventional clipping processing and image processing;
  • the image processing apparatus of this embodiment is applied to portable information terminals, such as a smart phone and a tablet terminal, for example.
  • FIG. 1 is a block diagram illustrating a configuration example of an image processing apparatus according to the first embodiment of the present invention. Here, a functional configuration example when performing cutout (cutout) processing and effect processing is shown.
  • the image processing apparatus 1 includes a touch panel 11, a finger / stylus discrimination processing unit 12, a cut-out range drawing processing unit 13, a connection processing unit 14, a storage unit 15, a display 16, and an operation unit 17.
  • the touch panel 11 is provided on the screen of the display 16 and detects a touch position of an input unit such as a finger or a stylus on a predetermined area. Examples of the touch panel include a contact type and a press type.
  • the touch panel 11 is an example of an operation receiving unit that receives an operation on a predetermined area by an input unit.
  • the finger / stylus discrimination processing unit 12 discriminates whether a finger or a stylus input means is touched. That is, the finger / stylus discrimination processing unit 12 has a function as an input unit discrimination unit that discriminates an input unit for which an operation is accepted by the touch panel 11.
  • the finger / stylus discrimination processing unit 12 may discriminate the input means based on the contact area with the touch panel 11. Since the contact area of a finger is generally larger than the contact area of a stylus, it is easy to determine whether the input means is a finger or a stylus.
  • the cutout range drawing processing unit 13 performs a drawing process using the range surrounded by the line drawn by the touch operation as the cutout range. That is, the cutout range drawing processing unit 13 performs cutout processing and effect processing. In addition, various other controls and processes are performed.
  • the connection processing unit 14 determines whether to connect the end point and the start point when the touch panel 11 is touched again after being released with a finger or a stylus.
  • the finger / stylus discrimination processing unit 12, the cutout range drawing processing unit 13, and the connection processing unit 14 may be realized by dedicated hardware, or the CPU executes a program as in the present embodiment. It may be realized with.
  • the storage unit 15 stores the locus of the touch position as coordinate information when the screen of the touch panel 11 is traced. In addition, the storage unit 15 stores a cut-out image.
  • the storage unit 15 stores processing information that is assigned to each input unit using the table 15a and that executes different processing with the same function on the image. Functions corresponding to drawing with a finger or a stylus are registered in advance as the table 15a.
  • FIG. 2 is a diagram showing an example of a table 15a in which various functions stored in the storage unit 15 are registered.
  • a combination of functions realized by a finger or stylus operation is registered so as to be selectable as processing information when the clipping process and the image process are performed.
  • the user can select a combination of functions suitable for the application by pressing the menu key 17a and selecting a desired combination of functions in advance.
  • the range selection function has a combination of “make the cut surface jagged” with a finger and “make the cut surface straight” with a stylus.
  • “chopping effect” with a finger and “scissors” with a stylus There is also a combination of “turning effect” with a finger and “normal clipping” with a stylus.
  • “lasso tool” with a finger and “normal clipping” with a stylus Note that each function in the case of image processing will be described in a second embodiment described later.
  • the display 16 is composed of an LCD or the like, and displays various images and a drawn line representing a cutout range and the like.
  • the operation unit 17 includes various keys.
  • the cut-out range drawing processing unit 13 executes processing corresponding to the processing information assigned to the input means determined by the finger / stylus determination processing unit 12 among the processing information stored in the storage unit 15.
  • the cut-out range drawing processing unit 13 may execute any one of the cut-out process functions stored in the table 15a, or may execute two or more processes. Further, the process to be executed may be determined in advance, or may be set by the user through an input operation.
  • FIG. 3 is a diagram illustrating an example of the front surface of the housing of the image processing apparatus 1.
  • a display 16, an operation unit 17, a microphone 18 a, and a speaker 18 b are provided on the front surface of the image processing apparatus 1.
  • the touch panel 11 is provided on the screen of the display 16 on which various images are displayed.
  • An operation unit 17 including a touch panel is provided below the screen of the display 16.
  • a menu key 17a, a clear key 17b, and a confirmation key 17c are set to be switchable. For example, it is possible to switch to a lasso key 60 described later.
  • a stylus 21 that can be held and operated by the user is also provided.
  • FIGS. 4A and 4B are flowcharts showing an example of the operation procedure of the image processing apparatus 1.
  • An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1 and is executed by the CPU in the image processing apparatus 1.
  • FIG. 4A shows an example of the range selection process. This range selection process is executed when the start of the cut-out mode is selected by the menu key 17a. Here, the range is selected using a finger and a stylus (a plurality of input means).
  • the image processing apparatus 1 stands by until touch input by the user is performed on the touch panel 11 (step S1).
  • the finger / stylus discrimination processing unit 12 discriminates whether the touch input is a finger or a stylus (step S2). Further, in the process of step S ⁇ b> 2, the cutout range drawing processing unit 13 acquires the coordinates of the touch position and records them in the storage unit 15.
  • the cutout range drawing processing unit 13 determines whether or not this touch input has been released (step S3). If not released, the image processing apparatus 1 returns to the process of step S2.
  • the cutout range drawing processing unit 13 determines whether or not the end of the cutout mode is selected by the menu key 17a (step S4). If the end of the cropping mode is not selected, the image processing apparatus 1 returns to the process of step S1. On the other hand, when the end of the cut-out mode is selected, the image processing apparatus 1 ends this operation.
  • FIG. 4B is a diagram illustrating an example of the cutout process. This cut-out process is executed after the process of FIG.
  • the cut-out range drawing processing unit 13 cuts out an image (work target image) registered in the storage unit 15 based on the selection range obtained from the coordinates recorded in the storage unit 15 (step S11).
  • the image processing apparatus 1 starts processing for each coordinate group of the cut-out image (step S12).
  • the coordinate group is a coordinate group of a trajectory traced by a finger, for example, and is distinguished from a coordinate group of a trajectory traced by a stylus thereafter.
  • the finger / stylus discrimination processing unit 12 discriminates whether or not it is a touch input by a finger (step S13). In the case of touch input with a finger, the cut-out range drawing processing unit 13 performs effect processing for the finger (step S14). In the effect processing for a finger, as will be described later, for example, shredding processing such as tearing paper with a finger is performed.
  • the cutout range drawing processing unit 13 performs an effect process for the stylus (step S15).
  • an effect process for the stylus as will be described later, a straight line cutting process such as cutting with scissors is performed.
  • step S16 determines whether or not the processing of all coordinate groups has been completed. If all the coordinate groups have not been processed, the image processing apparatus 1 returns to step S12.
  • the cut-out range drawing processing unit 13 stores the cut-out image in the storage unit 15 as an output image (step S17).
  • the stored output image is not only displayed on the screen of the display 16 but can be output as data via an external output interface (not shown).
  • 5A and 5B are diagrams illustrating an example of the cut-out process and the effect process.
  • the user draws a line 31 (an example of the first line) with the stylus 21 from the left side to the upper side of the image 30 displayed on the screen of the display device 16.
  • a line 33 (an example of a second line) is drawn with the finger 25 from the right side to the lower side.
  • the image 35 of the cutout range selection range
  • an effect such as cutting with scissors is applied in the portion from the left side to the upper side.
  • an effect of tearing off with a finger is applied in the portion from the right side to the lower side.
  • the stylus 21 corresponds to the first input means
  • the finger corresponds to the second input means.
  • FIG. 6 is a flowchart showing an example of a detailed range selection procedure. An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1 and is executed by the CPU in the image processing apparatus 1.
  • the image processing apparatus 1 stands by until there is a touch input by the user (step S31).
  • the finger / stylus discrimination processing unit 12 discriminates whether the touch input is by a user's finger or a stylus (step S32). Further, in step S32, the cutout range drawing processing unit 13 acquires the coordinates of the touch position (start point).
  • the cutout range drawing processing unit 13 determines whether or not the current touch input is the first time (step S33). If it is not the first time, the connection processing unit 14 performs processing for connecting the end point of the previous touch input and the start point of the current touch input (step S34). In this way, by connecting the previous end point and the current start point, the selection range can be determined in a closed state, and the selection range becomes clear.
  • the cutout range drawing processing unit 13 acquires the coordinates of the touch input and stores them in the storage unit 15 including the coordinates of the start point. (Step S35).
  • the cutout range drawing processing unit 13 determines whether or not the touch input has been released (step S36). If not released, the image processing apparatus 1 returns to the process of step S35 and performs the same operation. As a result, the coordinate group of the locus of touch input is recorded in the storage unit 15.
  • FIG. 7 is a diagram illustrating an example of the structure of data representing the locus of touch input recorded in the storage unit 15. This data shows the type of finger or stylus, the X and Y coordinates of the start point, the X and Y coordinates of the end point, and the X and Y coordinates of the trajectory that is a coordinate group.
  • step S36 the cutout range drawing processing unit 13 stores the end point of the current touch input in the storage unit 15 (step S37).
  • the cutout range drawing processing unit 13 determines whether or not the enter key 17c is pressed (step S38). If the confirmation key 17c is not pressed, the image processing apparatus 1 returns to the process of step S31. On the other hand, when the enter key 17c is pressed, the image processing apparatus 1 stands by until the end of the cut-out mode is selected by the menu key 17a (step S39). When the end of the cut-out mode is selected, the image processing apparatus 1 ends this operation.
  • FIG. 8 is a flowchart showing a second example of the range selection procedure.
  • An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1 and is executed by the CPU in the image processing apparatus 1.
  • the image processing apparatus 1 stands by until there is a touch input by the user (step S41). If there is a touch input, the finger / stylus discrimination processing unit 12 discriminates whether the touch input is by a user's finger or a stylus (step S42). Further, in step S ⁇ b> 41, the cutout range drawing processing unit 13 acquires the coordinates of the touch position (start point) and stores them in the storage unit 15.
  • the cut-out range drawing processing unit 13 acquires the coordinates of the touch input and stores them in the storage unit 15 (step S43). Then, the cutout range drawing processing unit 13 determines whether or not the touch input has been released (step S44). When the touch input is not released, the image processing apparatus 1 returns to the process of step S43 and performs the same operation. On the other hand, when released in step S44, the cutout range drawing processing unit 13 stores the coordinates of the end point of the touch input in the storage unit 15 (step S45).
  • connection processing unit 14 reads out the coordinates of the start and end points of all lines from the storage unit 15 and performs connection determination (step S46).
  • connection determination it is determined whether or not the start point (start end) and end point (end end) recorded so far are within a certain range (below a predetermined distance). That is, when the end points of a certain touch input locus and the end points of another touch input locus are close to each other, these points are determined to be “connected”. Thereby, it is possible to connect not only the locus (line) of the previous touch input but also the previous touch input line, the touch input becomes simple, and the degree of freedom of range selection increases.
  • connection processing unit 14 determines whether or not it is determined to be “connected” as a result of the determination in step S46 (step S47). When it is determined that “to be connected”, the connection processing unit 14 performs a connection process for connecting the points to be connected (step S48).
  • step S49 determines whether or not the enter key 17c is pressed. If the confirmation key 17c is not pressed, the image processing apparatus 1 returns to the process of step S41. On the other hand, when the enter key 17c is pressed, the image processing apparatus 1 stands by until the end of the cut-out mode is selected by the menu key 17a (step S50). When the end of the cut-out mode is selected, the image processing apparatus 1 ends this operation.
  • FIG. 9 is a diagram showing an example of the cut-out process and the effect process when the function of the lasso tool is used.
  • the lasso tool has a function of recognizing a recognized object by recognizing a person's face included in the image as an object (object recognition processing by the object recognition unit) by the cutout range drawing processing unit 13.
  • the line 51 is drawn with the stylus 21 from the left side to the upper side of the image 50 displayed on the screen of the display device 16 as described above
  • the line 53 is drawn by tracing with the finger 25 from the right side to the lower side.
  • the lasso key 60 is pressed to perform the cutting process.
  • the function of the lasso tool may be automatically executed when the line 53 is drawn by the finger without pressing the lasso key 60.
  • the cut-out image 55 includes a background portion 55a and a human face 55b captured by a lasso tool.
  • the cut-out range drawing processing unit 13 as a processing unit may have a function as an object recognition unit that recognizes an object included in an image. Further, the cut-out range drawing processing unit 13 recognizes a selection range surrounded by the first line and a line connecting the start and end of the second line along the recognized object. Also good.
  • the first line is, for example, a line input by a stylus as a first input unit.
  • the second line is, for example, a line input with a finger as the second input means.
  • the edge 57 of the image 55 cut out along the face may be a jagged straight line or a smooth curve.
  • a related cut-out process is assigned as the same function to touch input using a finger or a stylus, and each influences. Processing is performed.
  • different processing is performed with the same function on an image selected using a finger or a stylus.
  • the cutout range drawing processing unit 13 serving as a processing unit performs, for each input unit, an image in a selection range surrounded by lines input by a plurality of input units when performing image cutout as a function.
  • a different processing for processing an image may be performed. Accordingly, a simple and intuitive operation can be performed using different input means.
  • the cutout range drawing processing unit 13 as a processing unit performs cutout processing with a straight line on the image selected along the first line, and cuts the image selected along the second line into pieces. Processing may be performed. Thereby, the range selection process, the cutout process, and the effect process can be performed by one operation with the stylus and the finger, and the operability is improved.
  • the cutout range drawing processing unit 13 as the processing unit, when the distance between the end point of the first line and the end point of the second line is equal to or less than a predetermined distance, The end points of the two lines may be connected, and the range surrounded by the connected first line and second line may be recognized as the selection range. Thereby, the connection process is executed for each input end point, and even if the end point of the input by the stylus and the end point of the input by the finger are slightly separated, the range selection process can be appropriately performed. Operability is improved.
  • FIG. 10 is a block diagram showing a configuration example of the image processing apparatus 1A according to the second embodiment of the present invention. Here, a functional configuration when performing image processing in the image processing apparatus 1A is shown. Further, the same components as those in the first embodiment are denoted by the same reference numerals, and the description thereof is omitted.
  • the image processing range drawing processing unit 23 performs a drawing process using the range surrounded by the touch operation as the image processing range.
  • the above-described table 15a is registered in the storage unit 15A.
  • Other configurations are the same as those in the first embodiment.
  • the storage unit 15A stores information on functions corresponding to drawing with a finger or a stylus in the table 15a in advance.
  • a combination of functions realized by a finger or stylus operation is registered so as to be selectable when performing image processing.
  • the user can select a combination of functions suitable for the application by pressing the menu key 17a and selecting a desired combination of functions in advance.
  • soft force processing functions include a combination of “blurring” with a finger and “clear” with a stylus.
  • the brightness adjustment function includes a combination of “darken” with a finger and “brighten” with a stylus.
  • the screen effect function includes a combination of “turn off the glitter effect” with a finger and “apply a glitter effect” with a stylus.
  • turn turning effect with a finger and “turn off turning effect” with a stylus.
  • the image processing range drawing processing unit 23 may execute any one of the image processing functions stored in the table 15a, or may execute two or more processes. Further, the process to be executed may be determined in advance, or may be set by the user through an input operation.
  • FIGS. 11A and 11B are flowcharts showing an example of the operation procedure of the image processing apparatus 1A.
  • An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1A and is executed by the CPU in the image processing apparatus 1A.
  • FIG. 11A is a diagram showing an example of a range selection processing procedure. This range selection process is executed when the start of the image processing range designation mode is selected by the menu key 17a.
  • the image processing apparatus 1A waits until a touch input by the user is performed on the touch panel 11 (step S51).
  • the finger / stylus discrimination processing unit 12 discriminates whether the touch input is a finger or a stylus (step S52). Further, in the process of step S52, the image processing range drawing processing unit 23 acquires the coordinates of the touch position and records them in the storage unit 15A.
  • the image processing range drawing processing unit 23 determines whether or not this touch input has been released (step S53). If not released, the image processing apparatus 1A returns to the process of step S52.
  • the image processing apparatus 1A determines whether or not the end of the image processing range designation mode is selected by the menu key 17a (step S54). If the end of the image processing range designation mode is not selected, the image processing apparatus 1A returns to the process of step S51. On the other hand, when the end of the image processing range designation mode is selected, the image processing apparatus 1A ends this operation.
  • FIG. 11B is a diagram illustrating an example of an image processing procedure. This image processing is executed after the range selection processing is performed.
  • the image processing range drawing processing unit 23 performs image processing on the image (work target image) registered in the storage unit 15A based on the selection range obtained from the coordinates recorded in the storage unit 15A (step S1). S61). Then, the image processing range drawing processing unit 23 stores the image subjected to the image processing in the storage unit 15A as an output image (step S62).
  • the stored output image is not only displayed on the screen of the display 16 but can be output as data via an external output interface (not shown).
  • FIG. 12 is a diagram for explaining image processing.
  • a portion to be subjected to image processing is designated by the stylus 21 or the finger 25, image processing is performed on the designated portion.
  • a selection range is designated by a line having a predetermined width for a portion traced with a finger or a stylus 21.
  • the portion 71 traced by the stylus 21 is subjected to, for example, image processing for increasing the brightness (represented by dots in the figure).
  • image processing for reducing brightness is performed on the portion 73 traced by the finger 25 (indicated by hatching in the drawing).
  • the image processing apparatus 1A of the present embodiment when image processing is performed, related processing with the same function is assigned to touch input using a finger or a stylus so that each affects the touch input. Processing is performed. Therefore, when performing image processing, a simple and intuitive operation can be performed using different input means. Furthermore, range selection processing and image processing (corresponding to the clipping processing and effect processing in the first embodiment) can be performed with one touch input, and operability is improved.
  • the present invention is not limited to the configuration of the above-described embodiment, and can be applied to any configuration that can achieve the functions shown in the claims or the functions of the configuration of the present embodiment. Is possible.
  • the case where two combinations of a finger and a stylus are shown as input means but three or more combinations may be used.
  • the thickness of both ends of the stylus may be changed, and different processing may be assigned to each of the three combinations of the touch input on the thin side, the touch input on the thick side, and the finger.
  • the identification accuracy of the contact area of the finger may be improved, and the thumb, the index finger, and the little finger may be identified and combined with the stylus to provide four input means.
  • the present invention is not limited to these processes, and can be applied to an input process, a search process, and the like.
  • the input means is determined based on the contact area of the finger or stylus to the touch panel.
  • the present invention is not limited to this, and for example, using a pressure-sensitive touch panel, the stylus and the finger are distributed with pressure distribution. You may make it discriminate
  • the touch panel 11 provided in the screen of the display 16 of the image processing apparatus 1 and 1A was used as an example of an operation reception part, the touch provided in the position away from the display 16 is used. It may be a pad.
  • the present invention also provides an image processing program that realizes the functions of the above-described embodiments to an image processing apparatus via a network or various storage media, and a program that is read and executed by a computer in the image processing apparatus. It is.
  • the present invention is useful because it enables simple and intuitive operations using different input means when processing an image.

Abstract

An image processing device that processes an image comprises: a control operation acceptance section that accepts control operations in respect of a prescribed region by input means; an input means identification section that identifies the input means whereby control operations are accepted by the control operation acceptance section; a storage section that stores processing information for executing different processing with the same function in respect of an image, allocated to each input means; and a processing section that executes processing corresponding to the processing information allocated to the input means identified by the input means identification section, of the processing information stored in the storage section.

Description

画像処理装置、画像処理方法、および画像処理プログラムImage processing apparatus, image processing method, and image processing program
 本発明は、画像を処理する画像処理装置、画像処理方法、および画像処理プログラムに関する。 The present invention relates to an image processing apparatus, an image processing method, and an image processing program for processing an image.
 スマートフォン、タブレット端末等の携帯情報端末では、指やスタイラスペン(単にスタイラスともいう)で画面にタッチすることで、さまざまな処理の操作が行われる。 In mobile information terminals such as smartphones and tablet terminals, various processing operations are performed by touching the screen with a finger or a stylus pen (also simply referred to as a stylus).
 図13は従来の切り出し処理および画像処理の手順を説明するための図である。例えば、携帯情報端末101の画面に表示された画像から所望の画像を得る場合、画像処理ソフトを使用して、画像の切り出し処理を行った後、画像の加工処理に切り替え、エフェクトをかける箇所が選択された。 FIG. 13 is a diagram for explaining the procedure of the conventional clipping process and image processing. For example, when a desired image is obtained from an image displayed on the screen of the portable information terminal 101, the image processing software is used to cut out the image, and then the processing is switched to the image processing and the part to which the effect is applied. chosen.
 また、この種の先行技術として、スタイラスの太さに応じて異なる機能、例えば、細いスタイラスはペンの機能、太いスタイラスは消しゴムの機能を有するものが知られている(特許文献1参照)。 Also, as this type of prior art, a function different depending on the thickness of the stylus is known, for example, a thin stylus has a pen function and a thick stylus has an eraser function (see Patent Document 1).
 また、メール作成画面において、指で元のメールの参照画面に遷移する、スタイラスで電話番号にタッチして電話をかける、指とスタイラスで添付ファイルを再生する、ものが知られている(特許文献2参照)。 In addition, on the mail creation screen, there are known ones that transition to the original mail reference screen with a finger, touch a phone number with a stylus to make a call, and play an attached file with the finger and stylus (Patent Document) 2).
 さらに、様々なアプリケーションにおいて、指とスタイラスでそれぞれ異なる機能を与えるものが知られている(特許文献3参照)。 Furthermore, in various applications, those that give different functions to the finger and the stylus are known (see Patent Document 3).
 このように、特許文献1~3では、指、スタイラス、もしくは、指とスタイラスにそれぞれ異なる機能を持たせることで、少ない手数で多くの機能を実現させていた。 As described above, in Patent Documents 1 to 3, by providing different functions to the finger, the stylus, or the finger and the stylus, many functions are realized with a small number of steps.
日本国特開平2-299013号公報Japanese Laid-Open Patent Publication No. Hei 2-299013 日本国特開2008-084119号公報Japanese Unexamined Patent Publication No. 2008-084119 日本国特開2008-108233号公報Japanese Unexamined Patent Publication No. 2008-108233
 しかしながら、上記従来の画像処理装置では、スタイラスもしくは指で画像の切り出し処理(範囲選択処理)を行った後、別のツールを使用してエフェクト処理を行う場合、2回の操作が必要となり、エフェクト箇所を個別に指定する等の場合、操作が煩雑なものとなっていた。 However, in the above-described conventional image processing apparatus, when the effect processing is performed using another tool after the image cutting process (range selection process) with the stylus or the finger, two operations are required. In the case of individually specifying the location, the operation is complicated.
 また、特許文献1~3に記載されたものは、いずれも、指、スタイラス、もしくは、指とスタイラスといった、入力手段毎に、関連性のない異なる機能が割り当てられていた。このため、ユーザは、各入力手段によってどのような機能が実現されるかを、個々に覚える必要があった。さらに、これらの入力手段が影響し合うような処理か行われることもなかった。 In addition, all of the devices described in Patent Documents 1 to 3 are assigned different unrelated functions such as a finger, a stylus, or a finger and a stylus for each input means. For this reason, the user has to remember what function is realized by each input means. In addition, there is no processing in which these input means influence each other.
 本発明は、上記従来の事情に鑑みてなされたものであって、異なる入力手段を用いて簡易で直感的な操作を行うことができる画像処理装置、画像処理方法、および画像処理プログラムを提供することを目的とする。 The present invention has been made in view of the above-described conventional circumstances, and provides an image processing apparatus, an image processing method, and an image processing program capable of performing simple and intuitive operations using different input means. For the purpose.
 本発明の画像処理装置は、画像を処理する画像処理装置であって、入力手段による所定領域への操作を受け付ける操作受付部と、前記操作受付部によって操作が受け付けられた入力手段を判別する入力手段判別部と、前記入力手段ごとに割り当てられ、画像に対し、同じ機能で異なる処理を実行するための処理情報を記憶する記憶部と、前記記憶部に記憶された処理情報のうち、前記入力手段判別部により判別された入力手段に割り当てられた処理情報に対応する処理を実行する処理部と、を備える。 An image processing apparatus according to the present invention is an image processing apparatus that processes an image, and includes an operation receiving unit that receives an operation to a predetermined area by an input unit, and an input that determines an input unit that has received the operation by the operation receiving unit. A means determination unit, a storage unit that is assigned to each input unit and stores processing information for executing different processing with the same function on an image, and the input among the processing information stored in the storage unit And a processing unit that executes processing corresponding to the processing information assigned to the input means determined by the means determination unit.
 この構成によれば、指やスタイラスを用いて選択される画像に対し、同一の機能で異なる処理が施される。これにより、異なる入力手段を用いて簡易で直感的な操作を行うことができる。 According to this configuration, different processing is performed with the same function on an image selected using a finger or a stylus. Thereby, a simple and intuitive operation can be performed using different input means.
 本発明の画像処理方法は、画像を処理する画像処理装置における画像処理方法であって、入力手段による所定領域への操作を受け付けるステップと、前記操作が受け付けられた入力手段を判別するステップと、記憶部に記憶された、前記入力手段ごとに割り当てられ、画像に対し、同じ機能で異なる処理を実行するための処理情報のうち、前記判別された入力手段に割り当てられた処理情報に対応する処理を実行するステップと、を有する。 An image processing method of the present invention is an image processing method in an image processing apparatus for processing an image, the step of receiving an operation to a predetermined area by an input unit, the step of determining an input unit that has received the operation, Processing corresponding to processing information assigned to the determined input means among processing information assigned to each input means and stored in the storage unit for executing different processing with the same function on the image Performing the steps.
 この方法によれば、指やスタイラスを用いて選択される画像に対し、同一の機能で異なる処理が施される。これにより、異なる入力手段を用いて簡易で直感的な操作を行うことができる。 According to this method, different processing with the same function is performed on an image selected using a finger or a stylus. Thereby, a simple and intuitive operation can be performed using different input means.
 本発明の画像処理プログラムは、上記画像処理方法の各ステップをコンピュータに実行させるためのプログラムである。 The image processing program of the present invention is a program for causing a computer to execute each step of the image processing method.
 このプログラムによれば、指やスタイラスを用いて選択される画像に対し、同一の機能で異なる処理が施される。これにより、異なる入力手段を用いて簡易で直感的な操作を行うことができる。 According to this program, different processing is performed with the same function on an image selected using a finger or a stylus. Thereby, a simple and intuitive operation can be performed using different input means.
 本発明によれば、異なる入力手段を用いて簡易で直感的な操作を行うことができる。 According to the present invention, simple and intuitive operations can be performed using different input means.
本発明の第1の実施形態における画像処理装置の構成例を示すブロック図1 is a block diagram illustrating a configuration example of an image processing apparatus according to a first embodiment of the present invention. 本発明の第1の実施形態における記憶部に格納された各種の機能が登録されたテーブルの一例を示す図The figure which shows an example of the table in which the various functions stored in the memory | storage part in the 1st Embodiment of this invention were registered. 本発明の第1の実施形態における画像処理装置の筐体前面を示す図The figure which shows the housing | casing front surface of the image processing apparatus in the 1st Embodiment of this invention. (A)、(B)本発明の第1の実施形態における画像処理装置の動作手順の一例を示すフローチャート(A), (B) The flowchart which shows an example of the operation | movement procedure of the image processing apparatus in the 1st Embodiment of this invention. (A)、(B)本発明の第1の実施形態における切り出し処理およびエフェクト処理の一例を示す図(A), (B) The figure which shows an example of the cut-out process in the 1st Embodiment of this invention, and an effect process 本発明の第1の実施形態における範囲選択処理手順の第1例を示すフローチャートThe flowchart which shows the 1st example of the range selection processing procedure in the 1st Embodiment of this invention. 本発明の第1の実施形態における記憶部に記録されるタッチ入力の軌跡を表すデータの構造の一例を示す図The figure which shows an example of the structure of the data showing the locus | trajectory of the touch input recorded on the memory | storage part in the 1st Embodiment of this invention. 本発明の第1の実施形態における範囲選択処理手順の第2例を示すフローチャートThe flowchart which shows the 2nd example of the range selection processing procedure in the 1st Embodiment of this invention. 本発明の第1の実施形態における投げ縄ツールの機能を用いた場合の切り出し処理およびエフェクト処理の一例を示す図The figure which shows an example of the cutting-out process and effect process at the time of using the function of the lasso tool in the 1st Embodiment of this invention 本発明の第2の実施形態における画像処理装置の構成例を示すブロック図The block diagram which shows the structural example of the image processing apparatus in the 2nd Embodiment of this invention. (A)、(B)本発明の第2の実施形態における画像処理装置の動作手順の一例を示すフローチャート(A), (B) The flowchart which shows an example of the operation | movement procedure of the image processing apparatus in the 2nd Embodiment of this invention. 本発明の第2の実施形態における画像処理の一例を説明するための図The figure for demonstrating an example of the image processing in the 2nd Embodiment of this invention. (A)~(C)従来の切り出し処理および画像処理の手順を説明するための図FIGS. 4A to 4C are diagrams for explaining a procedure of conventional clipping processing and image processing;
 本発明の実施形態における画像処理装置、画像処理方法、および画像処理プログラムについて、図面を用いて説明する。本実施形態の画像処理装置は、例えばスマートフォンやタブレット端末などの携帯情報端末に適用される。 An image processing apparatus, an image processing method, and an image processing program according to an embodiment of the present invention will be described with reference to the drawings. The image processing apparatus of this embodiment is applied to portable information terminals, such as a smart phone and a tablet terminal, for example.
(第1の実施形態)
 図1は本発明の第1の実施形態における画像処理装置の構成例を示すブロック図である。ここでは、切り出し(切り抜き)処理およびエフェクト処理を行う際の機能的構成例が示されている。
(First embodiment)
FIG. 1 is a block diagram illustrating a configuration example of an image processing apparatus according to the first embodiment of the present invention. Here, a functional configuration example when performing cutout (cutout) processing and effect processing is shown.
 画像処理装置1は、タッチパネル11、指/スタイラス判別処理部12、切り出し範囲描画処理部13、結線処理部14、記憶部15、表示器16、および操作部17を有して構成される。 The image processing apparatus 1 includes a touch panel 11, a finger / stylus discrimination processing unit 12, a cut-out range drawing processing unit 13, a connection processing unit 14, a storage unit 15, a display 16, and an operation unit 17.
 タッチパネル11は、表示器16の画面に設けられ、所定領域への指やスタイラスなどの入力手段のタッチ位置を検出する。タッチパネルとしては、接触式や押圧式のものが挙げられる。タッチパネル11は、入力手段による所定領域への操作を受け付ける操作受付部の一例である。 The touch panel 11 is provided on the screen of the display 16 and detects a touch position of an input unit such as a finger or a stylus on a predetermined area. Examples of the touch panel include a contact type and a press type. The touch panel 11 is an example of an operation receiving unit that receives an operation on a predetermined area by an input unit.
 指/スタイラス判別処理部12は、指もしくはスタイラスのいずれの入力手段がタッチされたかを判別する。つまり、指/スタイラス判別処理部12は、タッチパネル11により操作が受け付けられた入力手段を判別する入力手段判別部としての機能を有する。 The finger / stylus discrimination processing unit 12 discriminates whether a finger or a stylus input means is touched. That is, the finger / stylus discrimination processing unit 12 has a function as an input unit discrimination unit that discriminates an input unit for which an operation is accepted by the touch panel 11.
 また、指/スタイラス判別処理部12は、タッチパネル11への接触面積をもとに、入力手段を判別してもよい。一般的に指の接触面積はスタイラスの接触面積に比べて大きいので、入力手段が指であるかスタイラスであるかを判別することは容易である。 Further, the finger / stylus discrimination processing unit 12 may discriminate the input means based on the contact area with the touch panel 11. Since the contact area of a finger is generally larger than the contact area of a stylus, it is easy to determine whether the input means is a finger or a stylus.
 切り出し範囲描画処理部13は、タッチ操作により描画された線で囲まれた範囲を切り出し範囲として描画処理を行う。つまり、切り出し範囲描画処理部13は、切り出し処理およびエフェクト処理を行う。また、その他の各種制御、処理を行う。 The cutout range drawing processing unit 13 performs a drawing process using the range surrounded by the line drawn by the touch operation as the cutout range. That is, the cutout range drawing processing unit 13 performs cutout processing and effect processing. In addition, various other controls and processes are performed.
 結線処理部14は、タッチパネル11に対し、指あるいはスタイラスでリリースしてから再びタッチした時、終点と始点を結線するか否かを判断する。なお、指/スタイラス判別処理部12、切り出し範囲描画処理部13、および結線処理部14は、専用のハードウェアで実現されてもよいし、本実施形態のように、CPUがプログラムを実行することで実現されてもよい。 The connection processing unit 14 determines whether to connect the end point and the start point when the touch panel 11 is touched again after being released with a finger or a stylus. The finger / stylus discrimination processing unit 12, the cutout range drawing processing unit 13, and the connection processing unit 14 may be realized by dedicated hardware, or the CPU executes a program as in the present embodiment. It may be realized with.
 記憶部15には、各種のデータが記憶される他、タッチパネル11の画面をなぞった際にタッチ位置の軌跡が座標の情報として記憶される。また、記憶部15には、切り出し画像が格納される。 In addition to storing various data, the storage unit 15 stores the locus of the touch position as coordinate information when the screen of the touch panel 11 is traced. In addition, the storage unit 15 stores a cut-out image.
 また、記憶部15は、テーブル15aを用いて、入力手段ごとに割り当てられ、画像に対し、同じ機能で異なる処理を実行するための処理情報を記憶する。指やスタイラスによる描画に対応する機能が、あらかじめテーブル15aとして登録されている。図2は記憶部15に格納された各種の機能が登録されたテーブル15aの一例を示す図である。このテーブル15aには、切り抜き処理と画像処理を行う際、指やスタイラスの操作によって実現される機能の組み合わせが処理情報として選択可能に登録されている。例えば、ユーザは、メニューキー17aを押下し、所望の機能の組み合わせをあらかじめ選択しておくことで、用途に適した機能の組み合わせを選択することができる。 Further, the storage unit 15 stores processing information that is assigned to each input unit using the table 15a and that executes different processing with the same function on the image. Functions corresponding to drawing with a finger or a stylus are registered in advance as the table 15a. FIG. 2 is a diagram showing an example of a table 15a in which various functions stored in the storage unit 15 are registered. In the table 15a, a combination of functions realized by a finger or stylus operation is registered so as to be selectable as processing information when the clipping process and the image process are performed. For example, the user can select a combination of functions suitable for the application by pressing the menu key 17a and selecting a desired combination of functions in advance.
 具体的には、切り抜き処理の場合、範囲選択の機能には、指で「切断面をギザギザにする」、スタイラスで「切断面を直線にする」という組み合わせがある。また、指で「千切り効果」、スタイラスで「ハサミ」という組み合わせがある。また、指で「めくり効果」、スタイラスで「通常切り抜き」という組み合わせがある。また、指で「投げ縄ツール」、スタイラスで「通常切り抜き」という組み合わせがある。なお、画像処理の場合の各機能ついては、後述する第2の実施形態で説明する。 Specifically, in the case of the clipping process, the range selection function has a combination of “make the cut surface jagged” with a finger and “make the cut surface straight” with a stylus. There is also a combination of “chopping effect” with a finger and “scissors” with a stylus. There is also a combination of “turning effect” with a finger and “normal clipping” with a stylus. There is also a combination of “lasso tool” with a finger and “normal clipping” with a stylus. Note that each function in the case of image processing will be described in a second embodiment described later.
 表示器16は、LCD等からなり、各種の画像を表示するとともに、切り出し範囲などを表す描画された線を表示する。操作部17は各種キーからなる。 The display 16 is composed of an LCD or the like, and displays various images and a drawn line representing a cutout range and the like. The operation unit 17 includes various keys.
 切り出し範囲描画処理部13は、記憶部15に記憶された処理情報のうち、指/スタイラス判別処理部12により判別された入力手段に割り当てられた処理情報に対応する処理を実行する。切り出し範囲描画処理部13は、テーブル15aに格納された切り抜き処理の機能のいずれか1つの処理を実行してもよいし、2つ以上の処理を実行してもよい。また、実行される処理はあらかじめ定められていても良いし、入力操作を介してユーザが設定してもよい。 The cut-out range drawing processing unit 13 executes processing corresponding to the processing information assigned to the input means determined by the finger / stylus determination processing unit 12 among the processing information stored in the storage unit 15. The cut-out range drawing processing unit 13 may execute any one of the cut-out process functions stored in the table 15a, or may execute two or more processes. Further, the process to be executed may be determined in advance, or may be set by the user through an input operation.
 図3は画像処理装置1の筐体前面の一例を示す図である。画像処理装置1の筐体前面には、表示器16、操作部17、マイク18a、およびスピーカ18bが設けられている。種々の画像が表示される表示器16の画面には、前述したように、タッチパネル11が設けられている。また、表示器16の画面の下方には、タッチパネルからなる操作部17が設けられている。ここでは、操作部17として、メニューキー17a、クリアキー17bおよび確定キー17cが切り替え自在に設定されている。例えば、後述する投げ縄キー60に切り替えることも可能である。また、ユーザが把持して操作可能なスタイラス21が用意されている。 FIG. 3 is a diagram illustrating an example of the front surface of the housing of the image processing apparatus 1. A display 16, an operation unit 17, a microphone 18 a, and a speaker 18 b are provided on the front surface of the image processing apparatus 1. As described above, the touch panel 11 is provided on the screen of the display 16 on which various images are displayed. An operation unit 17 including a touch panel is provided below the screen of the display 16. Here, as the operation unit 17, a menu key 17a, a clear key 17b, and a confirmation key 17c are set to be switchable. For example, it is possible to switch to a lasso key 60 described later. A stylus 21 that can be held and operated by the user is also provided.
 上記構成を有する画像処理装置1の動作について説明する。 The operation of the image processing apparatus 1 having the above configuration will be described.
 図4(A)、(B)は画像処理装置1の動作手順の一例を示すフローチャートである。この動作が記述された画像処理プログラムは、画像処理装置1内のROMに格納されており、画像処理装置1内のCPUによって実行される。 FIGS. 4A and 4B are flowcharts showing an example of the operation procedure of the image processing apparatus 1. An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1 and is executed by the CPU in the image processing apparatus 1.
 図4(A)は範囲選択処理の一例を示す図である。この範囲選択処理は、メニューキー17aによって切り抜きモードの開始が選択された場合に実行される。ここでは、指とスタイラス(複数の入力手段)を用いて範囲が選択される。 FIG. 4A shows an example of the range selection process. This range selection process is executed when the start of the cut-out mode is selected by the menu key 17a. Here, the range is selected using a finger and a stylus (a plurality of input means).
 画像処理装置1は、タッチパネル11に対し、ユーザによるタッチ入力が行われるまで待機する(ステップS1)。タッチ入力が行われると、指/スタイラス判別処理部12は、このタッチ入力が指によるものかスタイラスによるものかを判別する(ステップS2)。さらに、このステップS2の処理では、切り出し範囲描画処理部13は、タッチ位置の座標を取得して記憶部15に記録する。 The image processing apparatus 1 stands by until touch input by the user is performed on the touch panel 11 (step S1). When the touch input is performed, the finger / stylus discrimination processing unit 12 discriminates whether the touch input is a finger or a stylus (step S2). Further, in the process of step S <b> 2, the cutout range drawing processing unit 13 acquires the coordinates of the touch position and records them in the storage unit 15.
 切り出し範囲描画処理部13は、このタッチ入力がリリースされたか否かを判別する(ステップS3)。リリースされていない場合、画像処理装置1は、ステップS2の処理に戻る。 The cutout range drawing processing unit 13 determines whether or not this touch input has been released (step S3). If not released, the image processing apparatus 1 returns to the process of step S2.
 一方、リリースされた場合、切り出し範囲描画処理部13は、メニューキー17aによって切り抜きモードの終了が選択されたか否かを判別する(ステップS4)。切り抜きモードの終了が選択されていない場合、画像処理装置1はステップS1の処理に戻る。一方、切り抜きモードの終了が選択された場合、画像処理装置1は本動作を終了する。 On the other hand, when released, the cutout range drawing processing unit 13 determines whether or not the end of the cutout mode is selected by the menu key 17a (step S4). If the end of the cropping mode is not selected, the image processing apparatus 1 returns to the process of step S1. On the other hand, when the end of the cut-out mode is selected, the image processing apparatus 1 ends this operation.
 図4(B)は切り出し処理の一例を示す図である。この切り出し処理は、図4(A)の処理が行われた後に実行される。 FIG. 4B is a diagram illustrating an example of the cutout process. This cut-out process is executed after the process of FIG.
 切り出し範囲描画処理部13は、記憶部15に記録された座標から求められる選択範囲をもとに、記憶部15に登録された画像(作業対象の画像)を切り出す(ステップS11)。画像処理装置1は、切り出し画像の座標群ごとに処理を開始する(ステップS12)。ここで、座標群とは、例えば指によってなぞられた軌跡の座標群であり、その後にスタイラスによってなぞられた軌跡の座標群とは区別される。 The cut-out range drawing processing unit 13 cuts out an image (work target image) registered in the storage unit 15 based on the selection range obtained from the coordinates recorded in the storage unit 15 (step S11). The image processing apparatus 1 starts processing for each coordinate group of the cut-out image (step S12). Here, the coordinate group is a coordinate group of a trajectory traced by a finger, for example, and is distinguished from a coordinate group of a trajectory traced by a stylus thereafter.
 指/スタイラス判別処理部12は、指によるタッチ入力であるか否かを判別する(ステップS13)。指によるタッチ入力である場合、切り出し範囲描画処理部13は、指用のエフェクト処理を行う(ステップS14)。指用のエフェクト処理では、後述するように、例えば指で紙をちぎったような千切り処理が行われる。 The finger / stylus discrimination processing unit 12 discriminates whether or not it is a touch input by a finger (step S13). In the case of touch input with a finger, the cut-out range drawing processing unit 13 performs effect processing for the finger (step S14). In the effect processing for a finger, as will be described later, for example, shredding processing such as tearing paper with a finger is performed.
 一方、ステップS13で指によるタッチ入力でなく、スタイラスによるタッチ入力である場合、切り出し範囲描画処理部13は、スタイラス用のエフェクト処理を行う(ステップS15)。スタイラス用のエフェクト処理では、後述するように、例えばハサミで切ったような直線の切り出し処理が行われる。 On the other hand, if the touch input is not a finger touch but a stylus in step S13, the cutout range drawing processing unit 13 performs an effect process for the stylus (step S15). In the effect processing for the stylus, as will be described later, a straight line cutting process such as cutting with scissors is performed.
 ステップS14あるいはS15の処理後、切り出し範囲描画処理部13は、全ての座標群の処理が終わったか否かを判別する(ステップS16)。全ての座標群の処理が終わっていない場合、画像処理装置1はステップS12の処理に戻る。 After the processing of step S14 or S15, the cutout range drawing processing unit 13 determines whether or not the processing of all coordinate groups has been completed (step S16). If all the coordinate groups have not been processed, the image processing apparatus 1 returns to step S12.
 一方、全ての座標群の処理が終わった場合、切り出し範囲描画処理部13は、切り出された画像を出力画像として記憶部15に記憶する(ステップS17)。この記憶された出力画像は、表示器16の画面に表示されるだけでなく、外部出力インタフェース(図示せず)を介してデータとして出力可能である。 On the other hand, when all the coordinate groups have been processed, the cut-out range drawing processing unit 13 stores the cut-out image in the storage unit 15 as an output image (step S17). The stored output image is not only displayed on the screen of the display 16 but can be output as data via an external output interface (not shown).
 図5(A)、(B)は切り出し処理およびエフェクト処理の一例を示す図である。
 図5(A)に示すように、ユーザは、表示器16の画面に表示された画像30のうち、左側辺から上辺にかけてスタイラス21でライン31(第1の線の一例)の描画を行い、右側辺から下辺にかけて指25でライン33(第2の線の一例)の描画を行う。この結果、切り出し範囲(選択範囲)の画像35が確定される。
5A and 5B are diagrams illustrating an example of the cut-out process and the effect process.
As shown in FIG. 5A, the user draws a line 31 (an example of the first line) with the stylus 21 from the left side to the upper side of the image 30 displayed on the screen of the display device 16. A line 33 (an example of a second line) is drawn with the finger 25 from the right side to the lower side. As a result, the image 35 of the cutout range (selection range) is confirmed.
 この切り出し範囲で切り出しが行われると、図5(B)に示すように、切り出された画像35において、左側辺から上辺にいたる部分では、例えばハサミで切ったようなエフェクト(効果)がかかる。また、右側辺から下辺にいたる部分では、例えば指でちぎってめくったようなエフェクトがかかる。ここで、スタイラス21は第1の入力手段に相当し、指は第2の入力手段に相当する。 When the cutout is performed in this cutout range, as shown in FIG. 5B, in the cutout image 35, an effect (effect) such as cutting with scissors is applied in the portion from the left side to the upper side. In addition, in the portion from the right side to the lower side, for example, an effect of tearing off with a finger is applied. Here, the stylus 21 corresponds to the first input means, and the finger corresponds to the second input means.
 図6は詳細な範囲選択手順の一例を示すフローチャートである。
 この動作が記述された画像処理プログラムは、画像処理装置1内のROMに格納されており、画像処理装置1内のCPUによって実行される。
FIG. 6 is a flowchart showing an example of a detailed range selection procedure.
An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1 and is executed by the CPU in the image processing apparatus 1.
 まず、画像処理装置1は、ユーザによるタッチ入力があるまで待機する(ステップS31)。タッチ入力があると、指/スタイラス判別処理部12は、そのタッチ入力がユーザの指によるものかスタイラスによるものかを判別する(ステップS32)。さらに、このステップS32では、切り出し範囲描画処理部13は、タッチ位置(始点)の座標を取得する。 First, the image processing apparatus 1 stands by until there is a touch input by the user (step S31). When there is a touch input, the finger / stylus discrimination processing unit 12 discriminates whether the touch input is by a user's finger or a stylus (step S32). Further, in step S32, the cutout range drawing processing unit 13 acquires the coordinates of the touch position (start point).
 切り出し範囲描画処理部13は、今回のタッチ入力が初回であるか否かを判別する(ステップS33)。初回でない場合、結線処理部14は、前回のタッチ入力の終点と今回のタッチ入力の始点とをつなぐ処理を行う(ステップS34)。このように、前回の終点と今回の始点をつなぐことで、閉じた状態で選択範囲を確定することができ、選択範囲が明瞭になる。 The cutout range drawing processing unit 13 determines whether or not the current touch input is the first time (step S33). If it is not the first time, the connection processing unit 14 performs processing for connecting the end point of the previous touch input and the start point of the current touch input (step S34). In this way, by connecting the previous end point and the current start point, the selection range can be determined in a closed state, and the selection range becomes clear.
 ステップS33で初回のタッチ入力である場合、あるいはステップS34でつなぐ処理が行われた後、切り出し範囲描画処理部13は、タッチ入力の座標を取得し、始点の座標を含め、記憶部15に格納する(ステップS35)。 When it is the first touch input in step S33, or after the processing connected in step S34 is performed, the cutout range drawing processing unit 13 acquires the coordinates of the touch input and stores them in the storage unit 15 including the coordinates of the start point. (Step S35).
 切り出し範囲描画処理部13は、タッチ入力がリリースされたか否かを判別する(ステップS36)。リリースされていない場合、画像処理装置1は、ステップS35の処理に戻り、同様の動作を行う。これにより、記憶部15には、タッチ入力の軌跡の座標群が記録されることになる。 The cutout range drawing processing unit 13 determines whether or not the touch input has been released (step S36). If not released, the image processing apparatus 1 returns to the process of step S35 and performs the same operation. As a result, the coordinate group of the locus of touch input is recorded in the storage unit 15.
 図7は記憶部15に記録されるタッチ入力の軌跡を表すデータの構造の一例を示す図である。このデータには、指あるいはスタイラスの種別、始点のX,Y座標、終点のX,Y座標、座標群である軌跡の各X,Y座標が示されている。 FIG. 7 is a diagram illustrating an example of the structure of data representing the locus of touch input recorded in the storage unit 15. This data shows the type of finger or stylus, the X and Y coordinates of the start point, the X and Y coordinates of the end point, and the X and Y coordinates of the trajectory that is a coordinate group.
 一方、ステップS36でリリースされた場合、切り出し範囲描画処理部13は、今回のタッチ入力の終点を記憶部15に格納する(ステップS37)。 On the other hand, when released in step S36, the cutout range drawing processing unit 13 stores the end point of the current touch input in the storage unit 15 (step S37).
 この後、切り出し範囲描画処理部13は、確定キー17cが押下されたか否かを判別する(ステップS38)。確定キー17cが押下されない場合、画像処理装置1はステップS31の処理に戻る。一方、確定キー17cが押下された場合、画像処理装置1は、メニューキー17aによって切り抜きモードの終了が選択されるまで待機する(ステップS39)。切り抜きモードの終了が選択されると、画像処理装置1は本動作を終了する。 Thereafter, the cutout range drawing processing unit 13 determines whether or not the enter key 17c is pressed (step S38). If the confirmation key 17c is not pressed, the image processing apparatus 1 returns to the process of step S31. On the other hand, when the enter key 17c is pressed, the image processing apparatus 1 stands by until the end of the cut-out mode is selected by the menu key 17a (step S39). When the end of the cut-out mode is selected, the image processing apparatus 1 ends this operation.
 図8は範囲選択手順の第2例を示すフローチャートである。この動作が記述された画像処理プログラムは、画像処理装置1内のROMに格納されており、画像処理装置1内のCPUによって実行される。 FIG. 8 is a flowchart showing a second example of the range selection procedure. An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1 and is executed by the CPU in the image processing apparatus 1.
 まず、画像処理装置1は、ユーザによるタッチ入力があるまで待機する(ステップS41)。タッチ入力があると、指/スタイラス判別処理部12は、そのタッチ入力がユーザの指によるものかスタイラスによるものかを判別する(ステップS42)。さらに、このステップS41では、切り出し範囲描画処理部13は、タッチ位置(始点)の座標を取得して記憶部15に格納する。 First, the image processing apparatus 1 stands by until there is a touch input by the user (step S41). If there is a touch input, the finger / stylus discrimination processing unit 12 discriminates whether the touch input is by a user's finger or a stylus (step S42). Further, in step S <b> 41, the cutout range drawing processing unit 13 acquires the coordinates of the touch position (start point) and stores them in the storage unit 15.
 切り出し範囲描画処理部13は、タッチ入力の座標を取得し、記憶部15に格納する(ステップS43)。そして、切り出し範囲描画処理部13は、タッチ入力がリリースされたか否かを判別する(ステップS44)。タッチ入力がリリースされない場合、画像処理装置1は、ステップS43の処理に戻り、同様の動作を行う。一方、ステップS44でリリースされると、切り出し範囲描画処理部13は、タッチ入力の終点の座標を記憶部15に格納する(ステップS45)。 The cut-out range drawing processing unit 13 acquires the coordinates of the touch input and stores them in the storage unit 15 (step S43). Then, the cutout range drawing processing unit 13 determines whether or not the touch input has been released (step S44). When the touch input is not released, the image processing apparatus 1 returns to the process of step S43 and performs the same operation. On the other hand, when released in step S44, the cutout range drawing processing unit 13 stores the coordinates of the end point of the touch input in the storage unit 15 (step S45).
 この後、結線処理部14は、全ての線の始点・終点の座標を記憶部15から読み出して結線判定を行う(ステップS46)。この結線判定では、今までに記録された始点(始端)と終点(終端)が一定の範囲内(所定距離以下)にあるか否かが判断される。すなわち、あるタッチ入力の軌跡の端点と、別のタッチ入力の軌跡の端点とが近くにある場合、これらの点は「結線する」と判定される。これにより、前回のタッチ入力の軌跡(線)に限らず、それより前のタッチ入力の線ともつなぐことができ、タッチ入力が簡単になり、範囲選択の自由度が高まる。 Thereafter, the connection processing unit 14 reads out the coordinates of the start and end points of all lines from the storage unit 15 and performs connection determination (step S46). In this connection determination, it is determined whether or not the start point (start end) and end point (end end) recorded so far are within a certain range (below a predetermined distance). That is, when the end points of a certain touch input locus and the end points of another touch input locus are close to each other, these points are determined to be “connected”. Thereby, it is possible to connect not only the locus (line) of the previous touch input but also the previous touch input line, the touch input becomes simple, and the degree of freedom of range selection increases.
 結線処理部14は、ステップS46の判定の結果、「結線する」と判定されたか否かを判別する(ステップS47)。「結線する」と判定された場合、結線処理部14は、結線対象となった点をつなぐ結線処理を行う(ステップS48)。 The connection processing unit 14 determines whether or not it is determined to be “connected” as a result of the determination in step S46 (step S47). When it is determined that “to be connected”, the connection processing unit 14 performs a connection process for connecting the points to be connected (step S48).
 一方、ステップS47で「結線する」と判定されなかった場合、あるいはステップS48で結線処理を行った後、切り出し範囲描画処理部13は、確定キー17cが押下されたか否かを判別する(ステップS49)。確定キー17cが押下されない場合、画像処理装置1はステップS41の処理に戻る。一方、確定キー17cが押下された場合、画像処理装置1は、メニューキー17aによって切り抜きモードの終了が選択されるまで待機する(ステップS50)。切り抜きモードの終了が選択されると、画像処理装置1は本動作を終了する。 On the other hand, if it is not determined to be “connected” in step S47, or after performing the connection process in step S48, the cutout range drawing processing unit 13 determines whether or not the enter key 17c is pressed (step S49). ). If the confirmation key 17c is not pressed, the image processing apparatus 1 returns to the process of step S41. On the other hand, when the enter key 17c is pressed, the image processing apparatus 1 stands by until the end of the cut-out mode is selected by the menu key 17a (step S50). When the end of the cut-out mode is selected, the image processing apparatus 1 ends this operation.
 図9は投げ縄ツールの機能を用いた場合の切り出し処理およびエフェクト処理の一例を示す図である。投げ縄ツールには、切り出し範囲描画処理部13により、画像に含まれる人物の顔などをオブジェクトとして認識し(オブジェクト認識部によるオブジェクト認識処理)、認識されたオブジェクトを捉える機能がある。 FIG. 9 is a diagram showing an example of the cut-out process and the effect process when the function of the lasso tool is used. The lasso tool has a function of recognizing a recognized object by recognizing a person's face included in the image as an object (object recognition processing by the object recognition unit) by the cutout range drawing processing unit 13.
 表示器16の画面に表示された画像50に対し、前述したように、左側辺から上辺にかけてスタイラス21でライン51を描画した後、右側辺から下辺に指25でなぞってライン53を描画した後、投げ縄キー60を押して切り出し処理を行う。なお、投げ縄キー60を押下することなく、指によるライン53が描画された時点で自動的に投げ縄ツールの機能が実行されるようにしてもよい。この結果、切り出された画像55は、背景部分55aと、投げ縄ツールにより捉えられた人物の顔55bとからなる。 After the line 51 is drawn with the stylus 21 from the left side to the upper side of the image 50 displayed on the screen of the display device 16 as described above, the line 53 is drawn by tracing with the finger 25 from the right side to the lower side. Then, the lasso key 60 is pressed to perform the cutting process. The function of the lasso tool may be automatically executed when the line 53 is drawn by the finger without pressing the lasso key 60. As a result, the cut-out image 55 includes a background portion 55a and a human face 55b captured by a lasso tool.
 このように、処理部としての切り出し範囲描画処理部13は、画像に含まれるオブジェクトを認識するオブジェクト認識部としての機能を有してもよい。また、切り出し範囲描画処理部13は、第1の線と、第2の線の始端および終端を、認識されたオブジェクトに沿って結線した線と、で囲まれた選択範囲を認識するようにしてもよい。第1の線は、例えば、第1の入力手段としてのスタイラスにより入力された線である。第2の線は、例えば、第2の入力手段としての指により入力された線である。 Thus, the cut-out range drawing processing unit 13 as a processing unit may have a function as an object recognition unit that recognizes an object included in an image. Further, the cut-out range drawing processing unit 13 recognizes a selection range surrounded by the first line and a line connecting the start and end of the second line along the recognized object. Also good. The first line is, for example, a line input by a stylus as a first input unit. The second line is, for example, a line input with a finger as the second input means.
 これにより、画像に含まれる特定の対象(オブジェクト)に沿った画像の切り出しが可能となる。なお、顔に沿って切り取られる画像55の端縁57はギザギザの直線であっても滑らかな曲線であってもよい。 This makes it possible to cut out an image along a specific target (object) included in the image. The edge 57 of the image 55 cut out along the face may be a jagged straight line or a smooth curve.
 このように、本実施形態の画像処理装置1によれば、画像を切り出して加工する際、指やスタイラスを用いたタッチ入力に、同じ機能として関連性のある切り出し処理が割り当てられ、それぞれが影響し合う処理が行われる。ここでは、指やスタイラスを用いて選択される画像に対し、同一の機能で異なる加工処理が施される。 As described above, according to the image processing apparatus 1 of the present embodiment, when an image is cut out and processed, a related cut-out process is assigned as the same function to touch input using a finger or a stylus, and each influences. Processing is performed. Here, different processing is performed with the same function on an image selected using a finger or a stylus.
 このように、処理部としての切り出し範囲描画処理部13は、機能として画像の切り出しを行う場合、複数の入力手段により入力された線で囲まれた選択範囲にある画像に対し、入力手段ごとに異なる、画像を加工するための加工処理を行っても良い。これにより、従って、異なる入力手段を用いて簡易で直感的な操作を行うことができる。 As described above, the cutout range drawing processing unit 13 serving as a processing unit performs, for each input unit, an image in a selection range surrounded by lines input by a plurality of input units when performing image cutout as a function. A different processing for processing an image may be performed. Accordingly, a simple and intuitive operation can be performed using different input means.
 また、処理部としての切り出し範囲描画処理部13は、第1の線に沿って選択される画像に対し、直線で切り出し処理を行い、第2の線に沿って選択される画像に対し、千切り処理を行ってもよい。これにより、スタイラスおよび指による1度の操作で、範囲選択処理、切り出し処理、エフェクト処理を行うことができ、操作性が向上する。 Further, the cutout range drawing processing unit 13 as a processing unit performs cutout processing with a straight line on the image selected along the first line, and cuts the image selected along the second line into pieces. Processing may be performed. Thereby, the range selection process, the cutout process, and the effect process can be performed by one operation with the stylus and the finger, and the operability is improved.
 また、処理部としての切り出し範囲描画処理部13は、第1の線の端点と、第2の線の端点と、の距離が所定距離以下である場合には、第1の線の端点と第2の線の端点とを結線し、結線された第1の線と第2の線で囲まれた範囲を選択範囲として認識してもよい。これにより、各入力端点に対して結線処理が実行され、スタイラスによる入力の端点と指による入力の端点とが若干離れている場合であっても、適切に範囲選択処理を行うことができるので、操作性が向上する。 Further, the cutout range drawing processing unit 13 as the processing unit, when the distance between the end point of the first line and the end point of the second line is equal to or less than a predetermined distance, The end points of the two lines may be connected, and the range surrounded by the connected first line and second line may be recognized as the selection range. Thereby, the connection process is executed for each input end point, and even if the end point of the input by the stylus and the end point of the input by the finger are slightly separated, the range selection process can be appropriately performed. Operability is improved.
(第2の実施形態)
 第1の実施形態では、切り出し処理およびエフェクト処理について示したが、第2の実施形態では、画像処理を行う場合を示す。
(Second Embodiment)
In the first embodiment, the cut-out process and the effect process have been described. In the second embodiment, a case where image processing is performed is illustrated.
 図10は本発明の第2の実施形態における画像処理装置1Aの構成例を示すブロック図である。ここでは、画像処理装置1Aにおける画像処理を行う際の機能的構成が示されている。また、第1の実施形態と同一の構成要素については同一の符号を用いることで、その説明を省略する。 FIG. 10 is a block diagram showing a configuration example of the image processing apparatus 1A according to the second embodiment of the present invention. Here, a functional configuration when performing image processing in the image processing apparatus 1A is shown. Further, the same components as those in the first embodiment are denoted by the same reference numerals, and the description thereof is omitted.
 画像処理装置1Aでは、画像処理範囲描画処理部23は、タッチ操作により囲まれた範囲を画像処理範囲として描画処理を行う。また、記憶部15Aには、画像処理範囲および指/スタイラス判定結果が格納される他、前述したテーブル15aが登録されている。その他の構成は前記第1の実施形態と同じである。 In the image processing apparatus 1A, the image processing range drawing processing unit 23 performs a drawing process using the range surrounded by the touch operation as the image processing range. In addition to storing the image processing range and the finger / stylus determination result, the above-described table 15a is registered in the storage unit 15A. Other configurations are the same as those in the first embodiment.
 記憶部15Aは、前述したように、指やスタイラスによる描画に対応する機能の情報をあらかじめテーブル15aに記憶している。このテーブル15aには、前述した切り抜き処理とは別に、画像処理を行う際、指やスタイラスの操作によって実現される機能の組み合わせが選択可能に登録されている。例えば、ユーザは、メニューキー17aを押下し、所望の機能の組み合わせをあらかじめ選択しておくことで、用途に適した機能の組み合わせを選択することができる。 As described above, the storage unit 15A stores information on functions corresponding to drawing with a finger or a stylus in the table 15a in advance. In the table 15a, in addition to the above-described clipping process, a combination of functions realized by a finger or stylus operation is registered so as to be selectable when performing image processing. For example, the user can select a combination of functions suitable for the application by pressing the menu key 17a and selecting a desired combination of functions in advance.
 具体的には、画像処理の場合、ソフトフォース処理の機能には、指で「ぼやける」、スタイラスで「くっきり」という組み合わせがある。また、明度調整の機能には、指で「暗くする」、スタイラスで「明るくする」という組み合わせがある。また、画面効果の機能には、指で「キラキラ効果を消す」、スタイラスで「キラキラ効果を付ける」という組み合わせがある。また、指で「モザイクを消す」、スタイラスで「モザイクをかける」という組み合わせがある。また、指で「めくり効果をつける」、スタイラスで「めくり効果を消す」という組み合わせがある。 Specifically, in the case of image processing, soft force processing functions include a combination of “blurring” with a finger and “clear” with a stylus. The brightness adjustment function includes a combination of “darken” with a finger and “brighten” with a stylus. The screen effect function includes a combination of “turn off the glitter effect” with a finger and “apply a glitter effect” with a stylus. In addition, there is a combination of “erasing the mosaic” with a finger and “applying a mosaic” with a stylus. There is also a combination of “turn turning effect” with a finger and “turn off turning effect” with a stylus.
 画像処理範囲描画処理部23は、テーブル15aに格納された画像処理の機能のいずれか1つの処理を実行してもよいし、2つ以上の処理を実行してもよい。また、実行される処理はあらかじめ定められていても良いし、入力操作を介してユーザが設定してもよい。 The image processing range drawing processing unit 23 may execute any one of the image processing functions stored in the table 15a, or may execute two or more processes. Further, the process to be executed may be determined in advance, or may be set by the user through an input operation.
 上記構成を有する画像処理装置の動作について説明する。
 図11(A)、(B)は画像処理装置1Aの動作手順の一例を示すフローチャートである。この動作が記述された画像処理プログラムは、画像処理装置1A内のROMに格納されており、画像処理装置1A内のCPUによって実行される。
The operation of the image processing apparatus having the above configuration will be described.
FIGS. 11A and 11B are flowcharts showing an example of the operation procedure of the image processing apparatus 1A. An image processing program in which this operation is described is stored in the ROM in the image processing apparatus 1A and is executed by the CPU in the image processing apparatus 1A.
 図11(A)は範囲選択処理手順の一例を示す図である。この範囲選択処理は、メニューキー17aによって画像処理範囲指定モードの開始が選択された場合に実行される。 FIG. 11A is a diagram showing an example of a range selection processing procedure. This range selection process is executed when the start of the image processing range designation mode is selected by the menu key 17a.
 画像処理装置1Aは、タッチパネル11に対し、ユーザによるタッチ入力が行われるまで待機する(ステップS51)。タッチ入力が行われると、指/スタイラス判別処理部12は、このタッチ入力が指によるものかスタイラスによるものかを判別する(ステップS52)。さらに、このステップS52の処理では、画像処理範囲描画処理部23は、タッチ位置の座標を取得して記憶部15Aに記録する。 The image processing apparatus 1A waits until a touch input by the user is performed on the touch panel 11 (step S51). When the touch input is performed, the finger / stylus discrimination processing unit 12 discriminates whether the touch input is a finger or a stylus (step S52). Further, in the process of step S52, the image processing range drawing processing unit 23 acquires the coordinates of the touch position and records them in the storage unit 15A.
 画像処理範囲描画処理部23は、このタッチ入力がリリースされたか否かを判別する(ステップS53)。リリースされていない場合、画像処理装置1Aは、ステップS52の処理に戻る。 The image processing range drawing processing unit 23 determines whether or not this touch input has been released (step S53). If not released, the image processing apparatus 1A returns to the process of step S52.
 一方、リリースされた場合、画像処理装置1Aは、メニューキー17aによって画像処理範囲指定モードの終了が選択されたか否かを判別する(ステップS54)。画像処理範囲指定モードの終了が選択されていない場合、画像処理装置1AはステップS51の処理に戻る。一方、画像処理範囲指定モードの終了が選択された場合、画像処理装置1Aは本動作を終了する。 On the other hand, when released, the image processing apparatus 1A determines whether or not the end of the image processing range designation mode is selected by the menu key 17a (step S54). If the end of the image processing range designation mode is not selected, the image processing apparatus 1A returns to the process of step S51. On the other hand, when the end of the image processing range designation mode is selected, the image processing apparatus 1A ends this operation.
 図11(B)は画像処理手順の一例を示す図である。この画像処理は、範囲選択処理が行われた後に実行される。 FIG. 11B is a diagram illustrating an example of an image processing procedure. This image processing is executed after the range selection processing is performed.
 画像処理範囲描画処理部23は、記憶部15Aに記録された座標から求められる選択範囲をもとに、記憶部15Aに登録された画像(作業対象の画像)に対し、画像処理を行う(ステップS61)。そして、画像処理範囲描画処理部23は、画像処理が施された画像を出力画像として記憶部15Aに記憶する(ステップS62)。この記憶された出力画像は、表示器16の画面に表示されるだけでなく、外部出力インタフェース(図示せず)を介してデータとして出力可能である。 The image processing range drawing processing unit 23 performs image processing on the image (work target image) registered in the storage unit 15A based on the selection range obtained from the coordinates recorded in the storage unit 15A (step S1). S61). Then, the image processing range drawing processing unit 23 stores the image subjected to the image processing in the storage unit 15A as an output image (step S62). The stored output image is not only displayed on the screen of the display 16 but can be output as data via an external output interface (not shown).
 図12は画像処理を説明するための図である。表示された画像において、画像処理を行いたい部分をスタイラス21や指25でそれぞれ指定すると、指定された部分に対し、画像処理が行われる。ここでは、指やスタイラス21でなぞった部分には、所定幅の線で選択範囲が指定される。そして、スタイラス21によってなぞられた部分71には、例えば明度を上げる画像処理が行われる(図中、ドットで表されている)。指25によってなぞられた部分73には、例えば明度を下げる画像処理が行われる(図中、斜線で表されている)。 FIG. 12 is a diagram for explaining image processing. In the displayed image, when a portion to be subjected to image processing is designated by the stylus 21 or the finger 25, image processing is performed on the designated portion. Here, a selection range is designated by a line having a predetermined width for a portion traced with a finger or a stylus 21. The portion 71 traced by the stylus 21 is subjected to, for example, image processing for increasing the brightness (represented by dots in the figure). For example, image processing for reducing brightness is performed on the portion 73 traced by the finger 25 (indicated by hatching in the drawing).
 このように、本実施形態の画像処理装置1Aによれば、画像処理を行う際、指やスタイラスを用いたタッチ入力に、同じ機能で関連性のある処理が割り当てられ、それぞれが影響し合うような処理が行われる。従って、画像処理を行う際にも、異なる入力手段を用いて簡易で直感的な操作を行うことができる。さらに、1度のタッチ入力で、範囲選択処理と画像処理(第1の実施形態における切り出し処理およびエフェクト処理に対応)を行うことができ、操作性が向上する。 As described above, according to the image processing apparatus 1A of the present embodiment, when image processing is performed, related processing with the same function is assigned to touch input using a finger or a stylus so that each affects the touch input. Processing is performed. Therefore, when performing image processing, a simple and intuitive operation can be performed using different input means. Furthermore, range selection processing and image processing (corresponding to the clipping processing and effect processing in the first embodiment) can be performed with one touch input, and operability is improved.
 なお、上記第1の実施形態と第2の実施形態とは、組み合わせることが可能である。 Note that the first embodiment and the second embodiment can be combined.
 本発明は、上記実施形態の構成に限られるものではなく、特許請求の範囲で示した機能、または本実施形態の構成が持つ機能が達成できる構成であればどのようなものであっても適用可能である。 The present invention is not limited to the configuration of the above-described embodiment, and can be applied to any configuration that can achieve the functions shown in the claims or the functions of the configuration of the present embodiment. Is possible.
 例えば、上記実施形態では、入力手段として、指とスタイラスの2つの組み合わせの場合を示したが、3つ以上の組み合わせであってもよい。例えば、スタイラスの両端の太さを変え、細い側のタッチ入力と太い側のタッチ入力と指とからなる3つの組み合わせでそれぞれ異なる処理が割り当てられてもよい。さらには、指の接触面積の識別精度を高め、親指、人差し指、小指の3つを識別し、スタイラスと合わせて、4つの入力手段となるようにしてもよい。 For example, in the above-described embodiment, the case where two combinations of a finger and a stylus are shown as input means, but three or more combinations may be used. For example, the thickness of both ends of the stylus may be changed, and different processing may be assigned to each of the three combinations of the touch input on the thin side, the touch input on the thick side, and the finger. Furthermore, the identification accuracy of the contact area of the finger may be improved, and the thumb, the index finger, and the little finger may be identified and combined with the stylus to provide four input means.
 また、上記実施形態では、切り出し処理およびエフェクト処理と画像処理との場合を示したが、これらの処理に限らず、入力処理や検索処理などにも本発明は適用可能である。 In the above embodiment, the cases of the clipping process, the effect process, and the image process have been described. However, the present invention is not limited to these processes, and can be applied to an input process, a search process, and the like.
 また、上記実施形態では、指やスタイラスによるタッチパネルへの接触面積によって、入力手段を判別していたが、これに限らず、例えば、感圧方式のタッチパネルを用いて、圧力分布でスタイラスと指を判別するようにしてもよい。 In the above embodiment, the input means is determined based on the contact area of the finger or stylus to the touch panel. However, the present invention is not limited to this, and for example, using a pressure-sensitive touch panel, the stylus and the finger are distributed with pressure distribution. You may make it discriminate | determine.
 また、上記実施形態では、操作受付部の一例として、画像処理装置1、1Aの表示器16の画面に設けられたタッチパネル11が用いられたが、表示器16から離れた位置に設けられたタッチパッドであってもよい。 Moreover, in the said embodiment, although the touch panel 11 provided in the screen of the display 16 of the image processing apparatus 1 and 1A was used as an example of an operation reception part, the touch provided in the position away from the display 16 is used. It may be a pad.
 また、本発明は、上記実施形態の機能を実現する画像処理プログラムを、ネットワークあるいは各種記憶媒体を介して画像処理装置に供給し、この画像処理装置内のコンピュータが読み出して実行するプログラムも適用範囲である。 The present invention also provides an image processing program that realizes the functions of the above-described embodiments to an image processing apparatus via a network or various storage media, and a program that is read and executed by a computer in the image processing apparatus. It is.
 本発明を詳細にまた特定の実施態様を参照して説明したが、本発明の精神と範囲を逸脱することなく様々な変更や修正を加えることができることは当業者にとって明らかである。
 本出願は、2011年11月14日出願の日本特許出願No.2011-248847に基づくものであり、その内容はここに参照として取り込まれる。
Although the present invention has been described in detail and with reference to specific embodiments, it will be apparent to those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the invention.
This application is based on Japanese Patent Application No. 2011-248847 filed on November 14, 2011, the contents of which are incorporated herein by reference.
 本発明は、画像を処理する際、異なる入力手段を用いて簡易で直感的な操作を行うことができ、有用である。 The present invention is useful because it enables simple and intuitive operations using different input means when processing an image.
 1、1A 画像処理装置
 11 タッチパネル
 12 指/スタイラス判別処理部
 13 切り出し範囲描画処理部
 14 結線処理部
 15、15A 記憶部
 15a テーブル
 16 表示器
 17 操作部
 17a メニューキー
 17b クリアキー
 17c 確定キー
 18a マイク
 18b スピーカ
 21 スタイラス
 23 画像処理範囲描画処理部
 25 指
 30、35 画像
 31、33 ライン
 50、55 画像
 51、53 ライン
 55a 背景部分
 55b 顔
 57 画像の端縁
 71、73 描画された部分
DESCRIPTION OF SYMBOLS 1, 1A Image processing apparatus 11 Touch panel 12 Finger / stylus discrimination processing part 13 Cutout range drawing processing part 14 Connection processing part 15, 15A Storage part 15a Table 16 Display 17 Operation part 17a Menu key 17b Clear key 17c Confirmation key 18a Microphone 18b Speaker 21 Stylus 23 Image processing range drawing processing unit 25 Finger 30, 35 Image 31, 33 Line 50, 55 Image 51, 53 Line 55a Background part 55b Face 57 Image edge 71, 73 Part drawn

Claims (8)

  1.  画像を処理する画像処理装置であって、
     入力手段による所定領域への操作を受け付ける操作受付部と、
     前記操作受付部によって操作が受け付けられた入力手段を判別する入力手段判別部と、
     前記入力手段ごとに割り当てられ、画像に対し、同じ機能で異なる処理を実行するための処理情報を記憶する記憶部と、
     前記記憶部に記憶された処理情報のうち、前記入力手段判別部により判別された入力手段に割り当てられた処理情報に対応する処理を実行する処理部と、
     を備える画像処理装置。
    An image processing apparatus for processing an image,
    An operation accepting unit for accepting an operation to a predetermined area by the input means;
    An input means discriminating section for discriminating an input means whose operation is accepted by the operation accepting section;
    A storage unit that is assigned to each input unit and stores processing information for executing different processing with the same function on an image;
    Among the processing information stored in the storage unit, a processing unit that executes processing corresponding to the processing information assigned to the input unit determined by the input unit determination unit;
    An image processing apparatus comprising:
  2.  請求項1記載の画像処理装置であって、
     前記操作受付部は、前記入力手段が前記所定領域に接触することで操作を受け付け、
     前記入力手段判別部は、前記操作受付部が受け付けた前記入力手段の接触面積に応じて、前記入力手段を判別する画像処理装置。
    The image processing apparatus according to claim 1,
    The operation accepting unit accepts an operation when the input unit contacts the predetermined area,
    The image processing apparatus that determines the input unit according to a contact area of the input unit received by the operation receiving unit.
  3.  請求項1または2記載の画像処理装置であって、
     前記処理部は、前記機能として、画像の切り出しを行う場合、複数の前記入力手段により入力された線で囲まれた選択範囲にある画像に対し、前記入力手段ごとに異なる、画像を加工するための加工処理を行う画像処理装置。
    The image processing apparatus according to claim 1 or 2,
    When the image is cut out as the function, the processing unit processes an image that is different for each input unit with respect to an image in a selection range surrounded by lines input by the plurality of input units. An image processing apparatus that performs the processing.
  4.  請求項3記載の画像処理装置であって、
     前記処理部は、前記複数の入力手段のうち、第1の入力手段により入力された第1の線に沿って選択される画像に対し、直線で切り出し処理を行い、第2の入力手段により入力された第2の線に沿って選択される前記画像に対し、千切り処理を行う画像処理装置。
    The image processing apparatus according to claim 3,
    The processing unit performs a cut-out process with a straight line on an image selected along the first line input by the first input unit among the plurality of input units, and inputs by the second input unit. An image processing apparatus that performs shredding processing on the image selected along the second line.
  5.  請求項3記載の画像処理装置であって、
     前記画像に含まれるオブジェクトを認識するオブジェクト認識部を備え、
     前記処理部は、前記複数の入力手段のうち、第1の入力手段により入力された第1の線と、第2の入力手段により入力された第2の線の始端および終端を前記オブジェクト認識部により認識されたオブジェクトに沿って結線した線とで囲まれた前記選択範囲を認識する画像処理装置。
    The image processing apparatus according to claim 3,
    An object recognition unit for recognizing an object included in the image;
    The processing unit uses the first line input by the first input unit and the start and end of the second line input by the second input unit among the plurality of input units as the object recognition unit. An image processing apparatus for recognizing the selection range surrounded by a line connected along the object recognized by step (a).
  6.  請求項4または5記載の画像処理装置であって、
     前記処理部は、前記第1の入力手段により入力された第1の線の端点と、前記第2の入力手段により入力された第2の線の端点と、の距離が所定距離以下である場合には、前記第1の線の端点と前記第2の線の端点とを結線し、前記結線された第1の線と第2の線で囲まれた範囲を前記選択範囲として認識する画像処理装置。
    The image processing apparatus according to claim 4 or 5, wherein
    When the distance between the end point of the first line input by the first input unit and the end point of the second line input by the second input unit is equal to or less than a predetermined distance, the processing unit In the image processing, an end point of the first line and an end point of the second line are connected, and a range surrounded by the connected first line and the second line is recognized as the selection range. apparatus.
  7.  画像を処理する画像処理装置における画像処理方法であって、
     入力手段による所定領域への操作を受け付けるステップと、
     前記操作が受け付けられた入力手段を判別するステップと、
     記憶部に記憶された、前記入力手段ごとに割り当てられ、画像に対し、同じ機能で異なる処理を実行するための処理情報のうち、前記判別された入力手段に割り当てられた処理情報に対応する処理を実行するステップと、
     を有する画像処理方法。
    An image processing method in an image processing apparatus for processing an image,
    Receiving an operation to a predetermined area by the input means;
    Determining an input means for which the operation has been accepted;
    Processing corresponding to processing information assigned to the determined input means among processing information assigned to each input means and stored in the storage unit for executing different processing with the same function on the image A step of performing
    An image processing method.
  8.  請求項7に記載の画像処理方法の各ステップをコンピュータに実行させるための画像処理プログラム。 An image processing program for causing a computer to execute each step of the image processing method according to claim 7.
PCT/JP2012/006729 2011-11-14 2012-10-19 Image processing device, image processing method and image processing program WO2013073109A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-248847 2011-11-14
JP2011248847A JP2013105324A (en) 2011-11-14 2011-11-14 Image processing system, image processing method and image processing program

Publications (1)

Publication Number Publication Date
WO2013073109A1 true WO2013073109A1 (en) 2013-05-23

Family

ID=48429211

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/006729 WO2013073109A1 (en) 2011-11-14 2012-10-19 Image processing device, image processing method and image processing program

Country Status (2)

Country Link
JP (1) JP2013105324A (en)
WO (1) WO2013073109A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015176483A (en) * 2014-03-17 2015-10-05 富士通株式会社 Image processing program, image processing method, and information processing device
JP6904447B1 (en) 2020-02-20 2021-07-14 株式会社セガ Yugi image shooting equipment and programs

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02299013A (en) 1989-05-15 1990-12-11 Kyocera Corp Electronic system notebook device
JP2008084119A (en) 2006-09-28 2008-04-10 Kyocera Corp Mobile terminal and its control method
JP2008108233A (en) 2006-09-28 2008-05-08 Kyocera Corp Portable terminal and method for controlling the same
JP3143445U (en) * 2007-05-15 2008-07-24 宏達國際電子股▲ふん▼有限公司 Electronic devices that do not interfere with contact movement
JP2010044520A (en) * 2008-08-11 2010-02-25 Sony Ericsson Mobile Communications Ab Input processor, input processing method, input processing program and mobile terminal device
JP2011003074A (en) * 2009-06-19 2011-01-06 Sharp Corp Input method, input device and electric apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02299013A (en) 1989-05-15 1990-12-11 Kyocera Corp Electronic system notebook device
JP2008084119A (en) 2006-09-28 2008-04-10 Kyocera Corp Mobile terminal and its control method
JP2008108233A (en) 2006-09-28 2008-05-08 Kyocera Corp Portable terminal and method for controlling the same
JP3143445U (en) * 2007-05-15 2008-07-24 宏達國際電子股▲ふん▼有限公司 Electronic devices that do not interfere with contact movement
JP2010044520A (en) * 2008-08-11 2010-02-25 Sony Ericsson Mobile Communications Ab Input processor, input processing method, input processing program and mobile terminal device
JP2011003074A (en) * 2009-06-19 2011-01-06 Sharp Corp Input method, input device and electric apparatus

Also Published As

Publication number Publication date
JP2013105324A (en) 2013-05-30

Similar Documents

Publication Publication Date Title
US11461004B2 (en) User interface supporting one-handed operation and terminal supporting the same
JP5094158B2 (en) Terminal and control method of terminal with touch screen
JP6728275B2 (en) Virtual computer keyboard
US20130263013A1 (en) Touch-Based Method and Apparatus for Sending Information
CN103631514B (en) The method of operation for touch pen function and the electronic device for supporting this method
JP5204305B2 (en) User interface apparatus and method using pattern recognition in portable terminal
JP2004213269A (en) Character input device
US9386174B2 (en) Image forming apparatus, method for guidance on operation method by image forming apparatus, and system
JP2007272904A (en) Terminal equipment and method for selecting screen display item
US9229615B2 (en) Method and apparatus for displaying additional information items
US20150091804A1 (en) Technique for improving operability in switching character types in software keyboard
US20190220170A1 (en) Method and apparatus for creating group
WO2013073109A1 (en) Image processing device, image processing method and image processing program
JPWO2014045414A1 (en) Character input device, character input method, character input control program
KR102176458B1 (en) Method and apparatus for Performing Box Drawing for Data Labeling
KR20080096732A (en) Touch type information inputting terminal, and method thereof
CN114860149A (en) Content editing control method and device, electronic equipment and storage medium
CN110377219B (en) Interface interaction method and terminal
EP3457269B1 (en) Electronic device and method for one-handed operation
KR20120070133A (en) Apparatus for providing virtual touch interface using camera and method thereof
JP7268449B2 (en) Display control device, display control method, and display control program
JP7472950B2 (en) Touch panel type information terminal device and information input processing method thereof
KR20130094660A (en) Operation method and system for a plurality of touch panel, and portable device supporting the same
US20150235405A1 (en) Display of a data source indicator and a data sink indicator
JP6457170B2 (en) Portable electronic devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12850712

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012850712

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE