WO2015163118A1 - Character specifying device, and control program - Google Patents
Character specifying device, and control program Download PDFInfo
- Publication number
- WO2015163118A1 WO2015163118A1 PCT/JP2015/060640 JP2015060640W WO2015163118A1 WO 2015163118 A1 WO2015163118 A1 WO 2015163118A1 JP 2015060640 W JP2015060640 W JP 2015060640W WO 2015163118 A1 WO2015163118 A1 WO 2015163118A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- character
- area
- line
- region
- character string
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
Definitions
- the present invention relates to a character identification device for identifying a character included in a display image.
- a technique for recognizing characters (character strings) included in an image and executing processing using the recognized characters is known.
- a technique for acquiring information on characters and displaying an image generated from the acquired information on an image including characters, a technique for translating the extracted characters, and outputting a translation result, etc. has been known.
- Patent Document 1 describes a technique for recognizing characters only in a character string in a region selected by a user with a one-action operation on a touch panel.
- Japanese Patent Application Laid-Open No. 2004-228561 describes a technique for extracting only a character string line from the entire screen and recognizing a character string line closest to the coordinates touched by the user on the touch panel.
- Patent Document 3 listed below describes a technique for cutting out a character string from an image displayed in the vicinity of coordinates touched on a touch panel by a user.
- Patent Documents 1 to 3 have the following problems.
- the technique of Patent Literature 1 there is a possibility that erroneous recognition occurs when the target character string does not fit within the range selection.
- the technique of Patent Document 2 since it is necessary to extract the entire screen, it may take time to specify a character string.
- Patent Document 3 there is a possibility that a character string that is not intended by the user may be cut out together, and an extra procedure may be required until the target character string is specified. That is, the techniques of Patent Documents 1 to 3 have a problem that it is not sufficient to accurately specify only the characters intended by the user in a short time.
- Patent Documents 1 to 3 When a device such as a smartphone displays an image (so-called through image) input from a camera to take a picture and performs character recognition on the through image, each time the user moves the device Since the characters displayed on the display section change, the problems of Patent Documents 1 to 3 are particularly remarkable.
- the present invention has been made in view of the above-described problems, and an object of the present invention is to realize a character identification device or the like that can accurately identify only a target character in a short time.
- a character identification device for identifying a character included in a display image, and is specified by a user operation on the display image.
- a character recognition execution unit that performs character recognition on an enlarged region obtained by enlarging a specific region in the display image, and a region that is set as a character included in the enlarged region obtained by character recognition by the character recognition execution unit,
- a selected character specifying unit that specifies, as a character selected by the user, a character whose degree of overlap with the specific region exceeds a predetermined amount.
- FIG. 3 is a block diagram showing an example of a main part configuration of a smartphone according to Embodiments 1 to 3 of the present invention. It is a figure which shows an example of the outline
- the smartphone 1 specifies a total touch area 11 that is an area specified by an input operation from an image 10 (hereinafter, referred to as a character string image 10).
- a character string image 10 an image 10
- the smartphone 1 specifies a character recognition area 12 in which the total touch area 11 is enlarged as shown in FIG.
- the smartphone 1 performs character recognition for the character recognition area 12.
- the character recognition target may be a single character or a character string composed of a plurality of characters.
- the character string will be described as an example of character recognition. Subsequently, as illustrated in FIG.
- the smartphone 1 specifies an extracted line area that is an area of a line included in the character recognition area 12.
- the smartphone 1 specifies the extraction line area 13, the extraction line area 14, and the extraction line area 15.
- the smart phone 1 specifies the selected character string according to the overlapping degree of each extraction line area
- the smartphone 1 specifies the selected character string 16 as illustrated in FIG.
- the smartphone 1 since the smartphone 1 performs character recognition not on the entire image but on an enlarged area obtained by enlarging a specific area, the smartphone 1 can execute character recognition in a short time.
- the character string whose degree of overlap between the specific region and the region set in the character string included in the enlarged region obtained by character recognition exceeds a predetermined amount is specified, the selected character string can be specified with high accuracy. Can do.
- Embodiments 1 to 3 will be described in detail.
- the smartphone 1 will be described as an example, but the present invention can be applied to any character identification device for identifying characters included in a display image.
- Embodiment 1 An embodiment of the present invention will be described below with reference to FIGS. 1 and 3 to 8. First, with reference to FIG. 1, the principal part structure of the smart phone 1 (character identification device) is demonstrated.
- the smartphone 1 includes at least a touch panel 2, a photographing unit 3, a control unit 4, a storage unit 5, and a communication unit 6.
- the touch panel 2 includes an input unit 21 and a display unit 22.
- the input unit 21 is an input device that accepts an input operation by a user.
- the touch panel 2 described in the present embodiment is rectangular, and an XY plane coordinate system having the origin at the upper left vertex of the coordinate detection area of the touch panel 2 is set on the touch panel 2 as described later.
- the input unit 21 detects a touch surface that accepts contact (including approach) of an indicator (such as a finger or a pen), presence or absence of contact between the indicator and the touch surface, and a contact position (coordinates).
- the touch sensor may be realized by any sensor as long as it can detect the presence or absence of contact between the indicator and the touch surface.
- the input unit 21 outputs the detected coordinates and the like to the input specifying unit 41 described later.
- the display unit 22 is a display device that displays information processed by the smartphone 1 as an image in a display area, and is, for example, an LCD (liquid crystal display). Specifically, the display unit 22 displays information processed by a display control unit 46 described later. An example of the image displayed on the display unit 22 is the character string image 10.
- the photographing unit 3 is a photographing device for photographing an object and is a so-called camera.
- the imaging unit 3 an existing camera that is generally mounted on a smartphone can be used.
- the imaging unit 3 is controlled by an image acquisition unit 47 to be described later, and images an object.
- the photographing unit 3 outputs the photographed still image or video to the image acquisition unit 47.
- the control unit 4 performs overall control of each unit included in the smartphone 1.
- the control unit 4 includes an input specifying unit 41, an extraction region specifying unit 42, a character recognizing unit 43 (character recognition executing unit), a selected character string specifying unit 44 (selected character specifying unit), a process executing unit 45, a display control unit 46, And the image acquisition part 47 is included.
- the input specifying unit 41 specifies an input operation accepted by the input unit 21. Specifically, the input specifying unit 41 performs the input operation based on the coordinates in the input operation, the time that the indicator is in contact with the input surface, the moving direction of the indicator in contact with the input surface, and the like. Identify. Examples of the type of input operation include single tap, flick, and drag, but are not limited thereto. For example, the input specifying unit 41 performs a single tap operation, a drag operation, or a flick operation of the user with respect to the character string in the character string image in a state where an image including the character string (character string image 10) is displayed on the display unit 22. And the coordinates of the operation are supplied to the extraction region specifying unit 42. In addition, the input specifying unit 41 specifies an operation for performing shooting (for example, a single tap operation for a UI component displayed on the display unit 22), and instructs the image acquisition unit 47 to perform shooting. Supply.
- an operation for performing shooting for example, a single tap operation for a
- the extraction area specifying unit 42 specifies an enlarged area that is an area obtained by enlarging the area specified by the input operation (specific area).
- the identification of the enlarged region will be described with reference to FIGS. Note that the origin of the coordinates shown in FIG. 3 corresponds to the upper left vertex of the coordinate detection area, which is an area where the touch sensor of the input unit 21 can detect the coordinates.
- the coordinate detection area is described as a rectangular area, but the present invention is not limited to this example.
- the extraction area specifying unit 42 specifies the total touch area 11 (specific area) that is an area selected by an input operation performed on the touch panel 2.
- FIG. 3A is a diagram illustrating identification of the total touch area when the user performs a single tap operation on the touch panel 2. That is, in the example of FIG. 3A, the user touches the indicator S (user's finger) to the touch panel 2 and then moves the indicator S away from the touch panel without moving it.
- the extraction region specifying unit 42 selects a touch region M1 that is a square region with one side length L around the coordinate C1. Identify.
- the touch area M1 becomes the total touch area.
- the length L of one side should just be set suitably, for example, the length L of one side may be about 1.0 cm. This is the size of the average index finger tip (fingertip) of an adult male. When the user selects a character string using his / her index finger as the indicator S, the size of the fingertip is the same as the character size. It is assumed that a character string is selected to the extent. Further, the length L of one side may be changed according to the type of input operation or the strength of pressing.
- FIG. 3B is a diagram showing identification of the total touch area when the user performs a drag operation or a flick operation on the touch panel 2. That is, in the example of FIG. 3B, the user moves the indicator S away from the touch panel 2 after bringing the indicator S into contact with the touch panel 2.
- the extraction area specifying unit 42 is the coordinates of the position of the touch panel 2 touched by the indicator S from the input specifying unit 41, that is, in the example of FIG. 3B, the coordinates C1, the coordinates C2, the coordinates C3, and the coordinates When C4 is supplied, a touch area that is a square area having a side of L around each coordinate is specified. That is, in the example of FIG.
- the touch area M1, the touch area M2, the touch area M3, and the touch area M4 are specified.
- the extraction area specifying unit 42 specifies the minimum value and the maximum value of the X coordinate and the minimum value and the maximum value of the Y coordinate from the vertex coordinates of each specified touch area.
- the extraction area specifying unit 42 includes coordinates (x min , y min ) that are the minimum value of the specified X coordinate and the minimum value of the Y coordinate, and coordinates that are the maximum value of the specified X coordinate and the maximum value of the Y coordinate.
- a rectangular area (total touch area 11) in which a line segment connecting (x max , y max ) is a diagonal line is specified. That is, the coordinates (x min , y min ) and the coordinates (x max , y max ) are the vertices of the total touch area 11.
- the extraction area specifying unit 42 specifies the character recognition area 12 (enlarged area). Specifically, as illustrated in FIG. 4, the extraction area specifying unit 42 enlarges the total touch area 11 based on a predetermined condition, and specifies the enlarged area as the character recognition area 12.
- a predetermined condition As illustrated in FIG. 4, an example of the predetermined condition will be described. The example shown below is an example when the character string included in the character string image 10 is horizontally written.
- the extraction area specifying unit 42 specifies the coordinate X min of the 1L position from the side connecting the two vertices having the X coordinate x min in the total touch area 11 (the left side of the total touch area 11) to the negative direction side of the X coordinate. To do.
- the coordinate X max at a position of 1 L is specified on the positive direction side of the X coordinate from the side connecting the two vertices whose X coordinate is x max (the right side of the total touch area 11).
- the coordinate Y min of the position of 2L is specified on the negative direction side of the Y coordinate from the side connecting the two vertices having the Y coordinate y min in the total touch region 11 (upper side of the total touch region 11).
- a coordinate Y max at a position of 2L is specified on the positive direction side of the Y coordinate from the side connecting the two vertices having the Y coordinate y max in the total touch region 11 (the lower side of the total touch region 11).
- the extraction area specifying unit 42 specifies the character recognition area 12 that is a rectangular area in which a line segment connecting the coordinates (X min , Y min ) and the coordinates (X max , Y max ) is a diagonal line. That is, the coordinates (X min , Y min ) and the coordinates (X max , Y max ) are the vertices of the character recognition area 12.
- the extraction area specifying unit 42 specifies, as the character recognition area 12, an area in which the total touch area 11 is expanded by 2 L in the vertical direction and 1 L in the left-right direction.
- the character string image 10 is an image obtained by photographing a newspaper or a magazine
- the large character is often about two to three times the small character. That is, when the character string is written horizontally, after the user changes the magnification of the character string image 10 in order to adjust the size of the small character to the size of his / her fingertip, the input operation is performed on the large character without changing the magnification. Even if it is done, it is necessary to set the vertical width so that the entire large character is included in the character recognition area.
- the total touch area 11 is expanded by 2L in the vertical direction.
- the indicator may come into contact with a character adjacent to the word.
- the extraction area specifying unit 42 increases the left and right widths by 1 L in order to accurately recognize the selected character.
- the predetermined condition that is, how to enlarge the total touch area 11 is not limited to the above-described example.
- the extraction region specifying unit 42 supplies the coordinate information of the specified character recognition region 12 to the character recognition unit 43. Further, the extraction area specifying unit 42 supplies the coordinate information of the specified total touch area 11 to the selected character string specifying unit 44.
- the character recognition unit 43 performs a character recognition process on an image including a character string. Specifically, when the character recognition unit 43 is supplied with the coordinate information of the character recognition region 12, the character recognition unit 43 converts the character string in the region of the character recognition region 12 in the character string image 10 supplied from the image acquisition unit 47 into line units. Extract with Here, as an extraction method of a character string, since an existing optical character recognition technique can be used, detailed description thereof is omitted.
- the character recognizing unit 43 selects the extracted character string in line units (hereinafter referred to as “extracted line”) and coordinate information of the minimum rectangular area (hereinafter referred to as “extracted line area”) including the entire extracted line. This is supplied to the column specifying unit 44.
- the selected character string specifying unit 44 is a character string in which the overlapping degree between the area set in the character string included in the character recognition area 12 and the total touch area 11 obtained by character recognition by the character recognition section 43 exceeds a predetermined amount. Is specified. The identification of this character string will be described with reference to FIG. First, when the selected character string specifying unit 44 is supplied with coordinate information of an extracted line and an extracted line area from the character recognition unit 43, the selected character string specifying unit 44 determines whether the extracted line is vertical writing or horizontal writing. In the example illustrated in FIG. 5, a case where it is determined that the extracted line is horizontal writing will be described.
- the selected character string specifying unit 44 includes, from the extracted lines extracted from the character recognition area 12, an extracted line in which the degree of overlap between the line area and the specific area exceeds the first predetermined amount, that is, included in the extracted line.
- a character string specifying line which is an extraction line whose character string is to be specified, is specified. Specifically, first, the selected character string specifying unit 44 specifies, as a determination line, an extracted line having the smallest Y coordinate of predetermined coordinates (for example, the coordinates of the upper left vertex of the extracted line area) in the extracted line area among the extracted lines. .
- the selected character string specifying unit 44 specifies the extracted line having the extracted line area 13 (the extracted line at the top of the extracted lines) as the determination line among the extracted lines shown in FIG. Then, the selected character string specifying unit 44 compares the coordinate information of the extracted line area of the determination line with the coordinate information of the total touch area 11 supplied from the extraction area specifying unit 42, and extracts the extracted line area and the total touch area 11. It is determined whether or not. As shown in FIG. 5A, the extracted row area 13 and the total touch area 11 do not overlap. In this case, the selected character string specifying unit 44 does not specify the extracted line having the extracted line region 13 as the character string specifying line.
- the selected character string specifying unit 44 specifies an extracted line having an extracted line area 14 that is an extracted line having the next smallest Y coordinate of the predetermined coordinates as a determination line. Then, the selected character string specifying unit 44 determines whether or not the extracted line area 14 and the total touch area 11 overlap. As shown in FIG. 5A, the extracted row area 14 and the total touch area 11 overlap each other. In this case, the selected character string specifying unit 44 determines whether either the upper side or the lower side of the total touch area 11 is included in the extracted line area 14. In other words, the selected character string specifying unit 44 determines whether one of the two sides parallel to the character string traveling direction of the boundary line of the total touch area 11 overlaps the extracted line area 14. .
- the selected character string specifying unit 44 determines that the minimum value of the Y coordinate and the maximum value of the Y coordinate in the total touch area 11 are based on the minimum value of the Y coordinate and the maximum value of the Y coordinate in the extracted line area. It is determined whether it is included in the Y coordinate numerical value range. In the case of FIG. 5A, the minimum value of the Y coordinate in the total touch area 11 is included in the Y coordinate numerical value range in the extraction line area 14. In this case, the selected character string specifying unit 44 specifies a line overlapping area that is an area where the extracted line area and the total touch area 11 overlap.
- a value obtained by subtracting the minimum value of the Y coordinate from the maximum value of the Y coordinate in the row overlap region is the minimum value of the Y coordinate from the maximum value of the Y coordinate in the extracted row region 14. It is determined whether or not the length is a predetermined ratio or more (for example, a length of half or more) with respect to the value obtained by subtracting the value (that is, the height value of the extracted row area 14). As shown in (a) of FIG. 5, the height value of the row overlapping region is half or more of the height value of the extracted row region 14. In this case, the selected character string specifying unit 44 specifies an extracted line having the extracted line area 14 as a character string specifying line.
- the extracted row area 15 overlaps the total touch area 11, and the maximum value of the Y coordinate in the total touch area 11 is included in the Y coordinate numerical value range.
- the selected character string specifying unit 44 does not specify the extracted line having the extracted line area 15 as the character string specifying line. .
- the selected character string specifying unit 44 specifies the character string specifying line, the character whose overlapping degree between the character area and the specific area exceeds the second predetermined amount from the characters included in the character string specifying line, that is, The selected character selected by the input operation is specified.
- the selected character string specifying unit 44 for each character included in the character string specifying line (extracted line 14 in the case of FIG. 5), includes a minimum rectangular area (hereinafter referred to as a character) including the entire character. (Referred to as a region).
- the selected character string specifying unit 44 uses, as a determination character, a character having the smallest X coordinate of a predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters included in the character string specifying line. Identify. That is, the selected character string specifying unit 44 specifies the character having the character region 31 (the leftmost character in the character string specifying line) among the characters shown in FIG. Then, the selected character string specifying unit 44 compares the coordinate information of the character area of the determination character with the coordinate information of the total touch area 11 and determines whether or not the character area and the total touch area 11 overlap. As shown in FIG. 5B, the character area 31 and the total touch area 11 overlap each other.
- the selected character string specifying unit 44 determines whether either the left side or the right side of the total touch area 11 is included in the character area. In other words, the selected character string specifying unit 44 determines whether any one of the two sides perpendicular to the character string traveling direction of the boundary line of the total touch area 11 overlaps the character area. Specifically, in the selected character string specifying unit 44, the minimum value of the X coordinate and the maximum value of the X coordinate in the total touch area 11 include the minimum value of the X coordinate and the maximum value of the X coordinate in the character area. It is determined whether it is included in the X coordinate numerical range. In the case of FIG.
- the minimum value of the X coordinate in the total touch area 11 is included in the X coordinate numerical value range in the character area 31.
- the selected character string specifying unit 44 specifies a character superimposed region that is a region where the character region and the total touch region 11 overlap. Then, a value obtained by subtracting the minimum value of the X coordinate from the maximum value of the X coordinate in the character superimposed region (that is, the value of the width of the character superimposed region) is calculated as the minimum value of the X coordinate from the maximum value of the X coordinate in the character region 31.
- the length is a predetermined ratio or more (for example, a length of half or more) with respect to the subtracted value (that is, the width value of the character area 31).
- the value of the width of the character superimposed region is half or more than the value of the width of the character region 31.
- the selected character string specifying unit 44 specifies the character having the character area 31 as the selected character, and stores the selected character and the coordinate information of the selected character in the selected character storage unit 51 described later.
- the selected character string specifying unit 44 specifies a character having the character region 32 in which the X coordinate of the predetermined coordinate is the next smallest character as a determination character. Then, the same processing as in the above example is performed. In the case of a character having the character region 32, the character region 31 and the total touch region 11 overlap. The left side or the right side of the total touch area 11 is not included in the character area 32. In this case, the selected character string specifying unit 44 specifies the character having the character region 32 as the selected character, and stores the selected character and the coordinate information of the selected character in the selected character storage unit 51 described later. The selected character string specifying unit 44 also specifies the character having the character area 33 as the selected character.
- the selected character string specifying unit 44 specifies a character having the character region 34 as a selected character, and stores the selected character and the coordinate information of the character region of the selected character in the selected character storage unit 51 described later.
- the selected character string specifying unit 44 does not specify the extracted line having the character area 35 as the character string specifying line.
- the selected character string specifying unit 44 specifies the selected character in all the character string specifying lines
- the selected character string and the coordinate information of the selected character stored in the selected character storage unit 51 are read out, and the character area of the selected character is read. Based on the coordinate information, the selected characters are arranged to specify the character string. Then, the specified character string is supplied to the process execution unit 45.
- the process execution unit 45 executes a predetermined process based on the specified character string. Specifically, when the character string is supplied from the selected character string specifying unit 44, the process executing unit 45 acquires information related to the character string via the communication unit 6 described later. As an example of the information to be acquired, a search result obtained by searching the character string by a search engine, an image or a video related to the character string, and the like can be given. And the process execution part 45 produces the link image containing the link for accessing the acquired information using the acquired information. For example, an image indicating a search result of the character string, a thumbnail image of a video related to the character string, a reduced image of an image related to the character string, and the like are created as a link image.
- what is created using the information acquired by the process execution unit 45 is not limited to a link image, and may be a character string including a link, for example. Then, the process execution unit 45 supplies the created link image to the display control unit 46.
- the process which the process execution part 45 performs is not limited to the example mentioned above. That is, the process executed by the process execution unit 45 may be a process that is executed based on the supplied character string, and may be, for example, a process that translates the supplied character string into English.
- the display control unit 46 determines an image to be displayed on the display unit 22. Specifically, when the character string image 10 is supplied from the image acquisition unit 47, the display control unit 46 displays the supplied character string image 10 on the display unit 22. Further, when the display control unit 46 acquires the link image from the processing execution unit 45, the display control unit 46 superimposes the link image on the character string image 10 and displays it on the display unit 22.
- the image acquisition unit 47 operates the photographing unit 3 to acquire the character string image 10. Specifically, when an image capturing instruction is supplied from the input specifying unit 41, the image acquiring unit 47 causes the image capturing unit 3 to capture an image. Then, an image (character string image 10) captured by the imaging unit 3 is acquired, and the acquired character string image 10 is supplied to the character recognition unit 43 and the display control unit 46.
- the image acquisition part 47 in this embodiment is a structure which acquires the image image
- the image acquisition unit 47 may read the character string image 10 stored in the storage unit 5 in response to an instruction from the input specifying unit 41 or acquire the character string image 10 via the communication unit 6. May be. Further, the image to be acquired may be a moving image or a so-called through image displayed on the display unit 22 in order to take a picture with the photographing unit 3.
- the storage unit 5 is a storage device that stores various data used in the smartphone 1. As shown in FIG. 1, the storage unit 5 includes a selected character storage unit 51. The selected character storage unit 51 temporarily stores the selected character identified by the selected character string specifying unit 44 in association with the selected character and the coordinate information of the selected character.
- the communication unit 6 is a communication device for transmitting and receiving information between the smartphone 1 and an external device. Specifically, the communication unit 6 is controlled by the processing execution unit 45 and acquires information related to the character string specified by the selected character string specifying unit 44 from the outside.
- FIG. 6 describes the flow of processing after the character string image 10 is displayed on the display unit 22.
- the input specifying unit 41 specifies an input operation input to the input unit 21 (YES in S1)
- the input specifying unit 41 determines the coordinates of the specified input operation, that is, the touch panel 2 that the indicator S touches.
- the extraction area specifying unit 42 specifies the total touch area 11 (S2). Since specific examples have been described above, they will be omitted.
- the extraction area specifying unit 42 specifies the character recognition area 12 (S3). Specifically, the extraction area specifying unit 42 enlarges the total touch area 11 based on a predetermined condition, and specifies the enlarged area as the character recognition area 12.
- the extraction region specifying unit 42 supplies the coordinate information of the specified character recognition region 12 to the character recognition unit 43. Further, the extraction area specifying unit 42 supplies the coordinate information of the specified total touch area 11 to the selected character string specifying unit 44. Subsequently, the character recognition unit 43 executes character recognition processing of the character recognition area 12 (S4). Specifically, when the character recognition unit 43 is supplied with the coordinate information of the character recognition region 12, the character recognition unit 43 executes the character string in the region of the character recognition region 12 in the character string image 10 supplied from the image acquisition unit 47. Extract by units (extract extracted rows). Then, the character recognizing unit 43 supplies the extracted extracted line and the coordinate information of the extracted line area, which is the smallest rectangular area including the entire extracted line, to the selected character string specifying unit 44.
- the selected character string specifying unit 44 determines whether or not the supplied extracted line is horizontal writing (S5). Note that the processing flow in the case of vertical writing (NO in S5) will be described in the second embodiment to be described later. In the case of horizontal writing (YES in S5), the selected character string specifying unit 44 selects the uppermost line in the character recognition area 12 as a determination line (S11). Subsequently, the selected character string specifying unit 44 determines whether or not the determination line area and the total touch area 11 overlap each other (S12). Here, the determined line area is an extracted line area of the extracted line identified as the determined line.
- the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S16). Specifically, the selected character string specifying unit 44 determines that the current determination line is the extracted line having the largest Y coordinate of predetermined coordinates (for example, the coordinates of the upper left vertex of the extracted line area) in the extracted line area among the extracted lines. It is determined whether or not. If the current determination line is not the last line (NO in S16), the selected character string specifying unit 44 sets the line immediately below the current determination line as the determination line (S17).
- the selected character string specifying unit 44 determines the extracted line whose Y coordinate of the predetermined coordinate in the extracted line area is the next smaller than the Y coordinate of the predetermined coordinate in the current determined line as the next determined line. And And the selection character string specific
- the selected character string specifying unit 44 determines that only one of the upper side and the lower side of the total touch area 11 is the determination line area. It is determined whether it is within (S13). Since the specific example of the determination has been described above, the description is omitted. When only one of the upper side and the lower side is in the determination line area (YES in S13), the selected character string specifying unit 44 determines whether or not the height of the line overlap area is half or more of the height of the determination line area. Is determined (S14). Since the specific example of the determination has been described above, the description is omitted. If it is not half or more (NO in S14), the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S16). Since the subsequent processing has already been described, the description thereof is omitted here.
- the selected character string specifying unit 44 specifies the determination line as a character string specifying line. Then, the character determination process is executed (S15). If only one of the upper side or the lower side of the total touch area 11 is not in the determination line area (NO in S13), the selected character string specifying unit 44 omits the process in step S14 and sets the determination line as a character. Identifies a column specific row. Then, a character determination process is executed (S15).
- the selected character string specifying unit 44 specifies the coordinate information of the character area for each character included in the character string specifying line. Subsequently, the selected character string specifying unit 44 selects the leftmost character in the character string specifying line as a determination character (S21). Subsequently, the selected character string specifying unit 44 determines whether or not the determination character area and the total touch area 11 overlap (S22). Here, the determination character area is a character area of the character specified as the determination character. If they do not overlap (NO in S22), the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S26).
- the selected character string specifying unit 44 determines that the current determination character is the X coordinate of a predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters in the character string specifying line. Determine whether the character is large. If the current determination character is not the final character (NO in S26), the selected character string specifying unit 44 sets the character to the right of the current determination character as the determination character (S27). Specifically, the selected character string specifying unit 44 sets a character whose X coordinate of the predetermined coordinate in the character area is the next smaller than the X coordinate of the predetermined coordinate in the determination character area of the current determination character as the next determination character. . Then, the selected character string specifying unit 44 returns to the process of step S22. The case where the current determination character is the final character will be described later.
- the selected character string specifying unit 44 determines that only one of the left side and the right side of the total touch area 11 is the determination character area. It is determined whether it is within (S23). Since the specific example of the determination has been described above, the description is omitted. When only one of the left side and the right side is in the determination character area (YES in S23), the selected character string specifying unit 44 determines whether or not the width of the character superimposed area is half or more of the width of the determination character area. Determine (S24). Since the specific example of the determination has been described above, the description is omitted. If it is not half or more (NO in S24), the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S26). Since the subsequent processing has already been described, the description thereof is omitted here.
- the selected character string specifying unit 44 specifies the determination character as the selected character and selects the selected character. Characters are stored in the selected character storage unit 51 (S25). If only one of the left side or the right side of the total touch area 11 is not in the determination character area (NO in S23), the selected character string specifying unit 44 selects the determination character by omitting the process of step S24. Identifies it as a character. Then, the selected character is stored in the selected character storage unit 51 (S25).
- the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S26). When it determines with it being the last character (it is YES at S26), the selection character string specific
- the selected character string specifying unit 44 selects the rightmost line in the character recognition area 12 as a determination line (S31). Specifically, the selected character string specifying unit 44 specifies an extracted line having the largest X coordinate of a predetermined coordinate in the extracted line area (for example, the coordinate of the upper left vertex of the extracted line area) among the extracted lines as a determination line. Subsequently, the selected character string specifying unit 44 determines whether or not the determination line area and the total touch area 11 overlap (S32). If they do not overlap (NO in S32), the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S36).
- the selected character string specifying unit 44 determines that the current determination line is the extracted line having the smallest X coordinate of predetermined coordinates (for example, the coordinates of the upper left vertex of the extracted line area) in the extracted line area among the extracted lines. It is determined whether or not. If the current determination line is not the last line (NO in S36), the selected character string specifying unit 44 sets the line one left of the current determination line as the determination line (S37). Specifically, the selected character string specifying unit 44 determines an extracted line whose X coordinate of a predetermined coordinate in the extracted line area is next to the X coordinate of the predetermined coordinate in the determined line area of the current determined line as the next determined line. And And the selection character string specific
- the selected character string specifying unit 44 has only one of the left side or the right side of the total touch area 11 in the determination line area. It is determined whether or not (S33). Specifically, in the selected character string specifying unit 44, the minimum value of the X coordinate and the maximum value of the X coordinate in the total touch area 11 include the minimum value of the X coordinate and the maximum value of the X coordinate in the determination line area. It is determined whether it is included in the X coordinate numerical range.
- the selected character string specifying unit 44 determines whether or not the width of the line overlap area is equal to or greater than half the width of the determination line area. (S34). Specifically, the selected character string specifying unit 44 calculates a value obtained by subtracting the minimum value of the X coordinate from the maximum value of the X coordinate in the line superimposition area (that is, the value of the width of the line superimposition area). It is determined whether or not it is half or more of a value obtained by subtracting the minimum value of the X coordinate from the maximum value of the coordinate (that is, the value of the width of the determined row area).
- the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S36). Since the subsequent processing has already been described, the description thereof is omitted here. It should be noted that the value of more than half is an example, and the determination threshold is not limited to this example.
- the selected character string specifying unit 44 specifies the determination line as a character string specifying line, Character determination processing is executed (S15).
- Character determination processing is executed (S15).
- the selected character string specifying unit 44 omits the process in step S34 and sets the determination line as a character. Identifies a column specific row. Then, a character determination process is executed (S35).
- the selected character string specifying unit 44 specifies the coordinate information of the character area for each character included in the character string specifying line. Subsequently, the selected character string specifying unit 44 selects the uppermost character in the character string specifying line as a determination character (S41). Specifically, the selected character string specifying unit 44 determines a character having the smallest Y coordinate of a predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters included in the character string specifying line. Identifies as a character. Subsequently, the selected character string specifying unit 44 determines whether or not the determination character area and the total touch area 11 overlap (S42).
- the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S46). Specifically, the selected character string specifying unit 44 determines that the current determination character is the Y coordinate of the predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters in the character string specifying line. Determine whether the character is large. If the current determination character is not the final character (NO in S46), the selected character string specifying unit 44 sets the character immediately below the current determination character as the determination character (S47).
- the selected character string specifying unit 44 sets a character whose Y coordinate of a predetermined coordinate in the character area is the next smaller than the Y coordinate of the predetermined coordinate in the determination character area of the current determination character as the next determination character. . Then, the selected character string specifying unit 44 returns to the process of step S42. The case where the current determination character is the final character will be described later.
- the selected character string specifying unit 44 determines that only one of the upper side and the lower side of the total touch area 11 is the determination character area. It is determined whether it is within (S43). Specifically, the selected character string specifying unit 44 determines that the minimum value of the Y coordinate and the maximum value of the Y coordinate in the total touch area 11 are based on the minimum value of the Y coordinate and the maximum value of the Y coordinate in the determination character area. It is determined whether it is included in the Y coordinate numerical value range.
- the selected character string specifying unit 44 determines whether or not the height of the character superimposed area is half or more of the height of the determination character area. Is determined (S44). Specifically, the selected character string specifying unit 44 calculates a value obtained by subtracting the minimum value of the Y coordinate from the maximum value of the Y coordinate in the character superimposed region (that is, the height value of the character superimposed region) in the determination character region. It is determined whether or not it is half or more of a value obtained by subtracting the minimum value of the Y coordinate from the maximum value of the Y coordinate (that is, the height value of the determination character area).
- the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S46). Since the subsequent processing has already been described, the description thereof is omitted here. It should be noted that the value of more than half is an example, and the determination threshold is not limited to this example.
- the selected character string specifying unit 44 specifies the determination character as the selected character, and The selected character is stored in the selected character storage unit 51 (S45). If only one of the upper side or the lower side of the total touch area 11 is not in the determination character area (NO in S43), the selected character string specifying unit 44 omits the process of step S44 and selects the determination character Identifies it as a character. Then, the selected character is stored in the selected character storage unit 51 (S45).
- the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S46). When it determines with it being the last character (it is YES at S46), the selection character string specific
- the control blocks of the smartphone 1 are integrated circuits. It may be realized by a logic circuit (hardware) formed on an (IC chip) or the like, or may be realized by software using a CPU (Central Processing Unit).
- the smartphone 1 includes a CPU that executes instructions of a program that is software that realizes each function, a ROM (Read Only Memory) or a storage in which the above-described program and various data are recorded so as to be readable by the computer (or CPU).
- An apparatus (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like are provided. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it.
- a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
- the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
- the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
- a character specifying device (smart phone 1) according to aspect 1 of the present invention is a character specifying device for specifying a character included in a display image, and is specified in the display image specified by a user operation on the display image.
- a character recognition execution unit (character recognition unit 43) that performs character recognition on the enlarged region obtained by enlarging the region, and a region set as a character included in the enlarged region obtained by character recognition by the character recognition execution unit;
- a selected character specifying unit (selected character string specifying unit 44) that specifies a character whose degree of overlap with the specific region exceeds a predetermined amount as a character selected by the user.
- character recognition is performed not on the entire image but on an enlarged area obtained by enlarging a specific area.
- the time which character recognition processing requires can be shortened.
- character recognition is performed on the enlarged region, at least a region that completely includes a character that the user wants to select can be a target of character recognition processing. Therefore, the character that the user wants to select can be recognized in a complete state.
- region set to the character contained in an expansion area and a specific area exceeds predetermined amount is pinpointed as a character selected by the user. Thereby, the selected character can be specified with high accuracy. From the above, it is possible to specify only the target character with high accuracy in a short time.
- the selected character specifying unit is set for each line formed by a character included in the enlarged region obtained by character recognition by the character recognition executing unit.
- the degree of overlap between the line area in which the line exists and the specific area exceeds a first predetermined amount, the character included in the line area may be specified.
- the characters included in the line area are specified. That is, among the lines included in the enlarged area, it is possible to narrow down the lines for performing the process of specifying characters. Therefore, since it is not necessary to specify characters for all the lines included in the enlarged region, it is possible to specify characters in a short time.
- the character identification device is the character identification device according to aspect 2, in which the specific area and the line area are rectangular, and the selected character identification unit includes (1) the line area and the specific area overlapping each other. And two of the boundary lines of the specific area that are parallel to the direction of the character overlap the line area, or (2) the line area overlaps the specific area. In addition, in the boundary line of the specific area, when two sides parallel to the character traveling direction do not overlap the line area, the character included in the line area may be specified.
- the characters included in the line area are targeted for specification.
- the line where most of the specific areas overlap is a line specified by the operation. That is, since a line that is likely to be a line specified by an operation is determined as a line that includes the character to be specified, the character can be specified with high accuracy.
- the selected character identification unit is configured such that, of the boundary lines of the specific area, only one of the two sides parallel to the traveling direction of the character is the above.
- the length of the side perpendicular to the direction of movement of the character in the line overlap area which is an area where the line area overlaps with the specific area, is the progression of the character in the line area.
- the length is a predetermined ratio or more with respect to the length of the side perpendicular to the direction, the character included in the line area may be specified.
- the line having a high degree of overlap is specified.
- a line having a high degree of overlap is a line specified by the operation. That is, a character included in a line that is highly likely to be a line specified by an operation is set as a specification target. Therefore, the character can be specified with high accuracy.
- the character identification device is the character identification device according to any one of the aspects 2 to 4, wherein the selected character identification unit is included in a line area whose degree of overlap with the specific area exceeds the first predetermined amount.
- the character may be a character that identifies the character when the degree of overlap between the character region that is the region where the character exists and the specific region exceeds a second predetermined amount.
- the character when the degree of overlap between the character and the specific area exceeds the second predetermined amount, the character is specified. That is, it is determined for each character whether or not it is a selected character, and the selected character is specified. Therefore, the character can be specified with high accuracy.
- the character area is a rectangle
- the selected character specifying unit includes (1) the character area and the specific area overlapping, and Of the boundary lines of the specific area, two sides perpendicular to the direction of travel of the character both overlap the character area, or (2) the character area and the specific area overlap, and Of the boundary lines of the specific area, when neither two sides perpendicular to the advancing direction of the character overlap the character area, the character set as the character area may be a character to be specified.
- the character with the character area set is set as the specified character.
- a character in which most of the character area overlaps the specific area is a character that the user has attempted to select.
- the character that is highly likely to be selected by the user is the character to be specified, the character can be specified with high accuracy.
- the selected character specifying unit is configured so that only one of the two sides perpendicular to the advancing direction of the character is the boundary line of the specific region.
- the length of the side parallel to the direction of movement of the character in the character overlap area which is an area where the character area overlaps the character area and the character area overlaps the specific area, is the progression of the character in the character area.
- the character in which the character area is set may be a character to be specified.
- the character having a large overlapping degree is specified.
- a character with a large degree of overlap is a character that the user has attempted to select.
- the character that is highly likely to be selected by the user is the character to be specified, the character can be specified with high accuracy.
- the character identification device may be realized by a computer.
- the character identification device is operated on each computer by operating the computer as each unit (software element) included in the character identification device.
- the control program for the character identification device realized and the computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
- the present invention can be used in a character identification device for identifying characters included in a display image.
- a character identification device for identifying characters included in a display image.
- it is suitable for smartphones, tablet terminals, digital cameras, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Character Input (AREA)
Abstract
In order to quickly and accurately specify only the target characters, this character specifying device is provided with: a character recognition unit (43) which performs character recognition on an expanded region, which is an expanded version of a specified region specified by an operation; and a selected character string specification unit (44) which specifies characters having more than a prescribed degree of overlap between the aforementioned specified region and a region configured for the characters included in the expanded region obtained in the aforementioned character recognition.
Description
本発明は、表示画像に含まれる文字を特定するための文字特定装置に関する。
The present invention relates to a character identification device for identifying a character included in a display image.
近年、画像に含まれる文字(文字列)を文字認識し、文字認識した文字を用いた処理を実行する技術が知られている。このような技術としては、例えば、文字に関する情報を取得し、文字を含む画像に取得した情報から生成した画像を重畳表示させる技術や、抽出した文字を翻訳して、翻訳結果を出力する技術などが知られている。また、上述した技術では、画像に含まれる文字のうち、ユーザが選択した文字について上記のような処理を実行することも可能である。
In recent years, a technique for recognizing characters (character strings) included in an image and executing processing using the recognized characters is known. As such a technique, for example, a technique for acquiring information on characters and displaying an image generated from the acquired information on an image including characters, a technique for translating the extracted characters, and outputting a translation result, etc. It has been known. In the above-described technique, it is also possible to execute the above-described process for a character selected by the user among characters included in the image.
しかしながら、上述のような技術は、画像全体を文字認識の対象とする場合に、当該画像に多くの文字が記載されていると、目的の文字(例えば、ユーザが選択した文字列)を文字認識して特定するまでに時間がかかってしまうという問題がある。この問題に対する解決策として、以下のような技術が提案されている。下記特許文献1には、ユーザがタッチパネルに対してワンアクションの操作で範囲選択した領域内の文字列のみを文字認識する技術が記載されている。下記特許文献2には、画面全体から文字列行のみを抽出しておき、ユーザがタッチパネルに対してタッチした座標に最も近い文字列行を文字認識する技術が記載されている。下記特許文献3には、ユーザがタッチパネルに対してタッチした座標の近傍に表示された画像から文字列を切り出す技術が記載されている。
However, in the above-described technique, when the entire image is a target for character recognition, if a large number of characters are described in the image, the target character (for example, a character string selected by the user) is recognized. There is a problem that it takes time to specify. As a solution to this problem, the following techniques have been proposed. Patent Document 1 listed below describes a technique for recognizing characters only in a character string in a region selected by a user with a one-action operation on a touch panel. Japanese Patent Application Laid-Open No. 2004-228561 describes a technique for extracting only a character string line from the entire screen and recognizing a character string line closest to the coordinates touched by the user on the touch panel. Patent Document 3 listed below describes a technique for cutting out a character string from an image displayed in the vicinity of coordinates touched on a touch panel by a user.
しかしながら、上記特許文献1~3の技術には以下のような問題がある。特許文献1の技術では、範囲選択内に目的の文字列が収まらなかった場合に誤認識が発生するおそれがある。また、特許文献2の技術では、画面全体を行抽出する必要があるため、文字列の特定に時間がかかってしまうおそれがある。また、特許文献3の技術では、ユーザが目的としていない文字列も一緒に切り出す可能性があり、目的の文字列を特定するまでに余計な手順が必要となる可能性がある。つまり、特許文献1~3の技術は、ユーザが目的とする文字のみを短時間で精度よく特定するには十分でないという課題がある。また、スマートフォンなどの機器において、写真を撮影するためにカメラから入力される画像(いわゆるスルー画像)を表示して、当該スルー画像に対して文字認識を実行する場合、ユーザが機器を動かすたびに表示部に表示される文字が変わるため、上記特許文献1~3の問題は特に顕著となる。
However, the techniques of Patent Documents 1 to 3 have the following problems. In the technique of Patent Literature 1, there is a possibility that erroneous recognition occurs when the target character string does not fit within the range selection. Further, in the technique of Patent Document 2, since it is necessary to extract the entire screen, it may take time to specify a character string. Further, in the technique of Patent Document 3, there is a possibility that a character string that is not intended by the user may be cut out together, and an extra procedure may be required until the target character string is specified. That is, the techniques of Patent Documents 1 to 3 have a problem that it is not sufficient to accurately specify only the characters intended by the user in a short time. When a device such as a smartphone displays an image (so-called through image) input from a camera to take a picture and performs character recognition on the through image, each time the user moves the device Since the characters displayed on the display section change, the problems of Patent Documents 1 to 3 are particularly remarkable.
本発明は、上記の課題に鑑みてなされたものであり、その目的は、目的とする文字のみを短時間で精度よく特定することができる文字特定装置等を実現することにある。
The present invention has been made in view of the above-described problems, and an object of the present invention is to realize a character identification device or the like that can accurately identify only a target character in a short time.
上記の課題を解決するために、本発明の一態様に係る文字特定装置は、表示画像に含まれる文字を特定するための文字特定装置であって、上記表示画像に対するユーザ操作によって特定された上記表示画像内の特定領域を拡大した拡大領域に対し、文字認識を実行する文字認識実行部と、上記文字認識実行部による文字認識によって得られる上記拡大領域に含まれる文字に設定された領域と、上記特定領域との重なり度合いが所定量を超える文字をユーザに選択された文字として特定する選択文字特定部と、を備える。
In order to solve the above-described problem, a character identification device according to an aspect of the present invention is a character identification device for identifying a character included in a display image, and is specified by a user operation on the display image. A character recognition execution unit that performs character recognition on an enlarged region obtained by enlarging a specific region in the display image, and a region that is set as a character included in the enlarged region obtained by character recognition by the character recognition execution unit, A selected character specifying unit that specifies, as a character selected by the user, a character whose degree of overlap with the specific region exceeds a predetermined amount.
本発明の一態様によれば、目的とする文字のみを短時間で精度よく特定することができるという効果を奏する。
According to one aspect of the present invention, there is an effect that only a target character can be accurately specified in a short time.
まず、以下に説明する実施形態1~3の概要について、図2を参照して説明する。まず、スマートフォン1は、図2の(a)に示すように、画像10(以降、文字列画像10と称する)の中から、入力操作によって特定された領域である総タッチ領域11を特定する。なお図2では、説明の便宜上、文字列画像10の一部のみを示している。次に、スマートフォン1は、図2の(b)に示すように、総タッチ領域11を拡大した文字認識領域12を特定する。スマートフォン1は、この文字認識領域12について、文字認識を実行する。なお、文字認識の対象としては、単一の文字であってもよいし、複数の文字からなる文字列であってもよい。また、以降では文字認識の対象として上記文字列を例に挙げて説明する。続いて、図2の(c)に示すように、スマートフォン1は、文字認識領域12に含まれる行の領域である抽出行領域を特定する。図2の(c)の例の場合、スマートフォン1は、抽出行領域13、抽出行領域14、抽出行領域15を特定する。そして、スマートフォン1は、各抽出行領域と総タッチ領域11との重なり度合いに応じて、選択された文字列を特定する。図2の場合、図2の(d)に示すように、スマートフォン1は選択文字列16を特定する。
First, the outline of the first to third embodiments described below will be described with reference to FIG. First, as illustrated in FIG. 2A, the smartphone 1 specifies a total touch area 11 that is an area specified by an input operation from an image 10 (hereinafter, referred to as a character string image 10). In FIG. 2, only a part of the character string image 10 is shown for convenience of explanation. Next, the smartphone 1 specifies a character recognition area 12 in which the total touch area 11 is enlarged as shown in FIG. The smartphone 1 performs character recognition for the character recognition area 12. The character recognition target may be a single character or a character string composed of a plurality of characters. Hereinafter, the character string will be described as an example of character recognition. Subsequently, as illustrated in FIG. 2C, the smartphone 1 specifies an extracted line area that is an area of a line included in the character recognition area 12. In the case of the example in FIG. 2C, the smartphone 1 specifies the extraction line area 13, the extraction line area 14, and the extraction line area 15. And the smart phone 1 specifies the selected character string according to the overlapping degree of each extraction line area | region and the total touch area | region 11. FIG. In the case of FIG. 2, the smartphone 1 specifies the selected character string 16 as illustrated in FIG.
以上よりスマートフォン1は、画像全体ではなく、特定領域を拡大した拡大領域に対して文字認識を実行するので、短時間で文字認識を実行することができる。また、文字認識によって得られる拡大領域に含まれる文字列に設定された領域と、特定領域との重なり度合いが所定量を超える文字列を特定するので、選択された文字列を精度よく特定することができる。
As described above, since the smartphone 1 performs character recognition not on the entire image but on an enlarged area obtained by enlarging a specific area, the smartphone 1 can execute character recognition in a short time. In addition, since the character string whose degree of overlap between the specific region and the region set in the character string included in the enlarged region obtained by character recognition exceeds a predetermined amount is specified, the selected character string can be specified with high accuracy. Can do.
以下、実施形態1~3を詳細に説明する。なお、以下の実施形態においてはスマートフォン1を例に挙げて説明するが、表示画像に含まれる文字を特定するための文字特定装置であれば、本発明を適用可能である。
Hereinafter, Embodiments 1 to 3 will be described in detail. In the following embodiments, the smartphone 1 will be described as an example, but the present invention can be applied to any character identification device for identifying characters included in a display image.
〔実施形態1〕
本発明の一実施形態について、図1および図3~図8に基づいて説明すれば以下のとおりである。まず、図1を参照して、スマートフォン1(文字特定装置)の要部構成について説明する。スマートフォン1は少なくとも、タッチパネル2、撮影部3、制御部4、記憶部5、および通信部6を備えている。 Embodiment 1
An embodiment of the present invention will be described below with reference to FIGS. 1 and 3 to 8. First, with reference to FIG. 1, the principal part structure of the smart phone 1 (character identification device) is demonstrated. The smartphone 1 includes at least atouch panel 2, a photographing unit 3, a control unit 4, a storage unit 5, and a communication unit 6.
本発明の一実施形態について、図1および図3~図8に基づいて説明すれば以下のとおりである。まず、図1を参照して、スマートフォン1(文字特定装置)の要部構成について説明する。スマートフォン1は少なくとも、タッチパネル2、撮影部3、制御部4、記憶部5、および通信部6を備えている。 Embodiment 1
An embodiment of the present invention will be described below with reference to FIGS. 1 and 3 to 8. First, with reference to FIG. 1, the principal part structure of the smart phone 1 (character identification device) is demonstrated. The smartphone 1 includes at least a
タッチパネル2は、入力部21および表示部22を含む。入力部21は、ユーザによる入力操作を受け付ける入力デバイスである。本実施形態にて説明するタッチパネル2は矩形であり、タッチパネル2には、後述するように、タッチパネル2における座標検知領域の左上頂点を原点とするXY平面座標系が設定されている。入力部21は、指示体(指またはペンなど)の接触(接近も含む)を受け付けるタッチ面と、指示体とタッチ面との間の接触の有無、および、接触位置(座標)を検知するためのタッチセンサとで構成されている。タッチセンサは、指示体とタッチ面との接触の有無を検知できればどのようなセンサで実現されてもよい。また、入力部21は検知した座標などを後述する入力特定部41に出力する。表示部22はスマートフォン1が処理する情報を画像として表示領域に表示する表示デバイスであり、例えばLCD(液晶ディスプレイ)である。具体的には、表示部22は、後述する表示制御部46によって処理された情報が表示される。表示部22に表示される画像の一例としては、文字列画像10が挙げられる。
The touch panel 2 includes an input unit 21 and a display unit 22. The input unit 21 is an input device that accepts an input operation by a user. The touch panel 2 described in the present embodiment is rectangular, and an XY plane coordinate system having the origin at the upper left vertex of the coordinate detection area of the touch panel 2 is set on the touch panel 2 as described later. The input unit 21 detects a touch surface that accepts contact (including approach) of an indicator (such as a finger or a pen), presence or absence of contact between the indicator and the touch surface, and a contact position (coordinates). The touch sensor. The touch sensor may be realized by any sensor as long as it can detect the presence or absence of contact between the indicator and the touch surface. The input unit 21 outputs the detected coordinates and the like to the input specifying unit 41 described later. The display unit 22 is a display device that displays information processed by the smartphone 1 as an image in a display area, and is, for example, an LCD (liquid crystal display). Specifically, the display unit 22 displays information processed by a display control unit 46 described later. An example of the image displayed on the display unit 22 is the character string image 10.
撮影部3は、対象物を撮影するための撮影デバイスであり、いわゆるカメラである。撮影部3としては、スマートフォンに一般的に搭載されている、既存のカメラを用いることができる。撮影部3は、後述する画像取得部47によって制御され、対象物を撮影する。また、撮影部3は、撮影した静止画像や映像を画像取得部47に出力する。
The photographing unit 3 is a photographing device for photographing an object and is a so-called camera. As the imaging unit 3, an existing camera that is generally mounted on a smartphone can be used. The imaging unit 3 is controlled by an image acquisition unit 47 to be described later, and images an object. In addition, the photographing unit 3 outputs the photographed still image or video to the image acquisition unit 47.
制御部4は、スマートフォン1が備える各部を統括制御するものである。制御部4は、入力特定部41、抽出領域特定部42、文字認識部43(文字認識実行部)、選択文字列特定部44(選択文字特定部)、処理実行部45、表示制御部46、および、画像取得部47を含んでいる。
The control unit 4 performs overall control of each unit included in the smartphone 1. The control unit 4 includes an input specifying unit 41, an extraction region specifying unit 42, a character recognizing unit 43 (character recognition executing unit), a selected character string specifying unit 44 (selected character specifying unit), a process executing unit 45, a display control unit 46, And the image acquisition part 47 is included.
入力特定部41は、入力部21が受け付けた入力操作を特定するものである。具体的には、入力特定部41は、上記入力操作における座標、指示体が入力面に接触していた時間、および、入力面に接触した指示体の移動方向などに基づいて、当該入力操作を特定する。なお、入力操作の種別としては、シングルタップ、フリック、ドラッグなどの例があるが、これに限定されない。例えば、入力特定部41は、文字列を含む画像(文字列画像10)が表示部22に表示されている状態で、文字列画像内の文字列に対するユーザのシングルタップ操作、ドラッグ操作またはフリック操作を特定して、当該操作の座標を抽出領域特定部42に供給する。また、入力特定部41は、撮影を実行するための操作(例えば、表示部22に表示されたUI部品に対するシングルタップ操作)を特定して、画像取得部47に撮影を実行するための撮影指示を供給する。
The input specifying unit 41 specifies an input operation accepted by the input unit 21. Specifically, the input specifying unit 41 performs the input operation based on the coordinates in the input operation, the time that the indicator is in contact with the input surface, the moving direction of the indicator in contact with the input surface, and the like. Identify. Examples of the type of input operation include single tap, flick, and drag, but are not limited thereto. For example, the input specifying unit 41 performs a single tap operation, a drag operation, or a flick operation of the user with respect to the character string in the character string image in a state where an image including the character string (character string image 10) is displayed on the display unit 22. And the coordinates of the operation are supplied to the extraction region specifying unit 42. In addition, the input specifying unit 41 specifies an operation for performing shooting (for example, a single tap operation for a UI component displayed on the display unit 22), and instructs the image acquisition unit 47 to perform shooting. Supply.
抽出領域特定部42は、入力操作によって特定された領域(特定領域)を拡大した領域である拡大領域を特定するものである。この拡大領域の特定について、図3および図4を参照して説明する。なお、図3に示す座標の原点は、入力部21のタッチセンサが座標を検知可能な領域である座標検知領域の左上頂点と対応している。また、本実施形態において座標検知領域は矩形の領域として説明するが、この例に限定されるものではない。
The extraction area specifying unit 42 specifies an enlarged area that is an area obtained by enlarging the area specified by the input operation (specific area). The identification of the enlarged region will be described with reference to FIGS. Note that the origin of the coordinates shown in FIG. 3 corresponds to the upper left vertex of the coordinate detection area, which is an area where the touch sensor of the input unit 21 can detect the coordinates. In the present embodiment, the coordinate detection area is described as a rectangular area, but the present invention is not limited to this example.
まず、抽出領域特定部42は、タッチパネル2に対して行われた入力操作によって選択された領域である総タッチ領域11(特定領域)を特定する。図3の(a)は、ユーザがタッチパネル2に対してシングルタップ操作を行った場合の、総タッチ領域の特定を示す図である。つまり、図3の(a)の例では、ユーザは指示体S(ユーザの指)をタッチパネル2に接触させた後、移動させることなく指示体Sをタッチパネルから離している。抽出領域特定部42は、入力特定部41から指示体Sが接触した座標C1を供給されると、当該座標C1を中心として、一辺の長さがLである正方形の領域であるタッチ領域M1を特定する。ここで、ユーザが指示体をタッチパネルから離すと、タッチ領域M1が総タッチ領域となる。なお、一辺の長さLは適宜設定されればよく、例えば、一辺の長さLは、1.0cm程度であってもよい。これは、成人男性の平均的な人差し指の先端(指先)の大きさであり、ユーザが自身の人差し指を指示体Sとして用いて文字列を選択する場合、指先の大きさと文字の大きさとを同程度にして文字列を選択することを想定したものである。また、入力操作の種類や押圧の強さなどに応じて一辺の長さLを変えてもよい。
First, the extraction area specifying unit 42 specifies the total touch area 11 (specific area) that is an area selected by an input operation performed on the touch panel 2. FIG. 3A is a diagram illustrating identification of the total touch area when the user performs a single tap operation on the touch panel 2. That is, in the example of FIG. 3A, the user touches the indicator S (user's finger) to the touch panel 2 and then moves the indicator S away from the touch panel without moving it. When the coordinate C1 with which the indicator S is touched is supplied from the input specifying unit 41, the extraction region specifying unit 42 selects a touch region M1 that is a square region with one side length L around the coordinate C1. Identify. Here, when the user removes the indicator from the touch panel, the touch area M1 becomes the total touch area. In addition, the length L of one side should just be set suitably, for example, the length L of one side may be about 1.0 cm. This is the size of the average index finger tip (fingertip) of an adult male. When the user selects a character string using his / her index finger as the indicator S, the size of the fingertip is the same as the character size. It is assumed that a character string is selected to the extent. Further, the length L of one side may be changed according to the type of input operation or the strength of pressing.
一方、図3の(b)は、ユーザがタッチパネル2に対してドラッグ操作またはフリック操作を行った場合の、総タッチ領域の特定を示す図である。つまり、図3の(b)の例では、ユーザは指示体Sをタッチパネル2に接触させた後、指示体Sを移動させてタッチパネル2から離している。抽出領域特定部42は、入力特定部41から指示体Sが接触したタッチパネル2の位置の各座標、すなわち、図3の(b)の例の場合、座標C1、座標C2、座標C3、および座標C4を供給されると、それぞれの座標を中心として、一辺の長さがLである正方形の領域であるタッチ領域を特定する。すなわち、図3の(b)の例の場合、タッチ領域M1、タッチ領域M2、タッチ領域M3、およびタッチ領域M4を特定する。そして、抽出領域特定部42は、特定した各タッチ領域の頂点座標から、X座標の最小値および最大値、並びに、Y座標の最小値および最大値を特定する。そして、抽出領域特定部42は、特定したX座標の最小値とY座標の最小値からなる座標(xmin,ymin)と、特定したX座標の最大値とY座標の最大値からなる座標(xmax,ymax)とを結ぶ線分が対角線となる矩形領域(総タッチ領域11)を特定する。すなわち、座標(xmin,ymin)および座標(xmax,ymax)は、総タッチ領域11の頂点となる。
On the other hand, FIG. 3B is a diagram showing identification of the total touch area when the user performs a drag operation or a flick operation on the touch panel 2. That is, in the example of FIG. 3B, the user moves the indicator S away from the touch panel 2 after bringing the indicator S into contact with the touch panel 2. The extraction area specifying unit 42 is the coordinates of the position of the touch panel 2 touched by the indicator S from the input specifying unit 41, that is, in the example of FIG. 3B, the coordinates C1, the coordinates C2, the coordinates C3, and the coordinates When C4 is supplied, a touch area that is a square area having a side of L around each coordinate is specified. That is, in the example of FIG. 3B, the touch area M1, the touch area M2, the touch area M3, and the touch area M4 are specified. Then, the extraction area specifying unit 42 specifies the minimum value and the maximum value of the X coordinate and the minimum value and the maximum value of the Y coordinate from the vertex coordinates of each specified touch area. Then, the extraction area specifying unit 42 includes coordinates (x min , y min ) that are the minimum value of the specified X coordinate and the minimum value of the Y coordinate, and coordinates that are the maximum value of the specified X coordinate and the maximum value of the Y coordinate. A rectangular area (total touch area 11) in which a line segment connecting (x max , y max ) is a diagonal line is specified. That is, the coordinates (x min , y min ) and the coordinates (x max , y max ) are the vertices of the total touch area 11.
続いて、抽出領域特定部42は文字認識領域12(拡大領域)を特定する。具体的には、図4に示すように、抽出領域特定部42は総タッチ領域11を所定の条件に基づいて拡大し、拡大後の領域を文字認識領域12として特定する。ここで、上記所定の条件の一例について説明する。以下に示す例は文字列画像10に含まれる文字列が横書きである場合の例である。抽出領域特定部42は、総タッチ領域11においてX座標がxminである2つの頂点を結ぶ辺(総タッチ領域11の左辺)からX座標の負方向側に1Lの位置の座標Xminを特定する。また、総タッチ領域11においてX座標がxmaxである2つの頂点を結ぶ辺(総タッチ領域11の右辺)からX座標の正方向側に1Lの位置の座標Xmaxを特定する。さらに、総タッチ領域11においてY座標がyminである2つの頂点を結ぶ辺(総タッチ領域11の上辺)からY座標の負方向側に2Lの位置の座標Yminを特定する。最後に、総タッチ領域11においてY座標がymaxである2つの頂点を結ぶ辺(総タッチ領域11の下辺)からY座標の正方向側に2Lの位置の座標Ymaxを特定する。そして、抽出領域特定部42は、座標(Xmin,Ymin)と座標(Xmax,Ymax)とを結ぶ線分が対角線となる矩形領域である文字認識領域12を特定する。すなわち、座標(Xmin,Ymin)および座標(Xmax,Ymax)は、文字認識領域12の頂点となる。
Subsequently, the extraction area specifying unit 42 specifies the character recognition area 12 (enlarged area). Specifically, as illustrated in FIG. 4, the extraction area specifying unit 42 enlarges the total touch area 11 based on a predetermined condition, and specifies the enlarged area as the character recognition area 12. Here, an example of the predetermined condition will be described. The example shown below is an example when the character string included in the character string image 10 is horizontally written. The extraction area specifying unit 42 specifies the coordinate X min of the 1L position from the side connecting the two vertices having the X coordinate x min in the total touch area 11 (the left side of the total touch area 11) to the negative direction side of the X coordinate. To do. Further, in the total touch area 11, the coordinate X max at a position of 1 L is specified on the positive direction side of the X coordinate from the side connecting the two vertices whose X coordinate is x max (the right side of the total touch area 11). Furthermore, the coordinate Y min of the position of 2L is specified on the negative direction side of the Y coordinate from the side connecting the two vertices having the Y coordinate y min in the total touch region 11 (upper side of the total touch region 11). Finally, a coordinate Y max at a position of 2L is specified on the positive direction side of the Y coordinate from the side connecting the two vertices having the Y coordinate y max in the total touch region 11 (the lower side of the total touch region 11). Then, the extraction area specifying unit 42 specifies the character recognition area 12 that is a rectangular area in which a line segment connecting the coordinates (X min , Y min ) and the coordinates (X max , Y max ) is a diagonal line. That is, the coordinates (X min , Y min ) and the coordinates (X max , Y max ) are the vertices of the character recognition area 12.
上述した所定の条件について換言すれば、抽出領域特定部42は、総タッチ領域11を上下方向に2L分拡げ、左右方向に1L分広げた領域を文字認識領域12として特定する。ここで、文字列画像10が新聞や雑誌などを撮影することによって得られた画像である場合、大きい文字は小さい文字の約2~3倍であることが多い。つまり、文字列が横書きである場合、小さい文字をユーザが自身の指先の大きさに合わせるために文字列画像10の倍率を変更した後、倍率を変更することなく大きい文字に対して入力操作を行ったとしても、大きい文字全体が文字認識領域に含まれるように上下の幅を設定する必要がある。そのため、上述した例では、総タッチ領域11を上下方向に2L分広げている。また、文字列画像10中の単語を選択したい場合に、当該単語の隣の文字に指示体が接触してしまうことがある。この場合、単語の隣の文字は文字全体が選択されていない可能性があり、総タッチ領域11の左右の幅を広げないと、単語の隣の文字全体が文字認識の対象とならない場合がある。つまり、抽出領域特定部42は、選択された文字を正確に文字認識するために、左右の幅を1L分広げる。なお、所定の条件、すなわち、総タッチ領域11をどのように拡大するかについては、上述した例に限定されない。また、抽出領域特定部42は特定した文字認識領域12の座標情報を文字認識部43に供給する。また、抽出領域特定部42は特定した総タッチ領域11の座標情報を選択文字列特定部44に供給する。
In other words, the extraction area specifying unit 42 specifies, as the character recognition area 12, an area in which the total touch area 11 is expanded by 2 L in the vertical direction and 1 L in the left-right direction. Here, when the character string image 10 is an image obtained by photographing a newspaper or a magazine, the large character is often about two to three times the small character. That is, when the character string is written horizontally, after the user changes the magnification of the character string image 10 in order to adjust the size of the small character to the size of his / her fingertip, the input operation is performed on the large character without changing the magnification. Even if it is done, it is necessary to set the vertical width so that the entire large character is included in the character recognition area. Therefore, in the above-described example, the total touch area 11 is expanded by 2L in the vertical direction. In addition, when a word in the character string image 10 is desired to be selected, the indicator may come into contact with a character adjacent to the word. In this case, there is a possibility that the entire character of the character next to the word is not selected, and the entire character adjacent to the word may not be the target of character recognition unless the left and right widths of the total touch area 11 are widened. . That is, the extraction area specifying unit 42 increases the left and right widths by 1 L in order to accurately recognize the selected character. Note that the predetermined condition, that is, how to enlarge the total touch area 11 is not limited to the above-described example. In addition, the extraction region specifying unit 42 supplies the coordinate information of the specified character recognition region 12 to the character recognition unit 43. Further, the extraction area specifying unit 42 supplies the coordinate information of the specified total touch area 11 to the selected character string specifying unit 44.
文字認識部43は、文字列を含む画像について文字認識処理を実行するものである。具体的には、文字認識部43は、文字認識領域12の座標情報を供給されると、画像取得部47から供給された文字列画像10における文字認識領域12の領域内の文字列を行単位で抽出する。ここで、文字列の抽出方法としては、既存の光学文字認識の技術が利用できるため、詳細な説明は省略する。文字認識部43は、抽出した行単位の文字列(以降、抽出行と呼称する)と、抽出行全体を含む最小の矩形領域(以降、抽出行領域と呼称する)の座標情報とを選択文字列特定部44に供給する。
The character recognition unit 43 performs a character recognition process on an image including a character string. Specifically, when the character recognition unit 43 is supplied with the coordinate information of the character recognition region 12, the character recognition unit 43 converts the character string in the region of the character recognition region 12 in the character string image 10 supplied from the image acquisition unit 47 into line units. Extract with Here, as an extraction method of a character string, since an existing optical character recognition technique can be used, detailed description thereof is omitted. The character recognizing unit 43 selects the extracted character string in line units (hereinafter referred to as “extracted line”) and coordinate information of the minimum rectangular area (hereinafter referred to as “extracted line area”) including the entire extracted line. This is supplied to the column specifying unit 44.
選択文字列特定部44は、文字認識部43による文字認識によって得られる、文字認識領域12に含まれる文字列に設定された領域と、総タッチ領域11との重なり度合いが所定量を超える文字列を特定するものである。この文字列の特定について、図5を参照して説明する。まず選択文字列特定部44は、文字認識部43から抽出行および抽出行領域の座標情報を供給されると、当該抽出行が縦書きであるか横書きであるかを判定する。なお、図5に示す例は、抽出行が横書きであると判定された場合について説明する。続いて、選択文字列特定部44は、文字認識領域12から抽出した抽出行から、行領域と特定領域との重なり度合いが第1の所定量を超える抽出行、すなわち、該抽出行に含まれる文字列が特定対象となる抽出行である文字列特定行を特定する。具体的には、まず選択文字列特定部44は、抽出行のうち抽出行領域における所定座標(例えば、抽出行領域の左上頂点の座標)のY座標が最も小さい抽出行を判定行として特定する。つまり、選択文字列特定部44は、図5の(a)に示す抽出行のうち、抽出行領域13を有する抽出行(抽出行のうち最も上にある抽出行)を判定行として特定する。そして、選択文字列特定部44は、判定行の抽出行領域の座標情報と抽出領域特定部42から供給された総タッチ領域11の座標情報とを比較して、抽出行領域と総タッチ領域11とが重なっているか否かを判定する。図5の(a)に示すように、抽出行領域13と総タッチ領域11とは重なっていない。この場合、選択文字列特定部44は、抽出行領域13を有する抽出行を文字列特定行と特定しない。
The selected character string specifying unit 44 is a character string in which the overlapping degree between the area set in the character string included in the character recognition area 12 and the total touch area 11 obtained by character recognition by the character recognition section 43 exceeds a predetermined amount. Is specified. The identification of this character string will be described with reference to FIG. First, when the selected character string specifying unit 44 is supplied with coordinate information of an extracted line and an extracted line area from the character recognition unit 43, the selected character string specifying unit 44 determines whether the extracted line is vertical writing or horizontal writing. In the example illustrated in FIG. 5, a case where it is determined that the extracted line is horizontal writing will be described. Subsequently, the selected character string specifying unit 44 includes, from the extracted lines extracted from the character recognition area 12, an extracted line in which the degree of overlap between the line area and the specific area exceeds the first predetermined amount, that is, included in the extracted line. A character string specifying line, which is an extraction line whose character string is to be specified, is specified. Specifically, first, the selected character string specifying unit 44 specifies, as a determination line, an extracted line having the smallest Y coordinate of predetermined coordinates (for example, the coordinates of the upper left vertex of the extracted line area) in the extracted line area among the extracted lines. . That is, the selected character string specifying unit 44 specifies the extracted line having the extracted line area 13 (the extracted line at the top of the extracted lines) as the determination line among the extracted lines shown in FIG. Then, the selected character string specifying unit 44 compares the coordinate information of the extracted line area of the determination line with the coordinate information of the total touch area 11 supplied from the extraction area specifying unit 42, and extracts the extracted line area and the total touch area 11. It is determined whether or not. As shown in FIG. 5A, the extracted row area 13 and the total touch area 11 do not overlap. In this case, the selected character string specifying unit 44 does not specify the extracted line having the extracted line region 13 as the character string specifying line.
続いて、選択文字列特定部44は、所定座標のY座標が次に小さい抽出行である抽出行領域14を有する抽出行を判定行と特定する。そして、選択文字列特定部44は、抽出行領域14と総タッチ領域11とが重なっているか否かを判定する。図5の(a)に示すように、抽出行領域14と総タッチ領域11とは重なっている。この場合、選択文字列特定部44は、総タッチ領域11の上辺または下辺のいずれか一方が抽出行領域14内に含まれるか否かを判定する。換言すれば、選択文字列特定部44は、総タッチ領域11の境界線のうち、文字列の進行方向と平行な2辺のいずれか一方が抽出行領域14と重なっているか否かを判定する。具体的には、選択文字列特定部44は、総タッチ領域11におけるY座標の最小値およびY座標の最大値の値が、抽出行領域におけるY座標の最小値とY座標の最大値とからなるY座標数値範囲内に含まれているか否かを判定する。図5の(a)の場合、総タッチ領域11におけるY座標の最小値が抽出行領域14におけるY座標数値範囲内に含まれる。この場合、選択文字列特定部44は、抽出行領域と総タッチ領域11とが重なっている領域である行重畳領域を特定する。そして、行重畳領域におけるY座標の最大値からY座標の最小値を減算した値(すなわち、行重畳領域の高さの値)が、抽出行領域14におけるY座標の最大値からY座標の最小値を減算した値(すなわち、抽出行領域14の高さの値)に対して所定割合以上の長さ(例えば、半分以上の長さ)であるか否かを判定する。図5の(a)に示すように、行重畳領域の高さの値は抽出行領域14の高さの値の半分以上である。この場合、選択文字列特定部44は、抽出行領域14を有する抽出行を文字列特定行と特定する。なお、抽出行領域15は総タッチ領域11と重なっており、上記Y座標数値範囲内に、総タッチ領域11におけるY座標の最大値が含まれている。しかしながら、行重畳領域の高さの値が抽出行領域の高さの値の半分以上とならないため、選択文字列特定部44は、抽出行領域15を有する抽出行を文字列特定行と特定しない。
Subsequently, the selected character string specifying unit 44 specifies an extracted line having an extracted line area 14 that is an extracted line having the next smallest Y coordinate of the predetermined coordinates as a determination line. Then, the selected character string specifying unit 44 determines whether or not the extracted line area 14 and the total touch area 11 overlap. As shown in FIG. 5A, the extracted row area 14 and the total touch area 11 overlap each other. In this case, the selected character string specifying unit 44 determines whether either the upper side or the lower side of the total touch area 11 is included in the extracted line area 14. In other words, the selected character string specifying unit 44 determines whether one of the two sides parallel to the character string traveling direction of the boundary line of the total touch area 11 overlaps the extracted line area 14. . Specifically, the selected character string specifying unit 44 determines that the minimum value of the Y coordinate and the maximum value of the Y coordinate in the total touch area 11 are based on the minimum value of the Y coordinate and the maximum value of the Y coordinate in the extracted line area. It is determined whether it is included in the Y coordinate numerical value range. In the case of FIG. 5A, the minimum value of the Y coordinate in the total touch area 11 is included in the Y coordinate numerical value range in the extraction line area 14. In this case, the selected character string specifying unit 44 specifies a line overlapping area that is an area where the extracted line area and the total touch area 11 overlap. Then, a value obtained by subtracting the minimum value of the Y coordinate from the maximum value of the Y coordinate in the row overlap region (that is, the height value of the row overlap region) is the minimum value of the Y coordinate from the maximum value of the Y coordinate in the extracted row region 14. It is determined whether or not the length is a predetermined ratio or more (for example, a length of half or more) with respect to the value obtained by subtracting the value (that is, the height value of the extracted row area 14). As shown in (a) of FIG. 5, the height value of the row overlapping region is half or more of the height value of the extracted row region 14. In this case, the selected character string specifying unit 44 specifies an extracted line having the extracted line area 14 as a character string specifying line. Note that the extracted row area 15 overlaps the total touch area 11, and the maximum value of the Y coordinate in the total touch area 11 is included in the Y coordinate numerical value range. However, since the height value of the line overlap area does not become more than half of the height value of the extracted line area, the selected character string specifying unit 44 does not specify the extracted line having the extracted line area 15 as the character string specifying line. .
次に、選択文字列特定部44は文字列特定行を特定すると、当該文字列特定行内に含まれる文字から、文字領域と特定領域との重なり度合いが第2の所定量を超える文字、すなわち、入力操作によって選択された選択文字を特定する。具体的には、まず選択文字列特定部44は、文字列特定行(図5の例の場合、抽出行14)内に含まれる各文字について、文字全体を含む最小の矩形領域(以降、文字領域と呼称する)の座標情報を特定する。
Next, when the selected character string specifying unit 44 specifies the character string specifying line, the character whose overlapping degree between the character area and the specific area exceeds the second predetermined amount from the characters included in the character string specifying line, that is, The selected character selected by the input operation is specified. Specifically, first, the selected character string specifying unit 44, for each character included in the character string specifying line (extracted line 14 in the case of FIG. 5), includes a minimum rectangular area (hereinafter referred to as a character) including the entire character. (Referred to as a region).
続いて、選択文字列特定部44は、文字列特定行内に含まれる各文字のうち、文字領域における所定座標(例えば、文字領域の左上頂点の座標)のX座標が最も小さい文字を判定文字として特定する。つまり選択文字列特定部44は、図5の(b)に示す文字のうち、文字領域31を有する文字(文字列特定行内において最も左にある文字)を判定文字として特定する。そして選択文字列特定部44は、判定文字の文字領域の座標情報と総タッチ領域11の座標情報とを比較して、文字領域と総タッチ領域11とが重なっているか否かを判定する。図5の(b)に示すように、文字領域31と総タッチ領域11とは重なっている。この場合、選択文字列特定部44は、総タッチ領域11の左辺または右辺のいずれか一方が文字領域内に含まれているか否かを判定する。換言すれば、選択文字列特定部44は、総タッチ領域11の境界線のうち、文字列の進行方向と直角な2辺のいずれか一方が文字領域と重なっているか否かを判定する。具体的には、選択文字列特定部44は、総タッチ領域11におけるX座標の最小値およびX座標の最大値の値が、文字領域におけるX座標の最小値とX座標の最大値とからなるX座標数値範囲内に含まれているか否かを判定する。図5の(b)の場合、総タッチ領域11におけるX座標の最小値が文字領域31におけるX座標数値範囲内に含まれる。この場合、選択文字列特定部44は、文字領域と総タッチ領域11とが重なっている領域である文字重畳領域を特定する。そして、文字重畳領域におけるX座標の最大値からX座標の最小値を減算した値(すなわち、文字重畳領域の幅の値)が、文字領域31におけるX座標の最大値からX座標の最小値を減算した値(すなわち、文字領域31の幅の値)に対して所定割合以上の長さ(例えば、半分以上の長さ)であるか否かを判定する。図5の(b)に示すように、文字重畳領域の幅の値は文字領域31の幅の値の半分以上である。この場合、選択文字列特定部44は文字領域31を有する文字を選択文字と特定し、後述する選択文字格納部51に当該選択文字および当該選択文字の座標情報を格納する。
Subsequently, the selected character string specifying unit 44 uses, as a determination character, a character having the smallest X coordinate of a predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters included in the character string specifying line. Identify. That is, the selected character string specifying unit 44 specifies the character having the character region 31 (the leftmost character in the character string specifying line) among the characters shown in FIG. Then, the selected character string specifying unit 44 compares the coordinate information of the character area of the determination character with the coordinate information of the total touch area 11 and determines whether or not the character area and the total touch area 11 overlap. As shown in FIG. 5B, the character area 31 and the total touch area 11 overlap each other. In this case, the selected character string specifying unit 44 determines whether either the left side or the right side of the total touch area 11 is included in the character area. In other words, the selected character string specifying unit 44 determines whether any one of the two sides perpendicular to the character string traveling direction of the boundary line of the total touch area 11 overlaps the character area. Specifically, in the selected character string specifying unit 44, the minimum value of the X coordinate and the maximum value of the X coordinate in the total touch area 11 include the minimum value of the X coordinate and the maximum value of the X coordinate in the character area. It is determined whether it is included in the X coordinate numerical range. In the case of FIG. 5B, the minimum value of the X coordinate in the total touch area 11 is included in the X coordinate numerical value range in the character area 31. In this case, the selected character string specifying unit 44 specifies a character superimposed region that is a region where the character region and the total touch region 11 overlap. Then, a value obtained by subtracting the minimum value of the X coordinate from the maximum value of the X coordinate in the character superimposed region (that is, the value of the width of the character superimposed region) is calculated as the minimum value of the X coordinate from the maximum value of the X coordinate in the character region 31. It is determined whether or not the length is a predetermined ratio or more (for example, a length of half or more) with respect to the subtracted value (that is, the width value of the character area 31). As shown in FIG. 5B, the value of the width of the character superimposed region is half or more than the value of the width of the character region 31. In this case, the selected character string specifying unit 44 specifies the character having the character area 31 as the selected character, and stores the selected character and the coordinate information of the selected character in the selected character storage unit 51 described later.
続いて、選択文字列特定部44は、所定座標のX座標が次に小さい文字である文字領域32を有する文字を判定文字と特定する。そして、上述した例と同様の処理を行う。文字領域32を有する文字の場合、文字領域31と総タッチ領域11とが重なっている。そして、総タッチ領域11の左辺または右辺は、いずれも文字領域32内に含まれていない。この場合、選択文字列特定部44は、文字領域32を有する文字を選択文字と特定し、後述する選択文字格納部51に当該選択文字および当該選択文字の座標情報を格納する。また、選択文字列特定部44は、文字領域33を有する文字も同様に選択文字と特定する。続いて、文字領域34を有する文字は、文字領域34と総タッチ領域11とが重なっている。そして、総タッチ領域11におけるX座標の最大値が文字領域34におけるX座標数値範囲内に含まれる。さらに、文字重畳領域の幅の値は文字領域34の幅の値の半分以上である。よって、選択文字列特定部44は、文字領域34を有する文字を選択文字と特定し、後述する選択文字格納部51に当該選択文字および当該選択文字の文字領域の座標情報を格納する。一方、文字領域35を有する文字は、文字領域35と総タッチ領域11とが重なっていない。この場合、選択文字列特定部44は、文字領域35を有する抽出行を文字列特定行と特定しない。
Subsequently, the selected character string specifying unit 44 specifies a character having the character region 32 in which the X coordinate of the predetermined coordinate is the next smallest character as a determination character. Then, the same processing as in the above example is performed. In the case of a character having the character region 32, the character region 31 and the total touch region 11 overlap. The left side or the right side of the total touch area 11 is not included in the character area 32. In this case, the selected character string specifying unit 44 specifies the character having the character region 32 as the selected character, and stores the selected character and the coordinate information of the selected character in the selected character storage unit 51 described later. The selected character string specifying unit 44 also specifies the character having the character area 33 as the selected character. Subsequently, in the character having the character area 34, the character area 34 and the total touch area 11 overlap each other. The maximum value of the X coordinate in the total touch area 11 is included in the X coordinate numerical value range in the character area 34. Furthermore, the value of the width of the character superimposed area is half or more than the value of the width of the character area 34. Therefore, the selected character string specifying unit 44 specifies a character having the character region 34 as a selected character, and stores the selected character and the coordinate information of the character region of the selected character in the selected character storage unit 51 described later. On the other hand, in the character having the character area 35, the character area 35 and the total touch area 11 do not overlap. In this case, the selected character string specifying unit 44 does not specify the extracted line having the character area 35 as the character string specifying line.
最後に、選択文字列特定部44は、すべての文字列特定行において選択文字を特定すると、選択文字格納部51に格納したすべての選択文字および選択文字の座標情報を読み出し、選択文字の文字領域の座標情報に基づいて選択文字を並べ、文字列を特定する。そして、特定した文字列を処理実行部45に供給する。
Finally, when the selected character string specifying unit 44 specifies the selected character in all the character string specifying lines, the selected character string and the coordinate information of the selected character stored in the selected character storage unit 51 are read out, and the character area of the selected character is read. Based on the coordinate information, the selected characters are arranged to specify the character string. Then, the specified character string is supplied to the process execution unit 45.
処理実行部45は、特定された文字列に基づいて所定の処理を実行するものである。具体的には、処理実行部45は、選択文字列特定部44から文字列を供給されると、後述する通信部6を介して当該文字列に関連する情報を取得する。取得する情報の一例としては、検索エンジンによって当該文字列を検索した検索結果、当該文字列に関連する画像や映像などが挙げられる。そして、処理実行部45は、取得した情報を用いて、取得した情報にアクセスするためのリンクを含むリンク画像を作成する。例えば、当該文字列の検索結果を示す画像、当該文字列に関連する映像のサムネイル画像、当該文字列に関連する画像の縮小画像などをリンク画像として作成する。なお、処理実行部45が取得した情報を用いて作成するのはリンク画像に限定されず、例えばリンクを含む文字列であってもよい。そして、処理実行部45は作成したリンク画像を表示制御部46に供給する。なお、処理実行部45が実行する処理は上述した例に限定されない。つまり、処理実行部45が実行する処理は、供給された文字列に基づいて実行する処理であればよく、例えば、供給された文字列を英訳する処理であってもよい。
The process execution unit 45 executes a predetermined process based on the specified character string. Specifically, when the character string is supplied from the selected character string specifying unit 44, the process executing unit 45 acquires information related to the character string via the communication unit 6 described later. As an example of the information to be acquired, a search result obtained by searching the character string by a search engine, an image or a video related to the character string, and the like can be given. And the process execution part 45 produces the link image containing the link for accessing the acquired information using the acquired information. For example, an image indicating a search result of the character string, a thumbnail image of a video related to the character string, a reduced image of an image related to the character string, and the like are created as a link image. In addition, what is created using the information acquired by the process execution unit 45 is not limited to a link image, and may be a character string including a link, for example. Then, the process execution unit 45 supplies the created link image to the display control unit 46. In addition, the process which the process execution part 45 performs is not limited to the example mentioned above. That is, the process executed by the process execution unit 45 may be a process that is executed based on the supplied character string, and may be, for example, a process that translates the supplied character string into English.
表示制御部46は、表示部22に表示する画像を決定するものである。具体的には、表示制御部46は、画像取得部47から文字列画像10を供給されると、供給された文字列画像10を表示部22に表示する。また、表示制御部46は処理実行部45からリンク画像を取得すると、当該リンク画像を文字列画像10に重畳させて表示部22に表示する。
The display control unit 46 determines an image to be displayed on the display unit 22. Specifically, when the character string image 10 is supplied from the image acquisition unit 47, the display control unit 46 displays the supplied character string image 10 on the display unit 22. Further, when the display control unit 46 acquires the link image from the processing execution unit 45, the display control unit 46 superimposes the link image on the character string image 10 and displays it on the display unit 22.
画像取得部47は、撮影部3を動作させて文字列画像10を取得するものである。具体的には、画像取得部47は、入力特定部41から撮影指示を供給されると、撮影部3に画像を撮影させる。そして、撮影部3が撮影した画像(文字列画像10)を取得し、取得した文字列画像10を文字認識部43および表示制御部46に供給する。なお、本実施形態における画像取得部47は、撮影部3が撮影した画像を取得する構成であるが、この例に限定されない。例えば、画像取得部47は、入力特定部41からの指示を受けて、記憶部5に記憶されている文字列画像10を読み出してもよいし、通信部6を介して文字列画像10を取得してもよい。また、取得する画像は、動画であってもよいし、撮影部3で写真を撮影するために表示部22に表示される、いわゆるスルー画像であってもよい。
The image acquisition unit 47 operates the photographing unit 3 to acquire the character string image 10. Specifically, when an image capturing instruction is supplied from the input specifying unit 41, the image acquiring unit 47 causes the image capturing unit 3 to capture an image. Then, an image (character string image 10) captured by the imaging unit 3 is acquired, and the acquired character string image 10 is supplied to the character recognition unit 43 and the display control unit 46. In addition, although the image acquisition part 47 in this embodiment is a structure which acquires the image image | photographed by the imaging | photography part 3, it is not limited to this example. For example, the image acquisition unit 47 may read the character string image 10 stored in the storage unit 5 in response to an instruction from the input specifying unit 41 or acquire the character string image 10 via the communication unit 6. May be. Further, the image to be acquired may be a moving image or a so-called through image displayed on the display unit 22 in order to take a picture with the photographing unit 3.
記憶部5は、スマートフォン1にて使用される各種データを記憶する記憶デバイスである。図1に示すように、記憶部5は選択文字格納部51を含んでいる。選択文字格納部51は、選択文字列特定部44が選択文字と特定した選択文字および選択文字の座標情報を対応付けて一時的に記憶するものである。
The storage unit 5 is a storage device that stores various data used in the smartphone 1. As shown in FIG. 1, the storage unit 5 includes a selected character storage unit 51. The selected character storage unit 51 temporarily stores the selected character identified by the selected character string specifying unit 44 in association with the selected character and the coordinate information of the selected character.
通信部6は、スマートフォン1と外部機器との間で情報の送受信を行うための通信デバイスである。具体的には、通信部6は、処理実行部45によって制御されて、選択文字列特定部44が特定した文字列に関連する情報を外部から取得する。
The communication unit 6 is a communication device for transmitting and receiving information between the smartphone 1 and an external device. Specifically, the communication unit 6 is controlled by the processing execution unit 45 and acquires information related to the character string specified by the selected character string specifying unit 44 from the outside.
次に、図6から図8を参照して、入力操作によって選択された文字列を特定するための文字列特定処理の流れについて説明する。なお、図6は、表示部22に文字列画像10が表示された後の処理の流れについて記載している。
Next, the flow of the character string specifying process for specifying the character string selected by the input operation will be described with reference to FIGS. FIG. 6 describes the flow of processing after the character string image 10 is displayed on the display unit 22.
まず、入力部21に対して入力された入力操作を、入力特定部41が特定すると(S1でYES)、入力特定部41は特定した入力操作の座標、すなわち、指示体Sが接触したタッチパネル2の位置の各座標を抽出領域特定部42に供給する。続いて、抽出領域特定部42は総タッチ領域11を特定する(S2)。特定の具体例については上述しているため省略する。続いて抽出領域特定部42は文字認識領域12を特定する(S3)。具体的には、抽出領域特定部42は、総タッチ領域11を所定の条件に基づいて拡大し、拡大後の領域を文字認識領域12として特定する。なお、所定の条件の具体例については上述しているため、ここでの説明を省略する。抽出領域特定部42は特定した文字認識領域12の座標情報を文字認識部43に供給する。また、抽出領域特定部42は特定した総タッチ領域11の座標情報を選択文字列特定部44に供給する。続いて、文字認識部43は文字認識領域12の文字認識処理を実行する(S4)。具体的には、文字認識部43は、文字認識領域12の座標情報を供給されると、画像取得部47から供給された文字列画像10における、文字認識領域12の領域内の文字列を行単位で抽出する(抽出行を抽出する)。そして、文字認識部43は、抽出した抽出行と、抽出行全体を含む最小の矩形領域である抽出行領域の座標情報とを選択文字列特定部44に供給する。
First, when the input specifying unit 41 specifies an input operation input to the input unit 21 (YES in S1), the input specifying unit 41 determines the coordinates of the specified input operation, that is, the touch panel 2 that the indicator S touches. Are supplied to the extraction area specifying unit 42. Subsequently, the extraction area specifying unit 42 specifies the total touch area 11 (S2). Since specific examples have been described above, they will be omitted. Subsequently, the extraction area specifying unit 42 specifies the character recognition area 12 (S3). Specifically, the extraction area specifying unit 42 enlarges the total touch area 11 based on a predetermined condition, and specifies the enlarged area as the character recognition area 12. In addition, since the specific example of a predetermined condition has been mentioned above, description here is abbreviate | omitted. The extraction region specifying unit 42 supplies the coordinate information of the specified character recognition region 12 to the character recognition unit 43. Further, the extraction area specifying unit 42 supplies the coordinate information of the specified total touch area 11 to the selected character string specifying unit 44. Subsequently, the character recognition unit 43 executes character recognition processing of the character recognition area 12 (S4). Specifically, when the character recognition unit 43 is supplied with the coordinate information of the character recognition region 12, the character recognition unit 43 executes the character string in the region of the character recognition region 12 in the character string image 10 supplied from the image acquisition unit 47. Extract by units (extract extracted rows). Then, the character recognizing unit 43 supplies the extracted extracted line and the coordinate information of the extracted line area, which is the smallest rectangular area including the entire extracted line, to the selected character string specifying unit 44.
続いて、選択文字列特定部44は、供給された抽出行が横書きであるか否かを判定する(S5)。なお、縦書きである場合(S5でNO)の処理の流れについては、後述する実施形態2にて説明する。横書きである場合(S5でYES)、選択文字列特定部44は、文字認識領域12中の最も上にある行を判定行として選択する(S11)。続いて、選択文字列特定部44は、判定行領域と総タッチ領域11とが重なっているか否かを判定する(S12)。ここで、判定行領域とは、判定行と特定された抽出行の抽出行領域である。重なっていない場合(S12でNO)、選択文字列特定部44は、現在の判定行が文字認識領域12中の最終行であるか否かを判定する(S16)。具体的には、選択文字列特定部44は、現在の判定行が、抽出行のうち、抽出行領域における所定座標(例えば、抽出行領域の左上頂点の座標)のY座標が最も大きい抽出行であるか否かを判定する。現在の判定行が最終行でない場合(S16でNO)、選択文字列特定部44は、現在の判定行の1つ下の行を判定行とする(S17)。具体的には、選択文字列特定部44は、抽出行領域における所定座標のY座標が、現在の判定行の判定行領域における所定座標のY座標の次に小さい抽出行を、次の判定行とする。そして、選択文字列特定部44は、ステップS12の処理に戻る。なお、現在の判定行が最終行である場合については後述する。
Subsequently, the selected character string specifying unit 44 determines whether or not the supplied extracted line is horizontal writing (S5). Note that the processing flow in the case of vertical writing (NO in S5) will be described in the second embodiment to be described later. In the case of horizontal writing (YES in S5), the selected character string specifying unit 44 selects the uppermost line in the character recognition area 12 as a determination line (S11). Subsequently, the selected character string specifying unit 44 determines whether or not the determination line area and the total touch area 11 overlap each other (S12). Here, the determined line area is an extracted line area of the extracted line identified as the determined line. If they do not overlap (NO in S12), the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S16). Specifically, the selected character string specifying unit 44 determines that the current determination line is the extracted line having the largest Y coordinate of predetermined coordinates (for example, the coordinates of the upper left vertex of the extracted line area) in the extracted line area among the extracted lines. It is determined whether or not. If the current determination line is not the last line (NO in S16), the selected character string specifying unit 44 sets the line immediately below the current determination line as the determination line (S17). Specifically, the selected character string specifying unit 44 determines the extracted line whose Y coordinate of the predetermined coordinate in the extracted line area is the next smaller than the Y coordinate of the predetermined coordinate in the current determined line as the next determined line. And And the selection character string specific | specification part 44 returns to the process of step S12. The case where the current determination line is the last line will be described later.
これに対して、判定行領域と総タッチ領域11とが重なっている場合(S12でYES)、選択文字列特定部44は、総タッチ領域11の上辺または下辺のいずれか一方のみが判定行領域内にあるか否かを判定する(S13)。判定の具体例については上述しているため省略する。上辺または下辺のいずれか一方のみが判定行領域内にある場合(S13でYES)、選択文字列特定部44は、行重畳領域の高さが判定行領域の高さの半分以上であるか否かを判定する(S14)。判定の具体例については上述しているため省略する。半分以上でない場合(S14でNO)、選択文字列特定部44は、現在の判定行が文字認識領域12中の最終行であるか否かを判定する(S16)。以降の処理については既に説明しているためここでの説明を省略する。
On the other hand, when the determination line area and the total touch area 11 overlap (YES in S12), the selected character string specifying unit 44 determines that only one of the upper side and the lower side of the total touch area 11 is the determination line area. It is determined whether it is within (S13). Since the specific example of the determination has been described above, the description is omitted. When only one of the upper side and the lower side is in the determination line area (YES in S13), the selected character string specifying unit 44 determines whether or not the height of the line overlap area is half or more of the height of the determination line area. Is determined (S14). Since the specific example of the determination has been described above, the description is omitted. If it is not half or more (NO in S14), the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S16). Since the subsequent processing has already been described, the description thereof is omitted here.
これに対して、行重畳領域の高さが判定行領域の高さの半分以上である場合(S14でYES)、選択文字列特定部44は、判定行を文字列特定行であると特定して、文字判定処理を実行する(S15)。また、総タッチ領域11の上辺または下辺のいずれか一方のみが判定行領域内にない場合(S13でNO)、選択文字列特定部44は、ステップS14の処理を省略して、判定行を文字列特定行であると特定する。そして、文字判定処理を実行する(S15)。
On the other hand, when the height of the line superimposition area is not less than half the height of the determination line area (YES in S14), the selected character string specifying unit 44 specifies the determination line as a character string specifying line. Then, the character determination process is executed (S15). If only one of the upper side or the lower side of the total touch area 11 is not in the determination line area (NO in S13), the selected character string specifying unit 44 omits the process in step S14 and sets the determination line as a character. Identifies a column specific row. Then, a character determination process is executed (S15).
続いて、文字判定処理について説明する。選択文字列特定部44は、文字列特定行内に含まれる各文字について、文字領域の座標情報を特定する。続いて、選択文字列特定部44は、文字列特定行中の最も左にある文字を判定文字として選択する(S21)。続いて、選択文字列特定部44は、判定文字領域と総タッチ領域11とが重なっているか否かを判定する(S22)。ここで、判定文字領域とは、判定文字と特定された文字の文字領域である。重なっていない場合(S22でNO)、選択文字列特定部44は、現在の判定文字が文字列特定行中の最終文字であるか否かを判定する(S26)。具体的には、選択文字列特定部44は、現在の判定文字が、文字列特定行中の文字のうち、文字領域における所定座標(例えば、文字領域の左上頂点の座標)のX座標が最も大きい文字であるか否かを判定する。現在の判定文字が最終文字でない場合(S26でNO)、選択文字列特定部44は、現在の判定文字の1つ右の文字を判定文字とする(S27)。具体的には、選択文字列特定部44は、文字領域における所定座標のX座標が、現在の判定文字の判定文字領域における所定座標のX座標の次に小さい文字を、次の判定文字とする。そして、選択文字列特定部44は、ステップS22の処理に戻る。なお、現在の判定文字が最終文字である場合については後述する。
Next, the character determination process will be described. The selected character string specifying unit 44 specifies the coordinate information of the character area for each character included in the character string specifying line. Subsequently, the selected character string specifying unit 44 selects the leftmost character in the character string specifying line as a determination character (S21). Subsequently, the selected character string specifying unit 44 determines whether or not the determination character area and the total touch area 11 overlap (S22). Here, the determination character area is a character area of the character specified as the determination character. If they do not overlap (NO in S22), the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S26). Specifically, the selected character string specifying unit 44 determines that the current determination character is the X coordinate of a predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters in the character string specifying line. Determine whether the character is large. If the current determination character is not the final character (NO in S26), the selected character string specifying unit 44 sets the character to the right of the current determination character as the determination character (S27). Specifically, the selected character string specifying unit 44 sets a character whose X coordinate of the predetermined coordinate in the character area is the next smaller than the X coordinate of the predetermined coordinate in the determination character area of the current determination character as the next determination character. . Then, the selected character string specifying unit 44 returns to the process of step S22. The case where the current determination character is the final character will be described later.
これに対して、判定文字領域と総タッチ領域11とが重なっている場合(S22でYES)、選択文字列特定部44は、総タッチ領域11の左辺または右辺のいずれか一方のみが判定文字領域内にあるか否かを判定する(S23)。判定の具体例については上述しているため省略する。左辺または右辺のいずれか一方のみが判定文字領域内にある場合(S23でYES)、選択文字列特定部44は、文字重畳領域の幅が判定文字領域の幅の半分以上であるか否かを判定する(S24)。判定の具体例については上述しているため省略する。半分以上でない場合(S24でNO)、選択文字列特定部44は、現在の判定文字が文字列特定行中の最終文字であるか否かを判定する(S26)。以降の処理については既に説明しているためここでの説明を省略する。
On the other hand, when the determination character area and the total touch area 11 overlap (YES in S22), the selected character string specifying unit 44 determines that only one of the left side and the right side of the total touch area 11 is the determination character area. It is determined whether it is within (S23). Since the specific example of the determination has been described above, the description is omitted. When only one of the left side and the right side is in the determination character area (YES in S23), the selected character string specifying unit 44 determines whether or not the width of the character superimposed area is half or more of the width of the determination character area. Determine (S24). Since the specific example of the determination has been described above, the description is omitted. If it is not half or more (NO in S24), the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S26). Since the subsequent processing has already been described, the description thereof is omitted here.
これに対して、文字重畳領域の幅が判定文字領域の幅の半分以上である場合(S24でYES)、選択文字列特定部44は、判定文字を選択文字であると特定して、当該選択文字を選択文字格納部51に格納する(S25)。また、総タッチ領域11の左辺または右辺のいずれか一方のみが判定文字領域内にない場合(S23でNO)、選択文字列特定部44は、ステップS24の処理を省略して、判定文字を選択文字であると特定する。そして、当該選択文字を選択文字格納部51に格納する(S25)。
On the other hand, when the width of the character superimposed area is half or more than the width of the determination character area (YES in S24), the selected character string specifying unit 44 specifies the determination character as the selected character and selects the selected character. Characters are stored in the selected character storage unit 51 (S25). If only one of the left side or the right side of the total touch area 11 is not in the determination character area (NO in S23), the selected character string specifying unit 44 selects the determination character by omitting the process of step S24. Identifies it as a character. Then, the selected character is stored in the selected character storage unit 51 (S25).
続いて、選択文字列特定部44は、現在の判定文字が文字列特定行中の最終文字であるか否かを判定する(S26)。最終文字であると判定した場合(S26でYES)、選択文字列特定部44は、現在の判定行が文字認識領域12中の最終行であるか否かを判定する(S16)。最終行であると判定した場合(S16でYES)、選択文字列特定部44は選択された文字列を特定する(S18)。そして、特定した文字列を処理実行部45に供給する。以上で、文字列特定処理は終了する。
Subsequently, the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S26). When it determines with it being the last character (it is YES at S26), the selection character string specific | specification part 44 determines whether the present determination line is the last line in the character recognition area | region 12 (S16). When it determines with it being the last line (it is YES at S16), the selection character string specific | specification part 44 specifies the selected character string (S18). Then, the specified character string is supplied to the process execution unit 45. This completes the character string specifying process.
〔実施形態2〕
本発明の他の実施形態について、図6、図9、および図10に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。本実施形態では、文字認識部43から供給された抽出行を、選択文字列特定部44が縦書きであると判定した場合の文字列特定処理の流れについて説明する。なお、実施形態1で説明した文字列特定処理の流れと共通する処理については、その詳細な説明を省略する。 [Embodiment 2]
The following will describe another embodiment of the present invention with reference to FIG. 6, FIG. 9, and FIG. For convenience of explanation, members having the same functions as those described in the embodiment are given the same reference numerals, and descriptions thereof are omitted. In the present embodiment, the flow of the character string specifying process when the selected characterstring specifying unit 44 determines that the extracted line supplied from the character recognition unit 43 is vertically written will be described. Note that detailed description of processes common to the flow of the character string specifying process described in the first embodiment is omitted.
本発明の他の実施形態について、図6、図9、および図10に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。本実施形態では、文字認識部43から供給された抽出行を、選択文字列特定部44が縦書きであると判定した場合の文字列特定処理の流れについて説明する。なお、実施形態1で説明した文字列特定処理の流れと共通する処理については、その詳細な説明を省略する。 [Embodiment 2]
The following will describe another embodiment of the present invention with reference to FIG. 6, FIG. 9, and FIG. For convenience of explanation, members having the same functions as those described in the embodiment are given the same reference numerals, and descriptions thereof are omitted. In the present embodiment, the flow of the character string specifying process when the selected character
供給された抽出行が縦書きである場合(S5でNO)、選択文字列特定部44は文字認識領域12中の最も右にある行を判定行として選択する(S31)。具体的には、選択文字列特定部44は抽出行のうち、抽出行領域における所定座標(例えば、抽出行領域の左上頂点の座標)のX座標が最も大きい抽出行を判定行として特定する。続いて、選択文字列特定部44は、判定行領域と総タッチ領域11とが重なっているか否かを判定する(S32)。重なっていない場合(S32でNO)、選択文字列特定部44は、現在の判定行が文字認識領域12中の最終行であるか否かを判定する(S36)。具体的には、選択文字列特定部44は、現在の判定行が、抽出行のうち、抽出行領域における所定座標(例えば、抽出行領域の左上頂点の座標)のX座標が最も小さい抽出行であるか否かを判定する。現在の判定行が最終行でない場合(S36でNO)、選択文字列特定部44は現在の判定行の1つ左の行を判定行とする(S37)。具体的には、選択文字列特定部44は、抽出行領域における所定座標のX座標が、現在の判定行の判定行領域における所定座標のX座標の次に大きい抽出行を、次の判定行とする。そして、選択文字列特定部44はステップS12の処理に戻る。なお、現在の判定行が最終行である場合については後述する。
When the supplied extracted line is vertically written (NO in S5), the selected character string specifying unit 44 selects the rightmost line in the character recognition area 12 as a determination line (S31). Specifically, the selected character string specifying unit 44 specifies an extracted line having the largest X coordinate of a predetermined coordinate in the extracted line area (for example, the coordinate of the upper left vertex of the extracted line area) among the extracted lines as a determination line. Subsequently, the selected character string specifying unit 44 determines whether or not the determination line area and the total touch area 11 overlap (S32). If they do not overlap (NO in S32), the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S36). Specifically, the selected character string specifying unit 44 determines that the current determination line is the extracted line having the smallest X coordinate of predetermined coordinates (for example, the coordinates of the upper left vertex of the extracted line area) in the extracted line area among the extracted lines. It is determined whether or not. If the current determination line is not the last line (NO in S36), the selected character string specifying unit 44 sets the line one left of the current determination line as the determination line (S37). Specifically, the selected character string specifying unit 44 determines an extracted line whose X coordinate of a predetermined coordinate in the extracted line area is next to the X coordinate of the predetermined coordinate in the determined line area of the current determined line as the next determined line. And And the selection character string specific | specification part 44 returns to the process of step S12. The case where the current determination line is the last line will be described later.
これに対して、判定行領域と総タッチ領域11とが重なっている場合(S32でYES)、選択文字列特定部44は総タッチ領域11の左辺または右辺のいずれか一方のみが判定行領域内にあるか否かを判定する(S33)。具体的には選択文字列特定部44は、総タッチ領域11におけるX座標の最小値およびX座標の最大値の値が、判定行領域におけるX座標の最小値とX座標の最大値とからなるX座標数値範囲内に含まれているか否かを判定する。左辺または右辺のいずれか一方のみが判定行領域内にある場合(S33でYES)、選択文字列特定部44は行重畳領域の幅が判定行領域の幅の半分以上であるか否かを判定する(S34)。具体的には、選択文字列特定部44は、行重畳領域におけるX座標の最大値からX座標の最小値を減算した値(すなわち、行重畳領域の幅の値)が、判定行領域におけるX座標の最大値からX座標の最小値を減算した値(すなわち、判定行領域の幅の値)の半分以上であるか否かを判定する。半分以上でない場合(S34でNO)、選択文字列特定部44は現在の判定行が文字認識領域12中の最終行であるか否かを判定する(S36)。以降の処理については既に説明しているためここでの説明を省略する。なお、半分以上としたのは一例であり、判定の閾値はこの例に限定されない。
On the other hand, when the determination line area and the total touch area 11 overlap (YES in S32), the selected character string specifying unit 44 has only one of the left side or the right side of the total touch area 11 in the determination line area. It is determined whether or not (S33). Specifically, in the selected character string specifying unit 44, the minimum value of the X coordinate and the maximum value of the X coordinate in the total touch area 11 include the minimum value of the X coordinate and the maximum value of the X coordinate in the determination line area. It is determined whether it is included in the X coordinate numerical range. When only one of the left side and the right side is in the determination line area (YES in S33), the selected character string specifying unit 44 determines whether or not the width of the line overlap area is equal to or greater than half the width of the determination line area. (S34). Specifically, the selected character string specifying unit 44 calculates a value obtained by subtracting the minimum value of the X coordinate from the maximum value of the X coordinate in the line superimposition area (that is, the value of the width of the line superimposition area). It is determined whether or not it is half or more of a value obtained by subtracting the minimum value of the X coordinate from the maximum value of the coordinate (that is, the value of the width of the determined row area). If it is not half or more (NO in S34), the selected character string specifying unit 44 determines whether or not the current determination line is the last line in the character recognition area 12 (S36). Since the subsequent processing has already been described, the description thereof is omitted here. It should be noted that the value of more than half is an example, and the determination threshold is not limited to this example.
これに対して、行重畳領域の幅が判定行領域の幅の半分以上である場合(S34でYES)、選択文字列特定部44は、判定行を文字列特定行であると特定して、文字判定処理を実行する(S15)。また、総タッチ領域11の左辺または右辺のいずれか一方のみが判定行領域内にない場合(S33でNO)、選択文字列特定部44は、ステップS34の処理を省略して、判定行を文字列特定行であると特定する。そして、文字判定処理を実行する(S35)。
On the other hand, when the width of the line overlap area is half or more of the width of the determination line area (YES in S34), the selected character string specifying unit 44 specifies the determination line as a character string specifying line, Character determination processing is executed (S15). When only one of the left side or the right side of the total touch area 11 is not in the determination line area (NO in S33), the selected character string specifying unit 44 omits the process in step S34 and sets the determination line as a character. Identifies a column specific row. Then, a character determination process is executed (S35).
続いて、文字判定処理について説明する。選択文字列特定部44は、文字列特定行内に含まれる各文字について、文字領域の座標情報を特定する。続いて、選択文字列特定部44は、文字列特定行中の最も上にある文字を判定文字として選択する(S41)。具体的には、選択文字列特定部44は、文字列特定行内に含まれる各文字のうち、文字領域における所定座標(例えば、文字領域の左上頂点の座標)のY座標が最も小さい文字を判定文字として特定する。続いて、選択文字列特定部44は、判定文字領域と総タッチ領域11とが重なっているか否かを判定する(S42)。重なっていない場合(S42でNO)、選択文字列特定部44は、現在の判定文字が文字列特定行中の最終文字であるか否かを判定する(S46)。具体的には、選択文字列特定部44は、現在の判定文字が、文字列特定行中の文字のうち、文字領域における所定座標(例えば、文字領域の左上頂点の座標)のY座標が最も大きい文字であるか否かを判定する。現在の判定文字が最終文字でない場合(S46でNO)、選択文字列特定部44は、現在の判定文字の1つ下の文字を判定文字とする(S47)。具体的には、選択文字列特定部44は、文字領域における所定座標のY座標が、現在の判定文字の判定文字領域における所定座標のY座標の次に小さい文字を、次の判定文字とする。そして、選択文字列特定部44は、ステップS42の処理に戻る。なお、現在の判定文字が最終文字である場合については後述する。
Next, the character determination process will be described. The selected character string specifying unit 44 specifies the coordinate information of the character area for each character included in the character string specifying line. Subsequently, the selected character string specifying unit 44 selects the uppermost character in the character string specifying line as a determination character (S41). Specifically, the selected character string specifying unit 44 determines a character having the smallest Y coordinate of a predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters included in the character string specifying line. Identifies as a character. Subsequently, the selected character string specifying unit 44 determines whether or not the determination character area and the total touch area 11 overlap (S42). If they do not overlap (NO in S42), the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S46). Specifically, the selected character string specifying unit 44 determines that the current determination character is the Y coordinate of the predetermined coordinate in the character area (for example, the coordinate of the upper left vertex of the character area) among the characters in the character string specifying line. Determine whether the character is large. If the current determination character is not the final character (NO in S46), the selected character string specifying unit 44 sets the character immediately below the current determination character as the determination character (S47). Specifically, the selected character string specifying unit 44 sets a character whose Y coordinate of a predetermined coordinate in the character area is the next smaller than the Y coordinate of the predetermined coordinate in the determination character area of the current determination character as the next determination character. . Then, the selected character string specifying unit 44 returns to the process of step S42. The case where the current determination character is the final character will be described later.
これに対して、判定文字領域と総タッチ領域11とが重なっている場合(S42でYES)、選択文字列特定部44は、総タッチ領域11の上辺または下辺のいずれか一方のみが判定文字領域内にあるか否かを判定する(S43)。具体的には、選択文字列特定部44は、総タッチ領域11におけるY座標の最小値およびY座標の最大値の値が、判定文字領域におけるY座標の最小値とY座標の最大値とからなるY座標数値範囲内に含まれているか否かを判定する。上辺または下辺のいずれか一方のみが判定文字領域内にある場合(S43でYES)、選択文字列特定部44は、文字重畳領域の高さが判定文字領域の高さの半分以上であるか否かを判定する(S44)。具体的には、選択文字列特定部44は、文字重畳領域におけるY座標の最大値からY座標の最小値を減算した値(すなわち、文字重畳領域の高さの値)が、判定文字領域におけるY座標の最大値からY座標の最小値を減算した値(すなわち、判定文字領域の高さの値)の半分以上であるか否かを判定する。半分以上でない場合(S44でNO)、選択文字列特定部44は、現在の判定文字が文字列特定行中の最終文字であるか否かを判定する(S46)。以降の処理については既に説明しているためここでの説明を省略する。なお、半分以上としたのは一例であり、判定の閾値はこの例に限定されない。
On the other hand, when the determination character area and the total touch area 11 overlap (YES in S42), the selected character string specifying unit 44 determines that only one of the upper side and the lower side of the total touch area 11 is the determination character area. It is determined whether it is within (S43). Specifically, the selected character string specifying unit 44 determines that the minimum value of the Y coordinate and the maximum value of the Y coordinate in the total touch area 11 are based on the minimum value of the Y coordinate and the maximum value of the Y coordinate in the determination character area. It is determined whether it is included in the Y coordinate numerical value range. When only one of the upper side and the lower side is in the determination character area (YES in S43), the selected character string specifying unit 44 determines whether or not the height of the character superimposed area is half or more of the height of the determination character area. Is determined (S44). Specifically, the selected character string specifying unit 44 calculates a value obtained by subtracting the minimum value of the Y coordinate from the maximum value of the Y coordinate in the character superimposed region (that is, the height value of the character superimposed region) in the determination character region. It is determined whether or not it is half or more of a value obtained by subtracting the minimum value of the Y coordinate from the maximum value of the Y coordinate (that is, the height value of the determination character area). If it is not half or more (NO in S44), the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S46). Since the subsequent processing has already been described, the description thereof is omitted here. It should be noted that the value of more than half is an example, and the determination threshold is not limited to this example.
これに対して、文字重畳領域の幅が判定文字領域の高さの半分以上である場合(S44でYES)、選択文字列特定部44は、判定文字を選択文字であると特定して、当該選択文字を選択文字格納部51に格納する(S45)。また、総タッチ領域11の上辺または下辺のいずれか一方のみが判定文字領域内にない場合(S43でNO)、選択文字列特定部44は、ステップS44の処理を省略して、判定文字を選択文字であると特定する。そして、当該選択文字を選択文字格納部51に格納する(S45)。
On the other hand, when the width of the character superimposed region is half or more of the height of the determination character region (YES in S44), the selected character string specifying unit 44 specifies the determination character as the selected character, and The selected character is stored in the selected character storage unit 51 (S45). If only one of the upper side or the lower side of the total touch area 11 is not in the determination character area (NO in S43), the selected character string specifying unit 44 omits the process of step S44 and selects the determination character Identifies it as a character. Then, the selected character is stored in the selected character storage unit 51 (S45).
続いて、選択文字列特定部44は、現在の判定文字が文字列特定行中の最終文字であるか否かを判定する(S46)。最終文字であると判定した場合(S46でYES)、選択文字列特定部44は、現在の判定行が文字認識領域12中の最終行であるか否かを判定する(S36)。最終行であると判定した場合(S36でYES)、選択文字列特定部44は選択された文字列を特定する(S38)。以上で、文字列特定処理は終了する。
Subsequently, the selected character string specifying unit 44 determines whether or not the current determination character is the last character in the character string specifying line (S46). When it determines with it being the last character (it is YES at S46), the selection character string specific | specification part 44 determines whether the present determination line is the last line in the character recognition area | region 12 (S36). When it determines with it being the last line (it is YES at S36), the selection character string specific | specification part 44 specifies the selected character string (S38). This completes the character string specifying process.
〔実施形態3〕
スマートフォン1の制御ブロック(特に入力特定部41、抽出領域特定部42、文字認識部43、選択文字列特定部44、処理実行部45、表示制御部46、および画像取得部47)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。後者の場合、スマートフォン1は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 [Embodiment 3]
The control blocks of the smartphone 1 (in particular, theinput specifying unit 41, the extraction region specifying unit 42, the character recognizing unit 43, the selected character string specifying unit 44, the process executing unit 45, the display control unit 46, and the image acquiring unit 47) are integrated circuits. It may be realized by a logic circuit (hardware) formed on an (IC chip) or the like, or may be realized by software using a CPU (Central Processing Unit). In the latter case, the smartphone 1 includes a CPU that executes instructions of a program that is software that realizes each function, a ROM (Read Only Memory) or a storage in which the above-described program and various data are recorded so as to be readable by the computer (or CPU). An apparatus (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like are provided. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. The present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
スマートフォン1の制御ブロック(特に入力特定部41、抽出領域特定部42、文字認識部43、選択文字列特定部44、処理実行部45、表示制御部46、および画像取得部47)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。後者の場合、スマートフォン1は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 [Embodiment 3]
The control blocks of the smartphone 1 (in particular, the
〔まとめ〕
本発明の態様1に係る文字特定装置(スマートフォン1)は、表示画像に含まれる文字を特定するための文字特定装置であって、上記表示画像に対するユーザ操作によって特定された上記表示画像内の特定領域を拡大した拡大領域に対し、文字認識を実行する文字認識実行部(文字認識部43)と、上記文字認識実行部による文字認識によって得られる上記拡大領域に含まれる文字に設定された領域と、上記特定領域との重なり度合いが所定量を超える文字をユーザに選択された文字として特定する選択文字特定部(選択文字列特定部44)と、を備える。 [Summary]
A character specifying device (smart phone 1) according to aspect 1 of the present invention is a character specifying device for specifying a character included in a display image, and is specified in the display image specified by a user operation on the display image. A character recognition execution unit (character recognition unit 43) that performs character recognition on the enlarged region obtained by enlarging the region, and a region set as a character included in the enlarged region obtained by character recognition by the character recognition execution unit; A selected character specifying unit (selected character string specifying unit 44) that specifies a character whose degree of overlap with the specific region exceeds a predetermined amount as a character selected by the user.
本発明の態様1に係る文字特定装置(スマートフォン1)は、表示画像に含まれる文字を特定するための文字特定装置であって、上記表示画像に対するユーザ操作によって特定された上記表示画像内の特定領域を拡大した拡大領域に対し、文字認識を実行する文字認識実行部(文字認識部43)と、上記文字認識実行部による文字認識によって得られる上記拡大領域に含まれる文字に設定された領域と、上記特定領域との重なり度合いが所定量を超える文字をユーザに選択された文字として特定する選択文字特定部(選択文字列特定部44)と、を備える。 [Summary]
A character specifying device (smart phone 1) according to aspect 1 of the present invention is a character specifying device for specifying a character included in a display image, and is specified in the display image specified by a user operation on the display image. A character recognition execution unit (character recognition unit 43) that performs character recognition on the enlarged region obtained by enlarging the region, and a region set as a character included in the enlarged region obtained by character recognition by the character recognition execution unit; A selected character specifying unit (selected character string specifying unit 44) that specifies a character whose degree of overlap with the specific region exceeds a predetermined amount as a character selected by the user.
上記の構成によれば、画像全体ではなく、特定領域を拡大した拡大領域に対して文字認識を実行する。これにより、画像全体に対し文字認識処理を実行する場合と比較して、文字認識処理に要する時間を短くすることができる。また、拡大領域に対して文字認識を実行するので、少なくともユーザが選択したい文字を完全に含む領域を文字認識処理の対象とすることができる。よって、ユーザが選択したい文字を完全な状態で文字認識することができる。さらに上記の構成によれば、拡大領域に含まれる文字に設定された領域と、特定領域との重なり度合いが所定量を超える文字をユーザに選択された文字として特定する。これにより、選択された文字を精度よく特定することができる。以上より、目的とする文字のみを短時間で精度よく特定することができる。
According to the above configuration, character recognition is performed not on the entire image but on an enlarged area obtained by enlarging a specific area. Thereby, compared with the case where character recognition processing is performed with respect to the whole image, the time which character recognition processing requires can be shortened. In addition, since character recognition is performed on the enlarged region, at least a region that completely includes a character that the user wants to select can be a target of character recognition processing. Therefore, the character that the user wants to select can be recognized in a complete state. Furthermore, according to said structure, the character with which the overlap degree of the area | region set to the character contained in an expansion area and a specific area exceeds predetermined amount is pinpointed as a character selected by the user. Thereby, the selected character can be specified with high accuracy. From the above, it is possible to specify only the target character with high accuracy in a short time.
本発明の態様2に係る文字特定装置は、上記態様1において、上記選択文字特定部は、上記文字認識実行部による文字認識によって得られる上記拡大領域に含まれる文字が形成する行ごとに設定される、上記行が存在する領域である行領域と上記特定領域との重なり度合いが第1の所定量を超えたとき、該行領域に含まれる文字を特定対象としてもよい。
In the character specifying device according to aspect 2 of the present invention, in the above aspect 1, the selected character specifying unit is set for each line formed by a character included in the enlarged region obtained by character recognition by the character recognition executing unit. In addition, when the degree of overlap between the line area in which the line exists and the specific area exceeds a first predetermined amount, the character included in the line area may be specified.
上記の構成によれば、行領域と特定領域との重なり度合いが第1の所定量を超えたとき、該行領域に含まれる文字を特定対象とする。すなわち、拡大領域に含まれる行のうち、文字を特定する処理を行う行を絞ることができる。よって、拡大領域に含まれるすべての行について文字の特定を行わなくてよいため、文字の特定を短時間で行うことができる。
According to the above configuration, when the degree of overlap between the line area and the specific area exceeds the first predetermined amount, the characters included in the line area are specified. That is, among the lines included in the enlarged area, it is possible to narrow down the lines for performing the process of specifying characters. Therefore, since it is not necessary to specify characters for all the lines included in the enlarged region, it is possible to specify characters in a short time.
本発明の態様3に係る文字特定装置は、上記態様2において、上記特定領域および上記行領域は矩形であり、上記選択文字特定部は、(1)上記行領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と平行な2辺がいずれも上記行領域と重なっている、または、(2)上記行領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と平行な2辺がいずれも上記行領域と重なっていない場合、該行領域に含まれる文字を特定対象としてもよい。
The character identification device according to aspect 3 of the present invention is the character identification device according to aspect 2, in which the specific area and the line area are rectangular, and the selected character identification unit includes (1) the line area and the specific area overlapping each other. And two of the boundary lines of the specific area that are parallel to the direction of the character overlap the line area, or (2) the line area overlaps the specific area. In addition, in the boundary line of the specific area, when two sides parallel to the character traveling direction do not overlap the line area, the character included in the line area may be specified.
上記の構成によれば、特定領域の大部分が行領域と重なっているとき、該行領域に含まれる文字を特定対象とする。ここで、特定領域の大部分が重なっている行は、操作によって特定された行である可能性が高い。つまり、操作によって特定された行である可能性が高い行を、特定する文字が含まれる行と決定するので、文字の特定を精度よく行うことができる。
According to the above configuration, when most of the specific area overlaps the line area, the characters included in the line area are targeted for specification. Here, there is a high possibility that the line where most of the specific areas overlap is a line specified by the operation. That is, since a line that is likely to be a line specified by an operation is determined as a line that includes the character to be specified, the character can be specified with high accuracy.
本発明の態様4に係る文字特定装置は、上記態様3において、上記選択文字特定部は、上記特定領域の境界線のうち、上記文字の進行方向と平行な2辺のいずれか一方のみが上記行領域と重なっており、かつ、上記行領域と上記特定領域とが重なっている領域である行重畳領域における上記文字の進行方向と直角な辺の長さが、上記行領域における上記文字の進行方向と直角な辺の長さに対して所定割合以上の長さであるとき、該行領域に含まれる文字を特定対象としてもよい。
In the character identification device according to aspect 4 of the present invention, in the aspect 3, the selected character identification unit is configured such that, of the boundary lines of the specific area, only one of the two sides parallel to the traveling direction of the character is the above. The length of the side perpendicular to the direction of movement of the character in the line overlap area, which is an area where the line area overlaps with the specific area, is the progression of the character in the line area. When the length is a predetermined ratio or more with respect to the length of the side perpendicular to the direction, the character included in the line area may be specified.
上記の構成によれば、特定領域の境界線のうち文字の進行方向と平行な2辺のいずれか一方のみが行領域と重なっている場合において、重なり度合いが大きい行を特定する。ここで、重なり度合いが大きい行は操作によって特定された行である可能性が高い。つまり、操作によって特定された行である可能性が高い行に含まれる文字を特定対象とする。よって、文字の特定を精度よく行うことができる。
According to the above configuration, when only one of the two sides parallel to the character progression direction is overlapped with the line area among the boundary lines of the specific area, the line having a high degree of overlap is specified. Here, there is a high possibility that a line having a high degree of overlap is a line specified by the operation. That is, a character included in a line that is highly likely to be a line specified by an operation is set as a specification target. Therefore, the character can be specified with high accuracy.
本発明の態様5に係る文字特定装置は、上記態様2から4のいずれかにおいて、上記選択文字特定部は、上記特定領域との重なり度合いが上記第1の所定量を超えた行領域に含まれる文字ごとに設定される、上記文字が存在する領域である文字領域と上記特定領域との重なり度合いが第2の所定量を超えたとき、該文字を特定する文字としてもよい。
The character identification device according to aspect 5 of the present invention is the character identification device according to any one of the aspects 2 to 4, wherein the selected character identification unit is included in a line area whose degree of overlap with the specific area exceeds the first predetermined amount. The character may be a character that identifies the character when the degree of overlap between the character region that is the region where the character exists and the specific region exceeds a second predetermined amount.
上記の構成によれば、文字と特定領域との重なり度合いが第2の所定量を超えたとき、該文字を特定する文字とする。つまり、選択された文字であるか否かを、一文字ごとに判定して選択された文字を特定する。よって、文字の特定を精度よく行うことができる。
According to the above configuration, when the degree of overlap between the character and the specific area exceeds the second predetermined amount, the character is specified. That is, it is determined for each character whether or not it is a selected character, and the selected character is specified. Therefore, the character can be specified with high accuracy.
本発明の態様6に係る文字特定装置は、上記態様5において、上記文字領域は矩形であり、上記選択文字特定部は、(1)上記文字領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と直角な2辺がいずれも上記文字領域と重なっている、または、(2)上記文字領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と直角な2辺がいずれも上記文字領域と重なっていない場合、上記文字領域を設定された文字を、特定する文字としてもよい。
In the character specifying device according to aspect 6 of the present invention, in the above aspect 5, the character area is a rectangle, and the selected character specifying unit includes (1) the character area and the specific area overlapping, and Of the boundary lines of the specific area, two sides perpendicular to the direction of travel of the character both overlap the character area, or (2) the character area and the specific area overlap, and Of the boundary lines of the specific area, when neither two sides perpendicular to the advancing direction of the character overlap the character area, the character set as the character area may be a character to be specified.
上記の構成によれば、文字領域の大部分が特定領域と重なっているとき、文字領域を設定された文字を、特定する文字とする。ここで、文字領域の大部分が特定領域と重なっている文字は、ユーザが選択しようとした文字である可能性が高い。つまり、ユーザが選択しようとした文字である可能性が高い文字を、特定する文字とするので、文字の特定を精度よく行うことができる。
According to the above configuration, when most of the character area overlaps with the specific area, the character with the character area set is set as the specified character. Here, there is a high possibility that a character in which most of the character area overlaps the specific area is a character that the user has attempted to select. In other words, since the character that is highly likely to be selected by the user is the character to be specified, the character can be specified with high accuracy.
本発明の態様7に係る文字特定装置は、上記態様6において、上記選択文字特定部は、上記特定領域の境界線のうち、上記文字の進行方向と直角な2辺のいずれか一方のみが上記文字領域と重なっており、かつ、上記文字領域と上記特定領域とが重なっている領域である文字重畳領域における上記文字の進行方向と平行な辺の長さが、上記文字領域における上記文字の進行方向と直角な辺の長さに対して所定割合以上の長さであるとき、上記文字領域を設定された文字を、特定する文字としてもよい。
In the character specifying device according to aspect 7 of the present invention, in the aspect 6, the selected character specifying unit is configured so that only one of the two sides perpendicular to the advancing direction of the character is the boundary line of the specific region. The length of the side parallel to the direction of movement of the character in the character overlap area, which is an area where the character area overlaps the character area and the character area overlaps the specific area, is the progression of the character in the character area. When the length is equal to or greater than a predetermined ratio with respect to the length of the side perpendicular to the direction, the character in which the character area is set may be a character to be specified.
上記の構成によれば、特定領域の境界線のうち文字の進行方向と直角な2辺のいずれか一方のみが文字領域と重なっている場合において、重なり度合いが大きい文字を特定する。ここで、重なり度合いが大きい文字はユーザが選択しようとした文字である可能性が高い。つまり、ユーザが選択しようとした文字である可能性が高い文字を、特定する文字とするので、文字の特定を精度よく行うことができる。
According to the above configuration, when only one of the two sides perpendicular to the character traveling direction among the boundary lines of the specific region overlaps the character region, the character having a large overlapping degree is specified. Here, there is a high possibility that a character with a large degree of overlap is a character that the user has attempted to select. In other words, since the character that is highly likely to be selected by the user is the character to be specified, the character can be specified with high accuracy.
本発明の各態様に係る文字特定装置は、コンピュータによって実現してもよく、この場合には、コンピュータを上記文字特定装置が備える各部(ソフトウェア要素)として動作させることにより上記文字特定装置をコンピュータにて実現させる文字特定装置の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。
The character identification device according to each aspect of the present invention may be realized by a computer. In this case, the character identification device is operated on each computer by operating the computer as each unit (software element) included in the character identification device. The control program for the character identification device realized and the computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.
本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。
The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention. Furthermore, a new technical feature can be formed by combining the technical means disclosed in each embodiment.
本発明は、表示画像に含まれる文字を特定するための文字特定装置に利用することができる。特に、スマートフォン、タブレット端末、デジタルカメラなどに好適である。
The present invention can be used in a character identification device for identifying characters included in a display image. In particular, it is suitable for smartphones, tablet terminals, digital cameras, and the like.
1 スマートフォン(文字特定装置)、43 文字認識部(文字認識実行部)、44 選択文字列特定部(選択文字特定部)
1 Smartphone (character identification device), 43 Character recognition unit (character recognition execution unit), 44 Selected character string identification unit (selected character identification unit)
Claims (8)
- 表示画像に含まれる文字を特定するための文字特定装置であって、
上記表示画像に対するユーザ操作によって特定された上記表示画像内の特定領域を拡大した拡大領域に対し、文字認識を実行する文字認識実行部と、
上記文字認識実行部による文字認識によって得られる上記拡大領域に含まれる文字に設定された領域と、上記特定領域との重なり度合いが所定量を超える文字をユーザに選択された文字として特定する選択文字特定部と、を備えることを特徴とする文字特定装置。 A character identification device for identifying characters included in a display image,
A character recognition execution unit that performs character recognition on an enlarged area obtained by enlarging a specific area in the display image specified by a user operation on the display image;
A selected character that identifies, as a character selected by the user, a character whose degree of overlap between the area set as the character included in the enlarged area obtained by character recognition by the character recognition execution unit and the specific area exceeds a predetermined amount And a character identification device. - 上記選択文字特定部は、上記文字認識実行部による文字認識によって得られる上記拡大領域に含まれる文字が形成する行ごとに設定される、上記行が存在する領域である行領域と上記特定領域との重なり度合いが第1の所定量を超えたとき、該行領域に含まれる文字を特定対象とすることを特徴とする請求項1に記載の文字特定装置。 The selected character specifying unit is set for each line formed by characters included in the enlarged region obtained by character recognition by the character recognition executing unit, and a line region that is the region where the line exists and the specific region The character identification device according to claim 1, wherein when the degree of overlapping exceeds a first predetermined amount, a character included in the line area is targeted for identification.
- 上記特定領域および上記行領域は矩形であり、
上記選択文字特定部は、
(1)上記行領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と平行な2辺がいずれも上記行領域と重なっている、または、
(2)上記行領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と平行な2辺がいずれも上記行領域と重なっていない場合、該行領域に含まれる文字を特定対象とすることを特徴とする請求項2に記載の文字特定装置。 The specific area and the line area are rectangular,
The selected character identification part
(1) The line region and the specific region overlap, and two of the boundary lines of the specific region that are parallel to the character traveling direction overlap the line region, or
(2) When the line area and the specific area overlap, and two of the boundary lines of the specific area that are parallel to the character traveling direction do not overlap the line area, The character identification device according to claim 2, wherein characters included in the line area are targeted for identification. - 上記選択文字特定部は、上記特定領域の境界線のうち、上記文字の進行方向と平行な2辺のいずれか一方のみが上記行領域と重なっており、かつ、上記行領域と上記特定領域とが重なっている領域である行重畳領域における上記文字の進行方向と直角な辺の長さが、上記行領域における上記文字の進行方向と直角な辺の長さに対して所定割合以上の長さであるとき、該行領域に含まれる文字を特定対象とすることを特徴とする請求項3に記載の文字特定装置。 The selected character specifying unit includes only one of two sides parallel to the direction of the character of the boundary line of the specific area overlapping the line area, and the line area and the specific area. The length of the side perpendicular to the character traveling direction in the line overlapping region, which is the region where the characters overlap, is equal to or greater than a predetermined ratio with respect to the length of the side perpendicular to the character traveling direction in the line region. The character identification device according to claim 3, wherein the character included in the line area is a target to be identified.
- 上記選択文字特定部は、上記特定領域との重なり度合いが上記第1の所定量を超えた行領域に含まれる文字ごとに設定される、上記文字が存在する領域である文字領域と上記特定領域との重なり度合いが第2の所定量を超えたとき、該文字を特定する文字とする請求項2から4のいずれか1項に記載の文字特定装置。 The selected character specifying unit is set for each character included in a line area in which the degree of overlap with the specific area exceeds the first predetermined amount, and the character area and the specific area are areas where the character exists. The character identification device according to any one of claims 2 to 4, wherein when the degree of overlap with the character exceeds a second predetermined amount, the character is identified.
- 上記文字領域は矩形であり、
上記選択文字特定部は、
(1)上記文字領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と直角な2辺がいずれも上記文字領域と重なっている、または、
(2)上記文字領域と上記特定領域とが重なっており、かつ、上記特定領域の境界線のうち、上記文字の進行方向と直角な2辺がいずれも上記文字領域と重なっていない場合、上記文字領域を設定された文字を、特定する文字とすることを特徴とする請求項5に記載の文字特定装置。 The character area is rectangular,
The selected character identification part
(1) The character region and the specific region overlap, and two of the boundary lines of the specific region that are perpendicular to the character traveling direction overlap the character region, or
(2) When the character area and the specific area overlap, and two of the boundary lines of the specific area that are perpendicular to the character traveling direction do not overlap the character area, The character identification device according to claim 5, wherein the character set with the character area is a character to be identified. - 上記選択文字特定部は、上記特定領域の境界線のうち、上記文字の進行方向と直角な2辺のいずれか一方のみが上記文字領域と重なっており、かつ、上記文字領域と上記特定領域とが重なっている領域である文字重畳領域における上記文字の進行方向と平行な辺の長さが、上記文字領域における上記文字の進行方向と直角な辺の長さに対して所定割合以上の長さであるとき、上記文字領域を設定された文字を、特定する文字とすることを特徴とする請求項6に記載の文字特定装置。 The selected character specifying unit is configured such that only one of two sides perpendicular to the advancing direction of the character among the boundary lines of the specific region overlaps the character region, and the character region and the specific region are The length of the side parallel to the character traveling direction in the character overlapping region, which is the region where the characters overlap, is equal to or greater than a predetermined ratio with respect to the length of the side perpendicular to the character traveling direction in the character region. The character identification device according to claim 6, wherein the character for which the character area is set is a character to be identified.
- 請求項1から7のいずれか1項に記載の文字特定装置としてコンピュータを機能させるための制御プログラムであって、コンピュータを上記各部として機能させるための制御プログラム。 A control program for causing a computer to function as the character identification device according to any one of claims 1 to 7, wherein the control program causes the computer to function as each unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016514842A JP6170241B2 (en) | 2014-04-22 | 2015-04-03 | Character identification device and control program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014088333 | 2014-04-22 | ||
JP2014-088333 | 2014-04-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015163118A1 true WO2015163118A1 (en) | 2015-10-29 |
Family
ID=54332286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/060640 WO2015163118A1 (en) | 2014-04-22 | 2015-04-03 | Character specifying device, and control program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6170241B2 (en) |
WO (1) | WO2015163118A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106227398A (en) * | 2016-06-29 | 2016-12-14 | 宇龙计算机通信科技(深圳)有限公司 | A kind of camera image character displaying method and device |
JP2018124918A (en) * | 2017-02-03 | 2018-08-09 | 株式会社東芝 | Image processor, image processing method, and program |
JP2018180872A (en) * | 2017-04-12 | 2018-11-15 | 富士ゼロックス株式会社 | Document processing device and program |
CN111563497A (en) * | 2020-04-30 | 2020-08-21 | 广东小天才科技有限公司 | Frame question method and device based on movement track, electronic equipment and storage medium |
JP7471973B2 (en) | 2020-09-18 | 2024-04-22 | 東芝テック株式会社 | Information processing device and control program thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012027721A (en) * | 2010-07-23 | 2012-02-09 | Sony Corp | Information processor, information processing method and information processing program |
JP2012243167A (en) * | 2011-05-20 | 2012-12-10 | Sharp Corp | Display device and display program |
JP2013171365A (en) * | 2012-02-20 | 2013-09-02 | Mitsubishi Electric Corp | Display device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3460339B2 (en) * | 1994-11-18 | 2003-10-27 | 松下電器産業株式会社 | Object selection device and method |
JP2014102669A (en) * | 2012-11-20 | 2014-06-05 | Toshiba Corp | Information processor, information processing method and program |
-
2015
- 2015-04-03 WO PCT/JP2015/060640 patent/WO2015163118A1/en active Application Filing
- 2015-04-03 JP JP2016514842A patent/JP6170241B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012027721A (en) * | 2010-07-23 | 2012-02-09 | Sony Corp | Information processor, information processing method and information processing program |
JP2012243167A (en) * | 2011-05-20 | 2012-12-10 | Sharp Corp | Display device and display program |
JP2013171365A (en) * | 2012-02-20 | 2013-09-02 | Mitsubishi Electric Corp | Display device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106227398A (en) * | 2016-06-29 | 2016-12-14 | 宇龙计算机通信科技(深圳)有限公司 | A kind of camera image character displaying method and device |
JP2018124918A (en) * | 2017-02-03 | 2018-08-09 | 株式会社東芝 | Image processor, image processing method, and program |
US10296802B2 (en) | 2017-02-03 | 2019-05-21 | Kabushiki Kaisha Toshiba | Image processing device, image processing method, and computer program product |
JP2018180872A (en) * | 2017-04-12 | 2018-11-15 | 富士ゼロックス株式会社 | Document processing device and program |
CN111563497A (en) * | 2020-04-30 | 2020-08-21 | 广东小天才科技有限公司 | Frame question method and device based on movement track, electronic equipment and storage medium |
CN111563497B (en) * | 2020-04-30 | 2024-04-16 | 广东小天才科技有限公司 | Frame question method and device based on moving track, electronic equipment and storage medium |
JP7471973B2 (en) | 2020-09-18 | 2024-04-22 | 東芝テック株式会社 | Information processing device and control program thereof |
Also Published As
Publication number | Publication date |
---|---|
JP6170241B2 (en) | 2017-07-26 |
JPWO2015163118A1 (en) | 2017-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6170241B2 (en) | Character identification device and control program | |
US10291843B2 (en) | Information processing apparatus having camera function and producing guide display to capture character recognizable image, control method thereof, and storage medium | |
US9489715B2 (en) | Image display apparatus and image display method | |
US10031667B2 (en) | Terminal device, display control method, and non-transitory computer-readable recording medium | |
KR102450236B1 (en) | Electronic apparatus, method for controlling thereof and the computer readable recording medium | |
CN106815809B (en) | Picture processing method and device | |
CN110297545B (en) | Gesture control method, gesture control device and system, and storage medium | |
US20160300321A1 (en) | Information processing apparatus, method for controlling information processing apparatus, and storage medium | |
JP6206580B2 (en) | Terminal device, display control method, and program | |
JP6164361B2 (en) | Terminal device, display control method, and program | |
JP6328409B2 (en) | Translation device | |
US10990802B2 (en) | Imaging apparatus providing out focusing and method for controlling the same | |
US9396405B2 (en) | Image processing apparatus, image processing method, and image processing program | |
JP6364182B2 (en) | Character string recognition apparatus and character string recognition method | |
WO2015045679A1 (en) | Information device and control program | |
JP6251075B2 (en) | Translation device | |
JP2015032261A (en) | Display device and control method | |
KR20140134844A (en) | Method and device for photographing based on objects | |
KR20140112919A (en) | Apparatus and method for processing an image | |
WO2015159498A1 (en) | Method and apparatus for displaying additional objects on a graphical user interface based on pinch gesture | |
US20240265729A1 (en) | Information processing apparatus, information processing system, information processing method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15783839 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016514842 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15783839 Country of ref document: EP Kind code of ref document: A1 |