WO2014103775A1 - 情報処理装置、情報処理方法、及び、プログラム記憶媒体 - Google Patents
情報処理装置、情報処理方法、及び、プログラム記憶媒体 Download PDFInfo
- Publication number
- WO2014103775A1 WO2014103775A1 PCT/JP2013/083599 JP2013083599W WO2014103775A1 WO 2014103775 A1 WO2014103775 A1 WO 2014103775A1 JP 2013083599 W JP2013083599 W JP 2013083599W WO 2014103775 A1 WO2014103775 A1 WO 2014103775A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- graph
- information processing
- handwriting
- handwritten input
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/32—Digital ink
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/32—Digital ink
- G06V30/36—Matching; Classification
Definitions
- the present technology relates to an information processing apparatus, an information processing method, and a program storage medium, and more particularly, to an information processing apparatus, an information processing method, and a program storage medium that can improve work efficiency when creating a graphic. .
- a determination criterion character having the highest similarity to a handwritten input character is recognized as an input character.
- the method is used.
- Patent Document 1 as a technique for recognizing handwritten characters.
- Japanese Patent Application Laid-Open No. 2004-133830 proposes to improve the recognition accuracy using the number of strokes of the input handwriting and the stroke number data of the determination reference character by using the character recognition technology.
- spreadsheet software with a function to create graphs and tables and presentation software are widely used.
- the user operates the mouse and keyboard to enter numbers in a pre-prepared frame, instruct the creation of a graph, and then specify the number area to be reflected in the graph.
- the user operates the mouse and keyboard to enter numbers in a pre-prepared frame, instruct the creation of a graph, and then specify the number area to be reflected in the graph.
- the number area to be reflected in the graph Generally created.
- tablette pen a user's finger or a pen-shaped input device (hereinafter referred to as “tablet pen”) is often used instead of using a mouse or a keyboard as an input method.
- the act of creating a graph is a work involving complicated thinking such as trend analysis, and is often a joint work while consulting with others.
- the operability of the tablet terminal device or the like is poor and the work efficiency decreases, the result of the work can be influenced, so there has been a demand for improvement.
- the present technology has been made in view of such a situation, and in an electronic apparatus having a display device integrated with a touch panel, it is possible to improve work efficiency when creating graphics such as graphs. It is.
- An information processing apparatus includes an acquisition unit that acquires the content of handwriting input on a touch panel, and a post-shape obtained by shaping a graphic recognized from the handwriting input according to the content of the handwriting input A display control unit for controlling the display of the graphic.
- the display control unit displays a figure after shaping according to the parameter input by handwriting.
- the display control unit displays a shaped graphic reflecting the calculation result of a predetermined function executed based on the parameter.
- the graphic to be shaped and the parameter are associated with each other according to the line type used for the handwriting input.
- the display control unit displays a complemented figure obtained by complementing the figure recognized from the handwritten input according to the parameter.
- the acquisition unit acquires the content of the handwriting input on the screen for handwriting input on the touch panel.
- the display controller When the display control unit is instructed to return the display of the figure after shaping to the state before shaping, the display controller again displays the contents of the handwritten input before shaping.
- a recognizing unit that recognizes the content of the handwritten input and a shaping unit that shapes a figure recognized from the handwritten input according to the recognition result of the recognizing unit.
- the information processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- the information processing method or program storage medium according to one aspect of the present technology is an information processing method or program storage medium corresponding to the information processing apparatus according to one aspect of the present technology.
- the content of handwritten input to the touch panel is acquired, and a figure recognized from the handwritten input according to the content of the handwritten input.
- the display of the shaped figure obtained by shaping is controlled.
- FIG. 1 is a diagram illustrating an appearance of a tablet terminal device to which the present technology is applied.
- the tablet terminal device 10 has a plate-like housing and is a size that is easy to carry, and therefore can be used regardless of location.
- the tablet terminal device 10 has a touch panel in which a touch sensor and a display unit are integrated together with a processing function as a computer main body.
- a touch panel in which a touch sensor and a display unit are integrated together with a processing function as a computer main body.
- FIG. 2 is a diagram illustrating an internal configuration of a tablet terminal device to which the present technology is applied.
- the tablet terminal device 10 illustrated in FIG. 1 includes a control unit 101, a memory unit 102, an operation unit 103, a touch panel 104, a speaker 105, a recording unit 106, a communication unit 107, and a power supply unit 108.
- the control unit 101 controls the operation of each unit of the tablet terminal device 10.
- the memory unit 102 temporarily stores various data according to the control from the control unit 101.
- the operation unit 103 is a button or the like provided on the tablet terminal device 10 and supplies an operation signal to the control unit 101 in accordance with a user operation.
- the control unit 101 controls the operation of each unit in response to an operation signal from the operation unit 103.
- the touch panel 104 includes a display unit 151 and a touch sensor 152 superimposed on the screen.
- the display unit 151 is configured by a liquid crystal display (LCD: Liquid Crystal Display) or the like, and displays various types of information according to control from the control unit 101.
- the touch sensor 152 detects a handwriting input operation performed on the touch panel 104 using the tablet pen 20 or the like together with a position on the touch panel 104 where the operation is performed, and controls the detection signal. Supplied to the unit 101.
- the touch sensor 152 for example, an operation of bringing the tablet pen 20 into contact with the surface of the touch panel 104, or an operation of moving the tablet pen 20 in contact with the surface of the touch panel 104. There are operations such as releasing the tablet pen 20 from the surface of the touch panel 104.
- the touch panel 104 can employ various detection methods such as a capacitance method and an electromagnetic induction method.
- the speaker 105 outputs sound in accordance with control from the control unit 101.
- the recording unit 106 includes, for example, an HDD (Hard Disk Disk Drive) or the like.
- the recording unit 106 records various data according to control from the control unit 101.
- the communication unit 107 communicates with various servers (not shown) via a network according to the control from the control unit 101.
- the power supply unit 108 supplies power supplied from a storage battery or an external power supply to each unit including the control unit 101.
- the tablet terminal device 10 is configured as described above.
- FIG. 3 is a diagram illustrating a functional configuration example of the control unit 101 in FIG.
- control unit 101 includes a handwritten input content acquisition unit 171, a handwriting input recognition unit 172, an arithmetic processing unit 173, a complementing processing unit 174, a shaping conversion processing unit 175, and a display control unit 176.
- the control unit 101 includes a handwritten input content acquisition unit 171, a handwriting input recognition unit 172, an arithmetic processing unit 173, a complementing processing unit 174, a shaping conversion processing unit 175, and a display control unit 176.
- the handwritten input content acquisition unit 171 acquires the content of the handwritten input to the touch panel 104 based on the detection signal from the touch sensor 152 and supplies the content to the handwritten input recognition unit 172.
- the handwriting input recognition unit 172 recognizes the contents of the handwriting input from the handwriting input content acquisition unit 171, and the results of the recognition are processed by each processing unit of the arithmetic processing unit 173, the complement processing unit 174, and the shaping conversion processing unit 175. To supply.
- the calculation processing unit 173 performs predetermined calculation processing based on the recognition result from the handwriting input recognition unit 172, and supplies the calculation result to the shaping conversion processing unit 175 or the display control unit 176.
- the complement processing unit 174 performs predetermined complement processing based on the recognition result from the handwriting input recognition unit 172, and supplies the complement result to the shaping conversion processing unit 175.
- the shaping conversion processing unit 175 shapes the graph recognized from the handwriting input based on the recognition result from the handwriting input recognition unit 172, and supplies the graph to the display control unit 176.
- the shaping conversion processing unit 175 is supplied with at least one of the calculation result from the calculation processing unit 173 or the completion result from the complement processing unit 174.
- the graph recognized from the handwritten input is shaped based on the result.
- the display control unit 176 causes the display unit 151 to display the graph after shaping from the shaping conversion processing unit 175. In addition, the display control unit 176 causes the display unit 151 to display the calculation result from the calculation processing unit 173.
- the control unit 101 is configured as described above.
- FIG. 4 is a diagram for explaining a usage example 1 of the tablet terminal device 10.
- the presentation software is activated and the graph already explained in the presentation is displayed (S11).
- the user inputs a pie chart by handwriting using the tablet pen 20
- a graph g1 corresponding to the handwriting input is displayed on the touch panel 104 (S12).
- this graph g1 is composed of three fan-shaped areas, it is input by handwriting with the tablet pen 20, so it is not a complete circle, and the length of each circular arc is the ratio of the input numerical value. It is not proportional to. In other words, “40” is input as the ratio to the first sector area, “52” is input as the ratio to the second sector area, and the remaining sector areas are input to the remaining sector area. Although no numerical value has been input, it is estimated that “8” should be input as the ratio from the relationship with the numerical values input to other areas. In addition, the length of the fan-shaped arc input by handwriting is not proportional to the ratio of those numerical values.
- a button 211 for instructing the shaping conversion of the graph input by handwriting is displayed on the touch panel 104.
- the user operates the button 211 on the touch panel 104 using the tablet pen 20 (S13).
- the tablet terminal device 10 When the operation of the button 211 is performed, the tablet terminal device 10 performs recognition processing of a numerical value or graph input by handwriting and arithmetic processing according to the numerical value, and further performs shaping conversion of the graph g1 input by handwriting. The Thereby, the touch panel 104 displays a graph g2 obtained by shaping and converting the handwritten input graph g1 (S14).
- the graph g2 after this shaping conversion is a complete circle, and the length of each sector-shaped arc is proportional to the ratio of the numerical values input to the area.
- “40” is displayed in the first sector area
- “52” is displayed in the second sector area
- “8” is also displayed in the remaining sector areas. Is complemented and displayed.
- FIG. 5 is a diagram for explaining a usage example 2 of the tablet terminal device 10.
- a button 221 for returning the shaped and converted graph to the original handwritten input state is displayed on the touch panel 104.
- the user operates the button 221 on the touch panel 104 using the tablet pen 20 (S23).
- the tablet terminal device 10 displays the graph g1 again instead of the graph g2 (S24). That is, since the data of the graph g1 input by handwriting is temporarily stored in the memory unit 102, when the button 221 is operated, the data is read out and redisplayed on the touch panel 104. It will be.
- the button 211 may be operated again. Thereby, the precision of the graph preparation in the tablet terminal device 10 can be raised.
- the tablet terminal device 10 can display a pie chart that has been clarified by rough handwriting input and simple operation. Since it can be displayed and the pie chart that has been clarified can be corrected, the work efficiency at the time of creating the graph can be improved. As a result, it is possible to facilitate communication in the presentation. Further, since the graph can be created using the tablet pen 20 instead of operating the software keyboard and icons, the intuitiveness of the operation can be improved.
- FIG. 8 is a diagram for explaining a usage example 3 of the tablet terminal device 10.
- the presentation software is activated and the graph already explained in the presentation is displayed (S31).
- the calculation information f1 corresponding to the handwritten input is displayed on the touch panel 104 (S32).
- the calculation information includes, for example, a function such as an operator and its parameters, but it is also possible to specify only the parameters by setting a predetermined function in advance.
- a button 211 for instructing a graph shaping conversion is displayed on the touch panel 104.
- the user operates the button 211 on the touch panel 104 using the tablet pen 20 (S33).
- the tablet terminal device 10 When the operation of the button 211 is performed, the tablet terminal device 10 performs a calculation process according to the calculation information input by handwriting, and the calculation result is reflected in the graph. Specifically, since “ ⁇ 2” is input by hand in the upper area of the second bar graph from the left, an operation for doubling the value of the second bar graph from the left is performed. The graph is transformed. Thereby, the graph g7 in which the calculation result is reflected is displayed on the touch panel 104 (S34).
- FIG. 9 shows a specific example in which “ ⁇ 2” is input by handwriting in a predetermined area above the rightmost bar graph among the three bar graphs displayed on the touch panel 104. .
- the calculation of doubling the value of the rightmost bar graph is performed according to the calculation information f2, and the graph g8 in which the value of the bar graph is increased from “25” to “50” is displayed.
- the calculation result is reflected on the graph prepared in advance.
- the calculation result may be reflected on the bar graph input by handwriting.
- FIG. 10 when a graph g9 including three bar graphs is input by handwriting, “ ⁇ 2” is input by handwriting in a predetermined area corresponding to the rightmost bar graph.
- an operation for doubling the value of the rightmost bar graph is performed.
- the calculation result is reflected, and the value of the rightmost graph increases from “25” to “50”.
- FIG. 11 is a diagram for explaining a usage example 4 of the tablet terminal device 10.
- the presentation software is activated and the graph already explained in the presentation is displayed (S41).
- the user uses the tablet pen 20 to display “ ⁇ 2” as the first line type in a predetermined area above the second bar graph from the left among the eight bar graphs displayed on the touch panel 104.
- “Is input by handwriting” calculation information f4 corresponding to the handwriting input is displayed on the touch panel 104.
- calculation information f5 corresponding to the handwriting input is displayed on the touch panel 104.
- a button 211 for instructing a graph shaping conversion is displayed on the touch panel 104.
- the user operates the button 211 on the touch panel 104 using the tablet pen 20 (S43).
- the tablet terminal device 10 When the operation of the button 211 is performed, the tablet terminal device 10 performs a calculation process according to the calculation information input by handwriting with a specific line type among the plurality of calculation information input by handwriting, and further the calculation result Is reflected in the graph. Specifically, when a red line is set as the specific line type, “ ⁇ 2” is input by hand as a red line in a predetermined area of the seventh bar graph from the left. According to f5, an operation for doubling the value of the seventh bar graph from the left is performed, and the graph is shaped and converted according to the operation result. Thereby, the graph g11 in which the calculation result is reflected is displayed on the touch panel 104 (S44).
- the specific line type is not limited to the color, but may be any type that can specify the calculation target by associating the graph with the calculation information such as the thickness of the line or the type of line such as a solid line or a broken line.
- the specific line type is set on a predetermined setting screen and can be changed as appropriate.
- the operation information for example, “ ⁇ 2”
- a desired operation result with a specific line type for example, a red line
- the graph reflecting the calculation result can be displayed, the work efficiency at the time of creating the graph can be improved.
- FIG. 12 is a diagram for explaining a usage example 5 of the tablet terminal device 10.
- the presentation software is activated and the graph already explained in the presentation is displayed (S51).
- a layer 231 for handwriting input is displayed (S52).
- the handwriting input layer 231 is translucent and is superimposed on a part of the presentation graph, so that the graph or the like can be input by handwriting.
- the flick operation refers to an operation of placing a finger on a desired area on the touch panel 104 and sliding the finger in a predetermined direction. In the example of FIG. 12, since the finger on the touch panel 104 is slid leftward, the left flick operation is performed.
- a button 211 for instructing a graph shaping conversion is displayed on the touch panel 104.
- the user operates the button 211 on the touch panel 104 using the tablet pen 20 (S54).
- the tablet terminal device 10 When the operation of the button 211 is performed, the tablet terminal device 10 performs recognition processing of a numerical value or graph input by handwriting and arithmetic processing according to the numerical value, and further, the graph g12 input by handwriting is shaped and converted. The Thereby, on the touch panel 104, a graph g13 obtained by shaping and converting the graph g12 input by handwriting is displayed on the layer 231.
- the usage example 5 it is possible to display the graph subjected to the shaping conversion on the layer 231 by inputting the graph on the layer 231 displayed on the touch panel 104 by handwriting. That is, from the user's point of view, when creating a new graph, the layer 231 is displayed, and if a rough graph is handwritten on the layer 231, the one printed on the tablet terminal device 10 is displayed. As a result, work efficiency when creating graphs can be improved. In addition, even when there is no space for displaying a graph on the touch panel 104, handwriting input can be performed on the layer 231, so that the graph can be surely input by hand and a clear graph can be displayed.
- the layer 231 may have a predetermined transmittance as shown in FIG. 12, but the layer 231 may not be transmitted at all and the graph displayed in the lower layer of the layer 231 may not be visible. . In this case, only the graph input by handwriting is displayed on the layer 231.
- the graph is mainly described as the object of the shaping conversion.
- the calculation result may be displayed. Good. For example, when 3 ⁇ ⁇ 2 / 2 is input by handwriting, 2.12... As the calculation result is displayed together with the calculation formula.
- the presentation software is described as an example of the graph creation software executed by the tablet terminal device 10; however, other software capable of creating a graph such as spreadsheet software may be used. Also good.
- step S ⁇ b> 111 the handwritten input content acquisition unit 171 determines whether or not handwriting input by the tablet pen 20 on the touch panel 104 has started based on the detection signal from the touch sensor 152. However, as shown in Usage Example 5 in FIG. 12, handwriting input may be performed on the layer 231 for handwriting input. Here, after the start of handwriting input, the process proceeds to step S112.
- step S112 the handwritten input content acquisition unit 171 acquires the content of the handwritten input based on the detection signal from the touch sensor 152.
- step S113 the handwritten input content acquisition unit 171 determines whether or not the handwriting input by the tablet pen 20 on the touch panel 104 has ended based on the detection signal from the touch sensor 152.
- step S113 If it is determined in step S113 that the handwriting input has not been completed, the process returns to step S112, and the subsequent processes are repeated. That is, the handwritten input content acquisition unit 171 continues to acquire the content of the handwritten input from the start to the end of the handwritten input. And when handwriting input is complete
- step S114 the control unit 101 determines based on the detection signal from the touch sensor 152 whether or not the button 211 for instructing the shaping conversion of the handwritten input graph has been operated.
- step S114 If it is determined in step S114 that the button 211 has not been operated, further handwriting input is performed, so the process returns to step S111 and the subsequent processes are repeated. Note that the handwritten input data acquired by the handwritten input content acquisition unit 171 is prepared in response to an instruction to redisplay the handwritten input state by operating the button 221, as shown in Usage Example 2 of FIG. 5. Stored in the memory unit 102. If it is determined in step S114 that the button 211 has been operated, the process proceeds to step S115.
- step S114 the operation of the button 211 displayed on the touch panel 104 is set as the determination condition.
- the operation is not limited to the operation of the button 211, and the graph inputted by hand according to the predetermined operation on the tablet terminal device 10 is used. An instruction for shaping conversion may be performed.
- step S115 the handwriting input recognition unit 172 recognizes the content of the handwriting input acquired by the handwriting input content acquisition unit 171.
- the recognition processing of the contents of handwritten input for example, processing for distinguishing characters and graphs, character recognition, recognition of individual figures, recognition processing of a figure composed of a plurality of figures, and the like are performed.
- the figure given as teacher data and the figure handwritten by the user are both mapped to the feature parameter space, and the figure of the teacher data closest to the figure handwritten in the space is returned as the recognition result.
- the feature amount generally, the angle of the line at the start point / end point, the distance between the start point and the end point, the total length of the stroke of the input handwriting, and the like are used (for example, see Non-Patent Document 1 below).
- Non-Patent Document 1 Rubine Algorithm “Rubine, D.”: “Specifying” Gestures “by Example”, “In Proc.” Of “ACM” SIGGRAPH “91” (1991), “pp.” 329-337.
- a method for recognizing the contents of handwritten input for example, a pattern recognition method (for example, see Japanese Patent Publication No. 7-107708) or a method for calculating an angle histogram distribution (for example, see Japanese Patent No. 3046472). These known techniques may be used.
- step S116 the handwriting input recognition unit 172 determines whether or not a graph is included in the contents of the handwriting input based on the recognition result in step S115. If it is determined in step S116 that the handwritten input includes a graph, the process proceeds to step S117.
- step S117 the calculation processing unit 173 determines whether calculation information is included in the content of the handwritten input based on the recognition result in step S115. If it is determined in step S117 that calculation information (for example, “ ⁇ 2”) such as a function and its parameters is included, the process proceeds to step S118.
- calculation information for example, “ ⁇ 2”
- step S118 the calculation processing unit 173 performs a predetermined calculation process based on the calculation information input by handwriting. For example, as shown in Usage Example 3 in FIG. 8, when “ ⁇ 2” is input in a predetermined area on the touch panel 104, an operation for doubling the value of the target bar graph is performed.
- step S118 ends, the process proceeds to step S119. If it is determined in step S117 that the calculation information is not included in the contents of the handwritten input, the calculation process in step S118 is skipped, and the process proceeds to step S119.
- step S119 the complement processing unit 174 determines whether or not the handwritten input graph can be complemented based on the recognition result in step S115. If it is determined in step S119 that the graph can be complemented, the process proceeds to step S120.
- step S120 the complement processing unit 174 performs predetermined complement processing on the graph input by handwriting. For example, as shown in Usage Example 1 in FIG. 4, when there is an area where a ratio is not input among the fan-shaped areas of the pie chart input by handwriting, a numerical value indicating the ratio to be input to the area (Eg “8”) is complemented.
- step S120 When the complementing process in step S120 ends, the process proceeds to step S121. If it is determined in step S119 that the graph cannot be complemented, the complementing process in step S120 is skipped, and the process proceeds to step S121.
- step S121 the shaping conversion processing unit 175 determines whether or not the handwritten input graph can be shaped based on the recognition result in step S115. If it is determined in step S121 that the handwritten input graph can be shaped, the process proceeds to step S122.
- step S122 the shaping conversion processing unit 175 performs predetermined shaping conversion processing on the graph input by handwriting.
- the handwritten pie chart is not a complete circle, and the length of each circular arc is not proportional to the input numerical ratio. In this case, it is a perfect circle, and the length of each circular arc is shaped and converted into a pie chart that is proportional to the ratio of numerical values input to the area.
- step S122 When the shaping conversion process in step S122 is completed, the process proceeds to step S123. If it is determined in step S121 that the graph cannot be shaped, the shaping conversion process in step S122 is skipped, and the process proceeds to step S123.
- step S123 the display control unit 176 displays a graph on which various processes have been performed on the display unit 151. That is, the graph displayed on the display unit 151 is a graph on which at least one of the arithmetic processing in step S118, the complementing processing in step S120, and the shaping conversion processing in step S122 has been performed.
- the graph after the shaping conversion in Usage Example 1 (S14 in FIG. 4), the graph reflecting the calculation result in Usage Example 3 (S34 in FIG. 8), and the handwriting input with the specific line type in Usage Example 4
- a graph in which only the calculated calculation information reflects the calculation result (S44 in FIG. 11) or a graph after shaping conversion on the handwriting input layer 231 in Usage Example 5 (S55 in FIG. 12) is displayed on the display unit 151. Will be displayed.
- the display control is performed.
- the unit 176 reads the handwritten input data held in the memory unit 102 and causes the display unit 151 to redisplay the handwritten input state graph.
- step S116 If it is determined in step S116 that the content of the handwritten input does not include a graph, the process proceeds to step S124.
- step S124 the arithmetic processing unit 173 determines whether or not arithmetic information is included in the contents of the handwritten input. If it is determined in step S124 that the calculation information is included in the content of the handwritten input, the process proceeds to step S125.
- step S125 the calculation processing unit 173 performs predetermined calculation processing based on the calculation information input by handwriting.
- the process proceeds to step S123.
- step S123 the display control unit 176 displays the result of the arithmetic processing in step S125 on the display unit 151. As a result, the calculation result as shown in FIG. 13 is displayed on the touch panel 104.
- step S124 when it is determined that the calculation information is not included in the contents of the handwritten input, the process returns to step S111, and the subsequent processes are repeated.
- control unit 101 may acquire only a processing result by the server via a network such that a predetermined server (not shown) executes.
- the handwriting input shaping conversion process when handwriting input is performed, at least one of the calculation process in step S118, the complementing process in step S120, and the shaping conversion process in step S122 is performed. Since the written graph is displayed, the work efficiency at the time of creating the graph can be improved.
- a pie chart and a bar chart have been described as examples, but other than that, a cylindrical bar, a cone or a pyramid shaped vertical bar chart, a horizontal bar chart, a line chart, an area chart, a scatter chart, Candlestick charts, bubble charts, contour graphs, donut charts, radar graphs and the like can be targeted.
- numerical values and operators have been described as examples of parameters for shaping graphics such as graphs. However, other parameters such as characters or symbols, or combinations thereof are included. Also good.
- the tablet terminal device has been described as an example.
- the present technology is not limited thereto, and the present technology can be applied to an electronic device having a touch panel such as a mobile phone, a smart phone, and a personal computer.
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing a computer incorporated in dedicated hardware and various programs.
- FIG. 15 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 315 is further connected to the bus 314.
- An input unit 316, an output unit 317, a recording unit 318, a communication unit 319, and a drive 320 are connected to the input / output interface 315.
- the input unit 316 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 317 includes a display, a speaker, and the like.
- the recording unit 318 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 319 includes a network interface or the like.
- the drive 320 drives a removable medium 321 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 311 loads the program stored in the recording unit 318 to the RAM 313 via the input / output interface 315 and the bus 314 and executes the program, as described above. A series of processing is performed.
- the program executed by the computer 300 can be provided by being recorded on a removable medium 321 as a package medium, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the recording unit 318 via the input / output interface 315 by attaching the removable medium 321 to the drive 320.
- the program can be received by the communication unit 319 via a wired or wireless transmission medium and installed in the recording unit 318.
- the program can be installed in the ROM 312 or the recording unit 318 in advance.
- the program executed by the computer 300 may be a program that is processed in time series in the order described in this specification, or a necessary timing such as when the call is performed in parallel. It may be a program in which processing is performed.
- processing steps for describing a program for causing the computer 300 to perform various processes do not necessarily have to be processed in time series in the order described in the flowchart, and may be performed in parallel or individually. (For example, parallel processing or object processing).
- the program may be processed by one computer, or may be processed in a distributed manner by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the present technology can take a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above-described flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- this technique can take the following structures.
- An acquisition unit for acquiring contents of handwritten input on the touch panel An information processing apparatus comprising: a display control unit that controls display of a shaped graphic obtained by shaping a graphic recognized from the handwritten input according to the content of the handwritten input.
- a display control unit controls display of a shaped graphic obtained by shaping a graphic recognized from the handwritten input according to the content of the handwritten input.
- the display control unit displays a figure after shaping according to the parameter input by handwriting.
- the display control unit displays a shaped graphic reflecting a calculation result of a predetermined function executed based on the parameter.
- the graphic to be shaped and the parameter are associated with each other according to a line type used for the handwriting input.
- a recognition unit for recognizing the content of the handwritten input The information processing apparatus according to any one of (1) to (7), further including: a shaping unit that shapes a figure recognized from the handwritten input according to a recognition result of the recognition unit.
- the information processing apparatus is Get the contents of handwriting input on the touch panel, An information processing method including a step of controlling display of a shaped graphic obtained by shaping a graphic recognized from the handwritten input according to the content of the handwritten input.
- (10) Computer An acquisition unit for acquiring contents of handwritten input on the touch panel;
- a program storage storing a program that functions as a display control unit that controls display of a shaped figure obtained by shaping a figure recognized from the handwritten input according to the content of the handwritten input Medium.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Character Discrimination (AREA)
Abstract
Description
図4は、タブレット端末装置10の使用例1を説明するための図である。
図5は、タブレット端末装置10の使用例2を説明するための図である。
図8は、タブレット端末装置10の使用例3を説明するための図である。
図11は、タブレット端末装置10の使用例4を説明するための図である。
図12は、タブレット端末装置10の使用例5を説明するための図である。
前述の使用例では、整形変換の対象としてグラフを中心に説明したが、図13に示すように、所望の演算式が手書き入力された場合には、その演算結果が表示されるようにしてもよい。例えば、3×√2/2が手書き入力された場合には、その演算結果としての2.12・・・が、演算式とともに清書されて表示される。
タッチパネルに対する手書き入力の内容を取得する取得部と、
前記手書き入力の内容に応じて、前記手書き入力から認識される図形を整形して得られる整形後の図形の表示を制御する表示制御部と
を備える情報処理装置。
(2)
前記表示制御部は、前記手書き入力されるパラメータに応じた整形後の図形を表示させる
(1)に記載の情報処理装置。
(3)
前記表示制御部は、前記パラメータに基づき実行される所定の関数の演算結果が反映された整形後の図形を表示させる
(2)に記載の情報処理装置。
(4)
整形対象の図形と前記パラメータとは、前記手書き入力に用いられた線種によって関連付けられている
(3)に記載の情報処理装置。
(5)
前記表示制御部は、前記手書き入力から認識される図形を、前記パラメータに応じて補完して得られる補完後の図形を表示させる
(2)に記載の情報処理装置。
(6)
前記取得部は、前記タッチパネル上の手書き入力用の画面に対する前記手書き入力の内容を取得する
(1)乃至(5)のいずれか一項に記載の情報処理装置。
(7)
前記表示制御部は、整形後の図形の表示を整形前の状態に戻すための指示がされた場合、整形前の前記手書き入力の内容を再度表示させる
(1)乃至(6)のいずれか一項に記載の情報処理装置。
(8)
前記手書き入力の内容を認識する認識部と、
前記認識部の認識の結果に応じて、前記手書き入力から認識される図形を整形する整形部と
をさらに備える(1)乃至(7)のいずれか一項に記載の情報処理装置。
(9)
情報処理装置の情報処理方法において、
前記情報処理装置が、
タッチパネルに対する手書き入力の内容を取得し、
前記手書き入力の内容に応じて、前記手書き入力から認識される図形を整形して得られる整形後の図形の表示を制御する
ステップを含む情報処理方法。
(10)
コンピュータを、
タッチパネルに対する手書き入力の内容を取得する取得部と、
前記手書き入力の内容に応じて、前記手書き入力から認識される図形を整形して得られる整形後の図形の表示を制御する表示制御部と
として機能させることを特徴とするプログラムを記憶したプログラム記憶媒体。
Claims (10)
- タッチパネルに対する手書き入力の内容を取得する取得部と、
前記手書き入力の内容に応じて、前記手書き入力から認識される図形を整形して得られる整形後の図形の表示を制御する表示制御部と
を備える情報処理装置。 - 前記表示制御部は、前記手書き入力されるパラメータに応じた整形後の図形を表示させる
請求項1に記載の情報処理装置。 - 前記表示制御部は、前記パラメータに基づき実行される所定の関数の演算結果が反映された整形後の図形を表示させる
請求項2に記載の情報処理装置。 - 整形対象の図形と前記パラメータとは、前記手書き入力に用いられた線種によって関連付けられている
請求項3に記載の情報処理装置。 - 前記表示制御部は、前記手書き入力から認識される図形を、前記パラメータに応じて補完して得られる補完後の図形を表示させる
請求項2に記載の情報処理装置。 - 前記取得部は、前記タッチパネル上の手書き入力用の画面に対する前記手書き入力の内容を取得する
請求項1に記載の情報処理装置。 - 前記表示制御部は、整形後の図形の表示を整形前の状態に戻すための指示がされた場合、整形前の前記手書き入力の内容を再度表示させる
請求項1に記載の情報処理装置。 - 前記手書き入力の内容を認識する認識部と、
前記認識部の認識の結果に応じて、前記手書き入力から認識される図形を整形する整形部と
をさらに備える請求項1に記載の情報処理装置。 - 情報処理装置の情報処理方法において、
前記情報処理装置が、
タッチパネルに対する手書き入力の内容を取得し、
前記手書き入力の内容に応じて、前記手書き入力から認識される図形を整形して得られる整形後の図形の表示を制御する
ステップを含む情報処理方法。 - コンピュータを、
タッチパネルに対する手書き入力の内容を取得する取得部と、
前記手書き入力の内容に応じて、前記手書き入力から認識される図形を整形して得られる整形後の図形の表示を制御する表示制御部と
として機能させることを特徴とするプログラムを記憶したプログラム記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/654,808 US10402085B2 (en) | 2012-12-28 | 2013-12-16 | Display of content based on handwritten input |
JP2014554330A JP6287861B2 (ja) | 2012-12-28 | 2013-12-16 | 情報処理装置、情報処理方法、及び、プログラム記憶媒体 |
EP13868853.6A EP2940559A4 (en) | 2012-12-28 | 2013-12-16 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM MEMORY |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012286723 | 2012-12-28 | ||
JP2012-286723 | 2012-12-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014103775A1 true WO2014103775A1 (ja) | 2014-07-03 |
Family
ID=51020863
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/083599 WO2014103775A1 (ja) | 2012-12-28 | 2013-12-16 | 情報処理装置、情報処理方法、及び、プログラム記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10402085B2 (ja) |
EP (1) | EP2940559A4 (ja) |
JP (1) | JP6287861B2 (ja) |
CN (1) | CN103914174A (ja) |
WO (1) | WO2014103775A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016534464A (ja) * | 2013-08-30 | 2016-11-04 | サムスン エレクトロニクス カンパニー リミテッド | 電子装置におけるチャートを表示する装置及び方法 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150086646A (ko) * | 2014-01-20 | 2015-07-29 | 삼성전자주식회사 | 프리뷰 이미지를 제공하는 화상형성장치, 그 프리뷰 이미지를 디스플레이하는 디스플레이 장치 및 그 방법들 |
JP6511860B2 (ja) * | 2015-02-27 | 2019-05-15 | 富士通株式会社 | 表示制御システム、グラフ表示方法およびグラフ表示プログラム |
CN110956674B (zh) * | 2019-10-23 | 2022-01-21 | 广州视源电子科技股份有限公司 | 图形调整方法、装置、设备及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0346472B2 (ja) | 1979-09-14 | 1991-07-16 | Burisutoru Maiyaazu Sukuibu Co | |
JPH06175977A (ja) * | 1992-12-08 | 1994-06-24 | Canon Inc | 電子機器 |
JPH07107708B2 (ja) | 1986-09-26 | 1995-11-15 | 株式会社日立製作所 | パタ−ン認識方法 |
JPH0950490A (ja) | 1995-08-07 | 1997-02-18 | Sony Corp | 手書き文字認識装置 |
JP2003330606A (ja) * | 2002-05-13 | 2003-11-21 | Ricoh Co Ltd | タッチパネル付きディスプレイ装置、タッチパネル付きディスプレイ装置の制御方法およびその方法をコンピュータに実行させるためのプログラム |
JP2012063938A (ja) * | 2010-09-15 | 2012-03-29 | Hitachi Solutions Ltd | 手書き図形認識システム、手書き図形認識方法及びプログラム |
JP2012190118A (ja) * | 2011-03-09 | 2012-10-04 | Seiko Epson Corp | 画像生成装置、プロジェクター、プログラムおよび画像生成方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5287417A (en) * | 1992-09-10 | 1994-02-15 | Microsoft Corporation | Method and system for recognizing a graphic object's shape, line style, and fill pattern in a pen environment |
US5465325A (en) * | 1992-11-16 | 1995-11-07 | Apple Computer, Inc. | Method and apparatus for manipulating inked objects |
US7017124B2 (en) * | 2001-02-15 | 2006-03-21 | Denny Jaeger | Method for controlling electronic devices using digital recall tool |
US20080104571A1 (en) * | 2001-02-15 | 2008-05-01 | Denny Jaeger | Graphical object programming methods using graphical directional indicators |
US7324691B2 (en) * | 2003-09-24 | 2008-01-29 | Microsoft Corporation | System and method for shape recognition of hand-drawn objects |
US20100171754A1 (en) * | 2009-01-07 | 2010-07-08 | Microsoft Corporation | Converting digital ink to shapes and text |
US9256360B2 (en) * | 2010-08-25 | 2016-02-09 | Sony Corporation | Single touch process to achieve dual touch user interface |
JP5790070B2 (ja) * | 2010-08-26 | 2015-10-07 | カシオ計算機株式会社 | 表示制御装置およびプログラム |
US8994732B2 (en) * | 2011-03-07 | 2015-03-31 | Microsoft Corporation | Integration of sketch-based interaction and computer data analysis |
-
2013
- 2013-12-16 JP JP2014554330A patent/JP6287861B2/ja active Active
- 2013-12-16 US US14/654,808 patent/US10402085B2/en active Active
- 2013-12-16 WO PCT/JP2013/083599 patent/WO2014103775A1/ja active Application Filing
- 2013-12-16 EP EP13868853.6A patent/EP2940559A4/en not_active Ceased
- 2013-12-19 CN CN201310705720.3A patent/CN103914174A/zh active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0346472B2 (ja) | 1979-09-14 | 1991-07-16 | Burisutoru Maiyaazu Sukuibu Co | |
JPH07107708B2 (ja) | 1986-09-26 | 1995-11-15 | 株式会社日立製作所 | パタ−ン認識方法 |
JPH06175977A (ja) * | 1992-12-08 | 1994-06-24 | Canon Inc | 電子機器 |
JPH0950490A (ja) | 1995-08-07 | 1997-02-18 | Sony Corp | 手書き文字認識装置 |
JP2003330606A (ja) * | 2002-05-13 | 2003-11-21 | Ricoh Co Ltd | タッチパネル付きディスプレイ装置、タッチパネル付きディスプレイ装置の制御方法およびその方法をコンピュータに実行させるためのプログラム |
JP2012063938A (ja) * | 2010-09-15 | 2012-03-29 | Hitachi Solutions Ltd | 手書き図形認識システム、手書き図形認識方法及びプログラム |
JP2012190118A (ja) * | 2011-03-09 | 2012-10-04 | Seiko Epson Corp | 画像生成装置、プロジェクター、プログラムおよび画像生成方法 |
Non-Patent Citations (2)
Title |
---|
RUBINE, D.: "Specifying Gestures by Example", PROC. OF ACM SIGGRAPH '91, 1991, pages 329 - 337 |
See also references of EP2940559A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016534464A (ja) * | 2013-08-30 | 2016-11-04 | サムスン エレクトロニクス カンパニー リミテッド | 電子装置におけるチャートを表示する装置及び方法 |
Also Published As
Publication number | Publication date |
---|---|
US20150355835A1 (en) | 2015-12-10 |
JP6287861B2 (ja) | 2018-03-07 |
EP2940559A4 (en) | 2016-08-31 |
JPWO2014103775A1 (ja) | 2017-01-12 |
EP2940559A1 (en) | 2015-11-04 |
CN103914174A (zh) | 2014-07-09 |
US10402085B2 (en) | 2019-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2919104B1 (en) | Information processing device, information processing method, and computer-readable recording medium | |
EP3565181A1 (en) | Network topology adaptive data visualization method, device, apparatus and storage medium | |
US11262895B2 (en) | Screen capturing method and apparatus | |
US9524097B2 (en) | Touchscreen gestures for selecting a graphical object | |
US20130016126A1 (en) | Drawing aid system for multi-touch devices | |
US9588673B2 (en) | Method for manipulating a graphical object and an interactive input system employing the same | |
US9870144B2 (en) | Graph display apparatus, graph display method and storage medium | |
CN103955339A (zh) | 一种终端操作方法及终端设备 | |
JP2020522809A (ja) | 仮想基体として使用するための平面及び/又は四分木を検出するための方法及びデバイス | |
US9025878B2 (en) | Electronic apparatus and handwritten document processing method | |
JP6287861B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム記憶媒体 | |
EP2960763A1 (en) | Computerized systems and methods for cascading user interface element animations | |
US11275501B2 (en) | Creating tables using gestures | |
JP6595896B2 (ja) | 電子機器及び表示制御方法 | |
CN102314287A (zh) | 互动显示系统及方法 | |
US9811238B2 (en) | Methods and systems for interacting with a digital marking surface | |
CN107391015B (zh) | 一种智能平板的控制方法、装置、设备及存储介质 | |
JP2016110604A (ja) | ページめくりシステム及びページめくり方法 | |
JP2019067111A (ja) | 表示制御装置及びプログラム | |
US9146666B2 (en) | Touch sensor navigation | |
CN114063845A (zh) | 显示方法、显示装置和电子设备 | |
WO2020132863A1 (zh) | 续笔方法与显示终端 | |
US20160154466A1 (en) | Touch input method and electronic apparatus thereof | |
CN116088743A (zh) | 书写信息计算方法、装置、存储介质及电子设备 | |
JP2023137822A (ja) | 表示装置、清書方法、プログラム、情報共有システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13868853 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014554330 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013868853 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14654808 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |