WO2017072913A1 - 制御方法、電子黒板システム、表示装置およびプログラム - Google Patents
制御方法、電子黒板システム、表示装置およびプログラム Download PDFInfo
- Publication number
- WO2017072913A1 WO2017072913A1 PCT/JP2015/080563 JP2015080563W WO2017072913A1 WO 2017072913 A1 WO2017072913 A1 WO 2017072913A1 JP 2015080563 W JP2015080563 W JP 2015080563W WO 2017072913 A1 WO2017072913 A1 WO 2017072913A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input object
- color
- image
- unit
- touch
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/113—Recognition of static hand signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
Definitions
- the present invention relates to a control method, an electronic blackboard system, a display device, and a program.
- Patent Document 1 discloses an electronic blackboard system having the following functions.
- the electronic blackboard system described in Patent Literature 1 has a function of detecting the color of an input object from an image obtained by imaging the input object for specifying coordinates and reflecting the detection result on the drawing color on the computer operation screen.
- the electronic blackboard system described in Patent Document 1 aims to automatically set the drawing color to the original color of the input object. Therefore, in the electronic blackboard system described in Patent Document 1, for example, it is not easy to set the drawing color to a color different from the color of the input object, and there is a problem that the operability may be lowered.
- the present invention has been made in view of the above circumstances, and an object thereof is to provide a control method, an electronic blackboard system, a display device, and a program that can solve the above-described problems.
- an aspect of the present invention includes a touch detection step of detecting a touch by an input object, an image acquisition step of acquiring an image including at least a part of the input object, and the touch detection step.
- a control method includes a process determination step for determining a process to be executed in accordance with a detection result and an acquired image in the image acquisition step.
- a detection unit that detects a touch by an input object
- an imaging unit that captures an image including at least a part of the input object
- a detection result by the detection unit and a captured image by the imaging unit
- a display unit that displays an image by the process determined by the control unit.
- a process to be executed is determined according to a detection result by a detection unit that detects a touch by an input object and a captured image by an imaging unit that captures an image including at least a part of the input object.
- the display device includes a display unit that displays an image by the process determined by the control unit.
- a touch detection step of detecting a touch by an input object an image acquisition step of acquiring an image including at least a part of the input object, a detection result in the touch detection step, and the image
- the process to be executed can be determined according to the result of touch detection and the acquired image, the operability can be easily improved.
- FIG. 1 is a block diagram showing a configuration example of the first embodiment of the present invention.
- the control device 1 according to the first embodiment illustrated in FIG. 1 includes a touch detection unit 2, an image acquisition unit 3, and a process determination unit 4.
- the control device 1 can be configured using, for example, one or a plurality of computers, computer peripheral devices, and programs executed by the computers.
- the computer may be a terminal such as a personal computer or a smartphone, or may be a built-in computer such as a microcontroller.
- the peripheral device includes, for example, a detection device that detects a touch operation.
- the peripheral device does not include, for example, a detection device that detects a touch operation, but includes an interface for inputting and outputting signals to and from the detection device.
- the peripheral device includes, for example, an imaging device that captures an image.
- the peripheral device includes, for example, an interface for inputting / outputting a signal to / from the imaging device.
- the peripheral device includes, for example, a display device that displays an image.
- the peripheral device includes, for example, an interface for inputting / outputting a signal to / from the display device.
- the display device displays an image by processing determined by the processing determination unit 4 as described later.
- the touch detection unit 2, the image acquisition unit 3, and the processing determination unit 4 are fixed functions that are realized by executing a predetermined program on a computer using a computer and peripheral devices, respectively.
- blocks corresponding to certain functions such as the touch detection unit 2, the image acquisition unit 3, and the processing determination unit 4 are referred to as function blocks.
- the touch detection unit 2 detects a touch on the detection surface by an input object such as a user's finger, hand, or pen, or inputs a signal representing the detected result.
- the touch detection unit 2 outputs information indicating that the detection surface is touched and information indicating the position on the one or more touched detection surfaces to the processing determination unit 4.
- the touch detection unit 2 includes, for example, a detection device in a touch panel including a display device and a touch operation detection device. Or the touch detection part 2 is an input interface of the signal which the detection apparatus of touch operation output, for example.
- the image acquisition unit 3 acquires an image including, as a subject, at least a part of an input object that touches or is about to touch the touch detection unit 2.
- the image including at least a part of the input object is an image including a part of the input object to such an extent that feature data of the input object can be extracted.
- the image acquisition unit 3 includes, for example, an imaging device.
- the image acquisition part 3 is an input interface of the image data which the imaging device output, for example.
- the process determination unit 4 determines a process to be executed according to the detection result output by the touch detection unit 2 and the image acquired by the image acquisition unit 3.
- the entity that executes the process determined by the process determining unit 4 may be the process determining unit 4, a functional block different from the process determining unit 4, or both.
- the process to be executed is, for example, a drawing process.
- the process determination unit 4 determines the content of the drawing process according to, for example, the detection result by the touch detection unit 2 and the image acquired by the image acquisition unit 3.
- the process determining unit 4 determines a drawing color and a pen shape to be drawn when drawing a character or a line in response to a touch operation.
- the process to perform is a process which recognizes operation with respect to the detection surface by the input object mentioned above as virtual mouse operation, and produces
- the process determination unit 4 generates information according to the detection result of the touch detection unit 2 and the image acquired by the image acquisition unit 3, and the information content indicating the click state of the mouse button and the position of the mouse. To decide.
- the process determined by the process determination unit 4 is not limited to these examples.
- the control device 1 executes a touch detection step in which the touch detection unit 2 detects a touch by an input object or inputs a touch detection result.
- the image acquisition unit 3 executes an image acquisition step of acquiring an image including at least a part of the input object.
- the process determination part 4 performs the process determination step which determines the process performed according to the detection result in a touch detection step, and the acquired image in an image acquisition step. Therefore, according to this embodiment, the process to be executed can be determined according to the result of touch detection and the acquired image. Therefore, it is possible to flexibly deal with various processes and easily improve operability.
- FIG. 2 is a schematic block diagram illustrating a configuration example of the electronic blackboard system 10.
- the electronic blackboard system 10 illustrated in FIG. 2 includes an imaging unit 11, a control unit 12, and a touch panel 13.
- the entire electronic blackboard system 10 may be regarded as the second embodiment of the present invention, or the control unit 12 is configured as the second embodiment of the present invention. You may catch it.
- a combination of the imaging unit 11 and the control unit 12 or a combination of the control unit 12 and the touch panel 13 may be regarded as the second embodiment of the present invention.
- the display unit 19 that displays an image in accordance with an image signal input from the control unit 12 may be regarded as the second embodiment of the present invention.
- the display unit 19 can be configured as a display device including the display unit 19 without including the imaging unit 11, the control unit 12, and the detection unit 18 described later.
- the imaging unit 11 is a camera attached to the touch panel 13 as shown in FIG. 3, for example.
- the imaging unit 11 captures an area including an operation range for the screen 13 a of the touch panel 13. That is, the imaging unit 11 captures an image including at least a part of an input object that performs an input operation on the detection unit 18.
- the imaging unit 11 always shoots a moving image, shoots a still image repeatedly at a constant period, or shoots a moving image or a still image when a control signal (not shown) is received from the control unit 12.
- the imaging unit 11 may be configured by a plurality of cameras.
- a plurality of cameras can be provided on the upper side and the right side or left side of the display unit 19, respectively.
- the attachment of one or a plurality of cameras is not limited to being attached to the touch panel 13 as long as the input object in the region including the operation range can be photographed.
- the touch panel 13 includes a detection unit 18 and is attached to the display unit 19.
- the touch panel 13 and the display unit 19 may be an integrated device.
- the display unit 19 displays an image according to the image signal input from the control unit 12. For example, the display unit 19 displays an image by the drawing process determined by the control unit 12.
- the display unit 19 is, for example, a liquid crystal display.
- the detection unit 18 detects a touch operation on the display surface of the display unit 19, that is, the screen 13 a of the touch panel 13 by an input object such as a user's finger or pen.
- the detection unit 18 outputs a signal representing the presence or absence of the touch and the touched position to the control unit 12 as a detection result.
- the detection unit 18 is, for example, a touch pad formed in a transparent screen shape on the display surface of the liquid crystal display.
- the control unit 12 is a computer, and includes a CPU (Central Processing Unit), a storage device including volatile and nonvolatile memories, an input / output interface, a communication device, and the like.
- the control unit 12 includes an image recognition processing unit 14, a determination processing unit 15, a drawing processing unit 17, and an association storage unit 16.
- the image recognition processing unit 14, the determination processing unit 15, the drawing processing unit 17, and the association storage unit 16 are the functional blocks described above.
- the image recognition processing unit 14 temporarily stores the image data acquired from the imaging unit 11 in a storage device in the control unit 12. Then, the image recognition processing unit 14 executes, for example, processing for recognizing the shape and color (that is, the shape and / or color) of the input object from the image captured when the detection unit 18 detects the touch of the input object. . For example, the image recognition processing unit 14 compares the feature data of the shape extracted from the image to be recognized with the feature extraction data of the input object stored in the association storage unit 16 in advance, and features with high similarity. The identification information of the extracted data is output as a recognition result.
- the image recognition processing unit 14 compares the pixel value of each component of the color that occupies a certain area in the image to be recognized with the pixel value of each component of the color stored in advance, and the degree of similarity Is output as a recognition result.
- the determination processing unit 15 determines the content of the drawing process for the display unit 19 according to the detection result of the detection unit 18 and the recognition result of the image recognition processing unit 14. For example, as illustrated in FIG. 4, when the shape 91 is input to the screen 13 a by the input object 31, the determination processing unit 15 is set in association with feature extraction data similar to the feature data of the input object 31.
- the drawing processing unit 17 is controlled so as to draw the shape 91 with the set color.
- the input object 31 is a hand in a state where the screen 13 a is touched with the right index finger.
- the drawing color is black.
- the input to the screen 13a means that the screen 13a is touched with an input object or the input object is moved while touching the screen 13a.
- the determination processing unit 15 associates the feature extraction data with the feature data similar to the feature data of the input object 32.
- the drawing processing unit 17 is controlled to draw the shape 92 with the set color.
- the input object 32 is a hand in a state where the screen 13a is touched with the middle finger of the right hand.
- the drawing color is red.
- the determination processing unit 15 also associates the recognized shape and color of the input object with the content of the drawing process according to the detection result of the detection unit 18 and the shape and color of the input object recognized by the image recognition processing unit 14. Set the date.
- the determination processing unit 15 controls the drawing processing unit 17 to display the color setting menu 20 on the screen 13a, for example, as shown in FIG.
- the color setting menu 20 can be displayed, for example, when a button (not shown) provided on the touch panel 13 is pressed by the user or when the user performs a specific gesture toward the imaging unit 11.
- the color setting menu 20 shown in FIG. 6 includes a black icon 21, a red icon 22, a blue icon 23, a green icon 24, a yellow icon 25, and a white icon 26. For example, as illustrated in FIG.
- the determination processing unit 15 stores setting information that associates the feature data of the input object 31 with black in the association storage unit 16.
- the determination processing unit 15 displays setting information that associates the feature data of the input object 32 with red. 16 is stored.
- the association storage unit 16 stores information associating information representing the shape and color of the input object with information representing the content of processing.
- FIG. 9 shows an example of the contents stored in the association storage unit 16.
- the table 161 shown in FIG. 9 associates feature extraction data information with display information.
- the feature extraction data information is information representing data obtained by extracting the shape and color features of the input object.
- the display information is information indicating the contents of the drawing process, and in this example, the display information represents a process of deleting the drawing color or drawing.
- FIG. 9 shows input objects 31 to 36, which are feature extraction bases when feature extraction data information is generated, in association with arrows.
- the input objects 31 to 34 are the right hand 41, and the input objects 35 and 36 are the right hand 42.
- the feature extraction data information of the input object 31 having the shape in which the screen 13a is touched with the right index finger is “0045abd59932f096” in hexadecimal, and the drawing color “black” is included in this feature extraction data information. Is associated with the display information. Further, for example, the feature extraction data information of the input object 35 having a shape in which the screen 13a is touched with the left index finger is associated with display information indicating “eraser” which is a process for erasing the drawing made in the touched area. It has been.
- the feature extraction data information of the input object 36 having a shape in which the left hand is spread and the screen 13a is touched includes display information indicating “all erase” which is a process of erasing the drawing of the entire area of the screen 13a. It is associated.
- the association storage unit 16 may store a plurality of sets of typical feature extraction data information and display information in association with each other from the product shipment stage, for example, before performing the above-described setting process by the user. it can.
- the drawing processing unit 17 shown in FIG. 2 generates an image signal to be displayed on the display unit 19 under the control of the determination processing unit 15, and outputs the generated image signal to the display unit 19.
- the rendering processing unit 17 can generate an image signal by superimposing the image to be rendered and the input video signal under the control of the determination processing unit 15.
- steps S13 to S16 correspond to the color setting process
- steps S17 to S21 correspond to the drawing process. Further, it is assumed that the feature storage data of each input object having the shapes A to F or the shape Z is already stored in the association storage unit 16.
- the determination processing unit 15 determines whether or not the color setting menu 20 is displayed (step S12).
- the image recognition processing unit 14 acquires an image captured by the imaging unit 11 (step S13), stores the image in a predetermined memory, and recognizes the image. Processing is executed (step S14).
- the determination processing unit 15 compares the recognition result in the image recognition processing unit 14 with the setting value (that is, feature extraction data information) of the feature extraction data of the input object already stored in the association storage unit 16. (Step S15).
- the determination processing unit 15 can compare the recognition result and the set value, for example, using a table reference method.
- the determination processing unit 15 displays the display information corresponding to the feature extraction data information of the shape A as follows.
- the color information designated by the user in the color setting menu 20 is stored (step S16). For example, when the black icon 21 is touched with the input object 31 as illustrated in FIG. 7, the determination processing unit 15 displays the display information corresponding to the feature extraction data information of the input object 31 as illustrated in FIG. 9. Store black. After step S16, the process returns to step S11.
- the image recognition processing unit 14 acquires an image captured by the imaging unit 11 (step S17), and stores it in a predetermined memory. Image recognition is executed (step S18).
- the determination processing unit 15 compares the recognition result in the image recognition processing unit 14 with the set value of the feature extraction data of the input object already stored in the association storage unit 16 (step S19). In step S19, the determination processing unit 15 can compare the recognition result and the set value by, for example, a table reference method.
- the determination processing unit 15 stores the display information corresponding to the feature extraction data information of the shape A.
- the read color information is read from the association storage unit 16 (step S20).
- the determination processing unit 15 controls the drawing processing unit 17 to execute the drawing processing with the color specified by the read color information (step S21). For example, if the shape 91 is input from the input object 31 as illustrated in FIG. 4, the determination processing unit 15 draws the shape 91 in black. After step S21, the process returns to step S11.
- the picture can be written in black, and if the right hand middle finger touches the screen 13a, the picture can be written in red. Further, for example, the user can arbitrarily set the correspondence between the shape and color of the input object.
- the detection unit 18 detects a touch by an input object.
- the imaging unit 11 captures an image including at least a part of the input object.
- the control unit 12 determines processing to be executed according to the detection result by the detection unit 18 and the captured image by the imaging unit 11. Further, the display unit 19 displays an image by the process determined by the control unit 12. Therefore, according to the second embodiment, the process to be executed can be determined according to the result of touch detection and the acquired image, so that the operability can be easily improved.
- the second embodiment can be modified as follows, for example.
- the storage content of the association storage unit 16 can be updated. That is, the determination processing unit 15 can rewrite the feature extraction data information of the shape determined to be the most similar according to the recognition result based on the image acquired in step S13.
- the color setting menu 20 may have a hierarchical structure.
- the electronic blackboard system 10 can display, for example, a setting menu 20a for selecting a line shape shown in FIG.
- the setting menu 20a shown in FIG. 11 has an icon 26 for selecting one line and an icon 27 for selecting two lines.
- the icon 26 is touched, in the drawing process, when an input operation is performed with the shape of the input object when the color is selected, the line of the color is drawn with a single line.
- the icon 27 is touched, in the drawing process, when an input operation is performed with the shape of the input object when the color is selected, the line of the color is drawn with two lines.
- the setting menu 20a can be used to associate the content of the drawing process with the shape of the input object.
- the line shape and the input object shape are set using the setting menu 20a.
- the shape of the input object 31 touched with the icon 26 of the setting menu 20a is associated with the drawing with one line.
- the shape of the input object 33 touched with the icon 27 is associated with the drawing with two lines.
- the user can change the line into a single line by inputting in the shape of the input object 31 and the line into two lines by inputting in the shape of the input object 33.
- the correspondence between the input object and the color can be set using a general-purpose pen with a different color.
- a general-purpose pen with a different color.
- an icon 28 for instructing color identification processing is provided on the setting menu 20b.
- the user prepares one general-purpose pen or general-purpose pens 43 to 45 having different colors.
- the pen 43 is blue
- the pen 44 is red
- the pen 45 is black.
- the shapes of the pens 43 to 45 may be the same or different.
- the electronic blackboard system 10 recognizes the shape and color of the pens 43 to 45 when the icon 28 is touched, and sets blue for drawing with the pen 43, red for drawing with the pen 44, and black for drawing with the pen 45. To do.
- the contents of processing can be changed at the position touched by the pen. For example, the color can be designated when touched with one end of the pen, and the eraser when touched with the other end.
- the shape of the recognized input object and the color associated therewith can be displayed by the icon 81 or the icon 82 on the screen 13a.
- the icon 81 having a shape touched with the index finger is displayed in black.
- the icon 82 having a shape touched with the middle finger is displayed in red.
- the icon 81 or the icon 82 may be always displayed until another shape or color is recognized, or may be displayed for a certain period of time when the shape or color is changed.
- the user can always check information on what color can be described now. For example, in the case of an unintended color, the user can take an action such as changing the shape of the finger again.
- the icon display is less likely to be an obstacle.
- the drawing process is performed with a preset color from the start of the touch and a case where the color is changed to a preset color when the touch is finished.
- the line drawing 92 is drawn with a color corresponding to the shape recognized immediately before the start of touch.
- the line drawing 93 is drawn once with the color before change or the standard color, and the shape recognized during the drawing of the line drawing 93 when the touch is finished.
- the line drawing 92 is redrawn with a color corresponding to.
- the line drawing 92 may be drawn with a color corresponding to the recognition result when the touch is finished without drawing the line drawing 93.
- the setting of the correspondence between the shape and color of the input object and the content of the drawing process may be uniform for the entire screen 13a, or the screen 13a is divided into a plurality of partial areas and set for each partial area. May be changed. That is, for example, as shown in FIG. 18, one area 51 that covers the entire screen 13a can be used as an input and drawing area. In this case, in the area 51, the drawing process can be performed while changing the processing content according to the shape and color of the input object. Alternatively, as shown in FIG. 19, an area 52 that covers about half of the screen 13a is set, and the drawing process is performed only in the area 52 while changing the processing contents according to the shape and color of the input object. Also good. In the remaining area 53, input and drawing can be prevented from being performed. In this case, the drawing process to be executed for the region 53 according to the input is not determined.
- the screen 13a is divided into a plurality of areas 54, 55 and 56, and the setting for associating the shape and color of the input object with the content of the drawing process is different for each area. It may be allowed.
- the color setting menus 20c, 20d, and 20e can be displayed separately for each of the areas 54, 55, and 56.
- the imaging unit 11 is not limited to the camera, and an infrared sensor or the like can be used, or the camera and the infrared sensor can be used in combination.
- the electronic blackboard system is not limited to the one using a liquid crystal display, and may be one using a projector.
- the input object is not limited to the above, and any object that can identify the shape and color and that does not easily damage the screen 13a may be used.
- the touch panel 13 may be a touch panel included in, for example, a tablet terminal or a smartphone.
- the imaging part 11 can be comprised using the in-camera built in the tablet-type terminal and the smart phone, and the prism externally attached so that an input object could be imaged, for example.
- the control device 1 shown in FIG. 1 corresponds to the entire electronic blackboard system 10, the control unit 12 alone, and the like shown in FIG.
- the touch detection unit 2 illustrated in FIG. 1 corresponds to the detection unit 18, the combination of the detection unit 18 and the determination processing unit 15, the determination processing unit 15, and the like illustrated in FIG.
- the image acquisition unit 3 illustrated in FIG. 1 corresponds to the imaging unit 11, the combination of the imaging unit 11 and the image recognition processing unit 14, the image recognition processing unit 14, and the like illustrated in FIG.
- the processing determination unit 4 illustrated in FIG. 1 corresponds to the determination processing unit 15 illustrated in FIG.
- FIG. 21 is a schematic block diagram showing a configuration example of the electronic blackboard system 10a.
- the electronic blackboard system 10a illustrated in FIG. 21 includes a camera 100, a CPU 200, a touch panel 300, a personal computer (hereinafter referred to as a PC) 400, and a storage unit 500.
- a PC personal computer
- the camera 100 includes an optical module 101 and a signal processing unit 104.
- the optical module 101 includes an optical system 102 and an image sensor 103.
- the image sensor 103 is a CMOS (complementary metal oxide semiconductor) image sensor, a CCD (charge-coupled device) image sensor, or the like.
- the signal processing unit 104 reads a pixel value from the image sensor 103, performs signal processing on the read pixel value, converts the pixel value into a predetermined format video signal, and outputs the video signal. Further, the signal processing unit 104 controls the optical system 102, controls the image sensor 103, and changes the content of signal processing based on the control signal output from the CPU 200.
- the touch panel 300 includes a liquid crystal display device 301 and a touch sensor 302.
- the liquid crystal display device 301 displays an image based on the image signal output from the PC 400.
- the touch sensor 302 detects a touch operation on the display screen of the liquid crystal display device 301, and outputs a touch detection signal indicating that the touch is detected and screen coordinate data indicating the touched position.
- the CPU 200 includes a camera interface 201 and an arithmetic processing unit 202.
- the camera interface 201 is a circuit for inputting the video signal output from the camera 100 to the arithmetic processing unit 202.
- the arithmetic processing unit 202 inputs a touch detection signal and screen coordinate data from the touch panel 300.
- the arithmetic processing unit 202 outputs a control signal to the camera 100 and controls, for example, imaging timing.
- the arithmetic processing unit 202 outputs a control signal to the PC 400 to instruct an image to be drawn, for example.
- the storage unit 500 stores, for example, a table representing a correspondence relationship between data obtained by extracting the shape and color characteristics of the input object and processing associated with the shape and color.
- the storage unit 500 is, for example, a rewritable nonvolatile memory that is detachably connected to the CPU 200.
- the PC 400 generates an image to be displayed on the touch panel 300 from the control signal input from the CPU 200 and information indicating the video instructed by the user or the operation screen of the application, and outputs the image as a video signal of a predetermined format.
- the operation related to the setting process and the drawing process according to the touch operation of the electronic blackboard system 10a of the third embodiment is the same as the operation of the electronic blackboard system 10 of the second embodiment.
- the camera 100 of the third embodiment corresponds to the imaging unit 11 of the second embodiment.
- a touch panel 300 according to the third embodiment corresponds to the touch panel 13 according to the second embodiment.
- a combination of the CPU 200, the PC 400, and the storage unit 500 according to the third embodiment corresponds to the control unit 12 according to the second embodiment.
- the process to be executed can be determined according to the result of touch detection and the acquired image, similarly to the second embodiment, the operability can be easily improved. Further, when the storage unit 500 is detachable, information indicating the correspondence relationship between the shape and color and the processing can be easily updated.
- the third embodiment has a simple configuration suitable for displaying on the touch panel 300 by combining the operation screen of the application program of the PC 400 and characters and lines written on the touch panel 300, for example.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
以下、本発明の第1の実施形態について図面を参照して説明する。図1は、本発明の第1の実施形態の構成例を示したブロック図である。図1に示した第1の実施形態の制御装置1は、タッチ検出部2と、画像取得部3と、処理決定部4とを備える。
次に、本発明の第2の実施形態について図面を参照して説明する。図2は、電子黒板システム10の構成例を示した概略ブロック図である。図2に示した電子黒板システム10は、撮像部11と、制御部12と、タッチパネル13とを備える。図2に示した電子黒板システム10においては、例えば、電子黒板システム10全体を本発明の第2の実施形態としてとらえてもよいし、または、制御部12を本発明の第2の実施形態としてとらえてもよい。あるいは、撮像部11と制御部12とを組み合わせたもの、もしくは制御部12とタッチパネル13とを組み合わせたものを本発明の第2の実施形態としてとらえてもよい。あるいは、制御部12から入力された画像信号に応じて画像を表示する表示部19を、本発明の第2の実施形態としてとらえてもよい。この場合、表示部19は、例えば、撮像部11と制御部12と後述する検出部18とを備えず、表示部19を備えた表示装置として構成することができる。
次に、本発明の第3の実施形態について図面を参照して説明する。図21は、電子黒板システム10aの構成例を示した概略ブロック図である。図21に示した電子黒板システム10aは、カメラ100と、CPU200と、タッチパネル300と、パーソナルコンピュータ(以下、PCと言う)400と、記憶部500とを備える。
2 タッチ検出部
3 画像取得部
4 処理決定部
10、10a 電子黒板システム
11 撮像部
12 制御部
13 タッチパネル
13a 画面
14 画像認識処理部
15 判断処理部
16 対応づけ記憶部
17 描画処理部
18 検出部
19 表示部
21~26 アイコン(第1アイコン)
28 アイコン(第2アイコン)
81、82 アイコン(第3アイコン)
Claims (20)
- 入力物体によるタッチを検出するタッチ検出ステップと、
少なくとも前記入力物体の一部を含む画像を取得する画像取得ステップと、
前記タッチ検出ステップでの検出結果と前記画像取得ステップでの取得画像とに応じて実行する処理を決定する処理決定ステップと
を含む制御方法。 - 前記処理決定ステップで決定する前記処理は、予め記憶させた前記入力物体の形状及び/又は色に対応づけた処理である
請求項1に記載の制御方法。 - 前記処理決定ステップにおいて、前記取得画像から前記入力物体の形状及び/又は色を認識し、さらに前記認識した結果に応じて前記処理を決定する
請求項1又は2に記載の制御方法。 - 前記タッチ検出ステップでの検出結果と前記画像取得ステップでの取得画像とに応じて、タッチした前記入力物体の形状及び/又は色を認識して記憶するとともに、前記処理決定ステップで決定する前記処理を前記記憶した入力物体の形状及び/又は色に対応づける設定ステップをさらに含む
請求項2又は3に記載の制御方法。 - 前記設定ステップが、
前記処理を表す第1アイコンを表示するステップと、
前記第1アイコンが表す前記処理を、前記第1アイコンにタッチした前記入力物体の形状及び/又は色に対応づけるステップと
を含む請求項4に記載の制御方法。 - 前記設定ステップが、
色識別処理を表す第2アイコンを表示するステップと、
前記処理を、前記第2アイコンにタッチした前記入力物体の色に対応づけるステップと
を含む請求項4又は5に記載の制御方法。 - 前記処理決定ステップで、タッチの開始又は終了の際に前記処理を決定する
請求項1から6のいずれか1項に記載の制御方法。 - 前記処理決定ステップで決定した前記処理を表す第3アイコンを表示する表示ステップを
さらに含む請求項1から7のいずれか1項に記載の制御方法。 - 前記表示ステップで、前記処理決定ステップで決定された前記処理が変化した場合に一定時間前記第3アイコンを表示する
請求項8に記載の制御方法。 - 前記処理決定ステップで、前記表示面に含まれた複数の部分領域毎に、前記実行する処理を決定する
請求項1から9のいずれか1項に記載の制御方法。 - 前記処理決定ステップで、前記複数の部分領域の少なくとも一つに対して、前記実行する処理を決定しない
請求項10項に記載の制御方法。 - 入力物体によるタッチを検出する検出部と、
少なくとも前記入力物体の一部を含む画像を撮像する撮像部と、
前記検出部による検出結果と前記撮像部による撮像画像とに応じて実行する処理を決定する制御部と、
前記制御部が決定した処理により画像を表示する表示部と
を備えた電子黒板システム。 - 前記制御部が決定する前記処理は、予め記憶させた前記入力物体の形状及び/又は色に対応づけた処理である
請求項12に記載の電子黒板システム。 - 前記制御部が、前記撮像画像から前記入力物体の形状及び/又は色を認識し、さらに前記認識した結果に応じて前記処理を決定する
請求項12又は13に記載の電子黒板システム。 - 前記制御部は、さらに、
前記検出部の検出結果と前記撮像部の撮像画像とに応じて、タッチした前記入力物体の形状及び/又は色を認識し、
前記決定する処理と、前記認識した入力物体の形状及び/又は色との対応づけを設定する
請求項13又は14に記載の電子黒板システム。 - 前記制御部は、前記対応づけを設定する際に、
前記処理を表す第1アイコンを表示し、
前記第1アイコンが表す前記処理を、前記第1アイコンにタッチした前記入力物体の形状及び/又は色に対応づける
請求項15に記載の電子黒板システム。 - 前記制御部は、前記対応づけを設定する際に、
色識別処理を表す第2アイコンを表示し、
前記決定する処理を、前記第2アイコンにタッチした前記入力物体の色に対応づける
請求項15又は16に記載の電子黒板システム。 - 前記制御部は、タッチの開始又は終了の際に前記処理を決定する
請求項12から17のいずれか1項に記載の電子黒板システム。 - 入力物体によるタッチを検出する検出部による検出結果と前記入力物体の一部を少なくとも含む画像を撮像する撮像部による撮像画像とに応じて実行する処理を決定する制御部が決定した前記処理により画像を表示する表示部を
備えた表示装置。 - 入力物体によるタッチを検出するタッチ検出ステップと、
少なくとも前記入力物体の一部を含む画像を取得する画像取得ステップと、
前記タッチ検出ステップでの検出結果と前記画像取得ステップでの取得画像とに応じて実行する処理を決定する処理決定ステップと
をコンピュータに実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/750,094 US20180239486A1 (en) | 2015-10-29 | 2015-10-29 | Control method, electronic blackboard system, display device, and program |
JP2017547277A JPWO2017072913A1 (ja) | 2015-10-29 | 2015-10-29 | 制御方法、電子黒板システム、表示装置およびプログラム |
PCT/JP2015/080563 WO2017072913A1 (ja) | 2015-10-29 | 2015-10-29 | 制御方法、電子黒板システム、表示装置およびプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/080563 WO2017072913A1 (ja) | 2015-10-29 | 2015-10-29 | 制御方法、電子黒板システム、表示装置およびプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017072913A1 true WO2017072913A1 (ja) | 2017-05-04 |
Family
ID=58629921
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/080563 WO2017072913A1 (ja) | 2015-10-29 | 2015-10-29 | 制御方法、電子黒板システム、表示装置およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180239486A1 (ja) |
JP (1) | JPWO2017072913A1 (ja) |
WO (1) | WO2017072913A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10761670B2 (en) * | 2018-06-13 | 2020-09-01 | Tactual Labs Co. | Sensing of multiple writing instruments |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0784715A (ja) * | 1993-09-10 | 1995-03-31 | Hitachi Ltd | 情報処理装置 |
JPH09185456A (ja) * | 1995-04-28 | 1997-07-15 | Matsushita Electric Ind Co Ltd | インターフェイス装置 |
JPH1138949A (ja) * | 1997-07-15 | 1999-02-12 | Sony Corp | 描画装置、描画方法及び記録媒体 |
WO2011086600A1 (ja) * | 2010-01-15 | 2011-07-21 | パイオニア株式会社 | 情報処理装置、その方法 |
JP2012053584A (ja) * | 2010-08-31 | 2012-03-15 | Sanyo Electric Co Ltd | 情報表示システムおよびプログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120044140A1 (en) * | 2010-08-19 | 2012-02-23 | Sanyo Electric Co., Ltd. | Information display system and program, and optical input system, projection-type images and display apparatus |
US20140210797A1 (en) * | 2013-01-31 | 2014-07-31 | Research In Motion Limited | Dynamic stylus palette |
US20150058753A1 (en) * | 2013-08-22 | 2015-02-26 | Citrix Systems, Inc. | Sharing electronic drawings in collaborative environments |
-
2015
- 2015-10-29 WO PCT/JP2015/080563 patent/WO2017072913A1/ja active Application Filing
- 2015-10-29 US US15/750,094 patent/US20180239486A1/en not_active Abandoned
- 2015-10-29 JP JP2017547277A patent/JPWO2017072913A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0784715A (ja) * | 1993-09-10 | 1995-03-31 | Hitachi Ltd | 情報処理装置 |
JPH09185456A (ja) * | 1995-04-28 | 1997-07-15 | Matsushita Electric Ind Co Ltd | インターフェイス装置 |
JPH1138949A (ja) * | 1997-07-15 | 1999-02-12 | Sony Corp | 描画装置、描画方法及び記録媒体 |
WO2011086600A1 (ja) * | 2010-01-15 | 2011-07-21 | パイオニア株式会社 | 情報処理装置、その方法 |
JP2012053584A (ja) * | 2010-08-31 | 2012-03-15 | Sanyo Electric Co Ltd | 情報表示システムおよびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017072913A1 (ja) | 2018-05-24 |
US20180239486A1 (en) | 2018-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11048333B2 (en) | System and method for close-range movement tracking | |
CN105814522B (zh) | 基于运动识别来显示虚拟输入设备的用户界面的设备和方法 | |
KR102339674B1 (ko) | 디스플레이 장치 및 방법 | |
US9910498B2 (en) | System and method for close-range movement tracking | |
KR101514168B1 (ko) | 정보 처리 장치, 정보 처리 방법 및 기록 매체 | |
JP5658552B2 (ja) | 表示制御装置及びその制御方法、プログラム、及び記録媒体 | |
US9519365B2 (en) | Display control apparatus and control method for the same | |
WO2015045676A1 (ja) | 情報処理装置、および制御プログラム | |
US10684772B2 (en) | Document viewing apparatus and program | |
US20160300321A1 (en) | Information processing apparatus, method for controlling information processing apparatus, and storage medium | |
JP5760886B2 (ja) | 画像表示装置、画像表示方法及び画像表示プログラム | |
US20110037731A1 (en) | Electronic device and operating method thereof | |
WO2017072913A1 (ja) | 制御方法、電子黒板システム、表示装置およびプログラム | |
TWI653540B (zh) | 顯示裝置、投影機、及顯示控制方法 | |
US20200257396A1 (en) | Electronic device and control method therefor | |
CN108932054B (zh) | 显示装置、显示方法和非暂时性的记录介质 | |
US10212382B2 (en) | Image processing device, method for controlling image processing device, and computer-readable storage medium storing program | |
JP5994903B2 (ja) | 画像表示装置、画像表示方法及び画像表示プログラム | |
JP2020071544A (ja) | 情報表示装置、電子ペン、表示制御方法、及び表示制御プログラム | |
CN109218599B (zh) | 全景图像的显示方法及其电子装置 | |
JP6409777B2 (ja) | 撮像装置、撮像方法及びプログラム | |
JP6276630B2 (ja) | 情報処理装置、情報処理プログラムおよび情報処理方法 | |
TWI502519B (zh) | 手勢辨識模組及手勢辨識方法 | |
US20220256095A1 (en) | Document image capturing device and control method thereof | |
JP2018097280A (ja) | 表示装置、表示方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15907277 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017547277 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15750094 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15907277 Country of ref document: EP Kind code of ref document: A1 |