WO2019220803A1 - 位置指示装置及び情報処理装置 - Google Patents
位置指示装置及び情報処理装置 Download PDFInfo
- Publication number
- WO2019220803A1 WO2019220803A1 PCT/JP2019/015042 JP2019015042W WO2019220803A1 WO 2019220803 A1 WO2019220803 A1 WO 2019220803A1 JP 2019015042 W JP2019015042 W JP 2019015042W WO 2019220803 A1 WO2019220803 A1 WO 2019220803A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pressure
- sensor
- grip force
- communication unit
- position indicating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
- G06F3/03546—Pens or stylus using a rotatable ball at the tip as position detecting member
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04146—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position using pressure sensitive conductive elements delivering a boolean signal and located between crossing sensing lines, e.g. located between X and Y sensing line layers
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04162—Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04105—Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
Definitions
- the present invention relates to a position pointing device and an information processing device, and in particular, a pen-type position pointing device used for pointing both a position in a touch surface and a position in space, and a connection to such a position pointing device.
- the present invention relates to an information processing apparatus.
- a pen-type position pointing device (hereinafter referred to as “electronic pen”) used in combination with a tablet-type computer has attracted attention.
- This type of electronic pen is usually provided with a writing pressure sensor that detects pressure (writing pressure) applied to the pen tip.
- the computer receives a pen pressure value from the electronic pen when detecting the position of the electronic pen in the touch surface. And when drawing a line drawing according to the detected position, it is comprised so that the line
- Patent Document 1 discloses a pen-type input device that does not require a touch surface.
- This pen-type input device has a pressure sensor on the side surface, and is configured to detect a user's grip force.
- a character or figure is drawn with a pen
- characteristics corresponding to the character or figure to be drawn appear in the change in grip force.
- the technique of Patent Document 1 recognizes this feature as a character or a graphic, and thereby attempts to input a character or a graphic without detecting the position of the pen tip on the touch surface.
- the inventor of the present application writes characters or draws pictures on a virtual plane using the above-described electronic pen in a virtual reality (including VR: Virtual Reality, AR: Augmented Reality, MR: Mixed Reality) space.
- a virtual reality including VR: Virtual Reality, AR: Augmented Reality, MR: Mixed Reality
- the writing pressure value cannot be detected by the writing pressure sensor described above. Without a pen pressure value, it is impossible to control the line width and transparency according to the pen pressure value, and it becomes impossible to produce a writing taste similar to a conventional pen, so the line width and transparency can be controlled suitably. Other methods were needed.
- one of the objects of the present invention is to provide a position pointing device and an information processing device that can suitably control line width and transparency even when an actual touch surface does not exist.
- the position indicating device includes a housing, a position indicating unit for indicating a position, a first sensor for detecting a first pressure applied to the position indicating unit, and a second pressure applied to the housing.
- a second sensor for detecting, a first communication unit for transmitting the first pressure detected by the first sensor, and a second for transmitting the second pressure detected by the second sensor.
- And a communication unit a communication unit.
- the position pointing device includes a cylindrical external housing that houses a position pointing unit for performing position pointing on the input surface of the planar position sensor, and a position of the position pointing device in space.
- a spatial position detection unit that detects spatial position information by interaction with an external device, a pressure sensor that detects a force on the external housing, the spatial position information detected by the spatial position detection unit, and in the input plane
- the position indicating device may include a planar position information for indicating the position of the position indicating unit and a processing unit configured to output pressure information related to the force detected by the pressure sensor.
- An information processing apparatus is an information processing apparatus capable of communicating with a position pointing device having a housing, a position indicating unit that indicates a position, and a pressure sensor that detects a force applied to the housing, A communication unit that receives pressure detected by the sensor, and a controller that controls generation of a 3D object in the virtual reality space based on the position of the position pointing device in the space and the pressure received by the communication unit.
- Information processing apparatus capable of communicating with a position pointing device having a housing, a position indicating unit that indicates a position, and a pressure sensor that detects a force applied to the housing, A communication unit that receives pressure detected by the sensor, and a controller that controls generation of a 3D object in the virtual reality space based on the position of the position pointing device in the space and the pressure received by the communication unit.
- an information processing apparatus includes a cylindrical external housing that houses a position indicating unit that performs a position instruction on the input surface of a planar position sensor, and a pressure sensor that detects a force on the surface of the external housing.
- a computer connected to the position pointing device having spatial position information for indicating a position of the position pointing device in the space from the position pointing device, and a position of the position pointing unit in the input plane; Plane position information and pressure information related to the force detected by the pressure sensor are configured to be received, and when the spatial position information and the pressure information are received, the space based on the received spatial position information
- a spatial position indicating the position of the position indicating device is detected, and 3D drawing is performed based on the detected spatial position and the received pressure information.
- a plane position indicating the position of the position indicating unit in the touch surface is detected based on the received plane position information, and the detected plane position and reception are detected.
- the computer may perform 2D drawing based on the pressure information.
- the position pointing device according to the present invention capable of transmitting the pressure detected by the pressure sensor and the information processing device according to the present invention capable of performing 3D drawing based on the pressure detected by the pressure sensor, the actual touch surface Even in the case where no exists, the line width and transparency can be suitably controlled.
- FIG. 2A is a perspective view showing an external appearance of the electronic pen 5
- FIG. 2B is a schematic block diagram showing functional blocks of the electronic pen 5.
- FIG. 4 is a process flowchart showing details of the tablet input process shown in FIG. 3.
- FIG. 4 is a process flowchart showing details of the virtual reality space input process shown in FIG. 3.
- FIG. 7 is a process flowchart showing details of the correlation acquisition process (step S30) shown in FIG. 6.
- FIG. 7 is a process flowchart showing details of the tablet drawing process shown in FIG. 6.
- FIG. 7 is a process flowchart showing details of the virtual reality space drawing process shown in FIG. 6. It is a figure explaining the meaning of initial grip power. It is a figure which shows the structure of the grip force sensor 55 by a 1st example. It is a figure which shows the structure of the grip force sensor 55 by a 2nd example. It is a figure which shows the structure of the grip force sensor 55 by the 3rd example. It is a figure which shows the structure of the grip force sensor 55 by a 4th example.
- FIG. 1 is a diagram showing a configuration of a spatial position indicating system 1 according to an embodiment of the present invention.
- a spatial position indication system 1 includes a computer 2, a virtual reality display 3, a planar position sensor 4, an electronic pen 5, position detection devices 7a and 7b, a space And position sensors 8a to 8c.
- the spatial position sensors 8a to 8c are provided in the planar position sensor 4, the virtual reality display 3, and the electronic pen 5, respectively.
- FIG. 1 Each device shown in FIG. 1 is arranged in a room in principle. In the space position indicating system 1, almost the entire room can be used as a virtual reality space.
- the computer 2 includes a control unit 2a and a memory 2b. Each process performed by the computer 2 described below is realized by the control unit 2a reading and executing a program stored in the memory 2b.
- the computer 2 is connected to each of the virtual reality display 3, the position detecting devices 7a and 7b, and the planar position sensor 4 by wire or wirelessly.
- wired it is preferable to use USB (Universal Serial Bus), for example.
- wireless for example, it is preferable to use a wireless LAN such as Wi-Fi (registered trademark) or a short-range wireless communication such as Bluetooth (registered trademark).
- Wi-Fi registered trademark
- Bluetooth registered trademark
- the computer 2 is configured to have a function of displaying a virtual reality space on the virtual reality display 3.
- This virtual reality space may be a VR (Virtual Reality) space, an AR (Augmented Reality) space, or an MR (Mixed Reality) space.
- VR Virtual Reality
- AR Augmented Reality
- MR Magnetic Reality
- the user wearing the virtual reality display 3 recognizes the virtual reality and is separated from the real world.
- the AR space or the MR space the user wearing the virtual reality display 3 recognizes a space in which the virtual reality and the real world are mixed.
- the computer 2 functions as a rendering device that renders various 3D objects (objects) in a virtual reality space set with reference to the positions of the position detection devices 7a and 7b. Configured to update the display. As a result, various 3D objects appear in the virtual reality space displayed on the virtual reality display 3. Rendering by the computer 2 is executed based on 3D object information stored in the memory 2b.
- the 3D object information is information indicating the shape, position, and orientation of the 3D object in the virtual reality space indicating the virtual reality space set by the computer 2, and is stored in the memory 2b for each 3D object to be rendered.
- the 3D object rendered by the computer 2 includes a 3D object that actually exists such as the planar position sensor 4 and the electronic pen 5 shown in FIG. 3D objects that do not exist in reality (hereinafter referred to as “second 3D objects”) such as tablets (not shown) are included.
- the computer 2 first detects the position and orientation of the spatial position sensor 8b in the real space, and acquires viewpoint information indicating the user's viewpoint based on the detection result.
- the computer 2 When rendering the first 3D object, the computer 2 further detects the position and orientation of the spatial position sensor (for example, the spatial position sensors 8a and 8c) attached to the corresponding object in the real space, and the detection result is obtained. Store in the memory 2b. Then, the first 3D object is rendered in the virtual reality space based on the stored position and orientation, the viewpoint information described above, and the shape stored for the first 3D object. Further, the computer 2 detects the operation performed by the user in the virtual reality space by detecting the position of the spatial position sensor 8c, particularly for the electronic pen 5, and newly creates a second 3D object based on the result. Create (that is, newly store 3D object information in the memory 2b), or move or update a second 3D object that is already held (that is, update 3D object information stored in the memory 2b) ) Process.
- the spatial position sensor for example, the spatial position sensors 8a and 8c
- the computer 2 when rendering the second 3D object, the computer 2 renders the second 3D object in the virtual reality space based on the 3D object information stored in the memory 2b and the viewpoint information described above. Composed.
- the virtual reality display 3 is a VR display (head mounted display) used by being worn on a human head.
- VR display head mounted display
- various types of commercially available virtual reality displays such as “transmission type” or “non-transmission type”, “glasses type” or “hat type”, and any of them is used as the virtual reality display 3. Is also possible.
- the virtual reality display 3 is connected to each of the spatial position sensor 8a and the electronic pen 5 (including the spatial position sensor 8c) by wire or wirelessly.
- the spatial position sensors 8a and 8c are configured to notify the virtual reality display 3 of received light level information to be described later through this connection.
- the virtual reality display 3 notifies the computer 2 of the light reception level information notified from each of the spatial position sensors 8a and 8c together with the light reception level information of the spatial position sensor 8b incorporated therein.
- the computer 2 detects the positions and orientations of the spatial position sensors 8a to 8c in the real space based on the light reception level information thus notified.
- the planar position sensor 4 is an apparatus having an input surface 4a and a plurality of electrodes (not shown) arranged so as to cover the entire input surface 4a.
- the input surface 4a is preferably a flat surface, and can be made of a material suitable for sliding the pen tip of the electronic pen 5.
- the plurality of electrodes serve to detect a pen signal (described later) transmitted by the electronic pen 5.
- the pen signal detected by each electrode is supplied to the computer 2, and the computer 2 indicates the indication position of the electronic pen 5 within the input surface 4a and various data transmitted by the electronic pen 5 based on the supplied pen signal.
- the planar position sensor 4 may be built in, for example, a tablet terminal having a display function and a processor. In this case, a part or all of the computer 2 can be configured by the processor of the tablet terminal.
- the spatial position sensor 8 a is fixedly installed on the surface of the planar position sensor 4. Therefore, the position and orientation of the spatial position sensor 8a detected by the computer 2 indicate the position and orientation of the input surface 4a in the virtual reality space coordinate system.
- the electronic pen 5 is a position indicating device having a pen shape, and functions as an input device to the planar position sensor 4 (hereinafter referred to as “tablet input function”) and a function as an input device to the computer 2. (Hereinafter referred to as “virtual reality space input function”).
- the tablet input function includes a function of designating a position in the input surface 4a of the planar position sensor 4.
- the virtual reality space input function includes a function for indicating a position in the virtual reality space. Details of each function will be described later.
- the position detection devices 7a and 7b are base station devices that constitute a position detection system for detecting the positions of the spatial position sensors 8a to 8c, and can emit laser signals while changing directions according to control by the computer 2, respectively. Configured. Each of the spatial position sensors 8a to 8c is composed of a plurality of light receiving sensors. Each of the position detecting devices 7a and 7b receives a laser signal emitted by each of the light receiving sensors and receives light receiving level information including each light receiving level. Configured to get. The light reception level information acquired in this way is supplied to the computer 2 via the virtual reality display 3 as described above. In the present embodiment, the position detection devices 7a and 7b are configured to be able to emit laser signals, but are not limited to this configuration. For example, a configuration using other invisible light sensors, visible light sensors, or a combination thereof may be used.
- FIG. 2A is a perspective view showing the external appearance of the electronic pen 5.
- the electronic pen 5 includes a cylindrical external housing 5a that houses a pen tip 5b (position indicating unit) for performing a position instruction on the input surface 4a of the planar position sensor 4. Is done.
- the various members which comprise the grip force sensor 55 mentioned later and various switches are attached to the surface of the actual electronic pen 5, drawing is abbreviate
- the user When performing input using the tablet input function, the user holds the external casing 5a with one hand and causes the pen tip 5b to contact the input surface 4a of the planar position sensor 4. Then, an input operation using the electronic pen 5 is performed by moving the pen tip 5b on the input surface 4a while maintaining the contact state.
- the user when performing input using the virtual reality space input function, the user performs an input operation with the electronic pen 5 by holding the external housing 5a with one hand and moving the electronic pen 5 in the air.
- Input by the virtual reality space input function includes input to the virtual tablet described above.
- FIG. 2B is a schematic block diagram showing functional blocks of the electronic pen 5.
- the electronic pen 5 includes a processing unit 50, a planar communication unit 51, a spatial communication unit 52, a spatial position detection unit 53, a writing pressure sensor 54, a grip force sensor 55 (pressure sensor), and force generation.
- the unit 56 is configured. Since the electronic pen 5 may have only one of the writing pressure sensor 54 and the grip force sensor 55, the following description will be made including such a case.
- the processing unit 50 is connected to other units in the electronic pen 5 and is configured by a processor that controls these and performs various processes described below.
- the processing unit 50 reads out and executes a program stored in an internal memory (not shown), thereby executing control of other units in the electronic pen 5 and various processes described later.
- the planar communication unit 51 is a functional unit that transmits and receives signals to and from the computer 2 via the planar position sensor 4 under the control of the processing unit 50.
- a plurality of electrodes arranged in the input surface 4a of the planar position sensor 4 and a pen tip electrode (not shown) provided in the vicinity of the pen tip 5b of the electronic pen 5 are used as an antenna.
- this transmission / reception includes a case where a signal is unilaterally transmitted from the electronic pen 5 to the planar position sensor 4 and a case where a signal is bidirectionally transmitted / received between the electronic pen 5 and the planar position sensor 4.
- a signal transmitted from the planar position sensor 4 toward the electronic pen 5 is a “beacon signal”, and transmitted from the electronic pen 5 toward the planar position sensor 4.
- the signal will be referred to as a “pen signal”.
- an electromagnetic induction method or an active electrostatic method can be used as a specific method of signal transmission and reception in this case.
- the beacon signal is a signal transmitted by the computer 2 at a predetermined time interval, for example, and includes a command for controlling the electronic pen 5 from the computer 2.
- the pen signal is obtained by modulating the carrier wave with a burst signal (planar position information for indicating the position of the pen tip 5b in the input surface 4a) which is an unmodulated carrier wave and data requested to be transmitted by a command. Data signals to be transmitted.
- the space communication unit 52 has a function of transmitting and receiving signals to and from the computer 2 via the virtual reality display 3 under the control of the processing unit 50. As described above, transmission / reception of this signal is realized by wire or wireless.
- the plane position sensor 4 does not intervene in signal transmission / reception between the space communication unit 52 and the computer 2.
- the spatial position detection unit 53 is a functional unit configured by the spatial position sensor 8c shown in FIG. 1, and receives the light reception level described above by interaction with an external device (specifically, the position detection devices 7a and 7b). It plays a role of detecting information (space position information for indicating the position of the electronic pen 5 in the space). Specifically, the detection operation of the laser signals transmitted by the position detection devices 7a and 7b is periodically or continuously performed, and the received light level information corresponding to the detected laser signals is generated. The process which supplies to 50 is performed.
- the writing pressure sensor 54 is a sensor configured to be able to detect a force (writing pressure) applied to the pen tip 5b, and is configured by, for example, a capacitance sensor (not shown) whose capacitance value changes with writing pressure.
- the processing unit 50 has a function of acquiring the writing pressure detected by the writing pressure sensor 54 and generating writing pressure information related to the acquired writing pressure.
- the writing pressure information is, for example, a digital value obtained by performing analog-digital conversion on writing pressure that is analog information.
- the processing unit 50 has a function of acquiring the grip force detected by the grip force sensor 55 and generating pressure information related to the acquired grip force.
- the pressure information is, for example, a digital value obtained by performing analog-digital conversion on the grip force that is analog information.
- the force sense generator 56 has a function of generating a force sense according to a control signal supplied from the computer 2.
- the force sense here is, for example, vibration of the external housing 5a.
- the computer 2 Is supplied to the electronic pen 5 via the space communication unit 52, thereby causing the force generation unit 56 to generate a force sense.
- the user can obtain a feeling that the pen tip 5b collides with the surface of the virtual tablet that does not actually exist.
- the processing unit 50 When performing input using the tablet input function, the processing unit 50 first performs a detection operation of a beacon signal transmitted by the computer 2 via the planar communication unit 51. As a result, when a beacon signal is detected, the processing unit 50 sequentially outputs the above-described burst signal and data signal to the planar communication unit 51 as a response to the beacon signal.
- the data signal output in this way can include the above-described writing pressure information or pressure information.
- the planar communication unit 51 is configured to transmit the burst signal and the data signal thus input to the computer 2 via the planar position sensor 4.
- the computer 2 determines the position of the pen tip 5b in the input surface 4a based on the received intensity of the burst signal at each of the plurality of electrodes arranged in the input surface 4a.
- the plane position shown is detected.
- the data transmitted by the electronic pen 5 is acquired by receiving the data signal using the electrode closest to the detected planar position among the plurality of electrodes arranged in the input surface 4a.
- the computer 2 performs 2D drawing based on the detected plane position and the received data. Details of 2D drawing will be described later.
- the tablet input function is thus realized.
- the processing unit 50 is configured to sequentially output the light reception level information supplied from the space position detection unit 53 to the space communication unit 52.
- the processing unit 50 is configured to output the writing pressure information or the pressure information generated as described above to the spatial communication unit 52 together with the output of the light reception level information.
- the spatial communication unit 52 is configured to transmit each piece of information thus input to the computer 2.
- the computer 2 When the computer 2 receives the above information from the spatial communication unit 52, the computer 2 detects a spatial position indicating the position of the electronic pen 5 in the space based on the received light reception level information.
- information indicating the shape of the electronic pen 5 and the relative positional relationship between the spatial position detector 53 and the pen tip 5b is stored in the computer 2 in advance, and the computer 2 is directly obtained from the light reception level information.
- the position may be converted to the position of the pen tip 5b based on this information, and the position obtained by the conversion may be detected as a spatial position.
- the computer 2 performs 3D drawing based on the detected spatial position and the received writing pressure information or pressure information. Details of 3D drawing will also be described later.
- the virtual reality space input function is thus realized.
- FIG. 3 is a process flow diagram showing processing performed by the processing unit 50 of the electronic pen 5.
- 4 is a process flow diagram showing details of the tablet input process (step S1) shown in FIG. 3
- FIG. 5 shows details of the virtual reality space input process (step S2) shown in FIG. It is a processing flowchart.
- the operation of the electronic pen 5 will be described in detail with reference to FIGS.
- the processing unit 50 performs a tablet input process (step S1) and a virtual reality space input process (step S2) in a time-sharing manner.
- the processing unit 50 that performs tablet input processing first performs a beacon signal detection operation by the planar communication unit 51 (steps S ⁇ b> 10 and S ⁇ b> 11).
- the planar communication unit 51 attempts to detect a beacon signal by demodulating the signal arriving at the pen tip electrode.
- the processing unit 50 outputs the burst signal to the planar communication unit 51, thereby causing the planar communication unit 51 to transmit the burst signal (step S12).
- the subsequent processing differs depending on whether the electronic pen 5 has the writing pressure sensor 54 or not.
- the processing unit 50 acquires the writing pressure from the output of the writing pressure sensor 54 (step S13), and transmits a data signal including writing pressure information related to the acquired writing pressure by the planar communication unit 51 (step S14).
- the processing unit 50 acquires the grip force from the output of the grip force sensor 55 (step S15), and transmits a data signal including pressure information regarding the acquired grip force by the planar communication unit 51 (step S16).
- the processing unit 50 ends the tablet input process and starts the next virtual reality space input process (step S2) as can be understood from FIG.
- the processing unit 50 that performs the virtual reality space input process first performs a laser signal detection operation by the spatial position detection unit 53 (steps S20 and S21). As a result, when the laser signal is not detected, the virtual reality space input process is terminated. On the other hand, when the laser signal is detected, the processing unit 50 acquires light reception level information corresponding to the laser signal from the spatial position detection unit 53 and transmits the received light level information to the spatial communication unit 52 (step S22).
- the subsequent processing differs depending on whether the electronic pen 5 has the writing pressure sensor 54 or not.
- the processing unit 50 acquires the grip force from the output of the grip force sensor 55 (step S26), and transmits the pressure information regarding the acquired grip force by the space communication unit 52 (step S27).
- the processing unit 50 acquires the writing pressure from the output of the writing pressure sensor 54 (step S23), and determines whether or not the acquired writing pressure exceeds a predetermined value (step S24). This determination is a determination of whether or not the pen tip 5b is in contact with the actual surface, and is performed so as not to use the pen pressure when not in contact.
- the actual surface mentioned here corresponds to a surface such as a simple plate.
- the pen pressure sensor 54 can be used also for the virtual tablet, for example, by arranging an actual board according to the display position of the virtual tablet.
- step S24 When it determines with having exceeded in step S24, the process part 50 transmits the pen pressure information regarding the acquired pen pressure by the space communication part 52 (step S25). On the other hand, when it determines with not having exceeded in step S24, the process part 50 transfers a process to step S26 and performs transmission of pressure information (step S26, S27). After the transmission in step S25 or step S27, the processing unit 50 ends the virtual reality space input process and starts the next tablet input process (step S1) as can be understood from FIG.
- FIG. 6 is a processing flowchart showing processing performed by the control unit 2a of the computer 2.
- 7 is a process flow diagram showing details of the correlation acquisition process (step S30) shown in FIG. 6, and
- FIG. 10 is a process showing details of the tablet drawing process (step S35) shown in FIG.
- FIG. 11 is a flowchart showing the details of the virtual reality space drawing process (step S41) shown in FIG.
- the operation of the computer 2 will be described in detail with reference to these drawings.
- control unit 2a first executes a correlation acquisition process (step S30).
- the correlation acquisition process is a process of acquiring the correlation f between the writing pressure detected by the writing pressure sensor 54 and the grip force detected by the grip force sensor 55.
- the control unit 2 a first applies a pen pressure detection operation by the pen pressure sensor 54 and a grip force detection operation by the grip force sensor 55 to the electronic pen 5 for a predetermined number of times.
- writing pressure information and pressure information are received from the electronic pen 5 (steps S50 to S52).
- the control unit 2a After repeating the predetermined number of times, acquires the correlation f between the writing pressure and the gripping force based on the plurality of combinations of the obtained writing pressure and the gripping force (Step S53), and ends the correlation acquisition process.
- writing pressure f (grip force) in one example.
- FIGS. 8A and 8B are diagrams for explaining the correlation f between the writing pressure and the grip force.
- P is the writing pressure
- G is the gripping force
- F is the frictional force between the user's hand and the surface of the electronic pen 5.
- the control unit 2a subsequently sets a drawing area in the virtual reality space (step S31).
- the drawing area is an area where 3D drawing with the electronic pen 5 is executed.
- FIGS. 9A and 9B are diagrams showing specific examples of drawing areas.
- FIG. 9A shows an example in which an area within a predetermined distance from the display surface of the virtual tablet B is set as the drawing area A.
- the drawing area A according to this example is an area that enables input to the virtual tablet B.
- the control unit 2a displays the detected spatial position on the display surface of the virtual tablet B in the virtual reality space drawing process shown in step S35 described later. 3D rendering is executed after replacing with a spatial position projected on the screen.
- the user can draw a plane figure on the display surface of the virtual tablet B.
- the predetermined distance is preferably a value greater than zero. This is because it is difficult for the user to keep the electronic pen 5 in contact with a display surface that does not physically exist when the user attempts to input on the display surface of the virtual tablet B with the electronic pen 5. by.
- FIG. 9B shows an example in which an arbitrary three-dimensional space is set as the drawing area A.
- the control unit 2a performs 3D drawing without performing the replacement as in the example of FIG. As a result, the user can draw a solid figure in the drawing area A.
- step S32 the control unit 2a performs the detection operation of the received light level information and the burst signal (step S32). Specifically, this process includes a process of receiving light reception level information from the electronic pen 5 by wire or wireless, and a process of receiving a burst signal from the electronic pen 5 via the planar position sensor 4. As a result of performing step S32, the control unit 2a proceeds to step S34 when a burst signal is detected (positive determination at step S33), and when no burst signal is detected (negative determination at step S33). Advances the process to step S36.
- step S34 detects the above-described planar position (the position of the pen tip 5b in the input surface 4a) based on the detected burst signal (step S34).
- step S35 A tablet drawing process for performing 2D drawing on the display of the tablet terminal is included (step S35).
- control unit 2a first performs a detection operation of a data signal transmitted from the electronic pen 5 via the planar position sensor 4 (step S60). Then, it is determined which writing pressure information or pressure information is included in the data signal (step S61).
- the control unit 2a further determines whether or not the pen pressure indicated by the pen pressure information is equal to or less than a predetermined normal ON load (for example, 0). (Step S68). As a result, if it is determined that the load is normally equal to or less than the ON load, the process ends without performing 2D drawing. This is processing when it is considered that the pen tip 5b of the electronic pen 5 is not in contact with the input surface 4a (so-called hover state). On the other hand, if it is determined in step S68 that the load is greater than the normal ON load, the controller 2a uses, for example, the plane position sensor 4 based on the plane position detected in step S34 and the pen pressure indicated by the pen pressure information. 2D drawing is performed on the display of a certain tablet terminal (step S69).
- a predetermined normal ON load for example, 0
- the 2D drawing performed in step S69 includes a rendering process and a display process.
- the control unit 2a places a circle having a radius corresponding to the corresponding writing pressure at each of a series of sequentially detected planar positions. Then, by smoothly connecting the circumferences of the respective circles, two-dimensional curve data (ink data) having a width corresponding to the writing pressure is generated.
- the display process is a process of displaying the curve data generated in this way on a display of a tablet terminal that is the planar position sensor 4, for example.
- step S61 If it is determined in step S61 that the pressure information is included, the control unit 2a executes a process for converting the grip force indicated by the pressure information into writing pressure (steps S62 to S67). Specifically, the control unit 2a first determines whether the reset flag A is true or false (step S62).
- the reset flag A is a flag indicating whether or not the electronic pen 5 has just entered the range where the burst signal reaches the planar position sensor 4, and in the case of being immediately after, the determination result in step S62 is false.
- the controller 2a determined to be false in step S62 further determines whether or not the grip force indicated by the pressure information is greater than or equal to a predetermined value (step S63). If it is determined that it is less than the predetermined value, the grip force indicated by the pressure information is set as the initial grip force (step S64), and if it is determined that it is greater than or equal to the predetermined value, the predetermined value is set to the initial value.
- the grip force is set (step S65).
- the initial grip force is a variable used to handle the grip force when the electronic pen 5 enters the range where the burst signal reaches the planar position sensor 4 (during pen down) as zero. Step S65 determines the upper limit of the initial grip force, and is used, for example, to prevent the user from being able to exert sufficient writing pressure due to an excessive grip force required to increase the line width.
- FIG. 12 is a diagram for explaining the meaning of the initial grip force.
- a graph is shown in which the horizontal axis represents the force against the surface of the external housing 5 a and the vertical axis represents the grip force detected by the grip force sensor 55.
- the control unit 2a is configured not to use the grip force itself detected by the grip force sensor 55 but to use a value obtained by subtracting the initial grip force from the grip force as the grip force. By doing so, the user can input the writing pressure by the grip force by increasing or decreasing the grip force based on the grip force at the time of pen down.
- step S64 or step S65 the control unit 2a sets the reset flag A to true (step S66), and then performs a process of converting the grip force into writing pressure using the correlation f (step S67).
- step S67 is also executed when it is determined to be true in step S62.
- step S67 the control unit 2a substitutes a value obtained by subtracting the initial grip force from the grip force indicated by the pressure information into the correlation f as a grip force.
- the user can input the pen pressure by the grip force by increasing or decreasing the grip force based on the grip force at the time of pen-down.
- the control unit 2a that has obtained the writing pressure in step S67 executes steps S68 and S69 using the writing pressure. Thereby, 2D drawing similar to the case where writing pressure information is included in the data signal is realized.
- Step S69 ends the tablet drawing process. Then, the process returns to step S32 in FIG.
- step S36 The control unit 2a that has proceeded to step S36 in FIG. 6 first sets the reset flag A to false (step S36). Thereby, when the electronic pen 5 is removed from the range where the burst signal reaches the planar position sensor 4, the reset flag A can be returned to false.
- control unit 2a determines whether or not the light reception level information is detected by the detection operation in step S32 (step S37). And when it determines with having detected, the control part 2a detects the space position (position of the electronic pen 5 (or its pen tip 5b) in space) mentioned above based on the detected light reception level information (step) S38). Subsequently, the control unit 2a determines whether or not the detected spatial position is a position in the drawing area set in step S31 (step S39).
- the control unit 2a that has determined that the position is in the drawing area in step S39 executes a virtual reality space drawing process for performing 3D drawing in the virtual reality space (step S41).
- a process of replacing the detected spatial position with a spatial position projected onto the display surface of the virtual tablet may be inserted between step S39 and step S41 ( Step S40).
- This step S40 is a process that can be executed only when the drawing area including the detected spatial position is an area set on the display surface of the virtual tablet B as shown in FIG. As a result, the user can draw a plane figure on the display surface of the virtual tablet as described above.
- control unit 2a first performs a pen pressure information or pressure information receiving operation (step S70). Then, it is determined which of the pen pressure information and the pressure information is received (step S71).
- step S71 When it is determined in step S71 that the pen pressure information has been received, the control unit 2a further determines whether or not the pen pressure indicated by the pen pressure information is equal to or less than a predetermined normal ON load (for example, 0) (step S71). S80). As a result, when it is determined that the load is normally equal to or less than the ON load, the process ends without performing 3D drawing. This is processing when it is considered that the pen tip 5b of the electronic pen 5 is not in contact with the above-described actual board (for example, one arranged in accordance with the display position of the virtual tablet).
- a predetermined normal ON load for example, 0
- step S80 determines whether the load is greater than the normal ON load. If it is determined in step S80 that the load is greater than the normal ON load, the controller 2a determines the spatial position detected in step S38 (or the spatial position acquired in step S40) and the pen pressure indicated by the pen pressure information. Based on the above, 3D rendering is performed in the virtual reality space (step S81).
- the 3D drawing performed in step S79 includes a rendering process and a display process.
- the control unit 2a places a sphere having a radius corresponding to the corresponding writing pressure at each of a series of spatial positions that are sequentially detected. Then, three-dimensional curve data having a cross-sectional diameter corresponding to the writing pressure is generated by smoothly connecting the surfaces of the spheres.
- the display process is a process for displaying the curve data thus generated in the virtual reality space.
- 2D drawing on the display surface may be performed instead of 3D drawing.
- step S71 If it is determined in step S71 that the pressure information has been received, the control unit 2a executes a process for converting the grip force indicated by the pressure information into writing pressure (steps S72 to S77).
- the details of this process are the same as the processes of steps S62 to S67 shown in FIG. 10, and the writing pressure as the conversion result is acquired in step S77.
- the reset flag B is used instead of the reset flag A.
- the reset flag B is a flag indicating whether or not the electronic pen 5 has just entered the drawing area. If it is immediately after, the determination result in step S72 is false.
- the control unit 2a that has obtained the writing pressure in step S77 executes steps S78 and S79 using the writing pressure.
- steps S78 and S79 instead of the normal ON load, a value different from the normal ON load, preferably a space ON load set to a value larger than the normal ON load is used (that is, indicated by the pressure information in step S78).
- the processing is the same as steps S80 and S81 except that it is determined whether or not the writing pressure to be applied is equal to or less than a predetermined space ON load (> normal ON load). Thereby, 3D drawing similar to the case where pen pressure information is received is realized.
- the reason why the space ON load is used instead of the normal ON load in step S78 is that when the electronic pen 5 is operated in a state of floating in the air, compared with the case where the electronic pen 5 is operated in a state of being in contact with a fixed surface such as the input surface 4a. This corresponds to the fact that the grip force is increased by an amount necessary to support the weight of the electronic pen 5.
- a space ON load larger than the normal ON load in step S78 it is possible to appropriately perform 3D drawing despite such an increase in grip force.
- Step S79 ends the virtual reality space drawing process. Then, the process returns to step S32 in FIG. 6, and the next light reception level information and burst signal detection operation is executed. If the control unit 2a determines that the received light level information is not detected in step S37 in FIG. 6 and if it is determined in step S39 in FIG. Is set (step S42), the process returns to step S32, and the next light reception level information and burst signal detection operation is executed. By executing step S42, the reset flag B can be returned to false when the electronic pen 5 is detached from the drawing area (including the case where the electronic pen 5 is detached from the virtual reality space).
- the electronic pen 5 is configured to output pressure information related to the grip force, and the computer can execute 3D drawing and 2D drawing based on the pressure information related to the grip force.
- the computer is configured, it is possible to suitably control the line width and transparency even when there is no actual touch surface.
- FIG. 13 is a diagram showing the structure of the grip force sensor 55 according to the first example.
- the grip force sensor 55 according to this example is configured by a touch sensor configured to be able to detect a pressing force by a pressure-sensitive method, for example, and is disposed on a side surface of the external housing 5a.
- the processing unit 50 acquires the pressing force detected by the grip force sensor 55 as the grip force.
- FIG. 14 is a diagram showing the structure of the grip force sensor 55 according to the second example.
- the grip force sensor 55 according to the present example is configured by a button mechanism that can detect the pressing amount stepwise or continuously, and is disposed on the side surface of the external housing 5a.
- the processing unit 50 acquires the pressing amount detected by the grip force sensor 55 as the grip force.
- the button mechanism include an actuator, a hall element, and a strain gauge.
- FIG. 15 is a diagram showing the structure of the grip force sensor 55 according to the third example.
- the grip force sensor 55 according to the present example also serves as the writing pressure sensor 54 and is configured by a capacitor having a structure in which the dielectric 11 is disposed between the two electrode plates 10 and 12.
- One end of the electrode plate 10 is connected to the other end of the core body 13 constituting the pen tip 5b.
- the electrode plate 12 is connected to a button mechanism 14 disposed on the side surface of the external housing 5a.
- the capacitor according to this example is configured such that the distance between the electrode plate 10 and the electrode plate 12 changes according to the force applied to the pen tip 5b, and as a result, the capacitance also changes. Further, in the capacitor according to this example, as understood from a comparison between FIG. 15A and FIG. 15B, the electrode plate 12 moves in the horizontal direction according to the amount of pressing of the button mechanism 14, and as a result, The capacitance is changed as follows.
- the processing unit 50 according to this example regards the capacitor according to this example as the writing pressure sensor 54 and acquires the writing pressure from the capacitance.
- the capacitor according to this example is regarded as the grip force sensor 55 and the grip force is acquired from the capacitance. According to this example, it is possible to realize both the grip force sensor 55 and the writing pressure sensor 54 with one capacitor.
- the grip force sensor 55 and the pen pressure sensor 54 can be used also by a load cell. Since the load cell can individually measure the stress in each of the X direction, the Y direction, and the Z direction, based on the measured individual stress, the writing pressure that is the force in the pen axis direction and the pen axis direction are perpendicular to the pen axis direction. It is possible to individually calculate a grip force that is a strong force.
- FIG. 16 is a diagram showing the structure of the grip force sensor 55 according to the fourth example.
- the grip force sensor 55 according to this example has a structure in which the pressure-sensitive sensor 15, the substrate 16, and the dome button 17 are stacked, and is disposed on the side surface of the external housing 5a so that the surface on the dome button 17 side is exposed.
- the pressure-sensitive sensor 15 is a sensor configured to be able to sense a pressing force on the surface of the external housing 5a
- the dome button 17 is a button mechanism configured to be turned on and off by a user.
- FIG. 17 is a process flow diagram illustrating a process performed by the processing unit 50 of the electronic pen 5 when the grip force sensor 55 according to the fourth example is used.
- FIG. 17A is obtained by adding steps S90 to S95 to the processing flowchart shown in FIG.
- FIG. 17B is obtained by adding step S96 to the processing flowchart shown in FIG. 4 or FIG.
- the operation of the electronic pen 5 including the grip force sensor 55 according to the fourth example will be described with reference to FIG.
- the processing unit 50 first determines whether the dome button 17 is on or off (step S90). As a result, if it is determined to be off, the reset flag C is set to false (step S95), and the tablet input process of step S1 is started.
- the reset flag C is a flag indicating whether or not it is immediately after the dome button 17 is pressed. If it is immediately after, the determination result in step S91 described later is false.
- the processing unit 50 that has been determined to be on in step S90 next determines whether the reset flag C is true or false (step S91).
- the processing unit 50 determined to be true immediately starts the tablet input process in step S1.
- the processing unit 50 acquires the grip force from the grip force sensor 55 (step S92), and sets the acquired grip force as the initial grip force (step S93).
- the initial grip force is a variable used to handle the grip force when the dome button 17 is pressed as 0, and is the initial grip force used in the computer 2 (shown in FIG. 10 or FIG. 11). It is unrelated to what is used in the processing flow.
- the processing unit 50 that has executed Step S93 sets the reset flag C to true (Step S94), and starts the tablet input process of Step S1.
- the processing unit 50 subtracts the initial grip force from each of the grip force acquired in step S15 of FIG. 4 and the grip force acquired in step S26 of FIG. Is used as the grip force (step S96). That is, not the grip force itself acquired in steps S15 and S26 but pressure information regarding the grip force obtained by subtraction in step S96 is transmitted to the computer 2.
- the processing unit 50 executes the above processing, the user of the electronic pen 5 according to the present example increases or decreases the grip force based on the grip force at the timing when the dome button 17 is turned on by his / her own intention. It becomes possible to input the pen pressure by grip force.
- FIG. 18 is a diagram showing the structure of the grip force sensor 55 according to the fifth example.
- the grip force sensor 55 according to this example is configured by a capacitor having a structure in which the dielectric 19 and the rubber 20 are disposed between the two electrode plates 18 and 21, and is disposed on the side surface of the external housing 5a.
- the processing unit 50 according to this example is configured to acquire the capacitance of the capacitor that is the grip force sensor 55 as the grip force.
- the grip force sensor 55 it is possible to detect not only the pressing force but also the force in the pen axis direction as the grip force.
- FIG. 19 is a diagram showing the structure of the grip force sensor 55 according to the sixth example.
- the electronic pen 5 according to the present example has a grip member 22 attached to the external housing 5a, and the grip force sensor 55 according to the present example is built in the grip member 22.
- 19A is a side view of the electronic pen 5 with the grip member 22 attached
- FIG. 19B is a top view of the electronic pen 5 with the grip member 22 attached
- FIG. 19C is a grip. The use state of the electronic pen 5 with the member 22 attached is shown.
- the grip member 22 includes a cylindrical base 22a fitted to the external housing 5a and a finger rest extending in an arch shape from one end of the base 22a. 22b.
- the user uses the electronic pen 5 with the index finger placed on the finger rest 22b.
- FIG. 19 shows an example in which the grip member 22 is separate from the external housing 5a, but these may be integrally formed.
- the grip force sensor 55 is, for example, a strain gauge embedded in the finger rest 22b, and is configured to be able to detect a force (pressing force of the finger rest 22b) embedded in the user's index finger.
- the processing unit 50 according to this example is configured to acquire the force thus detected as a grip force.
- the processing unit 50 can detect a user action of shaking the electronic pen 5. If this is combined with detection of the pressing force of the finger rest 22b by the grip force sensor 55, it is also possible to simulate the tap operation of the touch surface.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201980029211.9A CN112074802A (zh) | 2018-05-18 | 2019-04-04 | 位置指示装置及信息处理装置 |
| JP2020519507A JP6887060B2 (ja) | 2018-05-18 | 2019-04-04 | 位置指示装置及び情報処理装置 |
| EP19804098.2A EP3796136B1 (en) | 2018-05-18 | 2019-04-04 | Position indication device and information processing device |
| US17/084,444 US12197653B2 (en) | 2018-05-18 | 2020-10-29 | Position indicating device and information processing device |
| US18/977,449 US20250110581A1 (en) | 2018-05-18 | 2024-12-11 | Position indicating device and information processing device |
| US19/097,598 US20250231626A1 (en) | 2018-05-18 | 2025-04-01 | Position indicating device and information processing device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018096313 | 2018-05-18 | ||
| JP2018-096313 | 2018-05-18 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/084,444 Continuation US12197653B2 (en) | 2018-05-18 | 2020-10-29 | Position indicating device and information processing device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019220803A1 true WO2019220803A1 (ja) | 2019-11-21 |
Family
ID=68540195
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/015042 Ceased WO2019220803A1 (ja) | 2018-05-18 | 2019-04-04 | 位置指示装置及び情報処理装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (3) | US12197653B2 (enExample) |
| EP (1) | EP3796136B1 (enExample) |
| JP (4) | JP6887060B2 (enExample) |
| CN (1) | CN112074802A (enExample) |
| WO (1) | WO2019220803A1 (enExample) |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2022003511A1 (en) * | 2020-07-01 | 2022-01-06 | Wacom Co., Ltd. | Systems and methods for dynamic shape sketching |
| US11294478B2 (en) * | 2018-03-23 | 2022-04-05 | Wacom Co., Ltd. | Three-dimensional position indicator and three-dimensional position detection system |
| JPWO2022185560A1 (enExample) * | 2021-03-02 | 2022-09-09 | ||
| WO2022224578A1 (ja) | 2021-04-23 | 2022-10-27 | 株式会社ワコム | コントローラ及びコンピュータ |
| US11797104B2 (en) | 2021-05-06 | 2023-10-24 | Samsung Electronics Co., Ltd. | Electronic device and control method of the same |
| WO2024090300A1 (ja) | 2022-10-24 | 2024-05-02 | ソニーグループ株式会社 | 情報処理装置及び情報処理方法 |
| WO2024090304A1 (ja) | 2022-10-24 | 2024-05-02 | ソニーグループ株式会社 | 入力デバイス、制御装置、制御方法、情報処理装置、及び、情報処理方法 |
| DE112023001330T5 (de) | 2022-04-21 | 2025-01-09 | Wacom Co., Ltd. | Haptischer stift und verfahren zur erzeugung von wellenformen zur haptischen steuerung |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11815968B2 (en) * | 2017-12-14 | 2023-11-14 | Societe Bic | Stylus for a touchscreen |
| JP7401427B2 (ja) * | 2018-05-21 | 2023-12-19 | 株式会社ワコム | 位置指示デバイス及び空間位置指示システム |
| KR20220012073A (ko) | 2020-07-22 | 2022-02-03 | 삼성전자주식회사 | 가상 사용자 인터랙션을 수행하기 위한 방법 및 그 장치 |
| CN112835457A (zh) * | 2021-02-06 | 2021-05-25 | 上海萃钛智能科技有限公司 | 一种3d魔术笔及基于该3d魔术笔的显示系统及其使用方法 |
| CN115953924B (zh) * | 2022-12-14 | 2025-07-15 | 立讯精密科技(南京)有限公司 | 一种基于虚拟现实的练字方法、装置、系统及电子设备 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0867101A (ja) | 1994-08-27 | 1996-03-12 | Dr Ing H C F Porsche Ag | 自動車のための車輪 |
| JP2006293605A (ja) * | 2005-04-08 | 2006-10-26 | Canon Inc | 情報処理方法およびシステム |
| JP2009266097A (ja) * | 2008-04-28 | 2009-11-12 | Toshiba Corp | 入力機器 |
| JP2013242819A (ja) * | 2012-05-23 | 2013-12-05 | Hitachi Consumer Electronics Co Ltd | ペン型入力装置 |
| KR101360980B1 (ko) * | 2013-02-05 | 2014-02-11 | 주식회사 카이언스 | 필기구형 전자 입력장치 |
| JP2018001721A (ja) * | 2016-07-08 | 2018-01-11 | 国立大学法人大阪大学 | 筆記装置及びコンピュータープログラム |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH086710A (ja) | 1994-06-21 | 1996-01-12 | Nippon Telegr & Teleph Corp <Ntt> | ペン型入力装置 |
| JPH10333815A (ja) * | 1997-06-04 | 1998-12-18 | Brother Ind Ltd | 画像認識装置 |
| US7961909B2 (en) * | 2006-03-08 | 2011-06-14 | Electronic Scripting Products, Inc. | Computer interface employing a manipulated object with absolute pose detection component and a display |
| EP1686554A3 (en) | 2005-01-31 | 2008-06-18 | Canon Kabushiki Kaisha | Virtual space generating system, image processing apparatus and information processing method |
| US8077155B2 (en) * | 2006-02-13 | 2011-12-13 | Rehm Peter H | Relative-position, absolute-orientation sketch pad and optical stylus for a personal computer |
| US8988398B2 (en) * | 2011-02-11 | 2015-03-24 | Microsoft Corporation | Multi-touch input device with orientation sensing |
| JP5145443B2 (ja) | 2011-06-23 | 2013-02-20 | 株式会社日立製作所 | 熱アシスト記録用磁気ヘッド及び磁気記録装置 |
| JP2013084096A (ja) | 2011-10-07 | 2013-05-09 | Sharp Corp | 情報処理装置 |
| JP2014062962A (ja) | 2012-09-20 | 2014-04-10 | Sony Corp | 情報処理装置、筆記具、情報処理方法およびプログラム |
| JP6286846B2 (ja) | 2013-03-25 | 2018-03-07 | セイコーエプソン株式会社 | プロジェクター、指示体、インタラクティブシステムおよび制御方法 |
| US9489048B2 (en) * | 2013-12-13 | 2016-11-08 | Immersion Corporation | Systems and methods for optical transmission of haptic display parameters |
| US10372245B2 (en) * | 2015-04-20 | 2019-08-06 | Wacom Co., Ltd. | System and method for bidirectional communication between stylus and stylus sensor controller |
| CN105551339A (zh) * | 2015-12-31 | 2016-05-04 | 英华达(南京)科技有限公司 | 基于虚拟现实技术的书法练习系统及方法 |
| US10203781B2 (en) * | 2016-06-24 | 2019-02-12 | Microsoft Technology Licensing, Llc | Integrated free space and surface input device |
| US10073548B2 (en) * | 2016-11-08 | 2018-09-11 | Wacom Co., Ltd. | Stylus having variable transmit signal strength, and sensor for detecting such stylus |
| JP2018156489A (ja) | 2017-03-17 | 2018-10-04 | 株式会社リコー | 電子ペン、電子ペンの制御方法およびプログラム |
| US10509489B2 (en) * | 2017-09-26 | 2019-12-17 | Yong Bum Kim | Systems and related methods for facilitating pen input in a virtual reality environment |
-
2019
- 2019-04-04 JP JP2020519507A patent/JP6887060B2/ja active Active
- 2019-04-04 WO PCT/JP2019/015042 patent/WO2019220803A1/ja not_active Ceased
- 2019-04-04 CN CN201980029211.9A patent/CN112074802A/zh active Pending
- 2019-04-04 EP EP19804098.2A patent/EP3796136B1/en active Active
-
2020
- 2020-10-29 US US17/084,444 patent/US12197653B2/en active Active
-
2021
- 2021-05-17 JP JP2021083044A patent/JP7373258B2/ja active Active
-
2023
- 2023-10-19 JP JP2023180098A patent/JP7659608B2/ja active Active
-
2024
- 2024-12-11 US US18/977,449 patent/US20250110581A1/en active Pending
-
2025
- 2025-03-28 JP JP2025055065A patent/JP2025092623A/ja active Pending
- 2025-04-01 US US19/097,598 patent/US20250231626A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0867101A (ja) | 1994-08-27 | 1996-03-12 | Dr Ing H C F Porsche Ag | 自動車のための車輪 |
| JP2006293605A (ja) * | 2005-04-08 | 2006-10-26 | Canon Inc | 情報処理方法およびシステム |
| JP2009266097A (ja) * | 2008-04-28 | 2009-11-12 | Toshiba Corp | 入力機器 |
| JP2013242819A (ja) * | 2012-05-23 | 2013-12-05 | Hitachi Consumer Electronics Co Ltd | ペン型入力装置 |
| KR101360980B1 (ko) * | 2013-02-05 | 2014-02-11 | 주식회사 카이언스 | 필기구형 전자 입력장치 |
| JP2018001721A (ja) * | 2016-07-08 | 2018-01-11 | 国立大学法人大阪大学 | 筆記装置及びコンピュータープログラム |
Cited By (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240192788A1 (en) * | 2018-03-23 | 2024-06-13 | Wacom Co., Ltd. | Three-dimensional position indicator and three-dimensional position detection system including grip part and tracker |
| US11294478B2 (en) * | 2018-03-23 | 2022-04-05 | Wacom Co., Ltd. | Three-dimensional position indicator and three-dimensional position detection system |
| US12399580B2 (en) * | 2018-03-23 | 2025-08-26 | Wacom, Co., Ltd. | Three-dimensional position indicator and three-dimensional position detection system including grip part and tracker |
| US11934592B2 (en) | 2018-03-23 | 2024-03-19 | Wacom Co., Ltd. | Three-dimensional position indicator and three-dimensional position detection system including grip part orthogonal to electronic pen casing |
| JP7778096B2 (ja) | 2020-07-01 | 2025-12-01 | 株式会社ワコム | 動的な形状スケッチングのためのシステム及び方法 |
| JP2023531302A (ja) * | 2020-07-01 | 2023-07-21 | 株式会社ワコム | 動的な形状スケッチングのためのシステム及び方法 |
| WO2022003511A1 (en) * | 2020-07-01 | 2022-01-06 | Wacom Co., Ltd. | Systems and methods for dynamic shape sketching |
| US12001629B2 (en) | 2020-07-01 | 2024-06-04 | Wacom Co., Ltd. | Systems and methods for dynamic shape sketching using position indicator and processing device that displays visualization data based on position of position indicator |
| JPWO2022185560A1 (enExample) * | 2021-03-02 | 2022-09-09 | ||
| US12105882B2 (en) | 2021-03-02 | 2024-10-01 | Sony Interactive Entertainment Inc. | Force sense presentation device |
| JP7544952B2 (ja) | 2021-03-02 | 2024-09-03 | 株式会社ソニー・インタラクティブエンタテインメント | 力覚提示装置 |
| KR20230138548A (ko) | 2021-04-23 | 2023-10-05 | 가부시키가이샤 와코무 | 컨트롤러 및 컴퓨터 |
| KR20250103779A (ko) | 2021-04-23 | 2025-07-07 | 가부시키가이샤 와코무 | 컨트롤러 및 컴퓨터 |
| WO2022224578A1 (ja) | 2021-04-23 | 2022-10-27 | 株式会社ワコム | コントローラ及びコンピュータ |
| US11797104B2 (en) | 2021-05-06 | 2023-10-24 | Samsung Electronics Co., Ltd. | Electronic device and control method of the same |
| DE112023001330T5 (de) | 2022-04-21 | 2025-01-09 | Wacom Co., Ltd. | Haptischer stift und verfahren zur erzeugung von wellenformen zur haptischen steuerung |
| US12487681B2 (en) | 2022-04-21 | 2025-12-02 | Wacom Co., Ltd. | Haptic pen and haptic control waveform generation method |
| WO2024090304A1 (ja) | 2022-10-24 | 2024-05-02 | ソニーグループ株式会社 | 入力デバイス、制御装置、制御方法、情報処理装置、及び、情報処理方法 |
| WO2024090300A1 (ja) | 2022-10-24 | 2024-05-02 | ソニーグループ株式会社 | 情報処理装置及び情報処理方法 |
| KR20250090310A (ko) | 2022-10-24 | 2025-06-19 | 소니그룹주식회사 | 정보 처리 장치 및 정보 처리 방법 |
| KR20250095638A (ko) | 2022-10-24 | 2025-06-26 | 소니그룹주식회사 | 입력 디바이스, 제어 장치, 제어 방법, 정보 처리 장치 및 정보 처리 방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019220803A1 (ja) | 2021-06-10 |
| JP2021114346A (ja) | 2021-08-05 |
| EP3796136B1 (en) | 2025-08-06 |
| JP7659608B2 (ja) | 2025-04-09 |
| US12197653B2 (en) | 2025-01-14 |
| JP7373258B2 (ja) | 2023-11-02 |
| US20250231626A1 (en) | 2025-07-17 |
| JP6887060B2 (ja) | 2021-06-16 |
| EP3796136A4 (en) | 2021-07-14 |
| EP3796136A1 (en) | 2021-03-24 |
| US20250110581A1 (en) | 2025-04-03 |
| CN112074802A (zh) | 2020-12-11 |
| US20210048897A1 (en) | 2021-02-18 |
| JP2023174898A (ja) | 2023-12-08 |
| JP2025092623A (ja) | 2025-06-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7373258B2 (ja) | 位置指示装置、コンピュータ、制御方法 | |
| JP7692962B2 (ja) | コンピュータ、レンダリング方法、及びプログラム | |
| TW408278B (en) | Input device | |
| US20100090949A1 (en) | Method and Apparatus for Input Device | |
| WO2012050537A1 (en) | Pencil input peripheral computer controller | |
| KR102572675B1 (ko) | 사용자 인터페이스를 적응적으로 구성하기 위한 장치 및 방법 | |
| WO2015137014A1 (ja) | 情報入出力装置及び情報入出力方法 | |
| KR20100009023A (ko) | 움직임을 인식하는 장치 및 방법 | |
| CN109189245B (zh) | 一种智能笔及该智能笔实现鼠标功能的方法 | |
| US20150002486A1 (en) | Multifunctional pencil input peripheral computer controller | |
| US7356769B2 (en) | Method and apparatus for providing inputs to a communication or computing device | |
| KR102824663B1 (ko) | 컨트롤러 및 컴퓨터 | |
| EP4362504A1 (en) | Method for providing user interface on basis of locational relationship between electronic devices, and electronic device therefor | |
| EP4216040B1 (en) | Computer, method, and program | |
| US20150103052A1 (en) | Direction input device and method for operating user interface using same | |
| KR20050116041A (ko) | 가속도센서로 구성된 디지털 펜 | |
| WO2013032410A1 (en) | Multifunctional pencil input peripheral computer controller | |
| KR102180661B1 (ko) | 압력 기반의 사용자 입력 장치와 이를 이용한 3d 무선 프리젠터 | |
| WO2023134408A1 (zh) | 一种信息传输方法和装置 | |
| KR101213438B1 (ko) | 이미지 구현 장치 및 그의 구현 방법 | |
| WO2010090608A2 (en) | Pencil input peripheral computer controller | |
| KR20010102683A (ko) | 컴퓨터용 입력장치 | |
| HK1197479A (en) | Multifunctional pencil input peripheral computer controller |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19804098 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2020519507 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2019804098 Country of ref document: EP |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2019804098 Country of ref document: EP |