WO2013161261A1 - Display control system and indicating device - Google Patents

Display control system and indicating device Download PDF

Info

Publication number
WO2013161261A1
WO2013161261A1 PCT/JP2013/002708 JP2013002708W WO2013161261A1 WO 2013161261 A1 WO2013161261 A1 WO 2013161261A1 JP 2013002708 W JP2013002708 W JP 2013002708W WO 2013161261 A1 WO2013161261 A1 WO 2013161261A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
display
control system
display unit
display control
Prior art date
Application number
PCT/JP2013/002708
Other languages
French (fr)
Japanese (ja)
Inventor
山田 和宏
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013161261A1 publication Critical patent/WO2013161261A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet

Definitions

  • This disclosure relates to a display control system capable of handwriting input on a display surface of a digital display.
  • Patent Document 1 discloses a technique for digitizing information entered on paper and transmitting the digitized information to a server or a terminal when characters or the like are entered on the paper using a pen.
  • This disclosure provides a display control system capable of executing handwriting input with high accuracy on the display surface of a display device.
  • the display control system includes a display device having a display unit and an instruction device for indicating a position on the display unit, and performs display control according to the position indicated by the instruction device.
  • the display unit of the display device is formed with a position information pattern representing a planar position on the display unit, and the pointing device has an instruction unit for indicating a position on the display unit, and a position indicated by the instruction unit.
  • a display unit that captures an image including a position information pattern, and the display control system specifies a position in the display unit based on the position information pattern included in the captured image by the reading unit;
  • a posture detection unit that detects distortion of the captured image and detects the posture of the pointing device with respect to the display unit from the detected distortion;
  • the display control system according to the present disclosure can realize highly accurate handwriting input without hindering downsizing of the pointing device and cost reduction of the display control system.
  • FIG. 1 is a block diagram of a display control system according to Embodiment 1.
  • FIG. It is a schematic sectional drawing of a display panel. It is a schematic sectional drawing of a digital pen. It is a top view of a color filter.
  • (A) to (D) are diagrams showing examples of dot arrangement corresponding to each code.
  • (A), (b) is a figure for demonstrating the relationship between the inclination of a digital pen, and the positional information read. It is an example of the captured image in the state of FIG. 4 is a flowchart illustrating a processing flow of the display control system according to the first embodiment.
  • 6 is a block diagram of a display control system according to Embodiment 2.
  • FIG. 10 is a flowchart illustrating a processing flow of the display control system according to the second embodiment. It is a schematic sectional drawing which shows the other structure of a digital pen.
  • a position information pattern indicating a planar position is formed on the display unit, and the indicated position is detected by optically reading the pattern by a pointing device, and a locus display or the like is performed. Can be considered.
  • the following problems are expected to occur. That is, there is a certain gap between the surface of the display unit with which the pointing device abuts and the layer on which the position information pattern is formed, depending on the thickness of the glass substrate or the like. For this reason, for example, when using an instruction device that performs reading at the tip of the instruction unit, when the instruction device is placed perpendicular to the display unit surface, the indicated position matches the reading position, but the display unit surface When the pointing device is tilted with respect to the position, the indicated position and the reading position are displaced. For this reason, the position indicated by the instruction unit and the position displayed on the display device do not necessarily match, and a shift occurs, which may give the user a sense of discomfort. In addition, since the magnitude of the positional deviation changes depending on the inclination of the pointing device, it is not appropriate to fix the display position in a fixed manner.
  • the display position can be corrected according to the posture information.
  • adding some device to detect the attitude of the pointing device is not preferable in terms of downsizing the pointing device and reducing the cost of the display control system.
  • FIG. 1 is a schematic diagram illustrating an appearance of a display control system 100 according to the embodiment.
  • the display control system 100 includes an optical digital pen (hereinafter simply referred to as “digital pen”) 10 as a pointing device and a display device 20.
  • the display device 20 is a liquid crystal display, for example, and can display various objects on the display unit 21.
  • the display unit 21 is formed with a dot pattern as a position information pattern representing a planar position in the display unit 21.
  • the digital pen 10 optically reads the dot pattern to detect position information of the designated position and transmits the position information to the display device 20.
  • the display device 20 receives the position information as input and performs various processes. That is, the digital pen 10 functions as a reading device and also functions as a data input device to the display control system 100.
  • the display device 20 can display the locus of the tip of the digital pen 10 by continuously displaying points at designated positions on the display unit 21 following the movement of the digital pen 10. That is, it is possible to enter characters, figures, and the like on the display unit 21 using the digital pen 10.
  • the display device 20 follows the movement of the digital pen 10 and continuously erases the display at the designated position on the display unit 21, thereby displaying a portion that matches the locus of the tip of the digital pen 10. Can be erased. That is, the digital pen 10 can be used like an eraser.
  • the display device 20 can use the digital pen 10 as an input device such as a mouse by displaying a specified position on the display unit 21. In this way, in the display control system 100, the position of the digital pen 10 is input as an input to the display device 20 by moving the digital pen 10 on the display unit 21 of the display device 20, and the display device 20 responds to the input. To change the displayed contents.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the display control system 100.
  • the display device 20 includes a receiving unit 22 that receives a signal from the outside, a display-side microcomputer 23 that controls the entire display device 20, and a display panel 24 that displays an image.
  • the display panel 24 of the present embodiment is a liquid crystal panel.
  • the receiving unit 22 receives a signal transmitted from the digital pen 10, which will be described in detail later.
  • the signal received by the receiving unit 22 is sent to the display-side microcomputer 23.
  • the display-side microcomputer 23 is composed of a CPU, a memory, and the like, and a program for operating the CPU is also mounted.
  • the display-side microcomputer 23 is an example of a control unit.
  • the display-side microcomputer 23 controls the display panel 24 based on a signal transmitted from the digital pen 10 and changes the content displayed on the display unit 21.
  • FIG. 3 is a schematic sectional view of the display panel 24.
  • the basic configuration of the display panel 24 is the same as that of a general liquid crystal panel.
  • the display panel 24 includes a pair of glass substrates 25, a polarizing filter 26 provided on the outer surface of each glass substrate 25, a pair of alignment films 27 provided between the pair of glass substrates 25, and a pair A liquid crystal layer 28 provided between the alignment films 27, a transparent electrode 29 provided on each alignment film 27, and a color filter 30 provided between the glass substrate 25 on the surface side and the transparent electrode 29.
  • Various images are displayed on the display unit 21.
  • the dot pattern described above is written on the color filter 30.
  • the dot pattern is an example of a position information pattern.
  • FIG. 4 is a cross-sectional view illustrating a schematic configuration of the digital pen 10.
  • the digital pen 10 includes a cylindrical main body 11, a pen tip 12 as an instruction unit attached to the tip of the main body 11, a pressure sensor 13 that detects pressure acting on the pen tip 12, and an infrared ray.
  • Irradiation unit 14 that emits light
  • reading unit 15 that reads incident infrared light
  • control unit 16 that controls the digital pen 10
  • transmission unit 17 that outputs a signal to the outside
  • each member of the digital pen 10 And a power source 19 for supplying power to the power source.
  • the main body 11 is formed of a cylinder similar to a general pen.
  • the pen tip portion 12 has a tapered shape, and its tip is formed to be round so as not to damage the display portion 21. Moreover, it is preferable that the shape of the pen point part 12 is a shape in which the user can easily recognize the image displayed on the display unit 21. In the configuration of FIG. 4, the pen tip portion 12 is formed of a material that can transmit infrared light.
  • the pressure sensor 13 is built in the main body 11 and connected to the proximal end of the pen tip 12.
  • the pressure sensor 13 detects the pressure applied to the pen tip unit 12 and transmits the detection result to the control unit 16. Specifically, the pressure sensor 13 detects the pressure applied to the pen tip portion 12 when the user enters characters or the like on the display portion 21 using the digital pen 10. That is, the pressure sensor 13 is used when determining whether or not there is an input intention of the user using the digital pen 10.
  • the irradiation unit 14 is provided at the tip of the main body 11 and in the vicinity of the pen tip 12.
  • the irradiation part 14 is comprised by infrared LED, for example, and is comprised so that infrared light may be irradiated from the front-end
  • FIG. 4 In the configuration of FIG. 4, a plurality of (for example, four) irradiation units 14 are arranged so as to surround the pen tip unit 12. The number of irradiation units 14 can be set as appropriate.
  • the irradiation part 14 may be formed in ring shape.
  • the reading unit 15 is provided at the tip of the main body 11 and in the vicinity of the pen tip unit 12.
  • the reading unit 15 includes an objective lens 15a, a lens 15b, and an image sensor 15c built in the tip of the pen tip unit 12, and the objective lens 15a and the lens 15b constitute an optical system.
  • the objective lens 15a and the lens 15b form incident light on the image sensor 15c.
  • Infrared light emitted from the irradiation unit 14 and reflected by the display device 20 is incident on the objective lens 15a.
  • the image sensor 15c is provided on the optical axes of the objective lens 15a and the lens 15b.
  • the imaging element 15 c converts an optical image formed on the imaging surface into an electrical signal and outputs the electrical signal to the control unit 16.
  • the image sensor 15c is configured by, for example, a CCD image sensor or a CMOS image sensor. Although the details will be described later, since the dot pattern is formed of a material that absorbs infrared light, the dot pattern does not return infrared light. As a result, an optical image in which the dot pattern is expressed in black is captured by the image sensor 15c.
  • the control unit 16 includes a specifying unit 16a, a posture detection unit 16b, and a pen-side microcomputer 16c, as shown in FIG.
  • the specifying unit 16 a specifies position information on the display unit 21 of the digital pen 10 based on the image signal from the reading unit 15. Specifically, the specifying unit 16a acquires a dot pattern from the image signal acquired by the reading unit 15, and specifies the position of the pen tip unit 12 on the display unit 21 based on the dot pattern. Information regarding the position of the pen tip part 12 specified by the specifying part 16a is sent to the pen-side microcomputer 16c.
  • the posture detection unit 16 b detects posture information with respect to the display unit 21 of the digital pen 10 based on the image signal from the reading unit 15.
  • the posture detection unit 16b detects image distortion from the image signal acquired by the reading unit 15 based on the arrangement of dot patterns and the like, and the detected distortion information indicates the normal direction of the surface of the display unit 21.
  • the tilt of the digital pen 10 is detected.
  • Information about the posture of the digital pen 10 detected by the posture detection unit 16b is sent to the pen-side microcomputer 16c.
  • the pen side microcomputer 16 c controls the entire digital pen 10.
  • the pen side microcomputer 16c is constituted by a CPU, a memory, and the like, and a program for operating the CPU is also mounted.
  • the transmission unit 17 transmits a signal to the outside. Specifically, the transmission unit 17 wirelessly transmits the position information specified by the specifying unit 16a and the posture information detected by the posture detection unit 16b to the outside. The transmission unit 17 performs near field communication with the reception unit 22 of the display device 20. The transmitter 17 is provided at the end of the main body 11 opposite to the pen tip 12.
  • FIG. 5 is a plan view of the color filter 30.
  • the color filter 30 includes a black matrix 31, a plurality of pixel regions 32 that are partitioned by the black matrix 31 and transmit light of a specific color, and dots 33 provided in the pixel region 32.
  • the pixel area 32 includes a red pixel area 32r that transmits red (R) light, a green pixel area 32g that transmits green (G) light, and a blue pixel area 32b that transmits blue (B) light. Contains.
  • Each pixel region 32 corresponds to a subpixel of the display panel 24. Note that when the colors to be transmitted are not distinguished, they are simply referred to as “pixel region 32”.
  • the black matrix 31 includes a vertical line extending in the longitudinal direction of the pixel region 32 and a horizontal line extending in the short direction of the pixel region 32 and is formed in a lattice shape.
  • the horizontal line is formed thicker than the vertical line.
  • the black matrix 31 and the dots 33 are formed of a material mainly composed of carbon black.
  • the dots 33 are formed in a solid circle.
  • the dots 33 are provided in some pixel regions 32 instead of all the pixel regions 32.
  • a plurality of dots 33 are collected to form a dot pattern. This dot pattern differs depending on the position of the color filter 30.
  • the dot pattern will be described in detail below.
  • first reference line 34 and a second reference line 35 are defined on the color filter 30. These first and second reference lines 34 and 35 are virtual lines and are not actually existing lines.
  • the first reference line 34 is a straight line extending in the short direction of the pixel region 32.
  • a plurality of first reference lines 34 are arranged in parallel in the longitudinal direction of the pixel region 32 every two pixel regions 32. Each first reference line 34 is located at the center in the longitudinal direction of the pixel region 32.
  • the second reference line 35 is a straight line extending in the longitudinal direction of the pixel region 32.
  • the second reference line 35 is provided on the green pixel region 32g, and a plurality of second green cells 32g are arranged in parallel in the short direction of the pixel region 32. Each second reference line 35 is located in the center in the lateral direction of the green pixel region 32g.
  • a grid is defined on the color filter 30 by the first reference line 34 and the second reference line 35.
  • FIG. 6 is a diagram showing an arrangement pattern of the dots 33.
  • the dots 33 are arranged at positions shifted from the intersection in any of four directions orthogonal to each other (up and down, left and right in FIGS. 5 and 6). Specifically, the dots 33 are arranged in any one of FIGS. 6A to 6D.
  • the dot 33 is arranged at a position shifted to the right on the first reference line 34 from the intersection of the first reference line 34 and the second reference line 35.
  • the dots 33 are arranged on the blue pixel region 32b. When this arrangement is digitized, it is represented by “1”. In the arrangement of FIG.
  • the dot 33 is arranged at a position shifted upward on the second reference line 35 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the green pixel region 32g. When this arrangement is digitized, it is represented by “2”. In the arrangement of FIG. 6C, the dot 33 is arranged at a position shifted to the left on the first reference line 34 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the red pixel region 32r. When this arrangement is digitized, it is represented by “3”. In the arrangement of FIG.
  • the dot 33 is arranged at a position shifted downward on the second reference line 35 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the green pixel region 32g. When this arrangement is digitized, it is represented by “4”.
  • one dot pattern is formed by 36 dots 33 included in the unit area.
  • each of the 36 dots 33 included in the unit area in any one of “1” to “4”, a huge number of dot patterns can be formed.
  • the dot patterns in each unit area are all different.
  • each dot pattern represents a position coordinate for each unit area. That is, when the color filter 30 is divided into unit areas of 6 dots ⁇ 6 dots, each dot pattern represents the position coordinates of the unit area.
  • dot pattern patterning and coordinate conversion methods for example, a publicly known method as disclosed in Japanese Patent Application Laid-Open No. 2006-141067 can be used.
  • FIG. 7 is an enlarged view showing a contact state between the digital pen and the display unit of the display device in the present embodiment.
  • the orientation of the pen tip portion 12 of the digital pen 10 is perpendicular to the display portion 21, and the imaging direction dc of the reading portion 15 of the digital pen 10 is the normal direction of the surface of the display portion 21. It matches dn.
  • the direction of the pen tip portion 12 of the digital pen 10 is inclined with respect to the display portion 20, and the imaging direction dc of the reading portion 15 of the digital pen 10 is the method of the display portion 20. It deviates from the line direction dn. For this reason, in the state of FIG. 7B, the position A captured by the color filter 30 on which the dot pattern is formed is shifted from the position B that is directly below the pen tip portion 12. This is because the glass substrate 25 and the polarization filter 26 on the surface of the display unit 21 have a thickness.
  • FIG. 8 is an example of an image captured by the reading unit 15, (a) is an image captured in the state of FIG. 7 (a), and (b) is FIG. 7 (b), that is, the pen tip unit 12 is tilted. It is the image imaged in the state.
  • trapezoidal distortion is generated as can be seen in comparison with FIG.
  • the degree of trapezoidal distortion can be obtained from the pitch size of the vertical arrangement of dot patterns, for example. That is, in the image of FIG. 8A, the vertical arrangement of dot patterns is arranged substantially in parallel.
  • the image of FIG. 8A the vertical arrangement of dot patterns is arranged substantially in parallel.
  • the vertical arrangement of the dot patterns has an inclination, and the inclination gradually increases as it approaches the edge of the image. That is, the pitch of the vertical arrangement of dot patterns is large on the near side and small on the far side. From the difference in the pitch of the dot pattern arrangement, the inclination of the imaging direction dc of the reading unit 15, that is, the inclination of the pen tip unit 12 can be calculated by geometric calculation.
  • the trapezoidal distortion detection method is not limited to the one shown here.
  • FIG. 9 is a flowchart showing a processing flow of the display control system 100. Below, the case where a user writes a character into the display apparatus 20 using the digital pen 10 is demonstrated.
  • the pen side microcomputer 16c of the digital pen 10 monitors whether or not pressure is applied to the pen tip portion 12 in step S11. This pressure is detected by the pressure sensor 13. When the pressure is detected (Yes), the pen-side microcomputer 16c determines that the user is inputting characters on the display unit 21 of the display device 20, and proceeds to step S12. While the pressure is not detected (No), the pen side microcomputer 16c repeats Step S11. Note that when the power source of the digital pen 10 is turned on, the irradiation unit 14 starts irradiation of infrared light.
  • step S12 the reading unit 15 of the digital pen 10 detects a dot pattern formed on the display unit 21.
  • Infrared light is emitted from the irradiation unit 14, and the infrared light is absorbed by at least the dots 33 provided in the color filter 30 of the display device 20, and is reflected by the image region 32 and the like.
  • the reflected infrared light is received by the image sensor 15c through the objective lens 15a and the lens 15b.
  • a dot pattern is imaged by the image sensor 15c.
  • the reading unit 15 optically reads the dot pattern.
  • the image signal acquired by the reading unit 15 is transmitted to the specifying unit 16a and the posture detecting unit 16b.
  • the specifying unit 16a acquires a dot pattern from the image signal, and specifies the position of the pen tip unit 12 on the display unit 21 based on the dot pattern. Specifically, the specifying unit 16a acquires a dot pattern by performing predetermined image processing on the obtained image signal. For example, since the black matrix 31 is formed of carbon black like the dots 33, it absorbs infrared light. Therefore, the black matrix 31 is also included in the image from the reading unit 15 in the same state as the dots 33. Therefore, the image signal from the reading unit 15 is subjected to predetermined image processing so that the dots 33 are easily discriminated from the black matrix 31, and an array of a plurality of dots 33 is acquired from the processed image signal.
  • the specifying unit 16a determines a unit area of 6 dots ⁇ 6 dots from the acquired arrangement of the dots 33, and specifies the position coordinates (position information) of the unit area from the dot pattern of the unit area.
  • the specifying unit 16a converts the dot pattern into position coordinates by a predetermined calculation corresponding to the dot pattern coding method.
  • the specified position information is transmitted to the pen-side microcomputer 16c.
  • the posture detection unit 16b detects the trapezoidal distortion of the image from the image signal, and detects the inclination of the pen tip unit 12 based on the degree of the trapezoidal distortion. For example, as described above, after performing rotation correction on the image, the inclination and pitch of the vertical arrangement of the dot pattern are obtained, and the inclination in the imaging direction of the reading unit 15 is calculated from the inclination and pitch by geometric calculation. . The calculated inclination is transmitted to the pen-side microcomputer 16c as posture information.
  • step S15 the pen side microcomputer 16c transmits the position information and the posture information to the display device 20 via the transmission unit 17.
  • the position information and posture information transmitted from the digital pen 10 are received by the receiving unit 22 of the display device 20.
  • the received position information and posture information are transmitted from the receiving unit 22 to the display-side microcomputer 23.
  • the display-side microcomputer 23 controls the display panel 24 to change the display content of the position corresponding to the position information.
  • a character is input, a point is displayed at a position corresponding to the position information on the display unit 21.
  • the display-side microcomputer 23 corrects the position corresponding to the position information in consideration of the posture information. For example, in the state of FIG. 7B, the position information indicates the position A, but the actual position where the pen tip portion 12 is in contact with the display portion 21 is the position B, and a deviation occurs. This deviation is caused by the inclination of the pen tip portion 12. For this reason, the display-side microcomputer 23 changes the display content change position to, for example, the position B in consideration of the posture information, that is, the inclination of the pen tip portion 12 so as not to give the user a sense of incongruity. It should be noted that the corrected position is not necessarily set to the position B immediately below the pen tip portion 12, and is set to a position between the position A and the position B where the user is assumed not to feel uncomfortable. It doesn't matter.
  • step S17 the pen side microcomputer 16c determines whether or not the input by the user is continued.
  • the pressure sensor 13 detects the pressure
  • the pen side microcomputer 16c determines that the input by the user is continued, and returns to step S11.
  • dots are continuously displayed at the position of the pen tip 12 on the display unit 21 following the movement of the pen tip 12 of the digital pen 10.
  • characters corresponding to the locus of the pen tip portion 12 of the digital pen 10 are displayed on the display portion 21 of the display device 20.
  • the pen side microcomputer 16c determines that the input by the user is not continued, and ends the process.
  • the display device 20 displays the locus of the tip of the digital pen 10 on the display unit 21 on the display unit 21, handwriting input to the display unit 21 using the digital pen 10 can be performed.
  • the usage of the display control system 100 is not restricted to this.
  • the digital pen 10 can be used like an eraser to erase characters, figures, etc. displayed on the display unit 21. That is, in the above example, the point is displayed at the position corresponding to the position information on the display unit 21, but the point at the position may be deleted.
  • the digital pen 10 can be used like a mouse to move a pointer displayed on the display unit 21 or to select an icon displayed on the display unit 21.
  • the position of the pen tip unit 12 on the display unit 21 based on the dot pattern included in the image captured by the reading unit 15 by the specifying unit 16a provided in the digital pen 10. Is identified.
  • a trapezoidal distortion of the image captured by the reading unit 15 is detected by the posture detection unit 16b provided in the digital pen 10, and based on the degree of the trapezoidal distortion, the display unit 21 of the pen tip unit 12 is detected. Tilt is detected.
  • the display device 20 determines the change position of the display information on the display unit 21 based on the position information specified by the specifying unit 16a and taking into account the posture information detected by the posture detection unit 16b.
  • the display position can be corrected so that there is no deviation between the position of the pen tip portion 12 and the display position on the display.
  • the tilt of the pen tip 12 is detected from the image captured by the reading unit 15, it is not necessary to add a new device to obtain the posture information of the digital pen 10. Therefore, it is possible to eliminate the user's uncomfortable feeling without increasing the size and cost of the digital pen 10.
  • the position on the display unit 21 is instructed by the digital pen 10 to the display device 20 having the display unit 21.
  • the reading unit 15 captures an image including a dot pattern representing a planar position on the display unit 21.
  • the position in the display part 21 is pinpointed by the specific
  • the posture detection unit 16b detects the distortion of the captured image, and the posture of the digital pen 10 with respect to the display unit 21 is detected from the detected distortion. For this reason, for example, even if there is a deviation between the designated position and the reading position of the dot pattern due to the posture of the digital pen 10, the detected posture information is taken into consideration in the display unit 21.
  • the display position can be corrected. Therefore, using the posture information detected from the captured image, it is possible to correct the deviation between the indicated position and the display position due to the posture of the digital pen 10 in advance.
  • the detected posture information can also be used for setting a display information change mode on the display unit 21.
  • the specifying unit 16a and the posture detection unit 16b are separate blocks, but these may be integrated to perform processing.
  • the position information and the posture information are transmitted from the digital pen 10 and the display device 20 corrects the position indicated by the position information based on the posture information.
  • the position indicated by the position information may be corrected by the pen 10 based on the posture information, and the corrected display position may be transmitted to the display device 20.
  • the type of the display device 20 is changed, there is a possibility that the distance from the surface of the display unit 21 to the layer on which the dot pattern is formed (here, the color filter 30) may change, and the display position is corrected accordingly. Need to be changed. For this reason, it can be said that the correction process is preferably performed on the display device 20 side.
  • FIG. 10 is a block diagram illustrating a schematic configuration of the display control system 200.
  • the display control system 200 is different from the first embodiment in that the display device 220, not the digital pen 210, identifies the position of the digital pen 210 and detects the posture.
  • the description of the configuration substantially similar to that of the first embodiment may be omitted.
  • the digital pen 210 has a pressure sensor 13, an irradiation unit 14, a reading unit 15, a control unit 216, and a transmission unit 17.
  • the configurations of the pressure sensor 13, the irradiation unit 14, the reading unit 15, and the transmission unit 17 are the same as those in the first embodiment.
  • the control unit 216 includes the pen-side microcomputer 16c and does not include the specifying unit 16a and the posture detection unit 16b of the first embodiment. That is, the control unit 216 outputs the image signal input from the image sensor 15c to the transmission unit 17 without specifying the position information of the digital pen 210 from the image signal. Thus, the image signal picked up by the image pickup device 15c is transmitted from the digital pen 210.
  • the display device 220 includes a receiving unit 22 that receives an external signal, a display-side microcomputer 23 that controls the entire display device 20, a display panel 24 that displays an image, and a digital pen 10. It has a specifying unit 240 that specifies the position, and a posture detecting unit 241 that detects the posture of the digital pen 10.
  • the configurations of the receiving unit 22, the display-side microcomputer 23, and the display panel 24 are the same as those in the first embodiment.
  • a dot pattern as shown in FIG. 5 is formed on the display unit 21 of the display panel 24.
  • the receiving unit 22 receives a signal transmitted from the digital pen 210 and transmits the signal to the specifying unit 240.
  • the specifying unit 240 has the same function as the specifying unit 16a of the digital pen 10 in the first embodiment. That is, in this embodiment, since the transmission signal from the digital pen 210 is an image signal acquired by the image sensor 15c, the specifying unit 240 specifies the position of the digital pen 210 from the image signal. That is, the specifying unit 240 acquires a dot pattern from the image signal, and specifies the position coordinates of the pen tip unit 12 on the display unit 21 based on the dot pattern, similarly to the specifying unit 16a. The specifying unit 240 transmits the specified position information to the display-side microcomputer 23.
  • the posture detection unit 241 has the same function as the posture detection unit 16b of the digital pen 10 in the first embodiment. That is, in this embodiment, since the transmission signal from the digital pen 210 is an image signal acquired by the image sensor 15c, the posture detection unit 241 detects the posture of the digital pen 210 from the image signal. That is, the posture detection unit 241 detects image distortion based on the arrangement of dot patterns and the like from the image signal, as in the posture detection unit 16b, and digitally displays the normal direction of the display unit 21 from the detected distortion information. The inclination of the pen 10 is detected. The posture detection unit 241 transmits the detected posture information to the display-side microcomputer 23. The display-side microcomputer 23 controls the display panel 24 to change the display information displayed on the display unit 21 based on the position information and the posture information.
  • FIG. 11 is a flowchart showing a process flow of the display control system 200. Below, the case where a user writes a character in the display apparatus 220 using the digital pen 210 is demonstrated.
  • step S21 the pen side microcomputer 16c of the digital pen 210 monitors whether or not pressure is applied to the pen tip portion 12.
  • the pen-side microcomputer 16c determines that the user is inputting characters to the display unit 21 of the display device 220, and proceeds to step S22.
  • step S ⁇ b> 22 the reading unit 15 of the digital pen 210 acquires a dot pattern image formed on the display unit 21.
  • the image signal acquired by the reading unit 15 is transmitted to the display device 220 via the transmission unit 17 in step S23.
  • the image signal transmitted from the digital pen 210 is received by the receiving unit 22 of the display device 220 in step S24.
  • the received image signal is sent to the specifying unit 240 and the posture detecting unit 241.
  • the specifying unit 240 acquires a dot pattern based on the image signal and specifies the position of the digital pen 210.
  • the position information specified by the specifying unit 240 is sent to the display-side microcomputer 23.
  • the posture detection unit 241 detects the trapezoidal distortion of the image from the image signal, and detects the inclination of the pen tip unit 12 based on the degree of the trapezoidal distortion. For example, as described above, after performing rotation correction on the image, the inclination of the vertical arrangement and pitch of the dot pattern is obtained, and the inclination in the imaging direction of the reading unit 15 is calculated from the inclination by geometric calculation. The calculated inclination is sent to the display-side microcomputer 23 as posture information.
  • step S26 when receiving the position information and the posture information, the display-side microcomputer 23 controls the display unit 24 to change the display contents of the position corresponding to the position information and the posture information.
  • the display-side microcomputer 23 corrects the position corresponding to the position information to a position where it is estimated that the user does not feel uncomfortable, for example, taking the posture information into consideration.
  • step S27 the pen side microcomputer 16c determines whether or not the input by the user is continued. If the input continues (Yes), the process returns to step S21 and the above flow is repeated. On the other hand, if the input is not continued, the process is terminated. In this way, characters corresponding to the locus of the pen tip 12 of the digital pen 210 are displayed on the display unit 21 of the display device 220.
  • the display control system 200 can detect the position of the digital pen 210 operated by the user with high definition and reflect the position on the display unit 21 with high definition.
  • the position on the display unit 21 is instructed by the digital pen 210 to the display device 220 having the display unit 21.
  • the reading unit 15 captures an image including a dot pattern representing a planar position on the display unit 21.
  • the specifying unit 240 specifies the position on the display unit 21 based on the dot pattern included in the captured image.
  • the posture detection unit 241 detects the distortion of the captured image, and the posture of the digital pen 210 with respect to the display unit 21 is detected from the detected distortion.
  • the detected posture information is taken into consideration in the display unit 21.
  • the display position can be corrected. Therefore, using the posture information detected from the captured image, it is possible to correct the deviation between the indicated position and the display position due to the posture of the digital pen 210 in advance.
  • the detected posture information can also be used for setting a display information change mode on the display unit 21.
  • Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. In addition, it is possible to combine the components described in the first and second embodiments to form a new embodiment.
  • the optical digital pen has been described as an example of the pointing device.
  • the indicating device is for indicating a position on the display unit of the display device, and includes an indicating unit for indicating a position on the display unit, and an image including a position information pattern at the position specified by the indicating unit. It is only necessary to include a reading unit that captures images. Therefore, the pointing device is not limited to the optical digital pen. Further, the configuration of the instruction unit and the reading unit is not limited to that shown in the first and second embodiments.
  • the dot pattern has been described as an example of the position information pattern.
  • the position information pattern may be formed on the display unit of the display device and represents a planar position on the display unit. Therefore, the position information pattern is not limited to a dot pattern. Further, the way of expressing the position coordinates and the division form of the unit area are not limited to those shown in the first and second embodiments.
  • the dots 33 are provided in the color filter 30, but are not limited thereto.
  • the dots 33 may be provided on the glass substrate 25 or the polarizing filter 26.
  • the display panel 24 may be configured to include a sheet different from the color filter 30, the glass substrate 25, and the polarizing filter 26 in which the dots 33 are formed.
  • the posture information of the digital pen 10 is used for correcting the display position. That is, based on the position information output from the specifying unit, the change position of the display information on the display unit is determined in consideration of the posture information of the pointing device.
  • the use form of the posture information is not limited to this. For example, the thickness of the line to be displayed, the touch of the line, or the like may be switched according to the inclination of the pen tip portion 12. That is, it is possible to set a display information change mode in the display unit according to the posture information.
  • the inclination of the pen tip 12 is obtained from the trapezoidal distortion of the image.
  • the rotation angle around the axis of the pen tip 12 is determined from the rotation correction amount of the image. It can also be detected as information.
  • a multicolor pen can be realized by changing the displayed color according to the rotation angle of the pen tip unit 12.
  • the digital pen 10 having a configuration in which the pen tip and the reading position are matched as illustrated in FIG. 4 has been described as an example, but the present invention is not limited to this.
  • the pen even when the pen is configured so that the pen tip and the reading position are separated by a predetermined length, if the pen is tilted with respect to the display unit 21, the pen tip position and the reading position are Will be deviated from the predetermined length. Therefore, the method as described above is effective.
  • FIG. 12 the same reference numerals as those in FIG. That is, the digital pen 10 shown in FIG.
  • a sensor 13 an irradiation unit 14 that emits infrared light, a lens 15 b, and an image sensor 15 c, a reading unit 15 that reads incident infrared light, and a control unit 16 that controls the digital pen 10.
  • a transmitter 17 for outputting a signal to the outside, and a power source 19 for supplying power to each member of the digital pen 10.
  • the specific unit and the posture control unit are assumed to be provided in the digital pen 10 in the first embodiment and are provided in the display device 20 in the second embodiment, but are not limited thereto. Alternatively, it may be provided as a control device separate from the digital pen 10 and the display device 20.
  • This disclosure is applicable to a display control system that can realize highly accurate handwriting input.
  • the present disclosure can be applied to tablets, smartphones, notebook PCs, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

In this display control system (100), display control is performed in accordance with positions on a display panel (24) indicated by an indicating device (10). Said indicating device (10) is provided with a reading unit (15) that takes an image at an indicated position, said image including a position-information pattern. An identification unit (16a) identifies the position on the basis of the position-information pattern included in the image taken by the reading unit (15). An orientation-detection unit (16b) detects distortion in the taken image, and from said distortion, detects the orientation of the indicating device (10).

Description

表示制御システムおよび指示装置Display control system and pointing device
 本開示は、デジタルディスプレイの表示面に対して手書き入力が可能な表示制御システムに関する。 This disclosure relates to a display control system capable of handwriting input on a display surface of a digital display.
 特許文献1は、ペンを用いて紙の上に文字等を記入する際に、紙に記入した情報を電子化し、その電子化された情報をサーバや端末に送信する技術を開示する。 Patent Document 1 discloses a technique for digitizing information entered on paper and transmitting the digitized information to a server or a terminal when characters or the like are entered on the paper using a pen.
特開2007-226577号公報JP 2007-226577 A
 本開示は、表示装置の表示面に高精度に手書き入力を実行可能な表示制御システムを提供する。 This disclosure provides a display control system capable of executing handwriting input with high accuracy on the display surface of a display device.
 本開示における表示制御システムは、表示部を有する表示装置と、表示部上の位置を指示するための指示装置とを備え、指示装置によって指示された位置に応じた表示制御を行うものであり、表示装置の表示部は、当該表示部における平面位置を表す位置情報パターンが形成されており、指示装置は、表示部上の位置を指示するための指示部と、指示部によって指示された位置における、位置情報パターンを含む画像を撮像する読取部とを備えており、表示制御システムは、読取部による撮像画像に含まれた位置情報パターンに基づいて、表示部における位置を特定する特定部と、撮像画像の歪みを検知し、検知した歪みから、指示装置の表示部に対する姿勢を検出する姿勢検出部とを備えている。 The display control system according to the present disclosure includes a display device having a display unit and an instruction device for indicating a position on the display unit, and performs display control according to the position indicated by the instruction device. The display unit of the display device is formed with a position information pattern representing a planar position on the display unit, and the pointing device has an instruction unit for indicating a position on the display unit, and a position indicated by the instruction unit. A display unit that captures an image including a position information pattern, and the display control system specifies a position in the display unit based on the position information pattern included in the captured image by the reading unit; A posture detection unit that detects distortion of the captured image and detects the posture of the pointing device with respect to the display unit from the detected distortion;
 本開示における表示制御システムは、指示装置の小型化や表示制御システムのコストダウン等の妨げになることなく、高精度な手書き入力を実現できる。 The display control system according to the present disclosure can realize highly accurate handwriting input without hindering downsizing of the pointing device and cost reduction of the display control system.
実施形態に係る表示制御システムの概略図である。It is a schematic diagram of a display control system concerning an embodiment. 実施形態1に係る表示制御システムのブロック図である。1 is a block diagram of a display control system according to Embodiment 1. FIG. 表示パネルの概略断面図である。It is a schematic sectional drawing of a display panel. デジタルペンの概略断面図である。It is a schematic sectional drawing of a digital pen. カラーフィルタの平面図である。It is a top view of a color filter. (A)~(D)は各符号に対応するドットの配置例を示す図である。(A) to (D) are diagrams showing examples of dot arrangement corresponding to each code. (a),(b)はデジタルペンの傾きと読み取られる位置情報との関係を説明するための図である。(A), (b) is a figure for demonstrating the relationship between the inclination of a digital pen, and the positional information read. 図7の状態における撮像画像の例である。It is an example of the captured image in the state of FIG. 実施形態1に係る表示制御システムの処理の流れを示すフローチャートである。4 is a flowchart illustrating a processing flow of the display control system according to the first embodiment. 実施形態2に係る表示制御システムのブロック図である。6 is a block diagram of a display control system according to Embodiment 2. FIG. 実施形態2に係る表示制御システムの処理の流れを示すフローチャートである。10 is a flowchart illustrating a processing flow of the display control system according to the second embodiment. デジタルペンの他の構成を示す概略断面図である。It is a schematic sectional drawing which shows the other structure of a digital pen.
 近年、スタイラスのような筆記具を用いて表示装置の表示面に文字等を記入することによって、該表示面に筆記具の軌跡をそのまま表示させるという手書き入力可能なシステムが開発されている。しかし、そのようなシステムは、未だ発展途上である。特に、高精細な手書き入力の点においては、まだまだ開発の余地がある。 Recently, a system capable of handwriting input has been developed in which a writing instrument such as a stylus is used to write characters on the display surface of the display device so that the locus of the writing instrument is directly displayed on the display surface. However, such a system is still developing. In particular, there is still room for development in terms of high-definition handwriting input.
 このような表示制御システムの1つとして、表示部に平面位置を示す位置情報パターンを形成し、それを指示装置によって光学的に読み取ることによって、指示位置の検出を行い、軌跡表示等を行う構成が考えられる。 As one of such display control systems, a position information pattern indicating a planar position is formed on the display unit, and the indicated position is detected by optically reading the pattern by a pointing device, and a locus display or the like is performed. Can be considered.
 上のような光学読み取りを利用した表示制御システムでは、次のような問題が生じることが予想される。すなわち、指示装置が当接する表示部表面と位置情報パターンが形成された層との間には、ガラス基板等の厚みによって、ある程度の間隔が空いている。このため、例えば指示部の先端で読み取りを行う指示装置を用いた場合、指示装置を表示部表面に対して垂直に当てたときは、指示した位置と読み取りの位置が一致するものの、表示部表面に対して指示装置が傾いているときには、指示した位置と読み取りの位置にずれが生じる。このため、指示部が示す位置と、表示装置に表示される位置とが必ずしも一致せず、ずれが生じ、これがユーザに違和感を与える可能性がある。また、この位置ずれの大きさは、指示装置の傾きの大小によって変化するため、表示位置を固定的に修正することは適切ではない。 In the display control system using optical reading as described above, the following problems are expected to occur. That is, there is a certain gap between the surface of the display unit with which the pointing device abuts and the layer on which the position information pattern is formed, depending on the thickness of the glass substrate or the like. For this reason, for example, when using an instruction device that performs reading at the tip of the instruction unit, when the instruction device is placed perpendicular to the display unit surface, the indicated position matches the reading position, but the display unit surface When the pointing device is tilted with respect to the position, the indicated position and the reading position are displaced. For this reason, the position indicated by the instruction unit and the position displayed on the display device do not necessarily match, and a shift occurs, which may give the user a sense of discomfort. In addition, since the magnitude of the positional deviation changes depending on the inclination of the pointing device, it is not appropriate to fix the display position in a fixed manner.
 この問題に対して、表示部に対する指示装置の傾き等の実際の姿勢が検知できれば、この姿勢情報に応じて、例えば表示位置を修正することが可能となる。ただし、指示装置の姿勢を検知するために何らかのデバイスを追加することは、指示装置の小型化や表示制御システムのコストダウン等の面で好ましくない。 For this problem, if the actual posture such as the tilt of the pointing device with respect to the display unit can be detected, for example, the display position can be corrected according to the posture information. However, adding some device to detect the attitude of the pointing device is not preferable in terms of downsizing the pointing device and reducing the cost of the display control system.
 以下、適宜図面を参照しながら、実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。 Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art.
 なお、発明者(ら)は、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって特許請求の範囲に記載の主題を限定することを意図するものではない。 The inventor (s) provides the accompanying drawings and the following description in order for those skilled in the art to fully understand the present disclosure, and is intended to limit the subject matter described in the claims. Not what you want.
 (実施形態1)
[1.表示制御システムの概要]
 図1は実施形態に係る表示制御システム100の外観を示す概略図である。表示制御システム100は、指示装置としての光学式デジタルペン(以下、単に「デジタルペン」と称する。)10と、表示装置20とを備えている。詳しくは後述するが、表示装置20は例えば液晶ディスプレイであり、表示部21に様々な対象を表示することができる。また、表示部21には、表示部21における平面位置を表す位置情報パターンとしてのドットパターンが形成されている。デジタルペン10は、ドットパターンを光学的に読み取ることによって、指定した位置の位置情報を検出し、該位置情報を表示装置20に送信する。表示装置20は、該位置情報を入力として受け取り、様々な処理を行う。すなわち、デジタルペン10は、読み取り装置として機能すると共に、表示制御システム100へのデータ入力装置としても機能する。
(Embodiment 1)
[1. Overview of display control system]
FIG. 1 is a schematic diagram illustrating an appearance of a display control system 100 according to the embodiment. The display control system 100 includes an optical digital pen (hereinafter simply referred to as “digital pen”) 10 as a pointing device and a display device 20. As will be described in detail later, the display device 20 is a liquid crystal display, for example, and can display various objects on the display unit 21. Further, the display unit 21 is formed with a dot pattern as a position information pattern representing a planar position in the display unit 21. The digital pen 10 optically reads the dot pattern to detect position information of the designated position and transmits the position information to the display device 20. The display device 20 receives the position information as input and performs various processes. That is, the digital pen 10 functions as a reading device and also functions as a data input device to the display control system 100.
 例えば、表示装置20は、デジタルペン10の移動に追従して、表示部21における指定された位置に点を連続的に表示することによって、デジタルペン10の先端の軌跡を表示することができる。すなわち、デジタルペン10を用いて表示部21に文字や図形等を記入することができる。また、表示装置20は、デジタルペン10の移動に追従して、表示部21における指定された位置の表示を連続的に消去することによって、デジタルペン10の先端の軌跡と一致する部分の表示を消去することができる。すなわち、デジタルペン10を消しゴムのように使用することができる。さらには、表示装置20は、表示部21における指定された位置を表示することによって、デジタルペン10をマウスのような入力装置として使用することができる。このように、表示制御システム100においては、デジタルペン10を表示装置20の表示部21上で動かすことによってデジタルペン10の位置が入力として表示装置20に入力され、表示装置20が該入力に応じて表示内容を変更する。 For example, the display device 20 can display the locus of the tip of the digital pen 10 by continuously displaying points at designated positions on the display unit 21 following the movement of the digital pen 10. That is, it is possible to enter characters, figures, and the like on the display unit 21 using the digital pen 10. In addition, the display device 20 follows the movement of the digital pen 10 and continuously erases the display at the designated position on the display unit 21, thereby displaying a portion that matches the locus of the tip of the digital pen 10. Can be erased. That is, the digital pen 10 can be used like an eraser. Furthermore, the display device 20 can use the digital pen 10 as an input device such as a mouse by displaying a specified position on the display unit 21. In this way, in the display control system 100, the position of the digital pen 10 is input as an input to the display device 20 by moving the digital pen 10 on the display unit 21 of the display device 20, and the display device 20 responds to the input. To change the displayed contents.
 [2.表示装置の構成]
 以下、表示装置20について説明する。図2は表示制御システム100の概略構成を示すブロック図である。
[2. Configuration of display device]
Hereinafter, the display device 20 will be described. FIG. 2 is a block diagram illustrating a schematic configuration of the display control system 100.
 表示装置20は、外部からの信号を受信する受信部22と、表示装置20全体を制御する表示側マイコン23と、画像を表示する表示パネル24と、を有している。本実施形態の表示パネル24は液晶パネルである。 The display device 20 includes a receiving unit 22 that receives a signal from the outside, a display-side microcomputer 23 that controls the entire display device 20, and a display panel 24 that displays an image. The display panel 24 of the present embodiment is a liquid crystal panel.
 受信部22は、詳しくは後述する、デジタルペン10から送信された信号を受信する。受信部22が受信した信号は、表示側マイコン23に送られる。 The receiving unit 22 receives a signal transmitted from the digital pen 10, which will be described in detail later. The signal received by the receiving unit 22 is sent to the display-side microcomputer 23.
 表示側マイコン23は、CPUやメモリなどから構成されており、CPUを動作させるためのプログラムも実装されている。表示側マイコン23は、制御部の一例である。例えば、表示側マイコン23は、デジタルペン10から送信された信号に基づいて表示パネル24を制御して、表示部21に表示させる内容を変更する。 The display-side microcomputer 23 is composed of a CPU, a memory, and the like, and a program for operating the CPU is also mounted. The display-side microcomputer 23 is an example of a control unit. For example, the display-side microcomputer 23 controls the display panel 24 based on a signal transmitted from the digital pen 10 and changes the content displayed on the display unit 21.
 図3は表示パネル24の概略断面図である。表示パネル24の基本的な構成は、一般的な液晶パネルの構成と同様である。詳しくは、表示パネル24は、一対のガラス基板25と、各ガラス基板25の外表面に設けられた偏光フィルタ26と、一対のガラス基板25の間に設けられた一対の配向膜27と、一対の配向膜27の間に設けられた液晶層28と、各配向膜27に設けられた透明電極29と、表面側のガラス基板25と透明電極29との間に設けられたカラーフィルタ30とを有している。表示部21に様々な画像が表示される。詳しくは後述するが、上述したドットパターンは、カラーフィルタ30に記されている。ドットパターンは、位置情報パターンの一例である。 FIG. 3 is a schematic sectional view of the display panel 24. The basic configuration of the display panel 24 is the same as that of a general liquid crystal panel. Specifically, the display panel 24 includes a pair of glass substrates 25, a polarizing filter 26 provided on the outer surface of each glass substrate 25, a pair of alignment films 27 provided between the pair of glass substrates 25, and a pair A liquid crystal layer 28 provided between the alignment films 27, a transparent electrode 29 provided on each alignment film 27, and a color filter 30 provided between the glass substrate 25 on the surface side and the transparent electrode 29. Have. Various images are displayed on the display unit 21. Although described later in detail, the dot pattern described above is written on the color filter 30. The dot pattern is an example of a position information pattern.
 [3.デジタルペンの構成]
 次に、デジタルペン10の詳細な構成について説明する。図4は、デジタルペン10の概略構成を示す断面図である。
[3. Configuration of digital pen]
Next, a detailed configuration of the digital pen 10 will be described. FIG. 4 is a cross-sectional view illustrating a schematic configuration of the digital pen 10.
 デジタルペン10は、円筒状の本体部11と、本体部11の先端に取り付けられた指示部としてのペン先部12と、ペン先部12に作用する圧力を検出する圧力センサ13と、赤外光を出射する照射部14と、入射してきた赤外光を読み取る読取部15と、デジタルペン10を制御する制御部16と、外部へ信号を出力する送信部17と、デジタルペン10の各部材に電力を供給する電源19とを有している。 The digital pen 10 includes a cylindrical main body 11, a pen tip 12 as an instruction unit attached to the tip of the main body 11, a pressure sensor 13 that detects pressure acting on the pen tip 12, and an infrared ray. Irradiation unit 14 that emits light, reading unit 15 that reads incident infrared light, a control unit 16 that controls the digital pen 10, a transmission unit 17 that outputs a signal to the outside, and each member of the digital pen 10 And a power source 19 for supplying power to the power source.
 本体部11は、一般的なペンと同様の円筒で形成されている。ペン先部12は、先細形状であって、その先端は表示部21を傷つけない程度に丸く形成されている。また、ペン先部12の形状は、ユーザが表示部21に表示される画像を認識しやすい形状であることが好ましい。図4の構成では、ペン先部12は、赤外光を透過可能な材料で形成されている。 The main body 11 is formed of a cylinder similar to a general pen. The pen tip portion 12 has a tapered shape, and its tip is formed to be round so as not to damage the display portion 21. Moreover, it is preferable that the shape of the pen point part 12 is a shape in which the user can easily recognize the image displayed on the display unit 21. In the configuration of FIG. 4, the pen tip portion 12 is formed of a material that can transmit infrared light.
 圧力センサ13は、本体部11に内蔵され、ペン先部12の基端部に連結されている。圧力センサ13は、ペン先部12に加わる圧力を検出し、その検出結果を制御部16へ送信する。具体的には、圧力センサ13は、ユーザがデジタルペン10を用いて表示部21上に文字などを記入する際にペン先部12に加わる圧力を検出する。つまり、圧力センサ13は、デジタルペン10を用いたユーザの入力意思の有無を判定する際に用いられる。 The pressure sensor 13 is built in the main body 11 and connected to the proximal end of the pen tip 12. The pressure sensor 13 detects the pressure applied to the pen tip unit 12 and transmits the detection result to the control unit 16. Specifically, the pressure sensor 13 detects the pressure applied to the pen tip portion 12 when the user enters characters or the like on the display portion 21 using the digital pen 10. That is, the pressure sensor 13 is used when determining whether or not there is an input intention of the user using the digital pen 10.
 照射部14は、本体部11の先端部であって、ペン先部12の近傍に設けられている。照射部14は、例えば、赤外線LEDで構成されており、本体部11の先端から赤外光を照射するように構成されている。図4の構成では、複数(例えば、4個)の照射部14がペン先部12を囲むように配置されている。照射部14の個数は、適宜設定することができる。また、照射部14は、リング状に形成されていてもよい。 The irradiation unit 14 is provided at the tip of the main body 11 and in the vicinity of the pen tip 12. The irradiation part 14 is comprised by infrared LED, for example, and is comprised so that infrared light may be irradiated from the front-end | tip of the main-body part 11. FIG. In the configuration of FIG. 4, a plurality of (for example, four) irradiation units 14 are arranged so as to surround the pen tip unit 12. The number of irradiation units 14 can be set as appropriate. Moreover, the irradiation part 14 may be formed in ring shape.
 読取部15は、本体部11の先端部であって、ペン先部12の近傍に設けられている。読取部15は、ペン先部12の先端に内蔵された対物レンズ15aと、レンズ15bと、撮像素子15cとを有しており、対物レンズ15aとレンズ15bとで光学系を構成している。対物レンズ15aおよびレンズ15bは、入射してくる光を撮像素子15cに結像させる。対物レンズ15aには、照射部14から出射され、表示装置20で反射した赤外光が入射する。撮像素子15cは、対物レンズ15aおよびレンズ15bの光軸上に設けられている。撮像素子15cは、撮像面に結像した光学像を電気信号に変換して制御部16へ出力する。撮像素子15cは、例えば、CCDイメージセンサやCMOSイメージセンサで構成される。詳しくは後述するが、前記ドットパターンは赤外光を吸収する材料で形成されているので、ドットパターンからは赤外光が返ってこない。その結果、ドットパターンが黒く表現された光学像が撮像素子15cに撮像される。 The reading unit 15 is provided at the tip of the main body 11 and in the vicinity of the pen tip unit 12. The reading unit 15 includes an objective lens 15a, a lens 15b, and an image sensor 15c built in the tip of the pen tip unit 12, and the objective lens 15a and the lens 15b constitute an optical system. The objective lens 15a and the lens 15b form incident light on the image sensor 15c. Infrared light emitted from the irradiation unit 14 and reflected by the display device 20 is incident on the objective lens 15a. The image sensor 15c is provided on the optical axes of the objective lens 15a and the lens 15b. The imaging element 15 c converts an optical image formed on the imaging surface into an electrical signal and outputs the electrical signal to the control unit 16. The image sensor 15c is configured by, for example, a CCD image sensor or a CMOS image sensor. Although the details will be described later, since the dot pattern is formed of a material that absorbs infrared light, the dot pattern does not return infrared light. As a result, an optical image in which the dot pattern is expressed in black is captured by the image sensor 15c.
 制御部16は、図2に示すように、特定部16aと、姿勢検出部16bと、ペン側マイコン16cとを有する。特定部16aは、読取部15からの画像信号に基づいてデジタルペン10の表示部21上の位置情報を特定する。詳しくは、特定部16aは、読取部15が取得した画像信号からドットパターンを取得し、該ドットパターンに基づいてペン先部12の、表示部21上の位置を特定する。特定部16aにより特定されたペン先部12の位置に関する情報は、ペン側マイコン16cへ送られる。姿勢検出部16bは、読取部15からの画像信号に基づいてデジタルペン10の表示部21に対する姿勢情報を検出する。詳しくは、姿勢検出部16bは、読取部15が取得した画像信号から、ドットパターンの並び等を基にして画像の歪みを検知し、検知した歪み情報から、表示部21表面の法線方向に対するデジタルペン10の傾きを検出する。姿勢検出部16bにより検出されたデジタルペン10の姿勢に関する情報は、ペン側マイコン16cへ送られる。ペン側マイコン16cはデジタルペン10全体を制御する。ペン側マイコン16cは、CPUやメモリなどから構成されており、CPUを動作させるためのプログラムも実装されている。 The control unit 16 includes a specifying unit 16a, a posture detection unit 16b, and a pen-side microcomputer 16c, as shown in FIG. The specifying unit 16 a specifies position information on the display unit 21 of the digital pen 10 based on the image signal from the reading unit 15. Specifically, the specifying unit 16a acquires a dot pattern from the image signal acquired by the reading unit 15, and specifies the position of the pen tip unit 12 on the display unit 21 based on the dot pattern. Information regarding the position of the pen tip part 12 specified by the specifying part 16a is sent to the pen-side microcomputer 16c. The posture detection unit 16 b detects posture information with respect to the display unit 21 of the digital pen 10 based on the image signal from the reading unit 15. Specifically, the posture detection unit 16b detects image distortion from the image signal acquired by the reading unit 15 based on the arrangement of dot patterns and the like, and the detected distortion information indicates the normal direction of the surface of the display unit 21. The tilt of the digital pen 10 is detected. Information about the posture of the digital pen 10 detected by the posture detection unit 16b is sent to the pen-side microcomputer 16c. The pen side microcomputer 16 c controls the entire digital pen 10. The pen side microcomputer 16c is constituted by a CPU, a memory, and the like, and a program for operating the CPU is also mounted.
 送信部17は、信号を外部に送信する。具体的には、送信部17は、特定部16aにより特定した位置情報と、姿勢検出部16bにより検出した姿勢情報とを外部へ無線送信する。送信部17は、表示装置20の受信部22と近距離無線通信を行う。送信部17は、本体部11のうちペン先部12とは反対側の端部に設けられている。 The transmission unit 17 transmits a signal to the outside. Specifically, the transmission unit 17 wirelessly transmits the position information specified by the specifying unit 16a and the posture information detected by the posture detection unit 16b to the outside. The transmission unit 17 performs near field communication with the reception unit 22 of the display device 20. The transmitter 17 is provided at the end of the main body 11 opposite to the pen tip 12.
 [4.カラーフィルタの詳細構造]
 続いて、カラーフィルタ30の詳細構造について説明する。図5は、カラーフィルタ30の平面図である。
[4. Detailed structure of color filter]
Next, the detailed structure of the color filter 30 will be described. FIG. 5 is a plan view of the color filter 30.
 カラーフィルタ30は、ブラックマトリクス31と、該ブラックマトリクス31により区画され、特定の色の光を透過させる複数の画素領域32と、画素領域32内に設けられたドット33とを有している。画素領域32は、赤色(R)の光を透過させる赤色画素領域32rと、緑色(G)の光を透過させる緑色画素領域32gと、青色(B)の光を透過させる青色画素領域32bとを含んでいる。各画素領域32は、表示パネル24のサブピクセルに対応している。尚、透過させる色を区別しないときには、単に、「画素領域32」と称する。ブラックマトリクス31は、画素領域32の長手方向に延びる縦線と、画素領域32の短手方向に延びる横線とを含み、格子状に形成されている。横線は、縦線に比べて、太く形成されている。ブラックマトリクス31及びドット33は、カーボンブラックを主成分とする材料で形成されている。ドット33は、中実の円形に形成されている。ドット33は、全ての画素領域32ではなく、いくつかの画素領域32に設けられている。カラーフィルタ30においては、複数のドット33が一かたまりとなってドットパターンを形成している。このドットパターンは、カラーフィルタ30の位置に応じて異なっている。 The color filter 30 includes a black matrix 31, a plurality of pixel regions 32 that are partitioned by the black matrix 31 and transmit light of a specific color, and dots 33 provided in the pixel region 32. The pixel area 32 includes a red pixel area 32r that transmits red (R) light, a green pixel area 32g that transmits green (G) light, and a blue pixel area 32b that transmits blue (B) light. Contains. Each pixel region 32 corresponds to a subpixel of the display panel 24. Note that when the colors to be transmitted are not distinguished, they are simply referred to as “pixel region 32”. The black matrix 31 includes a vertical line extending in the longitudinal direction of the pixel region 32 and a horizontal line extending in the short direction of the pixel region 32 and is formed in a lattice shape. The horizontal line is formed thicker than the vertical line. The black matrix 31 and the dots 33 are formed of a material mainly composed of carbon black. The dots 33 are formed in a solid circle. The dots 33 are provided in some pixel regions 32 instead of all the pixel regions 32. In the color filter 30, a plurality of dots 33 are collected to form a dot pattern. This dot pattern differs depending on the position of the color filter 30.
 以下に、ドットパターンについて詳しく説明する。 The dot pattern will be described in detail below.
 まず、カラーフィルタ30上に第1基準線34と第2基準線35とを規定する。これら第1及び第2基準線34,35は、仮想的な線であり、実際には存在する線ではない。第1基準線34は、画素領域32の短手方向に延びる直線である。第1基準線34は、画素領域32の長手方向に画素領域32を2つおきに、複数並設されている。各第1基準線34は、画素領域32の長手方向中央に位置している。第2基準線35は、画素領域32の長手方向に延びる直線である。第2基準線35は、緑色画素領域32g上に設けられており、画素領域32の短手方向に緑色セル32gを2つおきに、複数並設されている。各第2基準線35は、緑色画素領域32gの短手方向中央に位置している。これら第1基準線34及び第2基準線35により、カラーフィルタ30上には格子が規定される。 First, a first reference line 34 and a second reference line 35 are defined on the color filter 30. These first and second reference lines 34 and 35 are virtual lines and are not actually existing lines. The first reference line 34 is a straight line extending in the short direction of the pixel region 32. A plurality of first reference lines 34 are arranged in parallel in the longitudinal direction of the pixel region 32 every two pixel regions 32. Each first reference line 34 is located at the center in the longitudinal direction of the pixel region 32. The second reference line 35 is a straight line extending in the longitudinal direction of the pixel region 32. The second reference line 35 is provided on the green pixel region 32g, and a plurality of second green cells 32g are arranged in parallel in the short direction of the pixel region 32. Each second reference line 35 is located in the center in the lateral direction of the green pixel region 32g. A grid is defined on the color filter 30 by the first reference line 34 and the second reference line 35.
 各ドット33は、第1基準線34と第2基準線35との交点の周辺に配置される。図6は、ドット33の配置パターンを示す図である。ドット33は、該交点から、互いに直交する四方(図5,6では、上下左右)の何れかにずれた位置に配置される。具体的には、ドット33は、図6(A)~(D)の何れかの配置となる。図6(A)の配置では、ドット33は、第1基準線34と第2基準線35との交点から、第1基準線34上を右側にずれた位置に配置される。このとき、ドット33は、青色画素領域32b上に配置される。この配置を数値化する際には「1」で表す。図6(B)の配置では、ドット33は、第1基準線34と第2基準線35との交点から、第2基準線35上を上側にずれた位置に配置される。このとき、ドット33は、緑色画素領域32g上に配置される。この配置を数値化する際には「2」で表す。図6(C)の配置では、ドット33は、第1基準線34と第2基準線35との交点から、第1基準線34上を左側にずれた位置に配置される。このとき、ドット33は、赤色画素領域32r上に配置される。この配置を数値化する際には「3」で表す。図6(D)の配置では、ドット33は、第1基準線34と第2基準線35との交点から、第2基準線35上を下側にずれた位置に配置される。このとき、ドット33は、緑色画素領域32g上に配置される。この配置を数値化する際には「4」で表す。 Each dot 33 is arranged around the intersection of the first reference line 34 and the second reference line 35. FIG. 6 is a diagram showing an arrangement pattern of the dots 33. The dots 33 are arranged at positions shifted from the intersection in any of four directions orthogonal to each other (up and down, left and right in FIGS. 5 and 6). Specifically, the dots 33 are arranged in any one of FIGS. 6A to 6D. In the arrangement shown in FIG. 6A, the dot 33 is arranged at a position shifted to the right on the first reference line 34 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the blue pixel region 32b. When this arrangement is digitized, it is represented by “1”. In the arrangement of FIG. 6B, the dot 33 is arranged at a position shifted upward on the second reference line 35 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the green pixel region 32g. When this arrangement is digitized, it is represented by “2”. In the arrangement of FIG. 6C, the dot 33 is arranged at a position shifted to the left on the first reference line 34 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the red pixel region 32r. When this arrangement is digitized, it is represented by “3”. In the arrangement of FIG. 6D, the dot 33 is arranged at a position shifted downward on the second reference line 35 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the green pixel region 32g. When this arrangement is digitized, it is represented by “4”.
 そして、6ドット×6ドットを1つの単位エリアとして、単位エリアに含まれる36個のドット33で1つのドットパターンを形成する。単位エリアに含まれる36個のドット33のそれぞれを前記「1」~「4」の何れかの配置とすることによって、膨大な数のドットパターンを形成することができる。各単位エリアのドットパターンは、全て異なっている。 Then, using 6 dots × 6 dots as one unit area, one dot pattern is formed by 36 dots 33 included in the unit area. By arranging each of the 36 dots 33 included in the unit area in any one of “1” to “4”, a huge number of dot patterns can be formed. The dot patterns in each unit area are all different.
 これらのドットパターンの1つ1つには、情報が付加されている。詳しくは、各ドットパターンは、単位エリアごとの位置座標を表している。つまり、カラーフィルタ30を、6ドット×6ドットの単位エリアで分割すると、各ドットパターンはその単位エリアの位置座標を表している。このようなドットパターンのパターンニングや座標変換の方法は、例えば、特開2006-141067号公報に開示されているような公知の方法を用いることができる。 Information is added to each of these dot patterns. Specifically, each dot pattern represents a position coordinate for each unit area. That is, when the color filter 30 is divided into unit areas of 6 dots × 6 dots, each dot pattern represents the position coordinates of the unit area. As such dot pattern patterning and coordinate conversion methods, for example, a publicly known method as disclosed in Japanese Patent Application Laid-Open No. 2006-141067 can be used.
 [5.動作]
 続いて、このように構成された表示制御システム100の動作について説明する。
[5. Operation]
Next, the operation of the display control system 100 configured as described above will be described.
 図7は本実施形態におけるデジタルペンと表示装置の表示部との接触状態を示す拡大図である。図7(a)では、デジタルペン10のペン先部12の向きが表示部21に対して垂直になっており、デジタルペン10の読取部15の撮像方向dcが表示部21表面の法線方向dnと一致している。これに対して、図7(b)では、デジタルペン10のペン先部12の向きが表示部20に対して傾いており、デジタルペン10の読取部15の撮像方向dcが表示部20の法線方向dnからずれている。このため、図7(b)の状態では、ドットパターンが形成されたカラーフィルタ30において撮像される位置Aが、ペン先部12の直下にあたる位置Bに対して、ずれてしまう。このことは、表示部21表面のガラス基板25や偏光フィルタ26が厚みをもつことに起因する。 FIG. 7 is an enlarged view showing a contact state between the digital pen and the display unit of the display device in the present embodiment. In FIG. 7A, the orientation of the pen tip portion 12 of the digital pen 10 is perpendicular to the display portion 21, and the imaging direction dc of the reading portion 15 of the digital pen 10 is the normal direction of the surface of the display portion 21. It matches dn. On the other hand, in FIG. 7B, the direction of the pen tip portion 12 of the digital pen 10 is inclined with respect to the display portion 20, and the imaging direction dc of the reading portion 15 of the digital pen 10 is the method of the display portion 20. It deviates from the line direction dn. For this reason, in the state of FIG. 7B, the position A captured by the color filter 30 on which the dot pattern is formed is shifted from the position B that is directly below the pen tip portion 12. This is because the glass substrate 25 and the polarization filter 26 on the surface of the display unit 21 have a thickness.
 一方、図7(b)の状態では、読取部15の撮像方向dcが表示部21表面の法線方向dnに対して傾きを持っているため、読取部15によって撮像された画像に台形歪みが生じる。そこで、本実施形態では、撮像画像から台形歪みを検出し、その歪みの程度から、デジタルペン10の傾きを検出する。 On the other hand, in the state of FIG. 7B, since the imaging direction dc of the reading unit 15 has an inclination with respect to the normal direction dn of the surface of the display unit 21, trapezoidal distortion occurs in the image captured by the reading unit 15. Arise. Therefore, in the present embodiment, trapezoidal distortion is detected from the captured image, and the tilt of the digital pen 10 is detected from the degree of the distortion.
 図8は読取部15によって撮像された画像の例であり、(a)は図7(a)の状態で撮像された画像、(b)は図7(b)すなわちペン先部12が傾いた状態で撮像された画像である。図8(b)の画像では、図8(a)と対比すると分かるように、台形歪みが生じている。この台形歪みの程度を画像処理によって検出することによって、ペン先部12が表示部21に対してどの程度傾いているのかを検出することができる。台形歪みの程度は例えば、ドットパターンの縦方向の並びが持つピッチの大きさから、求めることができる。すなわち図8(a)の画像では、ドットパターンの縦方向の並びはほぼ平行に配置されている。これに対して図8(b)の画像では、ドットパターンの縦方向の並びが傾きを持っており、その傾きは画像の端に近づくにつれて次第に大きくなっている。つまり、ドットパターンの縦方向の並びのピッチが、手前側で大きく、奥側で小さくなっている。このドットパターンの並びのピッチの違いから、幾何学的計算によって、読取部15の撮像方向dcの傾きすなわちペン先部12の傾きを算出することができる。 FIG. 8 is an example of an image captured by the reading unit 15, (a) is an image captured in the state of FIG. 7 (a), and (b) is FIG. 7 (b), that is, the pen tip unit 12 is tilted. It is the image imaged in the state. In the image of FIG. 8B, trapezoidal distortion is generated as can be seen in comparison with FIG. By detecting the degree of the trapezoidal distortion by image processing, it is possible to detect how much the pen tip portion 12 is inclined with respect to the display portion 21. The degree of trapezoidal distortion can be obtained from the pitch size of the vertical arrangement of dot patterns, for example. That is, in the image of FIG. 8A, the vertical arrangement of dot patterns is arranged substantially in parallel. On the other hand, in the image of FIG. 8B, the vertical arrangement of the dot patterns has an inclination, and the inclination gradually increases as it approaches the edge of the image. That is, the pitch of the vertical arrangement of dot patterns is large on the near side and small on the far side. From the difference in the pitch of the dot pattern arrangement, the inclination of the imaging direction dc of the reading unit 15, that is, the inclination of the pen tip unit 12 can be calculated by geometric calculation.
 なお、実際には、読取部15の撮像画像には台形歪みの他に回転が加わっていることが多いので、画像に回転補正をかけた後に台形歪みの検出を行うのが好ましい。また、台形歪みの検出手法はここで示したものに限られるものではない。 In practice, rotation is often added to the captured image of the reading unit 15 in addition to the trapezoidal distortion. Therefore, it is preferable to detect the trapezoidal distortion after correcting the rotation of the image. Further, the trapezoidal distortion detection method is not limited to the one shown here.
 図9は表示制御システム100の処理の流れを示すフローチャートである。以下では、ユーザが、デジタルペン10を用いて表示装置20に文字を記入する場合について説明する。 FIG. 9 is a flowchart showing a processing flow of the display control system 100. Below, the case where a user writes a character into the display apparatus 20 using the digital pen 10 is demonstrated.
 まず、表示制御システム100の電源がオンされると、ステップS11において、デジタルペン10のペン側マイコン16cは、ペン先部12に圧力が作用したか否かを監視する。この圧力の検出は、圧力センサ13が行う。圧力が検出されると(Yes)、ペン側マイコン16cは、ユーザが表示装置20の表示部21に対して文字を入力していると判定し、ステップS12へ進む。圧力が検出されていない間(No)は、ペン側マイコン16cは、ステップS11を繰り返す。尚、デジタルペン10の電源がオンされると、照射部14は赤外光の照射を開始する。 First, when the power of the display control system 100 is turned on, the pen side microcomputer 16c of the digital pen 10 monitors whether or not pressure is applied to the pen tip portion 12 in step S11. This pressure is detected by the pressure sensor 13. When the pressure is detected (Yes), the pen-side microcomputer 16c determines that the user is inputting characters on the display unit 21 of the display device 20, and proceeds to step S12. While the pressure is not detected (No), the pen side microcomputer 16c repeats Step S11. Note that when the power source of the digital pen 10 is turned on, the irradiation unit 14 starts irradiation of infrared light.
 ステップS12では、デジタルペン10の読取部15が、表示部21に形成されたドットパターンを検出する。照射部14からは赤外光が出射されており、この赤外光は、少なくとも表示装置20のカラーフィルタ30に設けられたドット33に吸収される一方、画像領域32等において反射される。反射された赤外光は、対物レンズ15aおよびレンズ15bを介して撮像素子15cに受光される。その結果、撮像素子15cによりドットパターンが撮像される。このようにして、読取部15は、ドットパターンを光学的に読み取る。読取部15が取得した画像信号は、特定部16aおよび姿勢検出部16bに送信される。 In step S12, the reading unit 15 of the digital pen 10 detects a dot pattern formed on the display unit 21. Infrared light is emitted from the irradiation unit 14, and the infrared light is absorbed by at least the dots 33 provided in the color filter 30 of the display device 20, and is reflected by the image region 32 and the like. The reflected infrared light is received by the image sensor 15c through the objective lens 15a and the lens 15b. As a result, a dot pattern is imaged by the image sensor 15c. In this way, the reading unit 15 optically reads the dot pattern. The image signal acquired by the reading unit 15 is transmitted to the specifying unit 16a and the posture detecting unit 16b.
 ステップS13では、特定部16aが、画像信号からドットパターンを取得し、該ドットパターンに基づいてペン先部12の、表示部21上の位置を特定する。詳しくは、特定部16aは、得られた画像信号に所定の画像処理を施すことにより、ドットパターンを取得する。例えば、ブラックマトリクス31は、ドット33と同様に、カーボンブラックで形成されているため、赤外光を吸収する。そのため、読取部15からの画像には、ブラックマトリクス31もドット33と同じ状態で含まれている。そこで、読取部15からの画像信号に所定の画像処理を施すことによりドット33をブラックマトリクス31から判別し易くし、処理後の画像信号から複数のドット33の配列を取得する。続いて、特定部16aは、取得されたドット33の配列から6ドット×6ドットの単位エリアを割り出すと共に、該単位エリアのドットパターンから該単位エリアの位置座標(位置情報)を特定する。特定部16aは、ドットパターンのコーディング方法に対応した所定の演算により、ドットパターンを位置座標に変換する。特定された位置情報は、ペン側マイコン16cに送信される。 In step S13, the specifying unit 16a acquires a dot pattern from the image signal, and specifies the position of the pen tip unit 12 on the display unit 21 based on the dot pattern. Specifically, the specifying unit 16a acquires a dot pattern by performing predetermined image processing on the obtained image signal. For example, since the black matrix 31 is formed of carbon black like the dots 33, it absorbs infrared light. Therefore, the black matrix 31 is also included in the image from the reading unit 15 in the same state as the dots 33. Therefore, the image signal from the reading unit 15 is subjected to predetermined image processing so that the dots 33 are easily discriminated from the black matrix 31, and an array of a plurality of dots 33 is acquired from the processed image signal. Subsequently, the specifying unit 16a determines a unit area of 6 dots × 6 dots from the acquired arrangement of the dots 33, and specifies the position coordinates (position information) of the unit area from the dot pattern of the unit area. The specifying unit 16a converts the dot pattern into position coordinates by a predetermined calculation corresponding to the dot pattern coding method. The specified position information is transmitted to the pen-side microcomputer 16c.
 ステップS14では、姿勢検出部16bが、画像信号から画像の台形歪みを検知し、この台形歪みの程度に基づいてペン先部12の傾きを検出する。例えば上述したとおり、画像に回転補正を施した後に、ドットパターンの縦方向の並びについて傾きやピッチを求め、この傾きやピッチから幾何学的計算によって、読取部15の撮像方向の傾きを算出する。算出された傾きは姿勢情報として、ペン側マイコン16cに送信される。 In step S14, the posture detection unit 16b detects the trapezoidal distortion of the image from the image signal, and detects the inclination of the pen tip unit 12 based on the degree of the trapezoidal distortion. For example, as described above, after performing rotation correction on the image, the inclination and pitch of the vertical arrangement of the dot pattern are obtained, and the inclination in the imaging direction of the reading unit 15 is calculated from the inclination and pitch by geometric calculation. . The calculated inclination is transmitted to the pen-side microcomputer 16c as posture information.
 続いて、ステップS15では、ペン側マイコン16cは、位置情報および姿勢情報を送信部17を介して表示装置20へ送信する。 Subsequently, in step S15, the pen side microcomputer 16c transmits the position information and the posture information to the display device 20 via the transmission unit 17.
 デジタルペン10から送信された位置情報および姿勢情報は、表示装置20の受信部22により受信される。受信された位置情報および姿勢情報は、受信部22から表示側マイコン23に送信される。表示側マイコン23が位置情報を受信すると、ステップS16において、表示側マイコン23は、位置情報に対応する位置の表示内容を変更するように表示パネル24を制御する。この例では、文字の入力なので、表示部21における位置情報に対応する位置に点を表示する。 The position information and posture information transmitted from the digital pen 10 are received by the receiving unit 22 of the display device 20. The received position information and posture information are transmitted from the receiving unit 22 to the display-side microcomputer 23. When the display-side microcomputer 23 receives the position information, in step S16, the display-side microcomputer 23 controls the display panel 24 to change the display content of the position corresponding to the position information. In this example, since a character is input, a point is displayed at a position corresponding to the position information on the display unit 21.
 このとき、表示側マイコン23は、位置情報に対応する位置を、姿勢情報を加味して修正する。例えば図7(b)の状態では、位置情報は位置Aを示しているが、ペン先部12が表示部21に接している実際の位置は位置Bであり、ずれが生じている。このずれはペン先部12の傾きに起因している。このため、ユーザに違和感を与えないように、表示側マイコン23は、表示内容を変更する位置を、姿勢情報すなわちペン先部12の傾きを加味して例えば位置Bに変更する。なお、修正後の位置は必ずしもペン先部12直下の位置Bに設定する必要は必ずしもなく、例えば位置Aと位置Bとの間における、ユーザが違和感を感じないと推測される位置に設定してもかまわない。 At this time, the display-side microcomputer 23 corrects the position corresponding to the position information in consideration of the posture information. For example, in the state of FIG. 7B, the position information indicates the position A, but the actual position where the pen tip portion 12 is in contact with the display portion 21 is the position B, and a deviation occurs. This deviation is caused by the inclination of the pen tip portion 12. For this reason, the display-side microcomputer 23 changes the display content change position to, for example, the position B in consideration of the posture information, that is, the inclination of the pen tip portion 12 so as not to give the user a sense of incongruity. It should be noted that the corrected position is not necessarily set to the position B immediately below the pen tip portion 12, and is set to a position between the position A and the position B where the user is assumed not to feel uncomfortable. It doesn't matter.
 続いて、ステップS17において、ペン側マイコン16cは、ユーザによる入力が継続しているか否かを判定する。圧力センサ13が圧力を検出している場合には、ペン側マイコン16cは、ユーザによる入力が継続していると判定して、ステップS11へ戻る。そして、前記のフローを繰り返すことによって、デジタルペン10のペン先部12の移動に追従して、表示部21上におけるペン先部12の位置に点が連続的に表示される。最終的には、デジタルペン10のペン先部12の軌跡に応じた文字が表示装置20の表示部21に表示される。 Subsequently, in step S17, the pen side microcomputer 16c determines whether or not the input by the user is continued. When the pressure sensor 13 detects the pressure, the pen side microcomputer 16c determines that the input by the user is continued, and returns to step S11. Then, by repeating the above flow, dots are continuously displayed at the position of the pen tip 12 on the display unit 21 following the movement of the pen tip 12 of the digital pen 10. Finally, characters corresponding to the locus of the pen tip portion 12 of the digital pen 10 are displayed on the display portion 21 of the display device 20.
 一方、ステップS17において、圧力センサ13が圧力を検出していない場合には、ペン側マイコン16cは、ユーザによる入力が継続していないと判定して、処理を終了する。 On the other hand, if the pressure sensor 13 does not detect the pressure in step S17, the pen side microcomputer 16c determines that the input by the user is not continued, and ends the process.
 こうして、表示装置20が表示部21上におけるデジタルペン10の先端の軌跡を表示部21に表示することによって、デジタルペン10を用いた表示部21への手書き入力を行うことができる。 Thus, when the display device 20 displays the locus of the tip of the digital pen 10 on the display unit 21 on the display unit 21, handwriting input to the display unit 21 using the digital pen 10 can be performed.
 尚、以上では、文字を記入する場合について説明したが、表示制御システム100の使い方は、これに限られるものでない。文字に限らず、数字、記号及び図形等を記入できることはもちろんのことであるが、デジタルペン10を消しゴムのように用いて、表示部21に表示された文字、図形等を消すこともできる。つまり、前記の例では、表示部21における位置情報に対応する位置に点を表示しているが、当該位置の点を消去するようにすればよい。さらには、デジタルペン10をマウスのように用いて、表示部21に表示されるポインタを移動させたり、表示部21に表示されるアイコンを選択したりすることもできる。 In addition, although the case where a character was entered was demonstrated above, the usage of the display control system 100 is not restricted to this. Of course, not only characters but also numbers, symbols, figures, etc. can be entered, but the digital pen 10 can be used like an eraser to erase characters, figures, etc. displayed on the display unit 21. That is, in the above example, the point is displayed at the position corresponding to the position information on the display unit 21, but the point at the position may be deleted. Furthermore, the digital pen 10 can be used like a mouse to move a pointer displayed on the display unit 21 or to select an icon displayed on the display unit 21.
 [6.実施形態の効果]
 以上のように本実施形態によると、デジタルペン10に設けられた特定部16aによって、読取部15によって撮像された画像に含まれたドットパターンに基づいて、ペン先部12の表示部21における位置が特定される。またこれとともに、デジタルペン10に設けられた姿勢検出部16bによって、読取部15によって撮像された画像の台形歪みが検知され、この台形歪みの程度に基づいて、ペン先部12の表示部21に対する傾きが検知される。表示装置20は、特定部16aによって特定された位置情報に基づき、かつ、姿勢検出部16bによって検出された姿勢情報を加味して、表示部21における表示情報の変更位置を決定する。
[6. Effects of the embodiment]
As described above, according to the present embodiment, the position of the pen tip unit 12 on the display unit 21 based on the dot pattern included in the image captured by the reading unit 15 by the specifying unit 16a provided in the digital pen 10. Is identified. At the same time, a trapezoidal distortion of the image captured by the reading unit 15 is detected by the posture detection unit 16b provided in the digital pen 10, and based on the degree of the trapezoidal distortion, the display unit 21 of the pen tip unit 12 is detected. Tilt is detected. The display device 20 determines the change position of the display information on the display unit 21 based on the position information specified by the specifying unit 16a and taking into account the posture information detected by the posture detection unit 16b.
 これにより、デジタルペン10の姿勢が傾いている場合であっても、ペン先部12の位置とディスプレイにおける表示位置との間にずれが生じないように、表示位置を修正することができる。しかも、ペン先部12の傾きは読取部15によって撮像された画像から検出されるため、デジタルペン10の姿勢情報を得るために新たなデバイスを追加する必要がない。したがって、デジタルペン10のサイズやコストを上げることなく、ユーザの違和感をなくすことが可能である。 Thereby, even when the posture of the digital pen 10 is tilted, the display position can be corrected so that there is no deviation between the position of the pen tip portion 12 and the display position on the display. In addition, since the tilt of the pen tip 12 is detected from the image captured by the reading unit 15, it is not necessary to add a new device to obtain the posture information of the digital pen 10. Therefore, it is possible to eliminate the user's uncomfortable feeling without increasing the size and cost of the digital pen 10.
 すなわち、本実施形態では、表示部21を有する表示装置20に対して、デジタルペン10によって、表示部21上の位置が指示される。このときデジタルペン10において、読取部15が、表示部21における平面位置を表すドットパターンを含む画像を撮像する。そして特定部16aによって、この撮像画像に含まれたドットパターンに基づいて、表示部21における位置が特定される。また姿勢検出部16bによって、この撮像画像の歪みが検知され、検知された歪みから、デジタルペン10の表示部21に対する姿勢が検出される。このため、例えばデジタルペン10の姿勢に起因して、指示した位置とドットパターンの読み取り位置との間にずれが生じた場合であっても、検出した姿勢情報を加味して、表示部21における表示位置を修正することが可能になる。したがって、撮像画像から検出した姿勢情報を用いて、デジタルペン10の姿勢に起因する指示位置と表示位置との間のずれを未然に修正することが可能になる。また、検出した姿勢情報は、表示部21における表示情報の変更形態の設定にも活用することができる。 That is, in this embodiment, the position on the display unit 21 is instructed by the digital pen 10 to the display device 20 having the display unit 21. At this time, in the digital pen 10, the reading unit 15 captures an image including a dot pattern representing a planar position on the display unit 21. And the position in the display part 21 is pinpointed by the specific | specification part 16a based on the dot pattern contained in this captured image. Further, the posture detection unit 16b detects the distortion of the captured image, and the posture of the digital pen 10 with respect to the display unit 21 is detected from the detected distortion. For this reason, for example, even if there is a deviation between the designated position and the reading position of the dot pattern due to the posture of the digital pen 10, the detected posture information is taken into consideration in the display unit 21. The display position can be corrected. Therefore, using the posture information detected from the captured image, it is possible to correct the deviation between the indicated position and the display position due to the posture of the digital pen 10 in advance. The detected posture information can also be used for setting a display information change mode on the display unit 21.
 なお、特定部16aにおけるペン先部12の位置検出でも、位置情報パターンを精度良く認識するためには、画像の歪み補正や回転補正が必要になる。このため、特定部16aの処理の中で画像の歪み補正量が検知されると考えられる。この場合には、この歪み補正量からペン先部12の傾きを検出することが可能である。したがって、図2の構成では、特定部16aと姿勢検出部16bとは別ブロックとなっているが、これらが一体になって処理を行う構成としてもよい。 It should be noted that, even in the position detection of the pen tip portion 12 in the specifying portion 16a, image distortion correction and rotation correction are necessary in order to accurately recognize the position information pattern. For this reason, it is considered that the distortion correction amount of the image is detected during the processing of the specifying unit 16a. In this case, it is possible to detect the inclination of the pen tip 12 from the distortion correction amount. Therefore, in the configuration of FIG. 2, the specifying unit 16a and the posture detection unit 16b are separate blocks, but these may be integrated to perform processing.
 また上述の実施形態では、デジタルペン10から位置情報と姿勢情報を送信し、表示装置20の方で、位置情報が示す位置を姿勢情報に基づいて修正するものとしたが、この代わりに、デジタルペン10の方で、位置情報が示す位置を姿勢情報に基づいて修正し、修正後の表示位置を表示装置20に送信するようにしてもかまわない。ただし、表示装置20の種類が変わった場合には、表示部21表面からドットパターンが形成された層(ここではカラーフィルタ30)までの間隔が変わる可能性があり、表示位置の修正もそれに応じて変更する必要がある。このため、修正処理は、表示装置20側で行う方が好ましいといえる。 In the above-described embodiment, the position information and the posture information are transmitted from the digital pen 10 and the display device 20 corrects the position indicated by the position information based on the posture information. The position indicated by the position information may be corrected by the pen 10 based on the posture information, and the corrected display position may be transmitted to the display device 20. However, when the type of the display device 20 is changed, there is a possibility that the distance from the surface of the display unit 21 to the layer on which the dot pattern is formed (here, the color filter 30) may change, and the display position is corrected accordingly. Need to be changed. For this reason, it can be said that the correction process is preferably performed on the display device 20 side.
 (実施形態2)
 次に、実施形態2に係る表示制御システム200について説明する。図10は、表示制御システム200の概略構成を示すブロック図である。表示制御システム200は、デジタルペン210の位置の特定および姿勢の検出を、デジタルペン210ではなく表示装置220が行う点で実施形態1と異なる。以下、実施形態1と実質的に同様の構成については、重複説明を省略する場合がある。
(Embodiment 2)
Next, the display control system 200 according to the second embodiment will be described. FIG. 10 is a block diagram illustrating a schematic configuration of the display control system 200. The display control system 200 is different from the first embodiment in that the display device 220, not the digital pen 210, identifies the position of the digital pen 210 and detects the posture. Hereinafter, the description of the configuration substantially similar to that of the first embodiment may be omitted.
 デジタルペン210は、図10に示すように、圧力センサ13と、照射部14と、読取部15と、制御部216と、送信部17とを有している。圧力センサ13、照射部14、読取部15及び送信部17の構成は、実施形態1と同様である。制御部216は、ペン側マイコン16cを有し、実施形態1の特定部16aおよび姿勢検出部16bを有していない。つまり、制御部216は、撮像素子15cから入力された画像信号を、該画像信号からデジタルペン210の位置情報を特定することなく、送信部17へ出力する。こうして、デジタルペン210からは、撮像素子15cで撮像した画像信号が送信される。 As shown in FIG. 10, the digital pen 210 has a pressure sensor 13, an irradiation unit 14, a reading unit 15, a control unit 216, and a transmission unit 17. The configurations of the pressure sensor 13, the irradiation unit 14, the reading unit 15, and the transmission unit 17 are the same as those in the first embodiment. The control unit 216 includes the pen-side microcomputer 16c and does not include the specifying unit 16a and the posture detection unit 16b of the first embodiment. That is, the control unit 216 outputs the image signal input from the image sensor 15c to the transmission unit 17 without specifying the position information of the digital pen 210 from the image signal. Thus, the image signal picked up by the image pickup device 15c is transmitted from the digital pen 210.
 表示装置220は、図10に示すように、外部からの信号を受信する受信部22と、表示装置20全体を制御する表示側マイコン23と、画像を表示する表示パネル24と、デジタルペン10の位置を特定する特定部240と、デジタルペン10の姿勢を検出する姿勢検出部241とを有している。受信部22、表示側マイコン23及び表示パネル24の構成は、実施形態1と同様である。表示パネル24の表示部21には、図5に示したようなドットパターンが形成されている。受信部22は、デジタルペン210から送信された信号を受信して、該信号を特定部240に送信する。特定部240は、実施形態1におけるデジタルペン10の特定部16aと同様の機能を有する。つまり、本実施形態では、デジタルペン210からの送信信号は撮像素子15cで取得した画像信号なので、特定部240が該画像信号からデジタルペン210の位置を特定する。すなわち、特定部240は、特定部16aと同様に、画像信号からドットパターンを取得し、該ドットパターンに基づいてペン先部12の、表示部21上の位置座標を特定する。特定部240は、特定した位置情報を表示側マイコン23へ送信する。 As shown in FIG. 10, the display device 220 includes a receiving unit 22 that receives an external signal, a display-side microcomputer 23 that controls the entire display device 20, a display panel 24 that displays an image, and a digital pen 10. It has a specifying unit 240 that specifies the position, and a posture detecting unit 241 that detects the posture of the digital pen 10. The configurations of the receiving unit 22, the display-side microcomputer 23, and the display panel 24 are the same as those in the first embodiment. A dot pattern as shown in FIG. 5 is formed on the display unit 21 of the display panel 24. The receiving unit 22 receives a signal transmitted from the digital pen 210 and transmits the signal to the specifying unit 240. The specifying unit 240 has the same function as the specifying unit 16a of the digital pen 10 in the first embodiment. That is, in this embodiment, since the transmission signal from the digital pen 210 is an image signal acquired by the image sensor 15c, the specifying unit 240 specifies the position of the digital pen 210 from the image signal. That is, the specifying unit 240 acquires a dot pattern from the image signal, and specifies the position coordinates of the pen tip unit 12 on the display unit 21 based on the dot pattern, similarly to the specifying unit 16a. The specifying unit 240 transmits the specified position information to the display-side microcomputer 23.
 姿勢検出部241は、実施形態1におけるデジタルペン10の姿勢検出部16bと同様の機能を有する。つまり、本実施形態では、デジタルペン210からの送信信号は撮像素子15cで取得した画像信号なので、姿勢検出部241が該画像信号からデジタルペン210の姿勢を検出する。すなわち、姿勢検出部241は、姿勢検出部16bと同様に、画像信号からドットパターンの並び等を基にして画像の歪みを検知し、検知した歪み情報から、表示部21の法線方向に対するデジタルペン10の傾きを検出する。姿勢検出部241は、検出した姿勢情報を表示側マイコン23へ送信する。表示側マイコン23は、位置情報および姿勢情報に基づいて、表示部21に表示される表示情報を変更するように表示パネル24を制御する。 The posture detection unit 241 has the same function as the posture detection unit 16b of the digital pen 10 in the first embodiment. That is, in this embodiment, since the transmission signal from the digital pen 210 is an image signal acquired by the image sensor 15c, the posture detection unit 241 detects the posture of the digital pen 210 from the image signal. That is, the posture detection unit 241 detects image distortion based on the arrangement of dot patterns and the like from the image signal, as in the posture detection unit 16b, and digitally displays the normal direction of the display unit 21 from the detected distortion information. The inclination of the pen 10 is detected. The posture detection unit 241 transmits the detected posture information to the display-side microcomputer 23. The display-side microcomputer 23 controls the display panel 24 to change the display information displayed on the display unit 21 based on the position information and the posture information.
 次に表示制御システム200の動作について説明する。図11は、表示制御システム200の処理の流れを示すフローチャートである。以下では、ユーザが、デジタルペン210を用いて表示装置220に文字を記入する場合について説明する。 Next, the operation of the display control system 200 will be described. FIG. 11 is a flowchart showing a process flow of the display control system 200. Below, the case where a user writes a character in the display apparatus 220 using the digital pen 210 is demonstrated.
 表示制御システム200の電源がオンされると、ステップS21において、デジタルペン210のペン側マイコン16cは、ペン先部12に圧力が作用したか否かを監視する。圧力が検出されると(Yes)、ペン側マイコン16cは、ユーザが表示装置220の表示部21に対して文字を入力していると判定し、ステップS22へ進む。ステップS22では、デジタルペン210の読取部15が、表示部21に形成されたドットパターンの画像を取得する。読取部15が取得した画像信号は、ステップS23において、送信部17を介して表示装置220へ送信される。 When the power of the display control system 200 is turned on, in step S21, the pen side microcomputer 16c of the digital pen 210 monitors whether or not pressure is applied to the pen tip portion 12. When the pressure is detected (Yes), the pen-side microcomputer 16c determines that the user is inputting characters to the display unit 21 of the display device 220, and proceeds to step S22. In step S <b> 22, the reading unit 15 of the digital pen 210 acquires a dot pattern image formed on the display unit 21. The image signal acquired by the reading unit 15 is transmitted to the display device 220 via the transmission unit 17 in step S23.
 デジタルペン210から送信された画像信号は、ステップS24において、表示装置220の受信部22により受信される。受信された画像信号は、特定部240および姿勢検出部241に送られる。特定部240は、該画像信号に基づき、ドットパターンを取得し、デジタルペン210の位置を特定する。特定部240が特定した位置情報は、表示側マイコン23へ送られる。 The image signal transmitted from the digital pen 210 is received by the receiving unit 22 of the display device 220 in step S24. The received image signal is sent to the specifying unit 240 and the posture detecting unit 241. The specifying unit 240 acquires a dot pattern based on the image signal and specifies the position of the digital pen 210. The position information specified by the specifying unit 240 is sent to the display-side microcomputer 23.
 ステップS25では、姿勢検出部241が、画像信号から画像の台形歪みを検知し、この台形歪みの程度に基づいてペン先部12の傾きを検出する。例えば上述したとおり、画像に回転補正を施した後に、ドットパターンの縦方向の並びやピッチについて傾きを求め、この傾きから幾何学的計算によって、読取部15の撮像方向の傾きを算出する。算出された傾きは姿勢情報として、表示側マイコン23へ送られる。 In step S25, the posture detection unit 241 detects the trapezoidal distortion of the image from the image signal, and detects the inclination of the pen tip unit 12 based on the degree of the trapezoidal distortion. For example, as described above, after performing rotation correction on the image, the inclination of the vertical arrangement and pitch of the dot pattern is obtained, and the inclination in the imaging direction of the reading unit 15 is calculated from the inclination by geometric calculation. The calculated inclination is sent to the display-side microcomputer 23 as posture information.
 続いて、ステップS26において、表示側マイコン23は、位置情報および姿勢情報を受信すると、位置情報および姿勢情報に対応する位置の表示内容を変更するように表示部24を制御する。この例では、文字の入力なので、表示部21における位置情報に対応する位置に点を表示する。このとき、表示側マイコン23は、位置情報に対応する位置を、姿勢情報を加味して、例えばユーザが違和感を感じないと推測される位置に修正する。 Subsequently, in step S26, when receiving the position information and the posture information, the display-side microcomputer 23 controls the display unit 24 to change the display contents of the position corresponding to the position information and the posture information. In this example, since a character is input, a point is displayed at a position corresponding to the position information on the display unit 21. At this time, the display-side microcomputer 23 corrects the position corresponding to the position information to a position where it is estimated that the user does not feel uncomfortable, for example, taking the posture information into consideration.
 その後、ステップS27において、ペン側マイコン16cは、ユーザによる入力が継続しているか否かを判定する。入力が継続している場合(Yes)には、ステップS21へ戻り、前記のフローを繰り返す。一方、入力が継続していない場合には、処理を終了する。こうして、デジタルペン210のペン先部12の軌跡に応じた文字が表示装置220の表示部21に表示される。 Thereafter, in step S27, the pen side microcomputer 16c determines whether or not the input by the user is continued. If the input continues (Yes), the process returns to step S21 and the above flow is repeated. On the other hand, if the input is not continued, the process is terminated. In this way, characters corresponding to the locus of the pen tip 12 of the digital pen 210 are displayed on the display unit 21 of the display device 220.
 このような処理を行うことで、表示制御システム200は、ユーザが操作するデジタルペン210の位置を高精細に検知し、表示部21に高精細に反映させることができる。 By performing such processing, the display control system 200 can detect the position of the digital pen 210 operated by the user with high definition and reflect the position on the display unit 21 with high definition.
 すなわち、本実施形態では、表示部21を有する表示装置220に対して、デジタルペン210によって、表示部21上の位置が指示される。このときデジタルペン210において、読取部15が、表示部21における平面位置を表すドットパターンを含む画像を撮像する。そして表示装置220において、特定部240によって、この撮像画像に含まれたドットパターンに基づいて、表示部21における位置が特定される。また姿勢検出部241によって、この撮像画像の歪みが検知され、検知された歪みから、デジタルペン210の表示部21に対する姿勢が検出される。このため、例えばデジタルペン210の姿勢に起因して、指示した位置とドットパターンの読み取り位置との間にずれが生じた場合であっても、検出した姿勢情報を加味して、表示部21における表示位置を修正することが可能になる。したがって、撮像画像から検出した姿勢情報を用いて、デジタルペン210の姿勢に起因する指示位置と表示位置との間のずれを未然に修正することが可能になる。また、検出した姿勢情報は、表示部21における表示情報の変更形態の設定にも活用することができる。 That is, in this embodiment, the position on the display unit 21 is instructed by the digital pen 210 to the display device 220 having the display unit 21. At this time, in the digital pen 210, the reading unit 15 captures an image including a dot pattern representing a planar position on the display unit 21. In the display device 220, the specifying unit 240 specifies the position on the display unit 21 based on the dot pattern included in the captured image. Further, the posture detection unit 241 detects the distortion of the captured image, and the posture of the digital pen 210 with respect to the display unit 21 is detected from the detected distortion. For this reason, for example, even if a deviation occurs between the designated position and the dot pattern reading position due to the posture of the digital pen 210, the detected posture information is taken into consideration in the display unit 21. The display position can be corrected. Therefore, using the posture information detected from the captured image, it is possible to correct the deviation between the indicated position and the display position due to the posture of the digital pen 210 in advance. The detected posture information can also be used for setting a display information change mode on the display unit 21.
 (他の実施の形態)
 以上のように、本出願において開示する技術の例示として、実施形態1~2を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、上記実施形態1~2で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。
(Other embodiments)
As described above, Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. In addition, it is possible to combine the components described in the first and second embodiments to form a new embodiment.
 そこで、以下、他の実施の形態を例示する。 Therefore, other embodiments will be exemplified below.
 実施形態1~2では、指示装置の一例として、光学式デジタルペンを説明した。指示装置は、表示装置の表示部上の位置を指示するためのものであり、表示部上の位置を指示するための指示部と、指示部によって指示された位置における、位置情報パターンを含む画像を撮像する読取部とを備えていればよい。したがって、指示装置は、光学式デジタルペンに限定されない。また、指示部や読取部の構成は、実施形態1~2で示したものに限られるものではない。 In the first and second embodiments, the optical digital pen has been described as an example of the pointing device. The indicating device is for indicating a position on the display unit of the display device, and includes an indicating unit for indicating a position on the display unit, and an image including a position information pattern at the position specified by the indicating unit. It is only necessary to include a reading unit that captures images. Therefore, the pointing device is not limited to the optical digital pen. Further, the configuration of the instruction unit and the reading unit is not limited to that shown in the first and second embodiments.
 また、実施形態1~2では、位置情報パターンの一例として、ドットパターンを説明した。位置情報パターンは、表示装置の表示部に形成されており、表示部における平面位置を表すものであればよい。したがって、位置情報パターンは、ドットパターンに限定されない。また、位置座標の表し方や、単位エリアの分割形態などについても、実施形態1~2に示したものに限られるものではない。 In the first and second embodiments, the dot pattern has been described as an example of the position information pattern. The position information pattern may be formed on the display unit of the display device and represents a planar position on the display unit. Therefore, the position information pattern is not limited to a dot pattern. Further, the way of expressing the position coordinates and the division form of the unit area are not limited to those shown in the first and second embodiments.
 また、ドット33は、カラーフィルタ30に設けられているが、これに限られるものではない。ドット33は、ガラス基板25や偏光フィルタ26に設けられていてもよい。さらには、表示パネル24は、ドット33が形成された、カラーフィルタ30、ガラス基板25及び偏光フィルタ26とは別のシートを備える構成であってもよい。 The dots 33 are provided in the color filter 30, but are not limited thereto. The dots 33 may be provided on the glass substrate 25 or the polarizing filter 26. Furthermore, the display panel 24 may be configured to include a sheet different from the color filter 30, the glass substrate 25, and the polarizing filter 26 in which the dots 33 are formed.
 なお、上述の各実施形態では、デジタルペン10の姿勢情報を表示位置の修正に用いるものとした。すなわち、特定部から出力された位置情報に基づいて、指示装置の姿勢情報を加味して、表示部における表示情報の変更位置を決定するものとした。ただし、姿勢情報の利用形態はこれに限られるものではない。例えば、ペン先部12の傾きに応じて、表示する線の太さや、線のタッチ等を切り替えるようにしてもよい。すなわち、姿勢情報に応じて、表示部における表示情報の変更形態を設定することも可能である。 In each embodiment described above, the posture information of the digital pen 10 is used for correcting the display position. That is, based on the position information output from the specifying unit, the change position of the display information on the display unit is determined in consideration of the posture information of the pointing device. However, the use form of the posture information is not limited to this. For example, the thickness of the line to be displayed, the touch of the line, or the like may be switched according to the inclination of the pen tip portion 12. That is, it is possible to set a display information change mode in the display unit according to the posture information.
 また、上述の各実施形態では、画像の台形歪みからペン先部12の傾きを求めるものとしたが、他にも例えば、画像の回転補正量からペン先部12の軸周りの回転角度を姿勢情報として検出することも可能である。この場合、姿勢情報の利用形態としては例えば、ペン先部12の回転角度に応じて表示する色を変えて、多色ペンを実現する、といったようなことが考えられる。 In each of the above-described embodiments, the inclination of the pen tip 12 is obtained from the trapezoidal distortion of the image. However, for example, the rotation angle around the axis of the pen tip 12 is determined from the rotation correction amount of the image. It can also be detected as information. In this case, as a use form of the posture information, for example, a multicolor pen can be realized by changing the displayed color according to the rotation angle of the pen tip unit 12.
 また、上述の各実施形態では、図4に示すようなペン先と読み出し位置とが一致している構成のデジタルペン10を例にとって説明したが、これに限られるものでない。例えば、図12に示すような、ペン先と読み出し位置とが所定長だけ離れている構成のデジタルペンであっても、表示部21に対して傾けた場合には、ペン先位置と読み出し位置との間隔が所定長からずれてしまう。したがって、上述したような方法は有効である。図12では、図4と共通の構成要素には図4と同一の符号を付している。すなわち、図12に示すデジタルペン10は、円筒状の本体部11と、本体部11の先端に取り付けられた指示部としてのペン先部12と、ペン先部12に作用する圧力を検出する圧力センサ13と、赤外光を出射する照射部14と、レンズ15bと撮像素子15cとを有しており、入射してきた赤外光を読み取る読取部15と、デジタルペン10を制御する制御部16と、外部へ信号を出力する送信部17と、デジタルペン10の各部材に電力を供給する電源19とを有している。 Further, in each of the above-described embodiments, the digital pen 10 having a configuration in which the pen tip and the reading position are matched as illustrated in FIG. 4 has been described as an example, but the present invention is not limited to this. For example, as shown in FIG. 12, even when the pen is configured so that the pen tip and the reading position are separated by a predetermined length, if the pen is tilted with respect to the display unit 21, the pen tip position and the reading position are Will be deviated from the predetermined length. Therefore, the method as described above is effective. In FIG. 12, the same reference numerals as those in FIG. That is, the digital pen 10 shown in FIG. 12 has a cylindrical main body 11, a pen tip 12 as an indicator attached to the tip of the main body 11, and a pressure for detecting pressure acting on the pen tip 12. A sensor 13, an irradiation unit 14 that emits infrared light, a lens 15 b, and an image sensor 15 c, a reading unit 15 that reads incident infrared light, and a control unit 16 that controls the digital pen 10. And a transmitter 17 for outputting a signal to the outside, and a power source 19 for supplying power to each member of the digital pen 10.
 また、特定部および姿勢制御部について、実施形態1では、デジタルペン10に設けられているものとし、実施形態2では、表示装置20に設けられているものとしたが、これに限られるものではなく、デジタルペン10および表示装置20とは別個の制御装置として、設けてもかまわない。 In addition, the specific unit and the posture control unit are assumed to be provided in the digital pen 10 in the first embodiment and are provided in the display device 20 in the second embodiment, but are not limited thereto. Alternatively, it may be provided as a control device separate from the digital pen 10 and the display device 20.
 以上のように、本開示における技術の例示として、実施の形態を説明した。そのために、添付図面および詳細な説明を提供した。 As described above, the embodiments have been described as examples of the technology in the present disclosure. For this purpose, the accompanying drawings and detailed description are provided.
 したがって、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Accordingly, among the components described in the accompanying drawings and the detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to illustrate the above technique. May also be included. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
 また、上述の実施の形態は、本開示における技術を例示するためのものであるから、特許請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiments are for illustrating the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be made within the scope of the claims and the equivalents thereof.
 本開示は、高精度な手書き入力を実現できる表示制御システムに適用可能である。具体的には、タブレット、スマートフォン、ノートPCなどに、本開示は適用可能である。 This disclosure is applicable to a display control system that can realize highly accurate handwriting input. Specifically, the present disclosure can be applied to tablets, smartphones, notebook PCs, and the like.
10,210 光学式デジタルペン(指示装置)
12 ペン先部(指示部)
15 読取部
16a,240 特定部
16b,241 姿勢検出部
20,220 表示装置
21 表示部
24 表示パネル
100,200 表示制御システム
10,210 Optical digital pen (indicator)
12 Pen tip (instruction unit)
15 Reading unit 16a, 240 Identification unit 16b, 241 Attitude detection unit 20, 220 Display device 21 Display unit 24 Display panel 100, 200 Display control system

Claims (6)

  1.  表示部を有する表示装置と、前記表示部上の位置を指示するための指示装置とを備え、前記指示装置によって指示された位置に応じた表示制御を行う表示制御システムであって、
     前記表示装置の前記表示部は、当該表示部における平面位置を表す位置情報パターンが形成されており、
     前記指示装置は、
     前記表示部上の位置を指示するための指示部と、
     前記指示部によって指示された位置における、前記位置情報パターンを含む画像を撮像する読取部とを備えており、
     前記表示制御システムは、
     前記読取部による撮像画像に含まれた位置情報パターンに基づいて、前記表示部における位置を特定する特定部と、
     前記撮像画像の歪みを検知し、検知した歪みから、前記指示装置の前記表示部に対する姿勢を検出する姿勢検出部とを備えている、
    表示制御システム。
    A display control system comprising: a display device having a display unit; and an instruction device for indicating a position on the display unit, and performing display control according to the position instructed by the instruction device,
    The display unit of the display device has a position information pattern representing a planar position in the display unit,
    The pointing device is
    An instruction unit for indicating a position on the display unit;
    A reading unit that captures an image including the position information pattern at a position instructed by the instruction unit;
    The display control system includes:
    A specifying unit for specifying a position in the display unit based on a position information pattern included in a captured image by the reading unit;
    An attitude detection unit that detects distortion of the captured image and detects an attitude of the pointing device with respect to the display unit from the detected distortion;
    Display control system.
  2.  請求項1記載の表示制御システムにおいて、
     前記表示装置は、
     前記特定部から出力された位置情報に基づき、前記姿勢検出部から出力された姿勢情報を加味して、前記表示部における表示情報の変更位置を決定する、
    表示制御システム。
    The display control system according to claim 1,
    The display device
    Based on the position information output from the specifying unit, taking into account the posture information output from the posture detection unit, to determine the change position of the display information in the display unit,
    Display control system.
  3.  請求項1または2記載の表示制御システムにおいて、
     前記表示装置は、
     前記姿勢情報に応じて、前記表示部における表示情報の変更形態を設定可能である
    表示制御システム。
    The display control system according to claim 1 or 2,
    The display device
    The display control system which can set the change form of the display information in the said display part according to the said attitude | position information.
  4.  請求項1~3のうちいずれか1項記載の表示制御システムにおいて、
     前記姿勢検出部は、前記撮像画像の台形歪みを検知し、検知した台形歪みの程度から、前記指示装置の前記表示部に対する傾きを検出する
    表示制御システム。
    The display control system according to any one of claims 1 to 3,
    The posture detection unit is a display control system that detects a trapezoidal distortion of the captured image and detects an inclination of the pointing device with respect to the display unit based on the detected degree of the trapezoidal distortion.
  5.  請求項1~3のうちいずれか1項記載の表示制御システムにおいて、
     前記指示装置は、ペン型形状であり、
     前記姿勢検出部は、前記撮像画像の回転補正量を検知し、検知した回転補正量から、前記指示装置の軸周りにおける回転角度を検出する
    表示制御システム。
    The display control system according to any one of claims 1 to 3,
    The pointing device has a pen shape,
    The posture detection unit is a display control system that detects a rotation correction amount of the captured image and detects a rotation angle around the axis of the pointing device from the detected rotation correction amount.
  6.  表示部を有し、前記表示部に当該表示部における平面位置を表す位置情報パターンが形成されている表示装置に対して、前記表示部上の位置を指示するための指示装置であって、
     前記表示部上の位置を指示するための指示部と、
     前記指示部によって指示された位置における、前記位置情報パターンを含む画像を撮像する読取部と、
     前記読取部による撮像画像に含まれた位置情報パターンに基づいて、前記表示部における位置を特定する特定部と、
     前記撮像画像の歪みを検知し、検知した歪みから、前記指示装置の前記表示部に対する姿勢を検出する姿勢検出部とを備えている、
    指示装置。
    An instruction device for instructing a position on the display unit with respect to a display device having a display unit and a position information pattern representing a planar position in the display unit formed on the display unit,
    An instruction unit for indicating a position on the display unit;
    A reading unit that captures an image including the position information pattern at a position instructed by the instruction unit;
    A specifying unit for specifying a position in the display unit based on a position information pattern included in a captured image by the reading unit;
    An attitude detection unit that detects distortion of the captured image and detects an attitude of the pointing device with respect to the display unit from the detected distortion;
    Pointing device.
PCT/JP2013/002708 2012-04-26 2013-04-22 Display control system and indicating device WO2013161261A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-101720 2012-04-26
JP2012101720 2012-04-26

Publications (1)

Publication Number Publication Date
WO2013161261A1 true WO2013161261A1 (en) 2013-10-31

Family

ID=49482604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/002708 WO2013161261A1 (en) 2012-04-26 2013-04-22 Display control system and indicating device

Country Status (1)

Country Link
WO (1) WO2013161261A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003500777A (en) * 1999-05-28 2003-01-07 アノト・アクティエボラーク Recording information
JP2011103092A (en) * 2009-11-11 2011-05-26 Dainippon Printing Co Ltd Information processing system and display processing program
JP2011253343A (en) * 2010-06-02 2011-12-15 Dainippon Printing Co Ltd Information processing system and display processing program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003500777A (en) * 1999-05-28 2003-01-07 アノト・アクティエボラーク Recording information
JP2011103092A (en) * 2009-11-11 2011-05-26 Dainippon Printing Co Ltd Information processing system and display processing program
JP2011253343A (en) * 2010-06-02 2011-12-15 Dainippon Printing Co Ltd Information processing system and display processing program

Similar Documents

Publication Publication Date Title
WO2013035319A1 (en) Display panel, display device, and display control system
KR101152724B1 (en) Mouse provided with a dot pattern reading function
WO2013161246A1 (en) Display control system, display device, and display panel
US20130314313A1 (en) Display with coding pattern
WO2010109715A1 (en) Touch panel input system, and input pen
US20120162061A1 (en) Activation objects for interactive systems
JP2013242821A (en) Picture display device and picture operation method of the same
US9477327B2 (en) Display device and display control system
US20140362054A1 (en) Display control system and reading device
JP5553920B2 (en) Display panel and display device
WO2013161236A1 (en) Optical film, display panel, and display device
WO2011114590A1 (en) Position input device, position input system, position input method, position input program and computer-readable recording medium
WO2013161245A1 (en) Display control system, display device, and display panel
WO2013161261A1 (en) Display control system and indicating device
JP2014041602A (en) Information reading apparatus
JP5420807B1 (en) Display control system, pointing device and display panel
US20160364039A1 (en) Optical film, display panel, and display device
WO2016132732A1 (en) Display panel
WO2011121842A1 (en) Display device with input unit, control method for same, control program and recording medium
JP2010117841A (en) Image detection device, recognition method of input position and program
WO2014017039A1 (en) Information reading device
JP2007279296A (en) Information processing system, client device, and program making computer execute information processing method
US20150261327A1 (en) Display control system
JP2010128566A (en) Image detection device, recognition method for input region, and program
KR101385771B1 (en) Method for Reading Code Indicated on Display Panel, Reader Executing the Method Therein, Electronic Device Equipped with Display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13780809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13780809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP