WO2013161261A1 - Système de commande d'affichage et dispositif indicateur - Google Patents

Système de commande d'affichage et dispositif indicateur Download PDF

Info

Publication number
WO2013161261A1
WO2013161261A1 PCT/JP2013/002708 JP2013002708W WO2013161261A1 WO 2013161261 A1 WO2013161261 A1 WO 2013161261A1 JP 2013002708 W JP2013002708 W JP 2013002708W WO 2013161261 A1 WO2013161261 A1 WO 2013161261A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
display
control system
display unit
display control
Prior art date
Application number
PCT/JP2013/002708
Other languages
English (en)
Japanese (ja)
Inventor
山田 和宏
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013161261A1 publication Critical patent/WO2013161261A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet

Definitions

  • This disclosure relates to a display control system capable of handwriting input on a display surface of a digital display.
  • Patent Document 1 discloses a technique for digitizing information entered on paper and transmitting the digitized information to a server or a terminal when characters or the like are entered on the paper using a pen.
  • This disclosure provides a display control system capable of executing handwriting input with high accuracy on the display surface of a display device.
  • the display control system includes a display device having a display unit and an instruction device for indicating a position on the display unit, and performs display control according to the position indicated by the instruction device.
  • the display unit of the display device is formed with a position information pattern representing a planar position on the display unit, and the pointing device has an instruction unit for indicating a position on the display unit, and a position indicated by the instruction unit.
  • a display unit that captures an image including a position information pattern, and the display control system specifies a position in the display unit based on the position information pattern included in the captured image by the reading unit;
  • a posture detection unit that detects distortion of the captured image and detects the posture of the pointing device with respect to the display unit from the detected distortion;
  • the display control system according to the present disclosure can realize highly accurate handwriting input without hindering downsizing of the pointing device and cost reduction of the display control system.
  • FIG. 1 is a block diagram of a display control system according to Embodiment 1.
  • FIG. It is a schematic sectional drawing of a display panel. It is a schematic sectional drawing of a digital pen. It is a top view of a color filter.
  • (A) to (D) are diagrams showing examples of dot arrangement corresponding to each code.
  • (A), (b) is a figure for demonstrating the relationship between the inclination of a digital pen, and the positional information read. It is an example of the captured image in the state of FIG. 4 is a flowchart illustrating a processing flow of the display control system according to the first embodiment.
  • 6 is a block diagram of a display control system according to Embodiment 2.
  • FIG. 10 is a flowchart illustrating a processing flow of the display control system according to the second embodiment. It is a schematic sectional drawing which shows the other structure of a digital pen.
  • a position information pattern indicating a planar position is formed on the display unit, and the indicated position is detected by optically reading the pattern by a pointing device, and a locus display or the like is performed. Can be considered.
  • the following problems are expected to occur. That is, there is a certain gap between the surface of the display unit with which the pointing device abuts and the layer on which the position information pattern is formed, depending on the thickness of the glass substrate or the like. For this reason, for example, when using an instruction device that performs reading at the tip of the instruction unit, when the instruction device is placed perpendicular to the display unit surface, the indicated position matches the reading position, but the display unit surface When the pointing device is tilted with respect to the position, the indicated position and the reading position are displaced. For this reason, the position indicated by the instruction unit and the position displayed on the display device do not necessarily match, and a shift occurs, which may give the user a sense of discomfort. In addition, since the magnitude of the positional deviation changes depending on the inclination of the pointing device, it is not appropriate to fix the display position in a fixed manner.
  • the display position can be corrected according to the posture information.
  • adding some device to detect the attitude of the pointing device is not preferable in terms of downsizing the pointing device and reducing the cost of the display control system.
  • FIG. 1 is a schematic diagram illustrating an appearance of a display control system 100 according to the embodiment.
  • the display control system 100 includes an optical digital pen (hereinafter simply referred to as “digital pen”) 10 as a pointing device and a display device 20.
  • the display device 20 is a liquid crystal display, for example, and can display various objects on the display unit 21.
  • the display unit 21 is formed with a dot pattern as a position information pattern representing a planar position in the display unit 21.
  • the digital pen 10 optically reads the dot pattern to detect position information of the designated position and transmits the position information to the display device 20.
  • the display device 20 receives the position information as input and performs various processes. That is, the digital pen 10 functions as a reading device and also functions as a data input device to the display control system 100.
  • the display device 20 can display the locus of the tip of the digital pen 10 by continuously displaying points at designated positions on the display unit 21 following the movement of the digital pen 10. That is, it is possible to enter characters, figures, and the like on the display unit 21 using the digital pen 10.
  • the display device 20 follows the movement of the digital pen 10 and continuously erases the display at the designated position on the display unit 21, thereby displaying a portion that matches the locus of the tip of the digital pen 10. Can be erased. That is, the digital pen 10 can be used like an eraser.
  • the display device 20 can use the digital pen 10 as an input device such as a mouse by displaying a specified position on the display unit 21. In this way, in the display control system 100, the position of the digital pen 10 is input as an input to the display device 20 by moving the digital pen 10 on the display unit 21 of the display device 20, and the display device 20 responds to the input. To change the displayed contents.
  • FIG. 2 is a block diagram illustrating a schematic configuration of the display control system 100.
  • the display device 20 includes a receiving unit 22 that receives a signal from the outside, a display-side microcomputer 23 that controls the entire display device 20, and a display panel 24 that displays an image.
  • the display panel 24 of the present embodiment is a liquid crystal panel.
  • the receiving unit 22 receives a signal transmitted from the digital pen 10, which will be described in detail later.
  • the signal received by the receiving unit 22 is sent to the display-side microcomputer 23.
  • the display-side microcomputer 23 is composed of a CPU, a memory, and the like, and a program for operating the CPU is also mounted.
  • the display-side microcomputer 23 is an example of a control unit.
  • the display-side microcomputer 23 controls the display panel 24 based on a signal transmitted from the digital pen 10 and changes the content displayed on the display unit 21.
  • FIG. 3 is a schematic sectional view of the display panel 24.
  • the basic configuration of the display panel 24 is the same as that of a general liquid crystal panel.
  • the display panel 24 includes a pair of glass substrates 25, a polarizing filter 26 provided on the outer surface of each glass substrate 25, a pair of alignment films 27 provided between the pair of glass substrates 25, and a pair A liquid crystal layer 28 provided between the alignment films 27, a transparent electrode 29 provided on each alignment film 27, and a color filter 30 provided between the glass substrate 25 on the surface side and the transparent electrode 29.
  • Various images are displayed on the display unit 21.
  • the dot pattern described above is written on the color filter 30.
  • the dot pattern is an example of a position information pattern.
  • FIG. 4 is a cross-sectional view illustrating a schematic configuration of the digital pen 10.
  • the digital pen 10 includes a cylindrical main body 11, a pen tip 12 as an instruction unit attached to the tip of the main body 11, a pressure sensor 13 that detects pressure acting on the pen tip 12, and an infrared ray.
  • Irradiation unit 14 that emits light
  • reading unit 15 that reads incident infrared light
  • control unit 16 that controls the digital pen 10
  • transmission unit 17 that outputs a signal to the outside
  • each member of the digital pen 10 And a power source 19 for supplying power to the power source.
  • the main body 11 is formed of a cylinder similar to a general pen.
  • the pen tip portion 12 has a tapered shape, and its tip is formed to be round so as not to damage the display portion 21. Moreover, it is preferable that the shape of the pen point part 12 is a shape in which the user can easily recognize the image displayed on the display unit 21. In the configuration of FIG. 4, the pen tip portion 12 is formed of a material that can transmit infrared light.
  • the pressure sensor 13 is built in the main body 11 and connected to the proximal end of the pen tip 12.
  • the pressure sensor 13 detects the pressure applied to the pen tip unit 12 and transmits the detection result to the control unit 16. Specifically, the pressure sensor 13 detects the pressure applied to the pen tip portion 12 when the user enters characters or the like on the display portion 21 using the digital pen 10. That is, the pressure sensor 13 is used when determining whether or not there is an input intention of the user using the digital pen 10.
  • the irradiation unit 14 is provided at the tip of the main body 11 and in the vicinity of the pen tip 12.
  • the irradiation part 14 is comprised by infrared LED, for example, and is comprised so that infrared light may be irradiated from the front-end
  • FIG. 4 In the configuration of FIG. 4, a plurality of (for example, four) irradiation units 14 are arranged so as to surround the pen tip unit 12. The number of irradiation units 14 can be set as appropriate.
  • the irradiation part 14 may be formed in ring shape.
  • the reading unit 15 is provided at the tip of the main body 11 and in the vicinity of the pen tip unit 12.
  • the reading unit 15 includes an objective lens 15a, a lens 15b, and an image sensor 15c built in the tip of the pen tip unit 12, and the objective lens 15a and the lens 15b constitute an optical system.
  • the objective lens 15a and the lens 15b form incident light on the image sensor 15c.
  • Infrared light emitted from the irradiation unit 14 and reflected by the display device 20 is incident on the objective lens 15a.
  • the image sensor 15c is provided on the optical axes of the objective lens 15a and the lens 15b.
  • the imaging element 15 c converts an optical image formed on the imaging surface into an electrical signal and outputs the electrical signal to the control unit 16.
  • the image sensor 15c is configured by, for example, a CCD image sensor or a CMOS image sensor. Although the details will be described later, since the dot pattern is formed of a material that absorbs infrared light, the dot pattern does not return infrared light. As a result, an optical image in which the dot pattern is expressed in black is captured by the image sensor 15c.
  • the control unit 16 includes a specifying unit 16a, a posture detection unit 16b, and a pen-side microcomputer 16c, as shown in FIG.
  • the specifying unit 16 a specifies position information on the display unit 21 of the digital pen 10 based on the image signal from the reading unit 15. Specifically, the specifying unit 16a acquires a dot pattern from the image signal acquired by the reading unit 15, and specifies the position of the pen tip unit 12 on the display unit 21 based on the dot pattern. Information regarding the position of the pen tip part 12 specified by the specifying part 16a is sent to the pen-side microcomputer 16c.
  • the posture detection unit 16 b detects posture information with respect to the display unit 21 of the digital pen 10 based on the image signal from the reading unit 15.
  • the posture detection unit 16b detects image distortion from the image signal acquired by the reading unit 15 based on the arrangement of dot patterns and the like, and the detected distortion information indicates the normal direction of the surface of the display unit 21.
  • the tilt of the digital pen 10 is detected.
  • Information about the posture of the digital pen 10 detected by the posture detection unit 16b is sent to the pen-side microcomputer 16c.
  • the pen side microcomputer 16 c controls the entire digital pen 10.
  • the pen side microcomputer 16c is constituted by a CPU, a memory, and the like, and a program for operating the CPU is also mounted.
  • the transmission unit 17 transmits a signal to the outside. Specifically, the transmission unit 17 wirelessly transmits the position information specified by the specifying unit 16a and the posture information detected by the posture detection unit 16b to the outside. The transmission unit 17 performs near field communication with the reception unit 22 of the display device 20. The transmitter 17 is provided at the end of the main body 11 opposite to the pen tip 12.
  • FIG. 5 is a plan view of the color filter 30.
  • the color filter 30 includes a black matrix 31, a plurality of pixel regions 32 that are partitioned by the black matrix 31 and transmit light of a specific color, and dots 33 provided in the pixel region 32.
  • the pixel area 32 includes a red pixel area 32r that transmits red (R) light, a green pixel area 32g that transmits green (G) light, and a blue pixel area 32b that transmits blue (B) light. Contains.
  • Each pixel region 32 corresponds to a subpixel of the display panel 24. Note that when the colors to be transmitted are not distinguished, they are simply referred to as “pixel region 32”.
  • the black matrix 31 includes a vertical line extending in the longitudinal direction of the pixel region 32 and a horizontal line extending in the short direction of the pixel region 32 and is formed in a lattice shape.
  • the horizontal line is formed thicker than the vertical line.
  • the black matrix 31 and the dots 33 are formed of a material mainly composed of carbon black.
  • the dots 33 are formed in a solid circle.
  • the dots 33 are provided in some pixel regions 32 instead of all the pixel regions 32.
  • a plurality of dots 33 are collected to form a dot pattern. This dot pattern differs depending on the position of the color filter 30.
  • the dot pattern will be described in detail below.
  • first reference line 34 and a second reference line 35 are defined on the color filter 30. These first and second reference lines 34 and 35 are virtual lines and are not actually existing lines.
  • the first reference line 34 is a straight line extending in the short direction of the pixel region 32.
  • a plurality of first reference lines 34 are arranged in parallel in the longitudinal direction of the pixel region 32 every two pixel regions 32. Each first reference line 34 is located at the center in the longitudinal direction of the pixel region 32.
  • the second reference line 35 is a straight line extending in the longitudinal direction of the pixel region 32.
  • the second reference line 35 is provided on the green pixel region 32g, and a plurality of second green cells 32g are arranged in parallel in the short direction of the pixel region 32. Each second reference line 35 is located in the center in the lateral direction of the green pixel region 32g.
  • a grid is defined on the color filter 30 by the first reference line 34 and the second reference line 35.
  • FIG. 6 is a diagram showing an arrangement pattern of the dots 33.
  • the dots 33 are arranged at positions shifted from the intersection in any of four directions orthogonal to each other (up and down, left and right in FIGS. 5 and 6). Specifically, the dots 33 are arranged in any one of FIGS. 6A to 6D.
  • the dot 33 is arranged at a position shifted to the right on the first reference line 34 from the intersection of the first reference line 34 and the second reference line 35.
  • the dots 33 are arranged on the blue pixel region 32b. When this arrangement is digitized, it is represented by “1”. In the arrangement of FIG.
  • the dot 33 is arranged at a position shifted upward on the second reference line 35 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the green pixel region 32g. When this arrangement is digitized, it is represented by “2”. In the arrangement of FIG. 6C, the dot 33 is arranged at a position shifted to the left on the first reference line 34 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the red pixel region 32r. When this arrangement is digitized, it is represented by “3”. In the arrangement of FIG.
  • the dot 33 is arranged at a position shifted downward on the second reference line 35 from the intersection of the first reference line 34 and the second reference line 35. At this time, the dots 33 are arranged on the green pixel region 32g. When this arrangement is digitized, it is represented by “4”.
  • one dot pattern is formed by 36 dots 33 included in the unit area.
  • each of the 36 dots 33 included in the unit area in any one of “1” to “4”, a huge number of dot patterns can be formed.
  • the dot patterns in each unit area are all different.
  • each dot pattern represents a position coordinate for each unit area. That is, when the color filter 30 is divided into unit areas of 6 dots ⁇ 6 dots, each dot pattern represents the position coordinates of the unit area.
  • dot pattern patterning and coordinate conversion methods for example, a publicly known method as disclosed in Japanese Patent Application Laid-Open No. 2006-141067 can be used.
  • FIG. 7 is an enlarged view showing a contact state between the digital pen and the display unit of the display device in the present embodiment.
  • the orientation of the pen tip portion 12 of the digital pen 10 is perpendicular to the display portion 21, and the imaging direction dc of the reading portion 15 of the digital pen 10 is the normal direction of the surface of the display portion 21. It matches dn.
  • the direction of the pen tip portion 12 of the digital pen 10 is inclined with respect to the display portion 20, and the imaging direction dc of the reading portion 15 of the digital pen 10 is the method of the display portion 20. It deviates from the line direction dn. For this reason, in the state of FIG. 7B, the position A captured by the color filter 30 on which the dot pattern is formed is shifted from the position B that is directly below the pen tip portion 12. This is because the glass substrate 25 and the polarization filter 26 on the surface of the display unit 21 have a thickness.
  • FIG. 8 is an example of an image captured by the reading unit 15, (a) is an image captured in the state of FIG. 7 (a), and (b) is FIG. 7 (b), that is, the pen tip unit 12 is tilted. It is the image imaged in the state.
  • trapezoidal distortion is generated as can be seen in comparison with FIG.
  • the degree of trapezoidal distortion can be obtained from the pitch size of the vertical arrangement of dot patterns, for example. That is, in the image of FIG. 8A, the vertical arrangement of dot patterns is arranged substantially in parallel.
  • the image of FIG. 8A the vertical arrangement of dot patterns is arranged substantially in parallel.
  • the vertical arrangement of the dot patterns has an inclination, and the inclination gradually increases as it approaches the edge of the image. That is, the pitch of the vertical arrangement of dot patterns is large on the near side and small on the far side. From the difference in the pitch of the dot pattern arrangement, the inclination of the imaging direction dc of the reading unit 15, that is, the inclination of the pen tip unit 12 can be calculated by geometric calculation.
  • the trapezoidal distortion detection method is not limited to the one shown here.
  • FIG. 9 is a flowchart showing a processing flow of the display control system 100. Below, the case where a user writes a character into the display apparatus 20 using the digital pen 10 is demonstrated.
  • the pen side microcomputer 16c of the digital pen 10 monitors whether or not pressure is applied to the pen tip portion 12 in step S11. This pressure is detected by the pressure sensor 13. When the pressure is detected (Yes), the pen-side microcomputer 16c determines that the user is inputting characters on the display unit 21 of the display device 20, and proceeds to step S12. While the pressure is not detected (No), the pen side microcomputer 16c repeats Step S11. Note that when the power source of the digital pen 10 is turned on, the irradiation unit 14 starts irradiation of infrared light.
  • step S12 the reading unit 15 of the digital pen 10 detects a dot pattern formed on the display unit 21.
  • Infrared light is emitted from the irradiation unit 14, and the infrared light is absorbed by at least the dots 33 provided in the color filter 30 of the display device 20, and is reflected by the image region 32 and the like.
  • the reflected infrared light is received by the image sensor 15c through the objective lens 15a and the lens 15b.
  • a dot pattern is imaged by the image sensor 15c.
  • the reading unit 15 optically reads the dot pattern.
  • the image signal acquired by the reading unit 15 is transmitted to the specifying unit 16a and the posture detecting unit 16b.
  • the specifying unit 16a acquires a dot pattern from the image signal, and specifies the position of the pen tip unit 12 on the display unit 21 based on the dot pattern. Specifically, the specifying unit 16a acquires a dot pattern by performing predetermined image processing on the obtained image signal. For example, since the black matrix 31 is formed of carbon black like the dots 33, it absorbs infrared light. Therefore, the black matrix 31 is also included in the image from the reading unit 15 in the same state as the dots 33. Therefore, the image signal from the reading unit 15 is subjected to predetermined image processing so that the dots 33 are easily discriminated from the black matrix 31, and an array of a plurality of dots 33 is acquired from the processed image signal.
  • the specifying unit 16a determines a unit area of 6 dots ⁇ 6 dots from the acquired arrangement of the dots 33, and specifies the position coordinates (position information) of the unit area from the dot pattern of the unit area.
  • the specifying unit 16a converts the dot pattern into position coordinates by a predetermined calculation corresponding to the dot pattern coding method.
  • the specified position information is transmitted to the pen-side microcomputer 16c.
  • the posture detection unit 16b detects the trapezoidal distortion of the image from the image signal, and detects the inclination of the pen tip unit 12 based on the degree of the trapezoidal distortion. For example, as described above, after performing rotation correction on the image, the inclination and pitch of the vertical arrangement of the dot pattern are obtained, and the inclination in the imaging direction of the reading unit 15 is calculated from the inclination and pitch by geometric calculation. . The calculated inclination is transmitted to the pen-side microcomputer 16c as posture information.
  • step S15 the pen side microcomputer 16c transmits the position information and the posture information to the display device 20 via the transmission unit 17.
  • the position information and posture information transmitted from the digital pen 10 are received by the receiving unit 22 of the display device 20.
  • the received position information and posture information are transmitted from the receiving unit 22 to the display-side microcomputer 23.
  • the display-side microcomputer 23 controls the display panel 24 to change the display content of the position corresponding to the position information.
  • a character is input, a point is displayed at a position corresponding to the position information on the display unit 21.
  • the display-side microcomputer 23 corrects the position corresponding to the position information in consideration of the posture information. For example, in the state of FIG. 7B, the position information indicates the position A, but the actual position where the pen tip portion 12 is in contact with the display portion 21 is the position B, and a deviation occurs. This deviation is caused by the inclination of the pen tip portion 12. For this reason, the display-side microcomputer 23 changes the display content change position to, for example, the position B in consideration of the posture information, that is, the inclination of the pen tip portion 12 so as not to give the user a sense of incongruity. It should be noted that the corrected position is not necessarily set to the position B immediately below the pen tip portion 12, and is set to a position between the position A and the position B where the user is assumed not to feel uncomfortable. It doesn't matter.
  • step S17 the pen side microcomputer 16c determines whether or not the input by the user is continued.
  • the pressure sensor 13 detects the pressure
  • the pen side microcomputer 16c determines that the input by the user is continued, and returns to step S11.
  • dots are continuously displayed at the position of the pen tip 12 on the display unit 21 following the movement of the pen tip 12 of the digital pen 10.
  • characters corresponding to the locus of the pen tip portion 12 of the digital pen 10 are displayed on the display portion 21 of the display device 20.
  • the pen side microcomputer 16c determines that the input by the user is not continued, and ends the process.
  • the display device 20 displays the locus of the tip of the digital pen 10 on the display unit 21 on the display unit 21, handwriting input to the display unit 21 using the digital pen 10 can be performed.
  • the usage of the display control system 100 is not restricted to this.
  • the digital pen 10 can be used like an eraser to erase characters, figures, etc. displayed on the display unit 21. That is, in the above example, the point is displayed at the position corresponding to the position information on the display unit 21, but the point at the position may be deleted.
  • the digital pen 10 can be used like a mouse to move a pointer displayed on the display unit 21 or to select an icon displayed on the display unit 21.
  • the position of the pen tip unit 12 on the display unit 21 based on the dot pattern included in the image captured by the reading unit 15 by the specifying unit 16a provided in the digital pen 10. Is identified.
  • a trapezoidal distortion of the image captured by the reading unit 15 is detected by the posture detection unit 16b provided in the digital pen 10, and based on the degree of the trapezoidal distortion, the display unit 21 of the pen tip unit 12 is detected. Tilt is detected.
  • the display device 20 determines the change position of the display information on the display unit 21 based on the position information specified by the specifying unit 16a and taking into account the posture information detected by the posture detection unit 16b.
  • the display position can be corrected so that there is no deviation between the position of the pen tip portion 12 and the display position on the display.
  • the tilt of the pen tip 12 is detected from the image captured by the reading unit 15, it is not necessary to add a new device to obtain the posture information of the digital pen 10. Therefore, it is possible to eliminate the user's uncomfortable feeling without increasing the size and cost of the digital pen 10.
  • the position on the display unit 21 is instructed by the digital pen 10 to the display device 20 having the display unit 21.
  • the reading unit 15 captures an image including a dot pattern representing a planar position on the display unit 21.
  • the position in the display part 21 is pinpointed by the specific
  • the posture detection unit 16b detects the distortion of the captured image, and the posture of the digital pen 10 with respect to the display unit 21 is detected from the detected distortion. For this reason, for example, even if there is a deviation between the designated position and the reading position of the dot pattern due to the posture of the digital pen 10, the detected posture information is taken into consideration in the display unit 21.
  • the display position can be corrected. Therefore, using the posture information detected from the captured image, it is possible to correct the deviation between the indicated position and the display position due to the posture of the digital pen 10 in advance.
  • the detected posture information can also be used for setting a display information change mode on the display unit 21.
  • the specifying unit 16a and the posture detection unit 16b are separate blocks, but these may be integrated to perform processing.
  • the position information and the posture information are transmitted from the digital pen 10 and the display device 20 corrects the position indicated by the position information based on the posture information.
  • the position indicated by the position information may be corrected by the pen 10 based on the posture information, and the corrected display position may be transmitted to the display device 20.
  • the type of the display device 20 is changed, there is a possibility that the distance from the surface of the display unit 21 to the layer on which the dot pattern is formed (here, the color filter 30) may change, and the display position is corrected accordingly. Need to be changed. For this reason, it can be said that the correction process is preferably performed on the display device 20 side.
  • FIG. 10 is a block diagram illustrating a schematic configuration of the display control system 200.
  • the display control system 200 is different from the first embodiment in that the display device 220, not the digital pen 210, identifies the position of the digital pen 210 and detects the posture.
  • the description of the configuration substantially similar to that of the first embodiment may be omitted.
  • the digital pen 210 has a pressure sensor 13, an irradiation unit 14, a reading unit 15, a control unit 216, and a transmission unit 17.
  • the configurations of the pressure sensor 13, the irradiation unit 14, the reading unit 15, and the transmission unit 17 are the same as those in the first embodiment.
  • the control unit 216 includes the pen-side microcomputer 16c and does not include the specifying unit 16a and the posture detection unit 16b of the first embodiment. That is, the control unit 216 outputs the image signal input from the image sensor 15c to the transmission unit 17 without specifying the position information of the digital pen 210 from the image signal. Thus, the image signal picked up by the image pickup device 15c is transmitted from the digital pen 210.
  • the display device 220 includes a receiving unit 22 that receives an external signal, a display-side microcomputer 23 that controls the entire display device 20, a display panel 24 that displays an image, and a digital pen 10. It has a specifying unit 240 that specifies the position, and a posture detecting unit 241 that detects the posture of the digital pen 10.
  • the configurations of the receiving unit 22, the display-side microcomputer 23, and the display panel 24 are the same as those in the first embodiment.
  • a dot pattern as shown in FIG. 5 is formed on the display unit 21 of the display panel 24.
  • the receiving unit 22 receives a signal transmitted from the digital pen 210 and transmits the signal to the specifying unit 240.
  • the specifying unit 240 has the same function as the specifying unit 16a of the digital pen 10 in the first embodiment. That is, in this embodiment, since the transmission signal from the digital pen 210 is an image signal acquired by the image sensor 15c, the specifying unit 240 specifies the position of the digital pen 210 from the image signal. That is, the specifying unit 240 acquires a dot pattern from the image signal, and specifies the position coordinates of the pen tip unit 12 on the display unit 21 based on the dot pattern, similarly to the specifying unit 16a. The specifying unit 240 transmits the specified position information to the display-side microcomputer 23.
  • the posture detection unit 241 has the same function as the posture detection unit 16b of the digital pen 10 in the first embodiment. That is, in this embodiment, since the transmission signal from the digital pen 210 is an image signal acquired by the image sensor 15c, the posture detection unit 241 detects the posture of the digital pen 210 from the image signal. That is, the posture detection unit 241 detects image distortion based on the arrangement of dot patterns and the like from the image signal, as in the posture detection unit 16b, and digitally displays the normal direction of the display unit 21 from the detected distortion information. The inclination of the pen 10 is detected. The posture detection unit 241 transmits the detected posture information to the display-side microcomputer 23. The display-side microcomputer 23 controls the display panel 24 to change the display information displayed on the display unit 21 based on the position information and the posture information.
  • FIG. 11 is a flowchart showing a process flow of the display control system 200. Below, the case where a user writes a character in the display apparatus 220 using the digital pen 210 is demonstrated.
  • step S21 the pen side microcomputer 16c of the digital pen 210 monitors whether or not pressure is applied to the pen tip portion 12.
  • the pen-side microcomputer 16c determines that the user is inputting characters to the display unit 21 of the display device 220, and proceeds to step S22.
  • step S ⁇ b> 22 the reading unit 15 of the digital pen 210 acquires a dot pattern image formed on the display unit 21.
  • the image signal acquired by the reading unit 15 is transmitted to the display device 220 via the transmission unit 17 in step S23.
  • the image signal transmitted from the digital pen 210 is received by the receiving unit 22 of the display device 220 in step S24.
  • the received image signal is sent to the specifying unit 240 and the posture detecting unit 241.
  • the specifying unit 240 acquires a dot pattern based on the image signal and specifies the position of the digital pen 210.
  • the position information specified by the specifying unit 240 is sent to the display-side microcomputer 23.
  • the posture detection unit 241 detects the trapezoidal distortion of the image from the image signal, and detects the inclination of the pen tip unit 12 based on the degree of the trapezoidal distortion. For example, as described above, after performing rotation correction on the image, the inclination of the vertical arrangement and pitch of the dot pattern is obtained, and the inclination in the imaging direction of the reading unit 15 is calculated from the inclination by geometric calculation. The calculated inclination is sent to the display-side microcomputer 23 as posture information.
  • step S26 when receiving the position information and the posture information, the display-side microcomputer 23 controls the display unit 24 to change the display contents of the position corresponding to the position information and the posture information.
  • the display-side microcomputer 23 corrects the position corresponding to the position information to a position where it is estimated that the user does not feel uncomfortable, for example, taking the posture information into consideration.
  • step S27 the pen side microcomputer 16c determines whether or not the input by the user is continued. If the input continues (Yes), the process returns to step S21 and the above flow is repeated. On the other hand, if the input is not continued, the process is terminated. In this way, characters corresponding to the locus of the pen tip 12 of the digital pen 210 are displayed on the display unit 21 of the display device 220.
  • the display control system 200 can detect the position of the digital pen 210 operated by the user with high definition and reflect the position on the display unit 21 with high definition.
  • the position on the display unit 21 is instructed by the digital pen 210 to the display device 220 having the display unit 21.
  • the reading unit 15 captures an image including a dot pattern representing a planar position on the display unit 21.
  • the specifying unit 240 specifies the position on the display unit 21 based on the dot pattern included in the captured image.
  • the posture detection unit 241 detects the distortion of the captured image, and the posture of the digital pen 210 with respect to the display unit 21 is detected from the detected distortion.
  • the detected posture information is taken into consideration in the display unit 21.
  • the display position can be corrected. Therefore, using the posture information detected from the captured image, it is possible to correct the deviation between the indicated position and the display position due to the posture of the digital pen 210 in advance.
  • the detected posture information can also be used for setting a display information change mode on the display unit 21.
  • Embodiments 1 and 2 have been described as examples of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed. In addition, it is possible to combine the components described in the first and second embodiments to form a new embodiment.
  • the optical digital pen has been described as an example of the pointing device.
  • the indicating device is for indicating a position on the display unit of the display device, and includes an indicating unit for indicating a position on the display unit, and an image including a position information pattern at the position specified by the indicating unit. It is only necessary to include a reading unit that captures images. Therefore, the pointing device is not limited to the optical digital pen. Further, the configuration of the instruction unit and the reading unit is not limited to that shown in the first and second embodiments.
  • the dot pattern has been described as an example of the position information pattern.
  • the position information pattern may be formed on the display unit of the display device and represents a planar position on the display unit. Therefore, the position information pattern is not limited to a dot pattern. Further, the way of expressing the position coordinates and the division form of the unit area are not limited to those shown in the first and second embodiments.
  • the dots 33 are provided in the color filter 30, but are not limited thereto.
  • the dots 33 may be provided on the glass substrate 25 or the polarizing filter 26.
  • the display panel 24 may be configured to include a sheet different from the color filter 30, the glass substrate 25, and the polarizing filter 26 in which the dots 33 are formed.
  • the posture information of the digital pen 10 is used for correcting the display position. That is, based on the position information output from the specifying unit, the change position of the display information on the display unit is determined in consideration of the posture information of the pointing device.
  • the use form of the posture information is not limited to this. For example, the thickness of the line to be displayed, the touch of the line, or the like may be switched according to the inclination of the pen tip portion 12. That is, it is possible to set a display information change mode in the display unit according to the posture information.
  • the inclination of the pen tip 12 is obtained from the trapezoidal distortion of the image.
  • the rotation angle around the axis of the pen tip 12 is determined from the rotation correction amount of the image. It can also be detected as information.
  • a multicolor pen can be realized by changing the displayed color according to the rotation angle of the pen tip unit 12.
  • the digital pen 10 having a configuration in which the pen tip and the reading position are matched as illustrated in FIG. 4 has been described as an example, but the present invention is not limited to this.
  • the pen even when the pen is configured so that the pen tip and the reading position are separated by a predetermined length, if the pen is tilted with respect to the display unit 21, the pen tip position and the reading position are Will be deviated from the predetermined length. Therefore, the method as described above is effective.
  • FIG. 12 the same reference numerals as those in FIG. That is, the digital pen 10 shown in FIG.
  • a sensor 13 an irradiation unit 14 that emits infrared light, a lens 15 b, and an image sensor 15 c, a reading unit 15 that reads incident infrared light, and a control unit 16 that controls the digital pen 10.
  • a transmitter 17 for outputting a signal to the outside, and a power source 19 for supplying power to each member of the digital pen 10.
  • the specific unit and the posture control unit are assumed to be provided in the digital pen 10 in the first embodiment and are provided in the display device 20 in the second embodiment, but are not limited thereto. Alternatively, it may be provided as a control device separate from the digital pen 10 and the display device 20.
  • This disclosure is applicable to a display control system that can realize highly accurate handwriting input.
  • the present disclosure can be applied to tablets, smartphones, notebook PCs, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Dans le système de commande d'affichage (100) selon l'invention, la commande d'affichage est exécutée en fonction de positions sur un panneau d'affichage (24) indiqué par un dispositif indicateur (10). Ledit dispositif indicateur (10) est pourvu d'une unité de lecture (15) qui prend une image à une position indiquée, laquelle image comprend un motif d'information de position. Une unité d'identification (16) identifie la position sur la base du motif d'informations de position contenu dans l'image prise par l'unité de lecture (15). Une unité de détection d'orientation (16b) détecte la distorsion dans l'image prise et, à partir de cette distorsion, elle détecte l'orientation du dispositif indicateur (10).
PCT/JP2013/002708 2012-04-26 2013-04-22 Système de commande d'affichage et dispositif indicateur WO2013161261A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-101720 2012-04-26
JP2012101720 2012-04-26

Publications (1)

Publication Number Publication Date
WO2013161261A1 true WO2013161261A1 (fr) 2013-10-31

Family

ID=49482604

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/002708 WO2013161261A1 (fr) 2012-04-26 2013-04-22 Système de commande d'affichage et dispositif indicateur

Country Status (1)

Country Link
WO (1) WO2013161261A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003500777A (ja) * 1999-05-28 2003-01-07 アノト・アクティエボラーク 情報の記録
JP2011103092A (ja) * 2009-11-11 2011-05-26 Dainippon Printing Co Ltd 情報処理システム及び表示処理プログラム
JP2011253343A (ja) * 2010-06-02 2011-12-15 Dainippon Printing Co Ltd 情報処理システム及び表示処理プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003500777A (ja) * 1999-05-28 2003-01-07 アノト・アクティエボラーク 情報の記録
JP2011103092A (ja) * 2009-11-11 2011-05-26 Dainippon Printing Co Ltd 情報処理システム及び表示処理プログラム
JP2011253343A (ja) * 2010-06-02 2011-12-15 Dainippon Printing Co Ltd 情報処理システム及び表示処理プログラム

Similar Documents

Publication Publication Date Title
WO2013035319A1 (fr) Panneau d'affichage, dispositif d'affichage et système de commande d'affichage
KR101152724B1 (ko) 도트 패턴 판독 기능을 구비한 마우스
WO2013161246A1 (fr) Système de commande d'affichage, dispositif d'affichage et panneau d'affichage
JP5808712B2 (ja) 映像表示装置
JP2013532860A (ja) 符号化パターンを備えるディスプレイ
US9477327B2 (en) Display device and display control system
WO2010109715A1 (fr) Système de saisie de panneau tactile et crayon de saisie
US20120162061A1 (en) Activation objects for interactive systems
JP5553920B2 (ja) 表示パネルおよび表示装置
US20140362054A1 (en) Display control system and reading device
WO2013161236A1 (fr) Film optique, panneau d'affichage et dispositif d'affichage
WO2011114590A1 (fr) Dispositif de saisie de la position, système de saisie de la position, procédé de saisie de la position, programme de saisie de la position et support d'enregistrement lisible par ordinateur
JP2014041602A (ja) 情報読取装置
WO2013161245A1 (fr) Système de commande d'affichage, dispositif d'affichage et panneau d'affichage
WO2013161261A1 (fr) Système de commande d'affichage et dispositif indicateur
JP5420807B1 (ja) 表示制御システム、指示装置および表示パネル
WO2016132732A1 (fr) Panneau d'affichage
WO2011121842A1 (fr) Dispositif d'affichage comportant une unité d'entrée, procédé de commande pour celui-ci, programme de commande et support d'enregistrement
US20160349422A1 (en) Optical film, display panel, and display device
JP2010117841A (ja) 像検知装置、入力位置の認識方法、およびプログラム
JP2010055585A (ja) 電子ペン装置
WO2014017039A1 (fr) Dispositif de lecture d'informations
JP2007279296A (ja) 情報処理システム、クライアント装置、情報処理方法をコンピュータに実行させるためのプログラム
US20150261327A1 (en) Display control system
JP2010128566A (ja) 像検知装置、入力領域の認識方法、およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13780809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13780809

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP