JP6342832B2 - Input device - Google Patents

Input device Download PDF

Info

Publication number
JP6342832B2
JP6342832B2 JP2015051074A JP2015051074A JP6342832B2 JP 6342832 B2 JP6342832 B2 JP 6342832B2 JP 2015051074 A JP2015051074 A JP 2015051074A JP 2015051074 A JP2015051074 A JP 2015051074A JP 6342832 B2 JP6342832 B2 JP 6342832B2
Authority
JP
Japan
Prior art keywords
position
imaging device
display
setting
display surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015051074A
Other languages
Japanese (ja)
Other versions
JP2016170710A (en
Inventor
小川 晃
晃 小川
和征 横川
和征 横川
伸一 百南
伸一 百南
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2015051074A priority Critical patent/JP6342832B2/en
Publication of JP2016170710A publication Critical patent/JP2016170710A/en
Application granted granted Critical
Publication of JP6342832B2 publication Critical patent/JP6342832B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an input device capable of setting setting information of an imaging device used for an input device that provides a function similar to a touch function in a display without a touch panel.

  In recent years, not only portable information terminals such as smartphones, tablets, and personal digital assistants (PDAs), but also large display devices are equipped with a touch panel to detect a contact on the display surface and execute processing. It has become. However, in a large display device not equipped with a touch panel, it is troublesome and difficult to mount the touch panel later.

  Therefore, Patent Document 1 discloses a technique in which an imaging device is provided at a corner of a display unit, and contact with the display unit is detected using an image captured by the imaging device.

JP 2011-103117 A (published on May 26, 2011)

  However, in the above-described conventional technology, information such as the focal length of the photographing lens of the image pickup device, the size of the image sensor, and the actual size of the image pickup device must be grasped in advance, which is troublesome and complicated. In addition, a special tool called an alignment tool must be used for the work for obtaining the above information, and a troublesome work is required.

  The present invention has been made in view of the above problems, and an object thereof is to realize an input device that can easily set setting information of an imaging device for determining a contact position on a display unit.

  In order to solve the above-described problem, an input device according to one embodiment of the present invention includes an optical axis of an imaging device installed so as to be capable of imaging a pointing object that is in contact with or close to the display surface of the display device, and the optical device. An acquisition unit that acquires position information indicating a position on the display surface of a setting line that is different from the optical axis line through the imaging device, and a captured image captured by the imaging device based on the positional information acquired by the acquisition unit And a setting unit that sets setting information used for specifying a contact or proximity position with respect to the display surface.

  According to one embodiment of the present invention, there is an effect that setting information of an imaging apparatus can be set by an easy process of acquiring position information.

It is a block diagram which shows the principal part structure of the touch position detection apparatus which concerns on Embodiment 1 of this invention. It is a figure which shows the touch position detection system containing the said touch position detection apparatus. It is a figure which shows the example of the picked-up image by the imaging device which concerns on the said Embodiment 1. FIG. It is a flowchart which shows the flow of the process which attaches the imaging device in the said Embodiment 1. FIG. 3 is a flowchart illustrating a flow of position adjustment processing of the imaging apparatus according to the first embodiment. 3 is a flowchart showing a flow of processing for setting parameters of the imaging apparatus according to the first embodiment. It is a figure for demonstrating the flow of the screen transition at the time of attachment of the imaging device in the said Embodiment 1. FIG. It is a figure which shows the example of a screen at the time of attachment of an imaging device. It is a figure which shows the example of a screen at the time of attachment of an imaging device. (A)-(c) is a figure which shows the example of a screen at the time of attachment of an imaging device. (A)-(c) is a figure which shows the example of a screen at the time of attachment of an imaging device. It is a figure for demonstrating the method of calculating | requiring the parameter of an imaging device. It is a figure for demonstrating the method of calculating | requiring a touch position. It is a figure for demonstrating the method of calculating | requiring a touch position. It is a block diagram which shows the principal part structure of the touch position detection apparatus which concerns on Embodiment 2 of this invention.

Embodiment 1
A touch position detection system 10 according to the present embodiment includes a touch position detection device (input device) 1, a display device 2, imaging devices 3 a and 3 b, and an operation device 4, and includes a corner of the display device 2 on which no touch panel is mounted. Are provided with imaging devices 3a and 3b, and a touch position is detected from an image including the display device 2 photographed by the imaging devices 3a and 3b. In addition, by enabling the setting of the parameters (setting information, for example, the focal length of the imaging lens, the installation position, the shooting angle, etc.) of the imaging devices 3a and 3b necessary for detecting the touch position, Even a display device not equipped with a touch panel can easily realize the same function as the touch panel.

〔overall structure〕
First, the overall configuration of the touch position detection system 10 will be described with reference to FIG. FIG. 2 is a diagram illustrating the touch position detection system 10. As illustrated in FIG. 2, the touch position detection system 10 includes a touch position detection device 1, a display device 2, imaging devices 3 a and 3 b, and an operation device 4. Note that the imaging devices 3a and 3b are referred to as the imaging device 3 when it is not necessary to distinguish them.

  The touch position detection device 1 detects a touch position (contact position) of an indication object (such as a user's finger or an indication bar) on the display device 2 and executes processing according to the touch position.

  The display device 2 is a display device not equipped with a touch panel, acquires display information from the touch position detection device 1, and displays the display information on the display surface 6. The display device 2 and the touch position detection device 1 are connected by, for example, HDMI (registered trademark) (High-Definition Multimedia Interface) connection.

  The imaging device 3 (3a, 3b) is installed in the upper corner of the display device 2 and images the display device 2. The imaging device 3 transmits the captured image to the touch position detection device 1. An image example 301 photographed by the imaging device 3a is shown in FIG. As shown in FIG. 3, in the image example 301, the display surface 6 of the display device 2 is photographed so as to be a straight line parallel to the longitudinal direction of the display device 2, and the user's finger (indicating object) 100 is displayed on the display surface (display). Part) 6 is touched. That is, the imaging device 3 is installed so as to be able to capture an image of the pointing object that is close to or in contact with the display surface 6.

  The imaging device 3 and the touch position detection device 1 are connected by, for example, USB (Universal Serial Bus). In the present embodiment, two imaging devices 3 are described. However, the number of imaging devices 3 is not limited to this, and may be three or more.

  Note that the imaging device 3 is provided in the vicinity of the display surface 6 with the shooting direction being substantially parallel to the display surface 6 of the display device 2.

  The operation device 4 operates a screen displayed on the display device 2 (for example, pushes a button, moves a cursor position, etc.), and is, for example, a mouse, a keyboard, a remote control, or the like.

[Configuration of Touch Position Detection Device 1]
Next, the touch position detection device 1 will be described with reference to FIG. FIG. 1 is a block diagram showing a main configuration of the touch position detection device 1. As illustrated in FIG. 1, the touch position detection device 1 includes a parameter storage unit 11, a parameter setting unit (parameter determination device) 12, a display control unit 13, a touch position deriving unit (coordinate specifying unit) 14, and a touch processing unit 15. including.

  The parameter storage unit 11 stores parameters of the imaging device 3 for obtaining the touch position. The stored parameters include parameters set by the parameter setting unit 12.

  The parameter setting unit 12 sets parameters of the imaging device 3 and includes a setting screen presentation unit (position display unit) 21, a position acquisition unit (acquisition unit) 22, and a parameter calculation unit (parameter calculation unit) 23. . In the present embodiment, the case where the focal length of the imaging device 3, the position of the imaging device 3, and the angle of the imaging device 3 are set as parameters of the imaging device 3 will be described. Is known (for example, when the focal length of the imaging device 3 is output wirelessly or by wire from the output unit of the imaging device 3 and can be acquired by another route), the position of the imaging device 3, and the imaging device Only an angle of 3 may be set and used with a known focal length. The position of the imaging device 3 is an XY plane in a plane parallel to the display surface 6 of the display device 2 when the upper left corner of the display surface 6 is the origin, the longitudinal direction is the X direction, and the lateral direction is the Y direction. Position.

  The setting screen presentation unit 21 transmits to the display control unit 13 setting screen information indicating a screen indicating the attachment position of the imaging device 3 and a screen for setting parameters of the imaging device 3.

  The position acquisition unit 22 acquires the position specified by using the operation device 4 from the display control unit 13 as screen position information (position information), and notifies the parameter calculation unit 23 of information indicating the acquired position.

  The parameter calculation unit 23 calculates an optical axis line and a setting line, which will be described later, using the position information notified from the position acquisition unit 22, and calculates parameters of the imaging device 3 using the calculated optical axis line and the setting line. Details of the parameter calculation method will be described later.

  The display control unit 13 transmits display screen information to the display device 2 to display on the display device 2, and includes a cursor control unit 31.

  When the cursor is displayed on the display device 2, the cursor control unit 31 moves the cursor position according to the operation by the operation device 4, and the position on the display device 2 specified by the operation by the operation device 4 is displayed on the screen position information. To the position acquisition unit 22 as follows.

  The touch position deriving unit 14 uses the parameters stored in the parameter storage unit 11 to derive the touch position from the captured image including the pointing object acquired from the imaging device 3. Then, the touch processing unit 15 is notified of touch position information indicating the derived touch position. Details of the derivation of the touch position will be described later.

  The touch processing unit 15 executes processing corresponding to the touch position notified from the touch position deriving unit 14 in the activated software. Then, the display control unit 13 is notified of information indicating a display screen based on the processing result.

[Flow of Installation Processing of Imaging Device 3]
Next, the flow of the installation process of the imaging device 3 will be described with reference to FIGS. 4 to 6 are flowcharts showing the flow of the installation process of the imaging device 3. FIG. 7 is an explanatory diagram of screen transitions in the installation process of the imaging device 3. 8 to 11 are diagrams illustrating examples of screens during the installation process of the imaging device 3.

  As shown in FIG. 4, first, the first imaging device 3 is installed on the display device 2 (S1). A screen example 101 at the time of installation is shown in a screen example 101 in FIG. As shown in the screen example 101, when the imaging device 3 is installed, buttons (left camera button 801, right camera button 802) for designating the imaging device 3 installed on the display device 2, an end button 803, a cursor 804, and a display device A picture 805 is displayed, and the operation content is explained to the user such as "Please press the button where the camera is installed. If all cameras are installed, press the end button." A message to do so is displayed. Then, for example, when the user installs the imaging device 3 on the left side toward the display device 2, after attaching the imaging device 3, the cursor 804 is moved and the left camera button 801 is pressed, as shown in FIG. Thus, the screen transitions to the screen example 102.

  When the imaging device 3 is installed on the right side of the display device 2 and the imaging device 3 is attached, when the cursor 804 is moved and the right camera button 802 is pressed, an example screen as shown in FIG. Transition to 102.

  When the installation work of all the imaging devices 3 is completed, the installation work of the imaging device 3 is finished by moving the cursor 804 and pressing the end button 803. The cursor 804 can be moved using the operation device 4.

  Next, an adjustment process of the position of the imaging device 3 is executed (S2). A detailed flow of the position adjustment process of the imaging device 3 in step S2 will be described with reference to FIG. When entering the position adjustment process of the imaging device 3, the setting screen presentation unit 21 displays a position setting screen on which the guideline 904 indicating the setting position of the imaging device 3 is displayed on the display device 2 (S21). An example of a position setting screen in the position adjustment process is shown in a screen example 102 in FIG. As shown in the screen example 102, a preview area 901 of a captured image, a cursor 902, and a next button 903 are displayed on the position setting screen. In the preview area of the photographed image, the image of the display device 2 being photographed is continuously displayed. Also, adjust the position of the attached camera so that the entire display surface of the display is displayed in the captured image and the displayed surface of the display overlaps the red line. A message for explanation is displayed. A guideline 904 obtained by dividing the preview area into two in the vertical direction is displayed in the preview area of the captured image (in FIG. 9, it is displayed as a black thick line, but in the embodiment, it is displayed as a red line. ).

Here, the user adjusts the position of the attached camera so as to satisfy all of the following conditions while confirming the displayed captured image.
(1) A portion corresponding to the entire display surface of the display device 2 is shown in the captured image.
(2) The display surface (the straight line) of the display device 2 displayed in the captured image overlaps the displayed guideline 904.

  When the adjustment of the position of the imaging device 3 is completed, the cursor 902 is moved, and the next button 903 is pressed (YES in S12), the screen transitions to the screen example 103-1 as shown in FIG.

  Thereafter, a parameter setting process of the imaging device 3 is executed (S3). A detailed flow of the parameter setting process in step S3 will be described with reference to FIG.

  First, the setting screen presentation unit 21 displays on the display device 2 a parameter first setting screen on which an optical axis point (corresponding position) that is a point representing the optical axis of the imaging device 3 is displayed (S21). An example of the parameter first setting screen is shown in a screen example 103-1 in FIG. As shown in FIG. 10A, on the parameter first setting screen, a preview area 1001 and a cursor 1002 of a captured image are displayed. In the preview area of the captured image, the image of the display device 2 being captured is continuously displayed, and a mark indicating the position corresponding to the optical axis of the imaging device 3 (optical axis point, FIG. 10A). In black squares). The optical axis point is the optical axis of the imaging device 3 and is displayed at a position corresponding to the display surface 6 of the display device 2 (the center of the captured image). In addition, move the pointing object on the display surface of the display, find the position where the pointing object displayed in the captured image overlaps with the (black square), move the cursor to that position on the display surface, and select it. A message for explaining the operation content to the user is displayed.

  Here, the user moves the finger (indicating object) on the display surface 6 of the display device 2, and the finger displayed on the captured image overlaps the black square indicating the optical axis point. Find the position. Then, the actual finger position (coordinate information) at the overlapping point is specified (for example, clicked with a mouse) using the operation device 4. Thereby, one position of the display surface 6 corresponding to the optical axis of the imaging device 3 can be acquired. When accepting that the position of the finger has been specified (YES in S22, acquisition step), the setting screen presenting unit 21 displays the parameter second setting screen on the display device 2 (screen example 103-2 shown in FIG. 7). Transition to).

  The actual finger position may be specified by placing the cursor 1002 on the finger position, or by attaching a detachable marker or the like to the finger position on the display surface 6 and placing the cursor 1002 on the position. You may perform together. Thereby, even if it is a case where work is performed by one person, the position of a finger can be specified easily.

  FIG. 10B shows a screen example 103-2 of the parameter second setting screen. As shown in FIG. 10B, on the second parameter setting screen, a preview area 1001, a cursor 1002, and a first specific position 1003 of the captured image are displayed. In the preview area of the photographed image, as in the screen example 103-1, the image of the display device 2 being photographed is continuously displayed, and a mark indicating the position corresponding to the optical axis of the imaging device 3 ( An optical axis point, which is indicated by a black square in FIG. 10B, is displayed. The first specific position 1003 indicates the position specified in step S22 (screen example 103-1). In addition, “while moving the pointing object again on the display surface of the display, find the position where the pointing object projected on the captured image overlaps with the (black square) and is different from the previously selected position. Please move the cursor to "Please select" to display a message for explaining the operation content to the user.

  Here, the user moves the finger on the display surface 6 of the display device 2, and the finger projected on the captured image is a position on the display surface 6 where the black square indicating the optical axis point overlaps, And a position different from the first specific position 1003 is searched for. Then, the actual finger position (coordinate information) at that point is specified using the operation device 4. When it is accepted that the position of the finger has been specified (YES in S23, acquisition step), the setting screen presentation unit 21 displays the parameter third setting screen on the display device 2 (screen example 103-3 shown in FIG. 7). Transition to).

  FIG. 10C shows a screen example 103-3 of the parameter third setting screen. As shown in FIG. 10C, on the parameter third setting screen, a preview area 1001, a cursor 1002, a first specific position 1003, a second specific position 1004, a redo button 1005, and a next button 1006 are displayed. Has been. In the preview area of the photographed image, as in the screen example 103-1, the image of the display device 2 being photographed is continuously displayed, and a mark indicating the position corresponding to the optical axis of the imaging device 3 ( The optical axis point, indicated by a black square in FIG. 10C, is displayed. The second specific position 1004 indicates the position specified in step S23 (screen example 103-2). In addition, a message for explaining the operation content to the user such as "If the selected position is not correct, press the [Redo] button. If there is no problem with the selected position, press the [Next] button." It is displayed.

  If the redo button 1005 is pressed (YES in S24), the screen returns to the screen example 103-1 as shown in FIG. The case where the redo button 1005 is pressed is when there is an error in either the position specified in step S22 (screen example 103-1) or the position specified in step S23 (screen example 103-2) (for example, If the cursor is shifted when specifying the position).

  When the next button 1006 is pressed (NO in S24), the setting screen presenting unit 21 displays the setting axis point (corresponding position) that is a point representing the setting axis of the imaging device 3 as the parameter fourth setting screen. Is displayed on the display device 2 (S25: transition to the screen example 103-4 shown in FIG. 7).

  FIG. 11A shows a screen example 103-4 of the fourth parameter setting screen. As shown in FIG. 11A, on the parameter fourth setting screen, a preview area 1001 and a cursor 1002 of a captured image are displayed. In the preview area of the photographed image, the image of the display device 2 being photographed is continuously displayed in the same manner as the screen example 103-1, and is different from the optical axis point of the screen example 103-1. A mark indicating a position corresponding to the set axis of the apparatus 3 (set axis point, displayed in black square in FIG. 11A, corresponding position) is displayed. The set axis point is displayed at a position corresponding to the display surface 6 of the display device 2 (the center in the horizontal direction of the captured image) at a position different from the optical axis point. In addition, move the pointing object on the display surface of the display, find the position where the pointing object displayed in the captured image overlaps with the (black square), move the cursor to that position on the display surface, and select it. A message for explaining the operation content to the user is displayed.

  Here, the user moves the finger (indicating object) on the display surface 6 of the display device 2, and the finger displayed on the captured image overlaps the black square indicating the set axis point. Find the position. Then, the position of the actual finger at the overlapping point is specified (for example, clicked with a mouse) using the operation device 4. Thereby, one position of the display surface 6 corresponding to the setting axis of the imaging device 3 can be acquired. When it is received that the position (coordinate information) of the finger has been specified (YES in S26, acquisition step), the setting screen presentation unit 21 displays the fifth parameter setting screen on the display device 2 (the screen shown in FIG. 7). Transition to Example 103-5).

  FIG. 11B shows a screen example 103-5 of the fifth parameter setting screen. As shown in FIG. 11B, a preview area 1001, a cursor 1002, and a third specific position 1007 of the captured image are displayed on the fifth parameter setting screen. In the preview area of the photographed image, as in the screen example 103-4, the image of the display device 2 being photographed is continuously displayed, and a mark indicating the position corresponding to the setting axis of the imaging device 3 ( A set axis point, which is indicated by a black square in FIG. 11B, is displayed. The third specific position 1007 indicates the position specified in step S26 (screen example 103-5). In addition, “while moving the pointing object again on the display surface of the display, find the position where the pointing object projected on the captured image overlaps with the (black square) and is different from the previously selected position. Please move the cursor to "Please select" to display a message for explaining the operation content to the user.

  Here, the user moves the finger on the display surface 6 of the display device 2, and the finger projected on the captured image is a position on the display surface 6 where the black square indicating the set axis point overlaps, And a position different from the third specific position 1007 is searched for. Then, the actual finger position at that point is specified using the operation device 4. When it is received that the position (coordinate information) of the finger has been specified (YES in S27, acquisition step), the setting screen presentation unit 21 displays the sixth parameter setting screen on the display device 2 (the screen shown in FIG. 7). Transition to Example 103-6).

  FIG. 11C shows a screen example 103-6 of the parameter sixth setting screen. As shown in FIG. 11C, on the parameter sixth setting screen, a preview area 1001, a cursor 1002, a third specific position 1007, a fourth specific position 1008, a redo button 1009, and a setting button 1010 are displayed. ing. In the preview area of the photographed image, as in the screen example 103-4, the image of the display device 2 being photographed is continuously displayed, and a mark indicating the position corresponding to the setting axis of the imaging device 3 ( A set axis point, which is indicated by a black square in FIG. 11C, is displayed. The fourth specific position 1008 indicates the position specified in step S27 (screen example 103-5). In addition, a message for explaining the operation content is displayed to the user, such as "If the selected position is not correct, press the [Redo] button. If there is no problem with the selected position, press the [Set] button." Has been.

  If the redo button 1009 is pressed (YES in S28), the screen returns to the screen example 103-4 as shown in FIG. The case where the redo button 1009 is pressed is when there is an error in either the position specified in step S26 (screen example 103-4) or the position specified in step S27 (screen example 103-5) (for example, If the cursor is shifted when specifying the position).

  When the setting button 1010 is pressed (NO in S28), the parameter calculation unit 23 executes a parameter calculation process (S29, setting step).

  When the processing of steps S1 to S3 is completed for all the imaging devices 3 (YES in S4), the installation processing of the imaging device 3 is completed.

[Parameter calculation method]
Next, a parameter calculation method by the parameter calculation unit 23 will be described with reference to FIG. FIG. 12 is a diagram for explaining the details of the parameter setting process.

  FIG. 12 shows a state where the imaging devices 3a and 3b are installed on the display device 2, and the relationship between the imaging device 3 and the captured image. In the figure showing the display device of FIG. 12, the longitudinal direction of the display surface 6 of the display device 2 is represented by the X direction, the short direction is the Y direction, and the upper left of the display surface 6 is represented by XY coordinates.

Then, the coordinates of the specific position 1203 acquired in step S22 (screen example 103-1) are (x 1 , y 1 ), and the coordinates of the specific position 1206 acquired in step S23 (screen example 103-2) are (x 2 , y 2 ), a straight line L1 (optical axis) passing through these positions can be expressed by the following equation.
L1: y = m 1 x + n 1
Here, m 1 = (y 2 −y 1 ) / (x 2 −x 1 ), n 1 = y 1 −m 1 * x 1
Similarly, the coordinates of the specific position 1204 acquired in step S26 (screen example 103-4) are (x 3 , y 3 ), and the coordinates of the specific position 1205 acquired in step S27 (screen example 103-5) are (x 4 , y 4 ), a straight line L2 (setting line) passing through these positions can be expressed by the following equation.
L2: y = m 2 x + n 2
Here, m 2 = (y 4 -y 3) / (x 4 -x 3), n 2 = y 3 -m 2 * x 3
In the relationship between the imaging device 3 and the captured image, the upper left in the captured image is the origin, the right lateral direction is the X ′ direction, the downward direction is the Y ′ direction, and the imaging direction of the imaging device 3 is the Z ′ direction. Here, the difference W in the X ′ direction between the mark indicating the optical axis point (displayed in screen examples 103-1 to 103-3) and the mark indicating the set axis point (displayed in screen examples 103-4 to 103-6). 1 can also be obtained on the X′Y ′ coordinate.

From these, the parameters (the angle of the imaging device 3, the position of the imaging device 3, and the focal length of the imaging device 3) obtained in the present embodiment can be obtained by the following equations.
Angle of the imaging apparatus 3: CA1 = tan −1 (| m1 |) (where 0 ° ≦ CA1 <90 °)
Position of the imaging device 3 (lens center position of): CP1 coordinates of When (cx 1, cy 1), cx 1 = (n 2 -n 1) / (m 1 -m 2), cy 1 = m 1 * Cx 1 + n 1
Focal length: FL1 = | w 1 * (1 + m 1 m 2 ) / (m 2 −m 1 ) |
Since the imaging device 3 is not installed in the longitudinal direction or the short direction of the display surface 6 of the display device 2, x 1 ≠ x 2 and x 3 ≠ x 4 are satisfied. In addition, because of the lens mechanism, the straight line L1 and the straight line L2 do not become parallel, and therefore m 1 ≠ m 2 . Further, since the straight line L1 and the straight line L2 are not orthogonal, m 1 m 2 ≠ −1.

[Touch position detection method]
Next, a touch position detection method by the touch position deriving unit 14 will be described with reference to FIGS. FIG. 13 is a diagram for explaining how to obtain an angle formed by the optical axis line used for detection of the touch position and the touch position candidate line.

  In FIG. 13, as in FIG. 12, the upper left in the captured image is the origin, the right lateral direction is the X ′ direction, the downward direction is the Y ′ direction, and the imaging direction of the imaging device 3 is the Z ′ direction.

  First, the touch position deriving unit 14 analyzes the captured image to extract the pointing object (finger) and obtains the touch position TP ′ in the captured image. Since the method for extracting the pointing object can be performed using the conventional technique, the description thereof is omitted here.

  Next, a difference distance W between the touch position TP ′ in the X′-axis direction and the image center is obtained. Then, from the difference distance W and the focal length FL (calculated as parameters), the optical axis OA (straight line passing through the lens center and the center of the captured image) and the touch position candidate line TC are calculated using the trigonometric formula. The formed angle YA ′ can be obtained. Here, the touch position candidate line TC is a straight line that passes through the center of the lens and the touch position projected on the imaging screen.

  Note that since the focal length and the difference distance need to be matched in units (scales), the focal length is determined not on the real space but on the photographed image coordinate system (X′Y ′ coordinate system).

  Next, a method for obtaining the touch position TP will be described with reference to FIG.

  First, as described above, the angle YA'a formed by the optical axis OAa and the touch position candidate line TCa in the imaging device 3a and the angle YA'b formed by the optical axis OAb and the touch position candidate line TCb in the imaging device 3b are obtained.

  Next, an angle YAa formed by the X axis and the touch position candidate line TCa is obtained from an angle CAa formed by the optical axis OAa and the X axis (an angle of the imaging device 3a, which has been calculated as a parameter) and the angle YA'a.

  Further, an angle YAb formed by the X axis and the touch position candidate line TCb is obtained from an angle CAb formed by the optical axis OAb and the X axis (an angle of the imaging device 3b, which has been calculated as a parameter) and the angle YA'b.

  Next, the expression of the touch position candidate line TCa is obtained from the angle YAa and the position CPa (calculated as a parameter) of the imaging device 3a, and the touch position candidate is calculated from the angle YAb and the position CPb (calculated as a parameter) of the imaging device 3b. Find the equation for line TCb. These linear formulas can be obtained only with general mathematical formulas.

  Finally, the touch position TP is obtained from the intersection of the touch position candidate line TCa and the touch position candidate line TCb. The intersection point can be obtained only by a general mathematical formula.

  As described above, according to the present embodiment, since the parameters of the imaging device 3 can be obtained without using a special tool, the parameters can be easily calculated and set. In addition, even when the camera is displaced, anyone can quickly re-install and reset the imaging device 3.

[Embodiment 2]
The following will describe another embodiment of the present invention with reference to FIG. For convenience of explanation, members having the same functions as those described in the embodiment are given the same reference numerals, and descriptions thereof are omitted.

  The present embodiment is different from the first embodiment in that a screen displayed on the display device 2 is operated (for example, a button is pressed or a cursor is moved) using the remote controller 5 of the display device 2. .

  In the present embodiment, in each process of the first embodiment, the button is pressed, the cursor is moved, and the position is specified by an operation by the remote controller 5 instead of an operation by the operation device 4. More specifically, for example, when the screen example 103-1 in FIG. 10 is displayed on the display device 2, when the button (cursor moving button) on the remote controller 5 corresponding to the movement of the cursor position is pressed, the cursor moving button is displayed. The display device 2 that has recognized that the button has been pressed notifies the cursor control unit 31 ′ to that effect. The cursor control unit 31 ′ moves the cursor position according to signal information indicating that the cursor movement button has been pressed. When the button (position specifying button) on the remote controller 5 corresponding to the position specification is pressed, the display device 2 that has recognized that the position specifying button has been pressed notifies the cursor control unit 31 ′. The cursor control unit 31 ′ notifies the position acquisition unit 22 of the cursor position when receiving the signal information indicating that the position specifying button is pressed as the specific position.

[Embodiment 3]
The control block (particularly the parameter setting unit 12, the display control unit 13, the touch position deriving unit 14, and the touch processing unit 15) of the touch position detection device 1 is a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. ) Or by software using a CPU (Central Processing Unit).

  In the latter case, the touch position detection device 1 includes a CPU that executes instructions of a program that is software that realizes each function, and a ROM (Read Only Memory) in which the program and various data are recorded so as to be readable by a computer (or CPU). ) Or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. The present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.

[Summary]
An input device (touch position detection device 1) according to aspect 1 of the present invention is an image pickup device that is installed so as to be able to pick up an indication object that is in contact with or close to the display surface (6) of the display device (2). An acquisition unit (position acquisition unit 22) that acquires position information indicating the position on the display surface of the optical axis (L1) of 3a, 3b) and a setting line (L2) that passes through the imaging device and is different from the optical axis; A setting unit (parameter setting unit 12) for setting setting information used for specifying a contact or proximity position with respect to the display surface from a captured image captured by the imaging device based on the position information acquired by the acquisition unit; It is equipped with.

  According to said structure, the setting information used in order to obtain | require a contact or a proximity | contact position based on the positional information which shows the optical axis line in a display surface, and the positional information which shows a setting line can be set. Therefore, setting information of the imaging device for specifying the contact or proximity position can be set by an easy process of acquiring the position information. That is, it is possible to easily set the setting information of the imaging device for obtaining the contact or proximity position with respect to the display surface.

  The input device according to aspect 2 of the present invention is the input apparatus according to aspect 1, wherein the acquisition unit acquires coordinates of at least two points on the optical axis as the position information, and coordinates of at least two points on the setting line. May be obtained.

  According to the above configuration, since at least two coordinates on the optical axis are acquired, the optical axis can be obtained. In addition, since at least two coordinates on the setting line are acquired, the setting line can be obtained. And setting information can be calculated | required from an optical axis line and a setting line.

  The input device according to aspect 3 of the present invention displays the captured image on the display device in the aspect 1 or 2, and the corresponding position corresponding to the optical axis and the setting line is displayed on the captured image being displayed. A position display unit for displaying, and the acquisition unit acquires, as the position information, coordinates of a contact position on the display surface selected by the user so as to overlap with the corresponding position displayed by the position display unit. May be.

  According to the above configuration, the corresponding position corresponding to the optical axis line or the setting line is displayed on the captured image being displayed, and the coordinates are obtained by acquiring the position in the display selected by the user so as to overlap the displayed corresponding position. Can be obtained. Therefore, the coordinates can be acquired by an easy operation of designating the position of the display surface by the user.

  The input device according to aspect 4 of the present invention is the input device according to any one of the aspects 1 to 3, wherein the setting information includes a focal length of the imaging device and a center position of the lens of the imaging device in a plane substantially parallel to the display surface. , And at least one of the shooting directions of the imaging apparatus.

  According to the above configuration, the focal length of the imaging device, the center position of the lens of the imaging device in a plane substantially parallel to the display surface, and the shooting direction of the imaging device can be used as the setting information.

  The input device according to Aspect 5 of the present invention is the input device according to any one of Aspects 1 to 4, wherein the coordinates of the pointing object in contact with or close to the display surface are determined from the images captured by the at least two imaging devices. And a coordinate specifying unit that specifies using the corresponding setting information.

  According to the above configuration, it is possible to specify the contact position of the pointing object on the display surface or the close position (coordinates) using the set coordinate information.

  An input device control method according to an aspect 6 of the present invention includes an optical axis of an imaging device installed so as to be capable of imaging a pointing object that is in contact with or close to the display surface of the display device, and the light passing through the imaging device. Based on the acquisition step (S22, S23, S26, S27) for acquiring position information indicating the position of the setting line different from the axis line on the display surface, and the position information acquired in the acquisition step A setting step (S29) for setting setting information used for specifying a contact position with respect to the display surface from the captured image. According to said structure, there exists an effect similar to the aspect 1 mentioned above.

  The parameter determination device according to each aspect of the present invention may be realized by a computer. In this case, the parameter determination device is operated on each computer by operating the computer as each unit (software element) included in the parameter determination device. The control program for the parameter determination device to be realized in this way and a computer-readable recording medium on which the control program is recorded also fall within the scope of the present invention.

  The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention. Furthermore, a new technical feature can be formed by combining the technical means disclosed in each embodiment.

  The present invention can be used in a system that performs the same function as a touch panel in a display device that does not include a touch panel.

1 Touch position detection device (input device)
2 Display device 3a, 3b Imaging device 4 Operating device 6 Display surface 12 Parameter setting unit (setting unit)
13 Display control unit 14 Touch position deriving unit (coordinate specifying unit)
15 Touch processing part 21 Setting screen presentation part (position display part)
22 Position acquisition unit (acquisition unit)
23 Parameter calculation unit 100 User's finger (indicating object)
CP1 Lens center L1 Optical axis L2 Setting line

Claims (5)

  1. The position on the display surface of the optical axis line of the imaging device installed so as to be able to image the pointing object that is in contact with or close to the display surface of the display device and the setting line passing through the imaging device and different from the optical axis line An acquisition unit for acquiring position information to be indicated;
    A setting unit configured to set setting information used for specifying a contact or proximity position with respect to the display surface from a captured image captured by the imaging device based on the position information acquired by the acquisition unit. An input device.
  2.   The input device according to claim 1, wherein the acquisition unit acquires at least two coordinates on the optical axis as the position information, and acquires at least two coordinates on the setting line. .
  3. A position display unit for displaying the captured image on the display device and displaying a corresponding position corresponding to the optical axis and the setting line on the captured image being displayed;
    3. The acquisition unit according to claim 1, wherein the acquisition unit acquires, as the position information, coordinates of a contact position with respect to the display surface selected by a user so as to overlap with the corresponding position displayed by the position display unit. The input device described.
  4.   The setting information indicates at least one of a focal length of the imaging device, a center position of a lens of the imaging device in a plane substantially parallel to the display surface, and a shooting direction of the imaging device. The input device according to any one of 1 to 3.
  5.   A coordinate specifying unit that specifies coordinates at which the pointing object is in contact with or close to the display surface from images captured by at least two of the imaging devices using the setting information corresponding to the imaging devices; The input device according to any one of claims 1 to 4.
JP2015051074A 2015-03-13 2015-03-13 Input device Active JP6342832B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015051074A JP6342832B2 (en) 2015-03-13 2015-03-13 Input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015051074A JP6342832B2 (en) 2015-03-13 2015-03-13 Input device

Publications (2)

Publication Number Publication Date
JP2016170710A JP2016170710A (en) 2016-09-23
JP6342832B2 true JP6342832B2 (en) 2018-06-13

Family

ID=56983928

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015051074A Active JP6342832B2 (en) 2015-03-13 2015-03-13 Input device

Country Status (1)

Country Link
JP (1) JP6342832B2 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3977018B2 (en) * 2001-02-07 2007-09-19 株式会社リコー Information input system
TWI402793B (en) * 2008-10-01 2013-07-21 Quanta Comp Inc Calibrating apparatus and method for image processing apparatus
US8294693B2 (en) * 2009-09-25 2012-10-23 Konica Minolta Holdings, Inc. Portable input device, method for calibration thereof, and computer readable recording medium storing program for calibration
TWI423101B (en) * 2010-12-08 2014-01-11 Wistron Corp Method for positioning compensation of a touch object on a touch surface of a screen and optical touch module thereof

Also Published As

Publication number Publication date
JP2016170710A (en) 2016-09-23

Similar Documents

Publication Publication Date Title
US8896535B2 (en) Image processing apparatus and method, and program therefor
JP6153564B2 (en) Pointing device with camera and mark output
JP2013054661A (en) Information display system, information display method and program for information display
JP2007129709A (en) Method for calibrating imaging device, method for calibrating imaging system including arrangement of imaging devices, and imaging system
JP2012212343A (en) Display control device, display control method, and program
KR20120016585A (en) Information processing apparatus, information processing method and program
JP4725526B2 (en) Information processing apparatus, imaging apparatus, information processing system, apparatus control method, and program
US8154616B2 (en) Data processing apparatus and method, and recording medium
JP2011081506A (en) Video display device and method of controlling display thereof
KR101472455B1 (en) User interface apparatus based on hand gesture and method thereof
EP2571257B1 (en) Projector device and operation detecting method
US20100060597A1 (en) Method and apparatus for displaying and selecting icons on a touch screen
JP5480777B2 (en) Object display device and object display method
US20140218300A1 (en) Projection device
KR101585488B1 (en) Imaging device and imaging method, and storage medium for storing tracking program processable by computer
EP2790089A1 (en) Portable device and method for providing non-contact interface
US10055081B2 (en) Enabling visual recognition of an enlarged image
KR20070037773A (en) Apparatus and method for inputting user command in display device
WO2013146269A1 (en) Image capturing device, image processing method, and program
JP2013061848A (en) Noncontact input device
US20150022449A1 (en) Algorithms, software and an interaction system that support the operation of an on the fly mouse
JP5659510B2 (en) Image processing apparatus, image processing method, and program
US20140253592A1 (en) Method for providing augmented reality, machine-readable storage medium, and portable terminal
JP2011180712A (en) Projection type image display apparatus
US20120169914A1 (en) Method to enlarge and change displayed image and photographing apparatus using the same

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170925

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180418

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180508

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180517

R150 Certificate of patent or registration of utility model

Ref document number: 6342832

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150