JP4807322B2 - Image processing system, image processing apparatus and method, and program - Google Patents

Image processing system, image processing apparatus and method, and program Download PDF

Info

Publication number
JP4807322B2
JP4807322B2 JP2007133758A JP2007133758A JP4807322B2 JP 4807322 B2 JP4807322 B2 JP 4807322B2 JP 2007133758 A JP2007133758 A JP 2007133758A JP 2007133758 A JP2007133758 A JP 2007133758A JP 4807322 B2 JP4807322 B2 JP 4807322B2
Authority
JP
Japan
Prior art keywords
image
position
light
indicating light
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2007133758A
Other languages
Japanese (ja)
Other versions
JP2008287625A (en
Inventor
浩之 塩谷
智経 増野
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to JP2007133758A priority Critical patent/JP4807322B2/en
Publication of JP2008287625A publication Critical patent/JP2008287625A/en
Application granted granted Critical
Publication of JP4807322B2 publication Critical patent/JP4807322B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image processing system, an image processing apparatus and method, and a program, and more particularly, to an image processing system, an image processing apparatus and method, and a program that can easily specify a position in a virtual three-dimensional space. .

  Conventionally, a presentation system has been proposed that assumes a virtual three-dimensional space for a two-dimensional screen and can specify a position in the virtual three-dimensional space.

  For example, as such a presentation system, sound wave sensors are arranged at four corners of a screen on which an image is displayed, and a sound wave signal from an input pen is received by the sound wave sensor, whereby a two-dimensional image displayed on the screen. There is also one that designates a position in a virtual three-dimensional space set for (for example, see Patent Document 1). In this presentation system, the envelope signal is detected from the sound wave signal so that the sound wave signal can be easily detected, and the detected envelope signal is used to detect the arrival time of the sound wave signal to the sound wave sensor. Yes. As a result, it is possible to reduce the level fluctuation of the sound wave signal and the influence of the reflected wave, which have been problems in the past, and to stably detect the position in the virtual three-dimensional space.

  In addition, the presentation system uses a magnetic field generation source fixed to the display and a bar-shaped position sensor that detects magnetism from the magnetic field generation source, for an object in a virtual three-dimensional space on the display, Some devices allow intuitive operation (see, for example, Patent Document 2).

  Furthermore, the presentation system generates a distance image by a stereo method using a plurality of cameras, and uses the generated distance image to detect a predetermined part such as a user's fingertip or arm, thereby generating a virtual 3 Some specify an object in a dimensional space (see, for example, Patent Document 3). In this presentation system, since an object positioned in the direction indicated by the detected user's part is specified, the user does not need to wear an interface such as a mouse, and convenience is expected to be improved. . Further, it is possible to designate a position in the real space instead of on the screen, and it is considered that the applicability is high.

JP 2002-351605 A JP 2003-85590 A JP 2004-265222 A

  However, the above-mentioned technology, due to its characteristics, cannot measure the position of the input pen or the user's part to be detected in a wide range, so a CRT (Cathode Ray Tube) display, PDP (Plasma Display Panel) However, it is only possible to display an image on a relatively small display device such as a small screen and to specify a position in the virtual three-dimensional space assumed for the image. Further, the above-described presentation system is easily influenced by the surrounding environment, and there are cases where the position in the virtual three-dimensional space cannot be specified reliably.

  For example, in the method using the sound wave signal, the influence of the reflected wave or the like can be avoided to some extent, but since the sound wave signal is used, the reaction speed for the designation of the position in the virtual three-dimensional space is limited by the sound speed. there were. For this reason, the larger the screen, the longer the processing takes, and a delay that cannot be ignored may occur. Moreover, since the transmitter and receiver of the sound wave signal as the input pen and the sound wave sensor are not so common, the cost becomes high.

  Furthermore, since the method using the magnetic field generation source and the rod-shaped position sensor uses magnetism, an error may occur in the position measurement by the position sensor when there is a magnetic object around the presentation system. is there. In addition, since a magnetic field generation source is used, in order to be able to measure a position by a position sensor in a wide range, a strong magnetic field corresponding to the range is required, and the position sensor position measurement accuracy is reduced, The cost may increase.

  Furthermore, in the method of detecting a predetermined part of the user, the range of positions that can be specified in the virtual three-dimensional space is limited to the range that the user can show with the body. Furthermore, since it is necessary to install a plurality of cameras, not only the cost is increased, but also the processing time for obtaining the designated position from the images obtained by the plurality of cameras becomes longer.

  The present invention has been made in view of such a situation, and the position in the virtual three-dimensional space can be more reliably specified at low cost regardless of the size of the image to be displayed and the surrounding environment. It is what you want to do.

  An image processing system according to a first aspect of the present invention includes a display device that displays a display image, an instruction device that irradiates light to the display image and indicates the position of a pointer displayed on the display image, An image processing system including an imaging device that captures an area in which the display image is displayed, and an image processing device that newly generates a display image based on a captured image obtained by imaging by the imaging device. The pointing device is a light for instructing a display position on the virtual three-dimensional space assumed for the display image displayed by the display device and displaying the pointer. A position indicating light emitting means for emitting a position indicating light, and a light having a wavelength different from that of the position indicating light and instructing movement of the pointer in a direction parallel to the optical path of the position indicating light Direction indication light emitting means for emitting indication light, and the imaging device picks up the image of the position indication light and the image of the direction indication light irradiated on the display image as the captured image, and the image The processing device separates, from the captured image, a position indication light image that is an image of light having the same wavelength as the position indication light and a direction indication light image that is an image of light having the same wavelength as the direction indication light. Means for detecting the position and shape of the image of the position indicating light on the display image from the position indicating light image, and detecting the image of the direction indicating light from the direction indicating light image. Based on the detection result, direction indicating light detecting means for outputting direction information indicating whether or not movement of the pointer in the direction is instructed, and the position and shape of the image of the position indicating light are output. Said person Based on the information, the position specifying means for specifying the display position of the pointer in the virtual three-dimensional space, and the position on the display image corresponding to the display position of the pointer specified by the position specifying means, Display image generating means for generating a new display image on which the pointer is displayed, and the display device switches the display of the display image displayed so far to the generated new display image.

  In the first aspect of the present invention, a display device that displays a display image, an indication device that irradiates light to the display image and indicates the position of a pointer displayed on the display image, and the display image In the image processing system including an imaging device that captures an area in which the image is displayed and an image processing device that newly generates a display image based on a captured image obtained by imaging by the imaging device, the instruction Position indication light which is a light for instructing a display position on the virtual three-dimensional space assumed for the display image displayed on the display device by the device and displaying the pointer. Is emitted, which is light having a wavelength different from that of the position indication light, and direction indication light for instructing movement of the pointer in a direction parallel to the optical path of the position indication light is emitted. An image of the position indicating light and an image of the direction indicating light irradiated on the display image are picked up as the picked-up image by the device, and the same wavelength as that of the position indicating light from the picked-up image by the image processing device. A position indicating light image that is an image of the light of the direction and a direction indicating light image that is an image of the light having the same wavelength as the direction indicating light are separated, and the position indicating light on the display image is separated from the position indicating light image A direction indicating whether or not movement of the pointer in the direction is instructed based on the detection result of detecting the position and shape of the image of the image, detecting the image of the direction indicating light from the direction indicating light image. Information is output, and the display position of the pointer in the virtual three-dimensional space is specified based on the position and shape of the image of the position indicating light and the output direction information, and the position A new display image in which the pointer is displayed is generated at a position on the display image corresponding to the display position of the pointer specified by the determination means, and the display image displayed so far by the display device Is switched to the generated new display image.

  An image processing apparatus according to a second aspect of the present invention is an image processing apparatus that is a display position in a virtual three-dimensional space assumed for a display image, and that specifies a display position for displaying a pointer. A position indicating light image obtained by imaging a region where an image is displayed, and an image of light having the same wavelength as the position indicating light for indicating the display position for displaying the pointer in the virtual three-dimensional space A position indicating light detecting means for detecting a position and a shape of an image of the position indicating light irradiated on the display image on the display image, and the display image being displayed. It is the 1st direction indication light image obtained by imaging a field, and is the same wavelength as the 1st direction indication light which instruct | indicates the movement of the said pointer to the 1st direction parallel to the optical path of the said position indication light Yes, said position An image of the first direction indicating light irradiated on the display image is detected from a first direction indicating light image which is an image of light having a wavelength different from the indication light, and the pointer is based on the detection result. First direction indicating light detecting means for outputting first direction information indicating whether or not movement in the first direction has been instructed, and the position and shape of the image of the position indicating light, and output Position specifying means for specifying the display position of the pointer in the virtual three-dimensional space based on the first direction information.

  The image processing apparatus further includes display image generation means for generating a new display image on which the pointer is displayed at a position on the display image corresponding to the display position of the pointer specified by the position specifying means. be able to.

  The position indicating light may have a perfect circular cross section.

  The display position of the pointer passes through the position in the virtual three-dimensional space corresponding to the position of the image of the position indicating light, and is a straight line parallel to the first direction specified from the shape of the image of the position indicating light It can be in the upper position.

  The position specifying means stores a depth length that determines a length from a position in the virtual three-dimensional space corresponding to a position of the image of the position indicating light to a display position of the pointer, and The depth length can be changed based on the direction information, and the display position of the pointer can be specified based on the changed depth length and the position and shape of the image of the position indicating light.

  A second direction indicating light image obtained by imaging the area where the display image is displayed, wherein the pointer indicates movement of the pointer in a second direction opposite to the first direction. The display image is irradiated from a second direction indicating light image having the same wavelength as the direction indicating light of 2 and a light having a wavelength different from that of the position indicating light and the first direction indicating light. A second direction that detects an image of the second direction indicating light and outputs second direction information indicating whether or not the movement of the pointer in the second direction is instructed based on the detection result. An instruction light detection unit is further provided, and the position specifying unit can change the depth length based on the first direction information and the second direction information.

  In the image processing apparatus, an imaging unit that captures the display image, the image of the position indicating light irradiated on the display image, and the image of the first direction indicating light irradiated on the display image as the captured image. And a separating means for separating the position indicating light image and the first direction indicating light image from the captured image.

  The position indicating light and the first direction indicating light may be invisible light.

  An image processing method or program according to a second aspect of the present invention is an image processing method for an image processing apparatus that specifies a display position on a virtual three-dimensional space assumed for a display image and displays a pointer. Or a position indication light image obtained by imaging a region where the display image is displayed, the program being a position indication for indicating a display position for displaying the pointer in the virtual three-dimensional space A position indicating light detecting step for detecting a position and shape of an image of the position indicating light irradiated on the display image on the display image from a position indicating light image which is an image of light having the same wavelength as the light; A direction indicating light image obtained by imaging the area where a display image is displayed, the direction indicating the movement of the pointer in a direction parallel to the optical path of the position indicating light Based on the detection result, an image of the direction indicating light irradiated on the display image is detected from a direction indicating light image that is an image of light having the same wavelength as the indicating light and a wavelength different from that of the position indicating light. A direction indicating light detecting step for outputting direction information indicating whether or not movement of the pointer in the direction is instructed, a position and shape of an image of the position indicating light, and the output direction information. And a position specifying step for specifying the display position of the pointer in the virtual three-dimensional space.

  In the second aspect of the present invention, a position indicating light image obtained by imaging a region where a display image is displayed, the position for indicating a display position for displaying a pointer in a virtual three-dimensional space From the position indication light image that is an image of light having the same wavelength as the indication light, the position and shape of the image of the position indication light irradiated on the display image on the display image are detected, and the display image is displayed. A direction-indicating light image obtained by imaging the region, and having the same wavelength as the direction-indicating light instructing movement of the pointer in a direction parallel to the optical path of the position-indicating light, An image of the direction indicating light irradiated on the display image is detected from a direction indicating light image that is an image of light having a wavelength different from that of the pointer, and based on the detection result, the movement of the pointer in the direction is instructed. Was Direction information indicating whether the output, the position and shape of the image of the position indicator light, on the basis of said direction information output, display position of the pointer in the virtual three-dimensional space is specified.

  According to the first aspect of the present invention, a position in a virtual three-dimensional space can be designated. In particular, according to the first aspect of the present invention, the position in the virtual three-dimensional space can be more reliably specified at low cost regardless of the size of the image to be displayed and the surrounding environment.

  According to the second aspect of the present invention, a position in a virtual three-dimensional space can be designated. In particular, according to the second aspect of the present invention, the position in the virtual three-dimensional space can be more reliably specified at low cost regardless of the size of the image to be displayed and the surrounding environment.

  Embodiments of the present invention will be described below. Correspondences between the constituent elements of the present invention and the embodiments described in the specification or the drawings are exemplified as follows. This description is intended to confirm that the embodiments supporting the present invention are described in the specification or the drawings. Therefore, even if there is an embodiment which is described in the specification or the drawings but is not described here as an embodiment corresponding to the constituent elements of the present invention, that is not the case. It does not mean that the form does not correspond to the constituent requirements. Conversely, even if an embodiment is described here as corresponding to a configuration requirement, that means that the embodiment does not correspond to a configuration requirement other than the configuration requirement. Not something to do.

  An image processing system according to the first aspect of the present invention (for example, the presentation system 11 in FIG. 1) includes a display device (for example, the projector 25 in FIG. 1) that displays a display image, and irradiates the display image with light. An instruction device (for example, the laser pointer 22 in FIG. 1) for indicating the position of a pointer (for example, the three-dimensional pointer 31 in FIG. 1) displayed on the display image, and an area in which the display image is displayed An image capturing apparatus that captures an image (for example, the camera 23 in FIG. 1) and an image processing apparatus that newly generates a display image based on the captured image obtained by capturing with the image capturing apparatus (for example, the image processing apparatus 24 in FIG. 1). ), Wherein the pointing device displays a display position in a virtual three-dimensional space assumed for the display image displayed by the display device. The position indicating light emitting means (for example, the invisible light laser 81 in FIG. 3) for emitting position indicating light that is light for indicating the display position for displaying the pointer is different from the position indicating light. Direction indicating light emitting means (for example, the invisible light laser 83 or the invisible light laser shown in FIG. 3) that emits a direction indicating light that is a light having a wavelength and instructs the movement of the pointer in a direction parallel to the optical path of the position indicating light. 84), and the imaging device captures the image of the position indicating light and the image of the direction indicating light irradiated on the display image as the captured image, and the image processing device extracts the captured image from the captured image. Separating means for separating a position indicating light image which is an image of light having the same wavelength as the position indicating light and a direction indicating light image which is an image of light having the same wavelength as the direction indicating light (for example, the geometry of FIG. conversion And a position indicating light detecting means (for example, a position indicating light detecting unit 223 in FIG. 10) for detecting the position and shape of the position indicating light image on the display image from the position indicating light image. , A direction indicating light detecting means for detecting an image of the direction indicating light from the direction indicating light image and outputting direction information indicating whether or not the movement of the pointer in the direction is instructed based on the detection result. Based on (for example, the forward instruction light detection unit 224 or the reverse instruction light detection unit 225 in FIG. 10), the position and shape of the image of the position instruction light, and the output direction information, the virtual three-dimensional space Position specifying means (for example, coordinate calculation unit 226 in FIG. 10) for specifying the display position of the pointer above, and the display corresponding to the display position of the pointer specified by the position specifying means Display image generating means (for example, the display image processing unit 228 in FIG. 10) that generates a new display image in which the pointer is displayed at a position on the image, and the display device displays the display The display image display is switched to the generated new display image.

  The image processing apparatus according to the second aspect of the present invention (for example, the image processing apparatus 24 in FIG. 10) is a display position in a virtual three-dimensional space assumed for a display image, and a pointer (for example, in FIG. 1). An image processing apparatus for specifying a display position for displaying a three-dimensional pointer 31), which is a position indicating light image obtained by capturing an image of a region where the display image is displayed, in the virtual three-dimensional space From the position indicating light image, which is an image of light having the same wavelength as the position indicating light for indicating the display position for displaying the pointer, the image of the position indicating light irradiated on the display image on the display image Position indication light detection means (for example, position indication light detection unit 223 in FIG. 10) for detecting the position and shape, and a first direction indication light image obtained by imaging the area where the display image is displayed There A first image that has the same wavelength as the first direction indicating light instructing movement of the pointer in a first direction parallel to the optical path of the position indicating light, and is a light having a wavelength different from that of the position indicating light. Whether or not the first direction indicating light image irradiated on the display image is detected from the direction indicating light image and whether or not the pointer is instructed to move in the first direction based on the detection result. First direction indicating light detecting means (for example, a forward indicating light detecting unit 224 in FIG. 10) that outputs first direction information indicating the position and shape of the image of the position indicating light, and the output Position specifying means for specifying the display position of the pointer in the virtual three-dimensional space based on the first direction information (for example, the coordinate calculation unit 226 in FIG. 10).

  In the image processing apparatus, a display image generating unit (for example, a diagram) that generates a new display image in which the pointer is displayed at a position on the display image corresponding to the display position of the pointer specified by the position specifying unit. Ten display image processing units 228) may be further provided.

  The position specifying means stores a depth length that determines a length from a position in the virtual three-dimensional space corresponding to a position of the image of the position indicating light to a display position of the pointer, and The depth length is changed based on the direction information, and the display position of the pointer is specified based on the changed depth length and the position and shape of the image of the position indicating light (for example, FIG. 29). Step S164 to step S168).

  The image processing apparatus includes a second direction indicating light image obtained by imaging the area where the display image is displayed, the pointer pointing in a second direction opposite to the first direction. From the second direction indicating light image which is the same wavelength as the second direction indicating light instructing the movement of the first direction indicating light and is an image of light having a wavelength different from that of the position indicating light and the first direction indicating light. The second direction information indicating whether or not the movement of the pointer in the second direction is instructed based on the detection result is detected on the image of the second direction indicating light irradiated on the image. Second direction indication light detection means (for example, reverse indication light detection unit 225 in FIG. 10) for outputting is further provided, and the position specifying means is based on the first direction information and the second direction information. The depth length is changed (for example, the length of FIG. Tsu processing of up S165 or step S167) can be.

  In the image processing apparatus, an imaging unit that captures the display image, the image of the position indicating light irradiated on the display image, and the image of the first direction indicating light irradiated on the display image as the captured image. (For example, the camera 23 in FIG. 1) and separation means (for example, the geometric transformation processing unit 222 in FIG. 10) that separates the position indicating light image and the first direction indicating light image from the captured image. Can be further provided.

  An image processing method or program according to a second aspect of the present invention is an image processing method for an image processing apparatus that specifies a display position on a virtual three-dimensional space assumed for a display image and displays a pointer. Or a position indication light image obtained by imaging a region where the display image is displayed, the program being a position indication for indicating a display position for displaying the pointer in the virtual three-dimensional space A position indicating light detection step for detecting the position and shape of the image of the position indicating light irradiated on the display image on the display image from the position indicating light image that is an image of light having the same wavelength as the light (for example, Step S43 in FIG. 12 and a direction indicating light image obtained by imaging the area where the display image is displayed, in a direction parallel to the optical path of the position indicating light An image of the direction indication light irradiated on the display image from a direction indication light image that is the same wavelength as the direction indication light instructing movement of the pointer and having a wavelength different from that of the position indication light. A direction indicating light detecting step (eg, step S44 or step S45 in FIG. 12) that detects and outputs first direction information indicating whether or not movement of the pointer in the direction is instructed based on the detection result. ), The position and shape of the image of the position indicating light, and the output direction information, a position specifying step for specifying the display position of the pointer in the virtual three-dimensional space (for example, FIG. Step S46).

  Embodiments to which the present invention is applied will be described below with reference to the drawings.

  FIG. 1 is a diagram showing a configuration example of an embodiment of a presentation system to which the present invention is applied.

  The presentation system 11 includes a screen 21, a laser pointer 22, a camera 23, an image processing device 24, and a projector 25. A display image projected from the projector 25 is displayed on the screen 21.

  This display image is three-dimensional computer graphics generated by projecting these objects on a two-dimensional plane based on information in a virtual three-dimensional space such as displayed objects. A virtual three-dimensional space is assumed on the display image, and a three-dimensional pointer 31 indicating a predetermined position on the virtual three-dimensional space is displayed on the display image.

  A user 32 giving a presentation operates the laser pointer 22 at a position moderately away from the screen 21 and moves the three-dimensional pointer 31 displayed on the display image to give the presentation. The laser pointer 22 designates the position of the three-dimensional pointer 31 by emitting invisible light (invisible light) such as infrared light according to the operation of the user 32.

  The camera 23 is a multispectral camera, and the position of the camera 23 is determined so that the user 32 is not positioned between the camera 23 and the screen 21. That is, the camera 23 is arranged at a position where the entire display image displayed on the screen 21 can be captured.

  The camera 23 captures images of the screen 21 and the surrounding area, and supplies the captured image obtained thereby to the image processing device 24. Therefore, the captured image captured by the camera 23 includes an invisible light image emitted from the laser pointer 22 reflected on the screen 21 in addition to the display image projected on the screen 21.

  Thus, since the camera 23 captures an image including the entire display image as a captured image, even if the camera 23 is not disposed at a position facing the screen 21, an image of invisible light irradiated on the display image is displayed. The position on the image can be grasped. Thereby, the display position of the three-dimensional pointer 31 instruct | indicated by the user 32 can be pinpointed reliably.

  The image processing device 24 obtains the position in the virtual three-dimensional space designated by the user 32 operating the laser pointer 22 based on the captured image supplied from the camera 23. Further, the image processing device 24 has recorded in advance a three-dimensional image, which is three-dimensional computer graphics as explanatory material, and has been obtained in a virtual three-dimensional space assumed for the recorded three-dimensional image. An image in which the three-dimensional pointer 31 is displayed at the position is generated as a new display image.

  That is, the virtual three-dimensional space assumed for the three-dimensional image corresponds to the virtual three-dimensional space assumed for the display image, and the display image has a three-dimensional pointer at a predetermined position of the three-dimensional image. 31 is the displayed image. The image processing device 24 supplies the generated display image to the projector 25 and controls the display of the display image.

  Based on the display image supplied from the image processing device 24, the projector 25 projects light for displaying the display image onto the screen 21 and displays the display image on the screen 21. Therefore, each time a new display image is generated by the image processing device 24, the display image displayed on the screen 21 is updated.

  Further, the laser pointer 22 operated by the user 32 is a pen-type laser pointer, for example, as shown in FIG. A button unit 51 including three buttons is provided on the left side of the laser pointer 22 in the figure, and the user 32 operates the button unit 51 so as to operate a laser beam provided at the left end of the laser pointer 22. Invisible light is emitted from the exit 52 according to the operation.

  That is, the laser pointer 22 includes a plurality of invisible light lasers that emit infrared light as invisible light for indicating the display position of the three-dimensional pointer 31, and the laser pointer viewed from the direction of the arrow A11. As shown in FIG. 3, a plurality of invisible light lasers are arranged in the cross section 22.

  In FIG. 3, the laser pointer 22 has a circular cross section, and an invisible light laser 81 that emits laser light (infrared light) whose cross section is a perfect circle is disposed substantially at the center of the laser pointer 22. . The invisible light laser 81 emits position indicating light that is laser light for designating the position of the three-dimensional pointer 31 in the virtual three-dimensional space, that is, the display position of the three-dimensional pointer 31 on the display image.

  In the drawing of the invisible light laser 81, an invisible light laser 82 that emits auxiliary light that is laser light having the same wavelength as that of the position indicating light is provided on the upper side. This auxiliary light is a laser beam for identifying from which direction the position indicating light is irradiated when the position indicating light is irradiated on the display image, and its cross section is a perfect circle. Shaped.

  The auxiliary light is light whose beam diameter is smaller than that of the position indicating light. Thus, by making the beam diameters of the position indicating light and the auxiliary light different from each other, the size of the image of the position indicating light on the screen 21 is different from the size of the image of the auxiliary light. It is possible to distinguish whether the image is an indicator light image or an auxiliary light image.

  For example, as shown in FIG. 4, it is assumed that the position indicating light and the auxiliary light are irradiated on the screen 21, and the image 111 of the position indicating light and the image 112 of the auxiliary light are observed on the screen 21. In this case, the image processing device 24 determines whether the position indicating light is applied to the screen 21 from the direction indicated by the arrow A31 or the screen from the direction indicated by the arrow A32 by using only the position indicating light image 111 on the captured image. It is not possible to know whether 21 has been irradiated.

  Therefore, when the user 32 operates the laser pointer 22 so that the position indicating light is always located on the vertical side with respect to the auxiliary light, that is, the button unit 51 always faces in the direction opposite to the vertical direction. By operating the laser pointer 22 by the user 32, it becomes possible to know from which direction the position indicating light is emitted.

  In FIG. 4, when the downward direction is the vertical direction in FIG. 4, the user 32 operates the laser pointer 22 while always pointing the button portion 51 upward, so that the auxiliary light image 112 is always the position indication light image 111. From the positional relationship between these images 111 and 112, it can be seen that the position indicating light is emitted from the direction of arrow A31.

  Returning to the description of FIG. 3, the left side of the invisible light laser 81 is a laser beam for moving the position in the virtual three-dimensional space where the three-dimensional pointer 31 is displayed to the near side with respect to the user 32. An invisible light laser 83 that emits some backward instruction light is provided. The backward indicator light is invisible light having a wavelength different from that of the position indicator light, and is used to instruct the user 32 to move the position of the three-dimensional pointer 31 in parallel with the optical path of the position indicator light.

  Further, on the right side of the invisible light laser 81, forward instruction light, which is laser light for moving the position in the virtual three-dimensional space where the three-dimensional pointer 31 is displayed to the back side with respect to the user 32, is displayed. An invisible laser 84 that emits light is provided. The forward instruction light is invisible light having a wavelength different from that of the position instruction light and the reverse instruction light, and instructs to move the position of the three-dimensional pointer 31 to the side opposite to the user 32 side in parallel with the optical path of the position instruction light. It is said to be light for doing.

  That is, as shown in FIG. 5, when the display image 141 is displayed in a predetermined area of the screen 21, the virtual three-dimensional space assumed for the display image 141 is lower left in the figure of the display image 141. Is a space in a three-dimensional space coordinate system with a reference point O (origin) as a vertex. In FIG. 5, from the reference point O in the figure, the right direction is the x-axis direction, the upward direction is the y-axis direction, the direction orthogonal to the display image 141 and the direction toward the back is the z-axis direction. Therefore, an image obtained by viewing this virtual three-dimensional space in the z-axis direction is displayed as the display image 141.

  A straight line 142 indicates the locus of the position indicating light emitted from the laser pointer 22, that is, the optical path of the position indicating light. The region 143 and the region 144 on the display image 141 are images of the position indicating light and the auxiliary light. Represents. Hereinafter, the region 143 and the region 144 are referred to as a position indicating light image 143 and an auxiliary light image 144.

  In FIG. 5, the position indicating light image 143 and the auxiliary light image 144 are located on the display image 141, and the display image 141 is located on the xy plane. The three-dimensional pointer 31 indicates a predetermined position of the object 145 in the virtual three-dimensional space on the extension line of the straight line 142. The object 145 is actually displayed on the display image 141, but is located on the z-axis direction side of the xy plane in the virtual three-dimensional space.

  Here, the image processing device 24 stores a depth length E that is a parameter indicating a distance from the image 143 of the position indicating light to the position indicated by the three-dimensional pointer 31, and corresponds to the operation of the user 32. By changing the value of the depth length E, a display image in which the three-dimensional pointer 31 is moved in a direction parallel to the straight line 142 is newly generated.

  For example, when the user 32 operates the laser pointer 22 without moving the laser pointer 22, that is, without changing the position and direction of the laser pointer 22, the backward instruction light is emitted together with the position instruction light. The value of the height E is changed so as to be reduced by a predetermined size, and the three-dimensional pointer 31 (position indicated by the position) moves to the near side in the drawing and in a direction parallel to the straight line 142. That is, the position of the three-dimensional pointer 31 after the movement is a point on the extension line of the straight line 142, and the length from the point to the image 143 of the position indicating light is a length indicated by the changed depth length E. The position of the point is

  Further, when the user 32 operates the laser pointer 22 so that the forward instruction light is emitted together with the position instruction light without moving the laser pointer 22, the value of the depth length E is increased by a predetermined amount. The changed three-dimensional pointer 31 (the position indicated by the arrow) moves to the back side in the drawing and in a direction parallel to the straight line 142.

  Therefore, for example, as shown in FIG. 6, when the position indicating light is irradiated to the position of the reference point O on the display image, the position indicated by the three-dimensional pointer 31 is the position of the point P11 in the virtual three-dimensional space. The user 32 can move the position of the three-dimensional pointer 31 to an arbitrary position on a straight line passing through the reference point O and the point P11 by operating the laser pointer 22 without moving its position and direction. it can. For example, the user 32 emits backward instruction light to move the position of the three-dimensional pointer 31 from the point P11 to the position of a point P12 on a straight line passing through the reference point O and the point P11, or emits forward instruction light. And can be moved to the position of the point P13.

  In FIG. 6, the right direction is the x-axis direction, the upper direction is the y-axis direction, the x-axis direction and the y-axis direction are orthogonal, and the direction toward the back is the z-axis direction. Further, in the following description, moving the three-dimensional pointer 31 in the direction in which the depth length E decreases may be referred to as moving forward, and conversely, the three-dimensional pointer 31 in the direction in which the depth length E increases. Moving is also referred to as moving in the depth direction. Further, the value of the depth length E is an integer value of 0 or more. That is, the three-dimensional pointer 31 is not displayed at a position where the z coordinate is a negative value in the virtual three-dimensional space.

  In this way, the user 32 operates the button unit 51 provided on the laser pointer 22 to emit invisible light from the laser pointer 22 and indicate the display position of the three-dimensional pointer 31.

  For example, when the user 32 operates the laser pointer 22 so that the value of the depth length E does not change and the three-dimensional pointer 31 is moved, as shown in FIG. 7A, the invisible light laser 81 and the invisible light laser The position indicating light and the auxiliary light are emitted from 82, and the backward instruction light and the forward instruction light are not emitted from the invisible light laser 83 and the invisible light laser 84. In other words, only the invisible light laser 81 and the invisible light laser 82 are turned on, and the invisible light laser 83 and the invisible light laser 84 are not turned on.

  In FIG. 7A, among the invisible light laser 81 to the invisible light laser 84, those that are shaded indicate that they are lit, and those that are not shaded are not lit. Represents that. Similarly, in FIGS. 7B and 7C, which will be described later, the lit invisible laser is shaded.

  When the user 32 operates the laser pointer 22 so that the three-dimensional pointer 31 moves in the depth direction, the invisible light laser 81, the invisible light laser 82, and the invisible light laser 84 are turned on as shown in FIG. 7B. However, the invisible light laser 83 is not turned on. That is, the laser pointer 22 emits position indicating light, auxiliary light, and forward indicating light.

  Further, when the user 32 operates the laser pointer 22 so that the three-dimensional pointer 31 moves in the forward direction, as shown in FIG. 7C, the invisible light laser 81, the invisible light laser 82, and the invisible light laser 83 are turned on. However, the invisible light laser 84 is not turned on. That is, from the laser pointer 22, position indicating light, auxiliary light, and reverse indicating light are emitted.

  As described above, infrared light for instructing the display position of the three-dimensional pointer 31 is emitted from the laser pointer 22 in accordance with an operation by the user 32. In the presentation system 11, the position indication light, auxiliary light, forward indication light, and reverse indication light emitted from the laser pointer 22 are infrared light, which is safer than when a visible light laser is used as the laser pointer. It is possible to improve the performance.

  That is, when visible laser light is incident on a human eye, it not only causes discomfort to the person but also may cause damage to the human eye. On the other hand, the laser pointer 22 can improve safety for human eyes by using infrared light having a wavelength longer than that of visible light. In addition, since infrared light is invisible light, There is no discomfort when it enters the eye. Furthermore, since position indication light, forward instruction light, and the like irradiated to the display image are not visible to human eyes, extra visible light is not irradiated to the display image, and the display image is not difficult to see.

  Next, FIG. 8 is a block diagram illustrating a configuration example of the camera 23 of FIG.

  The camera 23 includes a lens 181, a diaphragm 182, a CCD (Charge Coupled Devices) image sensor 183, a correlated double sampling circuit 184, an A / D (Analog / Digital) converter 185, a DSP (Digital Signal Processor) 186, a timing generator 187, A D / A (Digital / Analog) converter 188, a video encoder 189, a display 190, a CODEC (Compression / Decompression) 191, a memory 192, a bus 193, a CPU (Central Processing Unit) 194, and an input device 195 are included.

  The lens 181 collects light from the screen 21 that is the subject and the surrounding area, and makes the collected light enter the CCD image sensor 183 through the diaphragm 182. The diaphragm 182 adjusts the amount of light incident on the CCD image sensor 183 from the lens 181.

  The CCD image sensor 183 photoelectrically converts the light incident from the lens 181 to convert the incident light into an electrical signal, and generates a captured image that is an image of the screen 21 and its surrounding area. Here, each pixel of the captured image is, for example, R (red), G (green), or B (blue) which are the three primary colors of light, the color of the wavelength of the position indicating light, the color of the wavelength of the forward indicating light, or Only the color data (pixel value) of any one of the colors of the wavelengths of the reverse indicator light is included. The CCD image sensor 183 supplies the generated captured image to the correlated double sampling circuit 184.

  The correlated double sampling circuit 184 removes noise from the captured image by sampling the captured image from the CCD image sensor 183, and supplies the captured image from which the noise has been removed to the A / D converter 185. The A / D converter 185 converts the captured image from the correlated double sampling circuit 184, more specifically, the image signal of the captured image from an analog signal to a digital signal and supplies the converted signal to the DSP 186.

  The DSP 186 includes, for example, a signal processing processor and an image RAM (Random Access Memory). The DSP 186 performs pre-programmed image processing such as gradation compression processing as necessary on the captured image supplied from the A / D converter 185 and stored in the image RAM.

  Further, the DSP 186 determines that each pixel of the captured image is R (red), G (green), B (blue), the color of the position indicating light, the color of the forward indicating light, and the reverse as necessary. A color synchronization process is performed on the captured image from the A / D converter 185 so as to have each color data of the wavelength of the indicator light. The captured image subjected to the image processing is supplied to the image processing device 24 of FIG. 1 via the bus 193, or supplied to the D / A converter 188 or the CODEC 191.

  Further, a DSP 186, a timing generator 187, a CODEC 191, a memory 192, a CPU 194, and an input device 195 are connected to the bus 193.

  The timing generator 187 controls the operation timing of each of the CCD image sensors 183 to DSP 186. The D / A converter 188 converts the captured image supplied from the DSP 186 from a digital signal to an analog signal and supplies the analog signal to the video encoder 189. The video encoder 189 converts the captured image from the D / A converter 188 into a normal video signal such as R, G, B, etc., and supplies it to the display 190. The display 190 includes, for example, an LCD (Liquid Crystal Display) that functions as a finder or a video monitor, and displays the captured image supplied from the video encoder 189.

  The CODEC 191 encodes the captured image from the DSP 186 and records the encoded image in the memory 192, or reads and decodes the captured image recorded in the memory 192 and supplies the decoded image to the DSP 186. The memory 192 includes, for example, a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, and the like, and records the captured image supplied from the CODEC 191 and supplies the recorded captured image to the CODEC 191.

  The CPU 194 controls the operation of the entire camera 23 according to an operation signal from the input device 195 and the like. The input device 195 includes buttons and switches, for example, and supplies an operation signal according to the operation of the user 32 to the CPU 194.

  The CCD image sensor 183 has different spectral sensitivity for each pixel. That is, on the light receiving surface of the CCD image sensor 183, R (red), G (green), B (blue), the color of the wavelength of the position indicating light, the color of the wavelength of the forward indicating light, or the backward indicating light for each pixel. An on-chip color filter that transmits only light of any one of the colors of the wavelength of the color is disposed, and the light that has entered the CCD image sensor 183 passes through the on-chip color filter and performs photoelectric conversion. Incident on the element.

  For example, an on-chip color filter shown in FIG. 9 is arranged on the light receiving surface of the CCD image sensor 183. In FIG. 9, one square represents one pixel, and characters R, G, B, L1, L2, and L3 in the square are R (red), G (green), B (blue), R filter that transmits only the color of the wavelength of the position indicating light, the color of the wavelength of the forward indicating light, and the color of the wavelength of the backward indicating light, the G filter, the B filter, and the position Each of the filter of the indicator light wavelength, the filter of the forward indicator light wavelength, and the filter of the reverse indicator light wavelength is shown.

  In FIG. 9, each of the filters is arranged in a mosaic pattern. That is, the G filter and the wavelength indicating light wavelength filter are arranged in the vertical and horizontal directions in the figure every other pixel, and the R filter, the B filter, the forward indicating light wavelength filter, and Each of the filters of the wavelength of the reverse instruction light is arranged in the vertical direction and the horizontal direction every three pixels.

  FIG. 10 is a block diagram showing a configuration of the image processing device 24 of FIG.

  The image processing device 24 includes a color synchronization processing unit 221, a geometric conversion processing unit 222, a position indicating light detecting unit 223, a forward indicating light detecting unit 224, a reverse indicating light detecting unit 225, a coordinate calculating unit 226, and a three-dimensional image recording unit. 227 and a display image processing unit 228.

  The color synchronization processing unit 221 is supplied with a captured image, more specifically, an image signal of the captured image, from the DSP 186 of the camera 23 via the bus 193. Here, in the captured image supplied to the color synchronization processing unit 221, each pixel has R (red), G (green), B (blue), the color of the wavelength of the position indicating light, and the color of the wavelength of the forward indicating light. Or an image having only the pixel value of any one of the colors of the wavelengths of the reverse indicator light.

  The color synchronization processing unit 221 performs a color synchronization process for interpolating the pixel values of each color on the captured image supplied from the camera 23, and converts the captured image subjected to the color synchronization process to a geometric conversion processing unit. 222 is supplied. As a result, the captured image subjected to the color synchronization processing has each pixel R (red), G (green), B (blue), the color of the wavelength of the position indicating light, the color of the wavelength of the forward indicating light, and An image having pixel values of the respective colors of the wavelengths of the reverse indicator light is obtained.

  The geometric conversion processing unit 222 performs a geometric conversion process on the captured image supplied from the color synchronization processing unit 221 and extracts an image of a display image area on the captured image from the captured image subjected to the geometric conversion process. Extract images. Here, the display image displayed on the screen 21 is originally a rectangle, but the camera 23 is not necessarily arranged at a position facing the screen 21, so the area of the display image on the captured image is a rectangle. It may not be possible. Therefore, the geometric conversion processing unit 222 converts the region of the display image on the captured image into a rectangular region by performing a geometric conversion process on the captured image.

  Further, the geometric transformation processing unit 222 separates the extracted image into a position instruction light image, a forward instruction light image, and a reverse instruction light image, and the separated position instruction light image, forward instruction light image, and reverse instruction light are separated. Each of the images is supplied to each of the position instruction light detection unit 223, the forward instruction light detection unit 224, and the reverse instruction light detection unit 225.

  Here, the position indicating light image refers to an image of light having the same wavelength as that of the position indicating light, that is, an image in which each pixel has only a pixel value of the color of the wavelength of the position indicating light. An image of light having the same wavelength as that of the indicator light, that is, an image in which each pixel has only a pixel value of the color of the wavelength of the forward indicator light. The reverse instruction light image is an image of light having the same wavelength as the reverse instruction light, that is, an image in which each pixel has only a pixel value of a color of the reverse instruction light.

  The position indicating light detecting unit 223 detects an image of the position indicating light from the position indicating light image supplied from the geometric transformation processing unit 222, and the number K of pixels constituting the image and the coordinates of the center of gravity of the image of the position indicating light. (Xi, yi), the coordinates (xf, yf) of the intersection of the contour of the position indicating light image and the major axis of the contour, and the coordinates of the intersection of the contour of the position indicating light image and the minor axis of the contour ( xn, yn).

  That is, since the cross section of the position indicating light is a perfect circle, the shape of the contour of the image of the position indicating light on the screen 21 is a perfect circle or an ellipse. Therefore, the coordinates (xf, yf) are the coordinates of the long axis of the circle as the contour and the intersection of the circle, and the coordinates (xn, yn) are the intersection of the straight line orthogonal to the long axis and the circle. It is a coordinate. Hereinafter, the coordinates (xi, yi), the coordinates (xf, yf), and the coordinates (xn, yn) are also referred to as barycentric coordinates, major axis intersection coordinates, and minor axis intersection coordinates.

  The position indication light detection unit 223 supplies the number K of pixels and the barycentric coordinates (xi, yi) obtained in this way to the forward indication light detection unit 224 and the reverse indication light detection unit 225. Further, the position indicating light detection unit 223 supplies the barycentric coordinates (xi, yi), the long axis intersection coordinates (xf, yf), and the short axis intersection coordinates (xn, yn) as coordinate information to the coordinate calculation unit 226.

  The forward instruction light detection unit 224 detects an image of the forward instruction light from the forward instruction light image supplied from the geometric transformation processing unit 222 using the number of pixels K and the barycentric coordinates supplied from the position instruction light detection unit 223. To do. Then, the forward instruction light detection unit 224 supplies a forward flag, which is information indicating the detection result, to the coordinate calculation unit 226. Here, the forward flag refers to information indicating whether or not the user 32 has instructed to move the three-dimensional pointer 31 in the depth direction.

  The reverse indicator light detection unit 225 detects an image of the reverse indicator light from the reverse indicator light image supplied from the geometric transformation processing unit 222 using the number of pixels K and the barycentric coordinates supplied from the position indicator light detector 223. To do. Then, the reverse indicator light detector 225 supplies a reverse flag, which is information indicating the detection result, to the coordinate calculator 226. Here, the backward flag refers to information indicating whether or not the user 32 has instructed to move the three-dimensional pointer 31 in the forward direction.

  The coordinate calculation unit 226 stores the depth length E. The coordinate calculation unit 226 is based on the stored depth length E, the coordinate information from the position indication light detection unit 223, the forward flag from the forward indication light detection unit 224, and the reverse flag from the reverse indication light detection unit 225. Then, the position where the three-dimensional pointer 31 is displayed is calculated, and information indicating the position where the three-dimensional pointer 31 is to be displayed is supplied to the display image processing unit 228. The three-dimensional image recording unit 227 records a three-dimensional image, and supplies the recorded three-dimensional image to the display image processing unit 228 as necessary.

  The display image processing unit 228 generates a display image from the information indicating the position where the three-dimensional pointer 31 should be displayed supplied from the coordinate calculation unit 226 and the three-dimensional image supplied from the three-dimensional image recording unit 227. To the projector 25.

  By the way, when the user 32 operates the image processing device 24 to instruct to display the display image on the screen 21 in order to make a presentation, the display image processing unit 228 displays the three-dimensional image instructed by the user 32 as the display image. To the projector 25, and instructs the projector 25 to display a display image. Then, the projector 25 projects light for displaying the display image supplied from the display image processing unit 228 on the screen 21 based on the control of the display image processing unit 228 of the image processing device 24. Display the display image.

  When the display image is displayed on the screen 21, the user 32 performs a presentation while operating the laser pointer 22 and moving the three-dimensional pointer 31.

  When a display image is displayed on the screen 21, the presentation system 11 starts a display process that is a process for displaying a display image that reflects the operation of the user 32 on the laser pointer 22. Hereinafter, display processing by the presentation system 11 will be described with reference to the flowchart of FIG.

  In step S <b> 11, the camera 23 images the screen 21 and the surrounding area, and supplies the captured image obtained thereby to the color synchronization processing unit 221 of the image processing device 24. That is, the lens 181 condenses the incident light and makes it incident on the CCD image sensor 183, and the CCD image sensor 183 generates a captured image by photoelectrically converting the light incident from the lens 181 and generates the captured image. The image is supplied to the correlated double sampling circuit 184. The correlated double sampling circuit 184 removes noise from the captured image by sampling the captured image from the CCD image sensor 183, and supplies the captured image from which noise has been removed to the A / D converter 185. The A / D converter 185 converts the captured image from the correlated double sampling circuit 184 from an analog signal to a digital signal and supplies the converted signal to the DSP 186.

  The DSP 186 performs image processing such as gradation compression processing on the captured image supplied from the A / D converter 185 as necessary, and the captured image subjected to the image processing is converted into an image via the bus 193. This is supplied to the color synchronization processing unit 221 of the processing device 24.

  In step S12, the image processing device 24 performs a generation process. Although details of the generation process will be described later, in this generation process, the image processing device 24 uses the captured image supplied from the camera 23 to specify the display position where the three-dimensional pointer 31 is to be displayed and displays the display position. A display image in which the three-dimensional pointer 31 is displayed at the position is generated. When the display image is generated, the image processing device 24 supplies the generated display image to the projector 25.

  In step S <b> 13, the image processing device 24 controls the projector 25 to display a display image on the screen 21. Based on the display image supplied from the image processing device 24, the projector 25 projects light for displaying the display image onto the screen 21 and displays the display image on the screen 21. As a result, the display on the screen 21 is switched from the display image displayed so far to a new display image reflecting the operation of the user 32.

  In step S14, the image processing device 24 determines whether or not to end the process of displaying the display image. For example, when the user 32 operates the image processing device 24 and gives an instruction to end the display of the display image, the image processing device 24 determines to end the processing.

  If it is determined in step S14 that the process is not terminated, the process returns to step S11, and the above-described process is repeated. That is, a display image to be newly displayed is generated by the image processing device 24, and the generated display image is displayed on the screen 21 by the projector 25.

  On the other hand, when it is determined in step S14 that the process is to be ended, each device of the presentation system 11 ends the process being performed, and the display process is ended.

  In this way, the presentation system 11 captures the screen 21 and the surrounding area, and generates a display image reflecting the operation by the user 32 from the captured image obtained as a result. Then, the newly generated display image is displayed on the screen 21.

  Next, generation processing that is processing corresponding to the processing in step S12 in FIG. 11 will be described with reference to the flowchart in FIG.

  In step S <b> 41, the color synchronization processing unit 221 performs color synchronization processing on the captured image supplied from the camera 23, and supplies the captured image subjected to color synchronization processing to the geometric conversion processing unit 222. .

  The color synchronization processing unit 221 performs color synchronization processing to interpolate the pixel values of each color using the pixel values of the pixels around the pixels of the captured image. Accordingly, each pixel has R (red), G (green), B (blue), the color of the wavelength of the position indicating light, the color of the wavelength of the forward indicating light, and the color of the wavelength of the backward indicating light. A captured image having pixel values is obtained. This color synchronization processing is also called demosaic processing.

  More specifically, for example, the color synchronization processing unit 221 performs color synchronization processing using a technique described in Japanese Patent Laid-Open No. 2003-230159. That is, the color synchronization processing unit 221 generates a color difference signal from the pixel values of R, G, and B colors of the captured image, and generates a luminance image in which all pixels have a luminance component from the captured image and the color difference signal. To do. Then, based on the color difference signal, the color synchronization processing unit 221 converts the luminance image into a captured image in which each pixel has R, G, and B pixel values, and based on the G color pixel value. R and B pixel values are corrected.

  Further, the color synchronization processing unit 221 includes the pixel values of the R, G, and B colors of each pixel, the color of the position indicating light wavelength of the predetermined pixel, the color of the wavelength of the forward indicating light, and the backward indicating light. The pixel value of the color of the position indicating light of each pixel, the pixel value of the color of the wavelength of the forward indicating light, and the pixel value of the color of the wavelength of the backward indicating light are obtained using The captured image is an image in which each pixel has pixel values of R, G, B, the color of the wavelength of the position indicating light, the color of the wavelength of the forward indicating light, and the color of the wavelength of the backward indicating light.

  In addition, for example, the color synchronization processing unit 221 may perform color synchronization processing using a technique described in Japanese Patent Publication No. 61-502424. In such a case, the color synchronization processing unit 221 obtains the value of the hue component from the pixel values of the R, G, and B colors of the pixels of the captured image and performs linear interpolation to thereby determine the value of the hue component of each pixel. Get. Then, the color synchronization processing unit 221 obtains pixel values of R, G, and B colors again from the value of the hue component of each pixel.

  Further, the color synchronization processing unit 221 includes the pixel values of the R, G, and B colors of each pixel, the color of the position indicating light wavelength of the predetermined pixel, the color of the wavelength of the forward indicating light, and the backward indicating light. And the pixel value of the color of the position indicating light wavelength, the pixel value of the color of the forward indicating light wavelength, and the pixel value of the color of the reverse indicating light wavelength Ask for.

  When the captured image subjected to the color synchronization processing is supplied from the color synchronization processing unit 221 to the geometric conversion processing unit 222, the geometric conversion processing unit 222 is supplied from the color synchronization processing unit 221 in step S42. The captured image is subjected to a geometric transformation process.

  For example, as shown in FIG. 13, the projector 25 can project the light for displaying the display image from the front to the screen 21, that is, the position facing the front of the screen 21, that is, the screen 21. Placed in.

  On the other hand, the camera 23 is arranged at a position where an almost entire image of the screen 21 can be taken. In FIG. 13, the camera 23 is disposed at the lower right position in the drawing with respect to the screen 21. Therefore, the rectangular display image displayed on the screen 21 has a trapezoidal shape on the captured image, for example, as shown in FIG.

  A captured image 251 shown in FIG. 14 includes a display image 252 displayed on the screen 21. In addition, the display image 252 is a trapezoidal region having apexes of the four points 253-1 to 253-4. That is, since the camera 23 images the screen 21 from an oblique direction, a display image that is rectangular on the screen 21 is displayed on the captured image 251 in FIG. It has a long trapezoid shape.

  Therefore, first, the geometric transformation processing unit 222 detects an area of the display image 252 from the captured image 251 by performing edge detection on the captured image 251 of FIG. The geometric transformation processing unit 222 uses points 253-1 to 253-4 which are the four vertices of the detected display image 252 as feature points, and has a rectangular shape with the points 253-1 to 253-4 as vertices. A parameter for transforming the region into a rectangular region is calculated, and a geometric transformation process is performed on the captured image based on the calculated parameter. Further, the geometric transformation processing unit 222 extracts a rectangular area having vertices at the points 253-1 to 253-4 as extracted images from the captured image 251 subjected to the geometric transformation processing.

  Then, the geometric transformation processing unit 222 separates the extracted image extracted from the captured image into a position indication light image, a forward indication light image, and a reverse indication light image, and the separated position indication light image and forward indication are separated. The optical image and the reverse instruction light image are supplied to the position instruction light detection unit 223, the forward instruction light detection unit 224, and the reverse instruction light detection unit 225, respectively.

  For example, assume that the display image shown in FIG. 15A is displayed on the screen 21 when the camera 23 captures a captured image. In the display image shown in FIG. 15A, three objects are displayed, and a three-dimensional pointer 31 is displayed on the leftmost object in the figure. Further, for example, a position indication light image shown in FIG. 15B is obtained from a captured image obtained by capturing when the display image is displayed.

  In the position indicating light image shown in FIG. 15B, an elliptical position indicating light image 281 and an elliptical auxiliary light image 282 are captured and displayed on the left side in the drawing. In this position indicating light image, since the auxiliary light image 282 is located at the upper left in the figure of the position indicating light image 281, the position indicating light is applied to the display image on the screen 21 from the lower right direction. You can see that.

  Furthermore, when the display image shown in FIG. 15A is displayed on the screen 21 and the user 32 instructs to move the three-dimensional pointer 31 in the depth direction, the laser pointer 22 irradiates the display image with the forward instruction light. Therefore, for example, a forward instruction light image shown in FIG. 15C is obtained from the captured image.

  In FIG. 15C, parts corresponding to those in FIG. 15B are denoted by the same reference numerals, and description thereof is omitted. Further, since the wavelength of the position indicating light and the auxiliary light are different from the wavelength of the forward indicating light, the position indicating light image 281 and the auxiliary light image 282 are not actually displayed in the forward indicating light image.

  In FIG. 15C, the forward direction light image 291 is captured and displayed on the right side in the figure of the position direction light image 281. That is, since the forward instruction light image 291 is detected from the forward instruction light image, it is assumed that the movement of the three-dimensional pointer 31 in the depth direction is instructed.

  When the display image shown in FIG. 15A is displayed on the screen 21 and the user 32 instructs to move the three-dimensional pointer 31 in the forward direction, the display image is irradiated with the backward instruction light from the laser pointer 22. Therefore, for example, a reverse instruction light image shown in FIG. 15D is obtained from the captured image.

  In FIG. 15D, parts corresponding to those in FIG. 15B are denoted by the same reference numerals, and description thereof is omitted. Further, since the wavelength of the position indicating light and the auxiliary light is different from the wavelength of the reverse indicating light, the position indicating light image 281 and the auxiliary light image 282 are not actually displayed in the reverse indicating light image.

  In FIG. 15D, the reverse instruction light image 301 is captured and displayed on the left side of the position instruction light image 281. That is, since the reverse instruction light image 301 is detected from the reverse instruction light image, it is assumed that the movement of the three-dimensional pointer 31 in the forward direction is instructed.

  Thus, in the geometric transformation processing unit 222, each of the position instruction light image, the forward instruction light image, and the reverse instruction light image is separated from the extracted image.

  Returning to the description of the flowchart in FIG. 12, when the position instruction light image, the forward instruction light image, and the reverse instruction light image are obtained from the captured image in step S42, in step S43, the position instruction light detection unit 223 detects the center of gravity position detection. Process. Although details of the center-of-gravity position detection process will be described later, in this center-of-gravity position detection process, the position indication light detection unit 223 uses the number of pixels K, the center of gravity coordinates, The major axis intersection coordinates and the minor axis intersection coordinates are obtained. Then, the position indicating light detection unit 223 supplies the number K of pixels and the barycentric coordinates to the forward indicating light detecting unit 224 and the reverse moving instruction light detecting unit 225, and coordinates the barycentric coordinates, the long axis intersection coordinates, and the short axis intersection coordinates. It supplies to the calculation part 226.

  In step S44, the forward instruction light detection unit 224 performs forward instruction light detection processing. That is, the forward instruction light detection unit 224 detects an image of the forward instruction light from the forward instruction light image supplied from the geometric transformation processing unit 222 and supplies a forward flag as a detection result to the coordinate calculation unit 226.

  In step S45, the reverse instruction light detection unit 225 performs a reverse instruction light detection process. That is, the reverse instruction light detection unit 225 detects an image of the reverse instruction light from the reverse instruction light image supplied from the geometric transformation processing unit 222 and supplies a reverse flag as a detection result to the coordinate calculation unit 226. The details of the forward instruction light detection process and the reverse instruction light detection process will be described later.

  In step S46, the coordinate calculation unit 226 performs a three-dimensional pointer position calculation process. Although details of the three-dimensional pointer position calculation process will be described later, in this three-dimensional pointer position calculation process, the coordinate calculation unit 226 obtains the display position of the three-dimensional pointer 31 and indicates the display position of the three-dimensional pointer 31. Information is supplied to the display image processing unit 228. Here, as the display position of the three-dimensional pointer 31, for example, coordinates indicating a position in the virtual three-dimensional space assumed for the display image are obtained, and information indicating the coordinates of the virtual three-dimensional space is the coordinate calculation unit. 226 to the display image processing unit 228.

  In step S47, the display image processing unit 228 is based on the 3D image recorded in the 3D image recording unit 227 and the information indicating the display position of the 3D pointer 31 supplied from the coordinate calculation unit 226. Generate a display image.

  That is, the display image processing unit 228 displays an image in which the three-dimensional pointer 31 is displayed at a position on the three-dimensional image corresponding to the position on the virtual three-dimensional space specified by the information supplied from the coordinate calculation unit 226. Generated as a display image. Here, the position on the 3D image where the 3D pointer 31 is displayed is the position on the 3D image displayed on the 2D plane, and the display position of the 3D pointer 31 in the virtual 3D space. It is a position corresponding to.

  When the display image processing unit 228 generates the display image, the display image processing unit 228 supplies the generated display image to the projector 25 and instructs the projector 25 to display the display image. When the display image is supplied to the projector 25, the process proceeds to step S13 in FIG. 11, and the generation process ends.

  In this way, the image processing device 24 separates the position instruction light image, the forward instruction light image, and the reverse instruction light image from the captured image. Then, the image processing device 24 specifies the display position of the three-dimensional pointer 31 using the position instruction light image, the forward instruction light image, and the reverse instruction light image, and the display in which the three-dimensional pointer 31 is displayed at the display position. Generate an image.

  That is, the image processing device 24 identifies the direction in which the three-dimensional pointer 31 is moved by detecting the forward instruction light image and the reverse instruction light image from the captured image, and further specifies the specified direction. The position where the three-dimensional pointer 31 is to be displayed is specified from the position and shape of the image of the position indicating light.

  As described above, the display position of the three-dimensional pointer 31 is obtained using the position instruction light image, the forward instruction light image, and the reverse instruction light image, and the display image is generated. The display position of the three-dimensional pointer 31 can be specified. As a result, the user 32 simply operates the laser pointer 22 to irradiate the display image with invisible light, and can easily change the position in the virtual three-dimensional space assumed for the image displayed on the two-dimensional plane. Can be specified.

  Conventionally, a magnetic sensor, a magnetic marker, and an ultrasonic sensor have been used to designate a position in a virtual three-dimensional space assumed for a display image. However, these sensors and markers are expensive. In addition, depending on the size of the display image, it is necessary to change the range in which the sensor performs sensing. Therefore, depending on the size of the display image to be displayed, a magnetic sensor or ultrasonic sensor used in the presentation system must be selected. did not become.

  In contrast, in the presentation system 11, the invisible light emitted from the laser pointer 22 is imaged, and the display position of the three-dimensional pointer 31 is specified using the captured image obtained as a result. The position in the virtual three-dimensional space can be specified easily and surely without being influenced by the environment, and irrespective of the size of the display image.

  In addition, the presentation system 11 can use existing devices as the screen 21 and the projector 25. The multi-spectral camera as the camera 23 can be easily and inexpensively created by simply replacing the on-chip color filter arranged on the light receiving surface of an image sensor of a general color camera with the on-chip color filter shown in FIG. Therefore, it can be expected to be obtained at the same cost as a normal color camera due to the mass production effect.

  Furthermore, although the laser pointer 22 is a laser pointer that emits invisible light, the laser pointer for invisible light has already been sold as a product, and its price is not so expensive as compared with the laser pointer for visible light.

  Therefore, by using an existing apparatus, the presentation system 11 can be realized at a low cost without reducing versatility.

  Next, the center-of-gravity position detection process that is a process corresponding to the process of step S43 of FIG. 12 will be described with reference to the flowchart of FIG.

  When the position indicating light image is supplied from the geometric transformation processing unit 222 to the position indicating light detecting unit 223, in step S71, the position indicating light detecting unit 223 determines a pixel value in advance among each pixel of the position indicating light image. Pixels that are equal to or greater than the threshold value th1 are detected.

  In step S <b> 72, the position indicating light detection unit 223 performs clustering for the pixels of the detected position indicating light image. That is, the position indicating light detection unit 223 sets a region on the position indicating light image, which is composed of a plurality of pixels adjacent to each other and having a pixel value equal to or greater than the threshold th1, that is, a set of these pixels as one cluster.

  Thereby, for example, the region of the position indicating light image 281 and the region of the auxiliary light image 282 shown in FIG. 15B are detected as clusters from the position indicating light image. That is, the area detected as a cluster from the position indicating light image is an area irradiated with infrared light having the same wavelength as the position indicating light and auxiliary light on the screen 21, and the light projected from the projector 25 is Since most of them are visible light, it is normally predicted that two clusters are detected from the position indicating light image.

  Further, when the screen 21 is irradiated with infrared light as noise due to some influence, three or more clusters may be detected from the position indication light image. However, as the noise, position indication light or auxiliary light may be detected. It is rare that the screen 21 is irradiated with infrared light having a larger beam diameter. Therefore, the largest cluster among the detected plurality of clusters is predicted to be an image of the position indicating light irradiated on the screen 21 from the laser pointer 22, and the second largest cluster is predicted to be the image of auxiliary light. .

  In step S73, the position indicating light detection unit 223 obtains the position of the center of gravity of the second largest cluster, that is, the cluster composed of the second largest pixel among the clusters obtained by performing the clustering.

  For example, the position indicating light detection unit 223 is the second largest in the xy coordinate system having the position on the position indicating light image corresponding to the reference point O in the virtual three-dimensional space assumed for the display image as the reference point. The coordinates (xj, yj) of the position of the center of gravity of the cluster are obtained.

  Here, since the reference point O of the virtual three-dimensional space assumed for the display image is, for example, the lower left vertex in the display image 141 of FIG. 5, the position indicating light image shown in FIG. 15B is shown. The middle and lower left vertex is set as a reference point of the xy coordinate system. Further, the x-axis direction and the y-axis direction of the xy coordinate system with respect to the position indicating light image are the same as the x-axis direction and the y-axis direction of the virtual three-dimensional space assumed for the display image. Accordingly, in FIG. 15B, the right direction is the x-axis direction, and the upward direction is the y-axis direction.

  In step S74, the position indicating light detection unit 223 obtains the position of the center of gravity of the largest cluster among the clusters obtained by performing clustering, and the number of pixels constituting the cluster.

  For example, as shown in FIG. 17, it is assumed that the cluster C1 and the cluster C2 are detected from the position indicating light image as a result of clustering on the position indicating light image. Here, since the beam diameter of the position indicating light is larger than that of the auxiliary light, it is predicted that the image of the position indicating light is larger than the image of the auxiliary light. Therefore, the position indicating light detection unit 223 assumes that the larger cluster C1 of the two clusters detected from the position indicating light image is an image of the position indicating light, and the smaller cluster C2 is an auxiliary. Let it be an image of light.

  Then, the position indicating light detection unit 223 first obtains the coordinates of the position of the center of gravity of the smaller cluster C2 as the center of gravity coordinates of the auxiliary light image in the position indicating light image. For example, the position indicating light detection unit 223 obtains the coordinates of the center of gravity of each pixel constituting the cluster C2 as the coordinates of the center of gravity of the cluster C2. As a result, the coordinates (xj, yj) of the point 331 are obtained as the coordinates of the center of gravity of the cluster C2.

  Further, the position indicating light detection unit 223 obtains the coordinates of the position of the center of gravity of the larger cluster C1 as the center of gravity coordinates of the image of the position indicating light in the position indicating light image. As a result, the coordinates (xi, yi) of the point 332 are obtained as the coordinates of the center of gravity of the cluster C1. The coordinates (xi, yi) of the point 332 indicate the position of the center of gravity of the image on the display image of the position indicating light emitted to the display image. Further, the position indicating light detection unit 223 obtains the number of pixels constituting the cluster C1, that is, the number of pixels in the area of the cluster C1, as the number K of pixels of the position indicating light image.

  Returning to the description of the flowchart of FIG. 16, the position indicating light detection unit 223 determines the barycentric coordinates (xi, yi) of the position indicating light image and the number K of pixels of the position indicating light image as the forward indicating light detecting unit 224 and The reverse instruction light detection unit 225 is supplied.

  In step S75, the position indicating light detection unit 223 performs edge detection processing on the largest cluster among the clusters detected from the position indicating light image.

  For example, the position indicating light detection unit 223 performs edge detection processing on the cluster C1 in FIG. 17 and detects the contour of the cluster C1. Thereby, for example, an ellipse 341 shown in FIG. 18 is detected as the contour of the cluster C1. This ellipse 341 is the contour of the image of the position indicating light irradiated on the screen 21. Therefore, the shape of the image of the position indicating light irradiated on the display image on the display image is detected by the edge detection process. Hereinafter, the ellipse 341 is also referred to as a contour 341 of the cluster C1.

  When the contour of the largest cluster is detected, in step S76, the position indicating light detection unit 223 obtains the long axis intersection coordinates (xf, yf) for the contour of the cluster detected by the edge detection process. For example, the position indicating light detection unit 223 detects a point on the contour 341 of the cluster C1 in FIG. 18 that is the farthest from the point 332 of the center of gravity of the cluster C1, that is, the point having the maximum distance from the point 332. To do. In the example of FIG. 18, the contour 341 is elliptical, and therefore, two points are detected as the points having the maximum distance from the point 332.

  Then, the position indicating light detection unit 223 calculates the coordinates of the point 342 located between the point 331 and the point 332 among the two points farthest from the point 332, that is, the point 331 side that is the center of gravity of the cluster C2. The long axis intersection point coordinates (xf, yf) are obtained.

  When the contour 341 is a perfect circle, the distance from the point 332 to an arbitrary point on the contour 341 is equal, but the long-axis intersection coordinate point is on the contour 341 farthest from the center-of-gravity coordinate point 332. Since this is a point on the point 331 side, the coordinate of the intersection of the straight line connecting the point 331 and the point 332 and the contour 341 is the long-axis intersection coordinate.

  In step S77, the position indicating light detection unit 223 obtains a straight line that passes through the point of the center of gravity of the largest cluster and is orthogonal to the straight line connecting the point and the point of the long-axis intersection coordinates. The coordinates of the position of a point located in a predetermined direction with respect to the point of the center of gravity of the cluster is obtained as the short-axis intersection coordinate. Then, the position indicating light detection unit 223 supplies the obtained barycentric coordinates (xi, yi), long axis intersection coordinates (xf, yf), and short axis intersection coordinates (xn, yn) to the coordinate calculation unit 226 as coordinate information. Then, the process proceeds to step S44 in FIG. 12, and the gravity center position detection process ends.

  For example, as shown in FIG. 19, the position indicating light detection unit 223 obtains a straight line M1 connecting (passing) the point 332 of the center of gravity of the cluster C1 and the point 342 of the long-axis intersection coordinate. The position indicating light detection unit 223 obtains a straight line N1 that passes through the center of gravity 332 of the cluster C1 and is orthogonal to the straight line M1. The coordinates of the position of the point 351 are obtained as the short axis intersection point coordinates (xn, yn). That is, since the minor axis of the contour 341 that is an ellipse is orthogonal to the major axis, the straight line N1 that passes through the point 332 and is orthogonal to the straight line M1 that is a straight line that extends the major axis can be said to be a straight line that extends the minor axis. . Therefore, the position indicating light detection unit 223 sets the coordinates of the intersection of the straight line N1 and the contour 341 as the short axis intersection coordinates.

  Since the position and shape of the image of the position indication light can be known from the barycentric coordinates, the long axis intersection coordinates, and the short axis intersection coordinates obtained as described above, the position indication light detection unit 223 can detect the position indication light. It can be said that the position and shape of the image of the position indicating light are detected from the optical image.

  In this way, the position indicating light detection unit 223 detects the position and shape of the image of the position indicating light on the display image from the position indicating light image. Thus, by detecting the position and shape of the image of the position indicating light, the position where the three-dimensional pointer 31 should be displayed on the display image can be specified. In particular, since the position indicating light has a wavelength different from that of other light such as light for displaying a display image, that is, visible light, forward indicating light, and reverse indicating light, the image of the light is the position indicating light. Only the image of the position indicating light is detected without being detected from the image. Thereby, the image of the position indicating light can be easily detected from the position indicating light image.

  Next, with reference to the flowchart of FIG. 20, the forward instruction light detection process, which is a process corresponding to the process of step S44 of FIG. 12, will be described.

  In step S <b> 101, the forward instruction light detection unit 224 is centered on the point of the barycentric coordinates (xi, yi) supplied from the position instruction light detection unit 223 in the forward instruction light image supplied from the geometric transformation processing unit 222. Among the pixels in a circle area having a predetermined radius r, a pixel having a pixel value equal to or greater than a predetermined threshold th2 is detected.

  Here, the coordinate system in the forward instruction light image has a position corresponding to the reference point O in the virtual three-dimensional space assumed for the display image as a reference point, and the x-axis direction and y in the coordinate system in the forward instruction light image. The axial direction is the same as the x-axis direction and the y-axis direction of the virtual three-dimensional space of the display image.

  In step S102, the forward instruction light detection unit 224 performs clustering on the detected pixels of the forward instruction light image. That is, the forward instruction light detection unit 224 sets a region on the forward instruction light image that is adjacent to each other and includes a plurality of pixels having a pixel value equal to or greater than the threshold th2 as one cluster.

  In the laser pointer 22, the invisible light laser 84 that emits forward instruction light is provided in the vicinity of the invisible light laser 81 that emits position instruction light. Therefore, on the display image, the image of the forward instruction light is predicted to be in a position close to the image of the position instruction light. Therefore, an appropriate value of the radius r that can reliably detect the forward direction light image from the periphery of the position direction light image is determined, and as shown in FIG. 21, a point 332 of the center of gravity of the position direction light image is obtained. By making the area within the circle 381 having the center and radius r equal to the pixel whose pixel value is equal to or greater than the threshold th2, that is, the object of detection of the image of the forward instruction light, the forward instruction light is more efficiently and reliably detected. Can be detected.

  In FIG. 21, the same reference numerals are given to the portions corresponding to those in FIG. 19, and the description thereof is omitted. Further, the contour 341 of the image of the position instruction light is not actually displayed in the forward instruction light image.

  In FIG. 21, as a result of clustering for pixels having a pixel value equal to or greater than the threshold th2, an elliptical cluster C21 is detected at the upper right position in the diagram of the point 332. The cluster C21 is almost the same size as the contour 341, that is, the image of the position indicating light, and no other cluster is detected in the circle 381, so the cluster C21 is predicted to be the image of the forward indicating light. The

  Returning to the description of the flowchart of FIG. 20, in step S103, the forward instruction light detection unit 224 obtains the number of pixels R1 of the largest cluster among the clusters detected as a result of clustering. That is, the number of pixels constituting the largest cluster is set as the pixel number R1.

  In step S104, the forward instruction light detection unit 224 determines whether the absolute value of the difference between the obtained pixel number R1 and the pixel number K supplied from the position instruction light detection unit 223 is smaller than a predetermined threshold th3. Determine whether or not.

  That is, the forward instruction light detection unit 224 determines whether or not | R1-K | <th3. Here, | R1-K | represents the absolute value of the difference between the number of pixels R1 and the number of pixels K.

  If it is determined in step S104 that | R1−K | <th3, the size of the cluster detected from the forward instruction light image and the size of the image of the position instruction light is substantially the same, so the detected cluster is , It is assumed that the image is a forward instruction light, and the process proceeds to step S105. In step S105, the forward instruction light detection unit 224 sets the forward flag to “true”, and the process proceeds to step S107. Here, the forward flag of “true” indicates that an image of the forward instruction light is detected from the forward instruction light image and the movement of the three-dimensional pointer 31 in the depth direction is instructed.

  On the other hand, if it is determined in step S104 that | R1-K | <th3 is not satisfied, the cluster detected from the forward instruction light image has a size significantly different from that of the position instruction light, or the forward instruction. Since no cluster is detected from the light image, it is determined that the image of the forward instruction light is not detected from the forward instruction light image, and the process proceeds to step S106.

  In step S106, the forward instruction light detection unit 224 sets the forward flag to “false”, and the process proceeds to step S107. Here, the forward flag being “false” indicates that the forward instruction light image is not detected from the forward instruction light image, and the movement of the three-dimensional pointer 31 in the depth direction is not instructed.

  In step S107, the forward instruction light detection unit 224 outputs the forward flag set to either “true” or “false”, the process proceeds to step S45 in FIG. 12, and the forward instruction light detection process ends. To do. As a result, the forward flag is supplied from the forward instruction light detection unit 224 to the coordinate calculation unit 226.

  In this way, the forward instruction light detection unit 224 detects the image of the forward instruction light from the forward instruction light image, and whether or not the movement of the three-dimensional pointer 31 in the depth direction is instructed according to the detection result. A forward flag indicating that is output.

  Thus, by detecting the image of the forward instruction light, it is possible to easily know whether or not the movement of the three-dimensional pointer 31 in the depth direction is instructed. In particular, since the forward instruction light has a wavelength different from that of other lights such as visible light, position instruction light, and reverse instruction light for displaying a display image, those light images are detected from the forward instruction light image. Instead, only the image of the forward instruction light is detected. Thereby, the image of the forward instruction light can be easily detected from the forward instruction light image.

  When the forward flag is output from the forward instruction light detection unit 224, a reverse instruction light detection process that is a process corresponding to the process of step S45 of FIG. 12 is performed. Hereinafter, the reverse instruction light detection process will be described with reference to the flowchart of FIG. Note that each of the processes in steps S131 to S133 is the same as the processes in steps S101 to S103 in FIG. 20, and a description thereof will be omitted as appropriate.

  That is, in the processing from step S131 to step S133, for example, as shown in FIG. 23, the backward instruction light detection unit 225 targets the area in the circle 401 with the radius r centered on the point 332 of the center of gravity of the position instruction light. As a result, a pixel having a pixel value equal to or greater than the threshold th2 is detected and clustering is performed. In FIG. 23, parts corresponding to those in FIG. 19 are denoted by the same reference numerals, and description thereof is omitted. Further, the contour 341 of the image of the position indicating light is not actually displayed in the reverse indicating light image.

  In FIG. 23, as a result of clustering for pixels having a pixel value equal to or greater than the threshold th2, an elliptical cluster C31 is detected at the lower left position in the drawing of the point 332. The cluster C31 is almost the same size as the contour 341, that is, the image of the position indicating light, and no other cluster is detected in the circle 401, so the cluster C31 is predicted to be the image of the backward indicating light. The Further, the backward instruction light detection unit 225 obtains the pixel number R2 of the largest cluster among the detected clusters.

  Returning to the description of the flowchart of FIG. 22, in step S134, the backward indicator light detection unit 225 determines that the absolute value of the difference between the obtained pixel number R2 and the pixel number K supplied from the position indicator light detection unit 223 is in advance. It is determined whether or not it is smaller than a predetermined threshold th3, that is, whether or not | R2-K | <th3.

  If it is determined in step S134 that | R2-K | <th3, the detected cluster is assumed to be an image of the backward instruction light, and the process proceeds to step S135. In step S135, the reverse indicator light detection unit 225 sets the reverse flag to “true”, and the process proceeds to step S137. Here, the reverse flag being “true” indicates that the image of the reverse instruction light is detected from the reverse instruction light image and the movement of the three-dimensional pointer 31 in the front direction is instructed.

  On the other hand, if it is determined in step S134 that | R2-K | <th3 is not satisfied, it is determined that no reverse instruction light image has been detected from the reverse instruction light image, and the process proceeds to step S136. In step S136, the reverse indicator light detection unit 225 sets the reverse flag to “false”, and the process proceeds to step S137. Here, the backward flag of “false” indicates that the movement of the three-dimensional pointer 31 in the forward direction has not been instructed.

  In step S137, the reverse indicator light detection unit 225 outputs a reverse flag, the process proceeds to step S46 in FIG. 12, and the reverse indicator light detection process ends. As a result, the reverse flag is supplied from the reverse instruction light detection unit 225 to the coordinate calculation unit 226.

  In this manner, the reverse instruction light detection unit 225 detects the image of the reverse instruction light from the reverse instruction light image, and whether or not the movement of the three-dimensional pointer 31 in the front direction is instructed according to the detection result. A reverse flag indicating that is output.

  In this way, it is possible to easily know whether or not the movement of the three-dimensional pointer 31 is instructed by detecting the image of the reverse instruction light. In particular, the reverse indicator light has a wavelength different from that of other lights such as visible light, position indicator light, and forward indicator light for displaying the display image, and therefore the images of these lights are detected from the reverse indicator light image. However, it is possible to easily detect the image of the reverse instruction light.

  When the reverse flag is output from the reverse instruction light detection unit 225, a three-dimensional pointer position calculation process corresponding to the process of step S46 in FIG. 12 is performed. Hereinafter, this three-dimensional pointer position calculation processing will be described with reference to FIGS.

  In the three-dimensional pointer position calculation process, the coordinate calculation unit 226 uses the coordinate information, the forward flag, and the reverse flag supplied from the position instruction light detection unit 223, the forward instruction light detection unit 224, and the reverse instruction light detection unit 225. In the virtual three-dimensional space, the coordinates of the display position where the three-dimensional pointer 31 is to be displayed are obtained.

  First, the coordinate calculation unit 226 calculates the elevation angle α and the rotation angle β of the position indicating light in the virtual three-dimensional space. Here, the rotation angle β refers to the rotation angle of the position indicating light with respect to the xy plane, as shown in FIG. For example, as shown in FIG. 25, a center of gravity point 332 of the position indicating light image and a center of gravity point 331 of the auxiliary light image are detected from the position indicating light image. Note that, in FIG. 25, the same reference numerals are given to the portions corresponding to those in FIG. 19, and description thereof will be omitted as appropriate.

  If the position indicating light does not have a spread in the spatial direction, that is, if the beam diameter of the position indicating light is as close to 0 as possible, the position indicating light includes the points 331 and 332 and is perpendicular to the xy plane. It will go on the plane. Therefore, an angle formed by the plane and the yz plane is defined as a rotation angle β of the position indicating light.

  Therefore, in the figure, assuming that the vertical straight line 431 is a straight line parallel to the y-axis of the virtual three-dimensional space and the horizontal straight line 432 is a straight line parallel to the x-axis of the virtual three-dimensional space, The rotation angle β of the indication light is an angle formed by the long axis of the contour 341 of the position indication light image that is an ellipse and the straight line 431.

  The coordinate calculation unit 226 calculates an ellipse as the contour 341 from the coordinates (xi, yi) of the point 332, the coordinates (xf, yf) of the point 342, and the coordinates (xn, yn) of the point 351 supplied as coordinate information. The major axis length L and the minor axis length D are obtained. Then, the coordinate calculation unit 226 obtains the rotation angle β from the relationship between the major axis and the y axis of the contour 341. Here, the length L of the major axis is twice the length (distance) from the point 332 to the point 342, and the length D of the minor axis is the length (distance) from the point 332 to the point 351. ) Twice as long.

  Specifically, the coordinate calculation unit 226 calculates cosβ by calculating the following equation (1) from the coordinates (xi, yi) of the point 332 and the coordinates (xf, yf) of the point 342.

  Here, in equation (1), | yf−yi | represents the absolute value of the difference between yf and yi. When cos β is obtained in this way, the coordinate calculation unit 226 further transforms equation (1) as shown in the following equation (2) to obtain the rotation angle β.

  In this way, the coordinate calculation unit 226 calculates the rotation angle β from the coordinates (xi, yi) of the point 332 and the coordinates (xf, yf) of the point 342. In other words, the coordinate calculation unit 226 determines the rotation angle β from the positional relationship between the image of the position indicating light and the image of the auxiliary light.

  Further, the elevation angle α of the position indicating light in the virtual three-dimensional space includes the point 332 and the point 342, and an angle formed by a straight line parallel to the optical path of the position indicating light and the z axis in a plane orthogonal to the xy plane. Is done. That is, the elevation angle α is an elevation angle with respect to the z-axis of the position indicating light.

  For example, as shown in FIG. 26, it is assumed that the position indicating light is incident on the screen 21 that is the display surface of the display image so that the rotation angle β is zero. In FIG. 26, the vertical direction is the y-axis direction of the virtual three-dimensional space, the horizontal direction is the z-axis direction of the virtual three-dimensional space, and the display surface of the screen 21 is viewed in the x-axis direction.

  In FIG. 26, the position indicating light is incident on the display surface of the screen 21 in the direction indicated by the arrow 61, that is, from the lower right to the upper left in the drawing. Here, the straight line 461 in the vertical direction represents the display surface of the screen 21, and the straight line 462 and the straight line 463 represent position indication light.

  In other words, if the position indicating light is transmitted through the display surface, the beam diameter of the position indicating light is not actually 0, so that the position indicating light passes through a region between the straight line 462 and the straight line 463. Therefore, the distance between the straight line 462 and the straight line 463 is the beam diameter of the position indicating light.

  Further, if the straight line 464 is a straight line parallel to the z-axis of the virtual three-dimensional space, the elevation angle α of the position indicating light is an angle formed by the position indicating light and the z-axis, so the straight line 464 and the straight line 462 ( Alternatively, the angle formed by the straight line 463) is the elevation angle α.

  Here, the length of the short axis of the ellipse that is the image of the position indicating light on the display surface, that is, the length D of the short axis of the ellipse that is the contour of the image of the position indicating light on the position indicating light image is always the position. Equal to the diameter of the indicator light. Further, the length of the long axis of the ellipse that is the image of the position indicating light on the display surface, that is, the length L of the long axis of the ellipse that is the contour of the image of the position indicating light on the position indicating light image is represented by a straight line 462. It is the length (distance) from the intersection point with the straight line 461 to the intersection point between the straight line 463 and the straight line 461.

  Furthermore, since the angle formed by the straight line 461 and the straight line 461 is α, the long axis length L of the circular position indicating light image and the diameter of the position indicating light, that is, the short length of the position indicating light are short. The shaft length D satisfies the relationship cos α = D / L. Therefore, the coordinate calculation unit 226 determines the elevation angle α of the position indicating light from the relationship between the major axis length L and the minor axis length D of the ellipse, which is the contour of the position indicating light image on the position indicating light image. Can be sought. In other words, the coordinate calculation unit 226 determines the elevation angle α from the shape of the position indication light image, that is, the degree of distortion of the position indication light image.

  More specifically, the coordinate calculation unit 226 calculates from the barycentric coordinates (xi, yi), the long axis intersection coordinates (xf, yf), and the short axis intersection coordinates (xn, yn) of the position indicating light as coordinate information. The length L of the major axis and the length D of the minor axis are obtained, and cos α is obtained by calculating Equation (3) from the obtained length L of the major axis and the length D of the minor axis.

  Further, the coordinate calculation unit 226 obtains the elevation angle α by transforming Equation (3) as shown in the following Equation (4).

  In this way, the coordinate calculation unit 226 obtains the elevation angle α from the barycentric coordinates (xi, yi), the long axis intersection coordinates (xf, yf), and the short axis intersection coordinates (xn, yn) as coordinate information.

  As described above, when the rotation angle β and the elevation angle α of the position indicating light are obtained, the coordinate calculation unit 226 calculates the barycentric coordinates (xi, yi), the rotation angle β, the elevation angle α, and the stored depth length. From E, the coordinates of the display position of the three-dimensional pointer 31 are obtained.

  For example, the position of the center of gravity of the image of the position indicating light, that is, the position specified by the center of gravity coordinates (xi, yi) is the position of the reference point O in the coordinate system of the virtual three-dimensional space assumed for the display image. Then, as shown in FIG. 27, the point G of the display position of the three-dimensional pointer 31 is obtained from the depth length E, the rotation angle β, and the elevation angle α. In FIG. 27, the right direction is the x-axis direction, the upward direction is the y-axis direction, and the depth direction is the z-axis direction.

  In FIG. 27, point H is a perpendicular foot drawn from point G to the xz plane, point I is a perpendicular foot drawn from point G to the z-axis, and point J is point yz from point G. It is a leg of a perpendicular line drawn down on a plane. Here, since the straight line OG connecting the reference point O and the point G is a straight line obtained by extending the optical path of the position indicating light, the angle formed by the straight line OG and the z axis is the elevation angle α of the position indicating light. That is, ∠GOI is set to an elevation angle α.

  Further, a plane that passes through the position of the center of gravity of the image of the position indicating light, that is, the reference point O and the position of the center of gravity of the image of the auxiliary light and is orthogonal to the xy plane includes the reference point O, the point G, and the point I. Since it is a plane, the angle formed by the plane and the yz plane is the rotation angle β. That is, ∠GIJ, which is an angle formed by the straight line GI connecting the point G and the point I, and the straight line JI connecting the point J and the point I, is the rotation angle β. The plane including the reference point O, the point G, and the point I is a plane obtained by rotating the yz plane by an angle β with the z axis as the rotation axis.

  Here, the x coordinate, the y coordinate, and the z coordinate of the point G are equal to the x coordinate of the point H, the y coordinate of the point J, and the z coordinate of the point I, respectively. The x coordinate of point H, the y coordinate of point J, and the z coordinate of point I are respectively the distance from point H to point I, the distance from point J to point I, and from point I to reference point O. Therefore, the coordinates of the point G can be obtained by obtaining these distances (lengths).

  Therefore, considering that the straight line OG (vector OG) is decomposed along the plane GOI including the point G, the reference point O, and the point I, the vector OG can be expressed as the sum of the vector OI and the vector IG. . Since the length of the straight line OG is the depth length E and ∠GOI is the elevation angle α, the distance (length) from the point I to the reference point O is Ecosα, and the distance from the point G to the point I (length) ) Becomes Esinα. Therefore, the z coordinate of the point I, that is, the z coordinate of the point G is Ecosα.

  Further, the x coordinate of the point H and the y coordinate of the point J can be obtained from the length from the point G to the point I, that is, the length Esin α of the straight line GI and the rotation angle β of the position indicating light. That is, as shown in FIG. 28, the point G, the point H, the point I, and the point J are located on one plane parallel to the xy plane. In FIG. 28, portions corresponding to those in FIG. 27 are denoted by the same reference numerals, and description thereof is omitted.

  In FIG. 28, the length of the straight line GI is Esin α, and ∠GIJ is the rotation angle β, so the length from the point J to the point I, that is, the length of the straight line JI is Esin α cos β. Also, since ∠GIJ is β and ∠HIJ is 90 degrees, ∠HGI is β. Therefore, the length from the point H to the point I, that is, the length of the straight line HI is Esinαsinβ.

  Returning to the description of FIG. 27, since the length of the straight line HI is Esin α sin β and the straight line HI is parallel to the x axis, the x coordinate of the point G is equal to the length of the straight line HI and is set to Esin α sin β. Also, since the length of the straight line JI is Esin α cos β and the straight line JI is parallel to the y axis, the y coordinate of the point G is equal to the length of the straight line JI and is set to Esin α cos β. In this way, since the x coordinate, y coordinate, and z coordinate of the point G are obtained, the coordinates of the point G become coordinates (Esinαsinβ, Esinαcosβ, Ecosα).

  In the example of FIG. 27, the case where the position of the center of gravity of the image of the position indicating light is the same position as the reference point O has been considered. It is not always the same position as point O. Therefore, if the coordinates of the center of gravity of the image of the position indicating light are (xi, yi, 0) and the point G is moved by the position of the center of gravity of the position indicating light, the point indicating the position where the three-dimensional pointer 31 is to be displayed. The coordinates of G are (xi + Esinαsinβ, yi + Esinαcosβ, Ecosα).

  As described above, the display position of the three-dimensional pointer 31 uses the barycentric coordinates (xi, yi, 0) of the position indicating light, the rotation angle β of the position indicating light, the elevation angle α of the position indicating light, and the depth length E. Can be represented. Further, by changing E of the coordinates of the point G, the display position of the three-dimensional pointer 31 can be easily moved in the depth direction and the front direction.

  Next, the flow of the three-dimensional pointer position calculation process described above will be described with reference to the flowchart of FIG. This three-dimensional pointer position calculation process is a process corresponding to the process in step S46 in FIG. 12, and is started when the backward instruction light is detected in step S45 in FIG.

  In step S161, the coordinate calculation unit 226 calculates the center-of-gravity coordinates (xi, yi), the long-axis intersection coordinates (xf, yf), and the short-axis intersection coordinates (xn, yf) as coordinate information supplied from the position indicating light detection unit 223. yn) is used to determine the elevation angle α and the rotation angle β. That is, the coordinate calculation unit 226 calculates the elevation angle α by calculating the above-described equation (4), and calculates the rotation angle β by calculating the equation (2).

  In step S162, the coordinate calculation unit 226 calculates sin α and cos α using the calculated elevation angle α. In step S163, the coordinate calculation unit 226 calculates sin β and cos β using the calculated rotation angle β.

  In step S164, the coordinate calculation unit 226 determines whether or not to move the three-dimensional pointer 31 forward. For example, the coordinate calculation unit 226 has the forward flag supplied from the forward instruction light detection unit 224 set to “false” and the reverse flag supplied from the reverse instruction light detection unit 225 is set to “true”. If the stored depth length E is 1 or more, it is determined that the three-dimensional pointer 31 is moved forward.

  That is, when the backward flag is set to “true”, it indicates that the user has instructed to move the three-dimensional pointer 31 forward. If the depth length E is 0, the three-dimensional pointer 31 cannot be moved further forward. Therefore, when the forward flag is set to “false”, the backward flag is set to “true”, and the depth length E is 1 or more, the three-dimensional pointer 31 is moved forward.

  If it is determined in step S164 that the three-dimensional pointer 31 is moved in the forward direction, the process proceeds to step S165. In step S165, the coordinate calculation unit 226 decreases the stored depth length E value by 1, and the process proceeds to step S168.

  On the other hand, when it is determined in step S164 not to move in the forward direction, in step S166, the coordinate calculation unit 226 determines whether or not to move the three-dimensional pointer 31 in the depth direction.

  For example, the coordinate calculation unit 226 has the forward flag supplied from the forward instruction light detection unit 224 set to “true” and the reverse flag supplied from the reverse instruction light detection unit 225 is set to “false”. If it is, it is determined that the three-dimensional pointer 31 is moved in the depth direction.

  That is, when the forward flag is set to “true”, it indicates that the user has instructed the movement of the three-dimensional pointer 31 in the depth direction. Therefore, when the forward flag is set to “true” and the backward flag is set to “false”, the three-dimensional pointer 31 is moved in the depth direction.

  If it is determined in step S166 that the three-dimensional pointer 31 is moved in the depth direction, the process proceeds to step S167. In step S167, the coordinate calculation unit 226 increases the value of the stored depth length E by 1, and the process proceeds to step S168.

  On the other hand, if it is determined in step S166 not to move in the depth direction, the user is not instructed to move the three-dimensional pointer 31, or the three-dimensional pointer 31 is moved further forward. Since it cannot be performed, the value of the depth length E is not changed, and the process proceeds to step S168.

  If the depth length E has been changed in step S165 or step S167, or if it is determined not to move in the depth direction in step S166, in step S168, the coordinate calculation unit 226 calculates the barycentric coordinates (xi, yi, 0), From the stored depth length E and the obtained sin α, cos α, sin β, and cos β, the coordinates of the display position where the three-dimensional pointer 31 is to be displayed are obtained. Then, the coordinate calculation unit 226 supplies the obtained coordinates to the display image processing unit 228, the process proceeds to step S47 in FIG. 12, and the three-dimensional pointer position calculation process ends.

  That is, the coordinate calculation unit 226 adds the coordinates (xi + Esinαsinβ, yi + Esinαcosβ, Ecosα) of the display position of the three-dimensional pointer 31 to the centroid coordinates xi and yi, the value of the depth length E, and the calculated sinα, cosα, sinβ, and By substituting the value of cos β, the coordinates of the display position of the three-dimensional pointer 31 are obtained.

  The coordinates obtained in this way are supplied to the display image processing unit 228, and in step S47 in FIG. 12, the display image processing unit 228 puts the coordinates at the position on the three-dimensional image corresponding to the obtained display position. A display image that is an image on which the dimension pointer 31 is displayed is generated.

  In this way, the coordinate calculation unit 226 calculates the elevation angle α and the rotation angle β of the position indicating light, and calculates the three-dimensional pointer 31 from these angles, the position of the center of gravity of the position indicating light, and the stored depth length E. Obtain the coordinates of the display position of.

  Thus, the elevation angle α and the rotation angle β are obtained, and the coordinates of the display position of the three-dimensional pointer 31 are obtained from these angles, the position of the center of gravity of the position indicating light, and the stored depth length E. The position of the movement destination of the dimension pointer 31 can be easily obtained. Accordingly, the user 32 can move the three-dimensional pointer 31 in the depth direction or the front direction only by a simple operation of operating the laser pointer 22 to emit the forward instruction light or the reverse instruction light.

  In particular, since the position indicating light, the forward indicating light, and the reverse indicating light are invisible lights, they do not affect the displayed display image even if they are irradiated on the screen 21. In other words, an observer who observes the display image does not feel uncomfortable because the image of the position instruction light and the forward instruction light irradiated on the display image is not visible. In addition, since the position where the three-dimensional pointer 31 should be displayed can be specified simply by capturing an image of position instruction light or forward instruction light with the camera 23, a display image is displayed on a large display surface such as a conference room screen. Even in this case, the display position of the three-dimensional pointer 31 can be specified easily and with high accuracy.

  In the above description, the on-chip color filter shown in FIG. 9 is arranged on the light receiving surface of the CCD image sensor 183 of the camera 23. However, the on-chip color filter shown in FIG. 30 is arranged. It may be.

  In FIG. 30, one square represents one pixel, and each of the letters W and L1 to L3 in these squares is visible light, position indicating light, forward indicating light, and backward indicating light. Each of them represents a visible light filter, a position indicating light wavelength filter, a forward indicating light wavelength filter, and a backward indicating light wavelength filter that transmit only each of them.

  In FIG. 30, the respective filters are arranged in a mosaic pattern. That is, a visible light filter, a position indicating light wavelength filter, a forward indicating light wavelength filter, and a backward indicating light wavelength filter are arranged in the vertical and horizontal directions in the figure every other pixel. ing.

  As described above, when the on-chip color filter shown in FIG. 30 is arranged on the light receiving surface of the CCD image sensor 183, the interval between the filters of each wavelength, that is, the sampling interval of the light of each color is shortened. The image of the position indicating light, the image of the forward indicating light, and the image of the reverse indicating light can be detected with high accuracy.

  Further, the CCD image sensor 183 may be constituted by two image sensors. In such a case, for example, the light collected by the lens 181 is separated into two lights by a prism or the like so that the same image is picked up by each of the image pickup elements, and forms the CCD image sensor 183. It is made to enter each of these.

  Further, when the CCD image sensor 183 includes two image sensors, an on-chip color filter shown in FIG. 31, for example, is arranged on the light receiving surface of these image sensors.

  Here, FIG. 31A shows an on-chip color filter arranged on the light receiving surface of one image sensor that constitutes the CCD image sensor 183, and FIG. 31B shows an on-chip color filter arranged on the light receiving surface of the other image sensor. A color filter is shown. Furthermore, FIG. 31C shows another example of the on-chip color filter arranged on the light receiving surface of the other image sensor. In FIGS. 31A to 31C, one square represents one pixel.

  In FIG. 31A, each of the characters L1 to L3 in the square representing one pixel is a filter of the wavelength of the position indicating light that transmits only the position indicating light, the forward indicating light, and the backward indicating light, and the forward indicating light Each of the wavelength filter and the reverse direction light wavelength filter is shown.

  In FIG. 31A, the filter for the wavelength of the position indicating light, the filter for the wavelength of the forward indicating light, and the filter for the wavelength of the backward indicating light are arranged in the vertical and horizontal directions in the drawing every other pixel. Therefore, the position indicating light image, the forward indicating light image, and the reverse indicating light image are separated from the captured image captured by the image sensor on which the on-chip color filter shown in FIG. 31A is arranged.

  Further, as shown in FIG. 31B, the on-chip color filter disposed on the light receiving surface of the other image sensor includes an R filter, a G filter, and a filter in which the B filter is disposed in an array called a Bayer array. Is done.

  In FIG. 31B, each of the letters R, G, and B in the square representing one pixel transmits only light of R (red), G (green), and B (blue) colors. Each of the filter, the G filter, and the B filter is shown. Therefore, a visible light image can be obtained from a captured image captured by an image sensor in which the on-chip color filter is disposed on the light receiving surface.

  Furthermore, the on-chip color filter arranged on the light receiving surface of the other image sensor may be the filter shown in FIG. 31C. In FIG. 31C, a letter W in a square representing one pixel represents a visible light filter that transmits only visible light. A visible light image can also be obtained from a captured image captured by an image sensor in which the on-chip color filter is disposed on the light receiving surface.

  As described above, when the CCD image sensor 183 is configured by two image sensors, the sampling interval of light of each color is shortened, so that the image of the position indicating light, the image of the forward indicating light, and An image of the reverse instruction light can be detected. Further, when the on-chip color filter shown in FIG. 31 is used, the cost can be reduced because an existing CCD image sensor can be used.

  In the above description, the display image is displayed using the screen 21. However, a display device such as an LCD or a CRT display is connected to the image processing device 24, and the display image is displayed on the display device. Also good. In such a case, the user 32 irradiates the display screen of the display device that displays the display image with position instruction light or forward instruction light from the laser pointer 22, and the camera 23 displays the display screen of the display device and its display screen. The surrounding area is imaged.

  Furthermore, although it has been described that the camera 23 is connected to the image processing device 24, the camera 23 may be provided in the image processing device 24.

  The series of processes described above can be executed by hardware or can be executed by software. When a series of processing is executed by software, a program constituting the software may execute various functions by installing a computer incorporated in dedicated hardware or various programs. For example, it is installed from a program recording medium in a general-purpose personal computer or the like.

  FIG. 32 is a block diagram illustrating an example of a hardware configuration of a computer that executes the series of processes described above according to a program.

  In the computer, a CPU 501, a ROM (Read Only Memory) 502, and a RAM (Random Access Memory) 503 are connected to each other via a bus 504.

  An input / output interface 505 is further connected to the bus 504. The input / output interface 505 includes an input unit 506 including a keyboard, a mouse, and a microphone, an output unit 507 including a display and a speaker, a recording unit 508 including a hard disk and a non-volatile memory, and a communication unit 509 including a network interface. A drive 510 for driving a removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is connected.

  In the computer configured as described above, the CPU 501 loads the program recorded in the recording unit 508 to the RAM 503 via the input / output interface 505 and the bus 504 and executes the program, for example. Is performed.

  The program executed by the computer (CPU 501) is, for example, a magnetic disk (including a flexible disk), an optical disk (CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), etc.), a magneto-optical disk, or a semiconductor. The program is recorded on a removable medium 511 that is a package medium including a memory or the like, or provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.

  The program can be installed in the recording unit 508 via the input / output interface 505 by mounting the removable medium 511 on the drive 510. Further, the program can be received by the communication unit 509 via a wired or wireless transmission medium and installed in the recording unit 508. In addition, the program can be installed in the ROM 502 or the recording unit 508 in advance.

  The program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.

  The embodiment of the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present invention.

It is a figure which shows the structural example of one Embodiment of the presentation system to which this invention is applied. It is a figure which shows the external appearance of a laser pointer. It is a figure which shows the cross section of a laser pointer. It is a figure explaining the direction in which position indication light was irradiated. It is a figure explaining virtual three-dimensional space. It is a figure explaining the movement to the depth direction and near side of a three-dimensional pointer. It is a figure explaining the operation | movement corresponding to operation of the user of a laser pointer. It is a block diagram which shows the structural example of a camera. It is a figure which shows the example of the arrangement | sequence of an on-chip color filter. It is a block diagram which shows the structural example of an image processing apparatus. It is a flowchart explaining a display process. It is a flowchart explaining a production | generation process. It is a figure which shows the arrangement position of the camera with respect to a screen. It is a figure which shows the example of a captured image. It is a figure which shows the various images isolate | separated from a captured image. It is a flowchart explaining a gravity center position detection process. It is a figure explaining the clustering in a position instruction | indication light image. It is a figure explaining the edge detection of a cluster. It is a figure explaining a major axis intersection coordinate and a minor axis intersection coordinate. It is a flowchart explaining the detection process of a forward instruction | indication light. It is a figure explaining the range of detection of advance direction light. It is a flowchart explaining the detection process of reverse instruction | indication light. It is a figure explaining the range of detection of reverse instruction light. It is a figure explaining rotation angle (beta). It is a figure explaining rotation angle (beta). It is a figure explaining elevation angle (alpha). It is a figure explaining the display position of a three-dimensional pointer. It is a figure explaining the display position of a three-dimensional pointer. It is a flowchart explaining a three-dimensional pointer position calculation process. It is a figure which shows the other example of the arrangement | sequence of an on-chip color filter. It is a figure which shows the other example of the arrangement | sequence of an on-chip color filter. It is a block diagram which shows the structural example of a computer.

Explanation of symbols

  11 presentation system, 21 screen, 22 laser pointer, 23 camera, 24 image processing device, 25 projector, 81 invisible light laser, 83 invisible light laser, 84 invisible light laser, 183 CCD image sensor, 186 DSP, 221 color synchronization processing , 222 geometric transformation processing unit, 223 position indication light detection unit, 224 forward indication light detection unit, 225 reverse indication light detection unit, 226 coordinate calculation unit, 228 display image processing unit

Claims (11)

  1. A display device that displays a display image, an instruction device that irradiates light to the display image to indicate the position of a pointer displayed on the display image, and an image pickup that captures an area where the display image is displayed An image processing system including an apparatus and an image processing apparatus that newly generates a display image based on a captured image obtained by imaging by the imaging apparatus,
    The pointing device is
    Position indication light, which is a display position in the virtual three-dimensional space assumed for the display image displayed by the display device and is used to indicate the display position for displaying the pointer, is emitted. Position indicating light emitting means;
    A light of a wavelength different from that of the position indicating light, and direction indicating light emitting means for emitting direction indicating light for instructing movement of the pointer in a direction parallel to the optical path of the position indicating light,
    The imaging device captures the image of the position indicating light and the image of the direction indicating light irradiated on the display image as the captured image,
    The image processing apparatus includes:
    Separating means for separating, from the captured image, a position indicating light image that is an image of light having the same wavelength as the position indicating light and a direction indicating light image that is an image of light having the same wavelength as the direction indicating light;
    Position indication light detection means for detecting the position and shape of the image of the position indication light on the display image from the position indication light image;
    Direction indication light detection means for detecting an image of the direction indication light from the direction indication light image and outputting direction information indicating whether or not the pointer has been instructed to move in the direction based on the detection result; ,
    Position specifying means for specifying the display position of the pointer in the virtual three-dimensional space based on the position and shape of the image of the position indicating light and the output direction information;
    Display image generating means for generating a new display image in which the pointer is displayed at a position on the display image corresponding to the display position of the pointer specified by the position specifying means;
    The display device is configured to switch the display of the display image that has been displayed so far to the generated new display image.
  2. An image processing apparatus that specifies a display position on a virtual three-dimensional space assumed for a display image and that displays a pointer.
    A position indicating light image obtained by imaging an area where the display image is displayed, and light having the same wavelength as the position indicating light for indicating the display position for displaying the pointer in the virtual three-dimensional space Position indication light detection means for detecting the position and shape of the image of the position indication light irradiated on the display image on the display image from the position indication light image which is an image of
    A first direction indicating light image obtained by imaging the area where the display image is displayed, wherein the pointer indicates movement of the pointer in a first direction parallel to the optical path of the position indicating light. An image of the first direction indicating light irradiated on the display image from a first direction indicating light image having the same wavelength as that of the first direction indicating light and a wavelength different from that of the position indicating light. First direction indicating light detecting means for outputting first direction information indicating whether or not movement of the pointer in the first direction is instructed based on the detection result;
    An image processing apparatus comprising: position specifying means for specifying a display position of the pointer in the virtual three-dimensional space based on the position and shape of the image of the position indicating light and the output first direction information. .
  3. The display image generating means for generating a new display image in which the pointer is displayed at a position on the display image corresponding to the display position of the pointer specified by the position specifying means. Image processing device.
  4. The image processing apparatus according to claim 3, wherein the position indicating light has a perfect circular cross section.
  5. The display position of the pointer passes through the position in the virtual three-dimensional space corresponding to the position of the image of the position indicating light, and is a straight line parallel to the first direction specified from the shape of the image of the position indicating light The image processing apparatus according to claim 4, wherein the image processing apparatus is in an upper position.
  6. The position specifying means stores a depth length that determines a length from the position in the virtual three-dimensional space corresponding to the position of the image of the position indicating light to the display position of the pointer. 6. The display position of the pointer is specified based on the depth length after the change and the position and shape of the image of the position indicating light. The image processing apparatus described.
  7. A second direction indicating light image obtained by imaging the area where the display image is displayed, wherein the pointer indicates movement of the pointer in a second direction opposite to the first direction. The display image is irradiated from a second direction indicating light image having the same wavelength as the direction indicating light of 2 and a light having a wavelength different from that of the position indicating light and the first direction indicating light. A second direction that detects an image of the second direction indicating light and outputs second direction information indicating whether or not the movement of the pointer in the second direction is instructed based on the detection result. Further comprising indicator light detection means,
    The image processing apparatus according to claim 6, wherein the position specifying unit changes the depth length based on the first direction information and the second direction information.
  8. Imaging means for imaging the display image, the image of the position indicating light irradiated to the display image, and the image of the first direction indicating light irradiated to the display image as the captured image;
    The image processing apparatus according to claim 3, further comprising: a separating unit that separates the position indication light image and the first direction indication light image from the captured image.
  9. The image processing apparatus according to claim 3, wherein the position indicating light and the first direction indicating light are invisible light.
  10. An image processing method of an image processing apparatus for specifying a display position for displaying a pointer, which is a display position in a virtual three-dimensional space assumed for a display image,
    A position indicating light image obtained by imaging an area where the display image is displayed, and light having the same wavelength as the position indicating light for indicating the display position for displaying the pointer in the virtual three-dimensional space A position indicating light detection step for detecting the position and shape of the image of the position indicating light irradiated on the display image on the display image from the position indicating light image which is an image of
    A direction indicating light image obtained by imaging the area where the display image is displayed, and having the same wavelength as the direction indicating light for instructing movement of the pointer in a direction parallel to the optical path of the position indicating light And detecting the image of the direction indicating light irradiated on the display image from the direction indicating light image which is an image of light having a wavelength different from that of the position indicating light, and based on the detection result, the pointer of the pointer A direction indicating light detection step for outputting direction information indicating whether or not movement in the direction has been instructed;
    An image processing method comprising: a position specifying step of specifying a display position of the pointer in the virtual three-dimensional space based on the position and shape of the image of the position indicating light and the output direction information.
  11. It is a display position in a virtual three-dimensional space assumed for a display image, and is a program for image processing that specifies a display position for displaying a pointer,
    A position indicating light image obtained by imaging an area where the display image is displayed, and light having the same wavelength as the position indicating light for indicating the display position for displaying the pointer in the virtual three-dimensional space A position indicating light detection step for detecting the position and shape of the image of the position indicating light irradiated on the display image on the display image from the position indicating light image which is an image of
    A direction indicating light image obtained by imaging the area where the display image is displayed, and having the same wavelength as the direction indicating light for instructing movement of the pointer in a direction parallel to the optical path of the position indicating light And detecting the image of the direction indicating light irradiated on the display image from the direction indicating light image which is an image of light having a wavelength different from that of the position indicating light, and based on the detection result, the pointer of the pointer A direction indicating light detection step for outputting direction information indicating whether or not movement in the direction has been instructed;
    Causing the computer to execute processing including a position specifying step of specifying the display position of the pointer in the virtual three-dimensional space based on the position and shape of the image of the position indicating light and the output direction information. program.
JP2007133758A 2007-05-21 2007-05-21 Image processing system, image processing apparatus and method, and program Expired - Fee Related JP4807322B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007133758A JP4807322B2 (en) 2007-05-21 2007-05-21 Image processing system, image processing apparatus and method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007133758A JP4807322B2 (en) 2007-05-21 2007-05-21 Image processing system, image processing apparatus and method, and program

Publications (2)

Publication Number Publication Date
JP2008287625A JP2008287625A (en) 2008-11-27
JP4807322B2 true JP4807322B2 (en) 2011-11-02

Family

ID=40147269

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007133758A Expired - Fee Related JP4807322B2 (en) 2007-05-21 2007-05-21 Image processing system, image processing apparatus and method, and program

Country Status (1)

Country Link
JP (1) JP4807322B2 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5118663B2 (en) * 2009-03-31 2013-01-16 Kddi株式会社 Information terminal equipment
JP6251963B2 (en) 2012-03-01 2017-12-27 日産自動車株式会社 Camera apparatus and image processing method
JP6251962B2 (en) * 2012-03-01 2017-12-27 日産自動車株式会社 Camera apparatus and image processing method
JP6112618B2 (en) * 2014-07-17 2017-04-12 Necプラットフォームズ株式会社 Information processing system
ES2574617B1 (en) * 2014-11-18 2017-04-05 Ignacio FERRERO PERIS Method, system and computer product to interact with a laser pointer
CN108037870A (en) * 2017-11-03 2018-05-15 福建天晴数码有限公司 A kind of method and terminal of the three-dimensional scenic object pickup based on touch-screen

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2790965B2 (en) * 1992-08-19 1998-08-27 富士通株式会社 Optical pointing system
JP2000276297A (en) * 1999-03-25 2000-10-06 Seiko Epson Corp Device and method for detecting pointing position, presentation system, and information storage medium
JP2001290600A (en) * 2000-04-06 2001-10-19 Casio Comput Co Ltd Coordinate input system
JP2003036142A (en) * 2001-07-24 2003-02-07 Hitachi Ltd System and method for presentation
JP2004178469A (en) * 2002-11-29 2004-06-24 Hitachi Ltd Remote control system
JP4644800B2 (en) * 2005-01-07 2011-03-02 国立大学法人電気通信大学 3D position input device
JP4577612B2 (en) * 2005-03-30 2010-11-10 カシオ計算機株式会社 Information display system and information display method
JP2007086962A (en) * 2005-09-21 2007-04-05 Fuji Xerox Co Ltd Screen position indicating system and screen position indicating method

Also Published As

Publication number Publication date
JP2008287625A (en) 2008-11-27

Similar Documents

Publication Publication Date Title
TWI253006B (en) Image processing system, projector, information storage medium, and image processing method
US8254713B2 (en) Image processing apparatus, image processing method, program therefor, and recording medium in which the program is recorded
US9407837B2 (en) Depth sensor using modulated light projector and image sensor with color and IR sensing
US6031941A (en) Three-dimensional model data forming apparatus
JP2005517253A (en) Method and apparatus for providing an infiltration lookout
JP5109803B2 (en) Image processing apparatus, image processing method, and image processing program
JP4481280B2 (en) Image processing apparatus and image processing method
US20110157321A1 (en) Imaging device, 3d modeling data creation method, and computer-readable recording medium storing programs
US20070217780A1 (en) Object detection apparatus
JP2004144557A (en) Three-dimensional visual sensor
JP2011123071A (en) Image capturing device, method for searching occlusion area, and program
US20180096536A1 (en) Information processing apparatus, information processing system, and information processing method
JP2004517406A (en) Computer vision based wireless pointing system
US20110181553A1 (en) Interactive Projection with Gesture Recognition
JP4388530B2 (en) Single camera omnidirectional binocular image acquisition device
US8272746B2 (en) Projection display apparatus and display method
US20130335535A1 (en) Digital 3d camera using periodic illumination
JP2010130225A (en) Projection display device and adjustment method for projection
EP2437494A1 (en) Device for monitoring area around vehicle
CN102401646B (en) Method of structured light-based measurement
JP2014238731A (en) Image processor, image processing system, and image processing method
KR20090087638A (en) Apparatus and method for matching color image and depth image
EP2531980B1 (en) Depth camera compatibility
JPWO2011148607A1 (en) Gesture recognition device and gesture recognition method
JP2012133808A (en) Cursor control method and device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100427

TRDD Decision of grant or rejection written
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110714

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110719

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110801

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140826

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees