WO2009131131A1 - Display device and display method - Google Patents

Display device and display method Download PDF

Info

Publication number
WO2009131131A1
WO2009131131A1 PCT/JP2009/057939 JP2009057939W WO2009131131A1 WO 2009131131 A1 WO2009131131 A1 WO 2009131131A1 JP 2009057939 W JP2009057939 W JP 2009057939W WO 2009131131 A1 WO2009131131 A1 WO 2009131131A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
light
image
intensity
displaying
Prior art date
Application number
PCT/JP2009/057939
Other languages
French (fr)
Japanese (ja)
Inventor
卓司 川村
速人 宇都宮
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2008116231A external-priority patent/JP2009265439A/en
Priority claimed from JP2008116232A external-priority patent/JP2009266036A/en
Priority claimed from JP2008116233A external-priority patent/JP2009266037A/en
Priority claimed from JP2008116234A external-priority patent/JP2009265440A/en
Priority claimed from JP2008116229A external-priority patent/JP2009265438A/en
Priority claimed from JP2008116230A external-priority patent/JP5308705B2/en
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2009131131A1 publication Critical patent/WO2009131131A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits

Definitions

  • the present invention relates to a display device and a display method capable of specifying the position of light irradiated on a screen by an indication device such as a laser pointer.
  • the present invention relates to a display device and a display method that use light emitted to a screen by a pointing device such as a laser pointer.
  • a presenter who makes a presentation using a large screen irradiates light such as a laser pointer to indicate the position in the image displayed on the screen, and a viewer who views the screen uses a laser pointer or the like irradiated on the screen.
  • the position that the presenter wants to point to is recognized by the position of the light.
  • a pointing device described in Japanese Patent Application Laid-Open No. 2001-236181 takes a screen on which a pointing pointer is pointed by a laser pointer by a CCD camera, and recognizes a side of the screen in the image picked up by the CCD camera.
  • the coordinate position of the pointing pointer is specified from the recognized position of the side of the screen.
  • An instruction information input device described in Japanese Patent Application Laid-Open No. 9-62444 takes a projection screen on which an image is displayed by a projector by a video camera, a video signal shot by the video camera, and a video signal being displayed The position of the light projected by the laser pointer is specified. Furthermore, the instruction information input device can perform an input operation similar to that of a mouse using the laser pointer when the color or brightness of the light projected by the laser pointer changes.
  • the viewing angle control display device described in JP-A-6-230896 displays different screens on the display device according to the angle between the viewer and the screen, and corresponds to each light emitting element constituting the screen.
  • a light receiving element that selectively receives only light from an angle at which the screen by the light emitting element can be seen is provided.
  • the viewing angle control display device receives light emitted from a laser pointer or the like by the light receiving element.
  • a person who looks at the screen can see the position where the light from the laser pointer is absorbed by the liquid crystal display. It may be difficult to distinguish.
  • a light receiving element is provided for each light emitting element, and the light receiving element at the position irradiated with light by the laser pointer receives light.
  • the instruction information input device described in Japanese Patent Laid-Open No. 9-62444 can perform an input operation similar to that of a mouse by using the laser pointer by changing the color or brightness of the light projected by the laser pointer. However, it does not disclose how to perform the same operation as a mouse.
  • the present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
  • the display device includes a control unit that causes the display unit to display at least an image indicating a light detection position in accordance with the intensity of light detected by the detection unit.
  • the control unit detects the image by the detection unit.
  • an image showing at least the light detection position is displayed on the display means. Therefore, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
  • the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
  • a first control step of displaying an image indicating a detection position in accordance with the intensity of light detected by the detection unit when an image is displayed on a display device that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each position, the first control step includes: An image indicating the detection position is displayed according to the intensity of light detected by the unit. Therefore, if the display method according to the present invention is provided, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
  • the present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions, Discriminating means for discriminating a pattern in which the intensity of light detected by the detection unit changes;
  • the pattern discriminated by the discriminating means is the first pattern
  • the symbol image of the first display form indicating the position where the light is detected by the detecting unit
  • the pattern discriminated by the discriminating means is Control means for displaying a symbol image of a second display form different from the first display form and indicating the position when the second pattern is different from the first pattern. It is a display device.
  • the detection unit when displaying an image on a display unit including a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each of a plurality of positions, the detection unit includes the detection unit.
  • the pattern in which the intensity of the light detected by is changed is discriminated and the pattern discriminated by the discriminating unit is the first pattern by the control unit, the first position indicating the position where the light is detected by the detecting unit is determined.
  • the pattern discriminated by the discriminating means is a second pattern different from the first pattern
  • the position is different from the first display form and the position is A symbol image of the second display form shown is displayed.
  • the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
  • the pattern determined in the determination step is a first pattern
  • a symbol image of a first display form indicating a position where light is detected by the detection unit is displayed
  • the pattern determined in the determination step is A second control step of displaying a symbol image of a second display form different from the first display form and indicating the position when the second pattern is different from the first pattern.
  • the pattern in which the intensity of the emitted light changes is determined.
  • the pattern determined in the determination step is the first pattern
  • a symbol image of the first display form showing the position where the light is detected by the detection unit is displayed.
  • the pattern discriminated in the discriminating step is a second pattern different from the first pattern
  • a symbol image of the second display form that is different from the first display form and indicates the position is displayed.
  • the present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions, Range determining means for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity; Position specifying means for specifying a position that is the center of each range for each range determined by the range determining means; And a control unit that displays an image indicating the position specified by the position specifying unit.
  • the detection is performed by the range determination unit.
  • a range in which the light intensity detected by the unit is greater than or equal to a predetermined reference intensity is determined, and a position that is the center of each range is specified for each range determined by the range determination unit by the position specifying unit, and control is performed.
  • the means displays an image indicating the position specified by the position specifying means. Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device.
  • the present invention is a display method for displaying an image on a display device including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at a plurality of positions, A range determining step for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity; A first position specifying step for specifying a position at the center of each range for each range determined in the range determining step; And a third control step for displaying an image indicating the position specified in the first position specifying step.
  • the range determination step includes the detection A range in which the intensity of light detected by the unit is equal to or greater than a predetermined reference intensity is determined.
  • the first position specifying step a position at the center of each range is specified for each range determined in the range determining step.
  • the third control step an image indicating the position specified in the first position specifying step is displayed.
  • the position on the screen indicated by the operator using the pointing device can be clearly indicated even when using a pointing device that spreads the light irradiated on the screen, such as a flashlight. Can do.
  • the present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions, Range determining means for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity; Position specifying means for specifying, for each of the determined ranges, a position where the intensity of light is detected most strongly within the range determined by the range determining means; And a control unit that displays an image indicating the position specified by the position specifying unit.
  • the detection is performed by the range determination unit.
  • a range in which the light intensity detected by the unit is equal to or greater than a predetermined reference intensity is determined, and the position specifying unit determines the position having the strongest light intensity in the range determined by the range determining unit.
  • An image that is specified for each determined range and that indicates the position specified by the position specifying means is displayed by the control means. Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device.
  • the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions, A range determining step for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity; A second position specifying step for specifying, for each of the determined ranges, a position having the strongest light intensity within the range determined in the range determining step; And a fourth control step for displaying an image indicating the position specified in the second position specifying step.
  • the range determination step when displaying an image on a display device including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at each of a plurality of positions, includes the detection A range in which the intensity of light detected by the unit is equal to or greater than a predetermined reference intensity is determined.
  • the second position specifying step a position having the strongest light intensity is specified for each determined range within the range determined in the range determining step.
  • an image indicating the position specified in the second position specifying step is displayed.
  • the present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions, Discriminating means for discriminating a pattern in which the intensity of light detected by the detection unit changes; When the pattern discriminated by the discriminating means is the first pattern, the first control is performed on the display image, and the pattern discriminated by the discriminating means is different from the first pattern. And a control means for performing second control on the display image.
  • the detection unit when an image is displayed on a display unit including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at each of a plurality of positions, the detection unit performs the detection.
  • the first control is performed on the display image.
  • the pattern determined by the determining unit is a second pattern different from the first pattern, the second control is performed on the display image. Therefore, it is possible to perform control on the display image according to the pattern in which the intensity of light changes, for example, according to the light irradiated on the screen by the pointing device that emits light.
  • the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
  • the detection unit detects the image.
  • the pattern in which the intensity of the emitted light changes is determined.
  • the pattern determined in the determination step is the first pattern
  • the first control is performed on the display image
  • the pattern determined by the determination unit is the first pattern.
  • the second control is performed on the display image. Therefore, if the display method according to the present invention is applied, control according to the light irradiated on the screen by the pointing device that emits light, for example, control on the display image according to the pattern in which the light intensity changes. Can do.
  • the present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions, A shape discriminating means for discriminating a shape formed by a reference position detected as a light intensity detected by the detection unit equal to or higher than a predetermined reference intensity; And a control unit configured to perform predetermined control on the display image in accordance with the shape determined by the shape determination unit.
  • the display unit includes a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at each of a plurality of positions, and the shape determination unit detects the detection unit by the detection unit.
  • the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
  • the shape determination step when displaying an image on a display device that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each position, the shape determination step includes: The shape formed by the reference position detected when the detected light intensity is greater than or equal to a predetermined reference intensity is determined.
  • predetermined control is performed on the display image in accordance with the shape determined in the shape determination step. Therefore, by applying the display method according to the present invention, it is possible to perform control according to the light irradiated on the screen by the pointing device that emits light, for example, the shape formed on the screen by the light.
  • indication apparatus It is a figure which shows the structure of the display apparatus which is the 3rd Embodiment of this invention. It is a figure which shows the example of the shape of the light irradiated by the light emission instruction
  • FIG. 1 is a diagram showing a configuration of a display device 1 according to the first embodiment of the present invention.
  • the display device 1 includes a display device 10 and a light emission instruction device 30.
  • the display method according to the present invention is processed by the display device 10.
  • the display device 10 includes N video input terminals 11, a display / display processing temporary storage unit 12, a photosensor built-in display device 13, a display processing unit 14, a central processing unit 15, an internal storage unit 16, a remote control ( The light receiving unit 17, the remote control processing unit 18, the light receiving location determination unit 19, the instruction pointer display data creation unit 20, and the light reception waveform determination unit 21 are included.
  • the video input terminal 11 receives a video signal output from a video generation device (see, for example, the element 50 in FIG. 21 and the element 50A in FIG. 27) described later, which includes a personal computer (hereinafter referred to as “PC”). N input terminals are provided. N is a natural number. Video signals input from the video input terminals 11 are sent to the display / display processing temporary storage unit 12.
  • the display / display processing temporary storage unit 12 is a writable and readable storage device composed of a semiconductor memory, a hard disk device, or the like, and includes image information representing a video signal received from the video input terminal 11 and a central processing unit 15. To store image information on which display processing and display processing are performed.
  • the image information stored in the display / display processing temporary storage unit 12 can be read from the display processing unit 14 and the central processing unit 15.
  • the display device 13 with a built-in photosensor serving as a display means is a liquid crystal display in which a plurality of pixels constituting a display image and a photosensor serving as a detection unit for detecting the intensity and color of received light are built for each pixel. . Furthermore, the display device 13 with a built-in photosensor is provided with a photosensor for each of the red, green, and blue sub-pixels constituting each pixel, and the color of the received light can be detected for each pixel.
  • the photosensor built-in display device 13 displays the image information received from the display processing unit 14, and the light intensity (hereinafter referred to as "light reception intensity”) detected by each photosensor for each pixel and color (hereinafter referred to as “light reception”). Color ”) to the central processing unit 15.
  • the display processing unit 14 converts the image information read from the display / display processing temporary storage unit 12 into a format instructed by the central processing unit 15, for example, a format that can be displayed by the optical sensor built-in display device 13. It is sent to the display device 13 with a built-in optical sensor and displayed.
  • the central processing unit 15 includes a central processing unit (hereinafter referred to as “CPU”) and a memory that stores a control program for controlling the display device 10, and the CPU executes the control program stored in the memory to execute the optical program.
  • the sensor built-in display device 13, the display processing unit 14, and the remote control processing unit 18 are controlled.
  • the central processing unit 15 is a determination unit, a determination unit, a range determination unit, a position specifying unit, and a control unit.
  • the central processing unit 15 is an LSI (Large Scale Integration) FPGA (Field Programmable Gate Array) that can be programmed instead of the CPU and memory, and an ASIC that is an integrated circuit designed and manufactured for a specific application.
  • LSI Large Scale Integration
  • FPGA Field Programmable Gate Array
  • the internal storage unit 16 is configured by a storage device such as a writable and readable semiconductor memory, and stores control information 161 used when the central processing unit 15 executes a control program.
  • the control information 161 includes, for example, a received light color representing the color of the received light, a received light intensity representing the intensity of the received light, a received light number representing the number of received light, a pointer shape representing the shape of the pointing pointer, and an pointing pointer Information such as a pointer display color representing the color of the pointer, a pointer size representing the size of the pointing pointer, a pointer display position representing the center position of the pointing pointer display, and a pointer display standard representing the reference of the display position of the pointing pointer.
  • the number of received light is the number of light receiving portions that receive light, and is the number that matches the number of light emission instruction devices 30 that are irradiating light if the light is not overlapping.
  • the light receiving location is a pixel group including at least one pixel irradiated with light by the light emission instruction device 30.
  • the instruction pointer which is a symbol image is an image for clearly indicating the position where light is emitted by the light emission instruction device 30 and indicates the position of the representative point of the light receiving location.
  • the pointer display reference indicates whether the center position of the light receiving location is the reference or the brightest position of the light receiving location is the reference. That is, it is a reference whether the representative point is the center position of the light receiving location or the brightest position of the light receiving location.
  • the remote control light receiving unit 17 receives light from a remote controller (not shown) that transmits information for operating the display device 10 to the display device 10, converts the received light into an electrical signal, and sends the electrical signal to the remote control processing unit 18.
  • the remote control processing unit 18 converts the electrical signal received from the remote control light receiving unit 17 into information instructed from the remote control and sends the information to the central processing unit 15.
  • the light receiving location determination unit 19, the instruction pointer display data creation unit 20, and the light reception waveform determination unit 21 are functions realized by the central processing unit 15 executing a control program.
  • the light receiving location determination unit 19 determines the location of the light applied to the display device 13 with a built-in optical sensor.
  • the light receiving location determination unit 19 receives the intensity of received light from the optical sensor of the display device 13 with a built-in optical sensor equal to or higher than a predetermined reference intensity, for example, 128 or more pixels in 256 levels from 0 to 255, or A range in which the plurality of pixels are adjacent to each other is determined as a light receiving location where the light emitted from the light emission instruction device 30 is received.
  • the instruction pointer display data creation unit 20 determines the position where the instruction pointer is displayed for the light receiving location determined to be receiving light by the light receiving location determination unit 19 and creates the instruction pointer data representing the instruction pointer.
  • the received light waveform determination unit 21 determines an emission pattern of light emitted from the light emission instruction device 30.
  • the light reception waveform determination unit 21 continuously receives light received from the optical sensor of the display device 13 with a built-in optical sensor when the received light intensity is equal to or greater than a predetermined reference intensity for a predetermined period, for example, a period of 2 milliseconds. It is determined that the output pattern is a pulse-shaped output pattern when a state that is greater than or equal to a predetermined reference intensity and a state that is less than a predetermined reference intensity are repeated in a predetermined period.
  • the pulse-shaped emission pattern can be used as a plurality of emission patterns by changing the frequency, pulse width, pulse interval, and the like of repeated pulses and combining them.
  • the continuous emission pattern is the first pattern
  • the pulsed emission pattern is the second pattern.
  • the light emission instruction device 30 is an instruction device that indicates a position by irradiating a screen with light such as a laser pointer, an LED (Light Emitting Diode), or a flashlight.
  • the light emission instructing device 30a includes a light emitting unit 31 that emits light.
  • the light emission instructing device 30b includes an intermittent driving unit 32 that emits light in a pulse shape in addition to the light emitting unit 31, and the light emission instructing device 30c includes In addition to the light emitting unit 31 and the intermittent driving unit 32, a button 33 for switching whether to emit light in a pulsed manner or continuously is included.
  • FIG. 2 is a diagram illustrating an example of the screen 40 irradiated with light from the light emission instruction device 30.
  • each pixel 42 includes a red sub-pixel 421, a green sub-pixel 422, and a blue sub-pixel 423, and each pixel 42 is provided with a photosensor 46.
  • the optical sensor 46 includes three optical sensors 461 to 463 that detect color and an optical sensor (not shown) that detects the intensity of light.
  • the optical sensor 461 detects red
  • the optical sensor 462 detects green
  • the optical sensor 463 detects blue.
  • 3A to 3D are diagrams illustrating examples of instruction pointers displayed at positions where light is irradiated.
  • FIG. 3A shows an enlarged screen 41a that displays an instruction pointer 45a in which the position indicated by the laser beam 39 irradiating light is indicated in red.
  • FIG. 3B shows an enlarged screen 41b displaying an instruction pointer 45b in which the position indicated by the laser beam 39 irradiated with light is represented by 3 ⁇ 3 white pixels.
  • FIG. 3C shows an enlarged screen 41c that displays an instruction pointer 45c in which a position indicated by the laser beam 39 irradiated with light is represented by a 5 ⁇ 5 green pixel.
  • the instruction pointer 45c is an instruction pointer that is larger than the instruction pointer 45b illustrated in FIG. 3B.
  • FIG. 3D shows an enlarged screen 41d that displays an instruction pointer 45d that indicates the position indicated by the laser beam 39 irradiated with light in the shape of a red arrow. The tip of the arrow indicates the position of the pixel at the center of the pixel being irradiated with light, for example.
  • 4A to 4C are diagrams showing examples of instruction pointers when the irradiated light has a spread.
  • FIG. 1 shows an instruction pointer 45c in which a position indicated by the laser beam 39 irradiated with light is represented by a 5 ⁇ 5 green pixel.
  • the instruction pointer 45c is an instruction pointer that is larger than the instruction pointer 45
  • FIG. 4A shows that light from the flashlight is radiated to a wide area 44e.
  • the intensity of light emitted from the flashlight is stronger toward the center of the region 44e.
  • the instruction pointer display data creation unit 20 determines the position of the pixel at the center of the area 44e or the position of the pixel having the strongest light reception among the pixels included in the area 44e as the position for displaying the instruction pointer.
  • FIG. 4B shows an enlarged screen 41f displaying a red instruction pointer 45f at a position determined by the instruction pointer display data creation unit 20, and
  • FIG. 4C shows a red arrow at a position determined by the instruction pointer display data creation unit 20.
  • An enlarged screen 41g displaying a shape instruction pointer 45g is shown.
  • FIG. 5A and 5B are diagrams showing examples of a screen 40h irradiated with light from two types of light emission instruction devices 30a and 30b having different colors of light to be irradiated.
  • the light emission instruction devices 30a and 30b are both configured by a laser pointer, the light emission instruction device 30a emits a red laser beam, and the light emission instruction device 30b emits a yellow laser beam.
  • FIG. 5A shows that the screen 40h is irradiated with a red laser beam from the light emission instruction device 30a and a yellow laser beam from the light emission instruction device 30b.
  • the enlarged screen 41h shows the pixels 45h1 and 45h2 irradiated with the respective laser beams.
  • FIG. 6 is a diagram illustrating an example of a screen 40k irradiated with light from two types of light emission instruction devices 30c and 30d having different emission patterns.
  • the light emission instruction devices 30c and 30d are both configured by a laser pointer.
  • the light emission instruction device 30c irradiates a laser beam with a continuous emission pattern
  • the light emission instruction device 30d irradiates the laser beam with a pulsed emission pattern.
  • FIG. 6 shows that the laser beam from the light emission instructing device 30c and the laser beam from the light emission instructing device 30d are irradiated on the screen 40k, and the enlarged screen 41k is irradiated with the respective laser beams. Pixels 45k1 and 45k2 are shown.
  • the received light waveform determination unit 21 determines that the received light intensity received from the optical sensor of the pixel 45k1 is equal to or higher than a predetermined reference intensity for a predetermined period, and thus is a continuous emission pattern.
  • the variables used in the flowcharts of FIGS. 7 to 12 include the number of received light Lcount, the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, the pointer display reference Lbase, the received light color PixelColor, the received light intensity PixelValue, and the coordinate data of the received light location.
  • the number of received light Lcount is the number of light receiving portions that receive light.
  • the variable Lcount is an integer from 0 to LMAX, and LMAX is the maximum number of received light colors, for example, “3”.
  • the pointer size Lsize is a value indicating the size of the pointer, and when the value is “0”, it indicates a large size, and when the value is “1”, it indicates an intermediate size and the value is “2”. If present, it represents a small size.
  • the pointer display color Lcolor is a value indicating the color of the pointer, and when the value is “0”, it indicates red, when the value is “1”, it indicates green, and when the value is “2”.
  • the pointer shape Lpattern is a value representing the shape of the pointing pointer. When the value is “0”, it represents a quadrangle. When the value is “1”, it represents a circle, and when the value is “2”, Represents an arrow.
  • the pointer size Lsize, the pointer display color Lcolor, and the pointer shape Lpattern can be set for each K, for example, three, that is, for each received color. When distinguishing each, Lsize [K], Lcolor [K], and Lpattern [K].
  • the data of the pointer size Lsize, the pointer display color Lcolor, and the pointer shape Lpattern are also referred to as basic coordinate data.
  • the pointer size Lsize, the pointer display color Lcolor, and the pointer shape Lpattern are display forms determined in advance.
  • the pointer display reference Lbase is indicated when the irradiated light has a spread like light emitted by a flashlight or the like, that is, when the portion irradiated with light is composed of a plurality of adjacent pixels.
  • the position at which the pointer is displayed is based on the position of the center of the area where the light is irradiated (hereinafter referred to as “center reference”), or the position of the pixel where the intensity of the irradiated light is the strongest (referred to below)
  • the value is “0”, indicating that the central reference is used, and if the value is “1”, the brightness is indicated. .
  • the light reception color PixelColor and the light reception intensity PixelValue are pixel data representing the light reception color and the light reception intensity of the received light for each pixel.
  • the light receiving color PixelColor is composed of three elements corresponding to the red sub-pixel, the green sub-pixel, and the blue sub-pixel, and is expressed by 256 levels of gradation from “0” to “255”, respectively.
  • the pixel data of each pixel is composed of four elements: the gradation of the red sub-pixel, the gradation of the green sub-pixel, the gradation of the blue sub-pixel, and the intensity of light reception.
  • the gradation of the red subpixel, the gradation of the green subpixel, and the gradation of the blue subpixel are pixel data of the light reception color PixelColor, and the light reception intensity is the light reception intensity.
  • PixelValue pixel data The screen of the display device 13 with a built-in photosensor is composed of M pixels in the horizontal direction and N pixels in the vertical direction, and the position of each pixel is represented by XY coordinates with the upper left pixel as the origin toward the screen. For example, pixel data of a pixel having an XY coordinate of (0, 0) is (255, 255, 255, 255), and pixel data of a pixel having an XY coordinate of (0, 1) is (255, 250, 255).
  • the pixel data of the pixel with the XY coordinates (100, 200) is (120, 100, 200, 180), and the pixel data of the pixel with the XY coordinates (100, 201) is (122, 98, 210, 190), the pixel data of the pixel whose XY coordinates are (M-1, N-2) is (0, 0, 0, 0), and the XY coordinates are (M-1, N-).
  • the pixel data of the pixel 1) is (0, 0, 0, 0).
  • the coordinate data (hereinafter also referred to as “light reception data”) Lpos of the light receiving location represents the coordinates of each pixel included in the light receiving location for each light receiving location.
  • the instruction pointer data PointInf is data representing the XY coordinates, color, size, and shape for displaying the instruction pointer for each light receiving location.
  • the setting of the variables of the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, and the pointer display reference Lbase will be described in detail in the setting process shown in FIG.
  • FIG. 7 is a flowchart illustrating an example of the first display process performed by the central processing unit 15.
  • the first display process is a process of displaying an instruction pointer having the same color as the color of the received light, that is, the received light color, on the pixel that has received the light.
  • the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the optical sensor built-in display device (hereinafter also referred to as “panel”) 13.
  • panel optical sensor built-in display device
  • the process proceeds to Step A1.
  • step A1 the data for each pixel detected by each photosensor, specifically the light reception color and the light reception intensity data, are obtained from the panel 13, and set to the light reception color PixelColor and the light reception intensity PixelValue for each pixel. To do.
  • initialization is performed by substituting “0” into variables X and Y, respectively.
  • step A3 it is determined whether or not the value of the variable Y is less than the vertical resolution of the panel 13, that is, the number of pixels in the vertical direction of the screen. If the value of Y is less than the vertical resolution of the panel 13, the process proceeds to step A4. If the value of Y is equal to or greater than the vertical resolution of the panel 13, the first display process is terminated. In step A4, “0” is substituted into the variable X for initialization. In step A5, it is determined whether or not the value of the variable X is less than the horizontal resolution of the panel 13, that is, the number of pixels in the horizontal direction of the screen.
  • step A6 the PixelValue of the pixel at the coordinate (X, Y), that is, the intensity of light reception is acquired.
  • step A7 it is determined whether or not the acquired value of PixelValue is greater than a predetermined value, for example, the above-described predetermined reference intensity, that is, whether or not light is irradiated. If the value of PixelValue is greater than the predetermined value, the process proceeds to step A8, and if the value of PixelValue is not greater than the predetermined value, the process proceeds to step A9.
  • step A8 the pixel at the coordinate (X, Y) is displayed in the PixelColor of the coordinate (X, Y), that is, the received light color.
  • the central processing unit 15 sets the pixel color at the coordinates (X, Y) to the PixelColor at the coordinates (X, Y) in the image information stored in the display / display processing temporary storage unit 12. That is, it is replaced with the color of the received light color.
  • the display processing unit 14 reads the image information in which the color of the pixel at the coordinates (X, Y) is rewritten from the display / display processing temporary storage unit 12, converts the image information into a predetermined format, and displays the information on the panel 13.
  • FIG. 8 is a flowchart illustrating an example of the second display process performed by the central processing unit 15.
  • the second display process is a process of displaying an instruction pointer having a preset color and shape at the position of the light receiving portion that has received the light. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step B1.
  • step B1 data for each pixel detected by each photosensor, specifically, light reception color and light reception intensity data is acquired from the panel 13 and set to the light reception color PixelColor and the light reception intensity PixelValue for each pixel.
  • a light receiving location search process is performed.
  • instruction pointer data creation processing is performed.
  • step B4 an instruction pointer display process is performed, and the process returns to step B1.
  • FIG. 9 is a flowchart illustrating an example of a light receiving location search process called from the second display process. The light receiving location searching process is executed by the light receiving location determining section 19, and when the light receiving location determining section 19 is called from the second display process shown in FIG. 8, the process proceeds to step C1.
  • step C1 initialization is performed by substituting “0” into the variables X and Y and the variable Lcount of the number of received light.
  • step C2 it is determined whether or not the value of the variable Y is less than the vertical resolution of the panel 13, that is, the number of pixels in the vertical direction of the screen. If the value of Y is less than the vertical resolution of the panel 13, the process proceeds to step C3, and if the value of Y is equal to or greater than the vertical resolution of the panel 13, the received light spot search process is terminated.
  • step C3 initialization is performed by substituting “0” into the variable X.
  • step C4 it is determined whether or not the value of the variable X is less than the horizontal resolution of the panel 13, that is, the number of pixels in the horizontal direction of the screen.
  • Step C5 PixelValue of the pixel at the coordinate (X, Y), that is, the intensity of light reception is acquired.
  • step C6 it is determined whether or not the acquired PixelValue value is greater than a predetermined value, that is, the predetermined reference intensity, that is, whether or not light is irradiated. If the value of PixelValue is greater than the predetermined value, the process proceeds to step C7, and if the value of PixelValue is not greater than the predetermined value, the process proceeds to step C11.
  • Step C7 initialization is performed by substituting “0” into a variable N which is a light reception data counter.
  • step C8 it is determined whether or not the value of the variable N is less than the value of Lcount, that is, the number of received light that has already been detected. If the value of variable N is less than the value of Lcount, the process proceeds to step C9, and if the value of variable N is equal to or greater than the value of Lcount, the process proceeds to step C14.
  • step C9 it is determined whether or not the Nth received light data Lpos has the coordinates of adjacent pixels detected as having been irradiated with light (hereinafter referred to as “adjacent coordinates”).
  • step C10 the value of the coordinate (X, Y) is added to the Nth received light data Lpos.
  • Step C11 “1” is added to the variable X, and the process returns to Step C4.
  • Step C12 “1” is added to the variable N, and the process returns to Step C8.
  • Step C13 “1” is added to the variable Y, and the process returns to Step C2.
  • Step C14 “1” is added to the variable Lcount, that is, the number of received light is counted up, and the process proceeds to Step C10.
  • step 10 is a flowchart illustrating an example of instruction pointer data creation processing called from the second display processing.
  • the instruction pointer data generation process is executed by the instruction pointer display data generation unit 20, and when the instruction pointer display data generation unit 20 is called from the second display process shown in FIG. 8, the process proceeds to step D1.
  • step D1 "0" is substituted for variable N to be initialized.
  • step D2 it is determined whether or not the variable N is less than the received light number Lcount. If the variable N is less than the light reception number Lcount, the process proceeds to step D3. If the variable N is greater than or equal to the light reception number Lcount, the process proceeds to step D113.
  • step D3 the light reception color of the Nth light reception point is determined based on the coordinate data of the light reception point, that is, the light reception data Lpos, and the light reception color PixelColor of each pixel data.
  • step D4 the basic coordinate data of the pointer size Lsize and the pointer shape Lpattern corresponding to the received light color determined in step D3 is copied to the coordinate data of the instruction pointer data PointInf.
  • the basic coordinate data is represented by XY coordinates of each pixel of the pointing pointer when the reference position is set to XY coordinates (0, 0).
  • step D5 it is determined whether or not the value of the pointer display color Lcolor corresponding to the light reception color determined in step D3 is “3”, that is, the received color.
  • step D6 the color of the pointer display color Lcolor is set in the color data of the instruction pointer data PointInf.
  • the color of the pointer display color Lcolor is, for example, one of red, green, and blue.
  • step D7 it is determined whether or not the value of the pointer display reference Lbase is “0”, that is, whether or not the display reference is the central portion, that is, the central reference. If the value of the pointer display reference Lbase is “0”, the process proceeds to step D8. If the value of the pointer display reference Lbase is not “0”, the process proceeds to step D12.
  • step D8 the average value of the XY coordinates of the pixels included in the Nth light receiving location is obtained, the obtained average value of the X coordinate is set to the variable Xb, and the obtained average value of the Y coordinate is set to the variable Yb.
  • step D9 the value of the variable Xb is added to the X coordinate of the basic coordinate copied to the coordinate data of the pointing pointer data PointInf in step D4, and the value of the variable Yb is added to the Y coordinate.
  • Step D10 “1” is added to the variable N, and the process returns to Step D2.
  • step D11 the color data corresponding to the coordinates of the instruction pointer data PointInf is extracted from the received color PixelColor of each pixel data, set to the color data of the instruction pointer data PointInf, and the process proceeds to step D7.
  • step D12 a pixel having the largest value of received light intensity PixelValue is searched for among the pixels included in the Nth light receiving location, the X coordinate of the searched pixel data is set in the variable Xb, and the Y coordinate is set as the variable. Set to Yb.
  • step D13 data having a negative coordinate value or a value exceeding the resolution of the panel 13, that is, the number of pixels, is removed from the instruction pointer data PointInf.
  • step D14 the instruction pointer data PointInf is sorted, that is, rearranged in the coordinate order, and the instruction pointer data creation process is terminated.
  • FIG. 11 is a flowchart illustrating an example of an instruction pointer display process called from the second display process.
  • step E1 initialization is performed by substituting “0” into variables X, Y, and Z, respectively.
  • step E2 it is determined whether or not the value of the variable Y is less than the vertical resolution of the panel 13. If the value of Y is less than the vertical resolution of the panel 13, the process proceeds to step E3.
  • step E3 initialization is performed by substituting “0” into the variable X.
  • step E4 it is determined whether or not the value of the variable X is less than the horizontal resolution of the panel 13. If the value of the variable X is less than the horizontal resolution of the panel 13, the process proceeds to step E5. If the value of the variable X is equal to or greater than the horizontal resolution of the panel 13, the process proceeds to step E9. In step E5, it is determined whether or not the coordinates (X, Y) and the coordinates of the Z-th PointInf data are the same.
  • step E6 the process proceeds to step E6. If the coordinates (X, Y) and the coordinates of the Zth PointInf data are not the same, step E8 is performed. Proceed to In step E6, the pixel at the coordinate (X, Y) is displayed in the color of the Zth PointInf data. Specifically, the central processing unit 15 sets the color of the pixel at the coordinate (X, Y) in the image information stored in the display / display processing temporary storage unit 12 to the color of the Zth PointInf data.
  • FIG. 12 is a flowchart illustrating an example of the third display process performed by the central processing unit 15.
  • the third display process is a process of displaying an instruction pointer of a different color and shape for each light emission instruction device when the light emission instruction device 30 having a plurality of emission patterns, for example, a continuous emission pattern and a pulsed emission pattern is used. It is.
  • the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step F1.
  • Steps F1 to F4 correspond to steps B1 to B4 shown in FIG. 8, respectively, and description thereof is omitted to avoid duplication.
  • FIG. 13 is a flowchart illustrating an example of a light receiving location search process called from the third display process.
  • FIG. 14 is a flowchart illustrating an example of instruction pointer data creation processing called from the third display processing.
  • FIG. 15 is a flowchart illustrating an example of an instruction pointer display process called from the third display process. 13 is the same as the light receiving point search process shown in FIG. 9, and the instruction pointer display process shown in FIG. 15 is the same as the instruction pointer display process shown in FIG. The description is omitted to avoid it.
  • the instruction pointer data creation process shown in FIG. 14 is executed by the instruction pointer display data creation part 20, and when the instruction pointer display data creation part 20 is called from the third display process shown in FIG. 12, the process proceeds to step H1. .
  • Steps H1 and H2 are the same as steps D1 and D2 shown in FIG.
  • steps H6 to H9 are the same as steps D7 to D10 shown in FIG. 10, respectively, and steps H12 to H14 are respectively shown in FIG. Steps D12 to D14 shown in FIG. 10 are the same, and the description is omitted to avoid duplication.
  • step H3 it is determined based on the determination result of the received light waveform determination unit 21 whether or not the emission pattern is pulsed. If the emission pattern is pulsed, the process proceeds to step H4, and if the emission pattern is not pulsed, that is, if it is a continuous emission pattern, the process proceeds to step H10.
  • the reference coordinate data whose shape is a quadrangle and whose size is “intermediate” is copied to the coordinate data of the instruction pointer data PointInf.
  • step H5 blue is set in the color data of the instruction pointer data PointInf.
  • step H10 the reference coordinate data having a round shape and a size of “large” is copied to the coordinate data of the pointing pointer data PointInf.
  • step H11 red is set in the color data of the instruction pointer data PointInf, and the process proceeds to step H6.
  • Red is the first color
  • blue is the second color
  • the large circular shape is the first shape
  • the middle rectangular shape is the second shape. It is.
  • Step H3 shown in FIG. 14 is a determination step.
  • Steps H1, H2 and steps H4 to H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15 are second control steps.
  • Steps G1 to G14 shown in FIG. 13 are range determination steps.
  • FIG. 16 is a flowchart illustrating an example of a fourth display process performed by the central processing unit 15.
  • the light emission instruction device 30 having a plurality of emission patterns, for example, a continuous emission pattern and a pulsed emission pattern, is used, an indication pointer of a different color and shape is displayed for each light emission instruction device. For any one of the light emission instruction devices, this is processing for performing control related to screen control.
  • the control related to the screen control is, for example, a control for turning a page on the screen. Specifically, by sending a page break command for turning a page on the screen to a PC or the like that inputs a video signal to the display device 10. Do. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step K1. Steps K1 to K4 correspond to steps B1 to B4 shown in FIG. 8, respectively, and description thereof is omitted to avoid duplication.
  • FIG. 17 is a flowchart illustrating an example of a light receiving location search process called from the fourth display process.
  • FIG. 18 is a flowchart illustrating an example of instruction pointer data creation processing called from the fourth display processing.
  • FIG. 19 is a flowchart illustrating an example of an instruction pointer display process called from the fourth display process.
  • the received light spot search process shown in FIG. 17 is the same as the received light spot search process shown in FIG. 9, and the instruction pointer display process shown in FIG. 19 is the same as the instruction pointer display process shown in FIG. The description is omitted to avoid it.
  • the instruction pointer data creation process shown in FIG. 18 is executed by the instruction pointer display data creation part 20, and when the instruction pointer display data creation part 20 is called from the fourth display process shown in FIG. 16, the process proceeds to step M1. .
  • Steps M1 to M5 are the same as Steps H1 to H5 shown in FIG.
  • Steps M9 to M12 are the same as Steps H8 to H11 shown in FIG. 14, respectively, and Steps M14 and M15 are shown in FIG. Steps H13 and H14 shown in FIG. 14 are the same, and a description thereof is omitted to avoid duplication.
  • step M6 the average value of the XY coordinates of the pixels included in the Nth light receiving location is obtained, the obtained average value of the X coordinate is set to the variable Xb, and the obtained average value of the Y coordinate is set to the variable Yb.
  • step M7 the XY coordinates (Xc, Yc) of the average value converted to the resolution of the input signal are calculated from the XY coordinates (Xb, Yb) of the average value.
  • step M8 a page break command for turning the page on the screen is transmitted to the PC or the like that inputs the video to the video input terminal 11 together with the XY coordinates (Xc, Yc).
  • step M13 the average value of the XY coordinates of the pixels included in the Nth light receiving portion is obtained, the obtained average value of the X coordinate is set in the variable Xb, the obtained average value of the Y coordinate is set in the variable Yb, Proceed to step M9.
  • FIG. 20 is a flowchart illustrating an example of the setting process performed by the central processing unit 15. Before displaying a video signal from a PC or the like on the panel 13, the user responds to a setting screen (not shown) displayed on the panel 13, the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, and the pointer display reference.
  • the Lbase variable can be set in advance for each received color.
  • the central processing unit 15 generates image information of a setting screen for setting variables of the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, and the pointer display reference Lbase, and the display / display processing temporary storage unit 12
  • the display processing unit 14 displays the image information stored in the display / display processing temporary storage unit 12 on the panel 13.
  • the setting screen is a screen on which, for example, the contents to be set for each variable can be selected by a remote controller.
  • the central processing unit 15 displays the setting screen on the panel 13, and then proceeds to Step P1. In Step P1, it is determined whether the pointer display reference selected by the remote controller is “center reference” or “brightness reference”.
  • step P2 If the selected pointer display standard is “center standard”, the process proceeds to step P2, and if the selected pointer display standard is “brightness standard”, the process proceeds to step P14.
  • step P2 “0” is substituted for variable D.
  • step P3 the value of the variable D is substituted for the display reference Lbase.
  • Step P4 “0", that is, a value indicating red is substituted for the variable K for selecting the received light color.
  • Step P5 it is determined whether or not the value of the variable K is less than the number of light receiving colors that can be set, for example, a value “3” representing three colors of red, blue, and yellow. If the value of the variable K is less than “3”, the process proceeds to step P6.
  • step P6 first, the light reception color that has been set, that is, the color determined by the value of the variable K is displayed on the setting screen.
  • step P15 it is determined whether the pointer shape selected by the remote controller is “square”, “circle”, or “arrow”. If the selected pointer shape is “square”, the process proceeds to step P15. If the selected pointer shape is “circle”, the process proceeds to step P7. If the selected pointer shape is “arrow”, the process proceeds to step P15. Proceed to P16.
  • step P7 “1” is substituted for variable A.
  • step P8 it is determined whether the size of the pointer selected by the remote controller is “large”, “medium”, or “small”. If the size of the selected pointer is “large”, the process proceeds to step P17. If the size of the selected pointer is “medium”, the process proceeds to step P9, and the size of the selected pointer is “small”. ", The process proceeds to Step P18.
  • Step P9 “1” is substituted into the variable B.
  • Step P10 it is determined whether the pointer color selected by the remote controller is “red”, “blue”, “green”, or “light receiving color”. If the selected pointer color is “red”, the process proceeds to step P19.
  • Step P11 “1” is substituted into the variable C.
  • step P12 the set contents are stored. Specifically, the value of the variable B is assigned to the pointer size Lsize [K], the value of the variable C is assigned to the pointer color Lcolor [K], and the value of the variable A is assigned to the pointer shape Lpattern [K]. To do.
  • Step P13 “1” is added to the variable K, and the process returns to Step P5.
  • Step P14 “1” is substituted for the variable D, and the process proceeds to Step P3.
  • Step P15 “0” is substituted into the variable A, and the process proceeds to Step P8.
  • Step P16 “2” is substituted into the variable A, and the process proceeds to Step P8.
  • Step P17 “0” is substituted into the variable B, and the process proceeds to Step P10.
  • Step P18 “2” is substituted into the variable B, and the process proceeds to Step P10.
  • Step P19 “0” is substituted for the variable C, and the process proceeds to Step P12.
  • Step P20 “2” is substituted into the variable C, and the process proceeds to Step P12.
  • Step P21 “3” is substituted for the variable C, and the process proceeds to Step P12.
  • the central processing unit 15 when the image is displayed on the photosensor built-in display device 13 that includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 Thus, according to the intensity of light detected by the optical sensor, an image indicating the light detection position, for example, an instruction pointer is displayed on the optical sensor built-in display device 13. Therefore, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
  • the image displayed on the display device 13 with a built-in optical sensor is a predetermined color, for example, red, green, or blue, by using a color that can be easily understood by a person viewing the screen, the image is displayed on the screen by an instruction device that emits light. The position of the irradiated light can be made clearer. Furthermore, since the color of the light received by the optical sensor is detected and the image displayed on the optical sensor built-in display device 13 is the color detected by the optical sensor, the color irradiated by the pointing device that emits the light. Can be made clearer.
  • the image displayed on the display device 13 with a built-in optical sensor has a predetermined display form, for example, a quadrangle, a circle, or an arrow, an indication device that irradiates light by using a shape that is easy for a person viewing the screen to understand. The position of the light irradiated on the screen can be shown more clearly.
  • the central processing unit 15 determines whether or not the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference strength. The central processing unit 15 detects by the central processing unit 15 using the optical sensor. When it is determined that the intensity of the emitted light is equal to or greater than a predetermined reference intensity, the image is displayed on the display device 13 with a built-in optical sensor.
  • the position irradiated with light having a predetermined reference intensity or higher can be displayed on the display device 13 with a built-in optical sensor. Further, in displaying an image on the photosensor-equipped display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light for each position, steps D1 to D1 shown in FIG. In D14 and steps E1 to E9 shown in FIG. 11, an image indicating the detection position, for example, an instruction pointer, is displayed according to the intensity of light detected by the optical sensor. Therefore, if the display method according to the present invention is provided, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
  • the central processing unit 15 when displaying an image on the photosensor built-in display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 When the pattern in which the intensity of the light detected by the optical sensor changes is determined, and the pattern determined by the central processing unit 15 is a continuous emission pattern, the light is detected by the optical sensor.
  • a symbol image of a first display form indicating a position, for example, a red or large-sized circular pointer is displayed, and a pulse-shaped emission different from the emission pattern in which the pattern determined by the central processing unit 15 is continuous
  • a symbol image of a second display form different from the first display form and indicating the position for example, Color or size instruction pointer in the shape of the intermediate rectangle is displayed. Therefore, it is possible to clearly indicate the positions on the screen instructed by a plurality of, for example, two types of instruction devices that emit light having different patterns of varying light intensity.
  • the first display form is red and the second display form is blue different from red, a plurality of, for example, two types of light irradiating different patterns of varying light intensity.
  • the position on the screen designated by the pointing device can be clearly indicated by different colors.
  • the first display form is a round shape with a large size
  • the second display form is a square shape with a size different from the round shape with a large size
  • the positions on the screen instructed by a plurality of, for example, two types of pointing devices that emit light having different light intensity patterns can be clearly indicated by different shapes.
  • step H3 shown in FIG. 14 is a predetermined continuous emission pattern.
  • the symbol image of the first display form showing the position where the light is detected by the optical sensor, for example, a red or large-sized circular pointer is displayed, and the pattern determined in step H3 shown in FIG. Is a pulse-like emission pattern different from the continuous emission pattern, the symbol image of the second display form showing the position, which is different from the first display form, for example, a blue or medium-sized square
  • An instruction pointer of the shape of is displayed.
  • the positions on the screen indicated by a plurality of, for example, two types of pointing devices that emit light having different light intensity patterns can be specified.
  • the central processing unit 15 A light receiving location that is a range in which the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined, and the central calculation unit 15 becomes the center of each range for each range determined by the central calculation unit 15.
  • the position is specified, and the central processing unit 15 displays an image indicating the position specified by the central processing unit 15, for example, an instruction pointer. Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device. Furthermore, since the image indicating the specified position is in a predetermined display form, the position on the screen instructed by the operator using the pointing device can be clearly indicated by the specific display form. Furthermore, since the predetermined display form is a predetermined color, for example, red, green, or blue, the position on the screen instructed by the pointing device is made clearer by using a color that is easy for the viewer to see. be able to.
  • the position can be clearly indicated by the color irradiated by the pointing device. it can.
  • the predetermined display form is a predetermined shape, for example, a quadrangle, a circle, or an arrow, the position on the screen instructed by the pointing device is made clearer by using a shape that is easy for the viewer to see. Can be shown.
  • the steps shown in FIG. 1 in displaying an image on the display device 13 with a built-in photosensor, which includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the steps shown in FIG.
  • a light receiving point that is in a range in which the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined.
  • steps G1 to G14 shown in FIG. 13 and steps H7 and H8 shown in FIG. 14 each range is determined for each range determined in steps G1 to G14 shown in FIG. 13 and steps H7 and H8 shown in FIG. Identify the center location.
  • steps H1 to H6, H9 to H11, H13, and H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15 steps G1 to G14 shown in FIG. 13 and steps H7 and H8 shown in FIG.
  • An image indicating the position specified by (eg, an instruction pointer) is displayed. Therefore, if the display method according to the present invention is provided, the position on the screen indicated by the operator using the pointing device can be clearly indicated even when using a pointing device that spreads the light irradiated on the screen, such as a flashlight. Can do.
  • the central processing unit 15 when displaying an image on the photosensor built-in display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 A light receiving location that is a range in which the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined, and the intensity of light within the range determined by the central processing unit 15 is determined by the central processing unit 15. Is determined for each of the determined ranges, and the central processing unit 15 displays an image indicating the position specified by the central processing unit 15, for example, an instruction pointer.
  • the image indicating the specified position is in a predetermined display form
  • the position on the screen instructed by the operator using the pointing device can be clearly indicated by the specific display form.
  • the predetermined display form is a predetermined color, for example, red, green, or blue
  • the position on the screen instructed by the pointing device is made clearer by using a color that is easy for the viewer to see. be able to.
  • the position can be clearly indicated by the color irradiated by the pointing device. it can.
  • the predetermined display form is a predetermined shape, for example, a quadrangle, a circle, or an arrow, the position on the screen instructed by the pointing device is made clearer by using a shape that is easy for the viewer to see. Can be shown.
  • the steps shown in FIG. 1 in displaying an image on the display device 13 with a built-in photosensor, which includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the steps shown in FIG.
  • a light receiving location in a range where the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined.
  • Steps G1 to G14 shown in FIG. 13 and Steps H8 and H12 shown in FIG. 14 light within the range determined in Steps G1 to G14 shown in FIG. 13 and Steps H8 and H12 shown in FIG. A position having the strongest intensity is specified for each of the determined ranges.
  • steps H1 to H6, H9 to H11, H13, and H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15 steps G1 to G14 shown in FIG. 13 and steps H8 and H12 shown in FIG.
  • FIG. 21 is a diagram showing a configuration of a display device 1A according to the second embodiment of the present invention.
  • the display device 1A includes a display device 10A, a light emission instruction device 30, and a video generation device 50.
  • the display method according to the present invention is processed by the display device 10A and the video generation device 50.
  • the display device 10A includes N video input terminals 11, a display / display processing temporary storage unit 12, a photosensor built-in display device 13, a display processing unit 14, a central processing unit 15, an internal storage unit 16, a remote control ( (Hereinafter referred to as “remote control”) includes a light receiving unit 17, a remote control processing unit 18, a light receiving location determination unit 19, an instruction pointer display data creation unit 20, a light reception waveform determination unit 21 and a serial communication processing unit 22, and is connected to the video generation device 50.
  • remote control includes a light receiving unit 17, a remote control processing unit 18, a light receiving location determination unit 19, an instruction pointer display data creation unit 20, a light reception waveform determination unit 21 and a serial communication processing unit 22, and is connected to the video generation device 50.
  • the central processing unit 15 includes a central processing unit (hereinafter referred to as “CPU”) and a memory that stores a control program for controlling the display device 10A, and the CPU executes the control program stored in the memory, thereby
  • the sensor built-in display device 13, the display processing unit 14, the remote control processing unit 18, and the serial communication processing unit 22 are controlled.
  • the serial communication processing unit 22 is connected to the video generation device serial communication processing unit 55 of the video generation device 50 through a serial interface, and transmits and receives information to and from the video generation device serial communication processing unit 55.
  • the serial communication processing unit 22 transmits a command for controlling the display image instructed from the central processing unit 15 to the video generation device 50.
  • the commands for controlling the display image include, for example, a page break command for turning the page of the displayed image information, a horizontal scroll command for moving the displayed image information in the X-axis direction, and the displayed image information for the Y-axis. These are commands such as a vertical scroll command for moving in the direction, an enlargement command for enlarging the displayed image information, and a reduction command for reducing the displayed image information.
  • the light emission instruction device 30 is an instruction device that indicates a position by irradiating a screen with light such as a laser pointer, an LED (Light Emitting Diode), or a flashlight.
  • the light emission instructing device 30 includes a light emitting unit 31 that emits light, an intermittent drive unit 32 that emits the light emitting unit 31 in a pulsed manner, and a button 33 that switches whether to emit light in a pulsed manner or continuously.
  • the video generation device 50 is configured by a PC or the like, and includes a display data storage unit 51, a video signal generation unit 52, a video generation device central processing unit 53, a video generation device internal storage unit 54, and a video generation device serial communication processing unit 55.
  • the display data storage unit 51 is a storage device including a semiconductor memory or a hard disk device that stores image information to be displayed on the display device 10A.
  • the video signal generation unit 52 reads image information instructed by the video generation device central processing unit 53 from the image information stored in the display data storage unit 51 from the display data storage unit 51, and reads the read image information. Convert to video signal and output.
  • the video signal output by the video signal generation unit 52 is input to any one of the video input terminals 11 of the display device 10A.
  • the video generation device central processing unit 53 includes a CPU and a memory that stores a video generation device control program for controlling the video generation device 50, and the CPU executes the video generation device control program stored in the memory.
  • the video signal generator 52 and the video generator serial communication processor 55 are controlled.
  • the video generation device central processing unit 53 is an FPGA (Large Scale Integration) FPGA (LGA) that can be programmed instead of the CPU and the memory.
  • the video generation device internal storage unit 54 is configured by a storage device such as a writable and readable semiconductor memory, and stores control information 541 used when the video generation device central processing unit 53 executes the video generation device control program. .
  • Control information 541 includes information such as a display area.
  • the display area is information representing the area of the portion to be displayed on the display device 10A in the entire image information for one page.
  • the XY of the pixel in the upper left position toward the screen This is expressed by the coordinates (Ppos_xs, Ppos_ys) and the XY coordinates (Ppos_xe, Ppos_ye) of the pixel at the lower right position.
  • the XY coordinates are coordinates whose origin is, for example, the position of the upper left pixel toward the entire image for one page to be displayed.
  • the video generation device serial communication processing unit 55 is connected to the serial communication processing unit 22 of the display device through a serial interface, and transmits and receives information to and from the serial communication processing unit 22.
  • the video generation device serial communication processing unit 55 receives commands such as a page break command, a horizontal scroll command, a vertical scroll command, an enlargement command, and a reduction command, which control the display image transmitted from the display device 10A.
  • the received command is sent to the video generation device central processing unit 53.
  • the central processing unit 15 and the video generation device central processing unit 53 are control means.
  • the variables used in the flowcharts of FIGS. 22 to 25 include variables of the light reception number Lcount, the light reception color PixelColor, the light reception intensity PixelValue, the light reception point coordinate data Lpos, and the instruction pointer data PointInf. The definition of these variables is as described above, and a description thereof will be omitted.
  • the fifth display process is a process of displaying an instruction pointer having a preset color and shape at the position of the light receiving portion that has received the light.
  • the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step Q1.
  • step Q1 data for each pixel detected by each photosensor from the panel 13, specifically, light reception color and light reception intensity data are acquired and set to the light reception color PixelColor and the light reception intensity PixelValue for each pixel. To do.
  • step Q2 a light receiving location search process is performed.
  • an instruction pointer data creation process is performed.
  • FIG. 23 is a flowchart illustrating an example of a light receiving location search process called from the fifth display process.
  • FIG. 24 is a flowchart illustrating an example of instruction pointer data creation processing called from the fifth display processing.
  • FIG. 25 is a flowchart illustrating an example of an instruction pointer display process called from the fifth display process.
  • the received light spot search processes R1 to R14 shown in FIG. 23 are the same as the received light spot search processes C1 to C14 shown in FIG. 9, and the instruction pointer data creation processes S1 to S15 shown in FIG.
  • FIG. 26 is a flowchart illustrating an example of command processing performed by the video generation device central processing unit 53.
  • the command processing is processing for executing a command for controlling the display image received from the display image device 10A, for example, a page break command.
  • the video generation device central processing unit 53 instructs the video signal generation unit 52 to output the image information stored in the display data storage unit 51 to the display device 10A, the process proceeds to step U1.
  • step U1 it is determined whether a page break command has been received.
  • step U1 If a page break command is received, the process proceeds to step U1, and if no page break command is received, the process returns to step U1.
  • step U2 a page break command is executed, and the process returns to step U1. Specifically, the video signal of the pixel information of the next page of the page of the image information currently being output is output, and the process returns to step U1.
  • Steps S1, S2, S4 to S15 shown in FIG. 24, steps T1 to T9 shown in FIG. 25, and steps U1 and U2 shown in FIG. 26 are fifth control steps.
  • the central processing unit 15 when the image is displayed on the photosensor built-in display device 13 that includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 Thus, a pattern in which the intensity of light detected by the optical sensor changes, that is, an emission pattern, is discriminated, and the patterns discriminated by the central processing unit 15 by the central processing unit 15 and the video generation device central processing unit 53 are continuous.
  • the first control is performed on the display image, and when the pattern determined by the central processing unit 15 is a pulsed emission pattern different from the continuous emission pattern, the display image is controlled. Second control is performed.
  • the display image it is possible to control the display image according to the light emitted to the screen by the light emission instruction device 30 that emits light, for example, according to the emission pattern in which the intensity of the light changes.
  • the first control is a control for displaying an image showing the position of light detected by the optical sensor
  • the second control is a control for operating the display image
  • the emission pattern is continuous.
  • an instruction pointer is displayed, and when the emission pattern is a pulse pattern, an operation of displayed image information, for example, a page can be changed.
  • step S3 shown in FIG. 24 A pattern in which the intensity of light detected by the optical sensor changes, that is, an emission pattern is discriminated. Then, when the pattern determined in step S3 shown in FIG. 24 is a continuous emission pattern, the display image is subjected to steps S2, S9 to S15 shown in FIG. 24 and steps T1 to T9 shown in FIG. When the first control is performed and the pattern determined in step S3 shown in FIG. 24 is a pulsed emission pattern different from the continuous pattern, steps S2, S4 to S10, S14, shown in FIG.
  • FIG. 27 is a diagram showing a configuration of a display device 1B according to the third embodiment of the present invention.
  • the display device 1B includes a display device 10B, a light emission instruction device 30A, and a video generation device 50A.
  • the display method according to the present invention is processed by the display device 10B and the video generation device 50A.
  • connection relationship between the display device 10B and the video generation device 50A according to the present embodiment is similar to the connection relationship between the display device 10A and the video generation device 50 according to the above-described embodiment.
  • the display device 10B includes N video input terminals 11, a display / display processing temporary storage unit 12, a photosensor built-in display device 13, a display processing unit 14, a central processing unit 15, an internal storage unit 16, a remote control ( (Hereinafter referred to as “remote control”) including a light receiving unit 17, a remote control processing unit 18, a light receiving location determination unit 19, an instruction pointer display data creation unit 20, a light receiving unit shape determination unit 21 A, and a serial communication processing unit 22.
  • the light receiving location determination unit 19, the instruction pointer display data creation unit 20, and the light receiving unit shape determination unit 21A are functions realized by the central processing unit 15 executing a control program.
  • the instruction pointer display data creation unit 20 determines the position at which the instruction pointer is displayed for the light receiving part determined to receive light by the light receiving part determination unit 19 and creates instruction pointer data representing the instruction pointer. At the same time, a command corresponding to the shape determined by the light receiving portion shape determination unit 21A is transmitted to the video generation device 50.
  • the light receiving unit shape determining unit 21A which is a shape determining unit, is called from the instruction pointer display data creating unit 20 and determines the shape of the light irradiated on the screen by the light emission instruction device 30.
  • the light receiving portion shape determination unit 21A determines a shape formed by pixels whose received light intensity received from the optical sensor of the optical sensor built-in display device 13 is equal to or higher than a predetermined reference intensity.
  • the light emission instruction device 30A is an instruction device that indicates a position by irradiating the screen with light such as a laser pointer.
  • the light emission instructing device 30A includes a light emitting unit 31A that emits light, for example, a laser beam, and a light emission shape switching unit 32A that switches the shape of light formed on the screen by irradiating light.
  • the video generation device 50A is configured by a PC or the like, and includes a display data storage unit 51, a video signal generation unit 52, a video generation device central processing unit 53, a video generation device internal storage unit 54, a video generation device serial communication processing unit 55, A scroll processing unit 56 and an enlargement processing unit 57 are included.
  • the scroll processing unit 56 and the enlargement processing unit 57 are functions realized by the video generation device central processing unit 53A executing the video generation device control program.
  • the scroll processing unit 56 performs a process of horizontally scrolling or vertically scrolling image information to be displayed on the screen of the display device 10B among the image information for one page.
  • the horizontal scroll is a process of moving the image information to be displayed among the image information for one page in the X-axis direction of the XY coordinates
  • the vertical scroll is a process of moving in the Y-axis direction.
  • the enlargement processing unit 57 performs a process of enlarging image information to be displayed on the screen of the display device 10B among the image information for one page.
  • the central processing unit 15 and the video generation device central processing unit 53A are control means.
  • 28A to 28D are diagrams showing examples of the shape of light emitted by the light emission instruction device 30.
  • FIG. FIG. 28A is an example of a case where the shape of light emitted by the light emission instruction device 30 is a circle shape 61.
  • the circle shape 61 is a predetermined shape that is not irradiated with light at the center, for example, a circular shape.
  • FIG. 28B is an example when the shape of the light emitted by the light emission instruction device 30 is a dot shape 62.
  • the dot shape 62 is a predetermined shape in which light is irradiated to the central portion, for example, a circular shape.
  • FIG. 28C is an example in which the shape of the light emitted by the light emission instruction device 30 is a horizontally long shape 63.
  • the horizontally long shape 63 is a shape of a horizontal line segment whose inclination is equal to or less than a predetermined angle.
  • FIG. 28D is an example when the shape of the light emitted by the light emission instruction device 30 is the vertically long shape 64.
  • the vertically long shape 64 is a shape of a vertical line segment whose inclination is equal to or less than a predetermined angle.
  • the horizontally long shape is also referred to as a horizontal line shape
  • the vertically long shape is also referred to as a vertical line shape.
  • FIG. 29 is a diagram illustrating an example of a display image in which the panel 13 is irradiated with the light having the dot shape 62.
  • the image information 70 is image information for one page stored in the display data storage unit 51 ⁇ / b> A, and the image information in the area 71 among the image information for one page is displayed on the panel 13.
  • the screen 72 is a screen of the panel 13 and displays image information in the area 71 among the image information 70 for one page stored in the display data storage unit 51A.
  • the screen 74 is a partial screen obtained by enlarging a peripheral portion of the screen 72 irradiated with the laser beam from the light emission instruction device 30, and is irradiated with light having a dot shape 62.
  • FIG. 30 is a diagram illustrating an example of a display image displayed when the light shape is the horizontally long shape 63.
  • the display device 1B moves the region of the image information to be displayed to the right of the region of the displayed image information when the light of the horizontally long shape 63 is irradiated on the screen 75 by the light emission instruction device 30.
  • FIG. 30 is a state in which the area of the image information to be displayed is horizontally scrolled from the area 71, that is, moved to the right, and the image information of the area 71a is displayed on the screen 75.
  • FIG. 31 is a diagram illustrating an example of a display image displayed when the light shape is the vertically long shape 64.
  • the display device 1 ⁇ / b> B moves the region of the image information to be displayed downward from the region of the image information being displayed.
  • the image information displayed on the screen of the panel 13 is scrolled vertically.
  • FIG. 30 is a state in which the area of the image information to be displayed is horizontally scrolled from the area 71, that is, moved to the right, and the image information of the area 71a is displayed on the screen 75.
  • FIG. 31 is a diagram illustrating an example of a display image displayed when the light shape is the vertically long shape 64.
  • FIG. 31 is a state in which the area of the image information to be displayed is vertically scrolled from the area 71a, that is, moved downward, and the image information of the area 71b is displayed on the screen 76.
  • FIG. 32 is a diagram illustrating an example of a display image displayed when the light shape is the circle shape 61.
  • the display device 1B expands the image information displayed on the screen 77 around the pixel irradiated with the circle shape 61.
  • the image information is displayed on the screen of the panel 13.
  • the display device 1 ⁇ / b> B gradually enlarges the image information displayed on the screen 77 and displays the image information on the screen 77 while the light of the circle shape 61 is irradiated on the screen 77 by the light emission instruction device 30. At this time, the area of the image information displayed on the screen of the panel 13 is changed from the area 71b to the area 71c.
  • the variables used in the flowcharts of FIGS. 33 to 37 include variables of the light reception number Lcount, the light reception color PixelColor, the light reception intensity PixelValue, the light reception point coordinate data Lpos, and the instruction pointer data PointInf. The definition of these variables is as described above, and a description thereof will be omitted. FIG.
  • FIG. 33 is a flowchart illustrating an example of a sixth display process performed by the central processing unit 15.
  • the sixth display process is a process of displaying an instruction pointer of a preset color and shape at the position of the light receiving point that has received the light.
  • the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step V1.
  • Steps V1 to V4 correspond to steps Q1 to Q4 shown in FIG. 22, respectively, and description thereof is omitted to avoid duplication.
  • FIG. 34 is a flowchart illustrating an example of a light receiving location search process called from the sixth display process.
  • the light receiving location searching process is executed by the light receiving location determining unit 19, and when the light receiving location determining unit 19 is called from the sixth display process shown in FIG. 33, the process proceeds to step W1.
  • the received light spot search processes W1 to W4 and W6 to W14 shown in FIG. 34 are the same as the received light spot search processes R1 to R4 and R6 to R14 shown in FIG. 23, and a description thereof will be omitted to avoid duplication.
  • step W5 the PixelValue, that is, the intensity of light reception and the PixelColor, that is, the light reception color of the pixel at the coordinates (X, Y) are acquired.
  • FIG. 35 is a flowchart illustrating an example of instruction pointer data creation processing called from the sixth display processing.
  • the instruction pointer data generation process is executed by the instruction pointer display data generation unit 20, and when the instruction pointer display data generation unit 20 is called from the sixth display process shown in FIG. 33, the process proceeds to step AA1.
  • initialization is performed by substituting “0” into the variable N.
  • Step AA2 it is determined whether or not the variable N is less than the number of received light Lcount. If the variable N is less than the light reception number Lcount, the process proceeds to step AA3. If the variable N is greater than or equal to the light reception number Lcount, the instruction pointer data creation process is terminated.
  • step AA3 the coordinates of the Nth received light data Lpos are copied to the coordinate data of the Nth instruction pointer data PointInf.
  • step AA4 the color data of each coordinate of the Nth instruction pointer data PointInf is extracted from the data of the received light color PixelColor and copied.
  • step AA5 the light receiving portion shape determining unit 21A is called to determine the shape of the irradiated portion, that is, the light receiving portion.
  • step AA6 it is determined what the shape determined by the light receiving portion shape determining portion 21A is. If the determined shape is a dot shape, specifically, if the variable A is “0”, the process proceeds to step AA8.
  • step AA7 If the determined shape is a circle shape, specifically, if the variable A is “3”, Proceeding to step AA7, if the shape is a horizontal line, specifically, if the variable A is "1”, proceeding to step AA9, and if the shape is a vertical line, specifically, the variable A is "2".
  • step AA10 an “enlargement” command is transmitted to the video generation device 50 by the serial communication processing unit 22.
  • step AA8 “1” is added to the variable N, and the process returns to step AA2.
  • step AA9 a “horizontal scroll” command is transmitted to the video generation device 50A by the serial communication processing unit 22, and the process proceeds to step AA8.
  • FIG. 36 is a flowchart illustrating an example of an instruction pointer display process called from the sixth display process. After executing step V4 of the display process shown in FIG. 33, the central processing unit 15 proceeds to step BB1.
  • the instruction pointer display processes BB1 to BB9 shown in FIG. 36 are the same as the instruction pointer display processes E1 to E9 shown in FIG. 11, and a description thereof is omitted to avoid duplication.
  • FIG. 37 is a flowchart illustrating an example of a shape determination process called from the instruction pointer data creation process.
  • step CC1 the X coordinate of the pixel of interest is set to the minimum X coordinate Xmin among the X coordinates of the light receiving location.
  • step CC2 it is determined whether or not the value of the X coordinate of the pixel of interest is equal to or less than the value of the maximum X coordinate Xmax among the X coordinates of the light receiving location. If the X coordinate value of the pixel of interest is less than or equal to the X coordinate Xmax, the process proceeds to step CC3.
  • step CC3 it is determined whether or not the inclination in the X-axis direction is within an allowable range. Specifically, it is determined whether or not the range of the Y coordinate of the coordinate X is less than 5, that is, whether or not the difference between the Y coordinate of the pixel of interest and the Y coordinate of the pixel of the coordinate Xmin is less than 5. . If the Y coordinate range of the coordinate X is less than 5, the process proceeds to step CC4. If the Y coordinate range of the coordinate X is 5 or more, the process proceeds to step CC5.
  • step CC4 “1” is added to the variable X, and the process returns to step CC2.
  • step CC5 the Y coordinate of the pixel of interest is set to the minimum Y coordinate Ymin among the Y coordinates of the light receiving location.
  • step CC6 it is determined whether or not the Y coordinate value of the pixel of interest is less than or equal to the maximum Y coordinate Ymax value among the Y coordinates of the light receiving points. If the Y coordinate value of the pixel of interest is equal to or less than the Y coordinate Ymax value, the process proceeds to step CC7. If the Y coordinate value of the pixel of interest is greater than the Y coordinate Ymax value, the process proceeds to step CC13.
  • step CC7 it is determined whether the inclination in the Y-axis direction is within an allowable range. Specifically, it is determined whether or not the range of the X coordinate of the coordinate Y is less than 5, that is, whether or not the difference between the X coordinate of the pixel of interest and the X coordinate of the pixel of the coordinate Ymin is less than 5. . If the X coordinate range of the coordinate Y is less than 5, the process proceeds to Step CC8, and if the X coordinate range of the coordinate Y is 5 or more, the process proceeds to Step CC9. In step CC8, “1” is added to the variable Y, and the process returns to step CC6.
  • step CC10 it is determined whether or not the central portion is irradiated with light. Specifically, among the pixels included in the light receiving location data Lpos, the upper left position is within the range determined by the coordinates (Xm-2, Ym-2) and the lower right position is determined by the coordinates (Xm + 2, Ym + 2). It is determined whether there are two or more pixels with coordinates.
  • step CC11 a value “0” representing the dot shape is substituted for variable A, and the shape determination process is terminated.
  • step CC12 the value “1” representing the horizontal line shape is substituted for the variable A, and the shape determination process ends.
  • step CC13 the value “2” representing the vertical line shape is substituted for the variable A, and the shape determination process ends.
  • step CC14 the value “3” representing the circle shape is substituted for the variable A, and the shape determination process ends.
  • Steps CC1 to CC14 shown in FIG. 37 are shape determination steps.
  • step DD1 it is determined whether a command has been received. If no command is received, the process proceeds to step DD5. If the received command is a horizontal scroll command, the process proceeds to step DD8.
  • step DD2 it is determined whether or not the display area is the lowermost end of the display data. Specifically, it is determined whether or not the display area includes the lowermost pixel among pixels of pixel information for one page. If the display area is the lowermost end of the display data, the process proceeds to step DD7. If the display area is not the lowermost end of the display data, the process proceeds to step DD3. In step DD3, it is determined whether or not the difference between the lowermost end of the display area and the lowermost end of the display data is one pixel (hereinafter also referred to as “dot”).
  • step DD6 If the difference between the bottom end of the display area and the bottom end of the display data is 1 dot, the process proceeds to step DD4. If the difference between the bottom end of the display area and the bottom end of the display data is not 1 dot, the process proceeds to step DD4. .
  • step DD4 “2” is added to the Y coordinate of the display area. Specifically, “2” is added to the Y coordinates Ppos_ys and Ppos_ye.
  • step DD5 a video signal is generated in the display area, that is, the area determined by the XY coordinates (Ppos_xs, Ppos_ys) of the pixel located at the upper left position and the XY coordinates (Ppos_xe, Ppos_ye) of the pixel located at the lower right position.
  • the data is generated and output by the unit 52A, and the process returns to step DD1.
  • step DD6 the display area is set to the lowermost end, and the process proceeds to step DD5. That is, without changing the X coordinate, the lowermost pixel of the display area is set to the lowermost pixel of the image information for one page, and the process proceeds to step DD5.
  • step DD7 the display area is set to the uppermost end, and the process proceeds to step DD5.
  • step DD8 it is determined whether or not the display area is the rightmost end of the display data. Specifically, it is determined whether or not the display area includes the rightmost pixel among pixels of pixel information for one page. If the display area is the rightmost edge of the display data, the process proceeds to step DD12. If the display area is not the rightmost edge of the display data, the process proceeds to step DD9.
  • step DD9 it is determined whether or not the difference between the rightmost edge of the display area and the rightmost edge of the display data is 1 dot. If the difference between the rightmost edge of the display area and the rightmost edge of the display data is 1 dot, the process proceeds to step DD11. If the difference between the rightmost edge of the display area and the rightmost edge of the display data is not 1 dot, the process proceeds to step DD10. .
  • step DD10 "2" is added to the X coordinate of the display area. Specifically, 2 is added to the X coordinates Ppos_xs and Ppos_xe, and the process proceeds to step DD5. In step DD11, the display area is set to the rightmost end, and the process proceeds to step DD5.
  • step DD12 the display area is set to the leftmost end, and the process proceeds to step DD5. That is, without changing the Y coordinate, the leftmost pixel of the display area is set to the leftmost pixel of the image information for one page, and the process proceeds to step DD5.
  • step DD13 it is determined whether or not the width of the display area, that is, the number of pixels in the X-axis direction is within 10 dots. If the width of the display area is within 10 dots, the process proceeds to step DD16.
  • step DD14 it is determined whether or not the height of the display area, that is, the number of pixels in the Y-axis direction is within 10 dots. If the height of the display area is within 10 dots, the process proceeds to step DD16. If the height of the display area is not within 10 dots, the process proceeds to step DD15. In step DD15, "2" is added to each of the X coordinate and Y coordinate of the upper left pixel of the display area, and "2" is subtracted from each of the X coordinate and Y coordinate of the lower right pixel, to step F5. move on.
  • step DD16 the display area is set to a standard size, that is, a non-enlarged size, and the process proceeds to step DD5.
  • the video generation device central processing unit 53A executes steps DD2 to DD7 and steps DD8 to DD12 by the scroll processing unit 56, and executes steps DD13 to DD16 by the enlargement processing unit 57.
  • Steps V1 to V4 shown in FIG. 33 and steps DD1 to DD16 shown in FIG. 38 are sixth control steps.
  • the process for reducing the image information is not described in the above-described flowchart, the same process can be performed by assigning a shape other than the dot shape, the horizontally long shape, the vertically long shape, and the circle shape to the reduction command.
  • the enlarged image information or the reduced image information may be restored by assigning another shape to a command for restoring the enlarged image information or the reduced image information.
  • the display device 13 with a built-in optical sensor includes a plurality of pixels for displaying an image and an optical sensor for detecting the intensity of received light at each of a plurality of positions.
  • the shape formed by the reference position detected when the light intensity detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined, and the shape of the light receiving unit is determined by the central processing unit 15 and the video generation device central processing unit 53A. In accordance with the shape determined by the determination unit 21A, predetermined control is performed on the display image.
  • the predetermined control is a control for scrolling and displaying the display image
  • image information that does not enter the screen can be scrolled and displayed depending on the shape of the light applied to the screen.
  • the predetermined control is control for enlarging or reducing the display image, or returning and displaying the enlarged display image or the reduced display image
  • the image may be changed depending on the shape of light applied to the screen. Information can be enlarged or reduced, or displayed in its original size.
  • steps CC1 to CC1 shown in FIG. 14 the shape formed by the reference position detected when the intensity of the light detected by the optical sensor is greater than or equal to a predetermined reference intensity is determined.
  • steps V1 to V4 shown in FIG. 33 and steps DD1 to DD16 shown in FIG. 38 predetermined control is performed on the display image in accordance with the shape determined in steps CC1 to CC14 shown in FIG. .
  • the display method according to the present invention when the display method according to the present invention is applied, it is possible to control the display image according to the light irradiated on the screen by the pointing device that emits light, for example, the shape formed on the screen by the light. Furthermore, since the predetermined control is control for scrolling and displaying the display image, image information that does not enter the screen can be scrolled and displayed by applying the display method according to the present invention. Further, since the predetermined control is control for enlarging or reducing the display image, or returning and displaying the enlarged display image or the reduced display image, the display method according to the present invention is applied. The image information can be displayed enlarged or reduced.
  • the present invention can be implemented in various other forms without departing from the spirit or main features thereof.

Abstract

Provided are a display device and a display method. A light reception position judgment unit (19) adds a value of a coordinate (X, Y) of the pixel having a light reception intensity judged to be greater than a predetermined value, to the N-th light reception data Lpos. An instruction pointer display data creation unit (20) judges the light reception color of the N-th light reception position according to the light reception data Lpos and the light reception color of each pixel data, and copies the basic coordinate data on the pointer size and the pointer shape corresponding to the judged light reception color to the coordinate data on the instruction pointer data. If the Lcolor corresponding to the light reception color is “3”, the light reception color of each pixel data is set as the color data on the instruction pointer data. If the Lcolor is smaller than “3”, the value of Lcolor is set to the color data on the instruction pointer data. A CPU (15) displays an instruction pointer of the color, the size, and the shape indicated by the instruction pointer, at the position indicated by the coordinate data.

Description

表示装置および表示方法Display device and display method
 本発明は、レーザポインタなどの指示装置によって画面に照射された光の位置を特定することができる表示装置および表示方法に関する。
 本発明は、レーザポインタなどの指示装置によって画面に照射された光を利用する表示装置および表示方法に関する。
The present invention relates to a display device and a display method capable of specifying the position of light irradiated on a screen by an indication device such as a laser pointer.
The present invention relates to a display device and a display method that use light emitted to a screen by a pointing device such as a laser pointer.
 大型の画面を用いてプレゼンテーションを行う発表者は、レーザポインタなどの光を照射して、画面に表示される映像内の位置を指し示し、画面を見る人は、画面に照射されたレーザポインタなどの光の位置によって、発表者が指し示したい位置を認識する。
 特開2001-236181号公報に記載されているポインティングデバイスは、レーザポインタによって指示ポインタが指示されている画面をCCDカメラで撮像し、CCDカメラによって撮像された画像の中の画面の辺を認識し、認識した画面の辺の位置から指示ポインタの座標位置を特定するものである。したがって、画面に対して斜めの位置から指示してもレーザポインタによって指示される位置を特定することができる。
 特開平9-62444号公報に記載されている指示情報入力装置は、プロジェクタによって画像が表示された投影スクリーンをビデオカメラによって撮影し、ビデオカメラによって撮影された映像信号と、表示している映像信号との差分を抽出することによって、レーザポインタによって投射されている光の位置を特定する。さらに、指示情報入力装置は、レーザポインタによって投射されている光の色あるいは輝度が変化したとき、マウスと同様な入力操作をレーザポインタを用いて行うことが可能となる。
 特開平6-230896号公報に記載されている視野角制御表示装置は、見る人と画面との角度に応じて異なる画面を表示装置に表示するものであり、画面を構成する各発光素子に対応して、その発光素子による画面を見ることのできる角度内からきた光のみを選択受光する受光素子を備える。視野角制御表示装置は、この受光素子によって、レーザポインタなどから照射される光を受光する。
 しかしながら、液晶ディスプレイなど表示装置自体が発光する画面にレーザポインタなどで光を照射した場合、画面を見る人は、レーザポインタからの光が液晶ディスプレイに吸収されて、光が照射されている位置を判別し難い場合がある。特開平6-230896号公報に記載されている視野角制御表示装置は、発光素子ごとに受光素子が設けられており、レーザポインタによって光が照射された位置の受光素子が光を受光するので、光が照射された位置を正確に把握することはできるが、画面を見る人に光が照射されている位置をわかりやすくすることはできない。特開平9-62444号公報に記載されている指示情報入力装置は、レーザポインタによって投射されている光の色あるいは輝度の変化によって、マウスと同様な入力操作をレーザポインタを用いて行うことが可能となると記載されているが、具体的にどのようにマウスと同様の操作を行うかについては開示されていない。
A presenter who makes a presentation using a large screen irradiates light such as a laser pointer to indicate the position in the image displayed on the screen, and a viewer who views the screen uses a laser pointer or the like irradiated on the screen. The position that the presenter wants to point to is recognized by the position of the light.
A pointing device described in Japanese Patent Application Laid-Open No. 2001-236181 takes a screen on which a pointing pointer is pointed by a laser pointer by a CCD camera, and recognizes a side of the screen in the image picked up by the CCD camera. The coordinate position of the pointing pointer is specified from the recognized position of the side of the screen. Therefore, the position pointed to by the laser pointer can be specified even if it is pointed from a position oblique to the screen.
An instruction information input device described in Japanese Patent Application Laid-Open No. 9-62444 takes a projection screen on which an image is displayed by a projector by a video camera, a video signal shot by the video camera, and a video signal being displayed The position of the light projected by the laser pointer is specified. Furthermore, the instruction information input device can perform an input operation similar to that of a mouse using the laser pointer when the color or brightness of the light projected by the laser pointer changes.
The viewing angle control display device described in JP-A-6-230896 displays different screens on the display device according to the angle between the viewer and the screen, and corresponds to each light emitting element constituting the screen. In addition, a light receiving element that selectively receives only light from an angle at which the screen by the light emitting element can be seen is provided. The viewing angle control display device receives light emitted from a laser pointer or the like by the light receiving element.
However, when light is emitted from a laser pointer or the like to a screen that the display device itself emits, such as a liquid crystal display, a person who looks at the screen can see the position where the light from the laser pointer is absorbed by the liquid crystal display. It may be difficult to distinguish. In the viewing angle control display device described in JP-A-6-230896, a light receiving element is provided for each light emitting element, and the light receiving element at the position irradiated with light by the laser pointer receives light. Although it is possible to accurately grasp the position where the light is irradiated, it is not possible to make it easy to understand the position where the light is irradiated to the person viewing the screen. The instruction information input device described in Japanese Patent Laid-Open No. 9-62444 can perform an input operation similar to that of a mouse by using the laser pointer by changing the color or brightness of the light projected by the laser pointer. However, it does not disclose how to perform the same operation as a mouse.
 本発明の目的は、光を照射する指示装置によって画面に照射された光の位置を明示することができる表示装置および表示方法を提供することである。
 本発明の他の目的は、光を照射する指示装置によって画面に照射された光に応じた制御を行うことができる表示装置および表示方法を提供することである。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
 前記検出部によって検出された光の強さに応じて、少なくとも光の検出位置を示す画像を前記表示手段に表示させる制御手段とを含むことを特徴とする表示装置である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを画素ごとに検出する検出部とを備える表示手段に画像を表示するにあたって、制御手段によって、前記検出部によって検出された光の強さに応じて、少なくとも光の検出位置を示す画像が前記表示手段に表示される。
 したがって、光を照射する指示装置によって画面に照射された光の位置を明示することができる。
 また本発明は、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
 前記検出部によって検出された光の強さに応じて、検出位置を示す画像を表示させる第1の制御ステップとを含むことを特徴とする表示方法である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示するにあたって、第1の制御ステップでは、前記検出部によって検出された光の強さに応じて、検出位置を示す画像を表示する。
 したがって、本発明に係る表示方法を提供すれば、光を照射する指示装置によって画面に照射された光の位置を明示することができる。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
 前記検出部によって検出された光の強さが変化するパターンを判別する判別手段と、
 前記判別手段によって判別されたパターンが第1のパターンであるとき、前記検出部によって光が検出された位置を示す第1の表示形態のシンボル画像を表示させ、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像を表示させる制御手段とを含むことを特徴とする表示装置である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段に画像を表示するにあたって、判別手段によって、前記検出部によって検出された光の強さが変化するパターンが判別され、制御手段によって、前記判別手段によって判別されたパターンが第1のパターンであるとき、前記検出部によって光が検出された位置を示す第1の表示形態のシンボル画像が表示され、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像が表示される。
 したがって、光の強さが変化するパターンが異なる光を照射する複数の指示装置によって指示された画面上の位置をそれぞれ明示することができる。
 また本発明は、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
 前記検出部によって検出された光の強さが変化するパターンを判別する判別ステップと、
 前記判別ステップで判別されたパターンが第1のパターンであるとき、前記検出部によって光が検出された位置を示す第1の表示形態のシンボル画像を表示させ、前記判別ステップで判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像を表示する第2の制御ステップとを含むことを特徴とする表示方法である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示するにあたって、判別ステップでは、前記検出部によって検出された光の強さが変化するパターンを判別する。
 そして、第2の制御ステップでは、前記判別ステップで判別されたパターンが第1のパターンであるとき、前記検出部によって光が検出された位置を示す第1の表示形態のシンボル画像を表示させ、前記判別ステップで判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像を表示する。
 したがって、本発明に係る表示方法を提供すれば、光の強さが変化するパターンが異なる光を照射する複数の指示装置によって指示された画面上の位置をそれぞれ明示することができる。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
 前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定手段と、
 前記範囲決定手段によって決定された範囲ごとに各範囲の中心となる位置を特定する位置特定手段と、
 前記位置特定手段によって特定された位置を示す画像を表示させる制御手段とを含むことを特徴とする表示装置である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段に画像を表示するにあたって、範囲決定手段によって、前記検出部によって検出された光の強さが予め定める基準強度以上である範囲が決定され、位置特定手段によって、前記範囲決定手段によって決定された範囲ごとに各範囲の中心となる位置が特定され、制御手段によって、前記位置特定手段によって特定された位置を示す画像が表示される。
 したがって、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
 前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定ステップと、
 前記範囲決定ステップで決定された範囲ごとに各範囲の中心にある位置を特定する第1の位置特定ステップと、
 前記第1の位置特定ステップで特定された位置を示す画像を表示する第3の制御ステップとを含むことを特徴とする表示方法である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示装置に画像を表示するにあたって、範囲決定ステップでは、前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する。第1の位置特定ステップでは、前記範囲決定ステップで決定された範囲ごとに各範囲の中心にある位置を特定する。そして、第3の制御ステップでは、前記第1の位置特定ステップで特定された位置を示す画像を表示する。
 したがって、本発明に係る表示方法を提供すれば、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
 前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定手段と、
 前記範囲決定手段によって決定された範囲内のうち、光の強さが最も強く検出した位置を前記決定された範囲ごとに特定する位置特定手段と、
 前記位置特定手段によって特定された位置を示す画像を表示させる制御手段とを含むことを特徴とする表示装置である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段に画像を表示するにあたって、範囲決定手段によって、前記検出部によって検出された光の強さが予め定める基準強度以上である範囲が決定され、位置特定手段によって、前記範囲決定手段によって決定された範囲内のうち、光の強さが最も強い位置が前記決定された範囲ごとに特定され、制御手段によって、前記位置特定手段によって特定された位置を示す画像が表示される。
 したがって、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 また本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
 前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定ステップと、
 前記範囲決定ステップで決定された範囲内のうち、光の強さが最も強い位置を前記決定された範囲ごとに特定する第2の位置特定ステップと、
 前記第2の位置特定ステップで特定された位置を示す画像を表示する第4の制御ステップとを含むことを特徴とする表示方法である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示装置に画像を表示するにあたって、範囲決定ステップでは、前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する。第2の位置特定ステップでは、前記範囲決定ステップで決定された範囲内のうち、光の強さが最も強い位置を前記決定された範囲ごとに特定する。そして、第4の制御ステップでは、前記第2の位置特定ステップで特定された位置を示す画像を表示する。
 したがって、本発明に係る表示方法を提供すれば、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
 前記検出部によって検出された光の強さが変化するパターンを判別する判別手段と、
 前記判別手段によって判別されたパターンが第1のパターンであるとき、表示画像に対して第1の制御を行い、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、表示画像に対して第2の制御を行う制御手段とを含むことを特徴とする表示装置である。
 本発明によれば、画像を表示する複数の画素と、受光するする光の強さを複数の位置ごとに検出する検出部とを備える表示手段に画像を表示するにあたって、判別手段によって、前記検出部によって検出された光の強さが変化するパターンが判別され、制御手段によって、前記判別手段によって判別されたパターンが第1のパターンであるとき、表示画像に対して第1の制御が行われ、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、表示画像に対して第2の制御が行われる。
 したがって、光を照射する指示装置によって画面に照射された光に応じた制御、たとえば光の強さが変化するパターンに応じて表示画像に対する制御を行うことができる。
 また本発明は、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
 前記検出部によって検出された光の強さが変化するパターンを判別する判別ステップと、
 前記判別ステップで判別されたパターンが第1のパターンであるとき、表示画像に対して第1の制御を行い、前記判別ステップで判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、表示画像に対して第2の制御を行う第5の制御ステップとを含むことを特徴とする表示方法である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示するにあたって、判別ステップでは、前記検出部によって検出された光の強さが変化するパターンを判別する。そして、第5の制御ステップでは、前記判別ステップで判別されたパターンが第1のパターンであるとき、表示画像に対して第1の制御を行い、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、表示画像に対して第2の制御を行う。
 したがって、本発明に係る表示方法を適用すれば、光を照射する指示装置によって画面に照射された光に応じた制御、たとえば光の強さが変化するパターンに応じて表示画像に対する制御を行うことができる。
 本発明は、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
 前記検出部によって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状を判別する形状判別手段と、
 前記形状判別手段によって判別された形状に応じて、前記表示画像に対して予め定める制御を行う制御手段とを含むことを特徴とする表示装置である。
 本発明によれば、表示手段によって、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とが備えられ、形状判別手段によって、前記検出部によって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状が判別され、制御手段によって、前記形状判別手段によって判別された形状に応じて、前記表示画像に対して予め定める制御が行われる。
 したがって、光を照射する指示装置によって画面に照射された光、たとえば光によって画面に形成される形状に応じた制御を行うことができる。
 また本発明は、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
 前記検出部によって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状を判別する形状判別ステップと、
 前記形状判別ステップで判別された形状に応じて、前記表示画像に対して予め定める制御を行う第6の制御ステップとを含むことを特徴とする表示方法である。
 本発明によれば、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示するにあたって、形状判別ステップでは、前記検出部によって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状を判別する。第6の制御ステップでは、前記形状判別ステップで判別された形状に応じて、前記表示画像に対して予め定める制御を行う。
 したがって、本発明に係る表示方法を適用すれば、光を照射する指示装置によって画面に照射された光、たとえば光によって画面に形成される形状に応じた制御を行うことができる。
An object of the present invention is to provide a display device and a display method that can clearly indicate the position of light irradiated on a screen by an indication device that emits light.
Another object of the present invention is to provide a display device and a display method capable of performing control in accordance with light irradiated on a screen by an instruction device that emits light.
The present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
The display device includes a control unit that causes the display unit to display at least an image indicating a light detection position in accordance with the intensity of light detected by the detection unit.
According to the present invention, when an image is displayed on a display unit including a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each pixel, the control unit detects the image by the detection unit. In accordance with the intensity of the emitted light, an image showing at least the light detection position is displayed on the display means.
Therefore, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
Further, the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
And a first control step of displaying an image indicating a detection position in accordance with the intensity of light detected by the detection unit.
According to the present invention, when an image is displayed on a display device that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each position, the first control step includes: An image indicating the detection position is displayed according to the intensity of light detected by the unit.
Therefore, if the display method according to the present invention is provided, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
The present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
Discriminating means for discriminating a pattern in which the intensity of light detected by the detection unit changes;
When the pattern discriminated by the discriminating means is the first pattern, the symbol image of the first display form indicating the position where the light is detected by the detecting unit is displayed, and the pattern discriminated by the discriminating means is Control means for displaying a symbol image of a second display form different from the first display form and indicating the position when the second pattern is different from the first pattern. It is a display device.
According to the present invention, when displaying an image on a display unit including a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each of a plurality of positions, the detection unit includes the detection unit. When the pattern in which the intensity of the light detected by is changed is discriminated and the pattern discriminated by the discriminating unit is the first pattern by the control unit, the first position indicating the position where the light is detected by the detecting unit is determined. When a symbol image of one display form is displayed and the pattern discriminated by the discriminating means is a second pattern different from the first pattern, the position is different from the first display form and the position is A symbol image of the second display form shown is displayed.
Therefore, it is possible to clearly indicate each position on the screen instructed by a plurality of indicating devices that emit light having different patterns of varying light intensity.
Further, the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
A determination step of determining a pattern in which the intensity of light detected by the detection unit changes;
When the pattern determined in the determination step is a first pattern, a symbol image of a first display form indicating a position where light is detected by the detection unit is displayed, and the pattern determined in the determination step is A second control step of displaying a symbol image of a second display form different from the first display form and indicating the position when the second pattern is different from the first pattern. This is a display method characterized by this.
According to the present invention, when an image is displayed on a display device that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each position, in the determination step, the detection unit detects the image. The pattern in which the intensity of the emitted light changes is determined.
In the second control step, when the pattern determined in the determination step is the first pattern, a symbol image of the first display form showing the position where the light is detected by the detection unit is displayed. When the pattern discriminated in the discriminating step is a second pattern different from the first pattern, a symbol image of the second display form that is different from the first display form and indicates the position is displayed. To do.
Therefore, by providing the display method according to the present invention, it is possible to clearly indicate the positions on the screen instructed by a plurality of instruction devices that emit light having different light intensity patterns.
The present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
Range determining means for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
Position specifying means for specifying a position that is the center of each range for each range determined by the range determining means;
And a control unit that displays an image indicating the position specified by the position specifying unit.
According to the present invention, when an image is displayed on a display unit that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each of a plurality of positions, the detection is performed by the range determination unit. A range in which the light intensity detected by the unit is greater than or equal to a predetermined reference intensity is determined, and a position that is the center of each range is specified for each range determined by the range determination unit by the position specifying unit, and control is performed. The means displays an image indicating the position specified by the position specifying means.
Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device.
The present invention is a display method for displaying an image on a display device including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at a plurality of positions,
A range determining step for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
A first position specifying step for specifying a position at the center of each range for each range determined in the range determining step;
And a third control step for displaying an image indicating the position specified in the first position specifying step.
According to the present invention, when displaying an image on a display device including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at each of a plurality of positions, the range determination step includes the detection A range in which the intensity of light detected by the unit is equal to or greater than a predetermined reference intensity is determined. In the first position specifying step, a position at the center of each range is specified for each range determined in the range determining step. In the third control step, an image indicating the position specified in the first position specifying step is displayed.
Therefore, if the display method according to the present invention is provided, the position on the screen indicated by the operator using the pointing device can be clearly indicated even when using a pointing device that spreads the light irradiated on the screen, such as a flashlight. Can do.
The present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
Range determining means for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
Position specifying means for specifying, for each of the determined ranges, a position where the intensity of light is detected most strongly within the range determined by the range determining means;
And a control unit that displays an image indicating the position specified by the position specifying unit.
According to the present invention, when an image is displayed on a display unit that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each of a plurality of positions, the detection is performed by the range determination unit. A range in which the light intensity detected by the unit is equal to or greater than a predetermined reference intensity is determined, and the position specifying unit determines the position having the strongest light intensity in the range determined by the range determining unit. An image that is specified for each determined range and that indicates the position specified by the position specifying means is displayed by the control means.
Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device.
Further, the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
A range determining step for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
A second position specifying step for specifying, for each of the determined ranges, a position having the strongest light intensity within the range determined in the range determining step;
And a fourth control step for displaying an image indicating the position specified in the second position specifying step.
According to the present invention, when displaying an image on a display device including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at each of a plurality of positions, the range determination step includes the detection A range in which the intensity of light detected by the unit is equal to or greater than a predetermined reference intensity is determined. In the second position specifying step, a position having the strongest light intensity is specified for each determined range within the range determined in the range determining step. In the fourth control step, an image indicating the position specified in the second position specifying step is displayed.
Therefore, if the display method according to the present invention is provided, the position on the screen indicated by the operator using the pointing device can be clearly indicated even when using a pointing device that spreads the light irradiated on the screen, such as a flashlight. Can do.
The present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
Discriminating means for discriminating a pattern in which the intensity of light detected by the detection unit changes;
When the pattern discriminated by the discriminating means is the first pattern, the first control is performed on the display image, and the pattern discriminated by the discriminating means is different from the first pattern. And a control means for performing second control on the display image.
According to the present invention, when an image is displayed on a display unit including a plurality of pixels that display an image and a detection unit that detects the intensity of received light at each of a plurality of positions, the detection unit performs the detection. When the pattern in which the intensity of the light detected by the unit changes is determined and the pattern determined by the determining unit is the first pattern, the first control is performed on the display image. When the pattern determined by the determining unit is a second pattern different from the first pattern, the second control is performed on the display image.
Therefore, it is possible to perform control on the display image according to the pattern in which the intensity of light changes, for example, according to the light irradiated on the screen by the pointing device that emits light.
Further, the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
A determination step of determining a pattern in which the intensity of light detected by the detection unit changes;
When the pattern determined in the determination step is the first pattern, the first control is performed on the display image, and the pattern determined in the determination step is different from the first pattern. And a fifth control step of performing second control on the display image.
According to the present invention, when an image is displayed on a display device that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each position, in the determination step, the detection unit detects the image. The pattern in which the intensity of the emitted light changes is determined. In the fifth control step, when the pattern determined in the determination step is the first pattern, the first control is performed on the display image, and the pattern determined by the determination unit is the first pattern. When the second pattern is different from the first pattern, the second control is performed on the display image.
Therefore, if the display method according to the present invention is applied, control according to the light irradiated on the screen by the pointing device that emits light, for example, control on the display image according to the pattern in which the light intensity changes. Can do.
The present invention provides a display means comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at a plurality of positions,
A shape discriminating means for discriminating a shape formed by a reference position detected as a light intensity detected by the detection unit equal to or higher than a predetermined reference intensity;
And a control unit configured to perform predetermined control on the display image in accordance with the shape determined by the shape determination unit.
According to the present invention, the display unit includes a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light at each of a plurality of positions, and the shape determination unit detects the detection unit by the detection unit. The shape formed by the reference position detected with the intensity of the emitted light being equal to or higher than a predetermined reference intensity is determined, and the control unit determines the shape for the display image according to the shape determined by the shape determination unit. Predetermined control is performed.
Therefore, it is possible to perform control in accordance with the light irradiated on the screen by the pointing device that emits light, for example, the shape formed on the screen by the light.
Further, the present invention is a display method for displaying an image on a display device including a plurality of pixels for displaying an image and a detection unit that detects the intensity of received light for each position.
A shape discriminating step for discriminating a shape formed by a reference position detected as a light intensity detected by the detection unit equal to or greater than a predetermined reference intensity;
And a sixth control step for performing a predetermined control on the display image in accordance with the shape determined in the shape determination step.
According to the present invention, when displaying an image on a display device that includes a plurality of pixels that display an image and a detection unit that detects the intensity of received light for each position, the shape determination step includes: The shape formed by the reference position detected when the detected light intensity is greater than or equal to a predetermined reference intensity is determined. In the sixth control step, predetermined control is performed on the display image in accordance with the shape determined in the shape determination step.
Therefore, by applying the display method according to the present invention, it is possible to perform control according to the light irradiated on the screen by the pointing device that emits light, for example, the shape formed on the screen by the light.
 本発明の目的、特色、および利点は、下記の詳細な説明と図面とからより明確になるであろう。
本発明の第1の実施形態である表示装置の構成を示す図である。 発光指示装置の光が照射された画面の例を示す図である。 光が照射された位置に表示される指示ポインタの例を示す図である。 光が照射された位置に表示される指示ポインタの例を示す図である。 光が照射された位置に表示される指示ポインタの例を示す図である。 光が照射された位置に表示される指示ポインタの例を示す図である。 照射された光に広がりがある場合の指示ポインタの例を示す図である。 照射された光に広がりがある場合の指示ポインタの例を示す図である。 照射された光に広がりがある場合の指示ポインタの例を示す図である。 照射する光の色が異なる2種類の発光指示装置の光が照射された画面の例を示す図である。 照射する光の色が異なる2種類の発光指示装置の光が照射された画面の例を示す図である。 出射パターンが異なる2種類の発光指示装置の光が照射された画面の例を示す図である。 中央演算部が行う第1の表示処理の一例を示すフローチャートである。 中央演算部が行う第2の表示処理の一例を示すフローチャートである。 第2の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。 第2の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。 第2の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。 中央演算部が行う第3の表示処理の一例を示すフローチャートである。 第3の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。 第3の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。 第3の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。 中央演算部が行う第4の表示処理の一例を示すフローチャートである。 第4の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。 第4の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。 第4の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。 中央演算部が行う設定処理の一例を示すフローチャートである。 本発明の第2の実施形態である表示装置の構成を示す図である。 中央演算部が行う第5の表示処理の一例を示すフローチャートである。 第5の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。 第5の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。 第5の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。 映像生成装置中央演算部が行うコマンド処理の一例を示すフローチャートである。 本発明の第3の実施形態である表示装置の構成を示す図である。 発光指示装置によって照射される光の形状の例を示す図である。 発光指示装置によって照射される光の形状の例を示す図である。 発光指示装置によって照射される光の形状の例を示す図である。 発光指示装置によって照射される光の形状の例を示す図である。 ドット形状の光がパネルに照射されている表示画像の一例を示す図である。 光の形状が横長形状のときに表示される表示画像の一例を示す図である。 光の形状が縦長形状のときに表示される表示画像の一例を示す図である。 光の形状がサークル形状のときに表示される表示画像の一例を示す図である。 中央演算部が行う第6の表示処理の一例を示すフローチャートである。 第6の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。 第6の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。 第6の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。 指示ポインタデータ作成処理から呼び出される形状判定処理の一例を示すフローチャートである。 映像生成装置中央演算部が行うコマンド処理の一例を示すフローチャートである。
Objects, features, and advantages of the present invention will become more apparent from the following detailed description and drawings.
It is a figure which shows the structure of the display apparatus which is the 1st Embodiment of this invention. It is a figure which shows the example of the screen irradiated with the light of the light emission instruction | indication apparatus. It is a figure which shows the example of the instruction | indication pointer displayed on the position where light was irradiated. It is a figure which shows the example of the instruction | indication pointer displayed on the position where light was irradiated. It is a figure which shows the example of the instruction | indication pointer displayed on the position where light was irradiated. It is a figure which shows the example of the instruction | indication pointer displayed on the position where light was irradiated. It is a figure which shows the example of the instruction | indication pointer when the irradiated light has a breadth. It is a figure which shows the example of the instruction | indication pointer when the irradiated light has a breadth. It is a figure which shows the example of the instruction | indication pointer when the irradiated light has a breadth. It is a figure which shows the example of the screen irradiated with the light of two types of light emission instruction | indication apparatuses from which the color of the light to irradiate differs. It is a figure which shows the example of the screen irradiated with the light of two types of light emission instruction | indication apparatuses from which the color of the light to irradiate differs. It is a figure which shows the example of the screen irradiated with the light of two types of light emission instruction | indication apparatuses from which an emission pattern differs. It is a flowchart which shows an example of the 1st display process which a central processing part performs. It is a flowchart which shows an example of the 2nd display process which a central processing part performs. It is a flowchart which shows an example of the light reception location search process called from a 2nd display process. It is a flowchart which shows an example of the instruction | indication pointer data creation process called from a 2nd display process. It is a flowchart which shows an example of the instruction | indication pointer display process called from a 2nd display process. It is a flowchart which shows an example of the 3rd display process which a central processing part performs. It is a flowchart which shows an example of the light reception location search process called from a 3rd display process. It is a flowchart which shows an example of the instruction | indication pointer data creation process called from a 3rd display process. It is a flowchart which shows an example of the instruction | indication pointer display process called from a 3rd display process. It is a flowchart which shows an example of the 4th display process which a central processing part performs. It is a flowchart which shows an example of the light reception location search process called from the 4th display process. It is a flowchart which shows an example of the instruction | indication pointer data creation process called from a 4th display process. It is a flowchart which shows an example of the instruction | indication pointer display process called from a 4th display process. It is a flowchart which shows an example of the setting process which a central processing part performs. It is a figure which shows the structure of the display apparatus which is the 2nd Embodiment of this invention. It is a flowchart which shows an example of the 5th display process which a central processing part performs. It is a flowchart which shows an example of the light reception location search process called from the 5th display process. It is a flowchart which shows an example of the instruction | indication pointer data creation process called from the 5th display process. It is a flowchart which shows an example of the instruction | indication pointer display process called from the 5th display process. It is a flowchart which shows an example of the command processing which a video production | generation apparatus central processing part performs. It is a figure which shows the structure of the display apparatus which is the 3rd Embodiment of this invention. It is a figure which shows the example of the shape of the light irradiated by the light emission instruction | indication apparatus. It is a figure which shows the example of the shape of the light irradiated by the light emission instruction | indication apparatus. It is a figure which shows the example of the shape of the light irradiated by the light emission instruction | indication apparatus. It is a figure which shows the example of the shape of the light irradiated by the light emission instruction | indication apparatus. It is a figure which shows an example of the display image with which the light of dot shape is irradiated to the panel. It is a figure which shows an example of the display image displayed when the shape of light is a horizontally long shape. It is a figure which shows an example of the display image displayed when the shape of light is a vertically long shape. It is a figure which shows an example of the display image displayed when the shape of light is a circle shape. It is a flowchart which shows an example of the 6th display process which a central processing part performs. It is a flowchart which shows an example of the light reception location search process called from the 6th display process. It is a flowchart which shows an example of the instruction | indication pointer data creation process called from the 6th display process. It is a flowchart which shows an example of the instruction | indication pointer display process called from the 6th display process. It is a flowchart which shows an example of the shape determination process called from an instruction pointer data creation process. It is a flowchart which shows an example of the command processing which a video production | generation apparatus central processing part performs.
 以下図面を参考にして本発明の好適な実施形態を詳細に説明する。
 図1は、本発明の第1の実施形態である表示装置1の構成を示す図である。表示装置1は、ディスプレイ装置10および発光指示装置30を含む。本発明に係る表示方法は、ディスプレイ装置10によって処理される。
 ディスプレイ装置10は、N個の映像入力端子11、表示用・表示処理用一時記憶部12、光センサ内蔵表示用デバイス13、表示処理部14、中央演算部15、内部記憶部16、リモートコントロール(以下「リモコン」という)受光部17、リモコン処理部18、受光箇所判定部19、指示ポインタ表示データ作成部20および受光波形判定部21を含む。
 映像入力端子11は、パーソナルコンピュータ(以下「PC」という)などによって構成される後述の映像生成装置(たとえば図21の要素50および図27の要素50A参照)から出力される映像信号が入力される端子であり、N個の入力端子が設けられている。Nは、自然数である。各映像入力端子11から入力された映像信号は、表示用・表示処理用一時記憶部12に送られる。表示用・表示処理用一時記憶部12は、半導体メモリあるいはハードディスク装置などによって構成される書き込みおよび読み出し可能な記憶装置であり、映像入力端子11から受け取る映像信号を表す画像情報、および中央演算部15によって表示処理および表示用加工処理が行われる画像情報を記憶する。表示用・表示処理用一時記憶部12に記憶される画像情報は、表示処理部14および中央演算部15から読み出し可能である。
 表示手段である光センサ内蔵表示用デバイス13は、表示画像を構成する複数の画素、および受光する光の強さおよび色を検出する検出部である光センサを画素ごとに内蔵する液晶ディスプレイである。さらに、光センサ内蔵表示用デバイス13は、各画素を構成する赤、緑および青のサブピクセルごとに光センサが設けられ、受光した光の色も画素ごとに検出することができる。光センサ内蔵表示用デバイス13は、表示処理部14から受け取る画像情報を表示し、各光センサが画素ごとに検出する光の強さ(以下「受光の強さ」という)および色(以下「受光色」という)を中央演算部15に送る。
 表示処理部14は、表示用・表示処理用一時記憶部12から読み出した画像情報を、中央演算部15から指示される形式、たとえば光センサ内蔵表示用デバイス13が表示可能な形式に変換して光センサ内蔵表示用デバイス13に送り表示させる。
 中央演算部15は、中央処理装置(以下「CPU」という)およびディスプレイ装置10を制御するための制御プログラムを記憶するメモリを含み、CPUがメモリに記憶される制御プログラムを実行することによって、光センサ内蔵表示用デバイス13、表示処理部14およびリモコン処理部18を制御する。中央演算部15は、判別手段、判断手段、範囲決定手段、位置特定手段および制御手段である。中央演算部15は、CPUおよびメモリの代わりに、プログラミングすることができるLSI(Large Scale Integration)であるFPGA(Field Programmable Gate Array)、特定の用途のために設計、製造される集積回路であるASIC(Application Specific Integrated 
Circuit)、あるいはその他の演算機能を有する回路によって構成してもよい。
 内部記憶部16は、たとえば書き込みおよび読み出し可能な半導体メモリなどの記憶装置によって構成され、中央演算部15が制御プログラムを実行するときに用いる制御情報161を記憶する。制御情報161は、たとえば受光した光の色を表す受光色、受光した光の強さを表す受光の強さ、受光した光の数を表す受光数、指示ポインタの形状を表すポインタ形状、指示ポインタの色を表すポインタ表示色、指示ポインタの大きさを表すポインタサイズ、指示ポインタ表示の中心位置を表すポインタ表示位置、および指示ポインタの表示位置の基準を表すポインタ表示基準などの情報を含む。
 受光数は、光を受光している受光箇所の数であり、光が重なっていなければ、光を照射している発光指示装置30の数に一致する数である。受光箇所は、発光指示装置30によって光が照射されている少なくとも1つの画素からなる画素群である。シンボル画像である指示ポインタは、発光指示装置30によって光が照射されている位置を明示するための画像であり、受光箇所の代表点の位置を示す。ポインタ表示基準は、受光箇所の中心位置を基準とするか、受光箇所の最も明るい位置を基準とするかを示す。すなわち、代表点を受光箇所の中心位置とするか、受光箇所の最も明るい位置とするかの基準である。
 リモコン受光部17は、ディスプレイ装置10を操作するための情報をディスプレイ装置10に送信する図示しないリモコンからの光を受光し、受光した光を電気信号に変換して、リモコン処理部18に送る。リモコン処理部18は、リモコン受光部17から受け取る電気信号をリモコンから指示された情報に変換して、中央演算部15に送る。
 受光箇所判定部19、指示ポインタ表示データ作成部20および受光波形判定部21は、中央演算部15が制御プログラムを実行することによって実現される機能である。受光箇所判定部19は、光センサ内蔵表示用デバイス13に照射されている光の箇所を判定する。具体的には、受光箇所判定部19は、光センサ内蔵表示用デバイス13の光センサから受け取る受光の強さが、予め定める基準強さ以上、たとえば0~255の256段階で128以上の画素またはそれらの複数の画素が隣接している範囲を、発光指示装置30から照射されている光を受光している受光箇所と判定する。
 指示ポインタ表示データ作成部20は、受光箇所判定部19によって光を受光していると判定された受光箇所について、指示ポインタを表示する位置を決定するとともに、指示ポインタを表す指示ポインタデータを作成する。受光波形判定部21は、発光指示装置30から出射される光の出射パターンを判定する。具体的には、受光波形判定部21は、光センサ内蔵表示用デバイス13の光センサから受け取る受光の強さが、予め定める期間、たとえば2ミリ秒の期間予め定める基準強度以上であるとき、連続する出射パターンであると判定し、予め定める基準強度以上である状態と予め定める基準強度未満である状態とを予め定める期間に繰り返すとき、パルス状の出射パターンであると判定する。パルス状の出射パターンは、繰り返されるパルスの周波数、パルス幅、パルス間隔などを変化させて組み合わせることによって、複数の出射パターンとして用いることも可能である。連続する出射パターンは、第1のパターンであり、パルス状の出射パターンは、第2のパターンである。
 発光指示装置30は、レーザポインタ、LED(Light Emitting Diode)あるいは懐中電灯など光を画面に照射して位置を指示する指示装置である。発光指示装置30aは、光を発光する発光部31を含み、発光指示装置30bは、発光部31の他に、発光部31をパルス状に発光させる間欠駆動部32を含み、発光指示装置30cは、発光部31および間欠駆動部32の他に、発光をパルス状に照射するか連続して照射するかを切り換えるボタン33を含む。
 図2は、発光指示装置30の光が照射された画面40の例を示す図である。図2に示す発光指示装置30は、レーザポインタなど焦点の広がりが小さい指示装置であり、発光指示装置30からのレーザ光線39が画面40のうち、人物画像が表示されている領域41に照射されている。画面40の右側に引き出した画面41は、画素42の1つ1つがわかるように拡大した画面であり、画面41には、人物の表示画像43およびレーザ光線39が照射されているレーザポインタ照射箇所44が示されている。
 各画素42は、赤色のサブピクセル421、緑色のサブピクセル422および青色のサブピクセル423によって構成され、さらに各画素42には光センサ46が設けられる。光センサ46は、色を検出する3つの光センサ461~463および光の強さを検出する図示しない光センサを含む。光センサ461は赤色を検出し、光センサ462は緑色を検出し、光センサ463は青色を検出する。
 図3A~図3Dは、光が照射された位置に表示される指示ポインタの例を示す図である。図3Aは、レーザ光線39が光を照射して指し示している位置を赤色で表した指示ポインタ45aを表示する拡大画面41aを示す。図3Bは、レーザ光線39が光を照射して指し示している位置を3×3の白色の画素で表した指示ポインタ45bを表示する拡大画面41bを示す。図3Cは、レーザ光線39が光を照射して指し示している位置を5×5の緑色の画素で表した指示ポインタ45cを表示する拡大画面41cを示す。指示ポインタ45cは、図3Bに示した指示ポインタ45bよりも大きさを大きくした指示ポインタである。図3Dは、レーザ光線39が光を照射して指し示している位置を赤色の矢印の形状で示す指示ポインタ45dを表示する拡大画面41dを示す。矢印の先端は、たとえば光が照射されている画素の中心の画素の位置を示す。
 図4A~図4Cは、照射された光に広がりがある場合の指示ポインタの例を示す図である。図4Aは、懐中電灯からの光が広がりのある領域44eに照射されていることを示している。図4Aでは、懐中電灯によって照射されている光は、領域44eの中心ほど光の強さが強い。指示ポインタ表示データ作成部20は、領域44eの中心の画素の位置、または領域44eに含まれる画素のうち最も受光の強さが強い画素の位置を、指示ポインタを表示する位置と決定する。図4Bは、指示ポインタ表示データ作成部20が決定した位置に赤色の指示ポインタ45fを表示した拡大画面41fを示し、図4Cは、指示ポインタ表示データ作成部20が決定した位置に赤色の矢印の形状の指示ポインタ45gを表示した拡大画面41gを示す。
 図5Aおよび図5Bは、照射する光の色が異なる2種類の発光指示装置30a,30bの光が照射された画面40hの例を示す図である。発光指示装置30a,30bは、いずれもレーザポインタによって構成され、発光指示装置30aは赤色のレーザ光線を照射し、発光指示装置30bは黄色のレーザ光線を照射する。図5Aは、発光指示装置30aからの赤色のレーザ光線および発光指示装置30bからの黄色のレーザ光線が画面40hに照射されていることを示している。拡大画面41hには、それぞれのレーザ光線が照射されている画素45h1,45h2が示されている。図5Bは、発光指示装置30aからの赤色のレーザ光線が照射されている位置に、3×3画素からなる四角形の赤い指示ポインタ45j1が表示され、発光指示装置30bからの黄色のレーザ光線が照射されている位置に、黄色の矢印の形状の指示ポインタ45j2が表示されていることを示している。
 図6は、出射パターンが異なる2種類の発光指示装置30c,30dの光が照射された画面40kの例を示す図である。発光指示装置30c,30dは、いずれもレーザポインタによって構成され、発光指示装置30cはレーザ光線を連続する出射パターンで照射し、発光指示装置30dはレーザ光線をパルス状の出射パターンで照射する。図6は、発光指示装置30cからのレーザ光線および発光指示装置30dからのレーザ光線が画面40kに照射されていることを示しており、拡大画面41kには、それぞれのレーザ光線が照射されている画素45k1,45k2が示されている。
 図6に示した例では、受光波形判定部21は、画素45k1の光センサから受け取る受光の強さが、予め定める期間予め定める基準強度以上であるので、連続する出射パターンであると判定し、画素45k2の光センサから受け取る受光の強さが、予め定める基準強度以上である状態と予め定める基準強度未満である状態とを予め定める期間に繰り返すので、パルス状の出射パターンであると判定する。
 図7~図12のフローチャートで用いる変数には、受光数Lcount、ポインタサイズLsize、ポインタ表示色Lcolor、ポインタ形状Lpattern、ポインタ表示基準Lbase、受光色PixelColor、受光の強さPixelValue、受光箇所の座標データLpos、および指示ポインタデータPointInfの変数がある。
 受光数Lcountは、光を受光している受光箇所の数であり、光が照射されている箇所が隣接する複数の画素から構成される場合は、その1つのまとまりを1箇所として数える。変数Lcountは、0からLMAXまでの整数であり、LMAXは、受光色の数の最大数、たとえば「3」である。
 ポインタサイズLsizeは、指示ポインタのサイズを表す値であり、値が「0」であると、大きいサイズを表し、値が「1」であると、中間のサイズを表し、値が「2」であると、小さいサイズを表す。ポインタ表示色Lcolorは、指示ポインタの色を表す値であり、値が「0」であると、赤色を表し、値が「1」であると、緑色を表し、値が「2」であると、青色を表し、値が「3」であると、受光した色を表す。ポインタ形状Lpatternは、指示ポインタの形状を表す値であり、値が「0」であると、四角形を表し、値が「1」であると、丸を表し、値が「2」であると、矢印を表す。
 ポインタサイズLsize、ポインタ表示色Lcolorおよびポインタ形状Lpatternは、それぞれK個、たとえば3個、すなわち受光色ごとに設定可能であり、それぞれを区別するときは、Lsize[K]、Lcolor[K]およびLpattern[K]と表す。以下、ポインタサイズLsize、ポインタ表示色Lcolorおよびポインタ形状Lpatternのデータを基本座標データともいう。ポインタサイズLsize、ポインタ表示色Lcolorおよびポインタ形状Lpatternは、予め定める表示形態である。
 ポインタ表示基準Lbaseは、懐中電灯などによって照射された光のように、照射された光に広がりがある場合、つまり光が照射されている箇所が隣接する複数の画素から構成される場合に、指示ポインタを表示する位置を、光が照射されている箇所の中心の位置を基準(以下「中央基準」という)にするか、照射されている光の強さが最も強い画素の位置を基準(以下「明るさ基準」という)にするかを示す変数であり、値が「0」であると、中央基準にすることを表し、値が「1」であると、明るさ基準にすることを表す。
 受光色PixelColorおよび受光の強さPixelValueは、受光した光についての受光色および受光の強さをそれぞれ画素ごとに表す画素データである。受光色PixelColorは、赤色のサブピクセル、緑色のサブピクセルおよび青色のサブピクセルに対応する3つの要素から構成され、それぞれ「0」~「255」の256段階の階調で表わされる。各画素の画素データは、赤色のサブピクセルの階調、緑色のサブピクセルの階調、青色のサブピクセルの階調および受光の強さの4つの要素から構成される。4つの要素のうち、赤色のサブピクセルの階調、緑色のサブピクセルの階調および青色のサブピクセルの階調は、受光色PixelColorの画素データであり、受光の強さは、受光の強さPixelValueの画素データである。
 光センサ内蔵表示用デバイス13の画面は、横M個および縦N個の画素から構成され、画面に向かって左上の画素を原点とするXY座標で各画素の位置を表す。たとえばXY座標が(0,0)の画素の画素データは、(255,255,255,255)であり、XY座標が(0,1)の画素の画素データは、(255,250,255,252)であり、XY座標が(100,200)の画素の画素データは、(120,100,200,180)であり、XY座標が(100,201)の画素の画素データは、(122,98,210,190)であり、XY座標が(M-1,N-2)の画素の画素データは、(0,0,0,0)であり、XY座標が(M-1,N-1)の画素の画素データは、(0,0,0,0)である。各画素の受光色および受光の強さを表すときは、各画素のXY座標を用いて、それぞれPixelColor[x,y]およびPixelValue[x,y]と表す。
 受光箇所の座標データ(以下「受光データ」ともいう)Lposは、受光箇所に含まれる各画素の座標を、受光箇所ごとに表す。指示ポインタデータPointInfは、指示ポインタを表示するXY座標、色、大きさおよび形状を、受光箇所ごとに表すデータである。ポインタサイズLsize、ポインタ表示色Lcolor、ポインタ形状Lpatternおよびポインタ表示基準Lbaseの変数の設定は、図12に示す設定処理で詳述する。
 図7は、中央演算部15が行う第1の表示処理の一例を示すフローチャートである。第1の表示処理は、光を受光した画素に、受光した光の色つまり受光色と同じ色の指示ポインタを表示する処理である。中央演算部15が表示用・表示処理用一時記憶部12に記憶される画像情報を光センサ内蔵表示用デバイス(以下「パネル」ともいう)13に表示するように表示処理部14に指示した後、ステップA1に移る。
 ステップA1では、パネル13から各光センサが検出した画素ごとのデータ、具体的には受光色および受光の強さのデータを取得し、それぞれ画素ごとに受光色PixelColorおよび受光の強さPixelValueにセットする。ステップA2では、変数X,Yに「0」をそれぞれ代入して初期化する。変数X,Yの値は、自然数である。ステップA3では、変数Yの値がパネル13の垂直解像度、すなわち画面の縦方向の画素数未満であるか否かを判定する。Yの値がパネル13の垂直解像度未満であると、ステップA4に進み、Yの値がパネル13の垂直解像度以上であると、第1の表示処理を終了する。
 ステップA4では、変数Xに「0」を代入して初期化する。ステップA5では、変数Xの値がパネル13の水平解像度、すなわち画面の横方向の画素数未満であるか否かを判定する。変数Xの値がパネル13の水平解像度未満であると、ステップA6に進み、変数Xの値がパネル13の水平解像度以上であると、ステップA10に進む。ステップA6では、座標(X,Y)の画素のPixelValueつまり受光の強さを取得する。ステップA7では、取得したPixelValueの値が、所定の値、たとえば上述した予め定める基準強さより大きいか否か、すなわち光が照射されているか否かを判定する。PixelValueの値が所定の値より大きいと、ステップA8に進み、PixelValueの値が所定の値より大きくないと、ステップA9に進む。
 ステップA8では、座標(X,Y)の画素を、座標(X,Y)のPixelColorつまり受光色で表示する。具体的には、中央演算部15は、表示用・表示処理用一時記憶部12に記憶される画像情報のうち、座標(X,Y)の画素の色を、座標(X,Y)のPixelColorつまり受光色の色に置き換える。表示処理部14は、座標(X,Y)の画素の色が書き換えられた画像情報を表示用・表示処理用一時記憶部12から読み出し、所定の形式に変換してパネル13に表示させる。ステップA9では、変数Xに「1」を加算して、ステップA5に戻る。ステップA10では、変数Yに「1」を加算して、ステップA3に戻る。
 図8は、中央演算部15が行う第2の表示処理の一例を示すフローチャートである。第2の表示処理は、光を受光した受光箇所の位置に、予め設定されている色および形状の指示ポインタを表示する処理である。中央演算部15が表示用・表示処理用一時記憶部12に記憶される画像情報をパネル13に表示するように表示処理部14に指示した後、ステップB1に移る。
 ステップB1では、パネル13から各光センサが検出した画素ごとのデータ、具体的には受光色および受光の強さのデータを取得し、それぞれ画素ごとに受光色PixelColorおよび受光の強さPixelValueにセットする。ステップB2では、受光箇所検索処理を行う。ステップB3では、指示ポインタデータ作成処理を行う。ステップB4では、指示ポインタ表示処理を行って、ステップB1に戻る。
 図9は、第2の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。受光箇所検索処理は、受光箇所判定部19によって実行され、受光箇所判定部19は、図8に示した第2の表示処理から呼び出されると、ステップC1に移る。
 ステップC1では、変数X,Yおよび受光数の変数Lcountに「0」をそれぞれ代入して初期化する。ステップC2では、変数Yの値がパネル13の垂直解像度、すなわち画面の縦方向の画素数未満であるか否かを判定する。Yの値がパネル13の垂直解像度未満であると、ステップC3に進み、Yの値がパネル13の垂直解像度以上であると、受光箇所検索処理を終了する。ステップC3では、変数Xに「0」を代入して初期化する。ステップC4では、変数Xの値がパネル13の水平解像度、すなわち画面の横方向の画素数未満であるか否かを判定する。変数Xの値がパネル13の水平解像度未満であると、ステップC5に進み、変数Xの値がパネル13の水平解像度以上であると、ステップC13に進む。
 ステップC5では、座標(X,Y)の画素のPixelValueつまり受光の強さを取得する。ステップC6では、取得したPixelValueの値が、所定の値、つまり前記予め定める基準強さより大きいか否か、すなわち光が照射されているか否かを判定する。PixelValueの値が所定の値より大きいと、ステップC7に進み、PixelValueの値が所定の値より大きくないと、ステップC11に進む。ステップC7では、受光データカウンタである変数Nに「0」を代入して初期化する。
 ステップC8では、変数Nの値がLcountの値つまりすでに検出されている受光数未満であるか否かを判定する。変数Nの値がLcountの値未満であると、ステップC9に進み、変数Nの値がLcountの値以上であると、ステップC14に進む。ステップC9では、N番目の受光データLposに、すでに光が照射されているとして検出された隣接する画素の座標(以下「隣接座標」という)があるか否かを判定する。N番目の受光データLposに隣接座標があると、ステップC10に進み、N番目の受光データLposに隣接座標がないと、ステップC12に進む。
 ステップC10では、座標(X,Y)の値を、N番目の受光データLposに追加する。ステップC11では、変数Xに「1」を加算して、ステップC4に戻る。ステップC12では、変数Nに「1」を加算して、ステップC8に戻る。ステップC13では、変数Yに「1」を加算して、ステップC2に戻る。ステップC14では、変数Lcountに「1」を加算、すなわち受光数をカウントアップして、ステップC10に進む。
 図10は、第2の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。指示ポインタデータ作成処理は、指示ポインタ表示データ作成部20によって実行され、指示ポインタ表示データ作成部20は、図8に示した第2の表示処理から呼び出されると、ステップD1に移る。
 ステップD1では、変数Nに「0」を代入して初期化する。ステップD2では、変数Nが受光数Lcount未満であるか否かを判定する。変数Nが受光数Lcount未満であると、ステップD3に進み、変数Nが受光数Lcount以上であると、ステップD113に進む。ステップD3では、N番目の受光箇所の受光色を、受光箇所の座標データつまり受光データLposと、各画素データの受光色PixelColorとに基づいて判定する。ステップD4では、ステップD3で判定した受光色に対応したポインタサイズLsizeおよびポインタ形状Lpatternの基本座標データを、指示ポインタデータPointInfの座標データにコピーする。基本座標データは、基準位置をXY座標(0,0)とした場合の指示ポインタの各画素のXY座標によって表される。
 ステップD5では、ステップD3で判定した受光色に対応したポインタ表示色Lcolorの値が「3」つまり受光した色であるか否かを判定する。ポインタ表示色Lcolorの値が「3」であると、ステップD11に進み、ポインタ表示色Lcolorの値が「3」でないと、ステップD6に進む。ステップD6では、指示ポインタデータPointInfの色データに、ポインタ表示色Lcolorの色を設定する。ポインタ表示色Lcolorの色は、たとえば赤色、緑色および青色のうちのいずれか1つの値である。
 ステップD7では、ポインタ表示基準Lbaseの値が「0」であるか否か、つまり表示基準が中央部分つまり中央基準か否かを判定する。ポインタ表示基準Lbaseの値が「0」であると、ステップD8に進み、ポインタ表示基準Lbaseの値が「0」でないと、ステップD12に進む。ステップD8では、N番目の受光箇所に含まれる画素のXY座標の平均値を求め、求めたX座標の平均値を変数Xbにセットし、求めたY座標の平均値を変数Ybにセットする。
 ステップD9では、ステップD4で指示ポインタデータPointInfの座標データにコピーした基本座標のX座標に変数Xbの値を加え、Y座標に変数Ybの値を加える。
 ステップステップD10では、変数Nに「1」を加算して、ステップD2に戻る。ステップD11では、指示ポインタデータPointInfの座標に対応する色データを、各画素データの受光色PixelColorから取り出して、指示ポインタデータPointInfの色データに設定し、ステップD7に進む。
 ステップD12では、N番目の受光箇所に含まれる画素のうち、受光の強さPixelValueの値が最も大きい画素を検索し、検索した画素のデータのX座標を変数Xbにセットし、Y座標を変数Ybにセットする。ステップD13では、指示ポインタデータPointInfのうち、座標の値が負、またはパネル13の解像度つまり画素数を超える値のデータを取り除く。ステップD14では、指示ポインタデータPointInfを座標順にソートつまり並べ替えて、指示ポインタデータ作成処理を終了する。
 図11は、第2の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。中央演算部15は、図8に示した第2の表示処理のステップB4を実行すると、ステップE1に移る。
 ステップE1では、変数X,Y,Zに「0」をそれぞれ代入して初期化する。ステップE2では、変数Yの値がパネル13の垂直解像度未満であるか否かを判定する。Yの値がパネル13の垂直解像度未満であると、ステップE3に進み、Yの値がパネル13の垂直解像度以上であると、指示ポインタ表示処理を終了する。ステップE3では、変数Xに「0」を代入して初期化する。ステップE4では、変数Xの値がパネル13の水平解像度未満であるか否かを判定する。変数Xの値がパネル13の水平解像度未満であると、ステップE5に進み、変数Xの値がパネル13の水平解像度以上であると、ステップE9に進む。
 ステップE5では、座標(X,Y)とZ番目のPointInfのデータの座標とが同じであるか否かを判定する。座標(X,Y)とZ番目のPointInfのデータの座標とが同じであると、ステップE6に進み、座標(X,Y)とZ番目のPointInfのデータの座標とが同じでないと、ステップE8に進む。
 ステップE6では、座標(X,Y)の画素を、Z番目のPointInfのデータの色で表示する。具体的には、中央演算部15は、表示用・表示処理用一時記憶部12に記憶される画像情報のうち、座標(X,Y)の画素の色を、Z番目のPointInfのデータの色に置き換える。表示処理部14は、座標(X,Y)の画素の色が書き換えられた画像情報を表示用・表示処理用一時記憶部12から読み出し、所定の形式に変換してパネル13に表示させる。ステップE7では、変数Zに「1」を加算する。ステップE8では、変数Xに「1」を加算して、ステップE4に戻る。ステップE9では、変数Yに「1」を加算して、ステップE2に戻る。図10に示したステップD1~D14および図11に示したステップE1~E9は、第1の制御ステップである。
 図12は、中央演算部15が行う第3の表示処理の一例を示すフローチャートである。第3の表示処理は、複数の出射パターン、たとえば連続する出射パターンおよびパルス状の出射パターンの発光指示装置30が用いられる場合に、発光指示装置ごとに異なる色および形状の指示ポインタを表示する処理である。中央演算部15が表示用・表示処理用一時記憶部12に記憶される画像情報をパネル13に表示するように表示処理部14に指示した後、ステップF1に移る。ステップF1~F4は、それぞれ図8に示したステップB1~B4に対応し、重複を避けるために説明を省略する。
 図13は、第3の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。図14は、第3の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。図15は、第3の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。図13に示す受光箇所検索処理は、図9に示した受光箇所検索処理と同じであり、図15に示す指示ポインタ表示処理は、図11に示した指示ポインタ表示処理と同じであり、重複を避けるために説明は省略する。
 図14に示す指示ポインタデータ作成処理は、指示ポインタ表示データ作成部20によって実行され、指示ポインタ表示データ作成部20は、図12に示した第3の表示処理から呼び出されると、ステップH1に移る。ステップH1,H2は、それぞれ図10に示したステップD1,D2と同じであり、ステップH6~H9は、それぞれ図10に示したステップD7~D10と同じであり、ステップH12~H14は、それぞれ図10に示したステップD12~D14と同じであり、重複を避けるために説明は省略する。
 ステップH3では、受光波形判定部21の判定結果に基づいて、出射パターンがパルス状であるか否かを判定する。出射パターンがパルス状であると、ステップH4に進み、出射パターンがパルス状でないと、つまり連続状の出射パターンであると、ステップH10に進む。ステップH4では、形状が四角形、大きさが「中間」の基準座標データを、指示ポインタデータPointInfの座標データにコピーする。ステップH5では、指示ポインタデータPointInfの色データに、青色を設定する。
 ステップH10では、形状が丸、大きさが「大きい」の基準座標データを、指示ポインタデータPointInfの座標データにコピーする。ステップH11では、指示ポインタデータPointInfの色データに、赤色を設定し、ステップH6に進む。
 赤色は、第1の色であり、青色は、第2の色であり、大きさが大きい丸の形状は、第1の形状であり、大きさが中間の四角形の形状は、第2の形状である。図14に示したステップH3は、判別ステップである。図14に示したステップH1,H2およびステップH4~14ならびに図15に示したステップJ1~J9は、第2の制御ステップである。
 図13に示したステップG1~G14は、範囲決定ステップである。
 図14に示したステップH7,H8は、第1の位置特定ステップである。
 図14に示したステップH8,H12は、第2の位置特定ステップである。
 図14に示したステップH1~H6,H9~H11,H13,H14および図15に示したステップJ1~J9は、第3および第4の制御ステップである。
 図16は、中央演算部15が行う第4の表示処理の一例を示すフローチャートである。第4の表示処理は、複数の出射パターン、たとえば連続する出射パターンおよびパルス状の出射パターンの発光指示装置30が用いられる場合に、発光指示装置ごとに異なる色および形状の指示ポインタを表示するとともに、いずれか1つの発光指示装置については、画面制御に係る制御を行う処理である。画面制御に係る制御は、たとえば画面のページをめくる制御であり、具体的には、画面のページをめくる改頁コマンドを、映像信号をディスプレイ装置10に入力しているPCなどに送信することによって行う。中央演算部15が表示用・表示処理用一時記憶部12に記憶される画像情報をパネル13に表示するように表示処理部14に指示した後、ステップK1に移る。ステップK1~K4は、それぞれ図8に示したステップB1~B4に対応し、重複を避けるために説明を省略する。
 図17は、第4の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。図18は、第4の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。図19は、第4の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。図17に示す受光箇所検索処理は、図9に示した受光箇所検索処と同じであり、図19に示す指示ポインタ表示処理は、図11に示した指示ポインタ表示処理と同じであり、重複を避けるために説明は省略する。
 図18に示す指示ポインタデータ作成処理は、指示ポインタ表示データ作成部20によって実行され、指示ポインタ表示データ作成部20は、図16に示した第4の表示処理から呼び出されると、ステップM1に移る。ステップM1~M5は、それぞれ図14に示したステップH1~H5と同じであり、ステップM9~M12は、それぞれ図14に示したステップH8~H11と同じであり、ステップM14,M15は、それぞれ図14に示したステップH13,H14と同じであり、重複を避けるために説明は省略する。
 ステップM6では、N番目の受光箇所に含まれる画素のXY座標の平均値を求め、求めたX座標の平均値を変数Xbにセットし、求めたY座標の平均値を変数Ybにセットする。ステップM7では、平均値のXY座標(Xb,Yb)から、入力信号の解像度に換算した平均値のXY座標(Xc,Yc)を算出する。具体的には、座標Xcの値は、式(1)によって算出し、座標Ycの値は、式(2)によって算出する。
  Xc=Xb×(入力信号水平解像度/パネル水平解像度) …(1)
  Yc=Yb×(入力信号垂直解像度/パネル垂直解像度) …(2)
 ステップM8では、画面のページをめくる改頁コマンドを、XY座標(Xc,Yc)とともに、映像入力端子11に映像を入力するPCなどに送信する。ステップM13では、N番目の受光箇所に含まれる画素のXY座標の平均値を求め、求めたX座標の平均値を変数Xbにセットし、求めたY座標の平均値を変数Ybにセットし、ステップM9に進む。
 図20は、中央演算部15が行う設定処理の一例を示すフローチャートである。利用者は、PCなどからの映像信号をパネル13に表示させる前に、パネル13に表示される図示しない設定画面に応答して、ポインタサイズLsize、ポインタ表示色Lcolor、ポインタ形状Lpatternおよびポインタ表示基準Lbaseの変数を、受光色ごとに予め設定することができる。
 中央演算部15は、ポインタサイズLsize、ポインタ表示色Lcolor、ポインタ形状Lpatternおよびポインタ表示基準Lbaseの変数を設定するための設定画面の画像情報を生成して、表示用・表示処理用一時記憶部12に記憶させ、表示用・表示処理用一時記憶部12に記憶された画像情報を表示処理部14によってパネル13に表示させる。設定画面は、たとえばそれぞれの変数に設定する内容をリモコンによって選択することができる画面である。中央演算部15は、設定画面をパネル13に表示させた後、ステップP1に移る。
 ステップP1では、リモコンによって選択されたポインタ表示基準が「中央基準」であるのか「明るさ基準」であるのかを判定する。選択されたポインタ表示基準が「中央基準」であると、ステップP2に進み、選択されたポインタ表示基準が「明るさ基準」であると、ステップP14に進む。ステップP2では、変数Dに「0」を代入する。ステップP3では、表示基準Lbaseに変数Dの値を代入する。ステップP4では、受光色を選択するための変数Kに「0」つまり赤色を示す値を代入する。
 ステップP5では、変数Kの値が設定可能な受光色の数、たとえば赤色、青色および黄色の3色を表す値「3」未満であるか否かを判定する。変数Kの値が「3」未満であると、ステップP6に進み、変数Kの値が「3」以上であると、設定処理を終了する。変数Kは、赤色が「0」、青色が「1」および黄色が「2」である。設定を行っている受光色を設定画面に表示する。ステップP6では、まず、設定を行っている受光色、つまり変数Kの値によって決まる色が何かを設定画面に表示する。次に、リモコンによって選択されたポインタ形状が「四角形」であるのか「丸」であるのか「矢印」であるのかを判定する。選択されたポインタ形状が「四角形」であると、ステップP15に進み、選択されたポインタ形状が「丸」であると、ステップP7に進み、選択されたポインタ形状が「矢印」であると、ステップP16に進む。ステップP7では、変数Aに「1」を代入する。
 ステップP8では、リモコンによって選択されたポインタの大きさが「大」であるのか「中」であるのか「小」であるのかを判定する。選択されたポインタの大きさが「大」であると、ステップP17に進み、選択されたポインタの大きさが「中」であると、ステップP9に進み、選択されたポインタの大きさが「小」であると、ステップP18に進む。ステップP9では、変数Bに「1」を代入する。
 ステップP10では、リモコンによって選択されたポインタ色が「赤色」であるのか「青色」であるのか「緑色」であるのか「受光色」であるのかを判定する。選択されたポインタ色が「赤色」であると、ステップP19に進み、選択されたポインタ色が「緑色」であると、ステップP11に進み、選択されたポインタ色が「青色」であると、ステップP20に進み、選択されたポインタ色が「受光色」であると、ステップP21に進む。ステップP11では、変数Cに「1」を代入する。
 ステップP12では、設定した内容を記憶する。具体的には、変数Bの値をポインタの大きさLsize[K]に代入し、変数Cの値をポインタ色Lcolor[K]に代入し、変数Aの値をポインタ形状Lpattern[K]に代入する。ステップP13では、変数Kに「1」を加算して、ステップP5に戻る。ステップP14では、変数Dに「1」を代入して、ステップP3に進む。ステップP15では、変数Aに「0」を代入して、ステップP8に進む。ステップP16では、変数Aに「2」を代入して、ステップP8に進む。
 ステップP17では、変数Bに「0」を代入して、ステップP10に進む。ステップP18では、変数Bに「2」を代入して、ステップP10に進む。ステップP19では、変数Cに「0」を代入して、ステップP12に進む。ステップP20では、変数Cに「2」を代入して、ステップP12に進む。ステップP21では、変数Cに「3」を代入して、ステップP12に進む。
 このように、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、中央演算部15によって、光センサによって検出された光の強さに応じて、光の検出位置を示す画像、たとえば指示ポインタが光センサ内蔵表示用デバイス13に表示される。
 したがって、光を照射する指示装置によって画面に照射された光の位置を明示することができる。
 さらに、光センサ内蔵表示用デバイス13に表示させる画像は、予め定める色、たとえば赤色、緑色または青色であるので、画面を見る人がわかり易い色を用いることによって、光を照射する指示装置によって画面に照射された光の位置をより明瞭にすることができる。
 さらに、光センサによって、受光する光の色が検出され、光センサ内蔵表示用デバイス13に表示させる画像は、光センサが検出した色であるので、光を照射する指示装置によって照射している色をより明瞭にすることができる。
 さらに、光センサ内蔵表示用デバイス13に表示させる画像は、予め定める表示形態、たとえば四角形、丸または矢印の形状であるので、画面を見る人がわかり易い形状を用いることによって、光を照射する指示装置によって画面に照射された光の位置をより明確に示すことができる。
 さらに、中央演算部15によって、光センサによって検出された光の強さが予め定める基準強さ以上であるか否かが判断され、中央演算部15によって、中央演算部15によって、光センサによって検出された光の強さが予め定める基準強さ以上であると判断されたとき、前記画像が光センサ内蔵表示用デバイス13に表示される。
 したがって、予め定める基準強さ以上の光が照射された位置を光センサ内蔵表示用デバイス13に表示することができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、図10に示したステップD1~D14および図11に示したステップE1~E9では、光センサによって検出された光の強さに応じて、検出位置を示す画像、たとえば指示ポインタを表示する。
 したがって、本発明に係る表示方法を提供すれば、光を照射する指示装置によって画面に照射された光の位置を明示することができる。
 また、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、中央演算部15によって、光センサによって検出された光の強さが変化するパターンが判別され、中央演算部15によって、中央演算部15によって判別されたパターンが連続する出射パターンであるとき、光センサによって光が検出された位置を示す第1の表示形態のシンボル画像、たとえば赤色または大きさが大きい丸の形状の指示ポインタが表示され、中央演算部15によって判別されたパターンが連続する出射パターンとは異なるパルス状の出射パターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像、たとえば青色または大きさが中間の四角形の形状の指示ポインタが表示される。
 したがって、光の強さが変化するパターンが異なる光を照射する複数、たとえば2種類の指示装置によって指示された画面上の位置をそれぞれ明示することができる。
 さらに、前記第1の表示形態は、赤色であり、前記第2の表示形態は、赤色とは異なる青色であるので、光の強さが変化するパターンが異なる光を照射する複数、たとえば2種類の指示装置によって指示された画面上の位置をそれぞれ異なる色によって明示することができる。
 さらに、前記第1の表示形態は、大きさが大きい丸の形状であり、前記第2の表示形態は、大きさが大きい丸の形状とは異なる大きさが中間の四角形の形状であるので、光の強さが変化するパターンが異なる光を照射する複数、たとえば2種類の指示装置によって指示された画面上の位置をそれぞれ異なる形状によって明示することができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、図14に示したステップH3では、光センサによって検出された光の強さが変化するパターンを判別する。
 そして、図14に示したステップH1,H2およびステップH4~14ならびに図15に示したステップJ1~J9は、図14に示したステップH3で判別されたパターンが予め定める連続する出射パターンであるとき、光センサによって光が検出された位置を示す第1の表示形態のシンボル画像、たとえば赤色または大きさが大きい丸の形状の指示ポインタを表示させ、図14に示したステップH3で判別されたパターンが連続する出射パターンとは異なるパルス状の出射パターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像、たとえば青色または大きさが中間の四角形の形状の指示ポインタを表示する。
 したがって、本発明に係る表示方法を提供すれば、光の強さが変化するパターンが異なる光を照射する複数、たとえば2種類の指示装置によって指示された画面上の位置をそれぞれ明示することができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、中央演算部15によって、光センサによって検出された光の強さが予め定める基準強度以上である範囲である受光箇所が決定され、中央演算部15によって、中央演算部15によって決定された範囲ごとに各範囲の中心となる位置が特定され、中央演算部15によって、中央演算部15によって特定された位置を示す画像、たとえば指示ポインタが表示される。
 したがって、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 さらに、前記特定された位置を示す画像は、予め定める表示形態であるので、操作者が指示装置によって指示する画面上の位置を特定の表示形態によって明示することができる。
 さらに、前記予め定める表示形態は、予め定める色、たとえば赤色、緑色または青色であるので、画面を見る人がわかり易い色を用いることによって、指示装置によって指示された画面上の位置をより明瞭にすることができる。
 さらに、光センサによって、受光する光の色が検出され、前記予め定める表示形態は、光センサが検出した色であるので、指示装置によって照射している色でより明瞭に位置を明示することができる。
 さらに、記予め定める表示形態は、予め定める形状、たとえば四角形、丸または矢印の形状であるので、画面を見る人がわかり易い形状を用いることによって、指示装置によって指示された画面上の位置をより明確に示すことができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、図13に示したステップG1~G14および図14に示したステップH7、H8では、光センサによって検出された光の強さが予め定める基準強度以上である範囲である受光箇所を決定する。図13に示したステップG1~G14および図14に示したステップH7,H8では、図13に示したステップG1~G14および図14に示したステップH7,H8で決定された範囲ごとに各範囲の中心にある位置を特定する。そして、図14に示したステップH1~H6,H9~H11,H13,H14および図15に示したステップJ1~J9では、図13に示したステップG1~G14および図14に示したステップH7,H8で特定された位置を示す画像、たとえば指示ポインタを表示する。
 したがって、本発明に係る表示方法を提供すれば、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、中央演算部15によって、光センサによって検出された光の強さが予め定める基準強度以上である範囲である受光箇所が決定され、中央演算部15によって、中央演算部15によって決定された範囲内のうち、光の強さが最も強く検出した位置が前記決定された範囲ごとに特定され、中央演算部15によって、中央演算部15によって特定された位置を示す画像、たとえば指示ポインタが表示される。
 したがって、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 さらに、前記特定された位置を示す画像は、予め定める表示形態であるので、操作者が指示装置によって指示する画面上の位置を特定の表示形態によって明示することができる。
 さらに、前記予め定める表示形態は、予め定める色、たとえば赤色、緑色または青色であるので、画面を見る人がわかり易い色を用いることによって、指示装置によって指示された画面上の位置をより明瞭にすることができる。
 さらに、光センサによって、受光する光の色が検出され、前記予め定める表示形態は、光センサが検出した色であるので、指示装置によって照射している色でより明瞭に位置を明示することができる。
 さらに、記予め定める表示形態は、予め定める形状、たとえば四角形、丸または矢印の形状であるので、画面を見る人がわかり易い形状を用いることによって、指示装置によって指示された画面上の位置をより明確に示すことができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、図13に示したステップG1~G14および図14に示したステップH8,H12では、光センサによって検出された光の強さが予め定める基準強度以上である範囲である受光箇所を決定する。図13に示したステップG1~G14および図14に示したステップH8,H12では、図13に示したステップG1~G14および図14に示したステップH8,H12で決定された範囲内のうち、光の強さが最も強い位置を前記決定された範囲ごとに特定する。そして、図14に示したステップH1~H6,H9~H11,H13,H14および図15に示したステップJ1~J9では、図13に示したステップG1~G14および図14に示したステップH8,H12で特定された位置を示す画像、たとえば指示ポインタを表示する。
 したがって、本発明に係る表示方法を提供すれば、懐中電灯などのように画面に照射した光が広がる指示装置を用いる場合にも、操作者が指示装置によって指示する画面上の位置を明示することができる。
 図21は、本発明の第2の実施形態である表示装置1Aの構成を示す図である。表示装置1Aは、ディスプレイ装置10A、発光指示装置30および映像生成装置50を含む。本発明に係る表示方法は、ディスプレイ装置10Aおよび映像生成装置50によって処理される。本実施形態において、上述の実施形態の構成と同一の部分には同一の参照符を付し、その説明を省略する。
 ディスプレイ装置10Aは、N個の映像入力端子11、表示用・表示処理用一時記憶部12、光センサ内蔵表示用デバイス13、表示処理部14、中央演算部15、内部記憶部16、リモートコントロール(以下「リモコン」という)受光部17、リモコン処理部18、受光箇所判定部19、指示ポインタ表示データ作成部20、および受光波形判定部21およびシリアル通信処理部22を含み、映像生成装置50に接続されている。
 中央演算部15は、中央処理装置(以下「CPU」という)およびディスプレイ装置10Aを制御するための制御プログラムを記憶するメモリを含み、CPUがメモリに記憶される制御プログラムを実行することによって、光センサ内蔵表示用デバイス13、表示処理部14、リモコン処理部18およびシリアル通信処理部22を制御する。
 シリアル通信処理部22は、映像生成装置50の映像生成装置シリアル通信処理部55とシリアルインタフェースによって接続され、映像生成装置シリアル通信処理部55と情報を送受信する。シリアル通信処理部22は、中央演算部15から指示される表示画像に対する制御を行うコマンドを映像生成装置50に送信する。表示画像に対する制御を行うコマンドは、たとえば表示されている画像情報のページをめくる改頁コマンド、表示している画像情報をX軸方向に移動する横スクロールコマンド、表示している画像情報をY軸方向に移動する縦スクロールコマンド、表示している画像情報を拡大する拡大コマンド、および表示している画像情報を縮小する縮小コマンドなどのコマンドである。
 発光指示装置30は、レーザポインタ、LED(Light Emitting Diode)あるいは懐中電灯など光を画面に照射して位置を指示する指示装置である。発光指示装置30は、光を発光する発光部31、発光部31をパルス状に発光させる間欠駆動部32、および発光をパルス状に照射するか連続して照射するかを切り換えるボタン33を含む。
 映像生成装置50は、PCなどによって構成され、表示用データ記憶部51、映像信号生成部52、映像生成装置中央演算部53、映像生成装置内部記憶部54および映像生成装置シリアル通信処理部55を含む。
 表示用データ記憶部51は、ディスプレイ装置10Aに表示させるための画像情報を記憶する半導体メモリあるいはハードディスク装置などによって構成される記憶装置である。映像信号生成部52は、表示用データ記憶部51に記憶される画像情報のうち、映像生成装置中央演算部53によって指示される画像情報を表示用データ記憶部51から読み出し、読み出した画像情報を映像信号に変換して出力する。映像信号生成部52によって出力される映像信号は、ディスプレイ装置10Aの映像入力端子11のうちいずれか1つの映像入力端子に入力される。
 映像生成装置中央演算部53は、CPUと映像生成装置50を制御するための映像生成装置制御プログラムを記憶するメモリとを含み、CPUがメモリに記憶される映像生成装置制御プログラムを実行することによって、映像信号生成部52および映像生成装置シリアル通信処理部55を制御する。映像生成装置中央演算部53は、CPUおよびメモリの代わりに、プログラミングすることができるLSI(Large Scale Integration)であるFPGA(
Field Programmable Gate Array)、特定の用途のために設計、製造される集積回路であるASIC(Application Specific Integrated Circuit)、あるいはその他の演算機能を有する回路によって構成してもよい。
 映像生成装置内部記憶部54は、たとえば書き込みおよび読み出し可能な半導体メモリなどの記憶装置によって構成され、映像生成装置中央演算部53が映像生成装置制御プログラムを実行するときに用いる制御情報541を記憶する。制御情報541は、たとえば表示領域などの情報を含む。表示領域は、1ページ分の画像情報の全体の中でディスプレイ装置10Aに表示させる部分の領域を表す情報であり、たとえばその領域にある画素のうち画面に向かって左上の位置にある画素のXY座標(Ppos_xs,Ppos_ys)および右下の位置にある画素のXY座標(Ppos_xe,Ppos_ye)によって表す。XY座標は、たとえば表示される1ページ分の全体画像に向かって左上の画素の位置を原点とする座標である。
 映像生成装置シリアル通信処理部55は、ディスプレイ装置のシリアル通信処理部22とシリアルインタフェースによって接続され、シリアル通信処理部22と情報を送受信する。映像生成装置シリアル通信処理部55は、ディスプレイ装置10Aから送信される表示画像に対する制御を行うコマンド、たとえば改頁コマンド、横スクロールコマンド、縦スクロールコマンド、拡大コマンドおよび縮小コマンドなどのコマンドを受信し、受信したコマンドを映像生成装置中央演算部53に送る。中央演算部15および映像生成装置中央演算部53は、制御手段である。
 図22~図25のフローチャートで用いる変数には、受光数Lcount、受光色PixelColor、受光の強さPixelValue、受光箇所の座標データLpos、および指示ポインタデータPointInfの変数がある。これらの変数の定義は上述のとおりであり、説明を省略する。
 図22は、中央演算部15が行う第5の表示処理の一例を示すフローチャートである。第5の表示処理は、光を受光した受光箇所の位置に、予め設定されている色および形状の指示ポインタを表示する処理である。中央演算部15が表示用・表示処理用一時記憶部12に記憶される画像情報をパネル13に表示するように表示処理部14に指示した後、ステップQ1に移る。
 ステップQ1では、パネル13から各光センサが検出した画素ごとのデータ、具体的には受光色および受光の強さのデータを取得し、それぞれ画素ごとに受光色PixelColorおよび受光の強さPixelValueにセットする。ステップQ2では、受光箇所検索処理を行う。ステップQ3では、指示ポインタデータ作成処理を行う。ステップQ4では、指示ポインタ表示処理を行って、ステップQ1に戻る。
 図23は、第5の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。図24は、第5の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。図25は、第5の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。図23に示す受光箇所検索処理R1~R14は、図9に示した受光箇所検索処理C1~C14と同じであり、図24に示す指示ポインタデータ作成処理S1~S15は、図18に示した指示ポインタデータ作成処理M1~M15と同じであり図25に示す指示ポインタ表示処理T1~T9は、図11に示した指示ポインタ表示処理E1~E9と同じであり、重複を避けるために説明は省略する。図24に示したステップS3は、判別ステップである。
 図26は、映像生成装置中央演算部53が行うコマンド処理の一例を示すフローチャートである。コマンド処理は、ディスプレイ像置10Aから受信した表示画像に対する制御を行うコマンド、たとえば改頁コマンドを実行する処理である。映像生成装置中央演算部53が表示用データ記憶部51に記憶される画像情報をディスプレイ装置10Aに出力するように映像信号生成部52に指示した後、ステップU1に移る。
 ステップU1では、改頁コマンドを受信したか否かを判定する。改頁コマンドを受信すると、ステップU1に進み、改頁コマンドを受信しないと、ステップU1に戻る。ステップU2では、改頁コマンドを実行して、ステップU1に戻る。具体的には、現在出力している画像情報のページの次のページの画素情報の映像信号を出力して、ステップU1に戻る。図24に示したステップS1,S2,S4~S15、図25に示したステップT1~T9および図26に示したステップU1,U2は、第5の制御ステップである。
 このように、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、中央演算部15によって、光センサによって検出された光の強さが変化するパターン、つまり出射パターンが判別され、中央演算部15および映像生成装置中央演算部53によって、中央演算部15によって判別されたパターンが連続する出射パターンであるとき、表示画像に対して第1の制御が行われ、中央演算部15によって判別されたパターンが連続する出射パターンとは異なるパルス状の出射パターンであるとき、表示画像に対して第2の制御が行われる。
 したがって、光を照射する発光指示装置30によって画面に照射された光に応じた制御、たとえば光の強さが変化する出射パターンに応じて表示画像に対する制御を行うことができる。
 さらに、前記第1の制御は、光センサによって検出された光の位置を示す画像を表示させる制御であり、前記第2の制御は、表示画像を操作する制御であるので、出射パターンが連続するパターンであるときと、指示ポインタを表示し、出射パターンがパルス状のパターンであるときと、表示されている画像情報の操作、たとえばページを変えることができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、図24に示したステップS3では、光センサによって検出された光の強さが変化するパターン、つまり出射パターンを判別する。そして、図24に示したステップS3で判別されたパターンが連続する出射パターンであるとき、図24に示したステップS2,S9~S15および図25に示したステップT1~T9で、表示画像に対して第1の制御を行い、図24に示したステップS3で判別されたパターンが連続するパターンとは異なるパルス状の出射パターンであるとき、図24に示したステップS2,S4~S10,S14,S15および図26に示したステップU1,U2で、表示画像に対して第2の制御を行う。
 したがって、本発明に係る表示方法を適用すれば、光を照射する発光指示装置30によって画面に照射された光に応じた制御、たとえば光の強さが変化する出射パターンに応じて表示画像に対する制御を行うことができる。
 図27は、本発明の第3の実施形態である表示装置1Bの構成を示す図である。表示装置1Bは、ディスプレイ装置10B、発光指示装置30Aおよび映像生成装置50Aを含む。本発明に係る表示方法は、ディスプレイ装置10Bおよび映像生成装置50Aによって処理される。本実施形態において、上述の実施形態の構成と同一の部分には同一の参照符を付し、その説明を省略する。また本実施形態のディスプレイ装置10Bと映像生成装置50Aとの間の接続関係は、上述の実施形態のディスプレイ装置10Aと映像生成装置50との間の接続関係に類似している。
 ディスプレイ装置10Bは、N個の映像入力端子11、表示用・表示処理用一時記憶部12、光センサ内蔵表示用デバイス13、表示処理部14、中央演算部15、内部記憶部16、リモートコントロール(以下「リモコン」という)受光部17、リモコン処理部18、受光箇所判定部19、指示ポインタ表示データ作成部20、受光部形状判定部21Aおよびシリアル通信処理部22を含み、映像生成装置50に接続されている。
 受光箇所判定部19、指示ポインタ表示データ作成部20および受光部形状判定部21Aは、中央演算部15が制御プログラムを実行することによって実現される機能である。
 指示ポインタ表示データ作成部20は、受光箇所判定部19によって光を受光していると判定された受光箇所について、指示ポインタを表示する位置を決定して、指示ポインタを表す指示ポインタデータを作成するとともに、受光部形状判定部21Aによって判定された形状に応じたコマンドを映像生成装置50に送信する。形状判別手段である受光部形状判定部21Aは、指示ポインタ表示データ作成部20から呼び出され、発光指示装置30によって画面に照射される光の形状を判定する。具体的には、受光部形状判定部21Aは、光センサ内蔵表示用デバイス13の光センサから受け取る受光の強さが、予め定める基準強度以上である画素によって形成される形状を判定する。
 発光指示装置30Aは、レーザポインタなど光を画面に照射して位置を指示する指示装置である。発光指示装置30Aは、光、たとえばレーザ光線を発光する発光部31A、および光を照射して画面に形成する光の形状を切り換える発光形状切換部32Aを含む。
 映像生成装置50Aは、PCなどによって構成され、表示用データ記憶部51、映像信号生成部52、映像生成装置中央演算部53、映像生成装置内部記憶部54、映像生成装置シリアル通信処理部55、スクロール処理部56および拡大処理部57を含む。
 スクロール処理部56および拡大処理部57は、映像生成装置中央演算部53Aが映像生成装置制御プログラムを実行することによって実現される機能である。スクロール処理部56は、1ページ分の画像情報のうちディスプレイ装置10Bの画面に表示させる画像情報を横スクロールあるいは縦スクロールする処理を行う。横スクロールは、1ページ分の画像情報のうち表示する画像情報をXY座標のX軸方向に移動する処理であり、縦スクロールはY軸方向に移動する処理である。拡大処理部57は、1ページ分の画像情報のうちディスプレイ装置10Bの画面に表示させる画像情報を拡大する処理を行う。中央演算部15および映像生成装置中央演算部53Aは、制御手段である。
 図28A~28Dは、発光指示装置30によって照射される光の形状の例を示す図である。図28Aは、発光指示装置30によって照射される光の形状がサークル形状61の場合の一例である。サークル形状61は、中心部に光が照射されない予め定める形状、たとえば円形の形状である。図28Bは、発光指示装置30によって照射される光の形状がドット形状62の場合の一例である。ドット形状62は、中心部に光が照射される予め定める形状、たとえば円形の形状である。図28Cは、発光指示装置30によって照射される光の形状が横長形状63の場合の一例である。横長形状63は、傾きが予め定める角度以下の横方向の線分の形状である。図28Dは、発光指示装置30によって照射される光の形状が縦長形状64の場合の一例である。縦長形状64は、傾きが予め定める角度以下の縦方向の線分の形状である。以下、横長形状のことを横線形状、縦長形状のことを縦線形状ともいう。
 図29は、ドット形状62の光がパネル13に照射されている表示画像の一例を示す図である。画像情報70は、表示用データ記憶部51Aに記憶される1ページ分の画像情報であり、1ページ分の画像情報のうち領域71にある画像情報がパネル13に表示される。画面72は、パネル13の画面であり、表示用データ記憶部51Aに記憶される1ページ分の画像情報70のうち領域71にある画像情報が表示されている。画面74は、画面72のうち発光指示装置30からのレーザ光線が照射されている周辺の部分を拡大した部分画面であり、ドット形状62の光が照射されている。
 図30は、光の形状が横長形状63のときに表示される表示画像の一例を示す図である。表示装置1Bは、発光指示装置30によって横長形状63の光が画面75に照射されていると、表示する画像情報の領域を、表示している画像情報の領域の右方向に移動することによって、パネル13の画面に表示する画像情報を横スクロールする。図30に示した例は、表示する画像情報の領域を、領域71から横スクロールさせ、つまり右方向に移動させ、領域71aの画像情報を画面75に表示した状態である。
 図31は、光の形状が縦長形状64のときに表示される表示画像の一例を示す図である。表示装置1Bは、発光指示装置30によって縦長形状64の光が画面76に照射されていると、表示する画像情報の領域を、表示している画像情報の領域の下方向に移動することによって、パネル13の画面に表示する画像情報を縦スクロールする。図31に示した例は、表示する画像情報の領域を、領域71aから縦スクロールさせ、つまり下方向に移動させ、領域71bの画像情報を画面76に表示した状態である。
 図32は、光の形状がサークル形状61のときに表示される表示画像の一例を示す図である。表示装置1Bは、発光指示装置30によってサークル形状61の光が画面77に照射されていると、サークル形状61が照射されている画素を中心にして画面77に表示されている画像情報を拡大した画像情報をパネル13の画面に表示する。表示装置1Bは、発光指示装置30によってサークル形状61の光が画面77に照射されている間、画面77に表示されている画像情報を徐々に拡大して、画面77に表示する。このとき、パネル13の画面に表示されている画像情報の領域は、領域71bから領域71cになっている。
 図33~図37のフローチャートで用いる変数には、受光数Lcount、受光色PixelColor、受光の強さPixelValue、受光箇所の座標データLpos、および指示ポインタデータPointInfの変数がある。これらの変数の定義は上述のとおりであり、説明を省略する。
 図33は、中央演算部15が行う第6の表示処理の一例を示すフローチャートである。第6の表示処理は、光を受光した受光箇所の位置に、予め設定されている色および形状の指示ポインタを表示する処理である。中央演算部15が表示用・表示処理用一時記憶部12に記憶される画像情報をパネル13に表示するように表示処理部14に指示した後、ステップV1に移る。ステップV1~V4は、それぞれ図22に示したステップQ1~Q4に対応し、重複を避けるために説明を省略する。
 図34は、第6の表示処理から呼び出される受光箇所検索処理の一例を示すフローチャートである。受光箇所検索処理は、受光箇所判定部19によって実行され、受光箇所判定部19は、図33に示した第6の表示処理から呼び出されると、ステップW1に移る。図34に示した受光箇所検索処理W1~W4,W6~W14は、図23に示した受光箇所検索処理R1~R4,R6~R14と同じであり、重複を避けるために説明を省略する。
 ステップW5では、座標(X,Y)の画素のPixelValueつまり受光の強さとPixelColorつまり受光色とを取得する。
 図35は、第6の表示処理から呼び出される指示ポインタデータ作成処理の一例を示すフローチャートである。指示ポインタデータ作成処理は、指示ポインタ表示データ作成部20によって実行され、指示ポインタ表示データ作成部20は、図33に示した第6の表示処理から呼び出されると、ステップAA1に移る。
 ステップAA1では、変数Nに「0」を代入して初期化する。ステップAA2では、変数Nが受光数Lcount未満であるか否かを判定する。変数Nが受光数Lcount未満であると、ステップAA3に進み、変数Nが受光数Lcount以上であると、指示ポインタデータ作成処理を終了する。
 ステップAA3では、N番目の受光データLposの座標を、N番目の指示ポインタデータPointInfの座標データにコピーする。ステップAA4では、N番目の指示ポインタデータPointInfの各座標の色データを、受光色PixelColorのデータから取り出して、コピーする。ステップAA5では、受光部形状判定部21Aを呼び出して、照射箇所つまり受光箇所の形状を判別する。ステップAA6では、受光部形状判定部21Aによって判定された形状が何かを判定する。判定された形状がドット形状であると、具体的には変数Aが「0」であると、ステップAA8に進み、サークル形状であると、具体的には変数Aが「3」であると、ステップAA7に進み、横線形状であると、具体的には変数Aが「1」であると、ステップAA9に進み、縦線形状であると、具体的には変数Aが「2」であると、ステップAA10に進む。
 ステップAA7では、「拡大」コマンドをシリアル通信処理部22によって映像生成装置50に送信する。ステップAA8では、変数Nに「1」を加算して、ステップAA2に戻る。ステップAA9では、「横スクロール」コマンドをシリアル通信処理部22によって映像生成装置50Aに送信して、ステップAA8に進む。ステップAA10では、「縦スクロール」コマンドをシリアル通信処理部22によって映像生成装置50Aに送信して、ステップAA8に進む。
 図36は、第6の表示処理から呼び出される指示ポインタ表示処理の一例を示すフローチャートである。中央演算部15は、図33に示した表示処理のステップV4を実行すると、ステップBB1に移る。図36に示す指示ポインタ表示処理BB1~BB9は、図11に示した指示ポインタ表示処理E1~E9と同じであり、重複を避けるために説明を省略する。
 図37は、指示ポインタデータ作成処理から呼び出される形状判定処理の一例を示すフローチャートである。形状判定処理は、受光部形状判定部21Aによって実行され、図35に示した指示ポインタデータ作成処理から呼び出されると、ステップCC1に移る。
 ステップCC1では、注目する画素のX座標を受光箇所のX座標のうち最小のX座標Xminとする。ステップCC2では、注目する画素のX座標の値が、受光箇所のX座標のうち最大のX座標Xmaxの値以下であるか否かを判定する。注目する画素のX座標の値がX座標Xmaxの値以下であると、ステップCC3に進み、注目する画素のX座標の値がX座標Xmaxの値よりも大きいと、ステップCC12に進む。ステップCC3では、X軸方向の傾きが許容範囲内であるか否かを判定する。具体的には、座標XのY座標の範囲が5未満であるか否か、すなわち注目する画素のY座標と座標Xminの画素のY座標との差が5未満であるか否かを判定する。座標XのY座標の範囲が5未満であると、ステップCC4に進み、座標XのY座標の範囲が5以上であると、ステップCC5に進む。ステップCC4では、変数Xに「1」を加算して、ステップCC2に戻る。
 ステップCC5では、注目する画素のY座標を受光箇所のY座標のうち最小のY座標Yminとする。ステップCC6では、注目する画素のY座標の値が、受光箇所のY座標のうち最大のY座標Ymaxの値以下であるか否かを判定する。注目する画素のY座標の値がY座標Ymaxの値以下であると、ステップCC7に進み、注目する画素のY座標の値がY座標Ymaxの値よりも大きいと、ステップCC13に進む。ステップCC7では、Y軸方向の傾きが許容範囲内か否かを判定する。具体的には、座標YのX座標の範囲が5未満であるか否か、すなわち注目する画素のX座標と座標Yminの画素のX座標との差が5未満であるか否かを判定する。座標YのX座標の範囲が5未満であると、ステップCC8に進み、座標YのX座標の範囲が5以上であると、ステップCC9に進む。ステップCC8では、変数Yに「1」を加算して、ステップCC6に戻る。
 ステップCC9では、受光箇所の中心座標(Xm,Ym)を求める。具体的には、式Xm=(Xmin+Xmax)/2、および式Ym=(Ymin+Ymax)/2によって求める。ステップCC10では、中心部に光が照射されているか否かを判定する。具体的には、受光箇所データLposに含まれる画素のうち、左上の位置が座標(Xm-2,Ym-2)と右下の位置が座標(Xm+2,Ym+2)とで決まる範囲内に、XY座標がある画素が2つ以上あるか否かを判定する。2つ以上あると、ステップCC11に進み、2つ以上ないと、ステップCC14に進む。
 ステップCC11では、変数Aにドット形状であることを表す値「0」を代入して、形状判定処理を終了する。ステップCC12では、変数Aに横線形状であることを表す値「1」を代入して、形状判定処理を終了する。ステップCC13では、変数Aに縦線形状であることを表す値「2」を代入して、形状判定処理を終了する。ステップCC14では、変数Aにサークル形状であることを表す値「3」を代入して、形状判定処理を終了する。図37に示したステップCC1~CC14は、形状判別ステップである。
 図38は、映像生成装置中央演算部53Aが行うコマンド処理の一例を示すフローチャートである。コマンド処理は、ディスプレイ像置10Bから受信した表示画像に対する制御を行うコマンド、たとえば横スクロールコマンド、縦スクロールコマンド、拡大コマンドおよび縮小コマンドなどのコマンドを実行する処理である。映像生成装置中央演算部53Aが表示用データ記憶部51Aに記憶される画像情報をディスプレイ装置10Bに出力するように映像信号生成部52Aに指示した後、ステップDD1に移る。
 ステップDD1では、コマンドを受信したか否かを判定する。コマンドを受信していないと、ステップDD5に進む。コマンドを受信し、受信したコマンドが横スクロールコマンドであると、ステップDD8に進み、受信したコマンドが縦スクロールコマンドであると、ステップDD2に進み、受信したコマンドが拡大コマンドであると、ステップDD13に進む。ステップDD2では、表示領域が表示データの最下端であるか否かを判定する。具体的には、表示領域が1ページ分の画素情報の画素のうち最下端の画素を含んでいるか否かを判定する。表示領域が表示データの最下端であると、ステップDD7に進み、表示領域が表示データの最下端でないと、ステップDD3に進む。
 ステップDD3では、表示領域の最下端と表示データの最下端との差が1画素(以下「ドット」ともいう)であるか否かを判定する。表示領域の最下端と表示データの最下端との差が1ドットであると、ステップDD6に進み、表示領域の最下端と表示データの最下端との差が1ドットでないと、ステップDD4に進む。ステップDD4では、表示領域のY座標に「2」を加える。具体的には、Y座標Ppos_ys,Ppos_yeに「2」を加算する。
 ステップDD5では、表示領域部分、すなわち左上の位置にある画素のXY座標(Ppos_xs,Ppos_ys)および右下の位置にある画素のXY座標(Ppos_xe,Ppos_ye)によって定まる領域部分の映像信号を映像信号生成部52Aによって生成して出力し、ステップDD1に戻る。ステップDD6では、表示領域を最下端にして、ステップDD5に進む。すなわち、X座標は変えずに、表示領域の最下端の画素が1ページ分の画像情報の最下端の画素になるようにして、ステップDD5に進む。ステップDD7では、表示領域を最上端にして、ステップDD5に進む。すなわち、X座標は変えずに、表示領域の最上端の画素が1ページ分の画像情報の最上端の画素になるようにして、ステップDD5に進む。
 ステップDD8では、表示領域が表示データの最右端であるか否かを判定する。具体的には、表示領域が1ページ分の画素情報の画素のうち最右端の画素を含んでいるか否かを判定する。表示領域が表示データの最右端であると、ステップDD12に進み、表示領域が表示データの最右端でないと、ステップDD9に進む。ステップDD9では、表示領域の最右端と表示データの最右端との差が1ドットであるか否かを判定する。表示領域の最右端と表示データの最右端との差が1ドットであると、ステップDD11に進み、表示領域の最右端と表示データの最右端との差が1ドットでないと、ステップDD10に進む。
 ステップDD10では、表示領域のX座標に「2」を加える。具体的には、X座標Ppos_xs,Ppos_xeに2を加算して、ステップDD5に進む。ステップDD11では、表示領域を最右端にして、ステップDD5に進む。すなわち、Y座標は変えずに、表示領域の最右端の画素が1ページ分の画像情報の最右端の画素になるようにして、ステップDD5に進む。ステップDD12では、表示領域を最左端にして、ステップDD5に進む。すなわち、Y座標は変えずに、表示領域の最左端の画素が1ページ分の画像情報の最左端の画素になるようにして、ステップDD5に進む。
 ステップDD13では、表示領域の幅つまりX軸方向の画素の数が10ドット以内であるか否かを判定する。表示領域の幅が10ドット以内であると、ステップDD16に進み、表示領域の幅が10ドット以内でないと、ステップDD14に進む。ステップDD14では、表示領域の高さつまりY軸方向の画素の数が10ドット以内であるか否かを判定する。表示領域の高さが10ドット以内であると、ステップDD16に進み、表示領域の高さが10ドット以内でないと、ステップDD15に進む。
 ステップDD15では、表示領域の左上の画素のX座標およびY座標のそれぞれに「2」を加算し、右下の画素のX座標およびY座標のそれぞれから「2」を減算して、ステップF5に進む。ステップDD16では、表示領域を標準のサイズ、すなわち非拡大のサイズにして、ステップDD5に進む。
 映像生成装置中央演算部53Aは、ステップDD2~DD7およびステップDD8~DD12をスクロール処理部56によって実行し、ステップDD13~DD16を拡大処理部57によって実行する。図33に示したステップV1~V4および図38に示したステップDD1~DD16は、第6の制御ステップである。
 上述したフローチャートでは、画像情報を縮小する処理を記載していないが、ドット形状、横長形状、縦長形状およびサークル形状以外の形状を、縮小コマンドに割り当てることによって、同様に処理することができる。さらに他の形状を、拡大した画像情報または縮小した画像情報を元に戻すコマンドに割り当てることによって、拡大した画像情報または縮小した画像情報を元に戻すことができるようにしてもよい。
 このように、光センサ内蔵表示用デバイス13によって、画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する光センサとが備えられ、受光部形状判定部21Aによって、光センサによって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状が判別され、中央演算部15および映像生成装置中央演算部53Aによって、受光部形状判定部21Aによって判別された形状に応じて、前記表示画像に対して予め定める制御が行われる。
 したがって、光を照射する指示装置によって画面に照射された光、たとえば光によって画面に形成される形状に応じて表示画像に対する制御を行うことができる。
 さらに、前記予め定める制御は、前記表示画像をスクロールさせて表示する制御であるので、画面に照射する光の形状によって、画面に入らない画像情報をスクロールして表示することができる。
 さらに、前記予め定める制御は、前記表示画像を拡大もしくは縮小して、または拡大した表示画像もしくは縮小した表示画像を元に戻して表示する制御であるので、画面に照射する光の形状によって、画像情報を拡大もしくは縮小して、あるいは元の大きさで表示することができる。
 さらに、画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する光センサとを備える光センサ内蔵表示用デバイス13に画像を表示するにあたって、図37に示したステップCC1~CC14では、光センサによって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状を判別する。図33に示したステップV1~V4および図38に示したステップDD1~DD16では、図37に示したステップCC1~CC14で判別された形状に応じて、前記表示画像に対して予め定める制御を行う。
 したがって、本発明に係る表示方法を適用すれば、光を照射する指示装置によって画面に照射された光、たとえば光によって画面に形成される形状に応じて表示画像に対する制御を行うことができる。
 さらに、前記予め定める制御は、前記表示画像をスクロールさせて表示する制御であるので、本発明に係る表示方法を適用すれば、画面に入らない画像情報をスクロールして表示することができる。
 さらに、前記予め定める制御は、前記表示画像を拡大もしくは縮小して、または拡大した表示画像もしくは縮小した表示画像を元に戻して表示する制御であるので、本発明に係る表示方法を適用すれば、画像情報を拡大あるいは縮小して表示することができる。
 本発明は、その精神または主要な特徴から逸脱することなく、他のいろいろな形態で実施できる。したがって、前述の実施形態はあらゆる点で単なる例示に過ぎず、本発明の範囲は特許請求の範囲に示すものであって、明細書本文には何ら拘束されない。さらに、特許請求の範囲に属する変形や変更は全て本発明の範囲内のものである。
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the drawings.
FIG. 1 is a diagram showing a configuration of a display device 1 according to the first embodiment of the present invention. The display device 1 includes a display device 10 and a light emission instruction device 30. The display method according to the present invention is processed by the display device 10.
The display device 10 includes N video input terminals 11, a display / display processing temporary storage unit 12, a photosensor built-in display device 13, a display processing unit 14, a central processing unit 15, an internal storage unit 16, a remote control ( The light receiving unit 17, the remote control processing unit 18, the light receiving location determination unit 19, the instruction pointer display data creation unit 20, and the light reception waveform determination unit 21 are included.
The video input terminal 11 receives a video signal output from a video generation device (see, for example, the element 50 in FIG. 21 and the element 50A in FIG. 27) described later, which includes a personal computer (hereinafter referred to as “PC”). N input terminals are provided. N is a natural number. Video signals input from the video input terminals 11 are sent to the display / display processing temporary storage unit 12. The display / display processing temporary storage unit 12 is a writable and readable storage device composed of a semiconductor memory, a hard disk device, or the like, and includes image information representing a video signal received from the video input terminal 11 and a central processing unit 15. To store image information on which display processing and display processing are performed. The image information stored in the display / display processing temporary storage unit 12 can be read from the display processing unit 14 and the central processing unit 15.
The display device 13 with a built-in photosensor serving as a display means is a liquid crystal display in which a plurality of pixels constituting a display image and a photosensor serving as a detection unit for detecting the intensity and color of received light are built for each pixel. . Furthermore, the display device 13 with a built-in photosensor is provided with a photosensor for each of the red, green, and blue sub-pixels constituting each pixel, and the color of the received light can be detected for each pixel. The photosensor built-in display device 13 displays the image information received from the display processing unit 14, and the light intensity (hereinafter referred to as "light reception intensity") detected by each photosensor for each pixel and color (hereinafter referred to as "light reception"). Color ”) to the central processing unit 15.
The display processing unit 14 converts the image information read from the display / display processing temporary storage unit 12 into a format instructed by the central processing unit 15, for example, a format that can be displayed by the optical sensor built-in display device 13. It is sent to the display device 13 with a built-in optical sensor and displayed.
The central processing unit 15 includes a central processing unit (hereinafter referred to as “CPU”) and a memory that stores a control program for controlling the display device 10, and the CPU executes the control program stored in the memory to execute the optical program. The sensor built-in display device 13, the display processing unit 14, and the remote control processing unit 18 are controlled. The central processing unit 15 is a determination unit, a determination unit, a range determination unit, a position specifying unit, and a control unit. The central processing unit 15 is an LSI (Large Scale Integration) FPGA (Field Programmable Gate Array) that can be programmed instead of the CPU and memory, and an ASIC that is an integrated circuit designed and manufactured for a specific application. (Application Specific Integrated
Circuit), or a circuit having other arithmetic functions.
The internal storage unit 16 is configured by a storage device such as a writable and readable semiconductor memory, and stores control information 161 used when the central processing unit 15 executes a control program. The control information 161 includes, for example, a received light color representing the color of the received light, a received light intensity representing the intensity of the received light, a received light number representing the number of received light, a pointer shape representing the shape of the pointing pointer, and an pointing pointer Information such as a pointer display color representing the color of the pointer, a pointer size representing the size of the pointing pointer, a pointer display position representing the center position of the pointing pointer display, and a pointer display standard representing the reference of the display position of the pointing pointer.
The number of received light is the number of light receiving portions that receive light, and is the number that matches the number of light emission instruction devices 30 that are irradiating light if the light is not overlapping. The light receiving location is a pixel group including at least one pixel irradiated with light by the light emission instruction device 30. The instruction pointer which is a symbol image is an image for clearly indicating the position where light is emitted by the light emission instruction device 30 and indicates the position of the representative point of the light receiving location. The pointer display reference indicates whether the center position of the light receiving location is the reference or the brightest position of the light receiving location is the reference. That is, it is a reference whether the representative point is the center position of the light receiving location or the brightest position of the light receiving location.
The remote control light receiving unit 17 receives light from a remote controller (not shown) that transmits information for operating the display device 10 to the display device 10, converts the received light into an electrical signal, and sends the electrical signal to the remote control processing unit 18. The remote control processing unit 18 converts the electrical signal received from the remote control light receiving unit 17 into information instructed from the remote control and sends the information to the central processing unit 15.
The light receiving location determination unit 19, the instruction pointer display data creation unit 20, and the light reception waveform determination unit 21 are functions realized by the central processing unit 15 executing a control program. The light receiving location determination unit 19 determines the location of the light applied to the display device 13 with a built-in optical sensor. Specifically, the light receiving location determination unit 19 receives the intensity of received light from the optical sensor of the display device 13 with a built-in optical sensor equal to or higher than a predetermined reference intensity, for example, 128 or more pixels in 256 levels from 0 to 255, or A range in which the plurality of pixels are adjacent to each other is determined as a light receiving location where the light emitted from the light emission instruction device 30 is received.
The instruction pointer display data creation unit 20 determines the position where the instruction pointer is displayed for the light receiving location determined to be receiving light by the light receiving location determination unit 19 and creates the instruction pointer data representing the instruction pointer. . The received light waveform determination unit 21 determines an emission pattern of light emitted from the light emission instruction device 30. Specifically, the light reception waveform determination unit 21 continuously receives light received from the optical sensor of the display device 13 with a built-in optical sensor when the received light intensity is equal to or greater than a predetermined reference intensity for a predetermined period, for example, a period of 2 milliseconds. It is determined that the output pattern is a pulse-shaped output pattern when a state that is greater than or equal to a predetermined reference intensity and a state that is less than a predetermined reference intensity are repeated in a predetermined period. The pulse-shaped emission pattern can be used as a plurality of emission patterns by changing the frequency, pulse width, pulse interval, and the like of repeated pulses and combining them. The continuous emission pattern is the first pattern, and the pulsed emission pattern is the second pattern.
The light emission instruction device 30 is an instruction device that indicates a position by irradiating a screen with light such as a laser pointer, an LED (Light Emitting Diode), or a flashlight. The light emission instructing device 30a includes a light emitting unit 31 that emits light. The light emission instructing device 30b includes an intermittent driving unit 32 that emits light in a pulse shape in addition to the light emitting unit 31, and the light emission instructing device 30c includes In addition to the light emitting unit 31 and the intermittent driving unit 32, a button 33 for switching whether to emit light in a pulsed manner or continuously is included.
FIG. 2 is a diagram illustrating an example of the screen 40 irradiated with light from the light emission instruction device 30. The light emission instruction device 30 shown in FIG. 2 is an instruction device having a small focal spread, such as a laser pointer, and a laser beam 39 from the light emission instruction device 30 is irradiated on a region 41 of a screen 40 where a person image is displayed. ing. A screen 41 drawn to the right side of the screen 40 is an enlarged screen so that each of the pixels 42 can be seen. The screen 41 is irradiated with a laser pointer irradiation spot irradiated with a person display image 43 and a laser beam 39. 44 is shown.
Each pixel 42 includes a red sub-pixel 421, a green sub-pixel 422, and a blue sub-pixel 423, and each pixel 42 is provided with a photosensor 46. The optical sensor 46 includes three optical sensors 461 to 463 that detect color and an optical sensor (not shown) that detects the intensity of light. The optical sensor 461 detects red, the optical sensor 462 detects green, and the optical sensor 463 detects blue.
3A to 3D are diagrams illustrating examples of instruction pointers displayed at positions where light is irradiated. FIG. 3A shows an enlarged screen 41a that displays an instruction pointer 45a in which the position indicated by the laser beam 39 irradiating light is indicated in red. FIG. 3B shows an enlarged screen 41b displaying an instruction pointer 45b in which the position indicated by the laser beam 39 irradiated with light is represented by 3 × 3 white pixels. FIG. 3C shows an enlarged screen 41c that displays an instruction pointer 45c in which a position indicated by the laser beam 39 irradiated with light is represented by a 5 × 5 green pixel. The instruction pointer 45c is an instruction pointer that is larger than the instruction pointer 45b illustrated in FIG. 3B. FIG. 3D shows an enlarged screen 41d that displays an instruction pointer 45d that indicates the position indicated by the laser beam 39 irradiated with light in the shape of a red arrow. The tip of the arrow indicates the position of the pixel at the center of the pixel being irradiated with light, for example.
4A to 4C are diagrams showing examples of instruction pointers when the irradiated light has a spread. FIG. 4A shows that light from the flashlight is radiated to a wide area 44e. In FIG. 4A, the intensity of light emitted from the flashlight is stronger toward the center of the region 44e. The instruction pointer display data creation unit 20 determines the position of the pixel at the center of the area 44e or the position of the pixel having the strongest light reception among the pixels included in the area 44e as the position for displaying the instruction pointer. FIG. 4B shows an enlarged screen 41f displaying a red instruction pointer 45f at a position determined by the instruction pointer display data creation unit 20, and FIG. 4C shows a red arrow at a position determined by the instruction pointer display data creation unit 20. An enlarged screen 41g displaying a shape instruction pointer 45g is shown.
5A and 5B are diagrams showing examples of a screen 40h irradiated with light from two types of light emission instruction devices 30a and 30b having different colors of light to be irradiated. The light emission instruction devices 30a and 30b are both configured by a laser pointer, the light emission instruction device 30a emits a red laser beam, and the light emission instruction device 30b emits a yellow laser beam. FIG. 5A shows that the screen 40h is irradiated with a red laser beam from the light emission instruction device 30a and a yellow laser beam from the light emission instruction device 30b. The enlarged screen 41h shows the pixels 45h1 and 45h2 irradiated with the respective laser beams. In FIG. 5B, a square red instruction pointer 45j1 composed of 3 × 3 pixels is displayed at the position where the red laser beam from the light emission instruction device 30a is irradiated, and the yellow laser beam from the light emission instruction device 30b is irradiated. This indicates that a pointer 45j2 in the shape of a yellow arrow is displayed at the position where it is displayed.
FIG. 6 is a diagram illustrating an example of a screen 40k irradiated with light from two types of light emission instruction devices 30c and 30d having different emission patterns. The light emission instruction devices 30c and 30d are both configured by a laser pointer. The light emission instruction device 30c irradiates a laser beam with a continuous emission pattern, and the light emission instruction device 30d irradiates the laser beam with a pulsed emission pattern. FIG. 6 shows that the laser beam from the light emission instructing device 30c and the laser beam from the light emission instructing device 30d are irradiated on the screen 40k, and the enlarged screen 41k is irradiated with the respective laser beams. Pixels 45k1 and 45k2 are shown.
In the example illustrated in FIG. 6, the received light waveform determination unit 21 determines that the received light intensity received from the optical sensor of the pixel 45k1 is equal to or higher than a predetermined reference intensity for a predetermined period, and thus is a continuous emission pattern. Since the state where the intensity of light received from the photosensor of the pixel 45k2 is equal to or higher than the predetermined reference intensity and the state where it is lower than the predetermined reference intensity are repeated in a predetermined period, it is determined to be a pulsed emission pattern.
The variables used in the flowcharts of FIGS. 7 to 12 include the number of received light Lcount, the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, the pointer display reference Lbase, the received light color PixelColor, the received light intensity PixelValue, and the coordinate data of the received light location. There are variables of Lpos and instruction pointer data PointInf.
The number of received light Lcount is the number of light receiving portions that receive light. When the portion irradiated with light is composed of a plurality of adjacent pixels, one unit is counted as one location. The variable Lcount is an integer from 0 to LMAX, and LMAX is the maximum number of received light colors, for example, “3”.
The pointer size Lsize is a value indicating the size of the pointer, and when the value is “0”, it indicates a large size, and when the value is “1”, it indicates an intermediate size and the value is “2”. If present, it represents a small size. The pointer display color Lcolor is a value indicating the color of the pointer, and when the value is “0”, it indicates red, when the value is “1”, it indicates green, and when the value is “2”. Represents blue, and a value of “3” represents the received color. The pointer shape Lpattern is a value representing the shape of the pointing pointer. When the value is “0”, it represents a quadrangle. When the value is “1”, it represents a circle, and when the value is “2”, Represents an arrow.
The pointer size Lsize, the pointer display color Lcolor, and the pointer shape Lpattern can be set for each K, for example, three, that is, for each received color. When distinguishing each, Lsize [K], Lcolor [K], and Lpattern [K]. Hereinafter, the data of the pointer size Lsize, the pointer display color Lcolor, and the pointer shape Lpattern are also referred to as basic coordinate data. The pointer size Lsize, the pointer display color Lcolor, and the pointer shape Lpattern are display forms determined in advance.
The pointer display reference Lbase is indicated when the irradiated light has a spread like light emitted by a flashlight or the like, that is, when the portion irradiated with light is composed of a plurality of adjacent pixels. The position at which the pointer is displayed is based on the position of the center of the area where the light is irradiated (hereinafter referred to as “center reference”), or the position of the pixel where the intensity of the irradiated light is the strongest (referred to below) The value is “0”, indicating that the central reference is used, and if the value is “1”, the brightness is indicated. .
The light reception color PixelColor and the light reception intensity PixelValue are pixel data representing the light reception color and the light reception intensity of the received light for each pixel. The light receiving color PixelColor is composed of three elements corresponding to the red sub-pixel, the green sub-pixel, and the blue sub-pixel, and is expressed by 256 levels of gradation from “0” to “255”, respectively. The pixel data of each pixel is composed of four elements: the gradation of the red sub-pixel, the gradation of the green sub-pixel, the gradation of the blue sub-pixel, and the intensity of light reception. Among the four elements, the gradation of the red subpixel, the gradation of the green subpixel, and the gradation of the blue subpixel are pixel data of the light reception color PixelColor, and the light reception intensity is the light reception intensity. PixelValue pixel data.
The screen of the display device 13 with a built-in photosensor is composed of M pixels in the horizontal direction and N pixels in the vertical direction, and the position of each pixel is represented by XY coordinates with the upper left pixel as the origin toward the screen. For example, pixel data of a pixel having an XY coordinate of (0, 0) is (255, 255, 255, 255), and pixel data of a pixel having an XY coordinate of (0, 1) is (255, 250, 255). 252), the pixel data of the pixel with the XY coordinates (100, 200) is (120, 100, 200, 180), and the pixel data of the pixel with the XY coordinates (100, 201) is (122, 98, 210, 190), the pixel data of the pixel whose XY coordinates are (M-1, N-2) is (0, 0, 0, 0), and the XY coordinates are (M-1, N-). The pixel data of the pixel 1) is (0, 0, 0, 0). When the light reception color and light reception intensity of each pixel are expressed, they are expressed as PixelColor [x, y] and PixelValue [x, y], respectively, using the XY coordinates of each pixel.
The coordinate data (hereinafter also referred to as “light reception data”) Lpos of the light receiving location represents the coordinates of each pixel included in the light receiving location for each light receiving location. The instruction pointer data PointInf is data representing the XY coordinates, color, size, and shape for displaying the instruction pointer for each light receiving location. The setting of the variables of the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, and the pointer display reference Lbase will be described in detail in the setting process shown in FIG.
FIG. 7 is a flowchart illustrating an example of the first display process performed by the central processing unit 15. The first display process is a process of displaying an instruction pointer having the same color as the color of the received light, that is, the received light color, on the pixel that has received the light. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the optical sensor built-in display device (hereinafter also referred to as “panel”) 13. The process proceeds to Step A1.
In step A1, the data for each pixel detected by each photosensor, specifically the light reception color and the light reception intensity data, are obtained from the panel 13, and set to the light reception color PixelColor and the light reception intensity PixelValue for each pixel. To do. In step A2, initialization is performed by substituting “0” into variables X and Y, respectively. The values of the variables X and Y are natural numbers. In step A3, it is determined whether or not the value of the variable Y is less than the vertical resolution of the panel 13, that is, the number of pixels in the vertical direction of the screen. If the value of Y is less than the vertical resolution of the panel 13, the process proceeds to step A4. If the value of Y is equal to or greater than the vertical resolution of the panel 13, the first display process is terminated.
In step A4, “0” is substituted into the variable X for initialization. In step A5, it is determined whether or not the value of the variable X is less than the horizontal resolution of the panel 13, that is, the number of pixels in the horizontal direction of the screen. If the value of the variable X is less than the horizontal resolution of the panel 13, the process proceeds to step A6, and if the value of the variable X is equal to or greater than the horizontal resolution of the panel 13, the process proceeds to step A10. In step A6, the PixelValue of the pixel at the coordinate (X, Y), that is, the intensity of light reception is acquired. In step A7, it is determined whether or not the acquired value of PixelValue is greater than a predetermined value, for example, the above-described predetermined reference intensity, that is, whether or not light is irradiated. If the value of PixelValue is greater than the predetermined value, the process proceeds to step A8, and if the value of PixelValue is not greater than the predetermined value, the process proceeds to step A9.
In step A8, the pixel at the coordinate (X, Y) is displayed in the PixelColor of the coordinate (X, Y), that is, the received light color. Specifically, the central processing unit 15 sets the pixel color at the coordinates (X, Y) to the PixelColor at the coordinates (X, Y) in the image information stored in the display / display processing temporary storage unit 12. That is, it is replaced with the color of the received light color. The display processing unit 14 reads the image information in which the color of the pixel at the coordinates (X, Y) is rewritten from the display / display processing temporary storage unit 12, converts the image information into a predetermined format, and displays the information on the panel 13. In step A9, “1” is added to the variable X, and the process returns to step A5. In Step A10, “1” is added to the variable Y, and the process returns to Step A3.
FIG. 8 is a flowchart illustrating an example of the second display process performed by the central processing unit 15. The second display process is a process of displaying an instruction pointer having a preset color and shape at the position of the light receiving portion that has received the light. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step B1.
In step B1, data for each pixel detected by each photosensor, specifically, light reception color and light reception intensity data is acquired from the panel 13 and set to the light reception color PixelColor and the light reception intensity PixelValue for each pixel. To do. In step B2, a light receiving location search process is performed. In step B3, instruction pointer data creation processing is performed. In step B4, an instruction pointer display process is performed, and the process returns to step B1.
FIG. 9 is a flowchart illustrating an example of a light receiving location search process called from the second display process. The light receiving location searching process is executed by the light receiving location determining section 19, and when the light receiving location determining section 19 is called from the second display process shown in FIG. 8, the process proceeds to step C1.
In step C1, initialization is performed by substituting “0” into the variables X and Y and the variable Lcount of the number of received light. In step C2, it is determined whether or not the value of the variable Y is less than the vertical resolution of the panel 13, that is, the number of pixels in the vertical direction of the screen. If the value of Y is less than the vertical resolution of the panel 13, the process proceeds to step C3, and if the value of Y is equal to or greater than the vertical resolution of the panel 13, the received light spot search process is terminated. In step C3, initialization is performed by substituting “0” into the variable X. In step C4, it is determined whether or not the value of the variable X is less than the horizontal resolution of the panel 13, that is, the number of pixels in the horizontal direction of the screen. If the value of the variable X is less than the horizontal resolution of the panel 13, the process proceeds to step C5. If the value of the variable X is equal to or greater than the horizontal resolution of the panel 13, the process proceeds to step C13.
In Step C5, PixelValue of the pixel at the coordinate (X, Y), that is, the intensity of light reception is acquired. In step C6, it is determined whether or not the acquired PixelValue value is greater than a predetermined value, that is, the predetermined reference intensity, that is, whether or not light is irradiated. If the value of PixelValue is greater than the predetermined value, the process proceeds to step C7, and if the value of PixelValue is not greater than the predetermined value, the process proceeds to step C11. In Step C7, initialization is performed by substituting “0” into a variable N which is a light reception data counter.
In step C8, it is determined whether or not the value of the variable N is less than the value of Lcount, that is, the number of received light that has already been detected. If the value of variable N is less than the value of Lcount, the process proceeds to step C9, and if the value of variable N is equal to or greater than the value of Lcount, the process proceeds to step C14. In step C9, it is determined whether or not the Nth received light data Lpos has the coordinates of adjacent pixels detected as having been irradiated with light (hereinafter referred to as “adjacent coordinates”). If there are adjacent coordinates in the Nth received light data Lpos, the process proceeds to step C10, and if there are no adjacent coordinates in the Nth received light data Lpos, the process proceeds to step C12.
In step C10, the value of the coordinate (X, Y) is added to the Nth received light data Lpos. In Step C11, “1” is added to the variable X, and the process returns to Step C4. In Step C12, “1” is added to the variable N, and the process returns to Step C8. In Step C13, “1” is added to the variable Y, and the process returns to Step C2. In Step C14, “1” is added to the variable Lcount, that is, the number of received light is counted up, and the process proceeds to Step C10.
FIG. 10 is a flowchart illustrating an example of instruction pointer data creation processing called from the second display processing. The instruction pointer data generation process is executed by the instruction pointer display data generation unit 20, and when the instruction pointer display data generation unit 20 is called from the second display process shown in FIG. 8, the process proceeds to step D1.
In step D1, "0" is substituted for variable N to be initialized. In step D2, it is determined whether or not the variable N is less than the received light number Lcount. If the variable N is less than the light reception number Lcount, the process proceeds to step D3. If the variable N is greater than or equal to the light reception number Lcount, the process proceeds to step D113. In step D3, the light reception color of the Nth light reception point is determined based on the coordinate data of the light reception point, that is, the light reception data Lpos, and the light reception color PixelColor of each pixel data. In step D4, the basic coordinate data of the pointer size Lsize and the pointer shape Lpattern corresponding to the received light color determined in step D3 is copied to the coordinate data of the instruction pointer data PointInf. The basic coordinate data is represented by XY coordinates of each pixel of the pointing pointer when the reference position is set to XY coordinates (0, 0).
In step D5, it is determined whether or not the value of the pointer display color Lcolor corresponding to the light reception color determined in step D3 is “3”, that is, the received color. If the value of the pointer display color Lcolor is “3”, the process proceeds to step D11. If the value of the pointer display color Lcolor is not “3”, the process proceeds to step D6. In step D6, the color of the pointer display color Lcolor is set in the color data of the instruction pointer data PointInf. The color of the pointer display color Lcolor is, for example, one of red, green, and blue.
In step D7, it is determined whether or not the value of the pointer display reference Lbase is “0”, that is, whether or not the display reference is the central portion, that is, the central reference. If the value of the pointer display reference Lbase is “0”, the process proceeds to step D8. If the value of the pointer display reference Lbase is not “0”, the process proceeds to step D12. In step D8, the average value of the XY coordinates of the pixels included in the Nth light receiving location is obtained, the obtained average value of the X coordinate is set to the variable Xb, and the obtained average value of the Y coordinate is set to the variable Yb.
In step D9, the value of the variable Xb is added to the X coordinate of the basic coordinate copied to the coordinate data of the pointing pointer data PointInf in step D4, and the value of the variable Yb is added to the Y coordinate.
In Step D10, “1” is added to the variable N, and the process returns to Step D2. In step D11, the color data corresponding to the coordinates of the instruction pointer data PointInf is extracted from the received color PixelColor of each pixel data, set to the color data of the instruction pointer data PointInf, and the process proceeds to step D7.
In step D12, a pixel having the largest value of received light intensity PixelValue is searched for among the pixels included in the Nth light receiving location, the X coordinate of the searched pixel data is set in the variable Xb, and the Y coordinate is set as the variable. Set to Yb. In step D13, data having a negative coordinate value or a value exceeding the resolution of the panel 13, that is, the number of pixels, is removed from the instruction pointer data PointInf. In step D14, the instruction pointer data PointInf is sorted, that is, rearranged in the coordinate order, and the instruction pointer data creation process is terminated.
FIG. 11 is a flowchart illustrating an example of an instruction pointer display process called from the second display process. When executing step B4 of the second display process shown in FIG. 8, the central processing unit 15 proceeds to step E1.
In step E1, initialization is performed by substituting “0” into variables X, Y, and Z, respectively. In step E2, it is determined whether or not the value of the variable Y is less than the vertical resolution of the panel 13. If the value of Y is less than the vertical resolution of the panel 13, the process proceeds to step E3. If the value of Y is equal to or greater than the vertical resolution of the panel 13, the instruction pointer display process is terminated. In step E3, initialization is performed by substituting “0” into the variable X. In step E4, it is determined whether or not the value of the variable X is less than the horizontal resolution of the panel 13. If the value of the variable X is less than the horizontal resolution of the panel 13, the process proceeds to step E5. If the value of the variable X is equal to or greater than the horizontal resolution of the panel 13, the process proceeds to step E9.
In step E5, it is determined whether or not the coordinates (X, Y) and the coordinates of the Z-th PointInf data are the same. If the coordinates (X, Y) and the coordinates of the Zth PointInf data are the same, the process proceeds to step E6. If the coordinates (X, Y) and the coordinates of the Zth PointInf data are not the same, step E8 is performed. Proceed to
In step E6, the pixel at the coordinate (X, Y) is displayed in the color of the Zth PointInf data. Specifically, the central processing unit 15 sets the color of the pixel at the coordinate (X, Y) in the image information stored in the display / display processing temporary storage unit 12 to the color of the Zth PointInf data. Replace with The display processing unit 14 reads the image information in which the color of the pixel at the coordinates (X, Y) is rewritten from the display / display processing temporary storage unit 12, converts the image information into a predetermined format, and displays the information on the panel 13. In Step E7, “1” is added to the variable Z. In Step E8, “1” is added to the variable X, and the process returns to Step E4. In Step E9, “1” is added to the variable Y, and the process returns to Step E2. Steps D1 to D14 shown in FIG. 10 and steps E1 to E9 shown in FIG. 11 are first control steps.
FIG. 12 is a flowchart illustrating an example of the third display process performed by the central processing unit 15. The third display process is a process of displaying an instruction pointer of a different color and shape for each light emission instruction device when the light emission instruction device 30 having a plurality of emission patterns, for example, a continuous emission pattern and a pulsed emission pattern is used. It is. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step F1. Steps F1 to F4 correspond to steps B1 to B4 shown in FIG. 8, respectively, and description thereof is omitted to avoid duplication.
FIG. 13 is a flowchart illustrating an example of a light receiving location search process called from the third display process. FIG. 14 is a flowchart illustrating an example of instruction pointer data creation processing called from the third display processing. FIG. 15 is a flowchart illustrating an example of an instruction pointer display process called from the third display process. 13 is the same as the light receiving point search process shown in FIG. 9, and the instruction pointer display process shown in FIG. 15 is the same as the instruction pointer display process shown in FIG. The description is omitted to avoid it.
The instruction pointer data creation process shown in FIG. 14 is executed by the instruction pointer display data creation part 20, and when the instruction pointer display data creation part 20 is called from the third display process shown in FIG. 12, the process proceeds to step H1. . Steps H1 and H2 are the same as steps D1 and D2 shown in FIG. 10, steps H6 to H9 are the same as steps D7 to D10 shown in FIG. 10, respectively, and steps H12 to H14 are respectively shown in FIG. Steps D12 to D14 shown in FIG. 10 are the same, and the description is omitted to avoid duplication.
In step H3, it is determined based on the determination result of the received light waveform determination unit 21 whether or not the emission pattern is pulsed. If the emission pattern is pulsed, the process proceeds to step H4, and if the emission pattern is not pulsed, that is, if it is a continuous emission pattern, the process proceeds to step H10. In step H4, the reference coordinate data whose shape is a quadrangle and whose size is “intermediate” is copied to the coordinate data of the instruction pointer data PointInf. In step H5, blue is set in the color data of the instruction pointer data PointInf.
In step H10, the reference coordinate data having a round shape and a size of “large” is copied to the coordinate data of the pointing pointer data PointInf. In step H11, red is set in the color data of the instruction pointer data PointInf, and the process proceeds to step H6.
Red is the first color, blue is the second color, the large circular shape is the first shape, and the middle rectangular shape is the second shape. It is. Step H3 shown in FIG. 14 is a determination step. Steps H1, H2 and steps H4 to H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15 are second control steps.
Steps G1 to G14 shown in FIG. 13 are range determination steps.
Steps H7 and H8 shown in FIG. 14 are first position specifying steps.
Steps H8 and H12 shown in FIG. 14 are second position specifying steps.
Steps H1 to H6, H9 to H11, H13, and H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15 are the third and fourth control steps.
FIG. 16 is a flowchart illustrating an example of a fourth display process performed by the central processing unit 15. In the fourth display process, when the light emission instruction device 30 having a plurality of emission patterns, for example, a continuous emission pattern and a pulsed emission pattern, is used, an indication pointer of a different color and shape is displayed for each light emission instruction device. For any one of the light emission instruction devices, this is processing for performing control related to screen control. The control related to the screen control is, for example, a control for turning a page on the screen. Specifically, by sending a page break command for turning a page on the screen to a PC or the like that inputs a video signal to the display device 10. Do. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step K1. Steps K1 to K4 correspond to steps B1 to B4 shown in FIG. 8, respectively, and description thereof is omitted to avoid duplication.
FIG. 17 is a flowchart illustrating an example of a light receiving location search process called from the fourth display process. FIG. 18 is a flowchart illustrating an example of instruction pointer data creation processing called from the fourth display processing. FIG. 19 is a flowchart illustrating an example of an instruction pointer display process called from the fourth display process. The received light spot search process shown in FIG. 17 is the same as the received light spot search process shown in FIG. 9, and the instruction pointer display process shown in FIG. 19 is the same as the instruction pointer display process shown in FIG. The description is omitted to avoid it.
The instruction pointer data creation process shown in FIG. 18 is executed by the instruction pointer display data creation part 20, and when the instruction pointer display data creation part 20 is called from the fourth display process shown in FIG. 16, the process proceeds to step M1. . Steps M1 to M5 are the same as Steps H1 to H5 shown in FIG. 14, respectively, Steps M9 to M12 are the same as Steps H8 to H11 shown in FIG. 14, respectively, and Steps M14 and M15 are shown in FIG. Steps H13 and H14 shown in FIG. 14 are the same, and a description thereof is omitted to avoid duplication.
In step M6, the average value of the XY coordinates of the pixels included in the Nth light receiving location is obtained, the obtained average value of the X coordinate is set to the variable Xb, and the obtained average value of the Y coordinate is set to the variable Yb. In step M7, the XY coordinates (Xc, Yc) of the average value converted to the resolution of the input signal are calculated from the XY coordinates (Xb, Yb) of the average value. Specifically, the value of the coordinate Xc is calculated by the equation (1), and the value of the coordinate Yc is calculated by the equation (2).
Xc = Xb × (input signal horizontal resolution / panel horizontal resolution) (1)
Yc = Yb × (input signal vertical resolution / panel vertical resolution) (2)
In step M8, a page break command for turning the page on the screen is transmitted to the PC or the like that inputs the video to the video input terminal 11 together with the XY coordinates (Xc, Yc). In step M13, the average value of the XY coordinates of the pixels included in the Nth light receiving portion is obtained, the obtained average value of the X coordinate is set in the variable Xb, the obtained average value of the Y coordinate is set in the variable Yb, Proceed to step M9.
FIG. 20 is a flowchart illustrating an example of the setting process performed by the central processing unit 15. Before displaying a video signal from a PC or the like on the panel 13, the user responds to a setting screen (not shown) displayed on the panel 13, the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, and the pointer display reference. The Lbase variable can be set in advance for each received color.
The central processing unit 15 generates image information of a setting screen for setting variables of the pointer size Lsize, the pointer display color Lcolor, the pointer shape Lpattern, and the pointer display reference Lbase, and the display / display processing temporary storage unit 12 The display processing unit 14 displays the image information stored in the display / display processing temporary storage unit 12 on the panel 13. The setting screen is a screen on which, for example, the contents to be set for each variable can be selected by a remote controller. The central processing unit 15 displays the setting screen on the panel 13, and then proceeds to Step P1.
In Step P1, it is determined whether the pointer display reference selected by the remote controller is “center reference” or “brightness reference”. If the selected pointer display standard is “center standard”, the process proceeds to step P2, and if the selected pointer display standard is “brightness standard”, the process proceeds to step P14. In step P2, “0” is substituted for variable D. In step P3, the value of the variable D is substituted for the display reference Lbase. In Step P4, "0", that is, a value indicating red is substituted for the variable K for selecting the received light color.
In Step P5, it is determined whether or not the value of the variable K is less than the number of light receiving colors that can be set, for example, a value “3” representing three colors of red, blue, and yellow. If the value of the variable K is less than “3”, the process proceeds to step P6. If the value of the variable K is “3” or more, the setting process is terminated. The variable K is “0” for red, “1” for blue, and “2” for yellow. Displays the light reception color that has been set on the setting screen. In step P6, first, the light reception color that has been set, that is, the color determined by the value of the variable K is displayed on the setting screen. Next, it is determined whether the pointer shape selected by the remote controller is “square”, “circle”, or “arrow”. If the selected pointer shape is “square”, the process proceeds to step P15. If the selected pointer shape is “circle”, the process proceeds to step P7. If the selected pointer shape is “arrow”, the process proceeds to step P15. Proceed to P16. In step P7, “1” is substituted for variable A.
In step P8, it is determined whether the size of the pointer selected by the remote controller is “large”, “medium”, or “small”. If the size of the selected pointer is “large”, the process proceeds to step P17. If the size of the selected pointer is “medium”, the process proceeds to step P9, and the size of the selected pointer is “small”. ", The process proceeds to Step P18. In Step P9, “1” is substituted into the variable B.
In Step P10, it is determined whether the pointer color selected by the remote controller is “red”, “blue”, “green”, or “light receiving color”. If the selected pointer color is “red”, the process proceeds to step P19. If the selected pointer color is “green”, the process proceeds to step P11. If the selected pointer color is “blue”, the process proceeds to step P19. Proceeding to P20, if the selected pointer color is “light receiving color”, the process proceeds to Step P21. In Step P11, “1” is substituted into the variable C.
In step P12, the set contents are stored. Specifically, the value of the variable B is assigned to the pointer size Lsize [K], the value of the variable C is assigned to the pointer color Lcolor [K], and the value of the variable A is assigned to the pointer shape Lpattern [K]. To do. In Step P13, “1” is added to the variable K, and the process returns to Step P5. In Step P14, “1” is substituted for the variable D, and the process proceeds to Step P3. In Step P15, “0” is substituted into the variable A, and the process proceeds to Step P8. In Step P16, “2” is substituted into the variable A, and the process proceeds to Step P8.
In Step P17, “0” is substituted into the variable B, and the process proceeds to Step P10. In Step P18, “2” is substituted into the variable B, and the process proceeds to Step P10. In Step P19, “0” is substituted for the variable C, and the process proceeds to Step P12. In Step P20, “2” is substituted into the variable C, and the process proceeds to Step P12. In Step P21, “3” is substituted for the variable C, and the process proceeds to Step P12.
As described above, when the image is displayed on the photosensor built-in display device 13 that includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 Thus, according to the intensity of light detected by the optical sensor, an image indicating the light detection position, for example, an instruction pointer is displayed on the optical sensor built-in display device 13.
Therefore, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
Furthermore, since the image displayed on the display device 13 with a built-in optical sensor is a predetermined color, for example, red, green, or blue, by using a color that can be easily understood by a person viewing the screen, the image is displayed on the screen by an instruction device that emits light. The position of the irradiated light can be made clearer.
Furthermore, since the color of the light received by the optical sensor is detected and the image displayed on the optical sensor built-in display device 13 is the color detected by the optical sensor, the color irradiated by the pointing device that emits the light. Can be made clearer.
Furthermore, since the image displayed on the display device 13 with a built-in optical sensor has a predetermined display form, for example, a quadrangle, a circle, or an arrow, an indication device that irradiates light by using a shape that is easy for a person viewing the screen to understand. The position of the light irradiated on the screen can be shown more clearly.
Further, the central processing unit 15 determines whether or not the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference strength. The central processing unit 15 detects by the central processing unit 15 using the optical sensor. When it is determined that the intensity of the emitted light is equal to or greater than a predetermined reference intensity, the image is displayed on the display device 13 with a built-in optical sensor.
Therefore, the position irradiated with light having a predetermined reference intensity or higher can be displayed on the display device 13 with a built-in optical sensor.
Further, in displaying an image on the photosensor-equipped display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light for each position, steps D1 to D1 shown in FIG. In D14 and steps E1 to E9 shown in FIG. 11, an image indicating the detection position, for example, an instruction pointer, is displayed according to the intensity of light detected by the optical sensor.
Therefore, if the display method according to the present invention is provided, the position of the light irradiated on the screen can be clearly indicated by the pointing device that emits light.
Further, when displaying an image on the photosensor built-in display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 When the pattern in which the intensity of the light detected by the optical sensor changes is determined, and the pattern determined by the central processing unit 15 is a continuous emission pattern, the light is detected by the optical sensor. A symbol image of a first display form indicating a position, for example, a red or large-sized circular pointer is displayed, and a pulse-shaped emission different from the emission pattern in which the pattern determined by the central processing unit 15 is continuous When it is a pattern, a symbol image of a second display form different from the first display form and indicating the position, for example, Color or size instruction pointer in the shape of the intermediate rectangle is displayed.
Therefore, it is possible to clearly indicate the positions on the screen instructed by a plurality of, for example, two types of instruction devices that emit light having different patterns of varying light intensity.
Furthermore, since the first display form is red and the second display form is blue different from red, a plurality of, for example, two types of light irradiating different patterns of varying light intensity. The position on the screen designated by the pointing device can be clearly indicated by different colors.
Further, the first display form is a round shape with a large size, and the second display form is a square shape with a size different from the round shape with a large size, The positions on the screen instructed by a plurality of, for example, two types of pointing devices that emit light having different light intensity patterns can be clearly indicated by different shapes.
Furthermore, when displaying an image on the display device 13 with a built-in optical sensor that includes a plurality of pixels for displaying an image and an optical sensor that detects the intensity of received light for each position, in step H3 shown in FIG. A pattern in which the intensity of light detected by the optical sensor changes is discriminated.
Steps H1, H2 and steps H4 to 14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15 are performed when the pattern determined in step H3 shown in FIG. 14 is a predetermined continuous emission pattern. The symbol image of the first display form showing the position where the light is detected by the optical sensor, for example, a red or large-sized circular pointer is displayed, and the pattern determined in step H3 shown in FIG. Is a pulse-like emission pattern different from the continuous emission pattern, the symbol image of the second display form showing the position, which is different from the first display form, for example, a blue or medium-sized square An instruction pointer of the shape of is displayed.
Therefore, if the display method according to the present invention is provided, the positions on the screen indicated by a plurality of, for example, two types of pointing devices that emit light having different light intensity patterns can be specified. .
Furthermore, when displaying an image on the photosensor built-in display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 A light receiving location that is a range in which the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined, and the central calculation unit 15 becomes the center of each range for each range determined by the central calculation unit 15. The position is specified, and the central processing unit 15 displays an image indicating the position specified by the central processing unit 15, for example, an instruction pointer.
Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device.
Furthermore, since the image indicating the specified position is in a predetermined display form, the position on the screen instructed by the operator using the pointing device can be clearly indicated by the specific display form.
Furthermore, since the predetermined display form is a predetermined color, for example, red, green, or blue, the position on the screen instructed by the pointing device is made clearer by using a color that is easy for the viewer to see. be able to.
Furthermore, since the color of light received by the optical sensor is detected, and the predetermined display form is the color detected by the optical sensor, the position can be clearly indicated by the color irradiated by the pointing device. it can.
Furthermore, since the predetermined display form is a predetermined shape, for example, a quadrangle, a circle, or an arrow, the position on the screen instructed by the pointing device is made clearer by using a shape that is easy for the viewer to see. Can be shown.
Furthermore, in displaying an image on the display device 13 with a built-in photosensor, which includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the steps shown in FIG. In G1 to G14 and Steps H7 and H8 shown in FIG. 14, a light receiving point that is in a range in which the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined. In steps G1 to G14 shown in FIG. 13 and steps H7 and H8 shown in FIG. 14, each range is determined for each range determined in steps G1 to G14 shown in FIG. 13 and steps H7 and H8 shown in FIG. Identify the center location. In steps H1 to H6, H9 to H11, H13, and H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15, steps G1 to G14 shown in FIG. 13 and steps H7 and H8 shown in FIG. An image indicating the position specified by (eg, an instruction pointer) is displayed.
Therefore, if the display method according to the present invention is provided, the position on the screen indicated by the operator using the pointing device can be clearly indicated even when using a pointing device that spreads the light irradiated on the screen, such as a flashlight. Can do.
Furthermore, when displaying an image on the photosensor built-in display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 A light receiving location that is a range in which the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined, and the intensity of light within the range determined by the central processing unit 15 is determined by the central processing unit 15. Is determined for each of the determined ranges, and the central processing unit 15 displays an image indicating the position specified by the central processing unit 15, for example, an instruction pointer.
Therefore, even when using a pointing device that spreads light applied to the screen, such as a flashlight, it is possible to clearly indicate the position on the screen that the operator points with the pointing device.
Furthermore, since the image indicating the specified position is in a predetermined display form, the position on the screen instructed by the operator using the pointing device can be clearly indicated by the specific display form.
Furthermore, since the predetermined display form is a predetermined color, for example, red, green, or blue, the position on the screen instructed by the pointing device is made clearer by using a color that is easy for the viewer to see. be able to.
Furthermore, since the color of light received by the optical sensor is detected, and the predetermined display form is the color detected by the optical sensor, the position can be clearly indicated by the color irradiated by the pointing device. it can.
Furthermore, since the predetermined display form is a predetermined shape, for example, a quadrangle, a circle, or an arrow, the position on the screen instructed by the pointing device is made clearer by using a shape that is easy for the viewer to see. Can be shown.
Furthermore, in displaying an image on the display device 13 with a built-in photosensor, which includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the steps shown in FIG. In G1 to G14 and Steps H8 and H12 shown in FIG. 14, a light receiving location in a range where the intensity of light detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined. In Steps G1 to G14 shown in FIG. 13 and Steps H8 and H12 shown in FIG. 14, light within the range determined in Steps G1 to G14 shown in FIG. 13 and Steps H8 and H12 shown in FIG. A position having the strongest intensity is specified for each of the determined ranges. In steps H1 to H6, H9 to H11, H13, and H14 shown in FIG. 14 and steps J1 to J9 shown in FIG. 15, steps G1 to G14 shown in FIG. 13 and steps H8 and H12 shown in FIG. An image indicating the position specified by (eg, an instruction pointer) is displayed.
Therefore, if the display method according to the present invention is provided, the position on the screen indicated by the operator using the pointing device can be clearly indicated even when using a pointing device that spreads the light irradiated on the screen, such as a flashlight. Can do.
FIG. 21 is a diagram showing a configuration of a display device 1A according to the second embodiment of the present invention. The display device 1A includes a display device 10A, a light emission instruction device 30, and a video generation device 50. The display method according to the present invention is processed by the display device 10A and the video generation device 50. In the present embodiment, the same reference numerals are given to the same parts as those in the above-described embodiment, and the description thereof is omitted.
The display device 10A includes N video input terminals 11, a display / display processing temporary storage unit 12, a photosensor built-in display device 13, a display processing unit 14, a central processing unit 15, an internal storage unit 16, a remote control ( (Hereinafter referred to as “remote control”) includes a light receiving unit 17, a remote control processing unit 18, a light receiving location determination unit 19, an instruction pointer display data creation unit 20, a light reception waveform determination unit 21 and a serial communication processing unit 22, and is connected to the video generation device 50. Has been.
The central processing unit 15 includes a central processing unit (hereinafter referred to as “CPU”) and a memory that stores a control program for controlling the display device 10A, and the CPU executes the control program stored in the memory, thereby The sensor built-in display device 13, the display processing unit 14, the remote control processing unit 18, and the serial communication processing unit 22 are controlled.
The serial communication processing unit 22 is connected to the video generation device serial communication processing unit 55 of the video generation device 50 through a serial interface, and transmits and receives information to and from the video generation device serial communication processing unit 55. The serial communication processing unit 22 transmits a command for controlling the display image instructed from the central processing unit 15 to the video generation device 50. The commands for controlling the display image include, for example, a page break command for turning the page of the displayed image information, a horizontal scroll command for moving the displayed image information in the X-axis direction, and the displayed image information for the Y-axis. These are commands such as a vertical scroll command for moving in the direction, an enlargement command for enlarging the displayed image information, and a reduction command for reducing the displayed image information.
The light emission instruction device 30 is an instruction device that indicates a position by irradiating a screen with light such as a laser pointer, an LED (Light Emitting Diode), or a flashlight. The light emission instructing device 30 includes a light emitting unit 31 that emits light, an intermittent drive unit 32 that emits the light emitting unit 31 in a pulsed manner, and a button 33 that switches whether to emit light in a pulsed manner or continuously.
The video generation device 50 is configured by a PC or the like, and includes a display data storage unit 51, a video signal generation unit 52, a video generation device central processing unit 53, a video generation device internal storage unit 54, and a video generation device serial communication processing unit 55. Including.
The display data storage unit 51 is a storage device including a semiconductor memory or a hard disk device that stores image information to be displayed on the display device 10A. The video signal generation unit 52 reads image information instructed by the video generation device central processing unit 53 from the image information stored in the display data storage unit 51 from the display data storage unit 51, and reads the read image information. Convert to video signal and output. The video signal output by the video signal generation unit 52 is input to any one of the video input terminals 11 of the display device 10A.
The video generation device central processing unit 53 includes a CPU and a memory that stores a video generation device control program for controlling the video generation device 50, and the CPU executes the video generation device control program stored in the memory. The video signal generator 52 and the video generator serial communication processor 55 are controlled. The video generation device central processing unit 53 is an FPGA (Large Scale Integration) FPGA (LGA) that can be programmed instead of the CPU and the memory.
Field Programmable Gate Array), ASIC (Application Specific Integrated Circuit) which is an integrated circuit designed and manufactured for a specific application, or other circuits having an arithmetic function may be used.
The video generation device internal storage unit 54 is configured by a storage device such as a writable and readable semiconductor memory, and stores control information 541 used when the video generation device central processing unit 53 executes the video generation device control program. . Control information 541 includes information such as a display area. The display area is information representing the area of the portion to be displayed on the display device 10A in the entire image information for one page. For example, among the pixels in the area, the XY of the pixel in the upper left position toward the screen This is expressed by the coordinates (Ppos_xs, Ppos_ys) and the XY coordinates (Ppos_xe, Ppos_ye) of the pixel at the lower right position. The XY coordinates are coordinates whose origin is, for example, the position of the upper left pixel toward the entire image for one page to be displayed.
The video generation device serial communication processing unit 55 is connected to the serial communication processing unit 22 of the display device through a serial interface, and transmits and receives information to and from the serial communication processing unit 22. The video generation device serial communication processing unit 55 receives commands such as a page break command, a horizontal scroll command, a vertical scroll command, an enlargement command, and a reduction command, which control the display image transmitted from the display device 10A. The received command is sent to the video generation device central processing unit 53. The central processing unit 15 and the video generation device central processing unit 53 are control means.
The variables used in the flowcharts of FIGS. 22 to 25 include variables of the light reception number Lcount, the light reception color PixelColor, the light reception intensity PixelValue, the light reception point coordinate data Lpos, and the instruction pointer data PointInf. The definition of these variables is as described above, and a description thereof will be omitted.
FIG. 22 is a flowchart illustrating an example of the fifth display process performed by the central processing unit 15. The fifth display process is a process of displaying an instruction pointer having a preset color and shape at the position of the light receiving portion that has received the light. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step Q1.
In step Q1, data for each pixel detected by each photosensor from the panel 13, specifically, light reception color and light reception intensity data are acquired and set to the light reception color PixelColor and the light reception intensity PixelValue for each pixel. To do. In step Q2, a light receiving location search process is performed. In step Q3, an instruction pointer data creation process is performed. In step Q4, an instruction pointer display process is performed, and the process returns to step Q1.
FIG. 23 is a flowchart illustrating an example of a light receiving location search process called from the fifth display process. FIG. 24 is a flowchart illustrating an example of instruction pointer data creation processing called from the fifth display processing. FIG. 25 is a flowchart illustrating an example of an instruction pointer display process called from the fifth display process. The received light spot search processes R1 to R14 shown in FIG. 23 are the same as the received light spot search processes C1 to C14 shown in FIG. 9, and the instruction pointer data creation processes S1 to S15 shown in FIG. The pointer pointer display processes T1 to T9 shown in FIG. 25 are the same as the pointer data creation processes M1 to M15, and are the same as the pointer pointer display processes E1 to E9 shown in FIG. . Step S3 shown in FIG. 24 is a determination step.
FIG. 26 is a flowchart illustrating an example of command processing performed by the video generation device central processing unit 53. The command processing is processing for executing a command for controlling the display image received from the display image device 10A, for example, a page break command. After the video generation device central processing unit 53 instructs the video signal generation unit 52 to output the image information stored in the display data storage unit 51 to the display device 10A, the process proceeds to step U1.
In step U1, it is determined whether a page break command has been received. If a page break command is received, the process proceeds to step U1, and if no page break command is received, the process returns to step U1. In step U2, a page break command is executed, and the process returns to step U1. Specifically, the video signal of the pixel information of the next page of the page of the image information currently being output is output, and the process returns to step U1. Steps S1, S2, S4 to S15 shown in FIG. 24, steps T1 to T9 shown in FIG. 25, and steps U1 and U2 shown in FIG. 26 are fifth control steps.
As described above, when the image is displayed on the photosensor built-in display device 13 that includes a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light at a plurality of positions, the central processing unit 15 Thus, a pattern in which the intensity of light detected by the optical sensor changes, that is, an emission pattern, is discriminated, and the patterns discriminated by the central processing unit 15 by the central processing unit 15 and the video generation device central processing unit 53 are continuous. When it is an emission pattern, the first control is performed on the display image, and when the pattern determined by the central processing unit 15 is a pulsed emission pattern different from the continuous emission pattern, the display image is controlled. Second control is performed.
Therefore, it is possible to control the display image according to the light emitted to the screen by the light emission instruction device 30 that emits light, for example, according to the emission pattern in which the intensity of the light changes.
Further, since the first control is a control for displaying an image showing the position of light detected by the optical sensor, and the second control is a control for operating the display image, the emission pattern is continuous. When it is a pattern, an instruction pointer is displayed, and when the emission pattern is a pulse pattern, an operation of displayed image information, for example, a page can be changed.
Furthermore, in displaying an image on the display device 13 with a built-in optical sensor that includes a plurality of pixels for displaying an image and an optical sensor that detects the intensity of received light for each position, in step S3 shown in FIG. A pattern in which the intensity of light detected by the optical sensor changes, that is, an emission pattern is discriminated. Then, when the pattern determined in step S3 shown in FIG. 24 is a continuous emission pattern, the display image is subjected to steps S2, S9 to S15 shown in FIG. 24 and steps T1 to T9 shown in FIG. When the first control is performed and the pattern determined in step S3 shown in FIG. 24 is a pulsed emission pattern different from the continuous pattern, steps S2, S4 to S10, S14, shown in FIG. In S15 and steps U1 and U2 shown in FIG. 26, the second control is performed on the display image.
Therefore, if the display method according to the present invention is applied, control according to the light emitted to the screen by the light emission instruction device 30 that emits light, for example, control over the display image according to the emission pattern in which the intensity of light changes. It can be performed.
FIG. 27 is a diagram showing a configuration of a display device 1B according to the third embodiment of the present invention. The display device 1B includes a display device 10B, a light emission instruction device 30A, and a video generation device 50A. The display method according to the present invention is processed by the display device 10B and the video generation device 50A. In the present embodiment, the same reference numerals are given to the same parts as those in the above-described embodiment, and the description thereof is omitted. The connection relationship between the display device 10B and the video generation device 50A according to the present embodiment is similar to the connection relationship between the display device 10A and the video generation device 50 according to the above-described embodiment.
The display device 10B includes N video input terminals 11, a display / display processing temporary storage unit 12, a photosensor built-in display device 13, a display processing unit 14, a central processing unit 15, an internal storage unit 16, a remote control ( (Hereinafter referred to as “remote control”) including a light receiving unit 17, a remote control processing unit 18, a light receiving location determination unit 19, an instruction pointer display data creation unit 20, a light receiving unit shape determination unit 21 A, and a serial communication processing unit 22. Has been.
The light receiving location determination unit 19, the instruction pointer display data creation unit 20, and the light receiving unit shape determination unit 21A are functions realized by the central processing unit 15 executing a control program.
The instruction pointer display data creation unit 20 determines the position at which the instruction pointer is displayed for the light receiving part determined to receive light by the light receiving part determination unit 19 and creates instruction pointer data representing the instruction pointer. At the same time, a command corresponding to the shape determined by the light receiving portion shape determination unit 21A is transmitted to the video generation device 50. The light receiving unit shape determining unit 21A, which is a shape determining unit, is called from the instruction pointer display data creating unit 20 and determines the shape of the light irradiated on the screen by the light emission instruction device 30. Specifically, the light receiving portion shape determination unit 21A determines a shape formed by pixels whose received light intensity received from the optical sensor of the optical sensor built-in display device 13 is equal to or higher than a predetermined reference intensity.
The light emission instruction device 30A is an instruction device that indicates a position by irradiating the screen with light such as a laser pointer. The light emission instructing device 30A includes a light emitting unit 31A that emits light, for example, a laser beam, and a light emission shape switching unit 32A that switches the shape of light formed on the screen by irradiating light.
The video generation device 50A is configured by a PC or the like, and includes a display data storage unit 51, a video signal generation unit 52, a video generation device central processing unit 53, a video generation device internal storage unit 54, a video generation device serial communication processing unit 55, A scroll processing unit 56 and an enlargement processing unit 57 are included.
The scroll processing unit 56 and the enlargement processing unit 57 are functions realized by the video generation device central processing unit 53A executing the video generation device control program. The scroll processing unit 56 performs a process of horizontally scrolling or vertically scrolling image information to be displayed on the screen of the display device 10B among the image information for one page. The horizontal scroll is a process of moving the image information to be displayed among the image information for one page in the X-axis direction of the XY coordinates, and the vertical scroll is a process of moving in the Y-axis direction. The enlargement processing unit 57 performs a process of enlarging image information to be displayed on the screen of the display device 10B among the image information for one page. The central processing unit 15 and the video generation device central processing unit 53A are control means.
28A to 28D are diagrams showing examples of the shape of light emitted by the light emission instruction device 30. FIG. FIG. 28A is an example of a case where the shape of light emitted by the light emission instruction device 30 is a circle shape 61. The circle shape 61 is a predetermined shape that is not irradiated with light at the center, for example, a circular shape. FIG. 28B is an example when the shape of the light emitted by the light emission instruction device 30 is a dot shape 62. The dot shape 62 is a predetermined shape in which light is irradiated to the central portion, for example, a circular shape. FIG. 28C is an example in which the shape of the light emitted by the light emission instruction device 30 is a horizontally long shape 63. The horizontally long shape 63 is a shape of a horizontal line segment whose inclination is equal to or less than a predetermined angle. FIG. 28D is an example when the shape of the light emitted by the light emission instruction device 30 is the vertically long shape 64. The vertically long shape 64 is a shape of a vertical line segment whose inclination is equal to or less than a predetermined angle. Hereinafter, the horizontally long shape is also referred to as a horizontal line shape, and the vertically long shape is also referred to as a vertical line shape.
FIG. 29 is a diagram illustrating an example of a display image in which the panel 13 is irradiated with the light having the dot shape 62. The image information 70 is image information for one page stored in the display data storage unit 51 </ b> A, and the image information in the area 71 among the image information for one page is displayed on the panel 13. The screen 72 is a screen of the panel 13 and displays image information in the area 71 among the image information 70 for one page stored in the display data storage unit 51A. The screen 74 is a partial screen obtained by enlarging a peripheral portion of the screen 72 irradiated with the laser beam from the light emission instruction device 30, and is irradiated with light having a dot shape 62.
FIG. 30 is a diagram illustrating an example of a display image displayed when the light shape is the horizontally long shape 63. The display device 1B moves the region of the image information to be displayed to the right of the region of the displayed image information when the light of the horizontally long shape 63 is irradiated on the screen 75 by the light emission instruction device 30. The image information displayed on the screen of the panel 13 is scrolled horizontally. The example shown in FIG. 30 is a state in which the area of the image information to be displayed is horizontally scrolled from the area 71, that is, moved to the right, and the image information of the area 71a is displayed on the screen 75.
FIG. 31 is a diagram illustrating an example of a display image displayed when the light shape is the vertically long shape 64. When the light emitting instruction device 30 irradiates the screen 76 with the light of the vertically long shape 64, the display device 1 </ b> B moves the region of the image information to be displayed downward from the region of the image information being displayed. The image information displayed on the screen of the panel 13 is scrolled vertically. The example shown in FIG. 31 is a state in which the area of the image information to be displayed is vertically scrolled from the area 71a, that is, moved downward, and the image information of the area 71b is displayed on the screen 76.
FIG. 32 is a diagram illustrating an example of a display image displayed when the light shape is the circle shape 61. When the light of the circle shape 61 is irradiated on the screen 77 by the light emission instruction device 30, the display device 1B expands the image information displayed on the screen 77 around the pixel irradiated with the circle shape 61. The image information is displayed on the screen of the panel 13. The display device 1 </ b> B gradually enlarges the image information displayed on the screen 77 and displays the image information on the screen 77 while the light of the circle shape 61 is irradiated on the screen 77 by the light emission instruction device 30. At this time, the area of the image information displayed on the screen of the panel 13 is changed from the area 71b to the area 71c.
The variables used in the flowcharts of FIGS. 33 to 37 include variables of the light reception number Lcount, the light reception color PixelColor, the light reception intensity PixelValue, the light reception point coordinate data Lpos, and the instruction pointer data PointInf. The definition of these variables is as described above, and a description thereof will be omitted.
FIG. 33 is a flowchart illustrating an example of a sixth display process performed by the central processing unit 15. The sixth display process is a process of displaying an instruction pointer of a preset color and shape at the position of the light receiving point that has received the light. After the central processing unit 15 instructs the display processing unit 14 to display the image information stored in the display / display processing temporary storage unit 12 on the panel 13, the process proceeds to step V1. Steps V1 to V4 correspond to steps Q1 to Q4 shown in FIG. 22, respectively, and description thereof is omitted to avoid duplication.
FIG. 34 is a flowchart illustrating an example of a light receiving location search process called from the sixth display process. The light receiving location searching process is executed by the light receiving location determining unit 19, and when the light receiving location determining unit 19 is called from the sixth display process shown in FIG. 33, the process proceeds to step W1. The received light spot search processes W1 to W4 and W6 to W14 shown in FIG. 34 are the same as the received light spot search processes R1 to R4 and R6 to R14 shown in FIG. 23, and a description thereof will be omitted to avoid duplication.
In step W5, the PixelValue, that is, the intensity of light reception and the PixelColor, that is, the light reception color of the pixel at the coordinates (X, Y) are acquired.
FIG. 35 is a flowchart illustrating an example of instruction pointer data creation processing called from the sixth display processing. The instruction pointer data generation process is executed by the instruction pointer display data generation unit 20, and when the instruction pointer display data generation unit 20 is called from the sixth display process shown in FIG. 33, the process proceeds to step AA1.
In step AA1, initialization is performed by substituting “0” into the variable N. In Step AA2, it is determined whether or not the variable N is less than the number of received light Lcount. If the variable N is less than the light reception number Lcount, the process proceeds to step AA3. If the variable N is greater than or equal to the light reception number Lcount, the instruction pointer data creation process is terminated.
In step AA3, the coordinates of the Nth received light data Lpos are copied to the coordinate data of the Nth instruction pointer data PointInf. In step AA4, the color data of each coordinate of the Nth instruction pointer data PointInf is extracted from the data of the received light color PixelColor and copied. In step AA5, the light receiving portion shape determining unit 21A is called to determine the shape of the irradiated portion, that is, the light receiving portion. In step AA6, it is determined what the shape determined by the light receiving portion shape determining portion 21A is. If the determined shape is a dot shape, specifically, if the variable A is “0”, the process proceeds to step AA8. If the determined shape is a circle shape, specifically, if the variable A is “3”, Proceeding to step AA7, if the shape is a horizontal line, specifically, if the variable A is "1", proceeding to step AA9, and if the shape is a vertical line, specifically, the variable A is "2". The process proceeds to step AA10.
In step AA 7, an “enlargement” command is transmitted to the video generation device 50 by the serial communication processing unit 22. In step AA8, “1” is added to the variable N, and the process returns to step AA2. In step AA9, a “horizontal scroll” command is transmitted to the video generation device 50A by the serial communication processing unit 22, and the process proceeds to step AA8. In Step AA10, a “vertical scroll” command is transmitted to the video generation device 50A by the serial communication processing unit 22, and the process proceeds to Step AA8.
FIG. 36 is a flowchart illustrating an example of an instruction pointer display process called from the sixth display process. After executing step V4 of the display process shown in FIG. 33, the central processing unit 15 proceeds to step BB1. The instruction pointer display processes BB1 to BB9 shown in FIG. 36 are the same as the instruction pointer display processes E1 to E9 shown in FIG. 11, and a description thereof is omitted to avoid duplication.
FIG. 37 is a flowchart illustrating an example of a shape determination process called from the instruction pointer data creation process. The shape determination process is executed by the light receiving portion shape determination unit 21A, and when called from the instruction pointer data creation process shown in FIG. 35, the process proceeds to step CC1.
In step CC1, the X coordinate of the pixel of interest is set to the minimum X coordinate Xmin among the X coordinates of the light receiving location. In step CC2, it is determined whether or not the value of the X coordinate of the pixel of interest is equal to or less than the value of the maximum X coordinate Xmax among the X coordinates of the light receiving location. If the X coordinate value of the pixel of interest is less than or equal to the X coordinate Xmax, the process proceeds to step CC3. If the X coordinate value of the pixel of interest is greater than the X coordinate Xmax value, the process proceeds to step CC12. In step CC3, it is determined whether or not the inclination in the X-axis direction is within an allowable range. Specifically, it is determined whether or not the range of the Y coordinate of the coordinate X is less than 5, that is, whether or not the difference between the Y coordinate of the pixel of interest and the Y coordinate of the pixel of the coordinate Xmin is less than 5. . If the Y coordinate range of the coordinate X is less than 5, the process proceeds to step CC4. If the Y coordinate range of the coordinate X is 5 or more, the process proceeds to step CC5. In step CC4, “1” is added to the variable X, and the process returns to step CC2.
In step CC5, the Y coordinate of the pixel of interest is set to the minimum Y coordinate Ymin among the Y coordinates of the light receiving location. In step CC6, it is determined whether or not the Y coordinate value of the pixel of interest is less than or equal to the maximum Y coordinate Ymax value among the Y coordinates of the light receiving points. If the Y coordinate value of the pixel of interest is equal to or less than the Y coordinate Ymax value, the process proceeds to step CC7. If the Y coordinate value of the pixel of interest is greater than the Y coordinate Ymax value, the process proceeds to step CC13. In step CC7, it is determined whether the inclination in the Y-axis direction is within an allowable range. Specifically, it is determined whether or not the range of the X coordinate of the coordinate Y is less than 5, that is, whether or not the difference between the X coordinate of the pixel of interest and the X coordinate of the pixel of the coordinate Ymin is less than 5. . If the X coordinate range of the coordinate Y is less than 5, the process proceeds to Step CC8, and if the X coordinate range of the coordinate Y is 5 or more, the process proceeds to Step CC9. In step CC8, “1” is added to the variable Y, and the process returns to step CC6.
In step CC9, the center coordinates (Xm, Ym) of the light receiving location are obtained. Specifically, it is obtained by the formula Xm = (Xmin + Xmax) / 2 and the formula Ym = (Ymin + Ymax) / 2. In step CC10, it is determined whether or not the central portion is irradiated with light. Specifically, among the pixels included in the light receiving location data Lpos, the upper left position is within the range determined by the coordinates (Xm-2, Ym-2) and the lower right position is determined by the coordinates (Xm + 2, Ym + 2). It is determined whether there are two or more pixels with coordinates. If there are two or more, the process proceeds to step CC11, and if there are not two or more, the process proceeds to step CC14.
In step CC11, a value “0” representing the dot shape is substituted for variable A, and the shape determination process is terminated. In step CC12, the value “1” representing the horizontal line shape is substituted for the variable A, and the shape determination process ends. In step CC13, the value “2” representing the vertical line shape is substituted for the variable A, and the shape determination process ends. In step CC14, the value “3” representing the circle shape is substituted for the variable A, and the shape determination process ends. Steps CC1 to CC14 shown in FIG. 37 are shape determination steps.
FIG. 38 is a flowchart illustrating an example of command processing performed by the video processing device central processing unit 53A. The command processing is processing for executing a command for controlling the display image received from the display image device 10B, such as a horizontal scroll command, a vertical scroll command, an enlargement command, and a reduction command. After the video generation device central processing unit 53A instructs the video signal generation unit 52A to output the image information stored in the display data storage unit 51A to the display device 10B, the process proceeds to step DD1.
In step DD1, it is determined whether a command has been received. If no command is received, the process proceeds to step DD5. If the received command is a horizontal scroll command, the process proceeds to step DD8. If the received command is a vertical scroll command, the process proceeds to step DD2. If the received command is an enlarged command, the process proceeds to step DD13. move on. In step DD2, it is determined whether or not the display area is the lowermost end of the display data. Specifically, it is determined whether or not the display area includes the lowermost pixel among pixels of pixel information for one page. If the display area is the lowermost end of the display data, the process proceeds to step DD7. If the display area is not the lowermost end of the display data, the process proceeds to step DD3.
In step DD3, it is determined whether or not the difference between the lowermost end of the display area and the lowermost end of the display data is one pixel (hereinafter also referred to as “dot”). If the difference between the bottom end of the display area and the bottom end of the display data is 1 dot, the process proceeds to step DD6. If the difference between the bottom end of the display area and the bottom end of the display data is not 1 dot, the process proceeds to step DD4. . In step DD4, “2” is added to the Y coordinate of the display area. Specifically, “2” is added to the Y coordinates Ppos_ys and Ppos_ye.
In step DD5, a video signal is generated in the display area, that is, the area determined by the XY coordinates (Ppos_xs, Ppos_ys) of the pixel located at the upper left position and the XY coordinates (Ppos_xe, Ppos_ye) of the pixel located at the lower right position. The data is generated and output by the unit 52A, and the process returns to step DD1. In step DD6, the display area is set to the lowermost end, and the process proceeds to step DD5. That is, without changing the X coordinate, the lowermost pixel of the display area is set to the lowermost pixel of the image information for one page, and the process proceeds to step DD5. In step DD7, the display area is set to the uppermost end, and the process proceeds to step DD5. In other words, the X coordinate is not changed, and the uppermost pixel of the display area becomes the uppermost pixel of the image information for one page, and the process proceeds to step DD5.
In step DD8, it is determined whether or not the display area is the rightmost end of the display data. Specifically, it is determined whether or not the display area includes the rightmost pixel among pixels of pixel information for one page. If the display area is the rightmost edge of the display data, the process proceeds to step DD12. If the display area is not the rightmost edge of the display data, the process proceeds to step DD9. In step DD9, it is determined whether or not the difference between the rightmost edge of the display area and the rightmost edge of the display data is 1 dot. If the difference between the rightmost edge of the display area and the rightmost edge of the display data is 1 dot, the process proceeds to step DD11. If the difference between the rightmost edge of the display area and the rightmost edge of the display data is not 1 dot, the process proceeds to step DD10. .
In step DD10, "2" is added to the X coordinate of the display area. Specifically, 2 is added to the X coordinates Ppos_xs and Ppos_xe, and the process proceeds to step DD5. In step DD11, the display area is set to the rightmost end, and the process proceeds to step DD5. That is, without changing the Y coordinate, the rightmost pixel of the display area is set to the rightmost pixel of the image information for one page, and the process proceeds to step DD5. In step DD12, the display area is set to the leftmost end, and the process proceeds to step DD5. That is, without changing the Y coordinate, the leftmost pixel of the display area is set to the leftmost pixel of the image information for one page, and the process proceeds to step DD5.
In step DD13, it is determined whether or not the width of the display area, that is, the number of pixels in the X-axis direction is within 10 dots. If the width of the display area is within 10 dots, the process proceeds to step DD16. If the width of the display area is not within 10 dots, the process proceeds to step DD14. In step DD14, it is determined whether or not the height of the display area, that is, the number of pixels in the Y-axis direction is within 10 dots. If the height of the display area is within 10 dots, the process proceeds to step DD16. If the height of the display area is not within 10 dots, the process proceeds to step DD15.
In step DD15, "2" is added to each of the X coordinate and Y coordinate of the upper left pixel of the display area, and "2" is subtracted from each of the X coordinate and Y coordinate of the lower right pixel, to step F5. move on. In step DD16, the display area is set to a standard size, that is, a non-enlarged size, and the process proceeds to step DD5.
The video generation device central processing unit 53A executes steps DD2 to DD7 and steps DD8 to DD12 by the scroll processing unit 56, and executes steps DD13 to DD16 by the enlargement processing unit 57. Steps V1 to V4 shown in FIG. 33 and steps DD1 to DD16 shown in FIG. 38 are sixth control steps.
Although the process for reducing the image information is not described in the above-described flowchart, the same process can be performed by assigning a shape other than the dot shape, the horizontally long shape, the vertically long shape, and the circle shape to the reduction command. Furthermore, the enlarged image information or the reduced image information may be restored by assigning another shape to a command for restoring the enlarged image information or the reduced image information.
As described above, the display device 13 with a built-in optical sensor includes a plurality of pixels for displaying an image and an optical sensor for detecting the intensity of received light at each of a plurality of positions. The shape formed by the reference position detected when the light intensity detected by the optical sensor is equal to or greater than a predetermined reference intensity is determined, and the shape of the light receiving unit is determined by the central processing unit 15 and the video generation device central processing unit 53A. In accordance with the shape determined by the determination unit 21A, predetermined control is performed on the display image.
Therefore, it is possible to control the display image in accordance with the light irradiated on the screen by the pointing device that emits light, for example, the shape formed on the screen by the light.
Further, since the predetermined control is a control for scrolling and displaying the display image, image information that does not enter the screen can be scrolled and displayed depending on the shape of the light applied to the screen.
Furthermore, since the predetermined control is control for enlarging or reducing the display image, or returning and displaying the enlarged display image or the reduced display image, the image may be changed depending on the shape of light applied to the screen. Information can be enlarged or reduced, or displayed in its original size.
Further, in displaying an image on the photosensor-equipped display device 13 including a plurality of pixels for displaying an image and a photosensor for detecting the intensity of received light for each position, steps CC1 to CC1 shown in FIG. In CC14, the shape formed by the reference position detected when the intensity of the light detected by the optical sensor is greater than or equal to a predetermined reference intensity is determined. In steps V1 to V4 shown in FIG. 33 and steps DD1 to DD16 shown in FIG. 38, predetermined control is performed on the display image in accordance with the shape determined in steps CC1 to CC14 shown in FIG. .
Therefore, when the display method according to the present invention is applied, it is possible to control the display image according to the light irradiated on the screen by the pointing device that emits light, for example, the shape formed on the screen by the light.
Furthermore, since the predetermined control is control for scrolling and displaying the display image, image information that does not enter the screen can be scrolled and displayed by applying the display method according to the present invention.
Further, since the predetermined control is control for enlarging or reducing the display image, or returning and displaying the enlarged display image or the reduced display image, the display method according to the present invention is applied. The image information can be displayed enlarged or reduced.
The present invention can be implemented in various other forms without departing from the spirit or main features thereof. Therefore, the above-described embodiment is merely an example in all respects, and the scope of the present invention is shown in the claims, and is not restricted by the text of the specification. Further, all modifications and changes belonging to the scope of the claims are within the scope of the present invention.

Claims (27)

  1.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
     前記検出部によって検出された光の強さに応じて、少なくとも光の検出位置を示す画像を前記表示手段に表示させる制御手段とを含むことを特徴とする表示装置。
    Display means comprising a plurality of pixels for displaying an image, and a detection unit for detecting the intensity of received light at a plurality of positions;
    A display device comprising: a control unit that causes the display unit to display at least an image indicating a light detection position in accordance with the intensity of light detected by the detection unit.
  2.  前記表示手段に表示させる画像は、予め定める色であることを特徴とする請求項1に記載の表示装置。 2. The display device according to claim 1, wherein the image displayed on the display means is a predetermined color.
  3.  前記検出部は、受光する光の色を検出可能であり、
     前記表示手段に表示させる画像は、前記検出部が検出した色であることを特徴とする請求項1に記載の表示装置。
    The detection unit is capable of detecting the color of light received,
    The display device according to claim 1, wherein the image displayed on the display unit is a color detected by the detection unit.
  4.  前記表示手段に表示させる画像は、予め定める表示形態であることを特徴とする請求項1~3のいずれか1つに記載の表示装置。 The display device according to any one of claims 1 to 3, wherein the image displayed on the display means has a predetermined display form.
  5.  前記検出部によって検出された光の強さが予め定める基準強さ以上であるか否かを判断する判断手段を含み、
     前記制御手段は、前記判断手段によって、前記検出部によって検出された光の強さが予め定める基準強さ以上であると判断されたとき、前記画像を表示手段に表示させることを特徴とする請求項1~4のいずれか1つに記載の表示装置。
    Determining means for determining whether or not the intensity of light detected by the detection unit is greater than or equal to a predetermined reference intensity;
    The control means causes the display means to display the image when it is determined by the determination means that the intensity of light detected by the detection unit is greater than or equal to a predetermined reference intensity. Item 5. The display device according to any one of Items 1 to 4.
  6.  画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
     前記検出部によって検出された光の強さに応じて、検出位置を示す画像を表示させる第1の制御ステップとを含むことを特徴とする表示方法。
    A display method for displaying an image on a display device comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each position,
    And a first control step of displaying an image indicating a detection position in accordance with the intensity of light detected by the detection unit.
  7.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
     前記検出部によって検出された光の強さが変化するパターンを判別する判別手段と、
     前記判別手段によって判別されたパターンが第1のパターンであるとき、前記検出部によって光が検出された位置を示す第1の表示形態のシンボル画像を表示させ、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像を表示させる制御手段とを含むことを特徴とする表示装置。
    Display means comprising a plurality of pixels for displaying an image, and a detection unit for detecting the intensity of received light at a plurality of positions;
    Discriminating means for discriminating a pattern in which the intensity of light detected by the detection unit changes;
    When the pattern discriminated by the discriminating means is the first pattern, a symbol image of the first display form showing the position where the light is detected by the detecting unit is displayed, and the pattern discriminated by the discriminating means is Control means for displaying a symbol image of a second display form different from the first display form and indicating the position when the second pattern is different from the first pattern. A display device.
  8.  前記第1の表示形態は、第1の色であり、
     前記第2の表示形態は、前記第1の色とは異なる第2の色であることを特徴とする請求項7に記載の表示装置。
    The first display form is a first color;
    The display device according to claim 7, wherein the second display form is a second color different from the first color.
  9.  前記第1の表示形態は、第1の形状であり、
     前記第2の表示形態は、前記第1の形状とは異なる第2の形状であることを特徴とする請求項7に記載の表示装置。
    The first display form is a first shape,
    The display device according to claim 7, wherein the second display form is a second shape different from the first shape.
  10.  画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
     前記検出部によって検出された光の強さが変化するパターンを判別する判別ステップと、
     前記判別ステップで判別されたパターンが第1のパターンであるとき、前記検出部によって光が検出された位置を示す第1の表示形態のシンボル画像を表示させ、前記判別ステップで判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、前記第1の表示形態とは異なり、かつ前記位置を示す第2の表示形態のシンボル画像を表示する第2の制御ステップとを含むことを特徴とする表示方法。
    A display method for displaying an image on a display device comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each position,
    A determination step of determining a pattern in which the intensity of light detected by the detection unit changes;
    When the pattern determined in the determination step is the first pattern, a symbol image of a first display form indicating a position where light is detected by the detection unit is displayed, and the pattern determined in the determination step is A second control step for displaying a symbol image in a second display form different from the first display form and indicating the position when the second pattern is different from the first pattern. A display method characterized by that.
  11.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
     前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定手段と、
     前記範囲決定手段によって決定された範囲ごとに各範囲の中心となる位置を特定する位置特定手段と、
     前記位置特定手段によって特定された位置を示す画像を表示させる制御手段とを含むことを特徴とする表示装置。
    Display means comprising a plurality of pixels for displaying an image, and a detection unit for detecting the intensity of received light at a plurality of positions;
    Range determining means for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
    Position specifying means for specifying a position that is the center of each range for each range determined by the range determining means;
    Control means for displaying an image indicating the position specified by the position specifying means.
  12.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
     前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定手段と、
     前記範囲決定手段によって決定された範囲内のうち、光の強さが最も強く検出した位置を前記決定された範囲ごとに特定する位置特定手段と、
     前記位置特定手段によって特定された位置を示す画像を表示させる制御手段とを含むことを特徴とする表示装置。
    Display means comprising a plurality of pixels for displaying an image, and a detection unit for detecting the intensity of received light at a plurality of positions;
    Range determining means for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
    Position specifying means for specifying, for each of the determined ranges, a position where the intensity of light is detected most strongly within the range determined by the range determining means;
    Control means for displaying an image indicating the position specified by the position specifying means.
  13.  前記特定された位置を示す画像は、予め定める表示形態であることを特徴とする請求項11または12に記載の表示装置。 The display device according to claim 11 or 12, wherein the image indicating the specified position is in a predetermined display form.
  14.  前記予め定める表示形態は、予め定める色であることを特徴とする請求項13に記載の表示装置。 14. The display device according to claim 13, wherein the predetermined display form is a predetermined color.
  15.  前記検出部は、受光する光の色を検出可能であり、
     前記予め定める表示形態は、前記検出部が検出した色であることを特徴とする請求項13に記載の表示装置。
    The detection unit is capable of detecting the color of light received,
    The display device according to claim 13, wherein the predetermined display form is a color detected by the detection unit.
  16.  記予め定める表示形態は、予め定める形状であることを特徴とする請求項13に記載の表示装置。 The display device according to claim 13, wherein the predetermined display form is a predetermined shape.
  17.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
     前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定ステップと、
     前記範囲決定ステップで決定された範囲ごとに各範囲の中心にある位置を特定する第1の位置特定ステップと、
     前記第1の位置特定ステップで特定された位置を示す画像を表示する第3の制御ステップとを含むことを特徴とする表示方法。
    A display method for displaying an image on a display device comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each of a plurality of positions,
    A range determining step for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
    A first position specifying step for specifying a position at the center of each range for each range determined in the range determining step;
    And a third control step of displaying an image indicating the position specified in the first position specifying step.
  18.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
     前記検出部によって検出された光の強さが予め定める基準強度以上である範囲を決定する範囲決定ステップと、
     前記範囲決定ステップで決定された範囲内のうち、光の強さが最も強い位置を前記決定された範囲ごとに特定する第2の位置特定ステップと、
     前記第2の位置特定ステップで特定された位置を示す画像を表示する第4の制御ステップとを含むことを特徴とする表示方法。
    A display method for displaying an image on a display device comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each of a plurality of positions,
    A range determining step for determining a range in which the intensity of light detected by the detection unit is equal to or greater than a predetermined reference intensity;
    A second position specifying step for specifying, for each of the determined ranges, a position having the strongest light intensity within the range determined in the range determining step;
    And a fourth control step of displaying an image indicating the position specified in the second position specifying step.
  19.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
     前記検出部によって検出された光の強さが変化するパターンを判別する判別手段と、
     前記判別手段によって判別されたパターンが第1のパターンであるとき、表示画像に対して第1の制御を行い、前記判別手段によって判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、表示画像に対して第2の制御を行う制御手段とを含むことを特徴とする表示装置。
    Display means comprising a plurality of pixels for displaying an image, and a detection unit for detecting the intensity of received light at a plurality of positions;
    Discriminating means for discriminating a pattern in which the intensity of light detected by the detection unit changes;
    When the pattern discriminated by the discriminating means is the first pattern, the first control is performed on the display image, and the pattern discriminated by the discriminating means is different from the first pattern. And a control means for performing second control on the display image.
  20.  前記第1の制御は、前記検出部によって検出された光の位置を示す画像を表示させる制御であり、
     前記第2の制御は、表示画像を操作する制御であることを特徴とする請求項19に記載の表示装置。
    The first control is a control for displaying an image indicating a position of light detected by the detection unit;
    The display device according to claim 19, wherein the second control is a control for manipulating a display image.
  21.  画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
     前記検出部によって検出された光の強さが変化するパターンを判別する判別ステップと、
     前記判別ステップで判別されたパターンが第1のパターンであるとき、表示画像に対して第1の制御を行い、前記判別ステップで判別されたパターンが前記第1のパターンとは異なる第2のパターンであるとき、表示画像に対して第2の制御を行う第5の制御ステップとを含むことを特徴とする表示方法。
    A display method for displaying an image on a display device comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each position,
    A determination step of determining a pattern in which the intensity of light detected by the detection unit changes;
    When the pattern determined in the determination step is the first pattern, the first control is performed on the display image, and the pattern determined in the determination step is different from the first pattern. And a fifth control step for performing second control on the display image.
  22.  画像を表示する複数の画素と、受光する光の強さを複数の位置ごとに検出する検出部とを備える表示手段と、
     前記検出部によって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状を判別する形状判別手段と、
     前記形状判別手段によって判別された形状に応じて、前記表示画像に対して予め定める制御を行う制御手段とを含むことを特徴とする表示装置。
    Display means comprising a plurality of pixels for displaying an image, and a detection unit for detecting the intensity of received light at a plurality of positions;
    A shape discriminating means for discriminating a shape formed by a reference position detected as a light intensity detected by the detection unit equal to or higher than a predetermined reference intensity;
    And a control unit that performs predetermined control on the display image according to the shape determined by the shape determination unit.
  23.  前記予め定める制御は、前記表示画像をスクロールさせて表示する制御であることを特徴とする請求項22に記載の表示装置。 23. The display device according to claim 22, wherein the predetermined control is control for scrolling and displaying the display image.
  24.  前記予め定める制御は、前記表示画像を拡大もしくは縮小して、または拡大した表示画像もしくは縮小した表示画像を元に戻して表示する制御であることを特徴とする請求項22に記載の表示装置。 23. The display device according to claim 22, wherein the predetermined control is control for enlarging or reducing the display image, or returning and displaying the enlarged display image or the reduced display image.
  25.  画像を表示する複数の画素と、受光する光の強さを位置ごとに検出する検出部とを備える表示装置に画像を表示する表示方法であって、
     前記検出部によって検出された光の強さが予め定める基準強さ以上として検出された基準位置によって形成される形状を判別する形状判別ステップと、
     前記形状判別ステップで判別された形状に応じて、前記表示画像に対して予め定める制御を行う第6の制御ステップとを含むことを特徴とする表示方法。
    A display method for displaying an image on a display device comprising a plurality of pixels for displaying an image and a detection unit for detecting the intensity of received light for each position,
    A shape discriminating step for discriminating a shape formed by a reference position detected as a light intensity detected by the detection unit equal to or greater than a predetermined reference intensity;
    And a sixth control step of performing predetermined control on the display image according to the shape determined in the shape determination step.
  26.  前記予め定める制御は、前記表示画像をスクロールさせて表示する制御であることを特徴とする請求項25に記載の表示方法。 26. The display method according to claim 25, wherein the predetermined control is a control for scrolling and displaying the display image.
  27.  前記予め定める制御は、前記表示画像を拡大もしくは縮小して、または拡大した表示画像もしくは縮小した表示画像を元に戻して表示する制御であることを特徴とする請求項25に記載の表示方法。 26. The display method according to claim 25, wherein the predetermined control is control for enlarging or reducing the display image, or returning and displaying the enlarged display image or the reduced display image.
PCT/JP2009/057939 2008-04-25 2009-04-21 Display device and display method WO2009131131A1 (en)

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
JP2008116231A JP2009265439A (en) 2008-04-25 2008-04-25 Display device and display method
JP2008-116230 2008-04-25
JP2008-116234 2008-04-25
JP2008116232A JP2009266036A (en) 2008-04-25 2008-04-25 Display device and display method
JP2008-116231 2008-04-25
JP2008116233A JP2009266037A (en) 2008-04-25 2008-04-25 Display device and display method
JP2008-116232 2008-04-25
JP2008-116229 2008-04-25
JP2008116234A JP2009265440A (en) 2008-04-25 2008-04-25 Display device and display method
JP2008-116233 2008-04-25
JP2008116229A JP2009265438A (en) 2008-04-25 2008-04-25 Display device and display method
JP2008116230A JP5308705B2 (en) 2008-04-25 2008-04-25 Display device and display method

Publications (1)

Publication Number Publication Date
WO2009131131A1 true WO2009131131A1 (en) 2009-10-29

Family

ID=41216862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/057939 WO2009131131A1 (en) 2008-04-25 2009-04-21 Display device and display method

Country Status (1)

Country Link
WO (1) WO2009131131A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309551A (en) * 2012-03-16 2013-09-18 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and icon display processing method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04371063A (en) * 1991-06-19 1992-12-24 Sony Corp Image display device
JPH06102994A (en) * 1992-09-18 1994-04-15 Matsushita Electric Ind Co Ltd Position instruction device
JP2001175413A (en) * 1999-12-16 2001-06-29 Sanyo Electric Co Ltd Display device
JP2002244813A (en) * 2001-02-14 2002-08-30 Sony Corp System and method for image display
JP2005539247A (en) * 2001-12-31 2005-12-22 インテル コーポレイション Light-emitting diode display that senses energy
JP2006018374A (en) * 2004-06-30 2006-01-19 Casio Comput Co Ltd Image processor with projector function, and program
JP2006277552A (en) * 2005-03-30 2006-10-12 Casio Comput Co Ltd Information display system and information display method
JP2007011233A (en) * 2005-07-04 2007-01-18 Toshiba Matsushita Display Technology Co Ltd Flat display device and imaging method using same
JP2007042073A (en) * 2005-07-01 2007-02-15 Canon Inc Video presentation system, video presentation method, program for causing computer to execute video presentation method and storage medium
JP2008022058A (en) * 2006-07-10 2008-01-31 Sony Corp Display apparatus, operating method of display apparatus, and video display system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04371063A (en) * 1991-06-19 1992-12-24 Sony Corp Image display device
JPH06102994A (en) * 1992-09-18 1994-04-15 Matsushita Electric Ind Co Ltd Position instruction device
JP2001175413A (en) * 1999-12-16 2001-06-29 Sanyo Electric Co Ltd Display device
JP2002244813A (en) * 2001-02-14 2002-08-30 Sony Corp System and method for image display
JP2005539247A (en) * 2001-12-31 2005-12-22 インテル コーポレイション Light-emitting diode display that senses energy
JP2006018374A (en) * 2004-06-30 2006-01-19 Casio Comput Co Ltd Image processor with projector function, and program
JP2006277552A (en) * 2005-03-30 2006-10-12 Casio Comput Co Ltd Information display system and information display method
JP2007042073A (en) * 2005-07-01 2007-02-15 Canon Inc Video presentation system, video presentation method, program for causing computer to execute video presentation method and storage medium
JP2007011233A (en) * 2005-07-04 2007-01-18 Toshiba Matsushita Display Technology Co Ltd Flat display device and imaging method using same
JP2008022058A (en) * 2006-07-10 2008-01-31 Sony Corp Display apparatus, operating method of display apparatus, and video display system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309551A (en) * 2012-03-16 2013-09-18 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and icon display processing method

Similar Documents

Publication Publication Date Title
US9176599B2 (en) Display device, display system, and data supply method for display device
KR100714722B1 (en) Apparatus and method for implementing pointing user interface using signal of light emitter
US8943231B2 (en) Display device, projector, display system, and method of switching device
JP5943335B2 (en) Presentation device
JP2008015706A (en) Image processor
JP2019525365A (en) Device for generating computer program and method for generating computer program
KR20090125165A (en) Projector system
JP2008251020A (en) Image display device and method of driving image display device
US20180255266A1 (en) Projector and control method of projector
US20200264729A1 (en) Display method, display device, and interactive projector
WO2009131131A1 (en) Display device and display method
JP4968360B2 (en) Image display device
US11093085B2 (en) Position detection method, position detection device, and interactive projector
JP2009266037A (en) Display device and display method
JP4572758B2 (en) Position coordinate input device
CN103649879B (en) Digitizer using position-unique optical signals
JP2006258683A (en) Colorimetric device, colorimetric method and colorimetric program
JP2009266036A (en) Display device and display method
JP2009265438A (en) Display device and display method
JP5308705B2 (en) Display device and display method
US11144164B2 (en) Position detection method, position detection device, and interactive projector
JP2009245366A (en) Input system, pointing device, and program for controlling input system
JP2009265439A (en) Display device and display method
CN102999233B (en) The noise spot automatic shield method and device of electronic whiteboard based on imageing sensor
US20190287236A1 (en) Image processing system, image processing device and image processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09735624

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09735624

Country of ref document: EP

Kind code of ref document: A1