WO2014208168A1 - 情報処理装置、制御方法、プログラム、および記憶媒体 - Google Patents
情報処理装置、制御方法、プログラム、および記憶媒体 Download PDFInfo
- Publication number
- WO2014208168A1 WO2014208168A1 PCT/JP2014/059530 JP2014059530W WO2014208168A1 WO 2014208168 A1 WO2014208168 A1 WO 2014208168A1 JP 2014059530 W JP2014059530 W JP 2014059530W WO 2014208168 A1 WO2014208168 A1 WO 2014208168A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- laser pointer
- laser
- information
- unit
- pointer
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/18—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective
- G02B27/20—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective for imaging minute objects, e.g. light-pointer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0383—Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present disclosure relates to an information processing apparatus, a control method, a program, and a storage medium.
- Projectors that can project and display images on a large screen are used in various situations such as company meetings, presentations, and school classes.
- a laser pointer that projects laser light onto the projected image is used.
- a technique for utilizing such a laser pointer having a function of projecting laser light for UI operation of a projector has been proposed as follows.
- Patent Document 1 by calculating the difference between the projected image data and the captured image data obtained by capturing the projected image plane, the movement of the laser pointer is recognized, and a command associated with the predetermined movement of the laser pointer is issued.
- a control system to perform is disclosed. Specifically, when the pointer irradiated by the laser pointer moves so as to form a right arrow, the control system controls to execute an associated display command “Proceed to next slide”.
- a determination system for correctly detecting the indicated position of the laser pointer on the projected image by the projector is presented even when the brightness of the screen installation location changes. Specifically, the determination system sets a pointer position determination threshold value before starting projection, calculates difference image data between the captured image data of the current frame and the captured image data of the previous frame, and exceeds the threshold value. The pixel position is determined as the irradiation position by the laser pointer.
- Patent Documents 1 and 2 described above only the coordinates of the laser beam irradiation position by the laser pointer are recognized based on the captured image.
- an input device such as a mouse connected to a computer transmitting projection image data to the projector must be operated. It was difficult to use in the situation where is standing and explaining near the screen.
- an operation that moves the cursor on the screen via an input device such as a mouse causes a processing delay, and the user delays the processing. It was necessary to allow the operation.
- the processing delay is processing delay that occurs in each of detection processing in an input device such as a mouse, internal processing of a computer, and display processing on a display.
- a new and improved information processing apparatus, control method, program, and storage medium capable of intuitively performing an operation input on an object in a projected image by moving a laser pointer are provided. suggest.
- a recognition unit that recognizes an irradiation position of a laser beam by a laser pointer with respect to a projection image, an acquisition unit that acquires movement information of the laser pointer, and an object in the projection image corresponding to the irradiation position,
- An information processing apparatus comprising: an output unit that outputs a control signal for changing a display according to the movement information.
- the step of recognizing the irradiation position of the laser beam by the laser pointer with respect to the projection image, the step of acquiring the movement information of the laser pointer, and the object in the projection image corresponding to the irradiation position are moved. And a step of outputting a control signal for changing the display in accordance with the information.
- a recognition unit that recognizes an irradiation position of a laser beam by a laser pointer with respect to a projection image
- an acquisition unit that acquires movement information of the laser pointer
- a projection image corresponding to the irradiation position in the projection image A program is proposed for causing an object to function as an output unit that outputs a control signal for changing the display according to the movement information.
- a recognition unit that recognizes an irradiation position of a laser beam by a laser pointer with respect to a projection image
- an acquisition unit that acquires movement information of the laser pointer
- a projection image corresponding to the irradiation position in the projection image A storage medium is proposed that stores a program for causing an object to function as an output unit that outputs a control signal for changing the display in accordance with the movement information.
- an operation system according to an embodiment of the present disclosure includes a projector 1, a laser pointer 2, and a PC (personal computer) 3 that outputs content for projection to the projector 1.
- the projection content includes charts, sentences, various other graphic images, maps, websites, 3D objects, and the like, and is hereinafter referred to as projection image data.
- the projector 1 projects the image data (display signal) received from the PC 3 onto a projection screen or a wall (hereinafter, the screen S is used as an example) in accordance with a control signal from the PC 3.
- the projected image projected on the screen S includes objects obj1 to 3 that can be edited, for example.
- the laser pointer 2 has a function of irradiating visible laser light in response to a pressing operation of the operation button 20a by the user (lecturer).
- the user can use the laser pointer 2 to irradiate an image projected on the screen S with a laser beam and give a presentation while indicating the irradiation position P according to the explanation location.
- the PC 3 electronically generates an image for projection, transmits the image data to the projector 1 by wire / wireless, and performs projection control.
- FIG. 1 shows a notebook PC as an example, the PC 3 according to the present embodiment is not limited to a notebook PC, and may be a desktop PC or a server on a network (cloud).
- Patent Documents 1 and 2 only the coordinates of the irradiation position of the laser beam by the laser pointer are recognized based on the captured image obtained by capturing the projection image. Therefore, it is not possible to perform an operation input on the editable objects obj1 to 3 in the projection image using the laser pointer, and in order to move the objects obj1 to 3, an input device (mouse, touchpad, Keyboard etc. had to be operated.
- the operation system according to each embodiment of the present disclosure can intuitively input an operation to an object in a projection image by moving a laser pointer. Specifically, for example, as shown in FIG. 1, after the operation button 20a of the laser pointer 2 is pressed and the object to be operated is determined as the object obj1, the object obj1 is rotated by rotating the laser pointer 2. Can be operated.
- FIG. 1 a system configuration example of an operation system according to an embodiment of the present disclosure will be specifically described.
- FIG. 2 is a diagram for describing a first system configuration example of an operation system according to an embodiment of the present disclosure. As shown in FIG. 2, the first system configuration example is formed by a projector 1, a laser pointer 2 (information processing device according to the present disclosure), and a PC 3.
- the projector 1 is wired / wirelessly connected to the PC 3 and projects an image on the screen S based on the projection image data (display signal) received from the PC 3.
- the projector 1 also has a visible light imaging unit 13v for recognizing the irradiation position P of the visible laser beam V by the laser pointer 2 on the projected image.
- the visible light imaging unit 13v may be built in the projector 1 or may be externally attached.
- the visible light imaging unit 13v is provided in the projector 1, and thus can automatically calibrate the range of the projection image that is the imaging target. Specifically, the visible light imaging unit 13v can change the imaging range and imaging direction in conjunction with the projection direction by the projector 1. In addition, when the visible light imaging unit 13v captures an area wider than the range of the projection image, laser irradiation outside the range of the projection image (outside the screen) can also be used for the UI operation.
- the laser pointer 2 emits a visible laser beam V that can be seen by human eyes in response to pressing of the operation button 20a provided on the laser pointer 2.
- the laser pointer 2 is used by the user to point an arbitrary position on the projection image with the laser light V. Specifically, the laser pointer 2 irradiates the laser beam V when the operation button 20a is half-pressed (an example of an irradiation instruction operation).
- the laser pointer 2 may irradiate a visible light marker image (hereinafter also referred to as a visible light marker) in addition to the visible light laser beam V.
- the visible light marker is, for example, an arbitrary figure (cross shape, heart shape, etc.) or a one-dimensional / two-dimensional barcode in which information such as a user ID is embedded. Further, the visible light marker is not limited to a still image, and may be a moving image whose color or shape changes.
- the laser pointer 2 is information indicating that the determination operation has been detected while continuing the irradiation of the laser beam V when the operation button 20a is fully pressed (an example of the determination operation) (hereinafter also referred to as determination operation information). Is transmitted to the projector 1 by wireless communication.
- the full pressing operation of the operation button 20a is an example of a determination operation for determining an object in the projection image as an operation target.
- the user holds the laser pointer 2 in his hand and points it at the screen S, presses the operation button 20a halfway to irradiate the laser beam V, and adjusts the irradiation position P to an arbitrary object among the objects in the projection image.
- the operation button 20a is fully pressed to determine the operation target.
- the user can move the laser pointer 2 to perform intuitive operation input on the determined object. Specifically, the user moves the laser pointer 2 in the vertical / parallel movement, rotational movement, or pan / tilt direction with respect to the projection image on the screen S, thereby moving the laser pointer to the object in the projection image.
- the operation input corresponding to the movement of 2 can be performed. For example, as shown in FIG. 1, by rotating the laser pointer 2, the object obj1 determined as the operation target can be rotated.
- Such movement information of the laser pointer 2 is detected by various sensors such as a gyro sensor, an acceleration sensor, and a geomagnetic sensor provided on the laser pointer 2, and transmitted to the projector 1 by wireless communication.
- the laser light V irradiated by the laser pointer 2 is imaged by the visible light imaging unit 13v provided in the projector 1, and the projector 1 recognizes the position coordinates of the irradiation position P. Further, when the determination operation information is transmitted from the laser pointer 2, the projector 1 determines an object in the projection image located at the coordinate position of the irradiation position P as an operation target.
- the projector 1 receives the movement information of the laser pointer 2 from the laser pointer 2. Then, the projector 1 transmits (outputs) a control signal (hereinafter also referred to as pointer information) for changing the display of the object determined as the operation target according to the movement information received from the laser pointer 2 to the PC 3.
- a control signal hereinafter also referred to as pointer information
- the PC 3 executes display change processing for the object according to the control signal (pointer information) received from the projector 1, and transmits the image data after execution to the projector 1.
- FIG. 3 is a diagram for describing a second system configuration example of the operation system according to the embodiment of the present disclosure.
- the second system configuration of the operation system is formed of a projector 1b (information processing apparatus according to the present disclosure), a laser pointer 2b, and a PC 3.
- the laser beam emitted from the laser pointer 2 is changed to visible light.
- invisible laser light may be irradiated.
- the laser pointer 2b emits a visible laser beam V and an invisible laser beam N such as an infrared ray that cannot be seen by human eyes in accordance with the operation of the operation button 20a.
- the visible laser beam V is irradiated, and when the operation button 20a is further pressed (when fully pressed), the visible laser beam V is emitted. While continuing the irradiation, the laser beam N of invisible light is irradiated.
- the lecturer or audience visually recognizes the irradiation position Pv indicated by the visible laser beam V, and the projector 1b captures the projected image by the invisible light imaging unit 13n provided in the projector 1b. Based on the captured image, the irradiation position Pn indicated by the invisible laser beam N is recognized.
- the irradiation position Pn by the invisible laser beam N is located at the same position as or near the irradiation position Pv of the laser beam V.
- the projector 1b (information processing apparatus according to the present disclosure) can recognize the irradiation position Pn of the laser beam N of invisible light irradiated from the laser pointer 2b. .
- determination operation information for example, full-press operation information on the operation button 20a
- the projector 1b determines an object (an object in the projection image located at the coordinate position of the irradiation position Pn) indicated by the laser light N on the projection image as an operation target object.
- the laser pointer 2b also has movement information (vertical / parallel movement, rotational movement, tilt / pan operation) detected by a gyro sensor or the like provided on the laser pointer 2b. Send to.
- movement information vertical / parallel movement, rotational movement, tilt / pan operation
- the projector 1b displays a control signal (pointer information) for changing the display of the object determined as the operation target according to the movement information received from the laser pointer 2b. ) Is transmitted (output) to the PC 3.
- FIG. 4 is a diagram for describing a third system configuration example of the operation system according to the embodiment of the present disclosure.
- the third system configuration of the operation system is formed of a projector 1c (information processing device according to the present disclosure), a laser pointer 2c, and a PC 3.
- the projectors 1 and 1b receive (acquire) the movement information of the laser pointers 2 and 2b by wireless communication.
- the movement information is acquired according to the present embodiment.
- a method is not limited to this, For example, you may acquire based on the invisible light marker M irradiated from the laser pointer 2b.
- the laser pointer 2c irradiates a visible laser beam V and an invisible marker M such as an infrared ray that cannot be seen by human eyes in accordance with the operation of the operation button 20a.
- an invisible marker M such as an infrared ray that cannot be seen by human eyes in accordance with the operation of the operation button 20a.
- the visible laser beam V is irradiated, and when the operation button 20a is further pressed (when fully pressed), the visible laser beam V is emitted. While continuing the irradiation, the marker M of invisible light is irradiated.
- the lecturer or audience visually recognizes the irradiation position P indicated by the visible laser beam V, and the projector 1c captures the projected image by the invisible light imaging unit 13n provided in the projector 1c. Based on the captured image, the position coordinate (irradiation position) of the marker M of invisible light is recognized.
- the irradiation position of the invisible light marker M is located at the same position as or near the irradiation position P of the laser beam V.
- the invisible light marker M is an image having an area such as an arbitrary figure (cross shape, heart shape, etc.) or a 1D / 2D barcode in which information such as a user ID is embedded.
- the projector 1c can recognize the position relative to the projected image of the laser pointer 2c by analyzing the shape, size, inclination, distortion, etc. of the invisible light marker M, and further, the shape and size of the invisible light marker M.
- the movement information of the laser pointer 2c can be acquired by continuously analyzing changes such as distortion.
- the invisible light marker M is not limited to a still image, and may be a moving image whose color or shape changes.
- determination operation information for example, full-press operation information on the operation button 20a
- the laser pointer 2c is transmitted to the projector 1c. Is done.
- the projector 1c receives the determination operation information, the projector 1c determines the object indicated by the invisible light marker M on the projection image as the operation target object.
- the projector 1c controls the display of the object determined as the operation target according to the movement information of the laser pointer 2c. (Pointer information) is transmitted (output) to the PC 3.
- the projectors 1 to 1c are collectively referred to as the projector 1 when it is not necessary to individually describe the projectors 1 to 1c.
- the laser pointers 2 to 2c are collectively referred to as the laser pointer 2 when it is not necessary to individually explain them.
- the operation unit of the laser pointer 2 is not limited to a configuration in which a single operation button 20a as shown in FIGS. 2 to 4 can detect a user operation in a plurality of stages (half press, full press).
- a configuration in which a plurality of stages of user operations are detected by the buttons may be used.
- operation buttons are provided on the upper surface and the lower surface of the housing of the laser pointer 2, respectively, and when the operation button on the upper surface is pressed, the first stage (corresponding to the half-pressing operation) is detected. When the operation button is also pressed, the user operation at the second stage (corresponding to the full press operation) is detected.
- the laser pointer 2 can detect two-stage user operations at a stage where one button is pressed (first stage) and a stage where two buttons are pressed simultaneously (second stage). .
- the operation unit provided in the laser pointer 2 is not limited to a button having a physical structure, and may be realized by a sensor that detects contact / proximity of a finger.
- a touch panel is provided on the upper surface of the housing of the laser pointer 2 as an operation unit, and the laser pointer 2 can be used in a plurality of stages according to the number of touches (number of taps) based on the detection result of finger contact / proximity. Detect operations.
- the shape of the laser pointer 2 according to the present embodiment is not limited to the rectangular parallelepiped shape shown in FIGS. 2 to 4B, and may be, for example, the shape of a pointing rod provided with an irradiation portion at the tip.
- projection image data (display signal) is transmitted from the PC 3 to the projector 1, and pointer information (control signal) is transmitted from the projector 1 to the PC 3. ) Is sent. Then, display control processing according to the pointer information is executed in the PC 3, and the image data (display signal) after execution is transmitted to the projector 1.
- the configuration of the operation system according to the present embodiment is not limited to the example illustrated in FIG. 5A.
- the projector 1 ′ including the function of the PC 3 can receive content from the content server 4 on the network (cloud) as illustrated in FIG. 5B.
- the configuration may be such that data (photos, videos, games, websites, etc.) is acquired.
- the projector 1 'generates a display signal (projection image data) based on the acquired content data, and performs projection control.
- the projector 1 ′ executes display control processing according to the pointer information, and projects the image data after execution.
- FIG. 6 is a block diagram illustrating an example of an internal configuration of the operation system according to the present embodiment.
- the projector 1 includes a projection image reception unit 10, an image projection unit 11, an imaging unit 13, a position recognition unit 14, an information acquisition unit 15, and a pointer information output unit 16.
- Projection image receiving unit 10 receives image data for projection from PC 3 by wire / wireless, and outputs the received image data to image projection unit 11.
- the image projection unit 11 projects (projects) the image data sent from the image projection unit 11 onto a projection screen or a wall.
- the imaging unit 13 captures the projection image projected on the screen S and outputs the captured image to the position recognition unit 14.
- the imaging unit 13 is realized by an invisible light imaging unit 13n that performs invisible light (invisible light) imaging such as an infrared camera or an ultraviolet camera, or a visible light imaging unit 13v that performs visible light imaging.
- the position recognizing unit 14 irradiates the projection image with the visible light laser light V irradiated position Pv and the invisible light laser light N irradiated position Pn with respect to the projected image based on the visible / invisible light captured image. Or functions as a recognition unit that recognizes the coordinate position of the invisible light marker M.
- the position recognition unit 14 detects the irradiation position (positional coordinate) by detecting the difference between the image projected by the image projection unit 11 and the visible / invisible light captured image obtained by capturing the projection image. .
- the position recognition unit 14 also analyzes the difference between the visible light / invisible light captured image of the previous frame of the currently projected image and the visible light / invisible light captured image of the currently projected image. The accuracy can be increased by adding.
- the position recognition unit 14 outputs the recognized irradiation position information (position coordinate information) to the pointer information output unit 16.
- the information acquisition unit 15 has a function of receiving determination operation information and / or movement information from the laser pointer 2 wirelessly.
- a method of wireless communication between the projector 1 and the laser pointer 2 is not particularly limited, but data is transmitted and received by, for example, Wi-Fi (registered trademark) or Bluetooth (registered trademark).
- the information acquisition unit 15 analyzes the non-visible light captured image captured by the non-visible light imaging unit 13 n to determine the shape and size of the invisible light marker M.
- the movement information of the laser pointer 2 is acquired according to the inclination, distortion, and the like.
- the information acquisition unit 15 outputs the acquired determination operation information and movement information to the pointer information output unit 16.
- the pointer information output unit 16 is an operation target in the projection image projected from the image projection unit 11 based on the determination information output from the information acquisition unit 15 and the irradiation position information output from the position recognition unit 14. The object to be determined is determined. Then, the pointer information output unit 16 detects a control signal for changing the display of the determined object according to the movement information output from the information acquisition unit 15 as pointer information, and transmits it to the PC 3 by wire / wireless ( Output.
- the laser pointer 2 includes an operation unit 20, a visible light laser irradiation unit 21, a non-visible light irradiation unit 22, a transmission unit 23, and an attitude detection sensor 24.
- the operation unit 20 has a function of detecting a plurality of stages of user operations, and is realized by a single / plurality of operation buttons and a touch panel. For example, when a first-stage user operation (specifically, a half-press operation of the operation button 20a) is detected, the operation unit 20 outputs to the visible light laser irradiation unit 21 that an irradiation instruction operation has been detected. . In addition, when the second stage user operation (specifically, a full pressing operation of the operation button 20a) is detected, the operation unit 20 detects the determination operation in the invisible light irradiation unit 22 and the transmission unit 23. Outputs the effect.
- a first-stage user operation specifically, a half-press operation of the operation button 20a
- the operation unit 20 outputs to the visible light laser irradiation unit 21 that an irradiation instruction operation has been detected.
- the second stage user operation specifically, a full pressing operation of the operation button 20a
- the operation unit 20 detects the determination operation in the
- the visible light laser irradiation unit 21 has a function of irradiating the visible laser beam V when a first-stage user operation (irradiation instruction operation) is detected in the operation unit 20. Further, the irradiation of the laser beam V by the visible light laser irradiation unit 21 is continuously performed even when the second stage user operation (decision operation) is detected in the operation unit 20.
- the invisible light irradiating unit 22 uses the invisible laser beam N or the invisible light marker M (also referred to as an invisible light image) when a second stage user operation (decision operation) is detected in the operation unit 20. Has the function of irradiating.
- the invisible light irradiation by the invisible light irradiation unit 22 is performed at the same position as or near the irradiation position Pv of the visible laser beam V irradiated by the visible light laser irradiation unit 21.
- the invisible light irradiation unit 22 is a one-dimensional / two-dimensional bar in which information such as a user ID is embedded when a first-stage user operation (irradiation instruction operation) is detected in the operation unit 20. You may irradiate the non-visible light marker M by a code
- the transmission unit 23 has a function of wirelessly transmitting determination operation information to the projector 1 when a second stage user operation (determination operation) is detected in the operation unit 20.
- the transmission unit 23 moves the detection result output from the posture detection sensor 24 while the second-stage user operation (decision operation) is detected (for example, while the operation button 20a is fully pressed). Information is transmitted wirelessly to the projector 1 continuously.
- the posture detection sensor 24 is a detection unit that detects posture information of the laser pointer 2, and is realized by, for example, a gyro sensor, an acceleration sensor, a geomagnetic sensor, or the like. Thereby, the laser pointer 2 can acquire its own up / down / left / right movement, rotational movement, and movement in the pan / tilt direction (movement information). The detection result of the posture detection sensor 24 is output to the transmission unit 23.
- the PC 3 includes a control unit 30, an image output unit 31, and an operation input unit 32.
- the control unit 30 has a function of controlling each component of the PC 3. Specifically, for example, the control unit 30 executes display control processing according to the operation input information (including the pointer information transmitted from the projector 1) detected by the operation input unit 32, and the image data after execution is The image output unit 31 outputs (transmits) to the projector 1. Specifically, the control unit 30 changes the display of the object in the projection image determined as the operation target according to the movement information of the laser pointer 2 (up / down / left / right movement, rotational movement, movement in the pan / tilt direction). Let For example, when the movement information indicates a movement of the laser pointer 2 rotating to the right with respect to the projection image, the control unit 30 performs processing to rotate the object in the determined projection image to the right in the same manner.
- the operation input unit 32 has a function of accepting user operation input (operation input information) from a keyboard or mouse connected to the PC 3.
- the operation input unit 32 also functions as a receiving unit that receives pointer information (control signal) from the projector 1 as operation input information.
- the operation input unit 32 outputs the received operation input information to the control unit 30.
- the image output unit 31 has a function of transmitting image data for projection to the projector 1 by wire / wireless.
- the transmission of the image data for projection may be continuously performed.
- the user can perform intuitive operation input according to the movement of the laser pointer 2 on the object in the projection image using the laser pointer 2.
- FIG. 7 is a sequence diagram showing operation processing of the operation system according to the present embodiment. As shown in FIG. 7, first, in step S103, the laser pointer 2a and the projector 1a are paired (connected or set) automatically or manually.
- step S106 the PC 3 and the projector 1 are connected by wire / wireless.
- the connection method is not particularly limited in this specification.
- step S109 the image output unit 31 of the PC 3 transmits the image data for projection to the projector 1.
- step S112 the image projection unit 11 of the projector 1a projects the projection image received from the projector 1a by the projection image reception unit 10 onto the screen S.
- step S115 the projector 1a starts visible light imaging for the range of the projected image by the visible light imaging unit 13v.
- the laser pointer 2a irradiates the laser V in accordance with a user operation. Specifically, for example, the laser pointer 2a irradiates the laser beam V with the visible light laser irradiation unit 21 when the operation button 20a is half-pressed. Thereby, the user (speaker) can explain to the audience while indicating an arbitrary place in the projection image with the laser beam V.
- step S124 the position recognition unit 14 of the projector 1a recognizes the irradiation position P (coordinate position) of the laser light V based on the visible light captured image captured by the visible light imaging unit 13v.
- step S127 the laser pointer 2a detects a determination operation (for example, a full pressing operation of the operation button 20a) by the operation unit 20.
- a determination operation for example, a full pressing operation of the operation button 20a
- step S130 the laser pointer 2a wirelessly transmits the determination operation information (decision command) detected by the operation unit 20 to the projector 1a.
- step S131 the pointer information output unit 16 of the projector 1a receives the determination operation information from the laser pointer 2a and is included in the projected image located at the coordinate position of the irradiation position P recognized by the position recognition unit 14. This object is determined as an operation target.
- the laser pointer 2a detects the posture of the laser pointer 2a by the posture detection sensor 24.
- the orientation detection sensor 24 detects the orientation, inclination, angular velocity, and the like of the laser pointer 2a.
- step S135 the transmission unit 23 of the laser pointer 2a continuously transmits the detection result by the attitude detection sensor 24 to the projector 1a as movement information of the laser pointer 2a.
- the transmission unit 23 may continuously transmit the movement information of the laser pointer 2a while the determination operation is continuously detected by the operation unit 20.
- step S136 the pointer information output unit 16 of the projector 1a displays pointer information (control signal) for changing the display of the object determined in S131 according to the movement information received from the laser pointer 2a by the information acquisition unit 15. ) Is detected.
- step S139 the pointer information output unit 16 of the projector 1a transmits the detected pointer information to the PC 3.
- step S142 the control unit 30 of the PC 3 executes display control processing according to the pointer information received from the projector 1a by the operation input unit 32.
- step S145 the image output unit 31 of the PC 3 transmits the projection image after the display control process is executed (after the process according to the pointer information) to the projector 1a.
- Example of operation input with laser pointer >> Next, an intuitive operation input using the laser pointer 2 in the transmission system according to the present embodiment will be described with a specific example.
- a specific example of the operation input by the laser pointer 2a in the first system configuration example will be described as an example among the first to third system configuration examples of the operation system according to the present embodiment described above.
- operation input based on the locus of the irradiation position P can be performed in addition to the operation input corresponding to the movement information of the laser pointer 2 described above.
- the projector 1a recognizes a change in the position of the irradiation position P based on the visible light captured image captured by the visible light imaging unit 13v, grasps the path of the irradiation position P, and is associated with a gesture that draws a predetermined path.
- the designated command is output as pointer information.
- FIG. 8 is a diagram showing a specific example of the operation input based on the locus of the irradiation position P.
- the projector 1a outputs a determination command as pointer information.
- the projector 1a when the irradiation position P by the laser beam V irradiated by the laser pointer 2a follows a trajectory that draws a line horizontally, the projector 1a outputs a cancel or delete command as pointer information.
- the irradiation positions Pr and Pl by the two laser pointers 2a-R and 2a-L are pinched out (the irradiation positions Pr and Pl are separated) or pinched in (the irradiation positions Pr and Pl are close).
- the projector 1a outputs an enlargement or reduction command as pointer information.
- FIG. 9 is a diagram for explaining the types of movement of the laser pointer 2a according to the present embodiment.
- the method of moving the laser pointer 2a generally has freedom of movement / rotation with respect to six types of axes.
- a display control process is performed in which the object in the determined projection image is moved (changed in display) in accordance with the movement with six degrees of freedom.
- the object is also translated in the x, y, and z directions.
- the object is rotated about the x axis (pitching) according to the movement of tilting the laser pointer 2a up and down as shown in FIG. 9 (pitching), and the object is rotated about the y axis according to the movement of panning left and right ( (Yaw), the object is rotated about the z-axis (rolling) according to the rotational movement.
- viewpoint control in 3D CAD, 3D CG modeling software, for example.
- a two-degree-of-freedom input device such as a mouse
- it is possible to switch modes such as translation in the xy direction / yz direction / xz direction, yaw / pitch rotation, roll rotation, GUI buttons, keyboard operations, and mouse buttons.
- the operation is complicated because it needs to be performed depending on the pressed state.
- these viewpoint position operations can be performed seamlessly in a modeless manner.
- FIG. 10 is a diagram for explaining an operation input example using the movement of the roll rotation.
- the user presses the operation button 20a of the laser pointer 2a halfway to irradiate the laser beam V, and the irradiation position P is an arbitrary object (obj) in the image projected on the screen S. Fit. Thereafter, the user presses the operation button 20a of the laser pointer 2a to determine the object (obj) to be operated, and rotates the laser pointer 2a with the operation button 20a fully pressed (rotates and rolls about the z axis).
- movement information tilt and angular velocity
- the projector 1a displays a control signal for changing the display of the object determined as the operation target according to the movement information of the laser pointer 2a.
- Pointer information is transmitted to the PC 3.
- the PC 3 performs display control so that the object determined as the operation target is rotated in accordance with the rotation of the laser pointer 2a.
- the PC 3 transmits the display-controlled projection image for rotating the object to the projector 1a, and the projector 1a projects the transmitted projection image on the screen S.
- FIG. 11 is a diagram for explaining an example of operation input using a vertical and vertical movement.
- the user presses the operation button 20a of the laser pointer 2a halfway to irradiate the laser beam V, and the irradiation position P is an arbitrary object (for example, a map image) in the image projected on the screen S. ).
- the user determines the object (obj) to be operated by fully pressing the operation button 20a of the laser pointer 2a, and moves the laser pointer 2a with respect to the screen S in the front-rear vertical direction with the operation button 20a fully pressed. (Translate in the z-axis direction).
- movement information (acceleration and direction) is transmitted from the laser pointer 2a to the projector 1a, and the projector 1a displays a control signal for changing the display of the object determined as the operation target according to the movement information of the laser pointer 2a.
- Pointer information is transmitted to the PC 3.
- the PC 3 performs display control so that the object determined as the operation target is enlarged / reduced according to the vertical / vertical movement of the laser pointer 2a according to the pointer information. Then, the PC 3 transmits to the projector 1a a display image whose display is controlled to enlarge / reduce the object, and the projector 1a projects the transmitted projection image onto the screen S.
- the map image (object) is reduced (zoomed out), and when the laser pointer 2a is moved forward in the vertical direction, the map image ( Object) is magnified (zoom in).
- FIG. 12 is a diagram for describing an operation input example using a movement of panning left and right. As shown on the left in FIG. 12, the user presses the operation button 20a of the laser pointer 2a halfway to irradiate the laser beam V, and the irradiation position P is an arbitrary object (for example, 3D) projected on the screen S. Object).
- the operation button 20a of the laser pointer 2a halfway to irradiate the laser beam V
- the irradiation position P is an arbitrary object (for example, 3D) projected on the screen S. Object).
- the user determines the object (obj) to be operated by fully pressing the operation button 20a of the laser pointer 2a, and moves the laser pointer 2a horizontally with respect to the screen S in a state where the operation button 20a is fully pressed. Pan to (swing left and right).
- movement information (acceleration or direction) is transmitted from the laser pointer 2a to the projector 1a, and the projector 1a displays a control signal for changing the display of the object determined as the operation target according to the movement information of the laser pointer 2a.
- (Pointer information) is transmitted to the PC 3.
- the PC 3 performs display control so that the object determined as the operation target is rotated (yawed) about the y axis in accordance with the pan movement of the laser pointer 2a. Then, the PC 3 transmits a projection image obtained by yawing control of the object to the projector 1a, and the projector 1a projects the transmitted projection image on the screen S.
- the 3D object is rotated (yawed) in the y-axis direction.
- FIG. 13 is a diagram for explaining a case where a viewpoint position operation of a 3D object (rigid body) is performed using a plurality of laser pointers 2a.
- the user holds the plurality of laser pointers 2a-R, 2a-L in both hands and presses the operation button 20a halfway to irradiate the laser beam V.
- the projector 1a recognizes the coordinate positions of the irradiation positions Pr and Pl by the laser pointers 2a-R and 2a-L based on the visible light captured image of the projection image, and sets the irradiation positions Pr and Pl to both ends.
- the held 3D object is determined as an operation target.
- movement information is transmitted from the laser pointers 2a-R and 2a-L to the projector 1a.
- the projector 1a transmits to the PC 3 a control signal (pointer information) for changing the display of the object determined as the operation target in accordance with the movement information of the laser pointers 2a-R and 2a-L.
- the PC 3 performs display control so that the object determined as the operation target is rotated (yawed) about the y-axis in accordance with the pan movement of the laser pointer 2a according to the pointer information. Then, the PC 3 transmits a projection image obtained by yawing control of the object to the projector 1a, and the projector 1a projects the transmitted projection image on the screen S.
- the 3D object held at the irradiation positions Pr and Pl of the laser beam V irradiated from the plurality of laser pointers 2a-R and 2a-L is converted into the laser pointers 2a-R and 2a-.
- panning L it is possible to rotate in the y-axis direction.
- the 3D object determined as the operation target is formed from one rigid body.
- the operation target according to the present embodiment is not limited to the example shown in FIG. May be a 3D object formed by being connected via indirect. This will be specifically described below with reference to FIG.
- FIG. 14 is a diagram for explaining a case where a 3D object formed by connecting a plurality of rigid bodies through indirect operation is operated.
- the user holds the plurality of laser pointers 2a-R, 2a-L in both hands and presses the operation buttons 20a halfway to irradiate the laser beam V, and the irradiation positions Pr and Pl are displayed on the screen.
- the user matches the irradiation position Pr with the indirect portion of the rigid body G1 and the rigid body G2 that form the 3D object (obj), and also connects the irradiation position Pr to the rigid body G1 via the indirect portion. Fit.
- the user presses the operation buttons 20a of the laser pointers 2a-R and 2a-L to determine the object to be operated (the rigid body G2 indicated by the irradiation position Pr), and presses the operation button 20a fully.
- the laser pointer 2a-L is moved in the horizontal direction.
- the projector 1a recognizes the coordinate positions of the irradiation positions Pr and Pl by the laser pointers 2a-R and 2a-L based on the visible light captured image of the projection image.
- movement information is transmitted from the laser pointers 2a-R and 2a-L to the projector 1a.
- the projector 1a transmits to the PC 3 a control signal (pointer information) for bending the rigid body G2 designated at the irradiation position Pr at the indirect portion designated at the irradiation position Pl according to the movement information of the laser pointer 2a-R. .
- the PC 3 performs display control so that the rigid body G2 of the 3D object determined as the operation target is bent at the indirect portion in accordance with the horizontal movement of the laser pointer 2a according to the pointer information. Then, the PC 3 transmits a projection image whose display is controlled to bend the rigid body G2 at a predetermined indirect portion to the projector 1a, and the projector 1a projects the transmitted projection image on the screen S.
- the rigid body G2 connected to the indirect part at the indirect part of the 3D object indicated by the irradiation position Pl by the laser pointer 2a-L. Can be moved according to the movement of the laser pointer 2a-R.
- the invisible light marker M irradiated from the laser pointer 2c rotates in the same manner as the rotation of the laser pointer 2c.
- the projector 1c that captures invisible light the projected image irradiated with the invisible light marker M analyzes the invisible light captured image, analyzes the inclination of the invisible light marker M, and
- the movement information (rotational movement) of the laser pointer 2c can be acquired according to the inclination.
- the size of the invisible light marker M irradiated from the laser pointer 2c increases as the laser pointer 2c moves away from the screen S.
- the projector 1c that captures invisible light the projected image irradiated with the invisible light marker M analyzes the invisible light captured image, analyzes the size of the invisible light marker M, and the invisible light marker M.
- the movement information (vertical movement) of the laser pointer 2c can be acquired in accordance with the change in the size of.
- the shape of the invisible light marker M irradiated from the laser pointer 2c is distorted as the laser pointer 2c moves obliquely with respect to the screen S.
- the projector 1c that captures invisible light the projected image irradiated with the invisible light marker M analyzes the invisible light captured image, analyzes the shape of the invisible light marker M, and
- the movement information (pan / tilt movement) of the laser pointer 2c can be acquired in accordance with the shape change (distortion).
- the operation input by the invisible light marker M has been described above.
- An example of the invisible light marker M irradiated from the laser pointer 2c is a two-dimensional barcode as shown in FIG. 15.
- an image recognition technique such as a cyber code is used. Is used.
- the non-visible light marker M is used.
- the operation input using the visible light marker is also possible. .
- the operation system according to the present embodiment can also accept independent operation inputs from a plurality of users.
- independent operation inputs from a plurality of users.
- FIG. 16 is a diagram for explaining identification of each irradiation position by a plurality of laser pointers 2c.
- the projector 1c uses the non-visible light marker M1.
- the invisible light marker M-3 is irradiated with many invisible light lasers in parallel.
- the invisible light marker M is not simply a difference in shape and color, but may be a two-dimensional barcode (invisible light marker M4 shown in FIG. 16) in which a user ID can be embedded. Thereby, the interference by simultaneous irradiation of more users can be avoided, and robust identification is enabled.
- the projector 1c uses the laser pointers 2c-1 to 2c- according to the difference in shape and color as shown in the invisible light markers M1 to M3 and based on the user ID read from the invisible light marker M4. It is also possible to specify four operators (users). When each user can be specified, the projector 1c can set a priority such as which user's operation has priority. For example, the projector 1c gives priority to accepting the operation of a specific user, gives priority to the user who first started irradiation, gives priority to a specific user for a certain period after starting irradiation, The priority is transferred in a later win method that prioritizes interrupts. In particular, when operations that greatly change the state that everyone is browsing, such as screen transition of projected images by pressing a button, scrolling the entire screen, etc., by accepting the operation according to any of the above priority methods, Convenience during operation by multiple users is improved.
- an imaging unit is provided on the laser pointer 2 side, and the projected image is captured with invisible light. It may be a system that recognizes the irradiation position P based on the optical image.
- the image capturing unit on the laser pointer 2 side performs invisible light image capturing only when the operation button 20a is pressed, so that waste of power consumption can be saved.
- FIG. 17 is a diagram for explaining a modification of the system configuration of the operation system according to the present embodiment.
- the system configuration according to the present modification is formed of a projector 1 d (information processing device according to the present disclosure), a laser pointer 2 d, and a PC 3.
- the projector 1d is connected to the PC 3 by wire / wireless, receives the projection image data from the PC 3, and projects the received projection image data on the screen S. Furthermore, the projector 1d according to the present embodiment superimposes and projects a coordinate specifying map (also referred to as a coordinate recognition image) Q of invisible light such as infrared rays on a screen S (image projection area).
- the projection area of the invisible light coordinate specifying map Q may be a range including the image projection area.
- the projector 1d may project one type of coordinate specifying map Q on the screen S, or may diffuse and project different types of coordinate specifying maps Q on several places on the screen S. Also good. By projecting a plurality of different coordinate specifying maps Q, the irradiation position can be specified with a partial angle of view even if the entire area of the screen S does not fall within the angle of view of the imaging unit provided in the laser pointer 2d.
- the laser pointer 2d irradiates the visible laser beam V according to the pressed state of the operation button 20a. Specifically, for example, the laser pointer 2d irradiates the visible laser beam V when the operation button 20a is half-pressed, and continues to irradiate the laser beam V when it is fully pressed.
- the laser pointer 2d captures an invisible light image of a range including the irradiation position P of the laser light V when the operation button 20a is fully pressed.
- the laser pointer 2d recognizes the coordinate specifying map Q ′ included in the invisible light captured image, and reads the coordinate specifying information, the size, inclination, distortion, and the like of the coordinate specifying map Q ′.
- the laser pointer 2d transmits the read information (hereinafter also referred to as read information) to the projector 1d by wireless communication.
- the projector 1d that has received the determined operation information and the read information from the laser pointer 2d recognizes the irradiation position P of the laser pointer 2d based on the read information, and determines an object to be operated. Further, the projector 1d acquires the movement (movement information) of the laser pointer 2d based on the size, inclination, distortion, and the like of the invisible light imaging map Q ′ indicated by the read information.
- the projector 1c detects a control signal (pointer information) for changing the display of the determined object in accordance with the movement information of the laser pointer 2d, and transmits it to the PC 3.
- a control signal pointer information
- the PC 3 executes display control processing according to the transmitted pointer information, and transmits the image data for projection after the execution to the projector 1d.
- the invisible light coordinate specifying map is superimposed and projected on the projection image from the projector 1d, and the invisible light is imaged on the laser pointer 2d side, thereby making the invisible light visible.
- the irradiation position P by the laser pointer 2d is recognized based on the optical image.
- FIG. 18 is a block diagram illustrating an example of the internal configuration of the operation system according to the present modification. Each configuration will be specifically described below.
- the internal configuration of the PC 3 has been described above with reference to FIG.
- the projector 1d includes a projection image reception unit 10, an image projection unit 11, a non-visible light image generation unit 17, a non-visible light projection unit 18, an information acquisition unit 15d, a position recognition unit 14d, and a pointer information output unit 16.
- the invisible light image generation unit 17 generates a map Q for specifying the coordinates of invisible light in which coordinate specifying information used when recognizing the irradiation position P by the laser pointer 2d is embedded.
- the invisible light projection unit 18 superimposes and projects the invisible light coordinate specifying map Q generated by the invisible light image generation unit 17 on the projection image of the screen S.
- the projection by the invisible light projection unit 18 and the image projection unit 11 may be a projection through different filters with the same light source.
- the information acquisition unit 15d wirelessly communicates with the laser pointer 2d, and receives determination operation information and reading information from the laser pointer 2d.
- the position recognizing unit 14d uses the coordinate identifying map Q created by the invisible light image generating unit 17 and the coordinate identifying map Q ′ captured by the invisible light included in the read information received by the information acquiring unit 15d. Based on the read coordinate specifying information, the irradiation position P (coordinate position) by the laser pointer 2d is recognized. For example, the position recognition unit 14d compares the coordinate specifying map Q with the coordinate specifying map Q 'indicated by the coordinate specifying information, and specifies the position of the coordinate specifying map Q' in the coordinate specifying map Q. Then, the position recognition unit 14d recognizes the center position of the coordinate specifying map Q 'as the irradiation position P (coordinate position) by the laser pointer 2d.
- the pointer information output unit 16 determines, as an operation target, an object in the projection image corresponding to the position of the irradiation position P recognized by the position recognition unit 14d when receiving the determination operation information by the information acquisition unit 15d.
- the pointer information output unit 16 then outputs a control signal (pointer information) for changing the display of the determined object in accordance with the movement (movement information) of the laser pointer 2d indicated by the read information received by the information acquisition unit 15d.
- the detected pointer information is wired / wirelessly transmitted to the PC 3.
- the laser pointer 2 d includes an operation unit 20, a visible light laser irradiation unit 21, a non-visible light imaging unit 25, an information reading unit 26, a transmission unit 23, and an attitude detection sensor 24.
- the visible light laser irradiation unit 21 has a function of irradiating the visible laser beam V when a first-stage user operation (irradiation instruction operation) is detected in the operation unit 20. Specifically, for example, the visible light laser irradiation unit 21 emits the laser light V when the operation button 20a is half-pressed.
- the non-visible light imaging unit 25 displays a range including a position (irradiation position P) irradiated with the laser light V when the second stage user operation (decision operation) is detected in the operation unit 20. It has a function for imaging.
- the invisible light imaging unit 25 performs invisible light imaging when the operation button 20a is fully pressed.
- the information reading unit 26 recognizes the coordinate specifying map Q ′ based on the invisible light captured image, and reads the coordinate specifying information, the size, inclination, distortion, and the like of the coordinate specifying map Q ′.
- the transmission unit 23 transmits information (read information) read by the information reading unit 26 and information (decision operation information) indicating the second-stage user operation detected by the operation unit 20 to the projector 1d by wireless communication. To do.
- the transmission unit 23 moves the detection result output from the posture detection sensor 24 while the second-stage user operation (decision operation) is detected (for example, while the operation button 20a is fully pressed). Information is continuously transmitted wirelessly to the projector 1d.
- the attitude detection sensor 24 has been described above with reference to FIG.
- the laser pointer 2d is provided with the invisible light imaging unit 25, and the laser pointer 2d performs invisible light imaging of the projected image.
- the projector 1d receives the coordinate specifying information read from the coordinate specifying map Q ′ captured with invisible light on the laser pointer 2d side, and based on the coordinate specifying information, determines the irradiation position P by the laser pointer 2d. Can be recognized. Further, the projector 1d can acquire the movement (movement information) of the laser pointer 2d based on the size, inclination, distortion, and the like of the coordinate specifying map Q ′ captured with invisible light.
- the projector 1 can estimate the relative position of the laser pointer 2 with respect to the screen S based on the direction and movement information of the laser light emitted from the laser pointer 2.
- the projector 1 assumes that the operator (user) is at the estimated position of the laser pointer 2, and displays the display screen (object in the projection image) so that the user can easily view the display screen.
- a control signal is detected and output to the PC 3.
- a user who makes a presentation by operating the laser pointer 2 is often viewing while irradiating the laser beam V at a shallow angle with respect to the screen S as shown in the left of FIG. Needed to correct and recognize the display screen (obj) that appears obliquely distorted in the brain.
- the projector 1 detects a control signal for displaying the display screen (obj) in the user direction and outputs it to the PC 3, and the PC 3 executes display control processing according to the control signal, Image data is transmitted to the projector 1 and projected. Thereby, as shown in the right of FIG. 19, display control is performed so that the display screen (obj) faces the user (to the relative position of the laser pointer 2 estimated based on the direction in which the laser beam V is irradiated, etc.). .
- the display control is performed for each screen item positioned corresponding to the coordinate position of the irradiation position P, so that the user can selectively confirm the screen item. Also, it is explicitly shown to whom the screen item is displayed. Further, since the area of the buttons and the like forming the screen items is guaranteed for the operator, it is possible to prevent deterioration of the operational feeling when the screen S is operated from a shallow angle.
- the irradiation area of the laser beam from the laser pointer 2 can be limited on the screen S.
- the projector 1 according to the present embodiment can recognize the coordinate position of the irradiation position P by the laser pointer 2 as described above, when the irradiation position P deviates from the screen S, the laser 1 is irradiated to the laser pointer 2.
- a control signal for stopping the light irradiation is wirelessly transmitted. Thereby, when the irradiation position P deviates from the screen S, the laser pointer 2 automatically stops the laser beam irradiation.
- the projector 1 detects that there is a person between the laser pointer 2 and the screen S, the projector 1 wirelessly transmits a control signal for stopping the laser pointer 2 to irradiate the laser beam. To do. Thereby, when the laser beam irradiated from the laser pointer 2 is accidentally directed to the person, the irradiation of the laser beam is automatically stopped, so that safety can be maintained.
- a specific description will be given with reference to FIG.
- FIG. 21 is a diagram for explaining an irradiable region.
- the irradiable area is, for example, a face recognition or the like in the projector 1 based on a captured image that is captured by an imaging unit provided in the projector 1 and includes a projected image within an angle of view. Whether or not there is no person is determined as an irradiable area D.
- the projector 1 when a person 5 is detected between the laser pointer 2 and the screen S, the projector 1 can irradiate an area on the screen S excluding an area corresponding to the person 5. Set to region D. Then, as shown in FIG. 21, the projector 1 continues the irradiation when the irradiation position P is located within the irradiation possible region D. Further, as shown in the lower left of FIG.
- the projector 1 A control signal for stopping the irradiation of the laser beam is wirelessly transmitted to the pointer 2.
- the laser pointer 2 can automatically stop the irradiation of the laser beam when it is out of the irradiable region D.
- the angle of view of the imaging unit 13 provided in the projector 1 also includes the periphery of the screen S
- the position of the irradiation position P outside the screen S (or the set screen range) as shown on the left side of FIG. Coordinates can also be recognized.
- the projector 1 can respond to gesture operations such as swipe-in in which the irradiation position P moves from the outside of the screen to the inside of the screen and swipe-out in which the irradiation position P moves from the inside of the screen to the outside of the outer surface.
- an indicator 40 indicating that a swipe-in operation is possible is displayed (projected) on the screen S as shown in the left of FIG.
- an operation such as displaying a menu is activated as shown in the right of FIG.
- the operation system can cope with various gesture operations such as drawing a circle outside the screen S in addition to the swipe operation.
- a guide display corresponding to a gesture operation outside the possible screen S is displayed (projected) at an end inside the screen S (a position close to the irradiation position P outside the screen S).
- the operation system can display (project) the operator information (user information) of the laser pointer 2 near the irradiation position P.
- the projector 1 and the PC 3 are configured based on the information received from the laser pointer 2 and the analysis result of the invisible / visible light marker by the one-dimensional / two-dimensional barcode emitted from the laser pointer 2.
- the pointer 2 is identified and user information is acquired.
- the projector 1 superimposes (projects) an image 44 indicating user information in the vicinity of the irradiation position P by the laser pointer 2 so that it is easy to understand who is irradiating a viewer or the like who is looking at the screen S. can do.
- the projector 1 can project a cursor 45 indicating information that can be operated on the projected image, emphasize the irradiation position P of the laser pointer 2, and make the operation state easy to understand.
- an operation input can be intuitively performed on an object in the projection image by moving the laser pointer 2.
- Irradiation with the laser beam V is started by an operation at the first stage (for example, half pressing of the operation button 20a), and an object to be operated is determined according to the subsequent operation at the second stage (for example, full pressing of the operation button 20a).
- an operation input for changing the display of the determined object is performed intuitively. be able to.
- the irradiation distance of the laser beam is very long compared to general radio waves, so it is useful for situations where a large projected image is operated from a far place in a wide place.
- a computer-readable storage medium storing the computer program is also provided.
- this technique can also take the following structures.
- a recognition unit for recognizing the irradiation position of the laser beam by the laser pointer on the projected image;
- An acquisition unit for acquiring movement information of the laser pointer;
- An output unit that outputs a control signal for changing the display of the object in the projection image corresponding to the irradiation position according to the movement information;
- An information processing apparatus comprising: (2) The output unit determines an object positioned corresponding to the irradiation position as an operation target object when receiving the determination operation information detected by the operation unit provided in the laser pointer, (1) The information processing apparatus described in 1.
- the acquisition unit acquires movement information of the laser pointer based on at least one of the size, inclination, and distortion of the visible light marker or the invisible light marker.
- the information processing apparatus described. (8)
- the movement information is information indicating vertical / horizontal movement, rotational movement, or pan / tilt movement of the laser pointer with respect to the projection image, according to any one of (1) to (7).
- Information processing device (9) The information processing apparatus according to any one of (1) to (8), wherein the acquisition unit receives the movement information from the laser pointer.
- the recognizing unit is configured to capture the laser beam based on a non-visible light image captured by a non-visible light image capturing unit provided on the laser pointer, with the non-visible light coordinate recognition image projected on the projected image.
- the information processing apparatus wherein the irradiation position of the laser beam with respect to the projection image by the pointer is recognized.
- the recognizing unit uses the plurality of laser pointers based on the shape, color, or user ID embedded in each visible light / invisible light marker irradiated to the projection image from the plurality of laser pointers.
- the information processing apparatus according to any one of (1) to (10), wherein each irradiation position is identified and recognized.
- a recognition unit for recognizing the irradiation position of the laser beam by the laser pointer on the projected image On the computer, A recognition unit for recognizing the irradiation position of the laser beam by the laser pointer on the projected image; An acquisition unit for acquiring movement information of the laser pointer; An output unit that outputs a control signal for changing the display of the object in the projection image corresponding to the irradiation position according to the movement information; Program to function as (14) On the computer, A recognition unit for recognizing the irradiation position of the laser beam by the laser pointer on the projected image; An acquisition unit for acquiring movement information of the laser pointer; An output unit that outputs a control signal for changing the display of the object in the projection image corresponding to the irradiation position according to the movement information; A storage medium storing a program for functioning as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
1.本開示の一実施形態による操作システムの概要
2.システム構成例
2-1.第1のシステム構成例
2-2.第2のシステム構成例
2-3.第3のシステム構成例
3.内部構成および動作処理
3-1.内部構成
3-2.動作処理
4.レーザーポインタによる操作入力例
4-1.照射位置の軌跡による操作入力
4-2.動きに応じた操作入力
5.変形例
5-1.システム構成
5-2.内部構成
6.補足
7.まとめ
まず、本開示の一実施形態による操作システムの概要について図1を参照して説明する。図1に示すように、本開示の一実施形態による操作システムは、プロジェクタ1、レーザーポインタ2、および投影用のコンテンツをプロジェクタ1に出力するPC(パーソナルコンピュータ)3を含む。投影用のコンテンツとは、図表、文章、その他種々のグラフィック画像や、地図、ウェブサイト、3Dオブジェクト等であって、以下投影用の画像データと称する。
ここで、上述したように、上記特許文献1、2では、投影画像を撮像した撮像画像に基づいてレーザーポインタによるレーザー光の照射位置の座標を認識するだけであった。したがって、レーザーポインタを用いて投影画像中の編集可能なオブジェクトobj1~3に対して操作入力を行うことができず、オブジェクトobj1~3を動かすためには、PC3の入力デバイス(マウス、タッチパッド、キーボード等)を操作しなければならなかった。
<2-1.第1のシステム構成例>
図2は、本開示の一実施形態による操作システムの第1のシステム構成例を説明するための図である。図2に示すように、第1のシステム構成例は、プロジェクタ1、レーザーポインタ2(本開示による情報処理装置)、およびPC3により形成される。
次に、本開示の一実施形態による操作システムの第2のシステム構成例について図3を参照して説明する。図3は、本開示の一実施形態による操作システムの第2のシステム構成例を説明するための図である。図3に示すように、操作システムの第2のシステム構成は、プロジェクタ1b(本開示による情報処理装置)、レーザーポインタ2b、およびPC3から形成される。
次に、本開示の一実施形態による操作システムの第3のシステム構成例について図4を参照して説明する。図4は、本開示の一実施形態による操作システムの第3のシステム構成例を説明するための図である。図4に示すように、操作システムの第3のシステム構成は、プロジェクタ1c(本開示による情報処理装置)、レーザーポインタ2c、およびPC3から形成される。
例えば、レーザーポインタ2の筐体上面と下面にそれぞれ操作ボタンが設けられ、上面の操作ボタンが押下された場合は第1段階(上記半押し操作に相当)のユーザ操作を検出し、さらに下面の操作ボタンも押下された場合は第2段階(上記全押し操作に相当)のユーザ操作を検出する。このように、レーザーポインタ2は、ボタンが1つ押されている段階(第1段階)と2つ同時に押されている段階(第2段階)とで2段階のユーザ操作を検出することができる。
続いて、本実施形態による操作システムの内部構成および動作処理について図6~図7を参照して説明する。
図6は、本実施形態による操作システムの内部構成の一例を示すブロック図である。
プロジェクタ1は、図5に示すように、投影画像受信部10、画像投影部11、撮像部13、位置認識部14、情報取得部15、およびポインタ情報出力部16を有する。
レーザーポインタ2は、図6に示すように、操作部20、可視光レーザー照射部21、非可視光照射部22、送信部23、および姿勢検知センサ24を有する。
PC3は、図6に示すように、制御部30、画像出力部31、および操作入力部32を有する。
次に、本実施形態による操作システムの動作処理について説明する。ここでは、上述した本実施形態による操作システムの第1~第3のシステム構成例のうち、一例として第1のシステム構成例における動作処理について説明する。
続いて、本実施形態による送信システムにおけるレーザーポインタ2を用いた直感的な操作入力について具体例を挙げて説明する。ここでは、上述した本実施形態による操作システムの第1~第3のシステム構成例のうち、一例として第1のシステム構成例におけるレーザーポインタ2aによる操作入力の具体例について説明する。
本実施形態による操作システムでは、上述したレーザーポインタ2の移動情報に応じた操作入力の他、照射位置Pの軌跡に基づいた操作入力を行うことも可能である。プロジェクタ1aは、可視光撮像部13vにより撮像した可視光撮像画像に基づいて、照射位置Pの位置の変化を認識して照射位置Pの軌跡を把握し、所定の軌跡を描くジェスチャーに対応付けられた指示コマンドをポインタ情報として出力する。以下、図8を参照して説明する。
次に、図9~図14を参照してレーザーポインタ2aの動き(移動情報)を用いた直感的な操作入力について具体的に説明する。
図10は、ロール回転の動きを用いた操作入力例について説明するための図である。図10左に示すように、ユーザは、レーザーポインタ2aの操作ボタン20aを半押ししてレーザー光Vを照射させ、照射位置PをスクリーンSに投影されている画像中の任意のオブジェクト(obj)に合せる。その後、ユーザは、レーザーポインタ2aの操作ボタン20aを全押しして操作対象とするオブジェクト(obj)を決定し、操作ボタン20aを全押しした状態でレーザーポインタ2aを回転(z軸について回転、ローリング)させる。この際、レーザーポインタ2aからは移動情報(傾きや角速度)がプロジェクタ1aに送信され、プロジェクタ1aは、操作対象に決定されたオブジェクトをレーザーポインタ2aの移動情報に応じて表示変化させるための制御信号(ポインタ情報)をPC3に送信する。PC3は、ポインタ情報にしたがって、操作対象に決定されたオブジェクトをレーザーポインタ2aの回転に合せて回転するよう表示制御する。そして、PC3は、オブジェクトを回転させる表示制御した投影画像をプロジェクタ1aに送信し、プロジェクタ1aは送信された投影画像をスクリーンSに投影する。
図11は、前後垂直移動の動きを用いた操作入力例について説明するための図である。図11に示すように、ユーザは、レーザーポインタ2aの操作ボタン20aを半押ししてレーザー光Vを照射させ、照射位置PをスクリーンSに投影されている画像中の任意のオブジェクト(例えば地図画像)に合せる。
図12は、左右にパンさせる動きを用いた操作入力例について説明するための図である。図12左に示すように、ユーザは、レーザーポインタ2aの操作ボタン20aを半押ししてレーザー光Vを照射させ、照射位置PをスクリーンSに投影されている画像中の任意のオブジェクト(例えば3Dオブジェクト)に合せる。
本実施形態による操作システムでは、複数のレーザーポインタ2aを同時に用いて操作入力を行うことも可能である。以下、図13~図14を参照して説明する。
図4を参照して説明したように、第3のシステム構成で形成される操作システムにおいては、非可視光マーカーMを介してレーザーポインタ2cの移動情報がプロジェクタ1cに取得される。ここで、図15を参照して、非可視光マーカーMによる操作入力について説明する。
本実施形態による操作システムは、複数のユーザによるそれぞれ独立した操作入力を受け付けることも可能である。以下、図16を参照して説明する。
続いて、本実施形態による操作システムのシステム構成の変形例について、図17~図18を参照して説明する。上述した第1~第3のシステム構成例では、プロジェクタ1に設けられた撮像部13で、スクリーンSに投影された画像(投影画像)を可視光/非可視光撮像し、撮像画像に基づいて照射位置Pを認識していた。
図17は、本実施形態による操作システムのシステム構成の変形例について説明するための図である。図17に示すように、本変形例によるシステム構成は、プロジェクタ1d(本開示による情報処理装置)、レーザーポインタ2d、およびPC3から形成される。
続いて、本実施形態による操作システムに含まれる各装置の内部構成について図18を参照して具体的に説明する。図18は、本変形例による操作システムの内部構成の一例を示すブロック図である。以下、各構成について具体的に説明する。なおPC3の内部構成は、図6を参照して上述したので、ここでの説明は省略する。
プロジェクタ1dは、投影画像受信部10、画像投影部11、非可視光画像生成部17、非可視光投影部18、情報取得部15d、位置認識部14d、およびポインタ情報出力部16を有する。
レーザーポインタ2dは、図18に示すように、操作部20、可視光レーザー照射部21、非可視光撮像部25、情報読取部26、送信部23、および姿勢検知センサ24を有する。
続いて、本実施形態による操作システムについて補足する。
本実施形態によるプロジェクタ1は、レーザーポインタ2から照射されるレーザー光の方向や移動情報に基づいて、スクリーンSに対するレーザーポインタ2の相対的な位置を推定することが可能である。
プロジェクタ1による投影画像の輝度が高い場合や、レーザーポインタ2から照射される可視光のレーザー光Vが同系色の表示領域に位置する場合、レーザーポインタ2の軌跡が視認しづらくなることがある。そこで、本実施形態によるプロジェクタ1およびPC3は、図20に示すように、スクリーンS上にレーザーポインタ2による照射位置Pが検出された際は、照射位置P付近の表示領域を暗くしたり、色の彩度を落としたり等の表示制御を行う。これにより、レーザーポインタ2の軌跡が視認しづらくなることを回避することができる。
本実施形態による操作システムでは、レーザーポインタ2からのレーザー光の照射領域をスクリーンS上に限定することができる。例えば、本実施形態によるプロジェクタ1は、上述したように、レーザーポインタ2による照射位置Pの座標位置を認識することができるので、照射位置PがスクリーンSから外れる場合、レーザーポインタ2に対してレーザー光の照射を停止するための制御信号を無線送信する。これにより、レーザーポインタ2は、照射位置PがスクリーンSから外れる場合は自動的にレーザー光の照射を停止する。
以上、レーザーポインタ2によるレーザー光Vや非可視光マーカーM等をスクリーンSの範囲内に照射して行う操作入力について説明したが、本実施形態による操作システムにおける操作入力の範囲は、スクリーンSの範囲内に限定されない。
本実施形態による操作システムは、レーザーポインタ2の操作者情報(ユーザ情報)を、照射位置P付近に表示(投影)させることができる。具体的には、プロジェクタ1およびPC3は、レーザーポインタ2から受信した情報や、レーザーポインタ2から照射された1次元/2次元バーコードによる非可視光/可視光マーカーの解析結果に基づいて、レーザーポインタ2を識別し、ユーザ情報を取得する。
上述したように、本実施形態による操作システムでは、レーザーポインタ2を動かすことで投影画像中のオブジェクトに対して直感的に操作入力を行うことができる。レーザー光Vの照射は、1段階目の操作(例えば操作ボタン20aの半押し)により開始され、続く2段階目の操作(例えば操作ボタン20aの全押し)に応じて操作対象のオブジェクトが決定される。そして、2段階目の操作が継続されたままレーザーポインタ2を上下左右前後、パン/チルト移動、または回転移動させることで、決定したオブジェクトも同様に表示変化させるための操作入力を直感的に行うことができる。
(1)
投影画像に対するレーザーポインタによるレーザー光の照射位置を認識する認識部と、
前記レーザーポインタの移動情報を取得する取得部と、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力する出力部と、
を備える、情報処理装置。
(2)
前記出力部は、前記レーザーポインタに設けられた操作部において検出された決定操作情報を受信した際に前記照射位置に対応して位置するオブジェクトを、操作対象のオブジェクトとして決定する、前記(1)に記載の情報処理装置。
(3)
前記情報処理装置は、前記決定操作を示す情報を前記レーザーポインタから受信する受信部をさらに備える、前記(2)に記載の情報処理装置。
(4)
前記認識部は、前記レーザーポインタから照射される可視光のレーザー光または非可視光のレーザー光の照射位置を認識する、前記(1)~(3)のいずれか1項に記載の情報処理装置。
(5)
前記認識部は、前記レーザーポインタから照射される可視光マーカーまたは非可視光マーカーの位置座標を前記レーザー光の照射位置として認識する、前記(1)~(3)のいずれか1項に記載の情報処理装置。
(6)
前記可視光マーカーまたは非可視光マーカーは、図形または一次元/二次元バーコードである、前記(5)に記載の情報処理装置。
(7)
前記取得部は、前記可視光マーカーまたは前記非可視光マーカーの大きさ、傾き、および歪みの少なくともいずれかに基づいて、前記レーザーポインタの移動情報を取得する、前記(5)または(6)に記載の情報処理装置。
(8)
前記移動情報は、前記レーザーポインタの前記投影画像に対する垂直/水平方向の移動、回転移動、またはパン/チルトの動きを示す情報である、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
前記取得部は、前記移動情報を前記レーザーポインタから受信する、前記(1)~(8)のいずれか1項に記載の情報処理装置。
(10)
前記認識部は、前記投影画像上に重ねて投影された非可視光の座標認識用画像を前記レーザーポインタに設けられた非可視光撮像部で撮像した非可視光撮像画像に基づいて、前記レーザーポインタによる前記投影画像に対するレーザー光の照射位置を認識する、前記(1)に記載の情報処理装置。
(11)
前記認識部は、複数のレーザーポインタから前記投影画像に対して照射される各可視光/非可視光マーカーの形状、色、またはマーカーに埋め込まれたユーザIDに基づいて、前記複数のレーザーポインタによる各照射位置を識別して認識する、前記(1)~(10)のいずれか1項に記載の情報処理装置。
(12)
投影画像に対するレーザーポインタによるレーザー光の照射位置を認識するステップと、
前記レーザーポインタの移動情報を取得するステップと、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力するステップと、
を含む、制御方法。
(13)
コンピュータに、
投影画像に対するレーザーポインタによるレーザー光の照射位置を認識する認識部と、
前記レーザーポインタの移動情報を取得する取得部と、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力する出力部と、
として機能させるための、プログラム。
(14)
コンピュータに、
投影画像に対するレーザーポインタによるレーザー光の照射位置を認識する認識部と、
前記レーザーポインタの移動情報を取得する取得部と、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力する出力部と、
として機能させるためのプログラムが記憶された、記憶媒体。
2、2a~2d レーザーポインタ
3 PC
10 投影画像受信部
11 画像投影部
13 撮像部
13v 可視光撮像部
13n 非可視光撮像部
14、14d 位置認識部
15、15d 情報取得部
16 ポインタ情報出力部
17 非可視光画像生成部
18 非可視光投影部
20 操作部
20a 操作ボタン
21 可視光レーザー照射部
23 送信部
24 姿勢検知センサ
25 非可視光撮像部
26 情報読取部
30 制御部
31 画像出力部
32 操作入力部
Claims (14)
- 投影画像に対するレーザーポインタによるレーザー光の照射位置を認識する認識部と、
前記レーザーポインタの移動情報を取得する取得部と、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力する出力部と、
を備える、情報処理装置。 - 前記出力部は、前記レーザーポインタに設けられた操作部において検出された決定操作情報を受信した際に前記照射位置に対応して位置するオブジェクトを、操作対象のオブジェクトとして決定する、請求項1に記載の情報処理装置。
- 前記情報処理装置は、前記決定操作を示す情報を前記レーザーポインタから受信する受信部をさらに備える、請求項2に記載の情報処理装置。
- 前記認識部は、前記レーザーポインタから照射される可視光のレーザー光または非可視光のレーザー光の照射位置を認識する、請求項1に記載の情報処理装置。
- 前記認識部は、前記レーザーポインタから照射される可視光マーカーまたは非可視光マーカーの位置座標を前記レーザー光の照射位置として認識する、請求項1に記載の情報処理装置。
- 前記可視光マーカーまたは非可視光マーカーは、図形または一次元/二次元バーコードである、請求項5に記載の情報処理装置。
- 前記取得部は、前記可視光マーカーまたは前記非可視光マーカーの大きさ、傾き、および歪みの少なくともいずれかに基づいて、前記レーザーポインタの移動情報を取得する、請求項5に記載の情報処理装置。
- 前記移動情報は、前記レーザーポインタの前記投影画像に対する垂直/水平方向の移動、回転移動、またはパン/チルトの動きを示す情報である、請求項1に記載の情報処理装置。
- 前記取得部は、前記移動情報を前記レーザーポインタから受信する、請求項1に記載の情報処理装置。
- 前記認識部は、前記投影画像上に重ねて投影された非可視光の座標認識用画像を前記レーザーポインタに設けられた非可視光撮像部で撮像した非可視光撮像画像に基づいて、前記レーザーポインタによる前記投影画像に対するレーザー光の照射位置を認識する、請求項1に記載の情報処理装置。
- 前記認識部は、複数のレーザーポインタから前記投影画像に対して照射される各可視光/非可視光マーカーの形状、色、またはマーカーに埋め込まれたユーザIDに基づいて、前記複数のレーザーポインタによる各照射位置を識別して認識する、請求項1に記載の情報処理装置。
- 投影画像に対するレーザーポインタによるレーザー光の照射位置を認識するステップと、
前記レーザーポインタの移動情報を取得するステップと、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力するステップと、
を含む、制御方法。 - コンピュータに、
投影画像に対するレーザーポインタによるレーザー光の照射位置を認識する認識部と、
前記レーザーポインタの移動情報を取得する取得部と、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力する出力部と、
として機能させるための、プログラム。 - コンピュータに、
投影画像に対するレーザーポインタによるレーザー光の照射位置を認識する認識部と、
前記レーザーポインタの移動情報を取得する取得部と、
前記照射位置に対応する投影画像中のオブジェクトを、前記移動情報に応じて表示変化させるための制御信号を出力する出力部と、
として機能させるためのプログラムが記憶された、記憶媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480035101.0A CN105308549B (zh) | 2013-06-26 | 2014-03-31 | 信息处理装置、控制方法、程序和存储介质 |
JP2015523897A JP6372487B2 (ja) | 2013-06-26 | 2014-03-31 | 情報処理装置、制御方法、プログラム、および記憶媒体 |
US14/901,683 US11029766B2 (en) | 2013-06-26 | 2014-03-31 | Information processing apparatus, control method, and storage medium |
EP14817801.5A EP3015961B1 (en) | 2013-06-26 | 2014-03-31 | Information processing device, control method, program, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013133891 | 2013-06-26 | ||
JP2013-133891 | 2013-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014208168A1 true WO2014208168A1 (ja) | 2014-12-31 |
Family
ID=52141522
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/059530 WO2014208168A1 (ja) | 2013-06-26 | 2014-03-31 | 情報処理装置、制御方法、プログラム、および記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11029766B2 (ja) |
EP (1) | EP3015961B1 (ja) |
JP (1) | JP6372487B2 (ja) |
CN (1) | CN105308549B (ja) |
WO (1) | WO2014208168A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104667526A (zh) * | 2015-01-21 | 2015-06-03 | 深圳华侨城文化旅游科技有限公司 | 一种基于红外激光的射击点识别系统及方法 |
JP2017157916A (ja) * | 2016-02-29 | 2017-09-07 | 国立大学法人東京工業大学 | 多重情報表示システム及びこれに用いる照光装置 |
JP2018005806A (ja) * | 2016-07-08 | 2018-01-11 | 株式会社スクウェア・エニックス | 位置特定プログラム、コンピュータ装置、位置特定方法、及び、位置特定システム |
JP2019078845A (ja) * | 2017-10-23 | 2019-05-23 | セイコーエプソン株式会社 | プロジェクターおよびプロジェクターの制御方法 |
JP2019532444A (ja) * | 2016-08-23 | 2019-11-07 | リアヴィーレ インコーポレイティドReavire,Inc. | 仮想光線を用いたオブジェクト制御 |
JPWO2020148601A1 (ja) * | 2019-01-18 | 2020-07-23 | ||
CN112365828A (zh) * | 2020-11-09 | 2021-02-12 | 深圳Tcl新技术有限公司 | 智能调整显示效果的方法、装置、设备及介质 |
JPWO2019188046A1 (ja) * | 2018-03-26 | 2021-02-12 | 富士フイルム株式会社 | 投影システム、投影制御装置、投影制御方法、投影制御プログラム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104602094B (zh) * | 2014-12-26 | 2018-06-01 | 联想(北京)有限公司 | 信息处理方法及电子设备 |
JP2017021237A (ja) * | 2015-07-13 | 2017-01-26 | キヤノン株式会社 | 画像投影装置、画像投影システム、表示装置、および、表示システム |
US10347002B2 (en) * | 2016-07-01 | 2019-07-09 | Guangdong Virtual Reality Technology Co., Ltd. | Electronic tracking device, electronic tracking system and electronic tracking method |
GB2557285A (en) * | 2016-12-05 | 2018-06-20 | Meiban Int Pte Ltd | Smart working professional ring |
EP3518222B1 (en) * | 2018-01-30 | 2020-08-19 | Alexander Swatek | Laser pointer |
JP2021165865A (ja) * | 2018-07-03 | 2021-10-14 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及び、記録媒体 |
CN110297550B (zh) * | 2019-06-28 | 2023-05-16 | 北京百度网讯科技有限公司 | 一种标注显示方法、装置、投屏设备、终端和存储介质 |
CN112328158A (zh) * | 2020-07-23 | 2021-02-05 | 深圳Tcl新技术有限公司 | 交互方法、显示装置、发射装置、交互系统及存储介质 |
CN112860083B (zh) * | 2021-01-08 | 2023-01-24 | 深圳市华星光电半导体显示技术有限公司 | 激光笔光源定位方法及显示装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001125738A (ja) | 1999-09-21 | 2001-05-11 | Seiko Epson Corp | プレゼンテーション制御システム及びその制御方法 |
JP2008015560A (ja) | 2006-06-30 | 2008-01-24 | Casio Comput Co Ltd | レーザポインタ位置判定システムおよびレーザポインタ位置判定方法 |
JP2010152717A (ja) * | 2008-12-25 | 2010-07-08 | Canon Inc | 画像処理装置、方法及びプログラム |
JP2012053545A (ja) * | 2010-08-31 | 2012-03-15 | Canon Inc | 画像処理システムおよびその制御方法 |
JP2013522766A (ja) * | 2010-03-16 | 2013-06-13 | インターフェイズ・コーポレーション | 対話型表示システム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5712658A (en) * | 1993-12-28 | 1998-01-27 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
JP3770308B2 (ja) * | 2000-07-31 | 2006-04-26 | セイコーエプソン株式会社 | 光ポインタ |
US6910778B2 (en) * | 2001-09-28 | 2005-06-28 | Fujinon Corporation | Presentation system using laser pointer |
US6764185B1 (en) * | 2003-08-07 | 2004-07-20 | Mitsubishi Electric Research Laboratories, Inc. | Projector as an input and output device |
US7683881B2 (en) * | 2004-05-24 | 2010-03-23 | Keytec, Inc. | Visual input pointing device for interactive display system |
US7284866B2 (en) * | 2005-01-05 | 2007-10-23 | Nokia Corporation | Stabilized image projecting device |
US7486274B2 (en) * | 2005-08-18 | 2009-02-03 | Mitsubishi Electric Research Laboratories, Inc. | Method for stabilizing and precisely locating pointers generated by handheld direct pointing devices |
US7852315B2 (en) * | 2006-04-07 | 2010-12-14 | Microsoft Corporation | Camera and acceleration based interface for presentations |
EP2291833A1 (en) * | 2008-06-19 | 2011-03-09 | Koninklijke Philips Electronics N.V. | Remote control pointing technology |
JP2010072977A (ja) * | 2008-09-19 | 2010-04-02 | Sony Corp | 画像表示装置および位置検出方法 |
US20110230238A1 (en) * | 2010-03-17 | 2011-09-22 | Sony Ericsson Mobile Communications Ab | Pointer device to navigate a projected user interface |
GB201020282D0 (en) * | 2010-11-30 | 2011-01-12 | Dev Ltd | An improved input device and associated method |
WO2012102690A1 (en) * | 2011-01-28 | 2012-08-02 | Hewlett-Packard Development Company, L.P. | Filter |
US8446364B2 (en) * | 2011-03-04 | 2013-05-21 | Interphase Corporation | Visual pairing in an interactive display system |
US9733713B2 (en) * | 2012-12-26 | 2017-08-15 | Futurewei Technologies, Inc. | Laser beam based gesture control interface for mobile devices |
US11269431B2 (en) * | 2013-06-19 | 2022-03-08 | Nokia Technologies Oy | Electronic-scribed input |
-
2014
- 2014-03-31 US US14/901,683 patent/US11029766B2/en active Active
- 2014-03-31 WO PCT/JP2014/059530 patent/WO2014208168A1/ja active Application Filing
- 2014-03-31 EP EP14817801.5A patent/EP3015961B1/en active Active
- 2014-03-31 CN CN201480035101.0A patent/CN105308549B/zh not_active Expired - Fee Related
- 2014-03-31 JP JP2015523897A patent/JP6372487B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001125738A (ja) | 1999-09-21 | 2001-05-11 | Seiko Epson Corp | プレゼンテーション制御システム及びその制御方法 |
JP2008015560A (ja) | 2006-06-30 | 2008-01-24 | Casio Comput Co Ltd | レーザポインタ位置判定システムおよびレーザポインタ位置判定方法 |
JP2010152717A (ja) * | 2008-12-25 | 2010-07-08 | Canon Inc | 画像処理装置、方法及びプログラム |
JP2013522766A (ja) * | 2010-03-16 | 2013-06-13 | インターフェイズ・コーポレーション | 対話型表示システム |
JP2012053545A (ja) * | 2010-08-31 | 2012-03-15 | Canon Inc | 画像処理システムおよびその制御方法 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104667526A (zh) * | 2015-01-21 | 2015-06-03 | 深圳华侨城文化旅游科技有限公司 | 一种基于红外激光的射击点识别系统及方法 |
JP2017157916A (ja) * | 2016-02-29 | 2017-09-07 | 国立大学法人東京工業大学 | 多重情報表示システム及びこれに用いる照光装置 |
JP2018005806A (ja) * | 2016-07-08 | 2018-01-11 | 株式会社スクウェア・エニックス | 位置特定プログラム、コンピュータ装置、位置特定方法、及び、位置特定システム |
JP2019532444A (ja) * | 2016-08-23 | 2019-11-07 | リアヴィーレ インコーポレイティドReavire,Inc. | 仮想光線を用いたオブジェクト制御 |
JP2019078845A (ja) * | 2017-10-23 | 2019-05-23 | セイコーエプソン株式会社 | プロジェクターおよびプロジェクターの制御方法 |
JPWO2019188046A1 (ja) * | 2018-03-26 | 2021-02-12 | 富士フイルム株式会社 | 投影システム、投影制御装置、投影制御方法、投影制御プログラム |
JPWO2020148601A1 (ja) * | 2019-01-18 | 2020-07-23 | ||
JP7384836B2 (ja) | 2019-01-18 | 2023-11-21 | 株式会社半導体エネルギー研究所 | 表示装置、表示システム |
CN112365828A (zh) * | 2020-11-09 | 2021-02-12 | 深圳Tcl新技术有限公司 | 智能调整显示效果的方法、装置、设备及介质 |
CN112365828B (zh) * | 2020-11-09 | 2024-03-12 | 深圳Tcl新技术有限公司 | 智能调整显示效果的方法、装置、设备及介质 |
Also Published As
Publication number | Publication date |
---|---|
US20160370883A1 (en) | 2016-12-22 |
EP3015961B1 (en) | 2021-07-14 |
CN105308549A (zh) | 2016-02-03 |
JPWO2014208168A1 (ja) | 2017-02-23 |
US11029766B2 (en) | 2021-06-08 |
JP6372487B2 (ja) | 2018-08-15 |
CN105308549B (zh) | 2018-08-21 |
EP3015961A4 (en) | 2017-02-22 |
EP3015961A1 (en) | 2016-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6372487B2 (ja) | 情報処理装置、制御方法、プログラム、および記憶媒体 | |
KR102098277B1 (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
JP5966510B2 (ja) | 情報処理システム | |
US20130135199A1 (en) | System and method for user interaction with projected content | |
US20130063345A1 (en) | Gesture input device and gesture input method | |
CN107291221B (zh) | 基于自然手势的跨屏幕自适应精度调整方法及装置 | |
EP3121686A1 (en) | Apparatus and method for remote control using camera-based virtual touch | |
JP5645444B2 (ja) | 画像表示システムおよびその制御方法 | |
CN103092432A (zh) | 人机交互操作指令的触发控制方法和系统及激光发射装置 | |
JP2015014882A (ja) | 情報処理装置、操作入力検出方法、プログラム、および記憶媒体 | |
TW201324235A (zh) | 手勢輸入的方法及系統 | |
CN110489027B (zh) | 手持输入设备及其指示图标的显示位置控制方法和装置 | |
US20160078679A1 (en) | Creating a virtual environment for touchless interaction | |
JP2012238293A (ja) | 入力装置 | |
US9740294B2 (en) | Display apparatus and method for controlling display apparatus thereof | |
CN105912101B (zh) | 一种投影控制方法和电子设备 | |
KR20200040716A (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
JP6866467B2 (ja) | ジェスチャー認識装置、ジェスチャー認識方法、ジェスチャー認識装置を備えたプロジェクタおよび映像信号供給装置 | |
JP2008181198A (ja) | 画像表示システム | |
KR20160072306A (ko) | 스마트 펜 기반의 콘텐츠 증강 방법 및 시스템 | |
US20170357336A1 (en) | Remote computer mouse by camera and laser pointer | |
US20220244788A1 (en) | Head-mounted display | |
JP2013218423A (ja) | 指向性映像コントロール装置及びその方法 | |
EP4400942A1 (en) | Touchless user interface control method, system, computer program and computer-readable medium | |
WO2015156068A1 (ja) | 画像処理装置及び画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480035101.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14817801 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015523897 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014817801 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14901683 Country of ref document: US |