WO2016157920A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2016157920A1
WO2016157920A1 PCT/JP2016/050258 JP2016050258W WO2016157920A1 WO 2016157920 A1 WO2016157920 A1 WO 2016157920A1 JP 2016050258 W JP2016050258 W JP 2016050258W WO 2016157920 A1 WO2016157920 A1 WO 2016157920A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
information processing
processing apparatus
information
unit
Prior art date
Application number
PCT/JP2016/050258
Other languages
French (fr)
Japanese (ja)
Inventor
達雄 藤原
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016157920A1 publication Critical patent/WO2016157920A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a program.
  • Patent Document 1 discloses an example of a usage form of a so-called projector.
  • the size of the projected image that is, the projection scene
  • a mechanism capable of controlling the projection mode of the video so that the angle of view), the resolution of the video, and the like are set more appropriately.
  • the present disclosure proposes an information processing apparatus, an information processing method, and a program capable of projecting a video in a more preferable manner according to a usage scene.
  • control unit includes a control unit that controls operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane,
  • An information processing apparatus is provided that selectively switches a plurality of operation modes different from each other in a mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units.
  • the processor controls the operations of the plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
  • the computer controls the operations of the plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
  • an information processing apparatus an information processing method, and a program capable of projecting a video in a more preferable mode according to a usage scene are provided.
  • FIG. 3 is an explanatory diagram for describing an overview of an information processing apparatus according to an embodiment of the present disclosure. It is explanatory drawing for demonstrating an example of the schematic structure of the information processing apparatus which concerns on the embodiment.
  • 4 is an explanatory diagram for describing an example of a schematic operation of the information processing apparatus according to the embodiment;
  • FIG. 6 is an explanatory diagram for describing another example of the schematic operation of the information processing apparatus according to the embodiment.
  • FIG. It is the block diagram which showed an example of the function structure of the information processing apparatus which concerns on the embodiment.
  • 5 is a flowchart illustrating an example of a flow of a series of operations of the information processing apparatus according to the embodiment.
  • 11 is an explanatory diagram for describing an outline of an operation of an information processing apparatus according to Modification Example 1.
  • FIG. 11 is an explanatory diagram for describing an outline of an operation of an information processing apparatus according to Modification Example 1.
  • FIG. 11 is an explanatory diagram for describing an example of an operation of an information processing apparatus according to Modification 3.
  • FIG. 11 is an explanatory diagram for explaining another example of the operation of the information processing apparatus according to the modification example 3.
  • FIG. 10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 4.
  • FIG. 10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 5.
  • FIG. 10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 5.
  • FIG. 10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 5.
  • FIG. It is the figure which showed an example of the hardware constitutions of the information processing apparatus which concerns on the embodiment.
  • FIG. 1 is an explanatory diagram for explaining an overview of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 1 is configured as a so-called projector, and projects a video on which desired display information is presented on a desired projection surface, thereby providing the display information to the user.
  • the information processing apparatus 1 is installed vertically above the table 140 or the like, and projects an image on the projection plane R10 with the top surface of the table 140 as the projection plane R10.
  • the vertical direction is the z direction
  • the directions orthogonal to each other on the horizontal plane that is, the directions orthogonal to the z direction and orthogonal to each other
  • reference numerals R11a and R11b schematically indicate areas in the projection surface R10 (hereinafter, may be referred to as “projection areas”) on which the information processing apparatus 1 projects an image.
  • projection areas R11a and R11b may be simply referred to as “projection area R11” unless they are particularly distinguished.
  • the information processing apparatus 1 makes at least a part of the projection plane R10 a projection area R11 and can project an image on each of the plurality of projection areas R11. It is configured.
  • FIG. 2 is an explanatory diagram for describing an example of a schematic configuration of the information processing apparatus 1 according to the present embodiment.
  • the information processing apparatus 1 according to the present embodiment includes output units 30a and 30b.
  • the output units 30a and 30b may be simply referred to as “output unit 30” unless they are particularly distinguished.
  • the information processing apparatus 1 may include the input unit 10.
  • the output unit 30 is configured to present various information to the user.
  • the output unit 30 is installed so as to face the top surface side (that is, the projection surface R ⁇ b> 10 side) of the table 140 apart from the table 140.
  • the output unit 30 includes a configuration for projecting an image in a so-called projector (that is, a projection unit), and is configured to be able to project an image on a projection region R11 in the projection surface R10.
  • the output unit 30 according to the present embodiment is configured to be able to control the direction in which an image is projected and the size of the image to be projected (in other words, the angle of view). That is, the output unit 30 is configured to be able to project an image with a desired size at a desired position in the projection plane R10. In other words, the output unit 30 is configured to be able to control the position and size of the projection region R11.
  • the projection region R11a indicates a region where the output unit 30a projects an image.
  • the projection region R11b indicates a region where the output unit 30b projects an image.
  • the output unit 30 may include an acoustic device for outputting acoustic information such as a so-called microphone.
  • the input unit 10 is configured to input the operation contents of the user who uses the information processing apparatus 1 and the shape and pattern of an object placed on the table 140.
  • the input unit 10 is installed so as to face the projection surface R ⁇ b> 10 side on which the output unit 30 projects an image while being separated from the table 140.
  • the input unit 10 can record information in the depth direction by imaging the table 140 with two imaging optical systems, for example, a camera that images the table 140 with one imaging optical system (for example, a series of lens groups). A stereo camera can be included.
  • the input unit 10 is a sound collection device (for example, a microphone or the like) for collecting acoustic information such as sound emitted by a user who uses the information processing apparatus 1 and environmental sound of the environment where the information processing apparatus 1 is placed. ) May be included.
  • the information processing apparatus 1 analyzes an image captured by the camera, for example, so that an object placed on the table 140 is analyzed. Is detected.
  • the input unit 10 includes a stereo camera, for example, a visible light camera or an infrared camera can be applied to the stereo camera.
  • the input unit 10 can acquire depth information.
  • the information processing apparatus 1 can detect a real object such as a hand or an object placed on the table 140, for example.
  • the information processing apparatus 1 allows the operation body such as the user's hand to contact and approach the projection surface R10 (that is, the top surface of the table 140), and the projection. It is possible to detect the detachment of the operating tool from the surface R10.
  • the projection surface R10 that is, the top surface of the table 140
  • the user's operation may be detected by a touch panel that detects contact of the user's finger or the like.
  • Other user operations that can be acquired by the input unit 10 may include, for example, a stylus operation on the projection plane R10, a gesture operation on the camera, and the like.
  • the input unit 10 may acquire a voice accompanying the user's utterance as a voice input.
  • the input unit 10 may include a detection device for detecting a change in the external environment, such as an optical sensor. In this case, the input unit 10 may acquire a detection result by the detection device as input information.
  • the information processing apparatus 1 is based on various types of input information acquired by the input unit 10, and input information (for example, operation content and instruction content) from the user, and changes in various states and situations. And the operation of the output unit 30 can be controlled according to the recognition result.
  • the configuration of the information processing apparatus 1 illustrated in FIGS. 1 and 2 is merely an example, and the information processing apparatus 1 includes a plurality of output units 30 and can control each of the plurality of output units 30 independently.
  • the configuration of the device 1 is not necessarily limited to the examples shown in FIGS.
  • the information processing apparatus 1 projects an image on the wall surface with a wall surface extending in the vertical direction as a projection surface (that is, the image is projected from the position facing the wall surface toward the wall surface. Projecting).
  • the information processing apparatus 1 may be configured as a so-called short focus projector.
  • the information processing apparatus 1 is configured as a so-called rear projection type projector that projects an image from the back side of a projection surface formed of a transparent material such as a glass plate or a transparent plastic plate. May be.
  • the information processing apparatus 1 demonstrated an example of the structure containing the input part 10 and the output parts 30a and 30b
  • the structure of the information processing apparatus 1 is shown.
  • the present invention is not necessarily limited to the examples shown in FIGS.
  • at least a part of the input unit 10 and the output units 30 a and 30 b may be provided outside the information processing apparatus 1.
  • the information processing apparatus 1 may establish communication with a configuration provided outside and control the operation of the configuration via the communication.
  • the example in which the information processing apparatus 1 includes the input unit 10 has been described. However, if the information processing apparatus 1 includes a plurality of output units 30 and can control the operations of the output units 30, The input unit 10 is not necessarily included.
  • the information processing apparatus 1 includes two output units 30 (that is, the output units 30 a and 30 b) has been described.
  • a plurality of output units 30 are included. The number is not necessarily limited to the examples shown in FIGS.
  • the information processing apparatus 1 may include four output units 30, and each of the four output units 30 may project an image onto the projection plane R10.
  • FIG. 3 is an explanatory diagram for explaining an example of a schematic operation of the information processing apparatus 1 according to the present embodiment.
  • the information processing apparatus 1 includes four output units 30 (here, the output units 30a to 30a). 30d) is illustrated as an example in which the operation can be controlled.
  • reference numeral R11a indicates a projection area in which the output unit 30a projects an image.
  • reference numerals R11b to R11d indicate projection areas in which the output units 30b to 30d project images.
  • the information processing apparatus 1 is configured to be able to control the operations of the plurality of output units 30.
  • Each of the output units 30 is configured to be able to control the position and size of the projection region R11.
  • the information processing apparatus 1 controls each operation of the plurality of output units 30 individually, so that a desired position in the projection plane R10 is obtained. It is possible to project an image of the direction and size.
  • the information processing apparatus 1 recognizes the positions of the users Ua to Uc with respect to the projection plane R10 based on the input information from the input unit 10, and outputs the output units 30a to 30c according to the recognition result. Each operation of 30d is controlled.
  • the information processing apparatus 1 controls the operation of the output unit 30c based on the recognition result of the position of the user Ua so that the projection region R11c corresponding to the output unit 30c is located in the vicinity of the user Ua. is doing. At this time, the information processing apparatus 1 may cause the output unit 30c to adjust the orientation of the video so that the video projected on the projection plane R11c and the user Ua face each other.
  • the information processing apparatus 1 controls the operation of the output unit 30d based on the recognition result of the position of the user Ub so that the projection region R11d corresponding to the output unit 30d is located in the vicinity of the user Ub. .
  • the information processing apparatus 1 outputs the output units 30a and 30b so that the projection regions R11a and R11b corresponding to the output units 30a and 30b are located in the vicinity of the user Uc based on the recognition result of the position of the user Uc. Is controlling the operation.
  • the information processing apparatus 1 individually controls the angle of view of each output unit 30, so that the size of the projection region corresponding to the output unit 30 (in other words, the image projected by the output unit 30 can be reduced). (Size) may be controlled.
  • the information processing apparatus 1 may independently control each of the plurality of output units 30 and cause each output unit 30 to project an image individually.
  • the information processing apparatus 1 operates at least a part of two or more output units 30 among the plurality of output units 30 so that the two or more output units 30 cooperate with each other.
  • One image may be projected by the above.
  • the projection plane R10 depends on the setting of the angle of view and the distance between the output unit 30 and the projection plane R10 (that is, the projection distance).
  • the size of the image projected on the screen is controlled.
  • the resolution of the projected image does not change, so that the user feels the image quality of the image rough according to the size of the image.
  • FIGS. 1 and 2 when the top surface of the table 140 is the projection plane R10, the distance between the user and the projection plane R10 is likely to be close, so that the size of the image is small.
  • the user tends to feel the image quality of the video more coarsely.
  • the resolution and the number of pixels of the video that can be projected by the output unit 30 are determined in advance according to the performance and characteristics of the output unit 30. Therefore, when the output unit 30 projects an image with a higher resolution or number of pixels than its own resolution or number of pixels, the resolution or the number of pixels of the image is equal to the resolution or the number of pixels of the output unit 30. There may be restrictions.
  • the information processing apparatus 1 forms a single video from the video projected from each of the at least two or more output units 30, thereby projecting a video in a more preferable mode (for example, more Projection of images with high resolution and pixel count).
  • FIG. 4 is an explanatory diagram for explaining another example of the schematic operation of the information processing apparatus 1 according to the present embodiment.
  • the information processing apparatus 1 forms one display region by combining the projection regions R11a to R11d of the projection units 30a to 30d, and links the projection units 30a to 30d to each other.
  • Video (in other words, display information) is projected onto the display area.
  • the information processing apparatus 1 controls the direction in which each of the output units 30a to 30d projects an image and the size of the image to be projected (in other words, the width of the angle of view), so that FIG. As shown, the projection regions R11a to R11d are combined.
  • the information processing apparatus 1 divides the video (display information) to be projected into partial videos according to the coupling relationship between the projection areas R11a to R11d, and the divided partial videos correspond to the projection areas R11. Output to the output unit 30.
  • the partial images projected from the output units 30a to 30d are combined on the projection plane R10 according to the connection relationship between the projection regions R11a to R11d, so that one image is formed.
  • the resolution and the number of pixels of the image projected on each of the projection areas R11a to R11d correspond to the resolution and the number of pixels of the output unit 30 corresponding to the projection area R11. Therefore, in the example shown in FIG. 4, for example, when the resolutions and the number of pixels of the output units 30a to 30d are the same, the resolution and pixels are compared with the case where an image of the same size is projected by the output unit 30 alone. It is possible to project a video image that is four times as many.
  • the information processing apparatus 1 may control the positions at which the projection areas R11a to R11d are projected so that at least a part of the information is overlapped between the adjacent projection areas R11. By such control, it becomes possible to make the boundary between adjacent projection regions R11 (in other words, the difference in luminance value between the projection regions R11) inconspicuous.
  • the information processing apparatus 1 includes an operation (operation mode) for individually controlling each output unit 30 as illustrated in FIG. 3 and two or more output units 30 as illustrated in FIG. 4.
  • the operation (operation mode) to be linked to each other may be configured to be selectively switchable.
  • the information processing apparatus 1 according to the present embodiment is configured to be able to selectively switch a plurality of operation modes different from each other in a mode for superimposing the projection regions R11 corresponding to the plurality of output units 30, respectively. It may be.
  • the mode for superimposing the projection regions R11 corresponding to each of the plurality of output units 30 is a mode in which the projection regions R11 corresponding to the respective output units 30 are not superimposed. Can also be included as some aspects.
  • the information processing apparatus 1 is not particularly limited in the trigger for switching the operation mode as long as the operation mode as illustrated in FIGS. 3 and 4 can be selectively switched as appropriate.
  • the information processing apparatus 1 recognizes various operations (for example, gesture operations such as tap and pinch-in / pinch-out) based on input information acquired by the input unit 10, for example.
  • the operation mode may be switched depending on the content.
  • the type of the operation is not particularly limited as long as the information processing apparatus 1 can recognize the operation mode of the switching destination according to the content of the operation by the user.
  • the information processing apparatus 1 may switch the operation mode based on the contents of operations through various operation devices such as a mouse, a button, and a touch panel.
  • the information processing apparatus 1 may switch to a more suitable operation mode in accordance with information related to a projected video (for example, information related to content to be projected).
  • information related to a projected video for example, information related to content to be projected.
  • the information processing apparatus 1 projects an image with a resolution higher than the resolution of each output unit 30 as illustrated in FIG. You may switch to the operation mode which cooperates the two or more output parts 30 so that one display area may be formed. At this time, the information processing apparatus 1 may switch the number of output units 30 to be linked with each other according to the resolution of the video to be projected.
  • the information processing apparatus 1 recognizes the state of the external environment according to detection results of detection devices such as various sensors, and enters a more suitable operation mode according to the recognition result of the external environment. You may switch. As a specific example, the information processing apparatus 1 may recognize the position and orientation of the user with respect to the projection plane R10 according to the detection result of the detection device, and determine the operation mode according to the recognition result.
  • the information processing apparatus 1 according to the present embodiment is configured to be able to selectively switch the mode for superimposing the projection regions R11 corresponding to the two or more output units 30, respectively. Based on such a configuration, the information processing apparatus 1 according to the present embodiment selectively switches the operation mode according to the usage scene based on the usage, the usage mode, the type of video (content) to be projected, and the like. As a result, it is possible to project the video in a more preferable manner according to the usage scene. In addition, with such a configuration, the information processing apparatus 1 according to the present embodiment can realize a function or performance equivalent to that of a higher-performance output unit (projector) by linking a plurality of output units 30. Therefore, it may be possible to reduce costs and achieve miniaturization.
  • a higher-performance output unit projector
  • FIG. 5 is a block diagram illustrating an example of a functional configuration of the information processing apparatus 1 according to the present embodiment.
  • the information processing apparatus 1 includes an input unit 10, a control unit 20, a storage unit 40, and a plurality of output units 30.
  • the information processing apparatus 1 includes output units 30 a and 30 b as the plurality of output units 30. Note that the input unit 10 and the output units 30a and 30b illustrated in FIG. 5 correspond to the input unit 10 and the output units 30a and 30b in the example illustrated in FIG.
  • the input unit 10 includes an imaging unit 11, for example.
  • the input unit 10 may include a detection unit 13.
  • the imaging unit 11 corresponds to the configuration for imaging the image of the projection plane R10 described above with reference to FIG. 2, and for example, a camera including one imaging optical system or a stereo camera including two imaging optical systems. Can be configured.
  • the imaging unit 11 outputs the captured image of the projection plane R10 to the control unit 20.
  • the detection unit 13 corresponds to a detection device such as various sensors for detecting various states.
  • the detection unit 13 is configured as a so-called depth sensor, and includes an information processing apparatus 1 and an object (for example, an operating body such as a hand) positioned within a predetermined detection range starting from the information processing apparatus 1. Or the distance to the projection plane R10), and control information indicating the measurement result may be generated.
  • the detection unit 13 is configured as a sensor for detecting a user, such as a so-called optical sensor or human sensor, and includes control information indicating a detection result of the user located in the vicinity of the projection plane R10. May be generated. Then, the detection unit 13 outputs control information indicating detection results of various states to the control unit 20.
  • the output unit 30 includes a video output unit 31.
  • the video output unit 31 corresponds to a configuration for projecting a video (that is, a projection unit) in the output unit 30, and is based on control from the control unit 20 described later (at least a part of a desired projection plane ( That is, an image is projected onto the projection area R11) in the projection plane R10.
  • the video output unit 31 is configured to be able to control the direction in which the video is projected and the size of the projected video (in other words, the angle of view).
  • the video output unit 31a corresponds to the video output unit 31 of the output unit 30a
  • the video output unit 31b corresponds to the video output unit 31 of the output unit 30b.
  • the control unit 20 includes an image analysis unit 21, an input analysis unit 22, a process execution unit 23, a mode determination unit 25, and an output control unit 26.
  • the image analysis unit 21 acquires the captured image of the projection plane R10 from the imaging unit 11.
  • the image analysis unit 21 performs an image analysis on the acquired image to detect an operating body such as a user's hand existing on the projection plane R10. Further, the image analysis unit 21 may detect a user located in the vicinity of the projection plane R10 by analyzing the acquired image. Then, the image analysis unit 21 outputs the analysis result of the image of the projection plane R10 (for example, the detection result of the operation body such as the user's hand) to the input analysis unit 22. Further, the image analysis unit 21 may output an analysis result of the image of the projection plane R10 (for example, a detection result of the user, the user's hand, etc.) to the mode determination unit 25 described later.
  • the input analysis unit 22 acquires the analysis result of the image of the projection plane R10 from the image analysis unit 21. Based on the obtained analysis result, the input analysis unit 22 extracts an operation body such as a user's hand from the image of the projection plane R10. The input analysis unit 22 recognizes the content of the user operation by the operation body based on the extracted movement of the operation body along the time series. At this time, the coordinates of the projection surface R10 on which the display information is projected and the contact coordinates on the projection surface R10 of the operating body such as the user's hand are calibrated in advance, for example, input analysis The unit 22 can detect which part of the projected display information (for example, GUI or the like) the operating body has touched.
  • the input analysis unit 22 can detect which part of the projected display information (for example, GUI or the like) the operating body has touched.
  • the input analysis unit 22 Based on the recognized operation content, the input analysis unit 22 recognizes the content of the instruction based on the operation from the user, identifies the target function (for example, application) or content according to the instruction, Control information indicating the content is output to the process execution unit 23.
  • the target function for example, application
  • the input analysis unit 22 may output information indicating the recognized operation content (for example, content indicated by the gesture operation) to the mode determination unit 25.
  • the process execution unit 23 is configured to execute various functions.
  • the process execution unit 23 acquires control information indicating functions and contents from the input analysis unit 22.
  • the process execution unit 23 reads various data (for example, a library for executing an application and content data) from the storage unit 40 for executing a target function based on the acquired control information.
  • the storage unit 40 is a storage unit that stores various data for the processing execution unit 23 to execute various functions.
  • the process execution unit 23 may output control information indicating the execution result of the function specified based on the operation from the user to the output control unit 26 described later.
  • the output control unit 26 transmits information indicating the execution result of the function specified based on the operation from the user to at least one of the output units 30a and 30b (more specifically, the video output units 31a and 31b). By projecting onto the projection plane R10, the execution result of the function can be presented to the user.
  • the process execution unit 23 may output various types of information according to the execution of the function specified based on the operation from the user to the mode determination unit 25 described later.
  • the process execution unit 23 includes information about the image content (for example, attributes such as resolution and the number of pixels). Information) may be output to the mode determination unit 25.
  • the mode determination unit 25 can recognize various information such as the resolution and the number of pixels of the reproduced image content.
  • the mode determination unit 25 switches the operation mode for controlling the operations of the plurality of output units 30 to one of a plurality of preset operation modes according to various types of input information.
  • the operation mode for example, the operation mode for individually controlling the operations of the plurality of output units 30 described with reference to FIG. 3 or the plurality of output units 30 described with reference to FIG. Among these, an operation mode in which at least some of the two or more output units 30 cooperate with each other can be used. Note that another example of an operation mode for the information processing apparatus 1 according to the present embodiment to control operations of the plurality of output units 30 will be separately described later as a modification.
  • the mode determination unit 25 may determine the operation mode based on the recognition result of the operation by the user.
  • the mode determination unit 25 acquires information (for example, content indicated by the gesture operation) indicating the recognition result of the operation content by the user from the input analysis unit 22, and according to the operation content indicated by the information.
  • the operation mode may be determined.
  • the type of operation for instructing the operation mode is not particularly limited.
  • the information processing apparatus 1 may cause the output unit 30 to project an image on which a UI for designating an operation mode is presented, and determine the operation mode based on an operation from the user with respect to the UI. Good.
  • the mode determination unit 25 compares the position in the projection plane R10 on which the UI is projected with the position of an operation (for example, a tap operation) performed by the user on the projection plane R10. What is necessary is just to recognize the operation mode designated by the user.
  • the mode determination unit 25 may acquire various types of information according to the execution of various functions by the processing execution unit 23, and determine the operation mode based on the information.
  • the mode determination unit 25 acquires information related to reproduced content (for example, attribute information such as resolution and the number of pixels) from the processing execution unit 23, and selects an operation mode based on the acquired information related to content. You may decide. In this case, for example, when the resolution of the reproduced content is higher than the resolution of each output unit 30, the mode determination unit 25 projects each of the two or more output units 30 as shown in FIG. An operation mode in which one display area is formed by combining the areas (that is, an operation mode in which two or more output units 30 are linked) may be selected.
  • the mode determination unit 25 may determine an operation mode based on a detection result by the detection unit 13 or an image analysis result by the image analysis unit 21.
  • the mode determination unit 25 recognizes the position and orientation of the user existing around the projection plane R10 based on the detection result by the detection unit 13 and the analysis result of the image by the image analysis unit 21, and The operation mode may be determined according to the recognition result.
  • the mode determination unit 25 selects an operation mode for individually controlling the operations of the plurality of output units 30 as shown in FIG. May be.
  • the mode determination unit 25 assigns one of the plurality of output units 30 to each detected user and displays the output unit 30 in the vicinity of the assigned user.
  • the setting information corresponding to the operation mode may be generated so that the information is projected.
  • the mode determination unit 25 also assigns the output unit 30 to each user according to the positional relationship between each output unit 30 and each detected user, and the direction in which the output unit 30 projects an image. And the angle of view may be determined.
  • the mode determination unit 25 prevents the video projected from the output units 30 on the projection plane R10 from being blocked by the detected users (in other words, the output units 30 and the output units).
  • the output unit 30 assigned to each user and the direction and angle of view in which the output unit 30 projects an image may be determined so that the user does not intervene with the projection region R11 corresponding to 30.
  • the mode determination unit 25 determines an operation mode for controlling operations of the plurality of output units 30 according to various types of input information, and sends information indicating the determined operation mode to the output control unit 26. Output. At this time, the mode determination unit 25 controls the operation of each output unit 30 in the determined operation mode, according to the setting information corresponding to the operation mode (for example, the direction or image in which each output unit 30 projects an image). Information indicating a corner) may be output to the output control unit 26.
  • the output control unit 26 is configured to control the operation of the output unit 30 (particularly, the video output unit 31).
  • the output control unit 26 acquires information indicating an operation mode for controlling operations of the plurality of output units 30 from the mode determination unit 25, and operates each output unit 30 according to the operation mode indicated by the acquired information. (For example, the direction in which the image is projected, the angle of view, the content of the image to be projected, etc.)
  • the output control unit 26 causes each output unit 30 to project display information to be presented to the user according to the determined operation mode.
  • the output control unit 26 may acquire control information indicating the execution result of the function specified based on the operation from the user from the process execution unit 23.
  • the output control unit 26 may project the display information indicating the execution result of the function specified based on the operation from the user on each output unit 30 according to the determined operation mode.
  • the output control unit 26 displays, for example, setting information indicating the setting of the direction and angle of view in which each output unit 30 projects an image, which is determined according to the detected position and orientation of the user. Obtained from the determination unit 25. Then, based on the acquired setting information, the output control unit 26 controls the operation of each output unit 30 (for example, the direction and angle of view for projecting the video) and presents the output unit 30 to the target user. Display information (for example, information indicating a result of an operation on a video (display information) projected by the output unit 30) may be projected.
  • the output control unit 26 determines the direction in which each of the two or more output units 30 projects an image so that the projection regions R11 corresponding to the two or more output units 30 to be linked to each other are coupled to each other. Control the angle of view. Further, the output control unit 26 divides the video (display information) to be projected into a plurality of partial images according to the coupling relationship between the projection regions R11 corresponding to each of the two or more output units 30, and is divided. The partial image may be output to the output unit 30 corresponding to each projection region R11.
  • the example of the functional configuration of the information processing apparatus 1 according to the present embodiment has been described above with reference to FIG.
  • the configuration illustrated in FIG. 5 is merely an example, and the functional configuration of the information processing device 1 is not necessarily limited to the example illustrated in FIG. 5 as long as the various operations of the information processing device 1 described above can be realized.
  • at least one of the input unit 10 and the plurality of output units 30 may be externally attached to the information processing device 1 as an external device different from the information processing device 1.
  • at least a part of the configuration of the control unit 20 may be provided in an external device (for example, a server connected via a network) different from the information processing device 1.
  • FIG. 6 is a flowchart illustrating an example of a flow of a series of operations of the information processing apparatus 1 according to the present embodiment.
  • Steps S101 and S103 First, the information processing apparatus 1 analyzes various input information (step S101), and determines an operation mode for controlling the operations of the plurality of output units 30 based on the analysis result (step S103).
  • the mode determination unit 25 may determine the operation mode based on the recognition result of the operation by the user.
  • the image analysis unit 21 acquires the image of the projection plane R10 acquired from the imaging unit 11, and performs image analysis on the image, thereby allowing the user's hand, etc., present on the projection plane R10 to exist. Detect the operating body. Then, the image analysis unit 21 outputs the analysis result of the image of the projection plane R10 (for example, the detection result of the operation body such as the user's hand) to the input analysis unit 22.
  • the input analysis unit 22 acquires the analysis result of the image of the projection plane R10 from the image analysis unit 21, and recognizes the content of the user operation (for example, the content of the gesture operation by an operating body such as a hand) based on the analysis result. To do. And the input analysis part 22 outputs the information (for example, the content which gesture operation shows) which shows the recognized operation content to the mode determination part 25 (step S101).
  • the mode determination unit 25 acquires information indicating the recognition result of the content of the operation by the user from the input analysis unit 22, and determines the operation mode according to the operation content indicated by the information (step S103).
  • the mode determination unit 25 acquires information (for example, attribute information such as resolution and the number of pixels) regarding the reproduced content from the processing execution unit 23, and based on the acquired information about the content, the operation mode May be determined.
  • the mode determination unit 25 as shown in FIG.
  • An operation mode for forming one display area by combining the 30 projection areas that is, an operation mode for linking two or more output units 30
  • step S103 an operation mode for linking two or more output units 30
  • the mode determination unit 25 determines an operation mode for controlling operations of the plurality of output units 30 according to various types of input information, and sends information indicating the determined operation mode to the output control unit 26. Output.
  • Step S105 The output control unit 26 acquires information indicating an operation mode for controlling operations of the plurality of output units 30 from the mode determination unit 25, and operates each output unit 30 according to the operation mode indicated by the acquired information. (For example, the direction in which the image is projected, the angle of view, the content of the image to be projected, etc.) are controlled.
  • Step S107 the output control unit 26 projects display information to be presented to the user on each output unit 30 according to the determined operation mode.
  • the output control unit 26 acquires the control information indicating the execution result of the function specified based on the operation from the user from the process execution unit 23, and the display information indicating the execution result based on the control information. May be projected on each output unit 30 according to the determined operation mode.
  • the information processing apparatus 1 determines an operation mode and projects according to the determined operation mode. Description has been given focusing on the operation of projecting an image on the surface R10.
  • Modification Example 1 Control Example for Projecting High Frame Rate Video
  • each output unit 30 projects a direction or image so that the projection regions R11 of the two or more output units 30 linked to each other overlap on the projection plane R10. Adjust the corners.
  • the information processing apparatus 1 is described as one in which the output units 30a and 30b cooperate with each other as the two or more output units 30.
  • FIG. 7 is an explanatory diagram for explaining the outline of the operation of the information processing apparatus 1 according to the first modification.
  • the projection region R11a corresponding to the output unit 30a and the output unit 30b An example of a state in which the corresponding projection region R11b is superimposed is shown.
  • the information processing apparatus 1 controls the operation of the output unit 30 so that two or more output units 30 (for example, the output units 30a and 30b) to be linked with each other project images at different timings.
  • two or more output units 30 for example, the output units 30a and 30b
  • FIG. 8 is an explanatory diagram for explaining an outline of the operation of the information processing apparatus 1 according to the first modification.
  • the information processing apparatus 1 causes the output units 30 a and 30 b to cooperate with each other so that a moving image with a higher frame rate than the frame rate based on the performance of each of the output units 30 a and 30 b. Projection is realized.
  • FIG. 8 schematically shows the order in which the frames F10a to F12a and F10b to F12b of the moving image to be projected are projected.
  • the frames F10a to F12a are projected on the output unit 30a from the series of frames F10a to F12a and F10b to F12b illustrated in FIG. 8, and the frames F10b to F12b are output. Projected onto the unit 30b.
  • the information processing apparatus 1 first causes the output unit 30a to project the frame F10a onto the projection region R11, and then causes the output unit 30b to project the frame F10b onto the projection region R11. After the frame F10b is projected, the information processing apparatus 1 causes the output unit 30a to project the frame F11a onto the projection region R11, and then causes the output unit 30b to project the frame F11b onto the projection region R11.
  • the information processing apparatus 1 superimposes the projection regions R11 of the output units 30a and 30b, and alternately displays the frames of the moving image to be projected on the output units 30a and 30b in time series. Projecting.
  • the information processing apparatus 1 according to Modification 1 can project a moving image at a frame rate higher than the frame rate based on the performance of each output unit 30.
  • the information processing apparatus 1 realizes the high frame rate described above when, for example, a moving image having a higher frame rate than the frame rate based on the performance of each output unit 30 is a projection target.
  • the operation of each output unit 30 may be controlled based on the operation mode.
  • Modification 1 with reference to FIGS. 7 and 8, a movie with a higher frame rate than the frame rate based on the performance of each of the output units 30 by linking two or more output units 30 together.
  • An example of the operation of the information processing apparatus 1 for realizing image projection has been described.
  • the information processing apparatus 1 may realize projection of an image with a wider dynamic range by causing the output units 30a and 30b to cooperate with each other.
  • the information processing apparatus 1 includes the output units 30 a and 30 b so that the projection regions R 11 (that is, the projection regions R 11 a and R 11 b) corresponding to the output units 30 a and 30 b overlap each other.
  • the operation of 30b is controlled.
  • the information processing apparatus 1 individually controls the light amount (for example, luminance) of the light emitted from the light sources of the output units 30a and 30b, so that each output unit 30 can be individually realized from the dynamic range.
  • the light amount for example, luminance
  • the information processing apparatus 1 when an image expressed in a wider dynamic range than the dynamic range that can be individually realized by each output unit 30 is a projection target, the information processing apparatus 1 performs the wide dynamic described above. You may control operation
  • the information processing apparatus 1 may realize an image projection with a wider color gamut by causing the output units 30a and 30b to cooperate with each other.
  • the information processing apparatus 1 includes the output units 30 a and 30 so that the projection regions R 11 (that is, the projection regions R 11 a and R 11 b) corresponding to the output units 30 a and 30 overlap each other.
  • the operation of 30b is controlled.
  • the information processing apparatus 1 outputs the light from the light sources of the output units 30a and 30b (that is, from the light source) so that the color gamuts of the light emitted from the output units 30a and 30b are different from each other.
  • the emitted light is controlled.
  • the method is not particularly limited as long as the color gamut of light emitted from each of the output units 30a and 30b can be controlled.
  • the information processing apparatus 1 may control the color gamut of light emitted from the output unit 30 by controlling the output of the light source.
  • the information processing apparatus 1 controls the optical system (for example, a filter or the like) that passes through before the light emitted from the light source is emitted to the outside of the output unit 30, so that the output unit 30. You may control the color gamut of the light radiate
  • the optical system for example, a filter or the like
  • the information processing apparatus 1 can project an image expressed in a wider color gamut than the color gamut that each output unit 30 can individually express.
  • each output unit 30 uses the wide color described above.
  • the operation of each output unit 30 may be controlled based on the operation mode for realizing the expression in the area.
  • FIG. 9 is an explanatory diagram for explaining an example of the operation of the information processing apparatus 1 according to the modified example 3.
  • An example of the case is shown.
  • FIG. 9 schematically shows the positional relationship of each pixel of the image projected on the projection plane R10 from each of the output units 30a and 30b.
  • reference symbol R13a schematically shows pixels of an image projected on the projection plane R10 by the output unit 30a (that is, video pixels projected on the projection region R11a).
  • reference symbol R13b schematically indicates pixels of an image projected on the projection plane R10 by the output unit 30b (that is, pixels of an image projected on the projection region R11b).
  • the information processing apparatus 1 determines that the difference between the position of the projection region R11a corresponding to the output unit 30a and the position of the projection region R11b corresponding to the output unit 30b is a half-pixel unit ( Alternatively, the position of each projection region R11 is adjusted so as to be in subpixel units.
  • the information processing apparatus 1 superimposes the images projected from the output units 30a and 30b on the projection plane R10, so that an image having a higher resolution than the resolution of each output unit 30 (for example, in the case of the example shown in FIG. 9, it is possible to simulate the projection of an image having a resolution of 4 times.
  • the information processing apparatus 1 may generate the pixel R13a projected by the output unit 30a and the pixel R13b projected by the output unit 30b based on each pixel in the image to be projected. As a specific example, the information processing apparatus 1 sets the pixel R13a projected by the output unit 30a to a pixel in a region corresponding to the pixel R13a in the image to be projected or a pixel located in the vicinity of the region. You may generate based on. The same applies to the pixel R13b projected by the output unit 30b.
  • the information processing apparatus 1 may, for example, describe “Niranjan Damera-Venkata and Nelson L. Chang” as an example of various filter processes for generating images projected by the output units 30a and 30b. , “Realizing Super-Resolution with Superimposed Projection,” IEEE International Workshop on Projector-Camera Systems IV (ProCams), 18 June 2007, Minneapolis, MN.
  • control shown in FIG. 9 can be realized by a single output unit 30.
  • the information processing apparatus 1 may project the pixel R13a projected onto the projection region R11a and the pixel R13b projected onto the projection region R11b in a time division manner.
  • FIG. 10 is an explanatory diagram for explaining another example of the operation of the information processing apparatus 1 according to the modified example 3.
  • the method is different from the control shown in FIG. 2 shows an example of control for realizing projection of an image with a higher resolution.
  • the image to be projected is divided into a plurality of partial images, and each partial image is projected in a time division manner, thereby projecting an image with a higher resolution than the resolution of each output unit 30.
  • the reference symbol R ⁇ b> 20 schematically indicates a range in which a video to be projected is projected.
  • Reference numerals R111 to R114 schematically indicate projection areas in which video is projected by the output unit 30 in a time division manner.
  • the information processing apparatus 1 divides an image to be projected into four partial images, and sequentially projects the corresponding partial images on the projection regions R111 to R114 in time division. In this way, the operation of the output unit 30 (for example, the projection direction, the angle of view, and the partial image to be projected) is controlled.
  • the so-called projector can control the size of the image projected on the projection plane R10 by controlling the angle of view, and even when the size of the image is changed.
  • the resolution of the video does not change. For this reason, when the angle of view is further reduced and the size of the image is controlled to be smaller, the size of each pixel in the projected image becomes smaller and the interval between the pixels becomes shorter.
  • the information processing apparatus 1 outputs the video to be projected by projecting each partial image with the resolution of the output unit 30 in a time division manner using the characteristics of the projector described above.
  • the image is projected at a higher resolution than the resolution of the unit 30 (that is, four times higher resolution).
  • control shown in FIG. 10 may be realized by control of a single output unit 30, or may be realized by linking two or more output units 30 to each other.
  • the information processing apparatus 1 causes each output unit 30 to project different partial images (that is, not to project the same partial image at the same timing).
  • the operation of the output unit 30 may be controlled.
  • the information processing apparatus 1 combines the control illustrated in FIG. 9 and the control illustrated in FIG. 10 in which at least two or more output units 30 cooperate with each other, so that the resolution is higher than the resolution of each output unit 30. Projection of the image may be realized.
  • each output unit 30 may be controlled based on the operation mode for realizing the high-resolution expression.
  • Modification 3 an example of the operation of the information processing apparatus 1 for realizing projection of an image with a resolution higher than the resolution of each output unit 30 has been described with reference to FIGS. 9 and 10. .
  • the controls of the information processing apparatus 1 described as the first to third modifications may be appropriately combined.
  • the information processing apparatus 1 realizes the control for realizing the expression in a wider color gamut described as the modification 2 and the expression with the higher resolution described as the modification 3. For this reason, it is possible to realize the projection of a higher quality image by combining with the control for this purpose.
  • the information processing apparatus 1 realizes the control described in the first modification for realizing the expression at a higher frame rate and the expression in the higher resolution described as the third modification. You may combine with the control for doing.
  • the number of the output parts 30 used as cooperation object may be adjusted suitably.
  • FIG. 11 is an explanatory diagram for explaining an example of the operation of the information processing apparatus 1 according to the modification 4.
  • the information processing apparatus 1 causes the output units 30a and 30b to cooperate with each other.
  • reference sign R11a indicates a projection region R11 corresponding to the output unit 30a.
  • Reference numeral R11b indicates a projection region R11 corresponding to the output unit 30b.
  • the information processing apparatus 1 is projecting an operation screen for operating the information processing apparatus 1 on the projection surface R ⁇ b> 10 on the top surface of the table 140.
  • a reference sign V11 schematically indicates a display object (for example, a window displayed based on the reproduction of content) presented in the operation screen. That is, the information processing apparatus 1 projects the entire operation screen toward the projection area R11b, and projects a partial area (partial image) on which the display object V11 is presented in the operation screen onto the projection area R11a. The image is superimposed on the operation screen projected on the projection region R11b.
  • the information processing apparatus 1 controls the projection region R11b to be formed over the entire projection surface R10 by widening the angle of view of the output unit 30b. The entire operation screen is projected on the output unit 30b.
  • the information processing apparatus 1 superimposes the projection region R11a on a partial region of the projection region R11b by controlling the angle of view and the projection direction for the output unit 30a.
  • the information processing apparatus 1 presents an image (that is, the display object V11) of a partial area of the operation screen corresponding to the area indicated by the projection area R11a in the operation screen projected on the projection area R11b.
  • the partial image is projected on the output unit 30a.
  • the information processing apparatus 1 can partially express the resolution of the area where the display object V11 is presented at a higher resolution than the entire operation screen projected on the projection area R11b. .
  • the position and size of the projection region R11a formed in the projection region R11b may be switched dynamically.
  • the information processing apparatus 1 changes the position of the projection region R11a in accordance with the change. You may change the size.
  • the information processing apparatus 1 controls the projection area R11a to be smaller by narrowing the angle of view of the output unit 30a, and the display object V11 is displayed in the projection area R11a.
  • the presented image is projected. That is, in the example illustrated in FIG. 12, the information processing apparatus 1 causes a part of the projection surface R10 to be a projection region R11a and causes the output unit 30a to project the video on which the display object V11 is presented toward the projection region R11a. ing. In this way, the display object V11 is presented by projecting the video only to the limited area of the projection area R11a, so that the resolution is higher than when the video is projected over the entire projection plane R10. The display object V11 is presented on the projection plane R10.
  • the information processing apparatus 1 recognizes that the position and size of the display object V11 are changed by the operation of the user U as illustrated in FIG. 13, the projection region R11a by the output unit 30a.
  • the projection of the video on the screen is temporarily stopped, and instead, the video is projected on the output unit 30b.
  • the reference symbol R11b shown in FIG. 13 indicates the projection region R11 corresponding to the output unit 30b.
  • the information processing apparatus 1 allows the angle of view of the output unit 30b so that the projection region R11b is a wider region including the projection region R11a (for example, a region corresponding to the entire projection surface R10).
  • the output unit 30b controls the direction in which the image is projected.
  • the information processing apparatus 1 projects the image projected on the projection area R11a onto the area corresponding to the projection area R11a in the projection area R11b (that is, the area where the display object V11 is displayed).
  • the image projected by the output unit 30b is controlled. That is, as shown in FIG. 13, the information processing apparatus 1 has the display object V11 with the same size and the same size as in the example shown in FIG. 12 even when switching to the state in which the image is projected onto the projection region R11b.
  • the video output by the output unit 30b is controlled so as to be projected.
  • the information processing apparatus 1 controls the display position of the display object V11 in the image projected on the projection region R11b based on the operation content of the user U. At this time, the information processing apparatus 1 controls the display position of the display object V11 based on the image processing, so that the position where the display object V11 is projected on the projection surface R10 without mechanical drive is determined. I have control. Therefore, during the control illustrated in FIG. 15, although the resolution of the display object V11 temporarily decreases, the control of the position where the display object V11 is projected is followed in a more preferable manner with respect to the operation by the user U. (That is, the response can be improved).
  • FIG. 14 shows a state in which the operation for moving the display object V11 by the user U is completed.
  • the output unit 30a so that the position where the display object V11 is displayed becomes the projection region R11a. Controls the direction in which the image is projected. Then, the information processing apparatus 1 restarts the projection of the image on which the display object V11 is presented on the output unit 30a. At this time, the information processing apparatus 1 may stop the projection of the video onto the projection region R11b by the output unit 30b.
  • the display object V11 is presented again on the projection surface R10 with a higher resolution than when the video is projected over the entire projection surface R10.
  • the information processing apparatus 1 controls the position of the display object V11 based on the operation content of the user U
  • the case where the size of the display object V11 is controlled is also described. It goes without saying that the same applies.
  • the information processing apparatus 1 has a wider projection region R11 when the display mode such as the position and size of the display object V11 is changed. An image is projected on the set output unit 30. Then, the information processing apparatus 1 controls the position and size of the display object V11 based on image processing, so that the display object V11 is projected on the projection plane R10 without any mechanical drive. Control the size. Such control temporarily reduces the resolution of the display object V11, but controls the display mode such as the position and size at which the display object V11 is projected in a more preferable mode for the operation by the user U. Can be made to follow.
  • the information processing apparatus 1 does not recognize the operation by the user U as in the information processing apparatus 1 (see FIG. 11) according to the above-described modification 4, and thus the projection region R11b (that is, A video (for example, the entire operation screen) may be projected onto a projection area R11 that includes the projection area R11a and is wider than the projection area R11a.
  • the information processing apparatus 1 projects an image on both the projection areas R11a and R11b when the operation by the user U is not recognized, and the projection area when the operation by the user U is recognized. It is only necessary to project an image on R11b.
  • FIG. 15 is a diagram illustrating an example of a hardware configuration of the information processing apparatus 1 according to an embodiment of the present disclosure.
  • the information processing apparatus 1 includes a processor 901, a memory 903, a storage 905, an operation device 907, a notification device 909, a detection device 911, an imaging device 913, Bus 917. Further, the information processing apparatus 1 may include a communication device 915.
  • the processor 901 may be, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), or a SoC (System on Chip), and executes various processes of the information processing apparatus 1.
  • the processor 901 can be configured by, for example, an electronic circuit for executing various arithmetic processes. Each configuration of the control unit 20 described above can be realized by the processor 901.
  • the memory 903 includes RAM (Random Access Memory) and ROM (Read Only Memory), and stores programs and data executed by the processor 901.
  • the storage 905 can include a storage medium such as a semiconductor memory or a hard disk.
  • the storage unit 40 described above can be realized by at least one of the memory 903 and the storage 905, or a combination of both.
  • the operation device 907 has a function of generating an input signal for a user to perform a desired operation.
  • the operation device 907 can be configured as a touch panel, for example.
  • the operation device 907 generates an input signal based on an input by the user, such as buttons, switches, and a keyboard, and an input for the user to input information, and supplies the input signal to the processor 901. It may be composed of a control circuit or the like.
  • the notification device 909 is an example of an output device.
  • the information may be notified to the user by projecting predetermined information onto a projection surface like a so-called projector.
  • the output unit 30 described above can be realized by the notification device 909.
  • the notification device 909 may be a device such as a liquid crystal display (LCD) device or an organic EL (Organic Light Emitting Diode) display. In this case, the notification device 909 can notify the user of predetermined information by displaying the screen.
  • LCD liquid crystal display
  • organic EL Organic Light Emitting Diode
  • the notification device 909 may be a device that notifies a user of predetermined information by outputting a predetermined acoustic signal, such as a speaker.
  • the notification device 909 described above is merely an example, and the aspect of the notification device 909 is not particularly limited as long as predetermined information can be notified to the user.
  • the notification device 909 may be a device that notifies the user of predetermined information using a lighting or blinking pattern, such as an LED (Light Emitting Diode).
  • the imaging device 913 includes an imaging element that captures a subject and obtains digital data of the captured image, such as a CMOS (Complementary Metal-Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor. That is, the imaging device 913 has a function of capturing a still image or a moving image via an optical system such as a lens in accordance with the control of the processor 901.
  • the imaging device 913 may store the captured image in the memory 903 or the storage 905.
  • the imaging unit 11 described above can be realized by the imaging device 913.
  • the detection device 911 is a device for detecting various states.
  • the detection device 911 can be configured by a so-called distance measuring sensor such as a stereo image sensor. Further, the detection device 911 may be configured by a sensor for detecting a predetermined target, such as a so-called optical sensor.
  • the detection unit 13 described above can be realized by the detection device 911.
  • the communication device 915 is a communication unit included in the information processing apparatus 1 and communicates with an external device via a network.
  • the communication device 915 is a wired or wireless communication interface.
  • the communication device 915 may include a communication antenna, an RF (Radio Frequency) circuit, a baseband processor, and the like.
  • the communication device 915 has a function of performing various kinds of signal processing on a signal received from an external device, and can supply a digital signal generated from the received analog signal to the processor 901.
  • the bus 917 connects the processor 901, the memory 903, the storage 905, the operation device 907, the notification device 909, the detection device 911, the imaging device 913, and the communication device 915 to each other.
  • the bus 917 may include a plurality of types of buses.
  • the information processing apparatus 1 selectively selects a plurality of operation modes having different modes for superimposing the projection regions R11 corresponding to the plurality of output units 30, respectively. It is configured to be switchable.
  • at least a part of the plurality of operation modes that the information processing apparatus 1 is a switching target includes an operation mode in which at least two or more output units 30 among the plurality of output units 30 cooperate with each other.
  • at least a part of the plurality of operation modes to be switched by the information processing apparatus 1 is an operation mode for individually controlling the operation of at least some of the output units 30 among the plurality of output units 30. May be included.
  • the information processing apparatus 1 is configured according to the usage scene of the information processing apparatus 1 (that is, usage usage, usage mode, content to be projected, and the like).
  • the output unit 30 can project an image in a more preferable manner.
  • the plurality of output units 30 to be controlled by the information processing apparatus 1 are not necessarily required to have the same performance, and may have different characteristics.
  • the information processing apparatus 1 uses the output unit 30 that is more suitable for the selected operation mode (in other words, more suitable for the use scene) as a control target in the operation mode according to the performance of each output unit 30. You may choose.
  • each output unit 30 to be detachable with respect to the information processing apparatus 1, the number of output units 30 to be controlled by the information processing apparatus 1 may be appropriately changed.
  • the information processing apparatus 1 dynamically switches the operation mode candidates to be selected according to the number of output units 30 recognized as control targets and the performance of each output unit 30. Also good.
  • a control unit that controls operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
  • the controller selectively switches a plurality of operation modes different from each other in a mode for superimposing the projection regions corresponding to each of at least two or more of the plurality of projection units.
  • Information processing device (2) The information processing apparatus according to (1), wherein the control unit selectively switches the operation mode according to information associated with the display information to be displayed.
  • the information processing apparatus according to any one of (1) to (3), wherein the control unit selectively switches the operation mode according to a detection result of an external environment.
  • the control unit in at least some of the plurality of operation modes, combines the projection regions corresponding to two or more projection units to form one display region, and the display region includes the display region.
  • the information processing apparatus according to any one of (1) to (4), wherein display information is projected.
  • the control unit controls, in at least some of the plurality of operation modes, operations of the two or more projection units in which the projection regions are superimposed on each other in a different manner from each other (1
  • the information processing apparatus according to any one of (5) to (5).
  • control unit causes the display information controlled to have different color gamuts to be projected onto each of the two or more projection units in which the projection regions are superimposed on each other. .
  • the control unit controls the dynamic range of the display information projected on the projection region by individually controlling the luminance of each of the two or more projection units obtained by superimposing the projection regions on each other, (6 ).
  • control unit controls the operation of each of the two or more projection units such that a difference between projection positions of the plurality of projection regions superimposed on each other is in sub-pixel units.
  • the information processing apparatus causes the two or more projection units that overlap the projection regions to project the display information at different timings.
  • the control unit corresponds to a plurality of partial images in which the display information is divided in at least some of the projection units in the projection region in at least some of the plurality of operation modes.
  • the information processing apparatus according to any one of (1) to (10), wherein the information is projected onto the partial area in a time division manner.
  • the control unit performs the first projection on a first projection region corresponding to the first projection unit among the plurality of projection units in at least some of the plurality of operation modes.
  • the controller is In at least some of the plurality of operation modes, When controlling the display mode of the first display information projected on the first projection area corresponding to the first projection unit among the plurality of projection units, Instead of the first projecting unit, the second projecting unit projects the second display information on which the first display information is presented on the second projecting region including the first projecting region. Let Controlling the display mode of the first display information in the second display information, The information processing apparatus according to any one of (1) to (12).
  • Processor Controlling the operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane; A mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units among the plurality of projection units selectively switching a plurality of operation modes different from each other; Including an information processing method.
  • On the computer Controlling the operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane; A mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units among the plurality of projection units selectively switching a plurality of operation modes different from each other; A program that executes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The purpose of the present invention is to provide an information processing device that makes it possible to project a video in a more suitable mode that corresponds to a use scene. An information processing device provided with a control unit for controlling the operation of a plurality of projection units that makes display information displayed by projecting the display information to at least some of projection areas in the plane of projection, the control unit selectively switching a plurality of operation modes mutually different in a mode for causing a projection area corresponding to each of two or more projection units among the plurality of projection units to be superposed one on another.

Description

情報処理装置、情報処理方法、及びプログラムInformation processing apparatus, information processing method, and program
 本開示は、情報処理装置、情報処理方法、及びプログラムに関する。 The present disclosure relates to an information processing apparatus, an information processing method, and a program.
 近年では、スクリーン等の投影面に対して映像を投影することで情報を提示する所謂プロジェクタは、オフィスのみに限らず、家庭などでも利用されるようになってきており、利用用途が広がってきている。また、近年では、映像が投影される壁面の近くに当該映像の投影元となる筐体を設置することが可能な、所謂短焦点(Short-Throw)のプロジェクタ等のように、当該プロジェクタの利用形態も多様化してきている。例えば、特許文献1には、所謂プロジェクタの利用形態の一例が開示されている。 In recent years, so-called projectors that present information by projecting an image on a projection surface such as a screen have been used not only in offices but also in homes, and their usage has expanded. Yes. Also, in recent years, the use of the projector, such as a so-called short-throw projector, in which a casing from which the image is projected can be installed near the wall on which the image is projected. Forms are also diversifying. For example, Patent Document 1 discloses an example of a usage form of a so-called projector.
特開2005-352171号公報JP 2005-352171 A
 一方で、プロジェクタの利用シーンの多様化(例えば、利用用途、利用形態、及び、投影対象となるコンテンツの多様化)に伴い、当該利用シーンに応じて、投影される映像の大きさ(即ち、画角の広さ)や、当該映像の解像度等が、より好適な設定となるように、映像の投影態様を制御可能な仕組みが求められている。 On the other hand, with the diversification of the usage scenes of the projector (for example, the usage, the usage form, and the diversification of the content to be projected), the size of the projected image (that is, the projection scene) There is a demand for a mechanism capable of controlling the projection mode of the video so that the angle of view), the resolution of the video, and the like are set more appropriately.
 そこで、本開示では、利用シーンに応じて、より好適な態様で映像を投影させることが可能な、情報処理装置、情報処理方法、及びプログラムを提案する。 Therefore, the present disclosure proposes an information processing apparatus, an information processing method, and a program capable of projecting a video in a more preferable manner according to a usage scene.
 本開示によれば、投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御する制御部を備え、前記制御部は、前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替える、情報処理装置が提供される。 According to the present disclosure, the control unit includes a control unit that controls operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane, An information processing apparatus is provided that selectively switches a plurality of operation modes different from each other in a mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units. The
 また、本開示によれば、プロセッサが、投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御することと、前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替えることと、を含む、情報処理方法が提供される。 According to the present disclosure, the processor controls the operations of the plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane; An aspect for superimposing the projection regions corresponding to each of at least two or more of the plurality of projection units selectively switching a plurality of different operation modes from each other. Is provided.
 また、本開示によれば、コンピュータに、投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御することと、前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替えることと、を実行させる、プログラムが提供される。 According to the present disclosure, the computer controls the operations of the plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane; A program for executing a mode in which the projection region corresponding to each of at least two or more of the projection units among the plurality of projection units is selectively switched among a plurality of different operation modes. Provided.
 以上説明したように本開示によれば、利用シーンに応じて、より好適な態様で映像を投影させることが可能な、情報処理装置、情報処理方法、及びプログラムが提供される。 As described above, according to the present disclosure, an information processing apparatus, an information processing method, and a program capable of projecting a video in a more preferable mode according to a usage scene are provided.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の一実施形態に係る情報処理装置の概要について説明するための説明図である。FIG. 3 is an explanatory diagram for describing an overview of an information processing apparatus according to an embodiment of the present disclosure. 同実施形態に係る情報処理装置の概略的な構成の一例について説明するための説明図である。It is explanatory drawing for demonstrating an example of the schematic structure of the information processing apparatus which concerns on the embodiment. 同実施形態に係る情報処理装置の概略的な動作の一例について説明するための説明図である。4 is an explanatory diagram for describing an example of a schematic operation of the information processing apparatus according to the embodiment; FIG. 同実施形態に係る情報処理装置の概略的な動作の他の一例について説明するための説明図である。6 is an explanatory diagram for describing another example of the schematic operation of the information processing apparatus according to the embodiment. FIG. 同実施形態に係る情報処理装置の機能構成の一例を示したブロック図である。It is the block diagram which showed an example of the function structure of the information processing apparatus which concerns on the embodiment. 同実施形態に係る情報処理装置の一連の動作の流れの一例を示したフローチャートである。5 is a flowchart illustrating an example of a flow of a series of operations of the information processing apparatus according to the embodiment. 変形例1に係る情報処理装置の動作の概要について説明するための説明図である。11 is an explanatory diagram for describing an outline of an operation of an information processing apparatus according to Modification Example 1. FIG. 変形例1に係る情報処理装置の動作の概要について説明するための説明図である。11 is an explanatory diagram for describing an outline of an operation of an information processing apparatus according to Modification Example 1. FIG. 変形例3に係る情報処理装置の動作の一例について説明するための説明図である。11 is an explanatory diagram for describing an example of an operation of an information processing apparatus according to Modification 3. FIG. 変形例3に係る情報処理装置の動作の他の一例について説明するための説明図である。11 is an explanatory diagram for explaining another example of the operation of the information processing apparatus according to the modification example 3. FIG. 変形例4に係る情報処理装置の動作の一例について説明するための説明図である。10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 4. FIG. 変形例5に係る情報処理装置の動作の一例について説明するための説明図である。10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 5. FIG. 変形例5に係る情報処理装置の動作の一例について説明するための説明図である。10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 5. FIG. 変形例5に係る情報処理装置の動作の一例について説明するための説明図である。10 is an explanatory diagram for explaining an example of an operation of an information processing apparatus according to Modification Example 5. FIG. 同実施形態に係る情報処理装置のハードウェア構成の一例を示した図である。It is the figure which showed an example of the hardware constitutions of the information processing apparatus which concerns on the embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 なお、説明は以下の順序で行うものとする。
 1.概要
 2.機能構成
 3.処理
 4.変形例
  4.1.変形例1:高フレームレートの映像を投影するための制御例
  4.2.変形例2:高画質の映像を投影するための制御例1
  4.3.変形例3:高画質の映像を投影するための制御例2
  4.4.変形例4:部分的な高画質化を実現するための制御例
  4.5.変形例5:表示態様の変更時における追従性向上のための制御例
 5.ハードウェア構成
 6.まとめ
The description will be made in the following order.
1. Overview 2. Functional configuration Process 4. Modified example 4.1. Modification Example 1: Control Example for Projecting High Frame Rate Video 4.2. Modification 2: Control example 1 for projecting high-quality video
4.3. Modification 3: Control example 2 for projecting high-quality video
4.4. Modification 4: Example of control for realizing partial high image quality 4.5. Modification 5: Control example for improving followability when changing display mode Hardware configuration Summary
 <1.概要>
 まず、本開示の一実施形態に係る情報処理装置の特徴をよりわかりやすくするために、図1を参照して本実施形態に係る情報処理装置の概要について説明する。図1は、本実施形態に係る情報処理装置の概要について説明するための説明図である。
<1. Overview>
First, in order to make the characteristics of the information processing apparatus according to an embodiment of the present disclosure easier to understand, an overview of the information processing apparatus according to the present embodiment will be described with reference to FIG. FIG. 1 is an explanatory diagram for explaining an overview of the information processing apparatus according to the present embodiment.
 本実施形態に係る情報処理装置1は、所謂プロジェクタとして構成されており、所望の投影面に対して、所望の表示情報が提示された映像を投影することで、当該表示情報をユーザに対して提示する。例えば、図1に示す例では、情報処理装置1は、テーブル140等の鉛直上方に設置され、当該テーブル140の天面を投影面R10として、当該投影面R10に映像を投影している。なお、以降の説明では、図1に示す例において、鉛直方向をz方向とし、水平面上の互いに直交する方向(即ち、z方向に対して直交し、かつ互いに直交する方向)をx方向及びy方向として称する場合がある。 The information processing apparatus 1 according to the present embodiment is configured as a so-called projector, and projects a video on which desired display information is presented on a desired projection surface, thereby providing the display information to the user. Present. For example, in the example illustrated in FIG. 1, the information processing apparatus 1 is installed vertically above the table 140 or the like, and projects an image on the projection plane R10 with the top surface of the table 140 as the projection plane R10. In the following description, in the example shown in FIG. 1, the vertical direction is the z direction, and the directions orthogonal to each other on the horizontal plane (that is, the directions orthogonal to the z direction and orthogonal to each other) are the x direction and y. Sometimes referred to as a direction.
 図1において、参照符号R11a及びR11bは、情報処理装置1が映像を投影する、投影面R10中の領域(以降では、「投影領域」と称する場合がある)を模式的に示している。なお、以降の説明では、投影領域R11a及びR11bを特に区別しない場合には、単に「投影領域R11」と称する場合がある。 1, reference numerals R11a and R11b schematically indicate areas in the projection surface R10 (hereinafter, may be referred to as “projection areas”) on which the information processing apparatus 1 projects an image. In the following description, the projection areas R11a and R11b may be simply referred to as “projection area R11” unless they are particularly distinguished.
 即ち、本実施形態に係る情報処理装置1は、図1に示すように、投影面R10中の少なくとも一部を投影領域R11とし、かつ、複数の投影領域R11それぞれに対して映像を投影可能に構成されている。 That is, as shown in FIG. 1, the information processing apparatus 1 according to the present embodiment makes at least a part of the projection plane R10 a projection area R11 and can project an image on each of the plurality of projection areas R11. It is configured.
 ここで、図2を参照して、本実施形態に係る情報処理装置1の概略的な構成の一例について説明する。図2は、本実施形態に係る情報処理装置1の概略的な構成の一例について説明するための説明図である。図2に示すように、本実施形態に係る情報処理装置1は、出力部30a及び30bを含む。なお、以降の説明では、出力部30a及び30bを特に区別しない場合には、単に「出力部30」と称する場合がある。また、情報処理装置1は、入力部10を含んでもよい。 Here, an example of a schematic configuration of the information processing apparatus 1 according to the present embodiment will be described with reference to FIG. FIG. 2 is an explanatory diagram for describing an example of a schematic configuration of the information processing apparatus 1 according to the present embodiment. As shown in FIG. 2, the information processing apparatus 1 according to the present embodiment includes output units 30a and 30b. In the following description, the output units 30a and 30b may be simply referred to as “output unit 30” unless they are particularly distinguished. Further, the information processing apparatus 1 may include the input unit 10.
 出力部30は、ユーザに対して各種情報を提示するための構成である。例えば、図2に示す例では、出力部30は、テーブル140とは離隔して、当該テーブル140の天面側(即ち、投影面R10側)を向くように設置されている。出力部30は、所謂プロジェクタにおいて映像を投影するための構成(即ち、投影部)を含み、投影面R10中の投影領域R11に対して映像を投影可能に構成されている。また、本実施形態に係る出力部30は、映像を投影する方向や、投影する映像の大きさ(換言すると、画角の広さ)を制御可能に構成されている。即ち、出力部30は、投影面R10中の所望の位置に、所望の大きさで映像を投影可能に構成されている。換言すると、出力部30は、投影領域R11の位置や大きさを制御可能に構成されている。 The output unit 30 is configured to present various information to the user. For example, in the example illustrated in FIG. 2, the output unit 30 is installed so as to face the top surface side (that is, the projection surface R <b> 10 side) of the table 140 apart from the table 140. The output unit 30 includes a configuration for projecting an image in a so-called projector (that is, a projection unit), and is configured to be able to project an image on a projection region R11 in the projection surface R10. Further, the output unit 30 according to the present embodiment is configured to be able to control the direction in which an image is projected and the size of the image to be projected (in other words, the angle of view). That is, the output unit 30 is configured to be able to project an image with a desired size at a desired position in the projection plane R10. In other words, the output unit 30 is configured to be able to control the position and size of the projection region R11.
 なお、図1及び図2において、投影領域R11aは、出力部30aが映像を投影する領域を示している。同様に、投影領域R11bは、出力部30bが映像を投影する領域を示している。 In FIGS. 1 and 2, the projection region R11a indicates a region where the output unit 30a projects an image. Similarly, the projection region R11b indicates a region where the output unit 30b projects an image.
 また、出力部30は、所謂マイクロフォン等のように音響情報を出力するための音響デバイスを含んでもよい。 Further, the output unit 30 may include an acoustic device for outputting acoustic information such as a so-called microphone.
 入力部10は、情報処理装置1を使用するユーザの操作内容や、テーブル140に置かれている物体の形状や模様などを入力するための構成である。例えば、図2に示す例では、入力部10は、テーブル140とは離隔して、出力部30が映像を投影する投影面R10側を向くように設置されている。 The input unit 10 is configured to input the operation contents of the user who uses the information processing apparatus 1 and the shape and pattern of an object placed on the table 140. For example, in the example illustrated in FIG. 2, the input unit 10 is installed so as to face the projection surface R <b> 10 side on which the output unit 30 projects an image while being separated from the table 140.
 入力部10は、例えば1つの撮像光学系(例えば、一連のレンズ群)でテーブル140を撮像するカメラや、2つの撮像光学系でテーブル140を撮像して奥行き方向の情報を記録することが可能なステレオカメラを含み得る。また、入力部10は、情報処理装置1を使用するユーザが発する音声や情報処理装置1が置かれている環境の環境音等の音響情報を集音するための集音デバイス(例えば、マイクロフォン等)を含んでもよい。 The input unit 10 can record information in the depth direction by imaging the table 140 with two imaging optical systems, for example, a camera that images the table 140 with one imaging optical system (for example, a series of lens groups). A stereo camera can be included. In addition, the input unit 10 is a sound collection device (for example, a microphone or the like) for collecting acoustic information such as sound emitted by a user who uses the information processing apparatus 1 and environmental sound of the environment where the information processing apparatus 1 is placed. ) May be included.
 入力部10が、1つの撮像光学系でテーブル140を撮像するカメラを含む場合には、情報処理装置1は、例えば、そのカメラが撮像した画像を解析することで、テーブル140に置かれた物体を検出する。また入力部10がステレオカメラを含む場合には、そのステレオカメラには、例えば可視光カメラや赤外線カメラ等が適用され得る。入力部10がステレオカメラを含むことで、入力部10は、深度情報を取得することが可能となる。入力部10が、深度情報を取得することで、情報処理装置1は、例えばテーブル140の上に置かれた手や物体等の実オブジェクトを検出することが可能となる。また、入力部10が、深度情報を取得することで、情報処理装置1は、投影面R10(即ち、テーブル140の天面)へのユーザの手のような操作体の接触及び近接や、投影面R10からの操作体の離脱を検出することが可能となる。なお、以降の説明では、ユーザが投影面R10に手等の操作体を接触または近接させることを、まとめて単に「接触」と称する場合がある。 When the input unit 10 includes a camera that captures the table 140 with a single imaging optical system, the information processing apparatus 1 analyzes an image captured by the camera, for example, so that an object placed on the table 140 is analyzed. Is detected. When the input unit 10 includes a stereo camera, for example, a visible light camera or an infrared camera can be applied to the stereo camera. When the input unit 10 includes a stereo camera, the input unit 10 can acquire depth information. When the input unit 10 acquires depth information, the information processing apparatus 1 can detect a real object such as a hand or an object placed on the table 140, for example. Further, when the input unit 10 acquires the depth information, the information processing apparatus 1 allows the operation body such as the user's hand to contact and approach the projection surface R10 (that is, the top surface of the table 140), and the projection. It is possible to detect the detachment of the operating tool from the surface R10. In the following description, when a user brings an operating tool such as a hand into contact with or close to the projection surface R10, it may be simply referred to as “contact”.
 また、以降では、手などの操作体によるユーザの操作が、入力部10によって撮像された画像から検出される場合を主に説明するが、本開示は係る例に限定されるものではない。ユーザの操作は、ユーザの指等の接触を検出するタッチパネルによって検出されてもよい。また、入力部10が取得できるユーザ操作としては、この他にも、例えば投影面R10に対するスタイラス操作、カメラに対するジェスチャ操作等が含まれ得る。 In the following, a case where a user's operation using an operating tool such as a hand is detected from an image captured by the input unit 10 will be mainly described, but the present disclosure is not limited to such an example. The user's operation may be detected by a touch panel that detects contact of the user's finger or the like. Other user operations that can be acquired by the input unit 10 may include, for example, a stylus operation on the projection plane R10, a gesture operation on the camera, and the like.
 また、入力部10がマイクロフォンのような集音デバイスを含む場合には、入力部10は、ユーザの発話に伴う音声を、音声入力として取得してもよい。 Further, when the input unit 10 includes a sound collection device such as a microphone, the input unit 10 may acquire a voice accompanying the user's utterance as a voice input.
 また、入力部10は、光学センサ等のような、外部環境の変化を検知するための検知デバイスを含んでもよい。この場合には、入力部10は、検知デバイスによる検知結果を入力情報として取得してもよい。 The input unit 10 may include a detection device for detecting a change in the external environment, such as an optical sensor. In this case, the input unit 10 may acquire a detection result by the detection device as input information.
 以上のような構成により、例えば、情報処理装置1は、入力部10により取得された各種入力情報に基づき、ユーザからの入力情報(例えば、操作内容や指示内容)や、各種状態や状況の変化を認識し、認識結果に応じて、出力部30の動作を制御することが可能となる。 With the above-described configuration, for example, the information processing apparatus 1 is based on various types of input information acquired by the input unit 10, and input information (for example, operation content and instruction content) from the user, and changes in various states and situations. And the operation of the output unit 30 can be controlled according to the recognition result.
 以上、図1及び図2を参照して、本実施形態に係る情報処理装置1の概略的な構成の一例について説明した。 Heretofore, an example of a schematic configuration of the information processing apparatus 1 according to the present embodiment has been described with reference to FIGS. 1 and 2.
 なお、図1及び図2に示す情報処理装置1の構成はあくまで一例であり、複数の出力部30を備え、かつ、当該複数の出力部30それぞれを独立して制御可能であれば、情報処理装置1の構成は、必ずしも図1及び図2に示す例には限定されない。具体的な一例として、情報処理装置1は、鉛直方向に向けて延伸する壁面を投影面として、当該壁面に対して映像を投影する(即ち、壁面に対向する位置から当該壁面に向けて映像を投影する)ように構成されていてもよい。また、情報処理装置1は、所謂短焦点のプロジェクタとして構成されていてもよい。また、情報処理装置1は、ガラス板や透明プラスチック板等の透明な材質で形成された投影面に対して、当該投影面の裏側から映像を投影する、所謂リアプロジェクション型のプロジェクタとして構成されていてもよい。 Note that the configuration of the information processing apparatus 1 illustrated in FIGS. 1 and 2 is merely an example, and the information processing apparatus 1 includes a plurality of output units 30 and can control each of the plurality of output units 30 independently. The configuration of the device 1 is not necessarily limited to the examples shown in FIGS. As a specific example, the information processing apparatus 1 projects an image on the wall surface with a wall surface extending in the vertical direction as a projection surface (that is, the image is projected from the position facing the wall surface toward the wall surface. Projecting). Further, the information processing apparatus 1 may be configured as a so-called short focus projector. Further, the information processing apparatus 1 is configured as a so-called rear projection type projector that projects an image from the back side of a projection surface formed of a transparent material such as a glass plate or a transparent plastic plate. May be.
 また、図1及び図2を参照して説明した例では、情報処理装置1が、入力部10と、出力部30a及び30bとを含む構成の一例について説明したが、情報処理装置1の構成は、必ずしも図1及び図2に示す例には限定されない。具体的な一例として、入力部10と、出力部30a及び30bとのうち、少なくとも一部が、情報処理装置1の外部に設けられていてもよい。この場合には、例えば、情報処理装置1は、外部に設けられた構成との間で通信を確立し、当該通信を介して当該構成の動作を制御してもよい。また、上記では、情報処理装置1が入力部10を含む例について説明したが、当該情報処理装置1は、複数の出力部30を含み、当該出力部30それぞれの動作を制御可能であれば、必ずしも入力部10を含まなくてもよい。 Moreover, in the example demonstrated with reference to FIG.1 and FIG.2, although the information processing apparatus 1 demonstrated an example of the structure containing the input part 10 and the output parts 30a and 30b, the structure of the information processing apparatus 1 is shown. However, the present invention is not necessarily limited to the examples shown in FIGS. As a specific example, at least a part of the input unit 10 and the output units 30 a and 30 b may be provided outside the information processing apparatus 1. In this case, for example, the information processing apparatus 1 may establish communication with a configuration provided outside and control the operation of the configuration via the communication. In the above description, the example in which the information processing apparatus 1 includes the input unit 10 has been described. However, if the information processing apparatus 1 includes a plurality of output units 30 and can control the operations of the output units 30, The input unit 10 is not necessarily included.
 また、図1及び図2を参照して説明した例では、情報処理装置1が、2つの出力部30(即ち、出力部30a及び30b)を含む例について説明したが、出力部30が複数含まれていれば、その数は必ずしも図1及び図2に示す例には限定されない。具体的な一例として、情報処理装置1は、4つの出力部30を含み、当該4つの出力部30それぞれに、投影面R10に対して映像を投影させてもよい。 In the example described with reference to FIGS. 1 and 2, the example in which the information processing apparatus 1 includes two output units 30 (that is, the output units 30 a and 30 b) has been described. However, a plurality of output units 30 are included. The number is not necessarily limited to the examples shown in FIGS. As a specific example, the information processing apparatus 1 may include four output units 30, and each of the four output units 30 may project an image onto the projection plane R10.
 次に、本実施形態に係る情報処理装置1の概略的な動作の一例について説明する。例えば、図3は、本実施形態に係る情報処理装置1の概略的な動作の一例について説明するための説明図であり、情報処理装置1が4つの出力部30(ここでは、出力部30a~30dとする)の動作を制御可能な場合の一例を示している。図3において、参照符号R11aは、出力部30aが映像を投影する投影領域を示している。同様に、参照符号R11b~R11dは、出力部30b~30dが映像を投影する投影領域を示している。 Next, an example of a schematic operation of the information processing apparatus 1 according to the present embodiment will be described. For example, FIG. 3 is an explanatory diagram for explaining an example of a schematic operation of the information processing apparatus 1 according to the present embodiment. The information processing apparatus 1 includes four output units 30 (here, the output units 30a to 30a). 30d) is illustrated as an example in which the operation can be controlled. In FIG. 3, reference numeral R11a indicates a projection area in which the output unit 30a projects an image. Similarly, reference numerals R11b to R11d indicate projection areas in which the output units 30b to 30d project images.
 前述したように、本実施形態に係る情報処理装置1は、複数の出力部30それぞれの動作を制御可能に構成されている。また、出力部30のそれぞれは、投影領域R11の位置や大きさを制御可能に構成されている。このような構成に基づき、情報処理装置1は、例えば、図3に示すように、複数の出力部30のそれぞれの動作を個別に制御することで、投影面R10中の所望の位置に、所望の向き及び大きさの映像を投影させることが可能となる。 As described above, the information processing apparatus 1 according to the present embodiment is configured to be able to control the operations of the plurality of output units 30. Each of the output units 30 is configured to be able to control the position and size of the projection region R11. Based on such a configuration, for example, as illustrated in FIG. 3, the information processing apparatus 1 controls each operation of the plurality of output units 30 individually, so that a desired position in the projection plane R10 is obtained. It is possible to project an image of the direction and size.
 例えば、図3に示す例では、情報処理装置1は、入力部10からの入力情報に基づき、投影面R10に対するユーザUa~Ucそれぞれの位置を認識し、認識結果に応じて、出力部30a~30dそれぞれの動作を制御している。 For example, in the example shown in FIG. 3, the information processing apparatus 1 recognizes the positions of the users Ua to Uc with respect to the projection plane R10 based on the input information from the input unit 10, and outputs the output units 30a to 30c according to the recognition result. Each operation of 30d is controlled.
 具体的には、情報処理装置1は、ユーザUaの位置の認識結果に基づき、出力部30cに対応する投影領域R11cが、当該ユーザUaの近傍に位置するように、出力部30cの動作を制御している。なお、このとき情報処理装置1は、投影面R11cに投影される映像とユーザUaとが正対するように、出力部30cに当該映像の向きを調整させてもよい。 Specifically, the information processing apparatus 1 controls the operation of the output unit 30c based on the recognition result of the position of the user Ua so that the projection region R11c corresponding to the output unit 30c is located in the vicinity of the user Ua. is doing. At this time, the information processing apparatus 1 may cause the output unit 30c to adjust the orientation of the video so that the video projected on the projection plane R11c and the user Ua face each other.
 また、情報処理装置1は、ユーザUbの位置の認識結果に基づき、出力部30dに対応する投影領域R11dが、当該ユーザUbの近傍に位置するように、出力部30dの動作を制御している。同様に、情報処理装置1は、ユーザUcの位置の認識結果に基づき、出力部30a及び30bに対応する投影領域R11a及びR11bが、当該ユーザUcの近傍に位置するように、出力部30a及び30bの動作を制御している。 Further, the information processing apparatus 1 controls the operation of the output unit 30d based on the recognition result of the position of the user Ub so that the projection region R11d corresponding to the output unit 30d is located in the vicinity of the user Ub. . Similarly, the information processing apparatus 1 outputs the output units 30a and 30b so that the projection regions R11a and R11b corresponding to the output units 30a and 30b are located in the vicinity of the user Uc based on the recognition result of the position of the user Uc. Is controlling the operation.
 なお、情報処理装置1は、各出力部30それぞれの画角を個別に制御することで、当該出力部30に対応する投影領域の大きさ(換言すると、当該出力部30により投影される映像の大きさ)を制御してもよい。 Note that the information processing apparatus 1 individually controls the angle of view of each output unit 30, so that the size of the projection region corresponding to the output unit 30 (in other words, the image projected by the output unit 30 can be reduced). (Size) may be controlled.
 このように、本実施形態に係る情報処理装置1は、例えば、複数の出力部30それぞれを独立して制御して、各出力部30に個別に映像を投影させてもよい。 Thus, for example, the information processing apparatus 1 according to the present embodiment may independently control each of the plurality of output units 30 and cause each output unit 30 to project an image individually.
 また、本実施形態に係る情報処理装置1は、複数の出力部30のうち、少なくとも一部の2以上の出力部30を互い連携させて動作させることで、当該連携する2以上の出力部30により1つの映像を投影させてもよい。 In addition, the information processing apparatus 1 according to the present embodiment operates at least a part of two or more output units 30 among the plurality of output units 30 so that the two or more output units 30 cooperate with each other. One image may be projected by the above.
 例えば、出力部30が投影面R10に映像を投影する場合には、画角の設定や、出力部30と投影面R10との間の距離(即ち、投写距離)とに応じて、投影面R10に投影される映像の大きさが制御される。一方で、投影面R10に投影される映像の大きさが変わった場合においても、投影される映像の解像度が変わらないため、映像の大きさに応じて、ユーザが、当該映像の画質を粗く感じる場合がある。特に、図1及び図2に示すように、テーブル140の天面上を投影面R10とする場合には、ユーザと投影面R10との間の距離が近接しやすくなるため、映像の大きさがより大きくなるように制御した場合に、ユーザは、当該映像の画質をより粗く感じる傾向にある。 For example, when the output unit 30 projects an image on the projection plane R10, the projection plane R10 depends on the setting of the angle of view and the distance between the output unit 30 and the projection plane R10 (that is, the projection distance). The size of the image projected on the screen is controlled. On the other hand, even when the size of the image projected on the projection surface R10 changes, the resolution of the projected image does not change, so that the user feels the image quality of the image rough according to the size of the image. There is a case. In particular, as shown in FIGS. 1 and 2, when the top surface of the table 140 is the projection plane R10, the distance between the user and the projection plane R10 is likely to be close, so that the size of the image is small. When controlling to be larger, the user tends to feel the image quality of the video more coarsely.
 また、出力部30が投影可能な映像の解像度やピクセル数は、出力部30の性能や特性に応じてあらかじめ決定される。そのため、出力部30が、自身の解像度やピクセル数よりも、より高い解像度やピクセル数の画像を投影する場合には、当該画像の解像度やピクセル数が、当該出力部30の解像度やピクセル数に制限される場合がある。 Also, the resolution and the number of pixels of the video that can be projected by the output unit 30 are determined in advance according to the performance and characteristics of the output unit 30. Therefore, when the output unit 30 projects an image with a higher resolution or number of pixels than its own resolution or number of pixels, the resolution or the number of pixels of the image is equal to the resolution or the number of pixels of the output unit 30. There may be restrictions.
 そこで、本実施形態に係る情報処理装置1は、少なくとも2以上の出力部30それぞれか投影される映像により、1つの映像を形成することで、より好適な態様での映像の投影(例えば、より解像度やピクセル数の高い映像の投影)を可能としている。 Therefore, the information processing apparatus 1 according to the present embodiment forms a single video from the video projected from each of the at least two or more output units 30, thereby projecting a video in a more preferable mode (for example, more Projection of images with high resolution and pixel count).
 例えば、図4は、本実施形態に係る情報処理装置1の概略的な動作の他の一例について説明するための説明図である。図4に示す例では、情報処理装置1は、投影部30a~30dそれぞれの投影領域R11a~R11dを結合することで1つの表示領域を形成し、投影部30a~30dを互いに連携させることで、当該表示領域に映像(換言すると、表示情報)を投影している。 For example, FIG. 4 is an explanatory diagram for explaining another example of the schematic operation of the information processing apparatus 1 according to the present embodiment. In the example shown in FIG. 4, the information processing apparatus 1 forms one display region by combining the projection regions R11a to R11d of the projection units 30a to 30d, and links the projection units 30a to 30d to each other. Video (in other words, display information) is projected onto the display area.
 より具体的には、情報処理装置1は、出力部30a~30dそれぞれが映像を投影する方向や、投影する映像の大きさ(換言すると、画角の広さ)を制御することで、図4に示すように、投影領域R11a~R11dを結合させる。また、情報処理装置1は、投影対象となる映像(表示情報)を、投影領域R11a~R11d間の結合関係に応じて部分映像に分割し、分割された部分映像を各投影領域R11に対応する出力部30に出力させる。このような制御により、出力部30a~30dそれぞれから投影された部分映像が、投影領域R11a~R11d間の結合関係に応じて、投影面R10上で結合され、1つ映像が形成される。 More specifically, the information processing apparatus 1 controls the direction in which each of the output units 30a to 30d projects an image and the size of the image to be projected (in other words, the width of the angle of view), so that FIG. As shown, the projection regions R11a to R11d are combined. In addition, the information processing apparatus 1 divides the video (display information) to be projected into partial videos according to the coupling relationship between the projection areas R11a to R11d, and the divided partial videos correspond to the projection areas R11. Output to the output unit 30. By such control, the partial images projected from the output units 30a to 30d are combined on the projection plane R10 according to the connection relationship between the projection regions R11a to R11d, so that one image is formed.
 なお、このとき、投影領域R11a~R11dそれぞれに投影される映像の解像度やピクセル数は、当該投影領域R11に対応する出力部30の解像度やピクセル数に相当することとなる。そのため、図4に示す例では、例えば、出力部30a~30dそれぞれの解像度やピクセル数が等しい場合には、出力部30単体で同等の大きさの映像を投影する場合に比べて、解像度やピクセル数が4倍の映像を投影することが可能となる。 At this time, the resolution and the number of pixels of the image projected on each of the projection areas R11a to R11d correspond to the resolution and the number of pixels of the output unit 30 corresponding to the projection area R11. Therefore, in the example shown in FIG. 4, for example, when the resolutions and the number of pixels of the output units 30a to 30d are the same, the resolution and pixels are compared with the case where an image of the same size is projected by the output unit 30 alone. It is possible to project a video image that is four times as many.
 また、図4に示すように、情報処理装置1は、隣接する投影領域R11間において、少なくとも一部が重畳するように、投影領域R11a~R11dそれぞれが投影される位置を制御してもよい。このような制御により、隣接する投影領域R11間の境界(換言すると、当該投影領域R11間の輝度値の差)を目立ちにくくすることが可能となる。 Also, as shown in FIG. 4, the information processing apparatus 1 may control the positions at which the projection areas R11a to R11d are projected so that at least a part of the information is overlapped between the adjacent projection areas R11. By such control, it becomes possible to make the boundary between adjacent projection regions R11 (in other words, the difference in luminance value between the projection regions R11) inconspicuous.
 なお、本実施形態に係る情報処理装置1は、図3に示すように、各出力部30を個別に制御する動作(動作モード)と、図4に示すように、2以上の出力部30を互いに連携させる動作(動作モード)とを、選択的に切り替え可能に構成されていてもよい。換言すると、本実施形態に係る情報処理装置1は、複数の出力部30それぞれに対応する投影領域R11を重畳させるための態様が、互いに異なる複数の動作モードを、選択的に切り替え可能に構成されていてもよい。なお、本説明において、複数の出力部30それぞれに対応する投影領域R11を重畳させるための態様には、図3に示すように、各出力部30それぞれに対応する投影領域R11を重畳させない態様についても、一部の態様として含み得るものとする。 The information processing apparatus 1 according to the present embodiment includes an operation (operation mode) for individually controlling each output unit 30 as illustrated in FIG. 3 and two or more output units 30 as illustrated in FIG. 4. The operation (operation mode) to be linked to each other may be configured to be selectively switchable. In other words, the information processing apparatus 1 according to the present embodiment is configured to be able to selectively switch a plurality of operation modes different from each other in a mode for superimposing the projection regions R11 corresponding to the plurality of output units 30, respectively. It may be. In this description, as shown in FIG. 3, the mode for superimposing the projection regions R11 corresponding to each of the plurality of output units 30 is a mode in which the projection regions R11 corresponding to the respective output units 30 are not superimposed. Can also be included as some aspects.
 なお、本実施形態に係る情報処理装置1は、図3や図4に示すような動作モードを適宜選択的に切り替えることが可能であれば、当該動作モードの切り替えに係る契機は特に限定されない。 Note that the information processing apparatus 1 according to the present embodiment is not particularly limited in the trigger for switching the operation mode as long as the operation mode as illustrated in FIGS. 3 and 4 can be selectively switched as appropriate.
 具体的な一例として、情報処理装置1は、例えば、入力部10により取得された入力情報に基づき、ユーザによる各種操作(例えば、タップやピンチイン/ピンチアウト等のジェスチャ操作)を認識し、当該操作内容に応じて、動作モードを切り替えてもよい。もちろん、情報処理装置1が、ユーザによる操作の内容に応じて、切り替え先の動作モードを認識できれば、当該操作の種別は特に限定されない。具体的な一例として、情報処理装置1は、マウス、ボタン、タッチパネル等の各種操作デバイスを介した操作の内容に基づき、動作モードを切り替えてもよい。 As a specific example, the information processing apparatus 1 recognizes various operations (for example, gesture operations such as tap and pinch-in / pinch-out) based on input information acquired by the input unit 10, for example. The operation mode may be switched depending on the content. Of course, the type of the operation is not particularly limited as long as the information processing apparatus 1 can recognize the operation mode of the switching destination according to the content of the operation by the user. As a specific example, the information processing apparatus 1 may switch the operation mode based on the contents of operations through various operation devices such as a mouse, a button, and a touch panel.
 また、他の一例として、情報処理装置1は、投影される映像に関する情報(例えば、投影対象となるコンテンツに関する情報)に応じて、より好適な動作モードに切り替えてもよい。具体的な一例として、情報処理装置1は、各出力部30の解像度よりも高解像度の映像を投影対象とする場合には、図4に示すように、複数の投影領域R11が結合されて1つの表示領域が形成されるように、2以上の出力部30を連携させる動作モードに切り替えてもよい。また、このとき情報処理装置1は、投影対象となる映像の解像度に応じて、互いに連携させる出力部30の数を切り替えてもよい。 As another example, the information processing apparatus 1 may switch to a more suitable operation mode in accordance with information related to a projected video (for example, information related to content to be projected). As a specific example, when the information processing apparatus 1 projects an image with a resolution higher than the resolution of each output unit 30 as illustrated in FIG. You may switch to the operation mode which cooperates the two or more output parts 30 so that one display area may be formed. At this time, the information processing apparatus 1 may switch the number of output units 30 to be linked with each other according to the resolution of the video to be projected.
 また、他の一例として、情報処理装置1は、各種センサ等の検出デバイスの検出結果に応じて、外部環境の状態を認識し、当該外部環境の認識結果に応じて、より好適な動作モードに切り替えてもよい。具体的な一例として、情報処理装置1は、検出デバイスの検出結果に応じて、投影面R10に対するユーザの位置や向きを認識し、当該認識結果に応じて、動作モードを決定してもよい。 As another example, the information processing apparatus 1 recognizes the state of the external environment according to detection results of detection devices such as various sensors, and enters a more suitable operation mode according to the recognition result of the external environment. You may switch. As a specific example, the information processing apparatus 1 may recognize the position and orientation of the user with respect to the projection plane R10 according to the detection result of the detection device, and determine the operation mode according to the recognition result.
 以上説明したように、本実施形態に係る情報処理装置1は、2以上の出力部30それぞれに対応する投影領域R11を重畳させるための態様を選択的に切り替え可能に構成されている。このような構成に基づき、本実施形態に係る情報処理装置1は、利用用途、利用形態、及び投影対象となる映像(コンテンツ)の種別等に基づく利用シーンに応じて選択的に動作モードを切り替えることで、当該利用シーンに応じたより好適な態様で映像を投影することが可能となる。また、このような構成により、本実施形態に係る情報処理装置1は、複数の出力部30を連携させることで、より性能の高い出力部(プロジェクタ)と同等の機能または性能を実現することが可能となるため、コストの低減や小型化を実現することが可能となる場合もある。 As described above, the information processing apparatus 1 according to the present embodiment is configured to be able to selectively switch the mode for superimposing the projection regions R11 corresponding to the two or more output units 30, respectively. Based on such a configuration, the information processing apparatus 1 according to the present embodiment selectively switches the operation mode according to the usage scene based on the usage, the usage mode, the type of video (content) to be projected, and the like. As a result, it is possible to project the video in a more preferable manner according to the usage scene. In addition, with such a configuration, the information processing apparatus 1 according to the present embodiment can realize a function or performance equivalent to that of a higher-performance output unit (projector) by linking a plurality of output units 30. Therefore, it may be possible to reduce costs and achieve miniaturization.
 以上、図1~図4を参照して、本実施形態に係る情報処理装置1の概要について説明した。なお、以降では、本実施形態に係る情報処理装置1についてさらに詳しく説明する。 The outline of the information processing apparatus 1 according to the present embodiment has been described above with reference to FIGS. Hereinafter, the information processing apparatus 1 according to the present embodiment will be described in more detail.
 <2.機能構成>
 まず、図5を参照して、本実施形態に係る情報処理装置1の機能構成の一例について説明する。図5は、本実施形態に係る情報処理装置1の機能構成の一例を示したブロック図である。
<2. Functional configuration>
First, an example of a functional configuration of the information processing apparatus 1 according to the present embodiment will be described with reference to FIG. FIG. 5 is a block diagram illustrating an example of a functional configuration of the information processing apparatus 1 according to the present embodiment.
 図5に示すように、本実施形態に係る情報処理装置1は、入力部10と、制御部20と、記憶部40と、複数の出力部30とを含む。なお、図5に示す例では、情報処理装置1は、複数の出力部30として、出力部30a及び30bを含む。なお、図5に示す、入力部10と、出力部30a及び30bとは、図2に示す例における、入力部10と、出力部30a及び30bとに対応している。 As illustrated in FIG. 5, the information processing apparatus 1 according to the present embodiment includes an input unit 10, a control unit 20, a storage unit 40, and a plurality of output units 30. In the example illustrated in FIG. 5, the information processing apparatus 1 includes output units 30 a and 30 b as the plurality of output units 30. Note that the input unit 10 and the output units 30a and 30b illustrated in FIG. 5 correspond to the input unit 10 and the output units 30a and 30b in the example illustrated in FIG.
 図5に示すように、入力部10は、例えば撮像部11を含む。また、入力部10は、検知部13を含んでもよい。 As shown in FIG. 5, the input unit 10 includes an imaging unit 11, for example. The input unit 10 may include a detection unit 13.
 撮像部11は、図2を参照して前述した、投影面R10の画像を撮像するための構成に相当し、例えば、1つの撮像光学系を含むカメラや、2つの撮像光学系を含むステレオカメラとして構成され得る。撮像部11は、撮像した投影面R10の画像を制御部20に出力する。 The imaging unit 11 corresponds to the configuration for imaging the image of the projection plane R10 described above with reference to FIG. 2, and for example, a camera including one imaging optical system or a stereo camera including two imaging optical systems. Can be configured. The imaging unit 11 outputs the captured image of the projection plane R10 to the control unit 20.
 検知部13は、各種状態を検知するための各種センサ等の検出デバイスに相当する。具体的な一例として、検知部13は、所謂深度センサとして構成され、情報処理装置1と、当該情報処理装置1を起点とした所定の検出範囲内に位置するオブジェクト(例えば、手等の操作体や投影面R10)との間の距離を測定し、測定結果を示す制御情報を生成してもよい。また、他の一例として、検知部13は、所謂光学センサや人感センサ等のようにユーザを検知するためのセンサとして構成され、投影面R10の近傍に位置するユーザの検知結果を示す制御情報を生成してもよい。そして、検知部13は、各種状態の検知結果を示す制御情報を制御部20に出力する。 The detection unit 13 corresponds to a detection device such as various sensors for detecting various states. As a specific example, the detection unit 13 is configured as a so-called depth sensor, and includes an information processing apparatus 1 and an object (for example, an operating body such as a hand) positioned within a predetermined detection range starting from the information processing apparatus 1. Or the distance to the projection plane R10), and control information indicating the measurement result may be generated. As another example, the detection unit 13 is configured as a sensor for detecting a user, such as a so-called optical sensor or human sensor, and includes control information indicating a detection result of the user located in the vicinity of the projection plane R10. May be generated. Then, the detection unit 13 outputs control information indicating detection results of various states to the control unit 20.
 出力部30は、映像出力部31を含む。映像出力部31は、出力部30のうち、映像を投影するための構成(即ち、投影部)に相当し、後述する制御部20からの制御に基づき、所望の投影面中の少なくとも一部(即ち、投影面R10中の投影領域R11)に対して映像を投影する。なお、映像出力部31は、映像を投影する方向や、投影する映像の大きさ(換言すると、画角の広さ)を制御可能に構成されている。 The output unit 30 includes a video output unit 31. The video output unit 31 corresponds to a configuration for projecting a video (that is, a projection unit) in the output unit 30, and is based on control from the control unit 20 described later (at least a part of a desired projection plane ( That is, an image is projected onto the projection area R11) in the projection plane R10. The video output unit 31 is configured to be able to control the direction in which the video is projected and the size of the projected video (in other words, the angle of view).
 なお、図5に示す例において、映像出力部31aは、出力部30aの映像出力部31に相当し、映像出力部31bは、出力部30bの映像出力部31に相当する。 In the example shown in FIG. 5, the video output unit 31a corresponds to the video output unit 31 of the output unit 30a, and the video output unit 31b corresponds to the video output unit 31 of the output unit 30b.
 制御部20は、画像解析部21と、入力解析部22と、処理実行部23と、モード判定部25と、出力制御部26とを含む。 The control unit 20 includes an image analysis unit 21, an input analysis unit 22, a process execution unit 23, a mode determination unit 25, and an output control unit 26.
 画像解析部21は、撮像部11から、撮像された投影面R10の画像を取得する。画像解析部21は、取得した画像に対して画像解析を施すことで、投影面R10上に存在する、ユーザの手等の操作体を検出する。また、画像解析部21は、取得した画像を解析することで、投影面R10の近傍に位置するユーザを検出してもよい。そして、画像解析部21は、投影面R10の画像の解析結果(例えば、ユーザの手等の操作体の検出結果)を、入力解析部22に出力する。また、画像解析部21は、投影面R10の画像の解析結果(例えば、ユーザや、当該ユーザの手等の検出結果)を、後述するモード判定部25に出力してもよい。 The image analysis unit 21 acquires the captured image of the projection plane R10 from the imaging unit 11. The image analysis unit 21 performs an image analysis on the acquired image to detect an operating body such as a user's hand existing on the projection plane R10. Further, the image analysis unit 21 may detect a user located in the vicinity of the projection plane R10 by analyzing the acquired image. Then, the image analysis unit 21 outputs the analysis result of the image of the projection plane R10 (for example, the detection result of the operation body such as the user's hand) to the input analysis unit 22. Further, the image analysis unit 21 may output an analysis result of the image of the projection plane R10 (for example, a detection result of the user, the user's hand, etc.) to the mode determination unit 25 described later.
 入力解析部22は、画像解析部21から、投影面R10の画像の解析結果を取得する。入力解析部22は、取得した当該解析結果に基づき、投影面R10の画像中から、ユーザの手等の操作体を抽出する。入力解析部22は、抽出した操作体の時系列に沿った動きに基づき、当該操作体によるユーザ操作の内容を認識する。また、このとき、表示情報が投影される投影面R10の座標と、ユーザの手等の操作体の投影面R10への接触座標とが一致するように予め校正されることで、例えば、入力解析部22は、操作体が、投影された表示情報(例えば、GUI等)のどの部分に接触したかを検出することが可能となる。 The input analysis unit 22 acquires the analysis result of the image of the projection plane R10 from the image analysis unit 21. Based on the obtained analysis result, the input analysis unit 22 extracts an operation body such as a user's hand from the image of the projection plane R10. The input analysis unit 22 recognizes the content of the user operation by the operation body based on the extracted movement of the operation body along the time series. At this time, the coordinates of the projection surface R10 on which the display information is projected and the contact coordinates on the projection surface R10 of the operating body such as the user's hand are calibrated in advance, for example, input analysis The unit 22 can detect which part of the projected display information (for example, GUI or the like) the operating body has touched.
 入力解析部22は、認識した操作内容に基づき、ユーザからの当該操作に基づく指示の内容を認識し、当該指示に応じて対象となる機能(例えば、アプリケーション)やコンテンツを特定し、当該機能やコンテンツを示す制御情報を処理実行部23に出力する。 Based on the recognized operation content, the input analysis unit 22 recognizes the content of the instruction based on the operation from the user, identifies the target function (for example, application) or content according to the instruction, Control information indicating the content is output to the process execution unit 23.
 また、入力解析部22は、認識した操作内容を示す情報(例えば、ジェスチャ操作が示す内容)を、モード判定部25に出力してもよい。 Further, the input analysis unit 22 may output information indicating the recognized operation content (for example, content indicated by the gesture operation) to the mode determination unit 25.
 処理実行部23は、各種機能を実行するための構成である。処理実行部23は、入力解析部22から、機能やコンテンツを示す制御情報を取得する。処理実行部23は、取得した制御情報に基づき、対象となる機能を実行するための各種データ(例えば、アプリケーションを実行するためのライブラリや、コンテンツのデータ)を記憶部40から読み出す。記憶部40は、処理実行部23が各種機能を実行するための各種データが記憶された記憶部である。 The process execution unit 23 is configured to execute various functions. The process execution unit 23 acquires control information indicating functions and contents from the input analysis unit 22. The process execution unit 23 reads various data (for example, a library for executing an application and content data) from the storage unit 40 for executing a target function based on the acquired control information. The storage unit 40 is a storage unit that stores various data for the processing execution unit 23 to execute various functions.
 そして、処理実行部23は、記憶部40から読み出した各種データに基づき、取得した制御情報が示す機能を実行する。 And the process execution part 23 performs the function which the acquired control information shows based on the various data read from the memory | storage part 40. FIG.
 なお、処理実行部23は、ユーザからの操作に基づき特定された機能の実行結果を示す制御情報を、後述する出力制御部26に出力してもよい。これにより、出力制御部26は、ユーザからの操作に基づき特定された機能の実行結果を示す情報を、出力部30a及び30b(より具体的には、映像出力部31a及び31b)の少なくともいずれかに投影面R10に投影させることで、当該機能の実行結果をユーザに提示することが可能となる。 Note that the process execution unit 23 may output control information indicating the execution result of the function specified based on the operation from the user to the output control unit 26 described later. As a result, the output control unit 26 transmits information indicating the execution result of the function specified based on the operation from the user to at least one of the output units 30a and 30b (more specifically, the video output units 31a and 31b). By projecting onto the projection plane R10, the execution result of the function can be presented to the user.
 また、処理実行部23は、ユーザからの操作に基づき特定された機能の実行に応じた各種情報を、後述するモード判定部25に出力してもよい。具体的な一例として、処理実行部23は、静止画像や動画像等のような画像コンテンツの再生に係る機能を実行した場合には、当該画像コンテンツに関する情報(例えば、解像度やピクセル数等の属性情報)を、モード判定部25に出力してもよい。これにより、モード判定部25は、再生された画像コンテンツの解像度やピクセル数等の各種情報を認識することが可能となる。 Further, the process execution unit 23 may output various types of information according to the execution of the function specified based on the operation from the user to the mode determination unit 25 described later. As a specific example, when a function related to reproduction of image content such as a still image or a moving image is executed, the process execution unit 23 includes information about the image content (for example, attributes such as resolution and the number of pixels). Information) may be output to the mode determination unit 25. Thereby, the mode determination unit 25 can recognize various information such as the resolution and the number of pixels of the reproduced image content.
 モード判定部25は、各種入力情報に応じて、複数の出力部30の動作を制御するための動作モードを、あらかじめ設定された複数の動作のモードのいずれかに切り替える。なお、当該動作モードとしては、例えば、図3を参照して説明した、複数の出力部30それぞれの動作を個別に制御する動作モードや、図4を参照して説明した、複数の出力部30のうち少なくとも一部の2以上の出力部30を互いに連携させる動作モード等が挙げられる。なお、本実施形態に係る情報処理装置1が、複数の出力部30の動作を制御するための動作モードの他の一例については、変形例として別途後述する。 The mode determination unit 25 switches the operation mode for controlling the operations of the plurality of output units 30 to one of a plurality of preset operation modes according to various types of input information. As the operation mode, for example, the operation mode for individually controlling the operations of the plurality of output units 30 described with reference to FIG. 3 or the plurality of output units 30 described with reference to FIG. Among these, an operation mode in which at least some of the two or more output units 30 cooperate with each other can be used. Note that another example of an operation mode for the information processing apparatus 1 according to the present embodiment to control operations of the plurality of output units 30 will be separately described later as a modification.
 例えば、モード判定部25は、ユーザによる操作の認識結果に基づき、動作モードを決定してもよい。この場合には、モード判定部25は、入力解析部22からユーザによる操作の内容の認識結果を示す情報(例えば、ジェスチャ操作が示す内容等)を取得し、当該情報が示す操作内容に応じて、動作モードを決定してもよい。 For example, the mode determination unit 25 may determine the operation mode based on the recognition result of the operation by the user. In this case, the mode determination unit 25 acquires information (for example, content indicated by the gesture operation) indicating the recognition result of the operation content by the user from the input analysis unit 22, and according to the operation content indicated by the information. The operation mode may be determined.
 なお、ユーザが、情報処理装置1に対して動作モードを指示することが可能であれば、当該動作モードを指示するための操作の種別は特に限定されない。具体的な一例として、情報処理装置1は、出力部30に、動作モードを指定するためのUIが提示された映像に投影させ、当該UIに対するユーザからの操作に基づき動作モードを決定してもよい。この場合には、モード判定部25は、当該UIが投影された投影面R10中の位置と、ユーザによる、当該投影面R10に対する操作(例えば、タップ操作等)の位置とを比較することで、ユーザにより指定された動作モードを認識すればよい。 In addition, if the user can instruct the operation mode to the information processing apparatus 1, the type of operation for instructing the operation mode is not particularly limited. As a specific example, the information processing apparatus 1 may cause the output unit 30 to project an image on which a UI for designating an operation mode is presented, and determine the operation mode based on an operation from the user with respect to the UI. Good. In this case, the mode determination unit 25 compares the position in the projection plane R10 on which the UI is projected with the position of an operation (for example, a tap operation) performed by the user on the projection plane R10. What is necessary is just to recognize the operation mode designated by the user.
 また、他の一例として、モード判定部25は、処理実行部23による各種機能の実行に応じた各種情報を取得し、当該情報に基づき動作モードを決定してもよい。具体的な一例として、モード判定部25は、再生されたコンテンツに関する情報(例えば、解像度やピクセル数等の属性情報)を処理実行部23から取得し、取得したコンテンツに関する情報に基づき、動作モードを決定してもよい。この場合には、モード判定部25は、例えば、再生されたコンテンツの解像度が、各出力部30の解像度よりも高い場合には、図4に示すように、2以上の出力部30それぞれの投影領域を結合することで1つの表示領域を形成する動作モード(即ち、2以上の出力部30を連携させる動作モード)を選択してもよい。 As another example, the mode determination unit 25 may acquire various types of information according to the execution of various functions by the processing execution unit 23, and determine the operation mode based on the information. As a specific example, the mode determination unit 25 acquires information related to reproduced content (for example, attribute information such as resolution and the number of pixels) from the processing execution unit 23, and selects an operation mode based on the acquired information related to content. You may decide. In this case, for example, when the resolution of the reproduced content is higher than the resolution of each output unit 30, the mode determination unit 25 projects each of the two or more output units 30 as shown in FIG. An operation mode in which one display area is formed by combining the areas (that is, an operation mode in which two or more output units 30 are linked) may be selected.
 また、他の一例として、モード判定部25は、検知部13による検知結果や、画像解析部21による画像の解析結果に基づき、動作モードを決定しもよい。具体的な一例として、モード判定部25は、検知部13による検知結果や、画像解析部21による画像の解析結果に基づき、投影面R10の周囲に存在するユーザの位置や向きを認識し、当該認識結果に応じて動作モードを決定してもよい。 As another example, the mode determination unit 25 may determine an operation mode based on a detection result by the detection unit 13 or an image analysis result by the image analysis unit 21. As a specific example, the mode determination unit 25 recognizes the position and orientation of the user existing around the projection plane R10 based on the detection result by the detection unit 13 and the analysis result of the image by the image analysis unit 21, and The operation mode may be determined according to the recognition result.
 また、より具体的な一例として、モード判定部25は、複数のユーザが検知された場合には、図3に示すように、複数の出力部30それぞれの動作を個別に制御する動作モードを選択してもよい。この場合には、モード判定部25は、例えば、検出された各ユーザに複数の出力部30のいずれかを割り当て、当該出力部30に、割り当てられたユーザの近傍に、当該ユーザに提示する表示情報を投影させるように、当該動作モードに応じた設定情報を生成してもよい。 As a more specific example, when a plurality of users are detected, the mode determination unit 25 selects an operation mode for individually controlling the operations of the plurality of output units 30 as shown in FIG. May be. In this case, for example, the mode determination unit 25 assigns one of the plurality of output units 30 to each detected user and displays the output unit 30 in the vicinity of the assigned user. The setting information corresponding to the operation mode may be generated so that the information is projected.
 また、このときモード判定部25は、各出力部30と、検出された各ユーザとの間の位置関係に応じて、各ユーザに割り当てる出力部30や、当該出力部30が映像を投影する方向や画角を決定してもよい。この場合には、モード判定部25は、例えば、各出力部30から投影面R10に投影される映像が、検出された各ユーザにより遮蔽されないように(換言すると、各出力部30と当該出力部30に対応する投影領域R11との間にユーザが介在しないように)、各ユーザに割り当てる出力部30や、当該出力部30が映像を投影する方向や画角を決定してもよい。 At this time, the mode determination unit 25 also assigns the output unit 30 to each user according to the positional relationship between each output unit 30 and each detected user, and the direction in which the output unit 30 projects an image. And the angle of view may be determined. In this case, for example, the mode determination unit 25 prevents the video projected from the output units 30 on the projection plane R10 from being blocked by the detected users (in other words, the output units 30 and the output units). The output unit 30 assigned to each user and the direction and angle of view in which the output unit 30 projects an image may be determined so that the user does not intervene with the projection region R11 corresponding to 30.
 以上のようにして、モード判定部25は、各種入力情報に応じて、複数の出力部30の動作を制御するための動作モードを決定し、決定した動作モードを示す情報を出力制御部26に出力する。また、このときモード判定部25は、決定した動作モードで各出力部30の動作を制御するための、当該動作モードに応じた設定情報(例えば、各出力部30が映像を投影する方向や画角を示す情報)を出力制御部26に出力してもよい。 As described above, the mode determination unit 25 determines an operation mode for controlling operations of the plurality of output units 30 according to various types of input information, and sends information indicating the determined operation mode to the output control unit 26. Output. At this time, the mode determination unit 25 controls the operation of each output unit 30 in the determined operation mode, according to the setting information corresponding to the operation mode (for example, the direction or image in which each output unit 30 projects an image). Information indicating a corner) may be output to the output control unit 26.
 出力制御部26は、出力部30(特に、映像出力部31)の動作を制御するための構成である。出力制御部26は、モード判定部25から複数の出力部30の動作を制御するための動作モードを示す情報を取得し、取得した当該情報が示す動作モードに応じて、各出力部30の動作(例えば、映像を投影する方向、画角、及び、投影する映像の内容等)を制御する。 The output control unit 26 is configured to control the operation of the output unit 30 (particularly, the video output unit 31). The output control unit 26 acquires information indicating an operation mode for controlling operations of the plurality of output units 30 from the mode determination unit 25, and operates each output unit 30 according to the operation mode indicated by the acquired information. (For example, the direction in which the image is projected, the angle of view, the content of the image to be projected, etc.)
 また、出力制御部26は、ユーザに提示するための表示情報を、決定された動作モードに応じて、各出力部30に投影させる。なお、このとき、出力制御部26は、処理実行部23から、ユーザからの操作に基づき特定された機能の実行結果を示す制御情報を取得してもよい。この場合には、出力制御部26は、ユーザからの操作に基づき特定された機能の実行結果を示す表示情報を、決定された動作モードに応じて、各出力部30に投影させてもよい。 Also, the output control unit 26 causes each output unit 30 to project display information to be presented to the user according to the determined operation mode. At this time, the output control unit 26 may acquire control information indicating the execution result of the function specified based on the operation from the user from the process execution unit 23. In this case, the output control unit 26 may project the display information indicating the execution result of the function specified based on the operation from the user on each output unit 30 according to the determined operation mode.
 より具体的な一例として、図3に示すように、複数の出力部30それぞれの動作を個別に制御する動作モードが選択された場合に着目する。この場合には、出力制御部26は、例えば、検出されたユーザの位置や向きに応じて決定された、各出力部30が映像を投影する方向や画角の設定を示す設定情報を、モード判定部25から取得する。そして、出力制御部26は、取得した設定情報に基づき、各出力部30の動作(例えば、映像を投影する方向や画角)を制御し、当該出力部30に、対象となるユーザに提示するための表示情報(例えば、当該出力部30が投影する映像(表示情報)に対する操作の結果を示す情報)を投影させてもよい。 As a more specific example, attention is paid to a case where an operation mode for individually controlling the operations of the plurality of output units 30 is selected as shown in FIG. In this case, the output control unit 26 displays, for example, setting information indicating the setting of the direction and angle of view in which each output unit 30 projects an image, which is determined according to the detected position and orientation of the user. Obtained from the determination unit 25. Then, based on the acquired setting information, the output control unit 26 controls the operation of each output unit 30 (for example, the direction and angle of view for projecting the video) and presents the output unit 30 to the target user. Display information (for example, information indicating a result of an operation on a video (display information) projected by the output unit 30) may be projected.
 また、他の一例として、図4に示すように、複数の出力部30のうち少なくとも一部の2以上の出力部30を連携させ、当該2以上の出力部30それぞれに対応する投影領域を結合することで1つの表示領域を形成する動作モードが選択された場合に着目する。この場合には、出力制御部26は、互いに連携させる2以上の出力部30それぞれに対応する投影領域R11が互いに結合されるように、当該2以上の出力部30それぞれが映像を投影する方向や画角を制御する。また、出力制御部26は、当該2以上の出力部30それぞれに対応する投影領域R11間の結合関係に応じて、投影対象となる映像(表示情報)を複数の部分画像に分割し、分割された部分画像を各投影領域R11に対応する出力部30に出力させればよい。 As another example, as shown in FIG. 4, at least a part of two or more output units 30 among a plurality of output units 30 are linked, and projection areas corresponding to the two or more output units 30 are combined. Thus, attention is paid to a case where an operation mode for forming one display area is selected. In this case, the output control unit 26 determines the direction in which each of the two or more output units 30 projects an image so that the projection regions R11 corresponding to the two or more output units 30 to be linked to each other are coupled to each other. Control the angle of view. Further, the output control unit 26 divides the video (display information) to be projected into a plurality of partial images according to the coupling relationship between the projection regions R11 corresponding to each of the two or more output units 30, and is divided. The partial image may be output to the output unit 30 corresponding to each projection region R11.
 以上、図5を参照して、本実施形態に係る情報処理装置1の機能構成の一例について説明した。なお、図5に示す構成はあくまで一例であり、上記に説明した情報処理装置1の各種動作が実現可能であれば、情報処理装置1の機能構成は必ずしも図5に示す例には限定されない。具多的な一例として、入力部10と複数の出力部30とのうち、少なくともいずれかが情報処理装置1とは異なる外部装置として、当該情報処理装置1に対して外付けされていてもよい。また、他の一例として、制御部20のうち、少なくとも一部の構成が、情報処理装置1とは異なる外部装置(例えば、ネットワークを介して接続されたサーバ等)に設けられていてもよい。 The example of the functional configuration of the information processing apparatus 1 according to the present embodiment has been described above with reference to FIG. The configuration illustrated in FIG. 5 is merely an example, and the functional configuration of the information processing device 1 is not necessarily limited to the example illustrated in FIG. 5 as long as the various operations of the information processing device 1 described above can be realized. As a specific example, at least one of the input unit 10 and the plurality of output units 30 may be externally attached to the information processing device 1 as an external device different from the information processing device 1. . As another example, at least a part of the configuration of the control unit 20 may be provided in an external device (for example, a server connected via a network) different from the information processing device 1.
 <3.処理>
 次に、図6を参照して、本実施形態に係る情報処理装置1の一連の処理の流れの一例について、特に、当該情報処理装置1が動作モードを決定し、決定した動作モードに応じて投影面R10に映像を投影する動作に着目して説明する。図6は、本実施形態に係る情報処理装置1の一連の動作の流れの一例を示したフローチャートである。
<3. Processing>
Next, with reference to FIG. 6, regarding an example of a flow of a series of processes of the information processing apparatus 1 according to the present embodiment, in particular, the information processing apparatus 1 determines an operation mode, and according to the determined operation mode. Description will be made focusing on the operation of projecting an image on the projection surface R10. FIG. 6 is a flowchart illustrating an example of a flow of a series of operations of the information processing apparatus 1 according to the present embodiment.
 (ステップS101、S103)
 まず、情報処理装置1は、各種入力情報を解析し(ステップS101)、当該解析の結果に基づき、複数の出力部30の動作を制御するための動作モードを決定する(ステップS103)。
(Steps S101 and S103)
First, the information processing apparatus 1 analyzes various input information (step S101), and determines an operation mode for controlling the operations of the plurality of output units 30 based on the analysis result (step S103).
 具体的な一例として、モード判定部25は、ユーザによる操作の認識結果に基づき、動作モードを決定してもよい。この場合には、画像解析部21は、撮像部11から取得した投影面R10の画像を取得し、当該画像に対して画像解析を施すことで、投影面R10上に存在する、ユーザの手等の操作体を検出する。そして、画像解析部21は、投影面R10の画像の解析結果(例えば、ユーザの手等の操作体の検出結果)を、入力解析部22に出力する。 As a specific example, the mode determination unit 25 may determine the operation mode based on the recognition result of the operation by the user. In this case, the image analysis unit 21 acquires the image of the projection plane R10 acquired from the imaging unit 11, and performs image analysis on the image, thereby allowing the user's hand, etc., present on the projection plane R10 to exist. Detect the operating body. Then, the image analysis unit 21 outputs the analysis result of the image of the projection plane R10 (for example, the detection result of the operation body such as the user's hand) to the input analysis unit 22.
 入力解析部22は、画像解析部21から投影面R10の画像の解析結果を取得し、当該解析結果に基づき、ユーザ操作の内容(例えば、手等の操作体によるジェスチャ操作の内容等)を認識する。そして、入力解析部22は、認識した操作内容を示す情報(例えば、ジェスチャ操作が示す内容)を、モード判定部25に出力する(ステップS101)。 The input analysis unit 22 acquires the analysis result of the image of the projection plane R10 from the image analysis unit 21, and recognizes the content of the user operation (for example, the content of the gesture operation by an operating body such as a hand) based on the analysis result. To do. And the input analysis part 22 outputs the information (for example, the content which gesture operation shows) which shows the recognized operation content to the mode determination part 25 (step S101).
 この場合には、モード判定部25は、入力解析部22からユーザによる操作の内容の認識結果を示す情報取得し、当該情報が示す操作内容に応じて、動作モードを決定する(ステップS103)。 In this case, the mode determination unit 25 acquires information indicating the recognition result of the content of the operation by the user from the input analysis unit 22, and determines the operation mode according to the operation content indicated by the information (step S103).
 また、他の一例として、モード判定部25は、再生されたコンテンツに関する情報(例えば、解像度やピクセル数等の属性情報)を処理実行部23から取得し、取得したコンテンツに関する情報に基づき、動作モードを決定してもよい。この場合には、モード判定部25は、例えば、再生されたコンテンツの解像度が、各出力部30の解像度よりも高い場合には(ステップS101)、図4に示すように、2以上の出力部30それぞれの投影領域を結合することで1つの表示領域を形成する動作モード(即ち、2以上の出力部30を連携させる動作モード)を選択してもよい(ステップS103)。 As another example, the mode determination unit 25 acquires information (for example, attribute information such as resolution and the number of pixels) regarding the reproduced content from the processing execution unit 23, and based on the acquired information about the content, the operation mode May be determined. In this case, for example, when the resolution of the reproduced content is higher than the resolution of each output unit 30 (step S101), the mode determination unit 25, as shown in FIG. An operation mode for forming one display area by combining the 30 projection areas (that is, an operation mode for linking two or more output units 30) may be selected (step S103).
 以上のようにして、モード判定部25は、各種入力情報に応じて、複数の出力部30の動作を制御するための動作モードを決定し、決定した動作モードを示す情報を出力制御部26に出力する。 As described above, the mode determination unit 25 determines an operation mode for controlling operations of the plurality of output units 30 according to various types of input information, and sends information indicating the determined operation mode to the output control unit 26. Output.
 (ステップS105)
 出力制御部26は、モード判定部25から複数の出力部30の動作を制御するための動作モードを示す情報を取得し、取得した当該情報が示す動作モードに応じて、各出力部30の動作(例えば、映像を投影する方向、画角、及び、投影する映像の内容等)を制御する。
(Step S105)
The output control unit 26 acquires information indicating an operation mode for controlling operations of the plurality of output units 30 from the mode determination unit 25, and operates each output unit 30 according to the operation mode indicated by the acquired information. (For example, the direction in which the image is projected, the angle of view, the content of the image to be projected, etc.) are controlled.
 (ステップS107)
 そして、出力制御部26は、ユーザに提示するための表示情報を、決定された動作モードに応じて、各出力部30に投影させる。なお、このとき、出力制御部26は、ユーザからの操作に基づき特定された機能の実行結果を示す制御情報を処理実行部23から取得し、当該制御情報に基づき、当該実行結果を示す表示情報を、決定された動作モードに応じて、各出力部30に投影させてもよい。
(Step S107)
Then, the output control unit 26 projects display information to be presented to the user on each output unit 30 according to the determined operation mode. At this time, the output control unit 26 acquires the control information indicating the execution result of the function specified based on the operation from the user from the process execution unit 23, and the display information indicating the execution result based on the control information. May be projected on each output unit 30 according to the determined operation mode.
 以上、図6を参照して、本実施形態に係る情報処理装置1の一連の処理の流れの一例について、特に、当該情報処理装置1が動作モードを決定し、決定した動作モードに応じて投影面R10に映像を投影する動作に着目して説明した。 As described above, with reference to FIG. 6, with respect to an example of a series of processing flows of the information processing apparatus 1 according to the present embodiment, in particular, the information processing apparatus 1 determines an operation mode and projects according to the determined operation mode. Description has been given focusing on the operation of projecting an image on the surface R10.
 <4.変形例>
 次に、本実施形態に係る情報処理装置1の変形例として、当該情報処理装置1が、複数の出力部30のうち少なくとも一部の2以上の出力部30を互いに連携させる場合の制御の一例について説明する。
<4. Modification>
Next, as a modification of the information processing apparatus 1 according to the present embodiment, an example of control when the information processing apparatus 1 causes at least some of the two or more output units 30 among the plurality of output units 30 to cooperate with each other. Will be described.
 [4.1.変形例1:高フレームレートの映像を投影するための制御例]
 まず、変形例1として、2以上の出力部30を連携させることで、当該出力部30それぞれの性能に基づくフレームレートよりも、より高いフレームレートでの動画像の投影を実現する場合の、情報処理装置1の動作の一例について説明する。
[4.1. Modification Example 1: Control Example for Projecting High Frame Rate Video]
First, as a first modification example, information in the case of realizing the projection of a moving image at a higher frame rate than the frame rate based on the performance of each output unit 30 by linking two or more output units 30 together. An example of the operation of the processing apparatus 1 will be described.
 変形例1に係る情報処理装置1では、互いに連携させる2以上の出力部30それぞれの投影領域R11が、投影面R10上で重畳するように、当該出力部30それぞれが映像を投影する方向や画角を調整する。なお、本説明では、情報処理装置1は、2以上の出力部30として、出力部30a及び30bを互いに連携させるものとして説明する。 In the information processing apparatus 1 according to the first modification, each output unit 30 projects a direction or image so that the projection regions R11 of the two or more output units 30 linked to each other overlap on the projection plane R10. Adjust the corners. In this description, the information processing apparatus 1 is described as one in which the output units 30a and 30b cooperate with each other as the two or more output units 30.
 例えば、図7は、変形例1に係る情報処理装置1の動作の概要について説明するための説明図であり、投影面R10上において、出力部30aに対応する投影領域R11aと、出力部30bに対応する投影領域R11bとを重畳させた状態の一例を示している。 For example, FIG. 7 is an explanatory diagram for explaining the outline of the operation of the information processing apparatus 1 according to the first modification. On the projection plane R10, the projection region R11a corresponding to the output unit 30a and the output unit 30b An example of a state in which the corresponding projection region R11b is superimposed is shown.
 そして、情報処理装置1は、互いに連携させる2以上の出力部30(例えば、出力部30a及び30b)が、互いに異なるタイミングで映像を投影するように、当該出力部30の動作を制御する。 The information processing apparatus 1 controls the operation of the output unit 30 so that two or more output units 30 (for example, the output units 30a and 30b) to be linked with each other project images at different timings.
 例えば、図8は、変形例1に係る情報処理装置1の動作の概要について説明するための説明図である。図8に示す例では、情報処理装置1が、出力部30a及び30bを互いに連携させることで、当該出力部30a及び30bそれぞれの性能に基づくフレームレートよりも、より高いフレームレートでの動画像の投影を実現している。 For example, FIG. 8 is an explanatory diagram for explaining an outline of the operation of the information processing apparatus 1 according to the first modification. In the example illustrated in FIG. 8, the information processing apparatus 1 causes the output units 30 a and 30 b to cooperate with each other so that a moving image with a higher frame rate than the frame rate based on the performance of each of the output units 30 a and 30 b. Projection is realized.
 具体的には、参照符号F10a~F12a及びF10b~F12bは、投影対象となる動画像の各フレームを模式的に示している。また、図8における横方向は、時間tを示している。即ち、図8は、投影対象となる動画像の各フレームF10a~F12a及びF10b~F12bが投影される順序を模式的に示している。 Specifically, the reference signs F10a to F12a and F10b to F12b schematically show each frame of the moving image to be projected. Further, the horizontal direction in FIG. 8 indicates time t. That is, FIG. 8 schematically shows the order in which the frames F10a to F12a and F10b to F12b of the moving image to be projected are projected.
 ここで、変形例1に係る情報処理装置1では、図8に示した一連のフレームF10a~F12a及びF10b~F12bのうち、フレームF10a~F12aを出力部30aに投影させ、フレームF10b~F12bを出力部30bに投影させる。 Here, in the information processing apparatus 1 according to the modification 1, the frames F10a to F12a are projected on the output unit 30a from the series of frames F10a to F12a and F10b to F12b illustrated in FIG. 8, and the frames F10b to F12b are output. Projected onto the unit 30b.
 即ち、情報処理装置1は、まず出力部30aにフレームF10aを投影領域R11に投影させ、次いで、出力部30bにフレームF10bを当該投影領域R11に投影させる。また、フレームF10bが投影された後に、情報処理装置1は、出力部30aにフレームF11aを投影領域R11に投影させ、次いで、出力部30bにフレームF11bを当該投影領域R11に投影させる。 That is, the information processing apparatus 1 first causes the output unit 30a to project the frame F10a onto the projection region R11, and then causes the output unit 30b to project the frame F10b onto the projection region R11. After the frame F10b is projected, the information processing apparatus 1 causes the output unit 30a to project the frame F11a onto the projection region R11, and then causes the output unit 30b to project the frame F11b onto the projection region R11.
 このように、情報処理装置1は、出力部30a及び30bそれぞれの投影領域R11を重畳させ、当該出力部30a及び30bに、投影対象となる動画像の各フレームを、時系列に沿って交互に投影させている。このような構成により、変形例1に係る情報処理装置1は、各出力部30それぞれの性能に基づくフレームレートよりも、より高いフレームレートで動画像を投影することが可能となる。 As described above, the information processing apparatus 1 superimposes the projection regions R11 of the output units 30a and 30b, and alternately displays the frames of the moving image to be projected on the output units 30a and 30b in time series. Projecting. With such a configuration, the information processing apparatus 1 according to Modification 1 can project a moving image at a frame rate higher than the frame rate based on the performance of each output unit 30.
 なお、情報処理装置1は、例えば、各出力部30の性能に基づくフレームレートよりも、高フレームレートの動画像が投影対象となった場合には、上記に説明した高フレームレートを実現するための動作モードに基づき、各出力部30の動作を制御してもよい。 Note that the information processing apparatus 1 realizes the high frame rate described above when, for example, a moving image having a higher frame rate than the frame rate based on the performance of each output unit 30 is a projection target. The operation of each output unit 30 may be controlled based on the operation mode.
 以上、変形例1として、図7及び図8を参照して、2以上の出力部30を連携させることで、当該出力部30それぞれの性能に基づくフレームレートよりも、より高いフレームレートでの動画像の投影を実現するための、情報処理装置1の動作の一例について説明した。 As described above, as Modification 1, with reference to FIGS. 7 and 8, a movie with a higher frame rate than the frame rate based on the performance of each of the output units 30 by linking two or more output units 30 together. An example of the operation of the information processing apparatus 1 for realizing image projection has been described.
 [4.2.変形例2:高画質の映像を投影するための制御例1]
 次に、変形例2として、2以上の出力部30を連携させることで、当該出力部30それぞれが投影可能な画像よりも、より高画質の画像の投影を実現するための、情報処理装置1の動作の一例について説明する。なお、本説明では、情報処理装置1は、出力部30a及び30bを互いに連携させるものとして説明する。
[4.2. Modification Example 2: Control Example 1 for Projecting a High Quality Video]
Next, as a second modification, the information processing apparatus 1 for realizing projection of an image with higher image quality than an image that can be projected by each of the output units 30 by linking two or more output units 30 together. An example of the operation will be described. In this description, the information processing apparatus 1 will be described assuming that the output units 30a and 30b cooperate with each other.
 例えば、変形例2に係る情報処理装置1は、出力部30a及び30bを互いに連携させることで、よりダイナミックレンジの広い画像の投影を実現してもよい。 For example, the information processing apparatus 1 according to the modified example 2 may realize projection of an image with a wider dynamic range by causing the output units 30a and 30b to cooperate with each other.
 具体的には、情報処理装置1は、図7に示すように、出力部30a及び30bそれぞれに対応する投影領域R11(即ち、投影領域R11a及びR11b)が互いに重畳するように、出力部30a及び30bの動作を制御する。 Specifically, as illustrated in FIG. 7, the information processing apparatus 1 includes the output units 30 a and 30 b so that the projection regions R 11 (that is, the projection regions R 11 a and R 11 b) corresponding to the output units 30 a and 30 b overlap each other. The operation of 30b is controlled.
 なお、このとき投影領域R11a及びR11bが重畳する領域には、出力部30a及び30bそれぞれから映像が投影され、当該映像が重畳することとなる。そのため、投影領域R11a及びR11bが重畳する領域に投影される映像の階調は、出力部30aから投影される映像の階調と、出力部30bから投影される映像の階調との加算結果となる。 Note that at this time, images are projected from the output units 30a and 30b, respectively, in the region where the projection regions R11a and R11b overlap, and the images overlap. Therefore, the gradation of the image projected on the region where the projection regions R11a and R11b overlap is obtained by adding the gradation of the image projected from the output unit 30a and the gradation of the image projected from the output unit 30b. Become.
 即ち、情報処理装置1は、出力部30a及び30bそれぞれの光源から照射される光の光量(例えば、輝度)を、個々に制御することで、各出力部30が個々に実現可能なダイナミックレンジよりも、より広いダイナミックレンジで表現された画像を投影することが可能となる。 That is, the information processing apparatus 1 individually controls the light amount (for example, luminance) of the light emitted from the light sources of the output units 30a and 30b, so that each output unit 30 can be individually realized from the dynamic range. However, it is possible to project an image expressed in a wider dynamic range.
 なお、情報処理装置1は、例えば、各出力部30が個々に実現可能なダイナミックレンジよりも、より広いダイナミックレンジで表現された画像が投影対象となった場合には、上記に説明した広いダイナミックレンジでの表現を実現するための動作モードに基づき、各出力部30の動作を制御してもよい。 Note that, for example, when an image expressed in a wider dynamic range than the dynamic range that can be individually realized by each output unit 30 is a projection target, the information processing apparatus 1 performs the wide dynamic described above. You may control operation | movement of each output part 30 based on the operation mode for implement | achieving the expression in a range.
 また、他の一例として、変形例2に係る情報処理装置1は、出力部30a及び30bを互いに連携させることで、より色域の広い画像の投影を実現してもよい。 As another example, the information processing apparatus 1 according to the second modification may realize an image projection with a wider color gamut by causing the output units 30a and 30b to cooperate with each other.
 具体的には、情報処理装置1は、図7に示すように、出力部30a及び30それぞれに対応する投影領域R11(即ち、投影領域R11a及びR11b)が互いに重畳するように、出力部30a及び30bの動作を制御する。 Specifically, as illustrated in FIG. 7, the information processing apparatus 1 includes the output units 30 a and 30 so that the projection regions R 11 (that is, the projection regions R 11 a and R 11 b) corresponding to the output units 30 a and 30 overlap each other. The operation of 30b is controlled.
 そして、情報処理装置1は、出力部30a及び30bそれぞれから照射される光の色域が、互いに異なる色域となるように、当該出力部30a及び30bそれぞれの光源からの出力(即ち、光源から出射される光)を制御する。なお、出力部30a及び30bそれぞれから照射される光の色域が制御可能であれば、その方法は特に限定されない。具体的な一例として、情報処理装置1は、光源の出力を制御することで、出力部30から出射される光の色域を制御してもよい。また、他の一例として、情報処理装置1は、光源から出射された光が出力部30の外部に出射されるまでに通過する光学系(例えば、フィルタ等)を制御することで、出力部30から出射される光の色域を制御してもよい。 Then, the information processing apparatus 1 outputs the light from the light sources of the output units 30a and 30b (that is, from the light source) so that the color gamuts of the light emitted from the output units 30a and 30b are different from each other. The emitted light) is controlled. The method is not particularly limited as long as the color gamut of light emitted from each of the output units 30a and 30b can be controlled. As a specific example, the information processing apparatus 1 may control the color gamut of light emitted from the output unit 30 by controlling the output of the light source. As another example, the information processing apparatus 1 controls the optical system (for example, a filter or the like) that passes through before the light emitted from the light source is emitted to the outside of the output unit 30, so that the output unit 30. You may control the color gamut of the light radiate | emitted from.
 このような制御により、出力部30a及び30bそれぞれから、異なる色域で表現された映像が投影され、投影面R10上で重畳されることとなる。このような構成により、変形例2に係る情報処理装置1は、各出力部30が個々に表現可能な色域よりも、より広い色域で表現された画像を投影することが可能となる。 By such control, images expressed in different color gamuts are projected from the output units 30a and 30b and superimposed on the projection plane R10. With such a configuration, the information processing apparatus 1 according to Modification 2 can project an image expressed in a wider color gamut than the color gamut that each output unit 30 can individually express.
 なお、情報処理装置1は、たとえば、各出力部30が個々に表現可能な色域よりも、より広い色域で表現された画像が投影対象となった場合には、上記に説明した広い色域での表現を実現するための動作モードに基づき、各出力部30の動作を制御してもよい。 For example, when an image expressed in a wider color gamut than the color gamut that can be individually expressed by each output unit 30 is a projection target, the information processing apparatus 1 uses the wide color described above. The operation of each output unit 30 may be controlled based on the operation mode for realizing the expression in the area.
 [4.3.変形例3:高画質の映像を投影するための制御例2]
 次に、変形例3として、各出力部30の解像度よりも、より高い解像度の画像の投影を実現するための、情報処理装置1の動作の一例について説明する。
[4.3. Modification 3: Control Example 2 for Projecting High-Quality Video]
Next, as a third modification, an example of the operation of the information processing apparatus 1 for realizing projection of an image with a higher resolution than the resolution of each output unit 30 will be described.
 例えば、図9は、変形例3に係る情報処理装置1の動作の一例について説明するための説明図であり、出力部30a及び30bを互いに連携させることで、より高い解像度の画像の投影を実現する場合の一例について示している。なお、図9は、出力部30a及び30bそれぞれから投影面R10に投影された画像の各ピクセルの位置関係を模式的に示している。 For example, FIG. 9 is an explanatory diagram for explaining an example of the operation of the information processing apparatus 1 according to the modified example 3. By projecting the output units 30a and 30b to each other, an image with a higher resolution is realized. An example of the case is shown. FIG. 9 schematically shows the positional relationship of each pixel of the image projected on the projection plane R10 from each of the output units 30a and 30b.
 図9において、参照符号R13aは、出力部30aにより投影面R10に投影された画像のピクセル(即ち、投影領域R11aに投影される映像のピクセル)を模式的に示している。同様に、参照符号R13bは、出力部30bにより投影面R10に投影された画像のピクセル(即ち、投影領域R11bに投影される映像のピクセル)を模式的に示している。 In FIG. 9, reference symbol R13a schematically shows pixels of an image projected on the projection plane R10 by the output unit 30a (that is, video pixels projected on the projection region R11a). Similarly, the reference symbol R13b schematically indicates pixels of an image projected on the projection plane R10 by the output unit 30b (that is, pixels of an image projected on the projection region R11b).
 即ち、図9に示す例では、情報処理装置1は、出力部30aに対応する投影領域R11aの位置と、出力部30bに対応する投影領域R11bの位置との間の差が、半ピクセル単位(もしくは、サブピクセル単位)となるように、各投影領域R11の位置を調整している。 That is, in the example shown in FIG. 9, the information processing apparatus 1 determines that the difference between the position of the projection region R11a corresponding to the output unit 30a and the position of the projection region R11b corresponding to the output unit 30b is a half-pixel unit ( Alternatively, the position of each projection region R11 is adjusted so as to be in subpixel units.
 このような制御のもと、情報処理装置1は、出力部30a及び30bそれぞれから投影された画像を投影面R10で重畳させることで、各出力部30の解像度よりも、より高い解像度の画像(例えば、図9に示す例の場合には、4倍の解像度の画像)の投影を模擬することが可能となる。 Under such control, the information processing apparatus 1 superimposes the images projected from the output units 30a and 30b on the projection plane R10, so that an image having a higher resolution than the resolution of each output unit 30 ( For example, in the case of the example shown in FIG. 9, it is possible to simulate the projection of an image having a resolution of 4 times.
 なお、情報処理装置1は、出力部30aが投影するピクセルR13aと、出力部30bが投影するピクセルR13bとを、投影対象となる画像中の各ピクセルに基づき生成してもよい。具体的な一例として、情報処理装置1は、出力部30aが投影するピクセルR13aを、投影対象となる画像中の、ピクセルR13aに対応する領域中のピクセルや、当該領域の近傍に位置するピクセルに基づき生成してもよい。このことは、出力部30bが投影するピクセルR13bについても同様である。 The information processing apparatus 1 may generate the pixel R13a projected by the output unit 30a and the pixel R13b projected by the output unit 30b based on each pixel in the image to be projected. As a specific example, the information processing apparatus 1 sets the pixel R13a projected by the output unit 30a to a pixel in a region corresponding to the pixel R13a in the image to be projected or a pixel located in the vicinity of the region. You may generate based on. The same applies to the pixel R13b projected by the output unit 30b.
 また、情報処理装置1が、図9に示すように、出力部30a及び30bそれぞれが投影する画像を生成するための各種フィルタ処理の一例については、例えば、「Niranjan Damera-Venkata and Nelson L. Chang, "Realizing Super-Resolution with Superimposed Projection," IEEE International Workshop on Projector-Camera Systems (ProCams), 18 June 2007, Minneapolis, MN.」に開示されている。 Further, as shown in FIG. 9, the information processing apparatus 1 may, for example, describe “Niranjan Damera-Venkata and Nelson L. Chang” as an example of various filter processes for generating images projected by the output units 30a and 30b. , “Realizing Super-Resolution with Superimposed Projection,” IEEE International Workshop on Projector-Camera Systems IV (ProCams), 18 June 2007, Minneapolis, MN.
 また、図9に示す制御については、単一の出力部30により実現することも可能である。この場合には、情報処理装置1は、例えば、投影領域R11aに投影されるピクセルR13aと、投影領域R11bに投影されるピクセルR13bとを、時分割で投影すればよい。 Further, the control shown in FIG. 9 can be realized by a single output unit 30. In this case, for example, the information processing apparatus 1 may project the pixel R13a projected onto the projection region R11a and the pixel R13b projected onto the projection region R11b in a time division manner.
 また、図10は、変形例3に係る情報処理装置1の動作の他の一例について説明するための説明図であり、図9に示す制御とは異なる方法で、各出力部30の解像度よりも、より高い解像度の画像の投影を実現するための制御の一例を示している。 FIG. 10 is an explanatory diagram for explaining another example of the operation of the information processing apparatus 1 according to the modified example 3. The method is different from the control shown in FIG. 2 shows an example of control for realizing projection of an image with a higher resolution.
 図10に示す例では、投影対象となる映像を複数の部分画像に分割し、各部分画像を時分割で投影することで、各出力部30の解像度よりも、より高い解像度の画像の投影を実現している。具体的には、図10に示す例において、参照符号R20は、投影対象となる映像が投影される範囲を模式的に示している。また、参照符号R111~R114は、出力部30により時分割で映像が投影される各投影領域を模式的に示している。 In the example illustrated in FIG. 10, the image to be projected is divided into a plurality of partial images, and each partial image is projected in a time division manner, thereby projecting an image with a higher resolution than the resolution of each output unit 30. Realized. Specifically, in the example illustrated in FIG. 10, the reference symbol R <b> 20 schematically indicates a range in which a video to be projected is projected. Reference numerals R111 to R114 schematically indicate projection areas in which video is projected by the output unit 30 in a time division manner.
 即ち、図10に示す例では、情報処理装置1は、投影対象となる画像を4つの部分画像に分割し、投影領域R111~R114に対して順次時分割で、対応する部分画像が投影されるように、出力部30の動作(例えば、投影方向、画角、及び投影対象となる部分画像)を制御する。 That is, in the example shown in FIG. 10, the information processing apparatus 1 divides an image to be projected into four partial images, and sequentially projects the corresponding partial images on the projection regions R111 to R114 in time division. In this way, the operation of the output unit 30 (for example, the projection direction, the angle of view, and the partial image to be projected) is controlled.
 なお、前述したように、所謂プロジェクタは、画角を制御することで投影面R10に投影される映像の大きさを制御することが可能であり、かつ、映像の大きさを変えた場合においても、当該映像の解像度は変わらない。そのため、画角をより絞り、映像の大きさがより小さくなるように制御された場合には、投影される映像における、各ピクセルの大きさがより小さくなり、ピクセル間の間隔がより短くなる。図10に示す例では、情報処理装置1は、上記に説明したプロジェクタの特性を利用し、出力部30の解像度の各部分画像を時分割で投影することで、投影対象となる映像を、出力部30の解像度よりもより高い解像度(即ち、4倍の解像度)で投影している。 As described above, the so-called projector can control the size of the image projected on the projection plane R10 by controlling the angle of view, and even when the size of the image is changed. The resolution of the video does not change. For this reason, when the angle of view is further reduced and the size of the image is controlled to be smaller, the size of each pixel in the projected image becomes smaller and the interval between the pixels becomes shorter. In the example illustrated in FIG. 10, the information processing apparatus 1 outputs the video to be projected by projecting each partial image with the resolution of the output unit 30 in a time division manner using the characteristics of the projector described above. The image is projected at a higher resolution than the resolution of the unit 30 (that is, four times higher resolution).
 なお、図10に示した制御は、単一の出力部30の制御により実現されてもよいし、2以上の出力部30を互いに連携させることで実現されてもよい。なお、2以上の出力部30を互いに連携させる場合には、情報処理装置1は、各出力部30が、互いに異なる部分画像を投影するように(即ち、同タイミングで同じ部分画像を投影しないように)当該出力部30の動作を制御すればよい。 Note that the control shown in FIG. 10 may be realized by control of a single output unit 30, or may be realized by linking two or more output units 30 to each other. When two or more output units 30 are linked to each other, the information processing apparatus 1 causes each output unit 30 to project different partial images (that is, not to project the same partial image at the same timing). B) The operation of the output unit 30 may be controlled.
 なお、情報処理装置1は、少なくとも2以上の出力部30を互いに連携させる、図9に示す制御と、図10に示す制御とを組み合わせることで、各出力部30の解像度よりも、より高い解像度の画像の投影を実現してもよい。 Note that the information processing apparatus 1 combines the control illustrated in FIG. 9 and the control illustrated in FIG. 10 in which at least two or more output units 30 cooperate with each other, so that the resolution is higher than the resolution of each output unit 30. Projection of the image may be realized.
 また、情報処理装置1は、各出力部30が個々に解像度よりも、より高い解像度で表現された画像が投影対象となった場合には、図9及び図10を参照して、上記に説明した高解像度での表現を実現するための動作モードに基づき、各出力部30の動作を制御してもよい。 In addition, the information processing apparatus 1 is described above with reference to FIGS. 9 and 10 when an image expressed by each output unit 30 with a higher resolution than the resolution is a projection target. The operation of each output unit 30 may be controlled based on the operation mode for realizing the high-resolution expression.
 以上、変形例3として、図9及び図10を参照して、各出力部30の解像度よりも、より高い解像度の画像の投影を実現するための、情報処理装置1の動作の一例について説明した。 As described above, as Modification 3, an example of the operation of the information processing apparatus 1 for realizing projection of an image with a resolution higher than the resolution of each output unit 30 has been described with reference to FIGS. 9 and 10. .
 なお、上記に変形例1~3として説明した、情報処理装置1の各制御を適宜組み合わせてもよいことは言うまでもない。具体的な一例として、情報処理装置1は、変形例2として説明した、より広い色域での表現を実現するための制御と、変形例3として説明した、より高い解像度での表現を実現するための制御とを組み合わせることで、より高画質の画像の投影を実現してもよい。また、他の一例として、情報処理装置1は、変形例1として説明した、より高いフレームレートでの表現を実現するための制御と、変形例3として説明した、より高い解像度での表現を実現するための制御とを組み合わせてもよい。なお、上記に説明した各制御を組み合わせる場合には、連携対象となる出力部30の数が、適宜調整されてもよいことは言うまでもない。 Needless to say, the controls of the information processing apparatus 1 described as the first to third modifications may be appropriately combined. As a specific example, the information processing apparatus 1 realizes the control for realizing the expression in a wider color gamut described as the modification 2 and the expression with the higher resolution described as the modification 3. For this reason, it is possible to realize the projection of a higher quality image by combining with the control for this purpose. As another example, the information processing apparatus 1 realizes the control described in the first modification for realizing the expression at a higher frame rate and the expression in the higher resolution described as the third modification. You may combine with the control for doing. In addition, when combining each control demonstrated above, it cannot be overemphasized that the number of the output parts 30 used as cooperation object may be adjusted suitably.
 [4.4.変形例4:部分的な高画質化を実現するための制御例]
 前述したように、所謂プロジェクタは、投影対象となる映像をより大きく投影するほど、投影される映像における、各ピクセルの大きさがより大きくなり、ピクセル間の間隔がより長くなる。一方で、投影対象となる映像をより小さく投影するほど、投影される映像における、各ピクセルの大きさがより小さくなり、ピクセル間の間隔がより短くなる。変形例4に係る情報処理装置1では、このような特性を利用し、2以上の出力部30を互いに連携させることで、投影対象となる一連の画像(例えば、操作画面)のうち、一部の部分画像(例えば、特定のコンテンツが表示された部分)について高画質化(即ち、部分的な高画質化)を実現する。以下に変形例4に係る情報処理装置1の動作の一例について説明する。
[4.4. Modified Example 4: Control Example for Realizing Partially High Image Quality]
As described above, so-called projectors project a larger image to be projected, the larger the size of each pixel in the projected image and the longer the interval between pixels. On the other hand, the smaller the video to be projected, the smaller the size of each pixel in the projected video and the shorter the spacing between pixels. In the information processing apparatus 1 according to the modified example 4, by utilizing such characteristics, two or more output units 30 are linked to each other, so that a part of a series of images (for example, operation screens) to be projected is included. The partial image (for example, the portion where the specific content is displayed) is improved in image quality (that is, partially improved in image quality). Hereinafter, an example of the operation of the information processing apparatus 1 according to Modification 4 will be described.
 例えば、図11は、変形例4に係る情報処理装置1の動作の一例について説明するための説明図である。なお、本説明では、情報処理装置1は、出力部30a及び30bを互いに連携させるものとする。図11において、参照符号R11aは、出力部30aに対応する投影領域R11を示している。また、参照符号R11bは、出力部30bに対応する投影領域R11を示している。 For example, FIG. 11 is an explanatory diagram for explaining an example of the operation of the information processing apparatus 1 according to the modification 4. In this description, it is assumed that the information processing apparatus 1 causes the output units 30a and 30b to cooperate with each other. In FIG. 11, reference sign R11a indicates a projection region R11 corresponding to the output unit 30a. Reference numeral R11b indicates a projection region R11 corresponding to the output unit 30b.
 図11に示す例では、例えば、情報処理装置1が、当該情報処理装置1を操作するための操作画面を、テーブル140の天面上の投影面R10に投映している場合を示している。図11において、参照符号V11は、当該操作画面中に提示された、表示オブジェクト(例えば、コンテンツの再生に基づき表示されるウィンドウ等)を模式的に示している。即ち、情報処理装置1は、操作画面全体を投影領域R11bに向けて投映し、当該操作画面中の表示オブジェクトV11が提示された一部の領域(部分画像)を、投影領域R11aに投映される画像として、投影領域R11bに投映された操作画面上に重畳させている。 In the example illustrated in FIG. 11, for example, the information processing apparatus 1 is projecting an operation screen for operating the information processing apparatus 1 on the projection surface R <b> 10 on the top surface of the table 140. In FIG. 11, a reference sign V11 schematically indicates a display object (for example, a window displayed based on the reproduction of content) presented in the operation screen. That is, the information processing apparatus 1 projects the entire operation screen toward the projection area R11b, and projects a partial area (partial image) on which the display object V11 is presented in the operation screen onto the projection area R11a. The image is superimposed on the operation screen projected on the projection region R11b.
 具体的には、情報処理装置1は、出力部30bの画角を広げることで、投影領域R11bが投影面R10全体にわたって形成されるように制御し、当該投影領域R11bに対して、投影対象となる操作画面全体を当該出力部30bに投映させている。また、情報処理装置1は、出力部30aについては、画角を絞り投影方向を制御することで、投影領域R11bの一部の領域に投影領域R11aを重畳させている。そして、情報処理装置1は、投影領域R11bに投映された操作画面中の、投影領域R11aで示された領域に相当する当該操作画面の一部の領域の画像(即ち、表示オブジェクトV11が提示された部分画像)を出力部30aに投映させている。 Specifically, the information processing apparatus 1 controls the projection region R11b to be formed over the entire projection surface R10 by widening the angle of view of the output unit 30b. The entire operation screen is projected on the output unit 30b. In addition, the information processing apparatus 1 superimposes the projection region R11a on a partial region of the projection region R11b by controlling the angle of view and the projection direction for the output unit 30a. Then, the information processing apparatus 1 presents an image (that is, the display object V11) of a partial area of the operation screen corresponding to the area indicated by the projection area R11a in the operation screen projected on the projection area R11b. The partial image) is projected on the output unit 30a.
 このような制御により、情報処理装置1は、表示オブジェクトV11が提示された領域の解像度を、部分的に、投影領域R11bに投映させる操作画面全体よりもより高い解像度で表現することが可能となる。 By such control, the information processing apparatus 1 can partially express the resolution of the area where the display object V11 is presented at a higher resolution than the entire operation screen projected on the projection area R11b. .
 なお、図11に示す例において、投影領域R11b中に形成される投影領域R11aの位置や大きさは、動的に切り替えてもよい。具体的な一例として、ユーザUからの操作に基づき、操作画面中における表示オブジェクトVの位置や大きさが変更された場合には、情報処理装置1は、当該変更にあわせて投影領域R11aの位置や大きさを変更してもよい。 In the example shown in FIG. 11, the position and size of the projection region R11a formed in the projection region R11b may be switched dynamically. As a specific example, when the position or size of the display object V in the operation screen is changed based on an operation from the user U, the information processing apparatus 1 changes the position of the projection region R11a in accordance with the change. You may change the size.
 以上、変形例4として、図11を参照して、2以上の出力部30を互いに連携させることで、投影対象となる一連の画像のうち、一部の部分画像について、高画質化を実現するための、情報処理装置1の動作の一例について説明した。 As described above, as Modification 4, with reference to FIG. 11, two or more output units 30 are linked to each other, thereby realizing high image quality for some partial images in a series of images to be projected. Therefore, an example of the operation of the information processing apparatus 1 has been described.
 [4.5.変形例5:表示態様の変更時における追従性向上のための制御例]
 次に、変形例5として、投影面R10の一部に対して画像を投影している状況下において、当該画像が投影される位置や大きさ等の表示態様を変更する場合の制御の一例について、図12~図14を参照して説明する。図12~図14は、変形例5に係る情報処理装置1の動作の一例について説明するための説明図であり、投影面R10の一部に投映された画像(例えば、表示オブジェクトV11)の位置や大きさ(即ち、表示態様)を変更する場合に、より好適な追従性を得るための制御の一例について示している。
[4.5. Modified Example 5: Control Example for Improving Followability when Changing Display Mode]
Next, as a fifth modification example, an example of control in a case where a display mode such as a position and a size at which the image is projected is changed in a situation where the image is projected onto a part of the projection surface R10. This will be described with reference to FIGS. 12 to 14 are explanatory diagrams for explaining an example of the operation of the information processing apparatus 1 according to the modified example 5, and the position of an image (for example, the display object V11) projected on a part of the projection plane R10. In the case of changing the size (that is, the display mode), an example of control for obtaining more suitable followability is shown.
 例えば、図12に示す例では、情報処理装置1は、出力部30aの画角を絞ることで投影領域R11aの大きさがより小さくなるように制御し、当該投影領域R11aに、表示オブジェクトV11が提示された画像を投影している。即ち、図12に示す例では、情報処理装置1は、投影面R10の一部を投影領域R11aとして、出力部30aに、表示オブジェクトV11が提示された映像を当該投影領域R11aに向けて投影させている。このように、投影領域R11aという限られた領域のみへの映像の投影により表示オブジェクトV11が提示されることで、投影面R10全体に亘って映像が投影される場合に比べて、より高い解像度で、表示オブジェクトV11が投影面R10に提示されることとなる。 For example, in the example illustrated in FIG. 12, the information processing apparatus 1 controls the projection area R11a to be smaller by narrowing the angle of view of the output unit 30a, and the display object V11 is displayed in the projection area R11a. The presented image is projected. That is, in the example illustrated in FIG. 12, the information processing apparatus 1 causes a part of the projection surface R10 to be a projection region R11a and causes the output unit 30a to project the video on which the display object V11 is presented toward the projection region R11a. ing. In this way, the display object V11 is presented by projecting the video only to the limited area of the projection area R11a, so that the resolution is higher than when the video is projected over the entire projection plane R10. The display object V11 is presented on the projection plane R10.
 ここで、図12に示す状態において、ユーザUの操作により、表示オブジェクトV11の位置が変更され、当該表示オブジェクトV11が投影領域R11aの外に移動されたものとする。このとき、投影領域R11aの位置や大きさを変更する場合には、出力部30a(特に、プロジェクタに相当する映像出力部31)の画角や映像を投影する方向を制御するため、機械的な駆動が介在する。そのため、例えば、ユーザUの操作にあわせて表示オブジェクトVの位置や大きさを変更する場合に、出力部30aの性能によっては、必ずしも好適な追従性が得られるとは限らない。 Here, in the state shown in FIG. 12, it is assumed that the position of the display object V11 is changed by the operation of the user U and the display object V11 is moved out of the projection region R11a. At this time, when changing the position and size of the projection region R11a, the angle of view of the output unit 30a (particularly the video output unit 31 corresponding to the projector) and the direction in which the video is projected are controlled. Drive is involved. Therefore, for example, when the position and size of the display object V are changed in accordance with the operation of the user U, suitable followability is not always obtained depending on the performance of the output unit 30a.
 そこで、変形例5に係る情報処理装置1は、図13に示すように、ユーザUの操作により、表示オブジェクトV11の位置や大きさが変更されたことを認識すると、出力部30aによる投影領域R11aへの映像の投影を一時的に停止し、替わりに、出力部30bに映像を投影させる。 Therefore, when the information processing apparatus 1 according to the modified example 5 recognizes that the position and size of the display object V11 are changed by the operation of the user U as illustrated in FIG. 13, the projection region R11a by the output unit 30a. The projection of the video on the screen is temporarily stopped, and instead, the video is projected on the output unit 30b.
 例えば、図13に示す参照符号R11bは、出力部30bに対応する投影領域R11を示している。図13に示すように、情報処理装置1は、投影領域R11bが、投影領域R11aを包含するより広い領域(例えば、投影面R10全体に相当する領域)となるように、出力部30bの画角と、当該出力部30bが映像を投影する方向を制御している。 For example, the reference symbol R11b shown in FIG. 13 indicates the projection region R11 corresponding to the output unit 30b. As illustrated in FIG. 13, the information processing apparatus 1 allows the angle of view of the output unit 30b so that the projection region R11b is a wider region including the projection region R11a (for example, a region corresponding to the entire projection surface R10). The output unit 30b controls the direction in which the image is projected.
 また、情報処理装置1は、投影領域R11b中の投影領域R11aに相当する領域(即ち、表示オブジェクトV11が表示されていた領域)に、当該投影領域R11aに投映されていた画像が投影されるように、出力部30bにより投影させる映像を制御する。即ち、情報処理装置1は、図13に示すように、投影領域R11bに映像を投映する状態に切り替えた場合においても、図12に示す例と同様の位置に同様の大きさで表示オブジェクトV11が投影されるように、出力部30bにより出力される映像を制御することとなる。 Also, the information processing apparatus 1 projects the image projected on the projection area R11a onto the area corresponding to the projection area R11a in the projection area R11b (that is, the area where the display object V11 is displayed). In addition, the image projected by the output unit 30b is controlled. That is, as shown in FIG. 13, the information processing apparatus 1 has the display object V11 with the same size and the same size as in the example shown in FIG. 12 even when switching to the state in which the image is projected onto the projection region R11b. The video output by the output unit 30b is controlled so as to be projected.
 そして、情報処理装置1は、ユーザUの操作内容に基づき、投影領域R11bに投映される画像中の表示オブジェクトV11の表示位置を制御する。なお、このとき情報処理装置1は、画像処理に基づき表示オブジェクトV11の表示位置を制御することで、機械的な駆動を介在させることなく、投影面R10中における表示オブジェクトV11が投影される位置を制御している。そのため、図15に示す制御中には、表示オブジェクトV11の解像度が一時的に低下するものの、ユーザUによる操作に対して、より好適な態様で、表示オブジェクトV11が投影される位置の制御を追従させる(即ち、レスポンスを向上させる)ことが可能となる。 Then, the information processing apparatus 1 controls the display position of the display object V11 in the image projected on the projection region R11b based on the operation content of the user U. At this time, the information processing apparatus 1 controls the display position of the display object V11 based on the image processing, so that the position where the display object V11 is projected on the projection surface R10 without mechanical drive is determined. I have control. Therefore, during the control illustrated in FIG. 15, although the resolution of the display object V11 temporarily decreases, the control of the position where the display object V11 is projected is followed in a more preferable manner with respect to the operation by the user U. (That is, the response can be improved).
 次いで、図14を参照する。図14は、ユーザUによる表示オブジェクトV11を移動させる操作が完了した状態を示している。 Next, refer to FIG. FIG. 14 shows a state in which the operation for moving the display object V11 by the user U is completed.
 情報処理装置1は、ユーザUによる表示オブジェクトV11を移動させる操作の完了を認識すると、図14に示すように、当該表示オブジェクトV11が表示された位置が投影領域R11aとなるように、出力部30aが映像を投影する方向を制御する。そして、情報処理装置1は、出力部30aに表示オブジェクトV11が提示された画像の投影を再開させる。なお、このとき情報処理装置1は、出力部30bによる投影領域R11bへの映像の投影を停止してもよい。 When the information processing apparatus 1 recognizes the completion of the operation of moving the display object V11 by the user U, as illustrated in FIG. 14, the output unit 30a so that the position where the display object V11 is displayed becomes the projection region R11a. Controls the direction in which the image is projected. Then, the information processing apparatus 1 restarts the projection of the image on which the display object V11 is presented on the output unit 30a. At this time, the information processing apparatus 1 may stop the projection of the video onto the projection region R11b by the output unit 30b.
 以上のような制御により、表示オブジェクトV11が、投影面R10全体に亘って映像が投影される場合に比べて、より高い解像度で投影面R10に再度提示されることとなる。なお、上記に説明した例では、情報処理装置1が、ユーザUの操作内容に基づき、表示オブジェクトV11の位置を制御する場合について説明したが、当該表示オブジェクトV11の大きさを制御する場合についても同様であることは言うまでもない。 By the control as described above, the display object V11 is presented again on the projection surface R10 with a higher resolution than when the video is projected over the entire projection surface R10. In the example described above, the case where the information processing apparatus 1 controls the position of the display object V11 based on the operation content of the user U has been described, but the case where the size of the display object V11 is controlled is also described. It goes without saying that the same applies.
 以上、図12~図14を説明したように、変形例5に係る情報処理装置1は、表示オブジェクトV11の位置や大きさ等のような表示態様を変更する場合に、投影領域R11がより広く設定された出力部30に映像を投影させている。そして、情報処理装置1は、画像処理に基づき表示オブジェクトV11の位置や大きさを制御することで、機械的な駆動を介在させることなく、投影面R10中における表示オブジェクトV11が投影される位置や大きさを制御する。このような制御により、表示オブジェクトV11の解像度が一時的に低下するものの、ユーザUによる操作に対して、より好適な態様で、表示オブジェクトV11が投影される位置や大きさ等の表示態様の制御を追従させることが可能となる。 As described above with reference to FIGS. 12 to 14, the information processing apparatus 1 according to the modified example 5 has a wider projection region R11 when the display mode such as the position and size of the display object V11 is changed. An image is projected on the set output unit 30. Then, the information processing apparatus 1 controls the position and size of the display object V11 based on image processing, so that the display object V11 is projected on the projection plane R10 without any mechanical drive. Control the size. Such control temporarily reduces the resolution of the display object V11, but controls the display mode such as the position and size at which the display object V11 is projected in a more preferable mode for the operation by the user U. Can be made to follow.
 なお、図12~図14に示す例では、情報処理装置1が、ユーザUによる操作を認識していない状態(例えば、図12及び図14に示す状態)では、投影領域R11bへの映像の投影を抑制しているが、必ずしも同態様に限定されるものではない。 In the examples shown in FIGS. 12 to 14, when the information processing apparatus 1 does not recognize the operation by the user U (for example, the state shown in FIGS. 12 and 14), the image is projected onto the projection region R11b. However, it is not necessarily limited to the same mode.
 具体的な一例として、情報処理装置1は、前述した変形例4に係る情報処理装置1(図11参照)のように、ユーザUによる操作を認識していない状態においても、投影領域R11b(即ち、投影領域R11aを包含する、当該投影領域R11aよりもより広い投影領域R11)に対して映像(例えば、操作画面全体)を投影してもよい。この場合には、情報処理装置1は、ユーザUによる操作を認識していない状態では、投影領域R11a及びR11bの双方に対して映像を投影し、ユーザUによる操作を認識した状態では、投影領域R11bにのみ映像を投影すればよい。 As a specific example, the information processing apparatus 1 does not recognize the operation by the user U as in the information processing apparatus 1 (see FIG. 11) according to the above-described modification 4, and thus the projection region R11b (that is, A video (for example, the entire operation screen) may be projected onto a projection area R11 that includes the projection area R11a and is wider than the projection area R11a. In this case, the information processing apparatus 1 projects an image on both the projection areas R11a and R11b when the operation by the user U is not recognized, and the projection area when the operation by the user U is recognized. It is only necessary to project an image on R11b.
 <5.ハードウェア構成>
 次に、図15を参照して、本開示の一実施形態に係る情報処理装置1のハードウェア構成の一例について説明する。図15は、本開示の一実施形態に係る情報処理装置1のハードウェア構成の一例を示した図である。
<5. Hardware configuration>
Next, an example of a hardware configuration of the information processing apparatus 1 according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 15 is a diagram illustrating an example of a hardware configuration of the information processing apparatus 1 according to an embodiment of the present disclosure.
 図15に示すように、本実施形態に係る情報処理装置1は、プロセッサ901と、メモリ903と、ストレージ905と、操作デバイス907と、報知デバイス909と、検知デバイス911と、撮像デバイス913と、バス917とを含む。また、情報処理装置1は、通信デバイス915を含んでもよい。 As illustrated in FIG. 15, the information processing apparatus 1 according to the present embodiment includes a processor 901, a memory 903, a storage 905, an operation device 907, a notification device 909, a detection device 911, an imaging device 913, Bus 917. Further, the information processing apparatus 1 may include a communication device 915.
 プロセッサ901は、例えばCPU(Central Processing Unit)、GPU(Graphics Processing Unit)、DSP(Digital Signal Processor)又はSoC(System on Chip)であってよく、情報処理装置1の様々な処理を実行する。プロセッサ901は、例えば、各種演算処理を実行するための電子回路により構成することが可能である。なお、前述した制御部20の各構成は、プロセッサ901により実現され得る。 The processor 901 may be, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), or a SoC (System on Chip), and executes various processes of the information processing apparatus 1. The processor 901 can be configured by, for example, an electronic circuit for executing various arithmetic processes. Each configuration of the control unit 20 described above can be realized by the processor 901.
 メモリ903は、RAM(Random Access Memory)及びROM(Read Only Memory)を含み、プロセッサ901により実行されるプログラム及びデータを記憶する。ストレージ905は、半導体メモリ又はハードディスクなどの記憶媒体を含み得る。例えば、前述した記憶部40は、メモリ903及びストレージ905の少なくともいずれか、もしくは、双方の組み合わせにより実現され得る。 The memory 903 includes RAM (Random Access Memory) and ROM (Read Only Memory), and stores programs and data executed by the processor 901. The storage 905 can include a storage medium such as a semiconductor memory or a hard disk. For example, the storage unit 40 described above can be realized by at least one of the memory 903 and the storage 905, or a combination of both.
 操作デバイス907は、ユーザが所望の操作を行うための入力信号を生成する機能を有する。操作デバイス907は、例えば、タッチパネルとして構成され得る。また、他の一例として、操作デバイス907は、例えばボタン、スイッチ、及びキーボードなどユーザが情報を入力するための入力部と、ユーザによる入力に基づいて入力信号を生成し、プロセッサ901に供給する入力制御回路などから構成されてよい。 The operation device 907 has a function of generating an input signal for a user to perform a desired operation. The operation device 907 can be configured as a touch panel, for example. As another example, the operation device 907 generates an input signal based on an input by the user, such as buttons, switches, and a keyboard, and an input for the user to input information, and supplies the input signal to the processor 901. It may be composed of a control circuit or the like.
 報知デバイス909は、出力デバイスの一例であり、例えば、所謂プロジェクタのように、投影面に対して所定の情報を投影することで、当該情報をユーザに報知してもよい。なお、前述した出力部30は、報知デバイス909により実現され得る。 The notification device 909 is an example of an output device. For example, the information may be notified to the user by projecting predetermined information onto a projection surface like a so-called projector. Note that the output unit 30 described above can be realized by the notification device 909.
 また、報知デバイス909は、液晶ディスプレイ(LCD:Liquid Crystal Display)装置、有機EL(OLED:Organic Light Emitting Diode)ディスプレイなどのデバイスであってよい。この場合には、報知デバイス909は、画面を表示することにより、ユーザに対して所定の情報を報知することができる。 Further, the notification device 909 may be a device such as a liquid crystal display (LCD) device or an organic EL (Organic Light Emitting Diode) display. In this case, the notification device 909 can notify the user of predetermined information by displaying the screen.
 また、報知デバイス909は、スピーカ等のように、所定の音響信号を出力することで、所定の情報をユーザに報知するデバイスであってもよい。 Further, the notification device 909 may be a device that notifies a user of predetermined information by outputting a predetermined acoustic signal, such as a speaker.
 なお、上記に示した報知デバイス909の例はあくまで一例であり、ユーザに対して所定の情報を報知可能であれば、報知デバイス909の態様は特に限定されない。具体的な一例として、報知デバイス909は、LED(Light Emitting Diode)のように、点灯又は点滅のパターンにより、所定の情報をユーザに報知するデバイスであってもよい。 In addition, the example of the notification device 909 described above is merely an example, and the aspect of the notification device 909 is not particularly limited as long as predetermined information can be notified to the user. As a specific example, the notification device 909 may be a device that notifies the user of predetermined information using a lighting or blinking pattern, such as an LED (Light Emitting Diode).
 撮像デバイス913は、CMOS(Complementary Metal-Oxide Semiconductor)イメージセンサやCCD(Charge Coupled Device)イメージセンサ等の、被写体を撮像し、撮像画像のデジタルデータを得る撮像素子を含む。即ち、撮像デバイス913は、プロセッサ901の制御に従い、レンズ等の光学系を介して静止画像又は動画像を撮影する機能を有する。撮像デバイス913は、撮像した画像をメモリ903やストレージ905に記憶させてもよい。なお、前述した撮像部11は、撮像デバイス913により実現され得る。 The imaging device 913 includes an imaging element that captures a subject and obtains digital data of the captured image, such as a CMOS (Complementary Metal-Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor. That is, the imaging device 913 has a function of capturing a still image or a moving image via an optical system such as a lens in accordance with the control of the processor 901. The imaging device 913 may store the captured image in the memory 903 or the storage 905. The imaging unit 11 described above can be realized by the imaging device 913.
 検知デバイス911は、各種状態を検知するためのデバイスである。検知デバイス911は、例えば、ステレオイメージセンサ等のような、所謂測距センサにより構成され得る。また、検知デバイス911は、所謂光学センサ等のように、所定の対象を検知するためのセンサにより構成されていてもよい。なお、前述した検知部13は、検知デバイス911により実現され得る。 The detection device 911 is a device for detecting various states. The detection device 911 can be configured by a so-called distance measuring sensor such as a stereo image sensor. Further, the detection device 911 may be configured by a sensor for detecting a predetermined target, such as a so-called optical sensor. The detection unit 13 described above can be realized by the detection device 911.
 通信デバイス915は、情報処理装置1が備える通信手段であり、ネットワークを介して外部装置と通信する。通信デバイス915は、有線または無線用の通信インタフェースである。通信デバイス915を、無線通信インタフェースとして構成する場合には、当該通信デバイス915は、通信アンテナ、RF(Radio Frequency)回路、ベースバンドプロセッサなどを含んでもよい。 The communication device 915 is a communication unit included in the information processing apparatus 1 and communicates with an external device via a network. The communication device 915 is a wired or wireless communication interface. When the communication device 915 is configured as a wireless communication interface, the communication device 915 may include a communication antenna, an RF (Radio Frequency) circuit, a baseband processor, and the like.
 通信デバイス915は、外部装置から受信した信号に各種の信号処理を行う機能を有し、受信したアナログ信号から生成したデジタル信号をプロセッサ901に供給することが可能である。 The communication device 915 has a function of performing various kinds of signal processing on a signal received from an external device, and can supply a digital signal generated from the received analog signal to the processor 901.
 バス917は、プロセッサ901、メモリ903、ストレージ905、操作デバイス907、報知デバイス909、検知デバイス911、撮像デバイス913、及び通信デバイス915を相互に接続する。バス917は、複数の種類のバスを含んでもよい。 The bus 917 connects the processor 901, the memory 903, the storage 905, the operation device 907, the notification device 909, the detection device 911, the imaging device 913, and the communication device 915 to each other. The bus 917 may include a plurality of types of buses.
 また、コンピュータに内蔵されるプロセッサ、メモリ、及びストレージなどのハードウェアを、上記した情報処理装置1が有する構成と同等の機能を発揮させるためのプログラムも作成可能である。また、当該プログラムを記録した、コンピュータに読み取り可能な記憶媒体も提供され得る。 In addition, it is possible to create a program for causing hardware such as a processor, memory, and storage built in the computer to perform the same functions as the configuration of the information processing apparatus 1 described above. A computer-readable storage medium that records the program can also be provided.
 <6.まとめ>
 以上説明したように、本開示の一実施形態に係る情報処理装置1は、複数の出力部30それぞれに対応する投影領域R11を重畳させるための態様が、互いに異なる複数の動作モードを、選択的に切り替え可能に構成されている。換言すると、情報処理装置1が切り替えの対象としている複数の動作モードのうち少なくとも一部には、複数の出力部30のうち、少なくとも2以上の出力部30を互いに連携させる動作モードが含まれる。また、情報処理装置1が切り替えの対象としている複数の動作モードのうち、少なくとも一部には、複数の出力部30のうち少なくとも一部の出力部30の動作を個別に制御するための動作モードが含まれていてもよい。
<6. Summary>
As described above, the information processing apparatus 1 according to an embodiment of the present disclosure selectively selects a plurality of operation modes having different modes for superimposing the projection regions R11 corresponding to the plurality of output units 30, respectively. It is configured to be switchable. In other words, at least a part of the plurality of operation modes that the information processing apparatus 1 is a switching target includes an operation mode in which at least two or more output units 30 among the plurality of output units 30 cooperate with each other. In addition, at least a part of the plurality of operation modes to be switched by the information processing apparatus 1 is an operation mode for individually controlling the operation of at least some of the output units 30 among the plurality of output units 30. May be included.
 このような構成により、本開示の一実施形態に係る情報処理装置1は、当該情報処理装置1の利用シーン(即ち、利用用途、利用形態、及び投影対象となるコンテンツ等)に応じて、各出力部30に、より好適な態様で映像を投影させることが可能となる。 With such a configuration, the information processing apparatus 1 according to an embodiment of the present disclosure is configured according to the usage scene of the information processing apparatus 1 (that is, usage usage, usage mode, content to be projected, and the like). The output unit 30 can project an image in a more preferable manner.
 なお、情報処理装置1が制御対象とする複数の出力部30は、必ずしも同じ性能である必要は無く、互いに異なる特性を有していてもよい。この場合には、情報処理装置1は、各出力部30の性能に応じて、選択した動作モードにより適した(換言すると、利用シーンにより適した)出力部30を、当該動作モードにおける制御対象として選択してもよい。また、情報処理装置1に対して、各出力部30を着脱可能に構成することで、当該情報処理装置1が制御対象とする出力部30の数を適宜変更できてもよい。この場合には、情報処理装置1は、自身が制御対象として認識している出力部30の数や各出力部30の性能に応じて、選択対象となる動作モードの候補を動的に切り替えてもよい。 Note that the plurality of output units 30 to be controlled by the information processing apparatus 1 are not necessarily required to have the same performance, and may have different characteristics. In this case, the information processing apparatus 1 uses the output unit 30 that is more suitable for the selected operation mode (in other words, more suitable for the use scene) as a control target in the operation mode according to the performance of each output unit 30. You may choose. Further, by configuring each output unit 30 to be detachable with respect to the information processing apparatus 1, the number of output units 30 to be controlled by the information processing apparatus 1 may be appropriately changed. In this case, the information processing apparatus 1 dynamically switches the operation mode candidates to be selected according to the number of output units 30 recognized as control targets and the performance of each output unit 30. Also good.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御する制御部を備え、
 前記制御部は、前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替える、
 情報処理装置。
(2)
 前記制御部は、表示対象となる前記表示情報に関連付けられた情報に応じて、前記動作モードを選択的に切り替える、前記(1)に記載の情報処理装置。
(3)
 前記制御部は、取得したユーザの操作内容を示す制御情報に応じて、前記動作モードを選択的に切り替える、前記(1)または(2)に記載の情報処理装置。
(4)
 前記制御部は、外部環境の検知結果に応じて、前記動作モードを選択的に切り替える、前記(1)~(3)のいずれか一項に記載の情報処理装置。
(5)
 前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、2以上の前記投影部それぞれに対応する前記投影領域を結合することで1つの表示領域とし、当該表示領域に前記表示情報を投影させる、前記(1)~(4)のいずれか一項に記載の情報処理装置。
(6)
 前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、前記投影領域を互いに重畳させた2以上の前記投影部それぞれの動作を、互いに異なる態様で制御する、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(7)
 前記制御部は、前記投影領域を互いに重畳させた2以上の前記投影部それぞれに、互いに異なる色域となるように制御された前記表示情報が投影させる、前記(6)に記載の情報処理装置。
(8)
 前記制御部は、前記投影領域を互いに重畳させた2以上の前記投影部それぞれの輝度を個々に制御することで、当該投影領域に投映される前記表示情報のダイナミックレンジを制御する、前記(6)に記載の情報処理装置。
(9)
 前記制御部は、互いに重畳させた複数の前記投影領域それぞれの投影位置の差が、サブピクセル単位となるように、2以上の前記投影部それぞれの動作を制御する、前記(6)に記載の情報処理装置。
(10)
 前記制御部は、前記投影領域を互いに重畳させた2以上の前記投影部それぞれに、互いに異なるタイミングで前記表示情報を投影させる、前記(6)に記載の情報処理装置。
(11)
 前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、少なくとも一部の前記投影部に、前記表示情報が分割された複数の部分画像を、前記投影領域中の対応する部分領域に時分割で投影させる、前記(1)~(10)のいずれか一項に記載の情報処理装置。
(12)
 前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、前記複数の投影部のうち第1の投影部に対応する第1の投影領域に対して、当該第1の投影領域に投影された前記表示情報のうち一部が表示された、第2の投影部に対応する第2の投影領域を重畳させる、前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)
 前記制御部は、
 前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、
 前記複数の投影部のうち第1の投影部に対応する第1の投影領域に投映された第1の表示情報の表示態様を制御する場合に、
 当該第1の投影部に替えて、第2の投影部に、当該第1の投影領域を包含する第2の投影領域に、当該第1の表示情報が提示された第2の表示情報を投影させ、
 当該第2の表示情報中における、当該第1の表示情報の表示態様を制御する、
 前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
 プロセッサが、
 投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御することと、
 前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替えることと、
 を含む、情報処理方法。
(15)
 コンピュータに、
 投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御することと、
 前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替えることと、
 を実行させる、プログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
A control unit that controls operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
The controller selectively switches a plurality of operation modes different from each other in a mode for superimposing the projection regions corresponding to each of at least two or more of the plurality of projection units.
Information processing device.
(2)
The information processing apparatus according to (1), wherein the control unit selectively switches the operation mode according to information associated with the display information to be displayed.
(3)
The information processing apparatus according to (1) or (2), wherein the control unit selectively switches the operation mode according to control information indicating the acquired user operation content.
(4)
The information processing apparatus according to any one of (1) to (3), wherein the control unit selectively switches the operation mode according to a detection result of an external environment.
(5)
The control unit, in at least some of the plurality of operation modes, combines the projection regions corresponding to two or more projection units to form one display region, and the display region includes the display region. The information processing apparatus according to any one of (1) to (4), wherein display information is projected.
(6)
The control unit controls, in at least some of the plurality of operation modes, operations of the two or more projection units in which the projection regions are superimposed on each other in a different manner from each other (1 The information processing apparatus according to any one of (5) to (5).
(7)
The information processing apparatus according to (6), wherein the control unit causes the display information controlled to have different color gamuts to be projected onto each of the two or more projection units in which the projection regions are superimposed on each other. .
(8)
The control unit controls the dynamic range of the display information projected on the projection region by individually controlling the luminance of each of the two or more projection units obtained by superimposing the projection regions on each other, (6 ).
(9)
The control unit according to (6), wherein the control unit controls the operation of each of the two or more projection units such that a difference between projection positions of the plurality of projection regions superimposed on each other is in sub-pixel units. Information processing device.
(10)
The information processing apparatus according to (6), wherein the control unit causes the two or more projection units that overlap the projection regions to project the display information at different timings.
(11)
The control unit corresponds to a plurality of partial images in which the display information is divided in at least some of the projection units in the projection region in at least some of the plurality of operation modes. The information processing apparatus according to any one of (1) to (10), wherein the information is projected onto the partial area in a time division manner.
(12)
The control unit performs the first projection on a first projection region corresponding to the first projection unit among the plurality of projection units in at least some of the plurality of operation modes. The second projection area corresponding to the second projection unit, in which a part of the display information projected on the area is displayed, is superimposed, according to any one of (1) to (11). Information processing device.
(13)
The controller is
In at least some of the plurality of operation modes,
When controlling the display mode of the first display information projected on the first projection area corresponding to the first projection unit among the plurality of projection units,
Instead of the first projecting unit, the second projecting unit projects the second display information on which the first display information is presented on the second projecting region including the first projecting region. Let
Controlling the display mode of the first display information in the second display information,
The information processing apparatus according to any one of (1) to (12).
(14)
Processor
Controlling the operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
A mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units among the plurality of projection units selectively switching a plurality of operation modes different from each other;
Including an information processing method.
(15)
On the computer,
Controlling the operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
A mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units among the plurality of projection units selectively switching a plurality of operation modes different from each other;
A program that executes
 1   情報処理装置
 10  入力部
 11  撮像部
 13  検知部
 20  制御部
 21  画像解析部
 22  入力解析部
 23  処理実行部
 25  モード判定部
 26  出力制御部
 30  出力部
 31  映像出力部
 40  記憶部
DESCRIPTION OF SYMBOLS 1 Information processing apparatus 10 Input part 11 Imaging part 13 Detection part 20 Control part 21 Image analysis part 22 Input analysis part 23 Process execution part 25 Mode determination part 26 Output control part 30 Output part 31 Video | video output part 40 Storage part

Claims (15)

  1.  投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御する制御部を備え、
     前記制御部は、前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替える、
     情報処理装置。
    A control unit that controls operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
    The controller selectively switches a plurality of operation modes different from each other in a mode for superimposing the projection regions corresponding to each of at least two or more of the plurality of projection units.
    Information processing device.
  2.  前記制御部は、表示対象となる前記表示情報に関連付けられた情報に応じて、前記動作モードを選択的に切り替える、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit selectively switches the operation mode in accordance with information associated with the display information to be displayed.
  3.  前記制御部は、取得したユーザの操作内容を示す制御情報に応じて、前記動作モードを選択的に切り替える、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit selectively switches the operation mode according to control information indicating the acquired user operation content.
  4.  前記制御部は、外部環境の検知結果に応じて、前記動作モードを選択的に切り替える、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the control unit selectively switches the operation mode according to a detection result of an external environment.
  5.  前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、2以上の前記投影部それぞれに対応する前記投影領域を結合することで1つの表示領域とし、当該表示領域に前記表示情報を投影させる、請求項1に記載の情報処理装置。 The control unit, in at least some of the plurality of operation modes, combines the projection regions corresponding to two or more projection units to form one display region, and the display region includes the display region. The information processing apparatus according to claim 1, wherein display information is projected.
  6.  前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、前記投影領域を互いに重畳させた2以上の前記投影部それぞれの動作を、互いに異なる態様で制御する、請求項1に記載の情報処理装置。 The control unit controls, in at least some of the plurality of operation modes, operations of the two or more projection units in which the projection regions are superimposed on each other in different modes. The information processing apparatus described in 1.
  7.  前記制御部は、前記投影領域を互いに重畳させた2以上の前記投影部それぞれに、互いに異なる色域となるように制御された前記表示情報が投影させる、請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the control unit projects the display information controlled to have different color gamuts onto each of the two or more projection units in which the projection regions are superimposed on each other.
  8.  前記制御部は、前記投影領域を互いに重畳させた2以上の前記投影部それぞれの輝度を個々に制御することで、当該投影領域に投映される前記表示情報のダイナミックレンジを制御する、請求項6に記載の情報処理装置。 The said control part controls the dynamic range of the said display information projected on the said projection area | region by controlling separately the brightness | luminance of each of the two or more said projection parts which overlapped the said projection area | region mutually. The information processing apparatus described in 1.
  9.  前記制御部は、互いに重畳させた複数の前記投影領域それぞれの投影位置の差が、サブピクセル単位となるように、2以上の前記投影部それぞれの動作を制御する、請求項6に記載の情報処理装置。 The information according to claim 6, wherein the control unit controls operations of the two or more projection units such that a difference between projection positions of the plurality of projection regions superimposed on each other is in sub-pixel units. Processing equipment.
  10.  前記制御部は、前記投影領域を互いに重畳させた2以上の前記投影部それぞれに、互いに異なるタイミングで前記表示情報を投影させる、請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the control unit causes the display information to be projected at different timings to each of the two or more projection units in which the projection areas are superimposed on each other.
  11.  前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、少なくとも一部の前記投影部に、前記表示情報が分割された複数の部分画像を、前記投影領域中の対応する部分領域に時分割で投影させる、請求項1に記載の情報処理装置。 The control unit corresponds to a plurality of partial images in which the display information is divided in at least some of the projection units in the projection region in at least some of the plurality of operation modes. The information processing apparatus according to claim 1, wherein the information processing apparatus projects the partial area in a time division manner.
  12.  前記制御部は、前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、前記複数の投影部のうち第1の投影部に対応する第1の投影領域に対して、当該第1の投影領域に投影された前記表示情報のうち一部が表示された、第2の投影部に対応する第2の投影領域を重畳させる、請求項1に記載の情報処理装置。 The control unit performs the first projection on a first projection region corresponding to the first projection unit among the plurality of projection units in at least some of the plurality of operation modes. The information processing apparatus according to claim 1, wherein a second projection area corresponding to a second projection unit on which a part of the display information projected on the area is displayed is superimposed.
  13.  前記制御部は、
     前記複数の動作モードのうち、少なくとも一部の動作モードにおいて、
     前記複数の投影部のうち第1の投影部に対応する第1の投影領域に投映された第1の表示情報の表示態様を制御する場合に、
     当該第1の投影部に替えて、第2の投影部に、当該第1の投影領域を包含する第2の投影領域に、当該第1の表示情報が提示された第2の表示情報を投影させ、
     当該第2の表示情報中における、当該第1の表示情報の表示態様を制御する、
     請求項1に記載の情報処理装置。
    The controller is
    In at least some of the plurality of operation modes,
    When controlling the display mode of the first display information projected on the first projection area corresponding to the first projection unit among the plurality of projection units,
    Instead of the first projecting unit, the second projecting unit projects the second display information on which the first display information is presented on the second projecting region including the first projecting region. Let
    Controlling the display mode of the first display information in the second display information,
    The information processing apparatus according to claim 1.
  14.  プロセッサが、
     投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御することと、
     前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替えることと、
     を含む、情報処理方法。
    Processor
    Controlling the operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
    A mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units among the plurality of projection units selectively switching a plurality of operation modes different from each other;
    Including an information processing method.
  15.  コンピュータに、
     投影面中の少なくとも一部の投影領域に対して表示情報を投影することで当該表示情報を表示させる複数の投影部の動作を制御することと、
     前記複数の投影部のうち、少なくとも2以上の前記投影部のそれぞれに対応する前記投影領域を重畳させるための態様が、互いに異なる複数の動作モードを選択的に切り替えることと、
     を実行させる、プログラム。
    On the computer,
    Controlling the operations of a plurality of projection units that display the display information by projecting the display information onto at least a part of the projection area in the projection plane;
    A mode for superimposing the projection regions corresponding to each of at least two of the plurality of projection units among the plurality of projection units selectively switching a plurality of operation modes different from each other;
    A program that executes
PCT/JP2016/050258 2015-03-31 2016-01-06 Information processing device, information processing method, and program WO2016157920A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015072541A JP2016191854A (en) 2015-03-31 2015-03-31 Information processor, information processing method, and program
JP2015-072541 2015-03-31

Publications (1)

Publication Number Publication Date
WO2016157920A1 true WO2016157920A1 (en) 2016-10-06

Family

ID=57004946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/050258 WO2016157920A1 (en) 2015-03-31 2016-01-06 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2016191854A (en)
WO (1) WO2016157920A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010133A (en) * 2021-04-08 2021-06-22 腾讯科技(深圳)有限公司 Data display method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6661282B2 (en) * 2015-05-01 2020-03-11 パラマウントベッド株式会社 Control device, image display system and program

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006165949A (en) * 2004-12-07 2006-06-22 Seiko Epson Corp Projector system and projector
JP2010020290A (en) * 2008-06-09 2010-01-28 Sony Corp Video-signal processing apparatus, video-signal processing method, and video-signal processing system, program, and recording medium
JP2010113598A (en) * 2008-11-07 2010-05-20 Seiko Epson Corp Input system, input method, input device and program
JP2010183229A (en) * 2009-02-04 2010-08-19 Seiko Epson Corp Projector, projection system, image display method, and image display program
JP2011033805A (en) * 2009-07-31 2011-02-17 Sanyo Electric Co Ltd Video controller, projection-type video display device, and video display system
US8013283B2 (en) * 2007-10-01 2011-09-06 Samsung Electronics Co., Ltd. Projector having a communication unit which communicates with at least one other projector different from the projector and multiple projection control method of the projector
JP2011248238A (en) * 2010-05-28 2011-12-08 Canon Inc Projection system
JP2012047849A (en) * 2010-08-25 2012-03-08 Canon Inc Projection type display system and projection type display apparatus
JP2014033381A (en) * 2012-08-06 2014-02-20 Ricoh Co Ltd Information processing apparatus and program, and image processing system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006165949A (en) * 2004-12-07 2006-06-22 Seiko Epson Corp Projector system and projector
US8013283B2 (en) * 2007-10-01 2011-09-06 Samsung Electronics Co., Ltd. Projector having a communication unit which communicates with at least one other projector different from the projector and multiple projection control method of the projector
JP2010020290A (en) * 2008-06-09 2010-01-28 Sony Corp Video-signal processing apparatus, video-signal processing method, and video-signal processing system, program, and recording medium
JP2010113598A (en) * 2008-11-07 2010-05-20 Seiko Epson Corp Input system, input method, input device and program
JP2010183229A (en) * 2009-02-04 2010-08-19 Seiko Epson Corp Projector, projection system, image display method, and image display program
JP2011033805A (en) * 2009-07-31 2011-02-17 Sanyo Electric Co Ltd Video controller, projection-type video display device, and video display system
JP2011248238A (en) * 2010-05-28 2011-12-08 Canon Inc Projection system
JP2012047849A (en) * 2010-08-25 2012-03-08 Canon Inc Projection type display system and projection type display apparatus
JP2014033381A (en) * 2012-08-06 2014-02-20 Ricoh Co Ltd Information processing apparatus and program, and image processing system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010133A (en) * 2021-04-08 2021-06-22 腾讯科技(深圳)有限公司 Data display method
CN113010133B (en) * 2021-04-08 2023-04-07 腾讯科技(深圳)有限公司 Data display method

Also Published As

Publication number Publication date
JP2016191854A (en) 2016-11-10

Similar Documents

Publication Publication Date Title
JP6429545B2 (en) Control device and control method
CN103279313B (en) Display device and display control method
CN103186018A (en) Projector and method of controlling projector
JP2012108479A (en) Projection type display device and control method thereof
KR102046569B1 (en) Imaging apparatus and method of controlling thereof
JP2014052930A (en) Display device and control method of display device
JP6844159B2 (en) Display device and control method of display device
JP2011082798A (en) Projection graphic display device
JP2011133955A (en) Video reproducing apparatus and method
JP6834258B2 (en) Display device, display system and control method of display device
TWI796039B (en) Interactive projection input and output devices
WO2016157920A1 (en) Information processing device, information processing method, and program
CN104978079B (en) Bi-directional display method and bi-directional display device
CN107454308B (en) Display control apparatus, control method thereof, and storage medium
CN104714769B (en) data processing method and electronic equipment
JP2011187086A (en) Video reproduction device and video reproduction method
WO2019198381A1 (en) Information processing device, information processing method, and program
US11276372B2 (en) Method of operation of display device and display device
JP7195816B2 (en) PROJECTION DEVICE, PROJECTION DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP6172618B2 (en) Projection system and pointer
JP2014178504A (en) Projection system and projection method
WO2016139902A1 (en) Display device and display control method
JP2001350585A (en) Image display device with coordinate input function
US20200145628A1 (en) Projection apparatus and correcting method of display image
JP2020173327A (en) Display method and display unit

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16771806

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16771806

Country of ref document: EP

Kind code of ref document: A1