WO2016190193A1 - Information processing apparatus, output control apparatus, information processing system, and video data output method - Google Patents

Information processing apparatus, output control apparatus, information processing system, and video data output method Download PDF

Info

Publication number
WO2016190193A1
WO2016190193A1 PCT/JP2016/064755 JP2016064755W WO2016190193A1 WO 2016190193 A1 WO2016190193 A1 WO 2016190193A1 JP 2016064755 W JP2016064755 W JP 2016064755W WO 2016190193 A1 WO2016190193 A1 WO 2016190193A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
information processing
output
image data
Prior art date
Application number
PCT/JP2016/064755
Other languages
French (fr)
Japanese (ja)
Inventor
貴志 島津
勇一郎 中村
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Publication of WO2016190193A1 publication Critical patent/WO2016190193A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home

Definitions

  • the present invention relates to a technique for transmitting and processing image data associated with display of a captured image and a drawn image.
  • HMD head mounted display
  • the HMD is basically based on the enjoyment of content such as video that is displayed only to the user wearing it.
  • content such as video that is displayed only to the user wearing it.
  • the user's personal movement interacts with the video, so the HMD is used for personal play and video content viewing.
  • the form tended to be limited. Therefore, it is desired to realize a content providing technology that allows users to share experiences with other users while wearing an HMD and enjoying an immersive feeling.
  • the present invention has been made in view of these problems, and an object of the present invention is to provide a technology that allows a plurality of people to enjoy content including image display on the HMD.
  • An aspect of the present invention relates to an information processing apparatus.
  • the information processing apparatus includes an output data generation unit that generates moving image data to be displayed on the display device, and a communication unit that sequentially outputs moving image data generated by the output data generation unit.
  • the output data generation unit causes the communication unit to output combined image data obtained by combining a plurality of moving images to be displayed on each display device for each frame.
  • the output control device includes an input unit that acquires display moving image data output from the information processing device, and a plurality of moving images to be displayed independently on a plurality of display devices in the display moving image data for each frame.
  • an input unit that acquires display moving image data output from the information processing device, and a plurality of moving images to be displayed independently on a plurality of display devices in the display moving image data for each frame.
  • the synthesized composite image data is included, a partial area to be displayed on the display device connected to itself is extracted from the synthesized image data, and the final display image is output data forming unit.
  • an output unit for outputting display image data to a connected display device.
  • This information processing system is an information processing system comprising: an information processing device that outputs moving image data; and an output control device that acquires the moving image data and causes the display device to display the information.
  • the output data generation unit causes the communication unit to transmit combined image data obtained by combining a plurality of moving images to be displayed independently on each display device for each frame.
  • a partial area to be displayed on the display device connected to the mobile device is extracted from the composite image data.
  • the moving image data output method includes a step of generating moving image data to be displayed on a display device and a step of sequentially outputting the generated moving image data.
  • the step of generating moving image data includes: When a plurality of devices are connected, the method includes a step of generating combined image data formed by combining a plurality of moving images to be displayed independently on each display device for each frame.
  • Still another aspect of the present invention also relates to a moving image data output method.
  • a partial region to be displayed on a display device connected to the display device is extracted from the composite image data to form a final display image, and the final display image data is connected to the display device.
  • a step of outputting the data is also referred to a moving image data output method.
  • contents including image display on the HMD can be diversified.
  • FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied.
  • the information processing system 8 includes an imaging device 12 that captures an object, an information processing device 10 that performs information processing based on the captured image, a flat panel display 16 that displays an image obtained as a result of information processing, and each HMDs 18a and 18b worn by users and input devices 14a and 14b operated by the users are included.
  • the information processing device 10 and the imaging device 12, the input devices 14a and 14b, the flat display 16, and the HMDs 18a and 18b may be connected by a cable or a known wireless communication technology such as Bluetooth (registered trademark). Good. Further, depending on the information processing performed by the information processing device 10, the imaging device 12, the flat display 16, and the input devices 14a and 14b may be omitted. Further, the external shapes of these devices are not limited to those shown in the drawings. Further, the number of users wearing the HMD 18a and the like is not limited to two, and may be one or three or more.
  • the imaging apparatus 12 generates output data of a captured image by performing general processing such as demosaic processing on a camera that captures an object such as a user at a predetermined frame rate, and outputs the output data to the information processing apparatus 10.
  • the camera includes a visible light sensor used in general digital cameras and digital video cameras, such as a CCD (Charge-Coupled Device) sensor and a CMOS (Complementary Metal-Oxide Semiconductor) sensor.
  • the imaging device 12 may include only one camera, or may be a so-called stereo camera in which two cameras are arranged on the left and right sides at a known interval as illustrated.
  • the imaging device 12 may be configured by a combination of a device that irradiates an object with reference light such as infrared rays and measures the reflected light and a monocular camera.
  • a stereo camera or a reflected light measurement mechanism When a stereo camera or a reflected light measurement mechanism is introduced, the position of a subject in a three-dimensional real space can be obtained, and information processing and display images can be further diversified.
  • the subject is determined by the triangulation principle, the subject's distance from the camera, and the reflected light is measured by TOF (Time of Flight) or pattern irradiation. Any method for specifying the distance from the camera is widely known.
  • the information processing apparatus 10 outputs image data to be displayed on the HMDs 18 a and 18 b and the flat display 16.
  • the data of the image may be data saved as a complete form such as a movie or a captured moving image, or may be drawn in real time inside the information processing apparatus 10. In the latter case, the information processing apparatus 10 performs, for example, general face detection and tracking processing on a captured image acquired from the imaging apparatus 12 at a predetermined frame rate, thereby reflecting a user's action as a subject. Progress the game in which appears, or convert the user's movement into a command input to perform information processing.
  • the information processing apparatus 10 acquires each user's movement using markers and acceleration sensors provided in the HMDs 18a and 18b and the input devices 14a and 14b, and acquires user operation information for the input devices 14a and 14b.
  • a different image may be generated for each user. For example, by tracking the image of the marker of the HMD 18a, 18b in the captured image, the position and movement of each user's head is specified. And by changing the viewpoint with respect to the virtual world displayed on HMD18a, 18b corresponding to the movement, each user can see a virtual world with an own viewpoint, and can obtain a sense of presence and immersion.
  • the information processing apparatus 10 may output audio data such as BGM in addition to such image data.
  • the HMDs 18a and 18b are display devices that display an image on a display panel such as an organic EL panel positioned in front of the user when the user wears the head. For example, a parallax image viewed from the left and right viewpoints is generated and displayed in the left and right areas obtained by dividing the display screen into two, thereby allowing the image to be viewed stereoscopically.
  • a parallax image viewed from the left and right viewpoints is generated and displayed in the left and right areas obtained by dividing the display screen into two, thereby allowing the image to be viewed stereoscopically.
  • the present embodiment is not limited to this, and one image may be displayed on the entire display screen.
  • the HMDs 18a and 18b further include an element that emits light of a predetermined color or a light emitting marker composed of a set thereof, a speaker or earphone that outputs sound at a position corresponding to the user's ear, and an acceleration that detects the tilt of the head of the user wearing the device. It may be equipped with a sensor.
  • the flat display 16 may be a television having a display that outputs a two-dimensional image and a speaker that outputs sound, and specifically, any of a liquid crystal television, an organic EL television, a plasma television, a PC display, and the like. Or the display and speaker of a tablet terminal or a portable terminal may be sufficient.
  • the input devices 14 a and 14 b are operated by the user to accept requests for processing start and end, function selection, various command inputs, and the like, and supply them to the information processing apparatus 10 as electrical signals.
  • the input devices 14a and 14b may be realized by any one of general input devices such as a game controller, a keyboard, a mouse, a joystick, a touch pad provided on the display screen of the flat display 16, or a combination thereof.
  • the input devices 14a and 14b may further include a light emitting marker composed of an element that emits light of a predetermined color or a set thereof. In this case, when the information processing apparatus 10 tracks the movement of the marker using the captured image, the movement itself of the input devices 14a and 14b can be a user operation. In addition, you may comprise the input devices 14a and 14b only with the light emission marker which has a holding part.
  • output data can be transmitted from one information processing apparatus 10 to a plurality of HMDs 18a and 18b.
  • HMDs 18a and 18b For example, it is possible to realize a mode in which a plurality of users participate in one game and operate the input devices 14a and 14b while watching the game situation from their respective viewpoints.
  • a user as a gallery can watch the game.
  • the data output by the information processing apparatus 10 is first transmitted to the HMD 18a, and then transmitted from the HMD 18a to the HMD 18b, as indicated by the arrows.
  • transmission is performed sequentially between the HMDs.
  • the number of users wearing HMD can be increased without increasing the number of communication terminals of each device.
  • a flat panel display 16 can be included as one such chain of HMDs.
  • the HMDs 18a and 18b may be collectively referred to as the HMD 18.
  • the information processing apparatus 10 can output audio data to each display device, but the following description will be made with the focus on image data.
  • FIG. 2 shows an example of the external shape of the HMD 18.
  • the HMD 18 includes an output mechanism unit 102 and a mounting mechanism unit 104.
  • the mounting mechanism unit 104 includes a mounting band 106 that goes around the head when the user wears to fix the device.
  • the wearing band 106 is made of a material or a structure whose length can be adjusted according to the head circumference of each user.
  • an elastic body such as rubber may be used, or a buckle, a gear, or the like may be used.
  • the output mechanism unit 102 includes a housing 108 shaped to cover the left and right eyes when the user wears the HMD 18 and includes a display panel inside so as to face the eyes when worn.
  • the display panel is realized by a liquid crystal panel or an organic EL panel.
  • the housing 108 may further include a pair of lenses that are positioned between the display panel and the user's eyes when the HMD 18 is mounted and that enlarge the viewing angle of the user.
  • the HMD 18 may further include a speaker or an earphone at a position corresponding to the user's ear when worn.
  • light emitting markers 110a, 110b, 110c, and 110d are provided on the outer surface of the housing 108.
  • the number and arrangement of the light emitting markers are not particularly limited.
  • the light emitting markers are provided at the four corners on the front surface of the casing of the output mechanism unit 102. Further, it may be provided on both side surfaces behind the mounting band 106. Since the light emitting markers 110c and 110d are below the output mechanism unit 102 and are not originally visible from the viewpoint of FIG. 2, the outer periphery is represented by dotted lines.
  • the HMD 18 realizes display and audio output by being connected to an output control device 20 that relays data output from the information processing device 10.
  • the output control device 20 basically includes one input terminal 5 and two output terminals 7a and 7b.
  • the input terminal 5 is an interface that acquires data output from the information processing apparatus 10 directly or indirectly.
  • the output terminals 7 a and 7 b are interfaces for outputting data to the HMD 18, the flat display 16, or another output control device 20.
  • this is not intended to limit the number of input terminals and output terminals and the installation location.
  • FIG. 3 shows an internal circuit configuration of the information processing apparatus 10.
  • the information processing apparatus 10 includes a CPU (Central Processing Unit) 22, a GPU (Graphics Processing Unit) 24, and a main memory 26. These units are connected to each other via a bus 30.
  • An input / output interface 28 is further connected to the bus 30.
  • the input / output interface 28 includes a peripheral device interface such as USB or IEEE1394, a communication unit 32 including a wired or wireless LAN network interface, a storage unit 34 such as a hard disk drive or a nonvolatile memory, and an output for outputting data to an external device.
  • a unit 36, an input unit 38 for inputting data from an external device, and a recording medium driving unit 40 for driving a removable recording medium such as a magnetic disk, an optical disk or a semiconductor memory are connected.
  • the CPU 22 controls the entire information processing apparatus 10 by executing an operating system stored in the storage unit 34.
  • the CPU 22 also executes various programs read from the removable recording medium and loaded into the main memory 26 or downloaded via the communication unit 32.
  • the GPU 24 has a function of a geometry engine and a function of a rendering processor, performs a drawing process according to a drawing command from the CPU 22, and outputs data from the output unit 36 to the output control device 20.
  • the main memory 26 is composed of RAM (Random Access Memory) and stores programs and data necessary for processing.
  • the output control device 20 may basically have the same configuration. However, depending on the application to be executed and the design of the apparatus, the information processing apparatus 10 may perform almost all processes, and the output control apparatus 20 may be sufficient with only simple processes. In this case, the output control device 20 can omit a part of the illustrated configuration.
  • FIG. 4 is a diagram for explaining images displayed on the HMD 18 and the flat display 16.
  • the left image (a) is an image displayed on the HMD 18, and the right image (b) is an image displayed on the flat display 16.
  • a grid-like pattern is the display target, but in reality, various images are displayed depending on the contents of information processing such as a virtual world and a game image.
  • the image (a) for the HMD 18 assumes that an image is stereoscopically viewed, and a pair of left-eye images and right-eye images is formed in left and right regions obtained by dividing an image plane corresponding to a display panel into two.
  • the parallax images are arranged respectively.
  • Such correction processing may be performed by the information processing apparatus 10 or the output control apparatus 20.
  • the output control device 20 that has received the data of the image (a) outputs the data to the HMD 18 as it is, and the flat display 16 displays either the left eye or the right eye of the image (a).
  • An image (b) is generated by performing the reverse correction to the above-described distortion correction and output.
  • the output control device 20 performs distortion correction on the uncorrected parallax image transmitted from the information processing device 10 to generate an image (a) and outputs the image (a) to the HMD 18.
  • One of the parallax images, that is, the image (b) is output as it is.
  • the former mode is assumed, but the latter mode can be similarly applied.
  • a general method can be applied to distortion correction for a lens and vice versa.
  • FIG. 5 shows a configuration of functional blocks of the information processing apparatus 10 and an output control apparatus (referred to as an output control apparatus 20a) that outputs image data to a certain HMD 18a.
  • the output control device 20 is sequentially connected to the information processing device 10 to transmit output data, and an appropriate format is transmitted to the HMD 18 or the flat display 16 connected to each output control device 20.
  • FIG. 5 shows the configuration of the information processing apparatus 10 and the output control apparatus 20a directly connected thereto, but the other output control apparatuses 20 have the same configuration.
  • the side closer to the information processing device 10 in the array of the output control devices 20 connected in this way is expressed as a higher level, and the far side is expressed as a lower level.
  • Each functional block shown in FIG. 5 can be realized in terms of hardware by the CPU, GPU, various memories shown in FIG. 3, and in terms of software, the data input function and data loaded from the recording medium into the memory. It is realized by a program that exhibits various functions such as a holding function, an image processing function, and a communication function. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • the information processing device 10 includes an input data acquisition unit 72 that acquires input data from the imaging device 12 and the input devices 14a and 14b, an information processing unit 74 that performs information processing according to an application to be executed such as a game, a game image and its sound
  • the output data generation unit 78 that generates data to be output as a result of information processing
  • the communication unit 76 that communicates with the highest-level output control device 20a
  • the connection status specification that specifies the connection status of various devices including the output control device 20a Part 80 and a data storage part 81 for storing basic data necessary for generating output data.
  • the input data acquisition unit 72 sequentially acquires images taken by the imaging device 12 at a predetermined frame rate. Also, information indicating the contents of user operations via the input devices 14a and 14b is acquired. Here, the user operation may be performed by general information processing such as selection of an application to be executed, start / end of processing, and command input. Then, the input data acquisition unit 72 supplies the acquired information to the information processing unit 74.
  • the input data acquisition unit 72 may further acquire the result measured by the acceleration sensor in each HMD 18 from the communication unit 76 and supply the result to the information processing unit 74.
  • the input data acquisition unit 72 also controls the start / end of shooting in the imaging device 12 according to the processing start / end request from the user acquired from the input devices 14a and 14b, or according to the processing result in the information processing unit 74.
  • the type of data acquired from the imaging device 12 may be controlled.
  • the data acquired by the input data acquisition unit 72 may differ depending on the content of information processing executed by the information processing unit 74.
  • the information processing unit 74 performs information processing such as a game designated by the user.
  • This information processing may include a process of detecting an image of an object from a captured image and tracking it. For example, a part of a user's body such as a head or a hand may be tracked using a contour line, or an object having a specific pattern or shape may be tracked by pattern matching. A general image processing technique can be applied to these tracking processes. Further, the position and posture of the user's head may be specified in detail by integrating information such as measurement values transmitted from the HMDs 18 by the acceleration sensor.
  • the information processing unit 74 uses the movement and position of the user thus obtained as an input value, or uses a user operation via the input devices 14a and 14b as an input value to progress the game or perform a corresponding process. Or do it.
  • the contents of information processing are not limited to these.
  • the output data generation unit 78 generates image and audio data to be output as a result of information processing in accordance with a request from the information processing unit 74.
  • the virtual world viewed from the viewpoint corresponding to the position and posture of the user's head is rendered as a left and right parallax image, and distortion correction considering the lens is performed and the image data arranged on the left and right Is generated. If this image is displayed in front of the left and right eyes at a predetermined frame rate on the HMD 18 and the sound in the virtual world is further output, the user can feel as if he / she entered the virtual world.
  • the communication unit 76 establishes communication with the output control device 20a using a predetermined protocol such as HDMI (registered trademark) (High-Definition Multimedia Interface) or USB (Universal Serial Bus), and the image and sound generated by the output data generation unit 78. Send the output data. Further, the communication unit 76 detects a device connected to the destination via the output control device 20a.
  • the connection state specifying unit 80 integrates information on the detected devices, specifies information related to the connection state such as the type, number, and connection relationship, and notifies the information processing unit 74 of the information.
  • the information processing unit 74 requests the output data generation unit 78 to generate a display image corresponding to each HMD 18.
  • the information processing unit 74 further requests the output data generation unit 78 to generate a combined image obtained by reducing the data of the plurality of display images generated as described above and connecting them for each frame.
  • the communication unit 76 similarly transmits the composite image data generated in this manner to the output control apparatus 20a. A specific synthesis example will be described in detail later.
  • the output control device 20a includes a first communication unit 82 that acquires output data transmitted from the information processing device 10, an output data forming unit 84 that processes the acquired output data into an appropriate format as necessary, and a connected HMD 18a.
  • the first communication unit 82 establishes communication with a device connected to a higher level by a predetermined protocol, and acquires output data transmitted from the information processing device 10 directly or indirectly. Since the output control apparatus 20a shown in FIG. 5 is directly connected to the information processing apparatus 10, it can also directly acquire output data.
  • the lower-order output control device 20 sequentially acquires output data from the higher-order output control device 20.
  • the output data forming unit 84 processes the output data acquired by the first communication unit 82 as necessary, and outputs the processed data to the second communication unit 86 as a state that can be output to the HMD 18a. Specifically, when the obtained output data includes composite image data, only the image to be displayed on the HMD 18a is extracted and enlarged to the original image size, whereby the image (a) in FIG. An image like this is generated.
  • the output data acquired by the first communication unit 82 does not include a composite image, that is, when the original image is in the state of FIG. 4A, the output data forming unit 84 uses the data as it is as the second communication unit. 86 may be output. As the latter case, a case where the HMD 18 other than the HMD 18a is not connected, or a case where a common image is displayed on all of the plurality of HMDs 18 can be considered.
  • the output data forming unit 84 further appropriately generates and outputs data that can be output to the device connected to the third communication unit 88.
  • a device that may be connected to the third communication unit 88 another output control device 20b, a flat panel display 16, and another HMD 18b may be considered.
  • the output data forming unit 84 may output the output data acquired by the first communication unit 82 to the third communication unit 88 as it is.
  • the output data forming unit 84 processes the output data acquired by the first communication unit 82 and generates an image for the flat display. Specifically, as described above, an image such as the image (b) in FIG. 4 is generated by applying reverse correction to the left or right image of the HMD image.
  • the same processing as that for generating data to be output to the HMD 18a connected to the second communication unit 86 is performed. That is, when the obtained output data includes composite image data, the image to be displayed on the HMD 18b is extracted and enlarged to the original image size. When the composite image is not included, the third communication is performed as it is. To the unit 88. As described above, the output data forming unit 84 appropriately switches the data to be output to the third communication unit 88 depending on the type of device connected to the third communication unit 88. In addition, the output data forming unit 84 performs processing such as decoding of audio data included in the output data as necessary.
  • the second communication unit 86 establishes communication with the connected HMD 18a according to a predetermined protocol, and transmits the output data that the output data forming unit 84 is able to output, thereby causing the HMD 18a to output moving images and audio.
  • the third communication unit 88 establishes communication with another output control device 20b, the flat display 16, or another HMD 18b, and the output data forming unit 84 can output to those devices. Send the output data.
  • the communication unit 76 of the information processing device 10, the first communication unit 82, the second communication unit 86, and the third communication unit 88 of the output control device 20a are connected to each other by transmitting and receiving signals defined by the protocol used for communication. Detect device type. By aggregating this information on the information processing apparatus 10 side according to the processing procedure according to the protocol, the connection status specifying unit 80 determines the number of output control devices 20 connected and the type of display device connected to each. Can be identified.
  • a connected device when connected by HDMI, a connected device can be detected by a hot plug detection signal, or the type of device can be recognized using the HDMI-CEC (Consumer Electronics Control) mechanism, and the connection order can be specified by addressing.
  • HDMI-CEC Consumer Electronics Control
  • a device detection function and an addressing function by USB can be used.
  • FIG. 6 is a diagram for explaining the composite image data output by the information processing apparatus 10.
  • the images 208a and 208b in the upper part of the figure are images that appear appropriately when displayed on the HMD 18 as they are.
  • the image 208a is displayed on the HMD 18a and the image 208b is displayed on the HMD 18b.
  • the output data generation unit 78 first generates images 208a and 208b corresponding to the HMDs 18a and 18b.
  • the generated images 208a and 208b are reduced by a factor of 1/2 in the horizontal direction and connected to the left and right to generate a composite image 210 for output.
  • the reduced image of the first image 208a is arranged in the left region 212a and the reduced image of the second image 208b is arranged in the right region 212b.
  • the information processing apparatus 10 generates such a composite image 210 for each frame and sequentially outputs it.
  • the first image, the second image,..., The Nth image are each reduced horizontally by 1 / N times. Just connect. By doing so, the amount of data to be output is constant regardless of the number of HMDs 18 connected, so that it is not necessary to expand communication hardware and the transfer rate is also constant.
  • the output data generation unit 78 includes information associating the area in the composite image 210 with the HMD 18 to be enlarged and displayed as additional information in the output data. This information may be transmitted at the start of output of a series of moving image data, or may be periodically transmitted, for example, added to each frame.
  • the information processing unit 74 requests the output data generation unit 78 to generate each display image in association with the identification information of the HMD 18 that has been assigned in advance, and the correspondence between the identification information and the arrangement of data in the composite image. Is generated and notified to the output data generation unit 78 to request generation of a composite image.
  • the model data necessary for generating the display image, the arrangement rule for generating the composite image, and the like are stored in the data storage unit 81.
  • the output data forming unit 84 of the output control device 20 extracts, for example, the region 212a from the composite image 210 in FIG. 6 and doubles it in the horizontal direction to restore the image 208a and output it to the HMD 18.
  • the area to be extracted is specified by referring to the additional information of the output data transmitted from the information processing apparatus 10.
  • any of general methods such as nearest neighbor method, bilinear interpolation method, bicubic interpolation method, and Lanczos method may be used.
  • FIG. 7 shows a case where the HMD 18 and the flat display 16 are connected to one output control device 20 as a basic example.
  • the information processing apparatus 10 transmits the image 120 as it is to the output control apparatus 20 (S10).
  • the output control device 20 outputs and displays the image 120 as it is on the connected HMD 18 (S12), generates a display image 122 for a flat panel display, and transmits it to the flat panel display 16 (S14).
  • the HMD image 120 corresponds to the image (a) in FIG. 4, and an ellipse labeled “L” in the rectangle indicates the left-eye image, and an ellipse labeled “R” indicates the right-eye image.
  • the flat display image 122 corresponds to the image (b) in FIG. 4.
  • an image obtained by reversely correcting the left-eye image of the image 120 is represented by a rectangular portion denoted as “L”.
  • the right-eye image may be reversely corrected as an image for a flat panel display, or both the left-eye image and the right-eye image may be reverse-corrected and simultaneously displayed on the flat panel display for stereoscopic viewing. Also good.
  • FIG. 8 is a diagram in which a second HMD 18b is connected instead of the flat panel display 16 in the connected state shown in FIG. It is assumed that the same image is displayed on the two HMDs 18a and 18b.
  • the information processing apparatus 10 when the information processing apparatus 10 generates an image to be commonly displayed on the HMDs 18a and 18b, the information processing apparatus 10 transmits the image 120 as it is to the output control apparatus 20 (S20).
  • the output control device 20 outputs the image 120 as it is to the two connected HMDs 18a and 18b, and displays them on each (S22 and S24). This allows two users to watch the same video or share experiences in the virtual world.
  • FIG. 9 shows a case where two HMDs 18a and 18b are connected to one output control device 20 as in the connection state shown in FIG.
  • the information processing apparatus 10 when the information processing apparatus 10 generates images to be displayed on the HMDs 18a and 18b, the information processing apparatus 10 transmits the composite image 124 that is reduced in the horizontal direction by a factor of 1/2 and connected to the left and right to the output control apparatus 20 (S30). ).
  • the ellipses denoted as “L1” and “R1” are the left and right display images for the HMD 18a
  • the ellipses denoted as “L2” and “R2” are the left and right display images for the HMD 18b, respectively.
  • An image reduced in the horizontal direction is shown.
  • the output control device 20 generates images 126 and 128 to be displayed on the HMDs 18a and 18b based on the correspondence between the region in the composite image and the identification information of the HMD 18 included in the additional information transmitted from the information processing device 10, respectively. Then, each HMD 18a, 18b is output and displayed (S32, S34).
  • the left half area including the ellipses labeled “L1” and “R1” is extracted from the composite image 124, and is doubled in the horizontal direction to be used for the HMD 18a.
  • the display image 126 is generated and transmitted to the HMD 18a (S32).
  • the right half area including the ellipses denoted as “L2” and “R2” is extracted from the composite image 124, and is expanded by a factor of 2 in the horizontal direction to display for the HMD 18b.
  • the image 128 is generated and transmitted to the HMD 18b (S34). According to this aspect, it is possible to realize an aspect in which two users view the same virtual world from different viewpoints or view different screens of the same game.
  • FIG. 10 shows a case where two HMDs 18a and 18b are connected to two output control devices 20a and 20b, respectively, and a flat panel display 16 is connected to a lower output control device 20b.
  • the information processing apparatus 10 first reduces the image displayed on the HMDs 18a and 18b by a factor of 1/2 in the horizontal direction as described with reference to FIG.
  • the composite image 124 connected to is transmitted to the upper output control apparatus 20a (S40).
  • the output control apparatus 20a Based on the correspondence between the area in the composite image and the identification information of the HMD 18 included in the additional information transmitted from the information processing apparatus 10, the output control apparatus 20a corresponds to the area corresponding to the connected HMD 18a, in the example of FIG. A region including an ellipse labeled “L1” and “R1” is specified. Then, the region is extracted from the composite image 124, and the display image 126 for the HMD 18a is generated by enlarging it twice in the horizontal direction, and transmitted to the HMD 18a (S42). The output control device 20a also transmits the data of the composite image 124 transmitted from the information processing device 10 to the output control device 20b connected to the lower level as it is (S44).
  • the output control device 20b based on the correspondence between the region in the composite image and the identification information of the HMD 18 included in the additional information transmitted from the information processing device 10, the region corresponding to the connected HMD 18b, an example of FIG. Then, an area including an ellipse written as “L2” or “R2” is specified. Then, the region is extracted from the composite image 124, and the display image 128 for the HMD 18b is generated by enlarging it twice in the horizontal direction, and transmitted to the HMD 18b (S46).
  • the output control device 20b also extracts a region corresponding to the image for the left eye of the HMD 18a from the composite image 124, enlarges it twice in the horizontal direction, and applies reverse correction to thereby display a display image 130 for a flat panel display. Is generated and transmitted to the flat display 16 (S48).
  • FIG. 10 shows an example in which four HMDs 18 are connected.
  • four output control devices 20a, 20b, 20c, and 20d are connected in series to the information processing device 10, and HMDs 18a, 18b, 18c, and 18d are connected to the output control devices 20a, 20b, 20c, and 20d, respectively. ing.
  • the information processing apparatus 10 When displaying individual images on the four HMDs 18a, 18b, 18c, and 18d, the information processing apparatus 10 reduces the display images corresponding to the HMDs 18a, 18b, 18c, and 18d by a factor of 1/4 in the horizontal direction. Then, the composite image 134 connected in the horizontal direction is generated and transmitted to the highest-level output control device 20a (S50).
  • the display images corresponding to the HMDs 18a, 18b, 18c, and 18d are represented as “L1 / R1”, “L2 / R2”, “L3 / R3”, and “L4 / R4”, respectively.
  • the output control device 20a refers to the additional information, extracts a region corresponding to the connected HMD 18a from the synthesized image 134, and expands the image by four times in the horizontal direction, thereby generating a display image 136 for the HMD 18a and transmitting it to the HMD 18a. (S52).
  • the output control device 20a also transmits the data of the composite image 134 transmitted from the information processing device 10 to the output control device 20b connected to the lower level as it is (S54). Thereafter, the data of the composite image 134 is sequentially transmitted to the output control devices 20c and 20d (S58 and S62), and also to the HMDs 18b, 18c and 18d connected to the respective output control devices 20b, 20c and 20d.
  • the respective display images 138, 140, 142 are restored and transmitted (S56, S60, S64). Thereby, individual images are displayed on the four HMDs 18a to 18d.
  • any number of flat displays 16 or any number of HMDs 18 can be obtained.
  • the image can be displayed in parallel on a display system that combines the flat panel display 16 and the flat panel display 16.
  • the connection order of the HMDs 18a to 18d and the arrangement order of the display images in the composite image 134 correspond to each other. However, as long as they are associated in the additional information as described above, the order is not particularly limited.
  • the data of the composite images 124 and 134 output from the information processing apparatus 10 are transmitted as they are between the plurality of output control apparatuses 20.
  • the higher-level output control device 20 excludes the image extracted by itself from the original composite image, and then uses the remaining image data as the next. You may make it transmit to the low-order output control apparatus 20 connected. In this way, the size of the image data to be transmitted can be reduced as it passes through the output control device 20.
  • FIG. 12 shows a case where four HMDs 18a, 18b, 18c, and 18d connected in the same manner as in FIG. 11 form a plurality of groups, and individual images are displayed for each group.
  • the HMDs 18a, 18c, and 18d belong to the group “a” and the HMD 18b belongs to the group “b” as appended to the HMDs 18a to 18d.
  • the information processing apparatus 10 reduces the display image corresponding to each group (denoted as “La / Ra” and “Lb / Rb”) by a factor of 1/2 in the horizontal direction, and the combined image 154 connected to the left and right. Is transmitted to the highest-level output control device 20a (S70).
  • the output control device 20a refers to the additional information, extracts an area corresponding to the group “a” to which the connected HMD 18a belongs from the composite image 154, and doubles the region in the horizontal direction, so that the group for the group “a” A display image 156 is generated and transmitted to the HMD 18a (S72).
  • the output control device 20a also transmits the data of the composite image 154 transmitted from the information processing device 10 to the output control device 20b connected to the lower level as it is (S74).
  • the output control device 20b generates a display image 158 for the group “b” by extracting the region corresponding to the group “b” to which the connected HMD 18b belongs from the composite image 154 and expanding the region twice in the horizontal direction. Is transmitted to the HMD 18b (S76).
  • the output control device 20b also transmits the data of the composite image 154 transmitted from the information processing device 10 to the output control device 20c connected to the lower level as it is (S78). Similarly to the output control device 20a, the output control device 20c generates a display image 156 for the group “a”, transmits it to the connected HMD 18c (S80), and sends it to the output control device 20d connected to the lower level.
  • the data of the composite image 154 transmitted from the information processing apparatus 10 is transmitted as it is (S82).
  • the output control device 20d also generates a display image 156 for the group “a” and transmits it to the connected HMD 18d (S84).
  • the four HMDs 18a to 18d are divided into two groups a and b, but the number of HMDs 18 and groups is not particularly limited.
  • Such an aspect can be applied to a case where an individual field of view is displayed for each team in a team-matching game, or a case where a display screen is changed depending on positions such as game participants and spectators.
  • the screen of the flat display 16 may be divided into a plurality of areas and, for example, images for the left eye of the images displayed on the respective HMDs 18 may be displayed in parallel. .
  • the composite image output by the information processing apparatus 10 is not limited to being composed of only an HMD image, but may include an image that can be displayed on the flat display 16 as it is, or a plurality of images that are displayed on each of the multiple flat displays 16. You may include images.
  • the information processing apparatus 10 may transmit such composite image data to another information processing apparatus or server via a network and display the data on a display device at a remote location.
  • FIG. 13 shows an example of the data structure of the connection status of the devices managed by the information processing device 10 and the correspondence information of the parameters used for processing.
  • the device management information 160 includes a display device ID column 162, a physical ID column 164, an output control device ID column 166, a display device type column 168, a group ID column 170, and an image region column 172.
  • the connection state specifying unit 80 detects the devices connected using the communication function, and Identification information (logical address) is assigned to the physical address.
  • the information described in the display device ID column 162 in the illustrated device management information 160 is an identification number assigned to the display device, that is, the HMD 18 or the flat display 16 among the detected devices. Information related to the order of connection is also represented by assignment.
  • the physical ID column 164 represents a physical address that each display device has.
  • the output control device ID column 166 represents identification information of the output control device 20 to which each display device is connected.
  • the display device type column 168 represents information for identifying the type of each display device, that is, whether it is an HMD or a flat panel display (shown as “TV” in the figure).
  • the display devices with the identification numbers “01” to “04” are HMDs, and the display device with the identification numbers “05” is “TV”, that is, a flat panel display.
  • the HMDs with the identification numbers “01” to “04” are connected to the output control devices 20 with the identification information “A” to “D”, respectively.
  • the output control device 20 with the identification information “D” is further connected to the identification number “05”. ”Is connected.
  • the connection state identification unit 80 collects these pieces of correspondence information and notifies the information processing unit 74 of them.
  • the information processing unit 74 determines whether to display a common image on all display devices, display a different image on all, or display a different image for each group depending on the content of the game. Make a decision.
  • each display device When specifying a different image for each group, specify the group to which each display device belongs. For example, if each HMD 18 is determined for each user, separately holding information that associates the user ID with the physical address of the HMD 18 and grouping the user IDs by user operation at the start of the game, etc. As a result, the identification number and group of each display device can be associated. However, it should be understood by those skilled in the art that various methods can be considered for associating the HMD 18 with the user and the HMD 18 with the group.
  • the group ID column 170 shows the identification information of the group associated with each HMD 18 in this way.
  • the group ID column 170 shows the identification information of the group associated with each HMD 18 in this way.
  • the identification number “05” is the flat display 16
  • both images are set to be displayed in parallel on the divided screen by belonging to both the groups “a” and “b”.
  • only one of the images may be displayed as described above.
  • the group ID column 170 can be omitted for games that do not require grouping.
  • the information processing unit 74 determines the arrangement of the image data to be displayed on the HMD 18 of each group in the composite image.
  • the lower part of FIG. 13 schematically shows an example of the arrangement.
  • the left half of the composite image 174 is the left and right image data of the group “a”
  • the right half is the data of the left and right images of the group “b”.
  • the horizontal range of the composite image is set in the image area column 172 of the device management information 160.
  • the horizontal size of the composite image 174 is X pixels
  • the range on the coordinate axis where the left end of the image is 0 and the right end is X is specified in a format such as (start point coordinate, end point coordinate).
  • an area of (0, X / 2-1) representing the left half of the composite image 174 is designated for the HMDs 18 having the identification numbers “01”, “03”, and “04” belonging to the group “a”.
  • an area (X / 2, X) representing the right half of the composite image is designated.
  • the flat panel display 16 with the identification number “05” for simultaneously displaying the images of both groups (0, X / 4-1) and (X / 2, 3X) corresponding to the left eye images of each group.
  • the area of / 4-1) is designated.
  • Such device management information 160 is stored in the data storage unit 81 and is shared inside the information processing device 10.
  • the output data generation unit 78 performs the image generation process requested from the information processing unit 74 in association with the identification number of the display device or the identification information of the group to generate a display image, and then reduces them horizontally. Then, the composite image 174 is generated by connecting as specified in the image area column 172. Further, the information processing apparatus 10 transmits the correspondence relationship represented by the apparatus management information 160 to the output control apparatus 20 as additional information.
  • each output control device 20 that has received the data of the composite image 174 specifies an area corresponding to the display device connected to itself based on the additional information, and extracts the final display image from the composite image 174.
  • the fields of the device management information 160 and the format of the notation shown here are merely examples, and various forms are conceivable depending on the content of the game, the communication protocol, and the like.
  • the user ID may be further associated, or may be associated with the input device ID of each user.
  • FIG. 14 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 outputs image data to be displayed on each display device.
  • the communication unit 76 and the connection state specifying unit 80 detect the connected device and assign an identification number. (S100). The information is notified to the information processing unit 74 and each detected device.
  • a display image is generated by the cooperation of the information processing unit 74 and the output data generation unit 78, and if necessary, the display image is reduced to 1 / N times in the horizontal direction and arranged in the associated region for output.
  • FIG. 15 is a flowchart showing a processing procedure for causing the output control device 20 to display an image on each display device.
  • the output control device 20 acquires information indicating the correspondence between the display device or group and the region in the composite image (S120). This information is notified from the information processing apparatus 10 in S110 of FIG.
  • the timing of notifying the correspondence information is not particularly limited, and may be periodically added to the output image such as every frame or every predetermined number of frames.
  • the output control device 20 acquires the image data transmitted from the information processing device 10 (S122). If the acquired image is a composite image (Y in S124), the region corresponding to the HMD 18 connected to itself or a group to which it belongs is extracted from the composite image and expanded in the horizontal direction to obtain a final display image. Is generated (S126). Whether the image transmitted from the information processing apparatus 10 is a composite image, the area to be extracted, and the enlargement ratio can be specified from the correspondence information acquired in S120.
  • the image transmitted from the information processing apparatus 10 is not a composite image, whether there is only one HMD 18 connected or a common image is displayed on all of the plurality of HMDs. (N in S124).
  • the output control device 20 extracts data of one of the left-eye images and applies reverse correction to the flat display.
  • a display image for use is generated (S130). If the flat display 16 is not connected (N of S128), the process of S130 is skipped.
  • the display image data generated in S126 is output to the connected HMD 18 and the display image data generated in S130 is output to the flat panel display 16, thereby displaying the image on each display device (S132).
  • the image data acquired from the information processing device 10 in S122 is output to the other output control device 20 as it is.
  • the processing of S124 to S132 is repeated for the image data transmitted directly or indirectly from the information processing apparatus 10 (N of S134, S122). As a result, a moving image such as a game screen is displayed on each HMD 18 and the flat display 16.
  • all the processing is terminated (Y in S134).
  • FIG. 6 is a diagram for explaining a composite image in which display images are connected in the vertical direction.
  • the upper part of the figure is a composite image 180 in this case, and the lower part is images 182 and 184 to be displayed on the two HMDs 18 respectively.
  • the images 182 and 184 are originally obtained by arranging the left and right parallax images in a region obtained by dividing the screen in the horizontal direction, in order to make the composite image 180 in the vertical direction as illustrated, Rearrangement is also necessary for the left and right images constituting one display image. That is, in the image 182, the left-eye image labeled “L1” is arranged in the first stage of the composite image 180, and the right-eye image written “R1” is arranged in the second stage of the composite image 180.
  • the image for the left eye labeled “L2” is arranged in the third row of the composite image 180
  • the image for the right eye written “R2” is arranged in the fourth row of the composite image 180.
  • the images “L1”, “R1”, “L2”, and “R2” are each enlarged by 2 times in the horizontal direction, reduced by 1/4 times in the vertical direction, and then connected in the vertical direction. As shown, a composite image 180 having the same size as the original images 182 and 184 is obtained.
  • the left-eye and right-eye images of each image are doubled in the horizontal direction and reduced to 1 / (2N) times in the vertical direction.
  • the left eye image and the right eye image are connected in a predetermined order.
  • a composite image in which the left-eye image and the right-eye image corresponding to one display image are vertically arranged in each region obtained by dividing the image region into N in the vertical direction can be generated.
  • Each output control device 20 extracts a region corresponding to the connected HMD 18, enlarges the left-eye image and the right-eye image by 2N times in the vertical direction, reduces the image by 1/2 times in the horizontal direction,
  • the images 182 and 184 are restored.
  • the processing procedure may be the same as that shown in FIGS. 14 and 15 except that the area designation in the image area column 172 of the device management information 160 indicates the vertical coordinate axis.
  • a buffer memory having a capacity for storing at least the entire reduced image of the left-eye image is required. Also, as the number of images to be included in the composite image increases, the input of data located below the composite image is delayed, and the time until the display of the image using the data is started is delayed by a minute time that is equal to or shorter than the frame period. It will be.
  • the information in the horizontal direction represents the left and right parallax, and thus tends to affect the stereoscopic effect.
  • connection direction of the display image is the horizontal direction or the vertical direction is appropriately determined according to the memory capacity that can be used, the display contents, and whether or not stereoscopic viewing is performed.
  • the composite image is composed only of the data of the left and right parallax images.
  • the vertical direction is simply 1 in the vertical direction. It may be reduced to / N times.
  • FIGS. 17 and 18 are diagrams for explaining a modification of the transmission mode of image data to be displayed on a plurality of HMDs.
  • FIG. 17 shows an example in which a plurality of images to be displayed are connected and transmitted without being reduced.
  • the upper part of the figure shows the composite image 220 in this case, and the lower part shows the images 222, 224, 226, and 228 to be displayed on the four HMDs 18, respectively.
  • the composite image 220 is an image in which four display images 222, 224, 226, and 228 are connected as they are, two in the horizontal direction and two in the vertical direction.
  • the composite image 220 has a size four times that of the original image.
  • the transmission band between the information processing device 10 and each output control device 20 is quadrupled.
  • the output control device 20 processing load is reduced.
  • the number of images to be connected as a composite image is not limited to four, and the connection order is not limited to that illustrated.
  • one display image may be divided and connected to a plurality of rows or columns. Furthermore, it may be combined with a mode of reducing the display image, and more display images may be combined while allowing the size of the combined image to be expanded.
  • FIG. 18 shows an example in which a plurality of images to be displayed are transmitted at a high rate without being reduced.
  • the upper part of the figure shows an image sequence 230 to be output in a normal output period for one frame, and the lower part shows images 232 and 234 to be displayed on the two HMDs 18 respectively.
  • the images 232 and 234 to be displayed on each HMD 18 are arranged on the time axis direction indicated by “t” in FIG. Since each of the images 232 and 234 is a moving image frame image, as a result, two moving image frame images are alternately output at a double rate. Even when three or more moving images are included, the frame images are output in order.
  • the output control device 20 extracts the data at the timing when the data of the frame to be displayed on the corresponding HMD 18 arrives. Therefore, the information processing apparatus 10 determines the output order according to the number of HMDs 18 and transmits additional information that associates the order with the HMDs 18 to be displayed. In this aspect, when the number of images to be displayed is N, the output rate is increased N times, so the transmission band between the information processing device 10 and each output control device 20 is increased N times.
  • each HMD 18 can display the acquired image without enlarging, thereby reducing the processing load of the output control device 20. In this case as well, it may be combined with a mode in which the display image is reduced, and more display images may be output while allowing an increase in output rate.
  • FIG. 19 is a diagram for explaining a method of embedding information in an image output from the information processing apparatus 10 as one embodiment of the latter.
  • the image 190 is an image transmitted from the information processing apparatus 10 to the output control apparatus 20.
  • the display image for the group “a” and the group “b” are displayed. This is a composite image obtained by connecting display images for use.
  • the output control device 20 determines which region of the image 190 should be extracted in the invisible region. Can be identified based on color. In an aspect in which no group is formed, the HMD 18 and the color, or the output control device 20 and the color can be directly associated with each other. In any aspect, the information processing apparatus 10 and each output control apparatus 20 may share the correspondence relationship. For example, if the physical address and color of the HMD 18 and the output control apparatus 20 are set in advance in both of them. No need to send or receive information about correspondence.
  • the information processing apparatus 10 determines which pattern to use depending on the content, the connection state of the display device, and the like at the start of the game, and notifies the output control device 20 of it. It may be.
  • the operation is the same if each invisible region is filled with a different color. Further, even if it is not a composite image, it can be represented that it is not a composite image because the colors of all invisible regions are the same.
  • the fill color for example, by selecting three colors of red, green, and blue from 0 or eight patterns of colors represented by the maximum gradation, the output control device 20 can easily determine the region to be extracted without misunderstanding.
  • the information processing apparatus 10 outputs data of the image 190 in which the correspondence information is represented by the color of the invisible region for all the frames to be output or frames at a predetermined interval.
  • the area determination process in the output control apparatus 20 can be simplified, and even if there is a change in the area to be extracted due to the increase or decrease in the number of connected HMDs 18 in the middle, there is no delay in the same process.
  • Appropriate areas can be extracted. Note that not all invisible areas may be filled, but only some areas of a predetermined size may be filled. Even in an invisible region corresponding to one display image, the amount of information to be embedded may be increased by using different colors or displaying marks or patterns depending on the position.
  • the left half invisible area 192a and the right half invisible area 192b are each composed of six small areas.
  • information such as the area in the upper left corner corresponding to the group and the area in the upper center indicating the enlargement ratio of the extracted image may be expressed in color.
  • the above-described pattern of the correspondence relationship between the color and the apparatus may be represented by a color at any location. Further, the information may be expressed by making the invisible region a striped pattern and changing the direction, width, color combination, and the like.
  • an output control device is connected in series to an information processing device that generates image data, and a display device such as an HMD or a flat panel display is connected to each output control device.
  • the information processing apparatus generates and outputs one display image or a combined image obtained by reducing a plurality of display images according to the connection state and the content of the image to be displayed on each display device. This increases the number of connected display devices and the number of images displayed on them without increasing the number of output terminals of each information processing device or output control device, and without increasing the size of data to be transmitted. Can do.
  • This mode is particularly effective for content using an HMD in which a user views a display image individually, and a unique mode in which a plurality of users view the same game or virtual world from individual viewpoints can be easily realized. . If a pair of HMD and output control device is provided as a set, the user can enjoy the desired content at a minimum cost by purchasing additional sets according to the content and the number of participants.
  • a composite image is composed of images that have been subjected to distortion correction corresponding to the lens for display by the HMD.
  • Information for specifying an area to be extracted from the composite image by each output control device is represented by a color or a graphic in the invisible area generated thereby.
  • 8 information processing system 10 information processing device, 12 imaging device, 14a input device, 16 flat panel display, 18 HMD, 20 output control device, 72 input data acquisition unit, 74 information processing unit, 76 communication unit, 78 output data generation Part, 80 connection state specifying part, 81 data storage part, 82 first communication part, 84 output data forming part, 86 second communication part, 88 third communication part.
  • the present invention can be used for a computer, a display device, a game device, an information processing device, an image processing device, a system including them, and the like that process various electronic contents.

Abstract

The present invention connects output control apparatuses 20a and 20b serially to an information processing apparatus 10 so that a display apparatus such as an HMD 18a, an HMD 18b, and a flat-type display 16 can be connected to each of the output control apparatuses 20a and 20b. The information processing apparatus 10 determines the connection states thereof and outputs a composited image 124 generated by reducing and connecting the images to be displayed on the HMDs 18a and 18b, respectively. Each of the output control apparatuses 20a and 20b retrieves the composited image, extracts areas corresponding to the display apparatus connected thereto, and generates and outputs final display images 126, 128, and 130, thereby causing them to be displayed.

Description

情報処理装置、出力制御装置、情報処理システム、および動画データ出力方法Information processing apparatus, output control apparatus, information processing system, and moving image data output method
 本発明は、撮影画像や描画画像の表示に伴う画像データの伝送および処理技術に関する。 The present invention relates to a technique for transmitting and processing image data associated with display of a captured image and a drawn image.
 ユーザが頭部に装着することで眼前に映像を表示するヘッドマウントディスプレイ(Head Mounted Display、以下、「HMD」と呼ぶ)が実用化されている。近年では、ユーザの左右の目に視差画像を表示することで立体映像を提示するHMDや、さらにユーザの頭部の動きを追跡しその結果を映像に反映させることで仮想空間への没入感や臨場感を与えるHMDが提案されている(例えば特許文献1参照)。 A head mounted display (hereinafter referred to as “HMD”) that displays an image in front of the user when the user wears it on the head has been put into practical use. In recent years, an HMD that presents a stereoscopic image by displaying parallax images on the left and right eyes of the user, and a sense of immersion in a virtual space by tracking the movement of the user's head and reflecting the result in the image, HMDs that give a sense of reality have been proposed (see, for example, Patent Document 1).
特開2014-93704号公報JP 2014-93704 A
 HMDは平板型のディスプレイと異なり、それを装着したユーザのみが表示される映像などのコンテンツを楽しむことを基本としている。頭部の動きに対応させて視点を変化させた仮想空間を表示する場合などは特に、ユーザの個人的な動きと映像がインタラクションするため、個人的な遊びや映像コンテンツの鑑賞などにHMDの使用形態が限定されがちであった。そのためHMDを装着し没入感を楽しみつつも、他のユーザと体験を共有できるようなコンテンツ提供技術の実現が望まれている。 Unlike the flat display, the HMD is basically based on the enjoyment of content such as video that is displayed only to the user wearing it. When displaying a virtual space with the viewpoint changed according to the movement of the head, the user's personal movement interacts with the video, so the HMD is used for personal play and video content viewing. The form tended to be limited. Therefore, it is desired to realize a content providing technology that allows users to share experiences with other users while wearing an HMD and enjoying an immersive feeling.
 本発明はこうした課題に鑑みてなされたものであり、その目的は、HMDへの画像表示を含むコンテンツを複数の人数で楽しめるようにする技術を提供することにある。 The present invention has been made in view of these problems, and an object of the present invention is to provide a technology that allows a plurality of people to enjoy content including image display on the HMD.
 本発明のある態様は情報処理装置に関する。この情報処理装置は、表示装置に表示させる動画像のデータを生成する出力データ生成部と、出力データ生成部が生成した動画像のデータを順次、出力する通信部と、を備え、表示装置が複数接続されているとき、出力データ生成部は、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを通信部に出力させることを特徴とする。 An aspect of the present invention relates to an information processing apparatus. The information processing apparatus includes an output data generation unit that generates moving image data to be displayed on the display device, and a communication unit that sequentially outputs moving image data generated by the output data generation unit. When there are a plurality of connections, the output data generation unit causes the communication unit to output combined image data obtained by combining a plurality of moving images to be displayed on each display device for each frame.
 本発明の別の態様は出力制御装置に関する。この出力制御装置は、情報処理装置が出力した表示用動画像のデータを取得する入力部と、表示用動画像のデータに、複数の表示装置に独立に表示させる複数の動画像がフレームごとに合成された合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とする出力データ形成部と、最終的な表示画像のデータを接続された表示装置に出力する出力部と、を備えることを特徴とする。 Another aspect of the present invention relates to an output control device. The output control device includes an input unit that acquires display moving image data output from the information processing device, and a plurality of moving images to be displayed independently on a plurality of display devices in the display moving image data for each frame. When the synthesized composite image data is included, a partial area to be displayed on the display device connected to itself is extracted from the synthesized image data, and the final display image is output data forming unit. And an output unit for outputting display image data to a connected display device.
 本発明のさらに別の態様は情報処理システムに関する。この情報処理システムは、動画像のデータを出力する情報処理装置と、当該動画像のデータを取得して表示装置に表示させる出力制御装置と、を備えた情報処理システムであって、情報処理装置は、表示装置に表示させる動画像のデータを生成する出力データ生成部と、出力データ生成部が生成した動画像のデータを順次、出力制御装置に出力する通信部と、を備え、表示装置が複数接続されているとき、出力データ生成部は、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを通信部に送信させ、出力制御装置は、動画像のデータを取得する入力部と、動画像のデータに合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とする出力データ形成部と、最終的な表示画像のデータを接続された表示装置に出力する出力部と、を備えることを特徴とする。 Still another aspect of the present invention relates to an information processing system. This information processing system is an information processing system comprising: an information processing device that outputs moving image data; and an output control device that acquires the moving image data and causes the display device to display the information. Includes an output data generation unit that generates moving image data to be displayed on the display device, and a communication unit that sequentially outputs the moving image data generated by the output data generation unit to the output control device. When there are a plurality of connections, the output data generation unit causes the communication unit to transmit combined image data obtained by combining a plurality of moving images to be displayed independently on each display device for each frame. When the composite image data is included in the input unit for acquiring the image data and the moving image data, a partial area to be displayed on the display device connected to the mobile device is extracted from the composite image data. Characterized in that it comprises an output data forming section to specific display image, and an output unit for outputting the data of the final displayed image to the connected display device.
 本発明のさらに別の態様は動画データ出力方法に関する。この動画データ出力方法は、表示装置に表示させる動画像のデータを生成するステップと、生成した動画像のデータを順次、出力するステップと、を含み、動画像のデータを生成するステップは、表示装置が複数接続されているとき、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを生成するステップを含むことを特徴とする。 Still another aspect of the present invention relates to a moving image data output method. The moving image data output method includes a step of generating moving image data to be displayed on a display device and a step of sequentially outputting the generated moving image data. The step of generating moving image data includes: When a plurality of devices are connected, the method includes a step of generating combined image data formed by combining a plurality of moving images to be displayed independently on each display device for each frame.
 本発明のさらに別の態様も動画データ出力方法に関する。情報処理装置が出力した表示用動画像のデータを取得するステップと、表示用動画像のデータに、複数の表示装置に独立に表示させる複数の動画像がフレームごとに合成された合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とするステップと、最終的な表示画像のデータを接続された表示装置に出力するステップと、を含むことを特徴とする。 Still another aspect of the present invention also relates to a moving image data output method. A step of acquiring display moving image data output by the information processing device; and a composite image data in which a plurality of moving images to be displayed independently on a plurality of display devices are combined for each frame with the display moving image data A partial region to be displayed on a display device connected to the display device is extracted from the composite image data to form a final display image, and the final display image data is connected to the display device. And a step of outputting the data.
 なお、以上の構成要素の任意の組合せ、本発明の表現を方法、装置、システム、コンピュータプログラム、コンピュータプログラムを記録した記録媒体などの間で変換したものもまた、本発明の態様として有効である。 Note that any combination of the above-described components, and the expression of the present invention converted between a method, an apparatus, a system, a computer program, a recording medium on which the computer program is recorded, and the like are also effective as an aspect of the present invention. .
 本発明によると、HMDへの画像表示を含むコンテンツを多様化させることができる。 According to the present invention, contents including image display on the HMD can be diversified.
本実施の形態を適用できる情報処理システムの構成例を示す図である。It is a figure which shows the structural example of the information processing system which can apply this Embodiment. 本実施の形態におけるHMDの外観形状の例を示す図である。It is a figure which shows the example of the external appearance shape of HMD in this Embodiment. 本実施の形態における情報処理装置の内部回路構成を示す図である。It is a figure which shows the internal circuit structure of the information processing apparatus in this Embodiment. 本実施の形態においてHMDと平板型ディスプレイに表示する画像を説明するための図である。It is a figure for demonstrating the image displayed on HMD and a flat type display in this Embodiment. 本実施の形態における情報処理装置と出力制御装置の機能ブロックの構成を示す図である。It is a figure which shows the structure of the functional block of the information processing apparatus and output control apparatus in this Embodiment. 本実施の形態において情報処理装置が出力する合成画像のデータを説明するための図である。It is a figure for demonstrating the data of the composite image which an information processing apparatus outputs in this Embodiment. 本実施の形態の情報処理システムにおける各装置の接続状態とその間を伝送する画像データの構成例を示す図である。It is a figure which shows the structural example of the image data transmitted between the connection state of each apparatus in the information processing system of this Embodiment, and the space | interval. 本実施の形態の情報処理システムにおける各装置の接続状態とその間を伝送する画像データの構成例を示す図である。It is a figure which shows the structural example of the image data transmitted between the connection state of each apparatus in the information processing system of this Embodiment, and the space | interval. 本実施の形態の情報処理システムにおける各装置の接続状態とその間を伝送する画像データの構成例を示す図である。It is a figure which shows the structural example of the image data transmitted between the connection state of each apparatus in the information processing system of this Embodiment, and the space | interval. 本実施の形態の情報処理システムにおける各装置の接続状態とその間を伝送する画像データの構成例を示す図である。It is a figure which shows the structural example of the image data transmitted between the connection state of each apparatus in the information processing system of this Embodiment, and the space | interval. 本実施の形態の情報処理システムにおける各装置の接続状態とその間を伝送する画像データの構成例を示す図である。It is a figure which shows the structural example of the image data transmitted between the connection state of each apparatus in the information processing system of this Embodiment, and the space | interval. 本実施の形態の情報処理システムにおける各装置の接続状態とその間を伝送する画像データの構成例を示す図である。It is a figure which shows the structural example of the image data transmitted between the connection state of each apparatus in the information processing system of this Embodiment, and the space | interval. 本実施の形態において情報処理装置が管理する装置の接続状態と処理に用いるパラメータの対応情報のデータ構造例を示す図である。It is a figure which shows the data structure example of the corresponding | compatible information of the parameter used for the connection state and process of an apparatus which an information processing apparatus manages in this Embodiment. 本実施の形態における情報処理装置が各表示装置にて表示させる画像のデータを出力する処理手順を示すフローチャートである。It is a flowchart which shows the process sequence which outputs the data of the image which the information processing apparatus in this Embodiment displays on each display apparatus. 本実施の形態における出力制御装置が各表示装置に画像を表示させる処理手順を示すフローチャートである。It is a flowchart which shows the process sequence which the output control apparatus in this Embodiment displays an image on each display apparatus. 本実施の形態において表示画像を縦方向に接続した合成画像について説明するための図である。It is a figure for demonstrating the synthesized image which connected the display image to the vertical direction in this Embodiment. 本実施の形態における、複数のHMDに表示する画像データの送信態様の変形例を説明するための図である。It is a figure for demonstrating the modification of the transmission aspect of the image data displayed on several HMD in this Embodiment. 本実施の形態における、複数のHMDに表示する画像データの送信態様の変形例を説明するための図である。It is a figure for demonstrating the modification of the transmission aspect of the image data displayed on several HMD in this Embodiment. 本実施の形態において、情報処理装置が出力する画像内に、画像中の領域とHMDとの対応情報を埋め込む手法を説明するための図である。In this Embodiment, it is a figure for demonstrating the method of embedding the correspondence information of the area | region in an image, and HMD in the image which an information processing apparatus outputs.
 図1は本実施の形態を適用できる情報処理システムの構成例を示す。この例で情報処理システム8は、対象物を撮影する撮像装置12、撮影した画像に基づき情報処理を行う情報処理装置10、情報処理の結果として得られた画像を表示する平板型ディスプレイ16および各ユーザが装着するHMD18a、18b、各ユーザが操作する入力装置14a、14bを含む。 FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied. In this example, the information processing system 8 includes an imaging device 12 that captures an object, an information processing device 10 that performs information processing based on the captured image, a flat panel display 16 that displays an image obtained as a result of information processing, and each HMDs 18a and 18b worn by users and input devices 14a and 14b operated by the users are included.
 情報処理装置10と、撮像装置12、入力装置14a、14b、平板型ディスプレイ16、HMD18a、18bとは、ケーブルで接続されても、Bluetooth(登録商標)など既知の無線通信技術により接続されてもよい。また情報処理装置10が実施する情報処理によっては、撮像装置12、平板型ディスプレイ16、入力装置14a、14bはなくてもよい。またこれらの装置の外観形状は図示するものに限らない。さらに、HMD18a等を装着するユーザの数は2人に限定されず、1人でも3人以上でもよい。 The information processing device 10 and the imaging device 12, the input devices 14a and 14b, the flat display 16, and the HMDs 18a and 18b may be connected by a cable or a known wireless communication technology such as Bluetooth (registered trademark). Good. Further, depending on the information processing performed by the information processing device 10, the imaging device 12, the flat display 16, and the input devices 14a and 14b may be omitted. Further, the external shapes of these devices are not limited to those shown in the drawings. Further, the number of users wearing the HMD 18a and the like is not limited to two, and may be one or three or more.
 撮像装置12は、ユーザなどの対象物を所定のフレームレートで撮影するカメラと、その出力信号にデモザイク処理など一般的な処理を施すことにより撮影画像の出力データを生成し、情報処理装置10に送出する機構とを有する。カメラはCCD(Charge Coupled Device)センサやCMOS(Complementary Metal Oxide Semiconductor)センサなど、一般的なデジタルカメラ、デジタルビデオカメラで利用されている可視光センサを備える。撮像装置12が備えるカメラは1つのみでもよいし、図示するように2つのカメラを既知の間隔で左右に配置したいわゆるステレオカメラでもよい。 The imaging apparatus 12 generates output data of a captured image by performing general processing such as demosaic processing on a camera that captures an object such as a user at a predetermined frame rate, and outputs the output data to the information processing apparatus 10. A delivery mechanism. The camera includes a visible light sensor used in general digital cameras and digital video cameras, such as a CCD (Charge-Coupled Device) sensor and a CMOS (Complementary Metal-Oxide Semiconductor) sensor. The imaging device 12 may include only one camera, or may be a so-called stereo camera in which two cameras are arranged on the left and right sides at a known interval as illustrated.
 あるいは、赤外線などの参照光を対象物に照射しその反射光を測定する装置と単眼のカメラとの組み合わせにより撮像装置12を構成してもよい。ステレオカメラや反射光の測定機構を導入した場合、3次元の実空間における被写体の位置を求めることができ、情報処理や表示画像をより多様化させることができる。ステレオカメラが左右の視点から撮影したステレオ画像を用いて、三角測量の原理により被写体のカメラからの距離を特定する手法や、反射光の測定によりTOF(Time of Flight)やパターン照射の方式で被写体のカメラからの距離を特定する手法はいずれも広く知られている。 Alternatively, the imaging device 12 may be configured by a combination of a device that irradiates an object with reference light such as infrared rays and measures the reflected light and a monocular camera. When a stereo camera or a reflected light measurement mechanism is introduced, the position of a subject in a three-dimensional real space can be obtained, and information processing and display images can be further diversified. Using stereo images taken by the stereo camera from the left and right viewpoints, the subject is determined by the triangulation principle, the subject's distance from the camera, and the reflected light is measured by TOF (Time of Flight) or pattern irradiation. Any method for specifying the distance from the camera is widely known.
 情報処理装置10はHMD18a、18bや平板型ディスプレイ16に表示させる画像のデータを出力する。当該画像のデータは、映画や撮影された動画など元から完成形としてデータ保存されたものでもよいし、情報処理装置10内部でリアルタイムに描画したものでもよい。後者の場合、情報処理装置10は例えば、撮像装置12から所定のフレームレートで取得した撮影画像に対し一般的な顔検出や追跡処理を施すことにより、被写体であるユーザの動作を反映させたキャラクタが登場するゲームを進捗させたり、ユーザの動きをコマンド入力に変換して情報処理を行ったりする。 The information processing apparatus 10 outputs image data to be displayed on the HMDs 18 a and 18 b and the flat display 16. The data of the image may be data saved as a complete form such as a movie or a captured moving image, or may be drawn in real time inside the information processing apparatus 10. In the latter case, the information processing apparatus 10 performs, for example, general face detection and tracking processing on a captured image acquired from the imaging apparatus 12 at a predetermined frame rate, thereby reflecting a user's action as a subject. Progress the game in which appears, or convert the user's movement into a command input to perform information processing.
 このとき情報処理装置10は、HMD18a、18bや入力装置14a、14bに設けたマーカーや加速度センサを利用して各ユーザの動きを取得したり、入力装置14a、14bに対するユーザの操作情報を取得したりして、ユーザごとに異なる画像を生成してもよい。例えば撮影画像におけるHMD18a、18bのマーカーの像を追跡することにより、各ユーザの頭部の位置や動きを特定する。そして、その動きに対応してHMD18a、18bに表示させる仮想世界に対する視点を変化させることにより、各ユーザは自分の視点で仮想世界を見ることができ、臨場感や没入感を得ることができる。情報処理装置10はそのような画像データに加え、BGMなどの音声データも出力してよい。 At this time, the information processing apparatus 10 acquires each user's movement using markers and acceleration sensors provided in the HMDs 18a and 18b and the input devices 14a and 14b, and acquires user operation information for the input devices 14a and 14b. Alternatively, a different image may be generated for each user. For example, by tracking the image of the marker of the HMD 18a, 18b in the captured image, the position and movement of each user's head is specified. And by changing the viewpoint with respect to the virtual world displayed on HMD18a, 18b corresponding to the movement, each user can see a virtual world with an own viewpoint, and can obtain a sense of presence and immersion. The information processing apparatus 10 may output audio data such as BGM in addition to such image data.
 HMD18a、18bは、ユーザが頭に装着することによりその眼前に位置する有機ELパネルなどの表示パネルに画像を表示する表示装置である。例えば左右の視点から見た視差画像を生成し、表示画面を2分割してなる左右の領域にそれぞれ表示させることにより、画像を立体視させることができる。ただし本実施の形態をこれに限る主旨ではなく、表示画面全体に1つの画像を表示させてもよい。HMD18a、18bはさらに、所定の色で発光する素子またはその集合からなる発光マーカー、ユーザの耳に対応する位置に音声を出力するスピーカーやイヤホン、装着した使用者の頭部の傾きを検出する加速度センサなどを装備していてもよい。 The HMDs 18a and 18b are display devices that display an image on a display panel such as an organic EL panel positioned in front of the user when the user wears the head. For example, a parallax image viewed from the left and right viewpoints is generated and displayed in the left and right areas obtained by dividing the display screen into two, thereby allowing the image to be viewed stereoscopically. However, the present embodiment is not limited to this, and one image may be displayed on the entire display screen. The HMDs 18a and 18b further include an element that emits light of a predetermined color or a light emitting marker composed of a set thereof, a speaker or earphone that outputs sound at a position corresponding to the user's ear, and an acceleration that detects the tilt of the head of the user wearing the device. It may be equipped with a sensor.
 平板型ディスプレイ16は、2次元の画像を出力するディスプレイおよび音声を出力するスピーカーを有するテレビでよく、具体的には液晶テレビ、有機ELテレビ、プラズマテレビ、PCディスプレイ等のいずれでもよい。あるいはタブレット端末や携帯端末のディスプレイおよびスピーカーであってもよい。入力装置14a、14bは、ユーザが操作することにより、処理の開始、終了、機能の選択、各種コマンド入力などの要求を受け付け、情報処理装置10に電気信号として供給する。 The flat display 16 may be a television having a display that outputs a two-dimensional image and a speaker that outputs sound, and specifically, any of a liquid crystal television, an organic EL television, a plasma television, a PC display, and the like. Or the display and speaker of a tablet terminal or a portable terminal may be sufficient. The input devices 14 a and 14 b are operated by the user to accept requests for processing start and end, function selection, various command inputs, and the like, and supply them to the information processing apparatus 10 as electrical signals.
 入力装置14a、14bは、ゲームコントローラ、キーボード、マウス、ジョイスティック、平板型ディスプレイ16の表示画面上に設けたタッチパッドなど、一般的な入力装置のいずれか、またはそれらの組み合わせによって実現してよい。入力装置14a、14bはさらに、所定の色で発光する素子またはその集合からなる発光マーカーを備えていてもよい。この場合、情報処理装置10がマーカーの動きを、撮影画像を用いて追跡することにより、入力装置14a、14bの動き自体をユーザ操作とすることができる。なお入力装置14a、14bを、把持部を有する発光マーカーのみで構成してもよい。 The input devices 14a and 14b may be realized by any one of general input devices such as a game controller, a keyboard, a mouse, a joystick, a touch pad provided on the display screen of the flat display 16, or a combination thereof. The input devices 14a and 14b may further include a light emitting marker composed of an element that emits light of a predetermined color or a set thereof. In this case, when the information processing apparatus 10 tracks the movement of the marker using the captured image, the movement itself of the input devices 14a and 14b can be a user operation. In addition, you may comprise the input devices 14a and 14b only with the light emission marker which has a holding part.
 このように本実施の形態では、1つの情報処理装置10から複数のHMD18a、18bへ出力データを送信できるようにする。これにより例えば複数のユーザが一つのゲームに参加し、それぞれの視点からゲームの状況を眺めつつ入力装置14a、14bを操作する、といった態様を実現できる。また、HMD18a、18bで表示させている画像のいずれか、または双方を平板型ディスプレイ16に表示させることにより、ギャラリーとしてのユーザがゲームを観戦することもできる。 Thus, in this embodiment, output data can be transmitted from one information processing apparatus 10 to a plurality of HMDs 18a and 18b. Thus, for example, it is possible to realize a mode in which a plurality of users participate in one game and operate the input devices 14a and 14b while watching the game situation from their respective viewpoints. In addition, by displaying either or both of the images displayed on the HMDs 18a and 18b on the flat panel display 16, a user as a gallery can watch the game.
 このとき情報処理装置10が出力したデータは、矢印に示すように、まずHMD18aに伝送し、次いでHMD18aからHMD18bへと伝送させる。ユーザの数が増える場合は、HMD間で順次伝送させていく。このようにすることで、各装置の通信用端子の数を増やすことなく、HMDを装着するユーザの数を増やしていくことができる。そしてそのようなHMDの連鎖の一旦として平板型ディスプレイ16を含めることもできる。なお以後の説明ではHMD18a、18bをHMD18と総称する場合がある。また上述のように情報処理装置10は各表示装置に音声データを出力可能とするが、以後は画像データに主眼を置き説明する。 At this time, the data output by the information processing apparatus 10 is first transmitted to the HMD 18a, and then transmitted from the HMD 18a to the HMD 18b, as indicated by the arrows. When the number of users increases, transmission is performed sequentially between the HMDs. By doing in this way, the number of users wearing HMD can be increased without increasing the number of communication terminals of each device. A flat panel display 16 can be included as one such chain of HMDs. In the following description, the HMDs 18a and 18b may be collectively referred to as the HMD 18. In addition, as described above, the information processing apparatus 10 can output audio data to each display device, but the following description will be made with the focus on image data.
 図2はHMD18の外観形状の例を示している。この例においてHMD18は、出力機構部102および装着機構部104で構成される。装着機構部104は、ユーザが被ることにより頭部を一周し装置の固定を実現する装着バンド106を含む。装着バンド106は各ユーザの頭囲に合わせて長さの調節が可能な素材または構造とする。例えばゴムなどの弾性体としてもよいし、バックルや歯車などを利用してもよい。 FIG. 2 shows an example of the external shape of the HMD 18. In this example, the HMD 18 includes an output mechanism unit 102 and a mounting mechanism unit 104. The mounting mechanism unit 104 includes a mounting band 106 that goes around the head when the user wears to fix the device. The wearing band 106 is made of a material or a structure whose length can be adjusted according to the head circumference of each user. For example, an elastic body such as rubber may be used, or a buckle, a gear, or the like may be used.
 出力機構部102は、HMD18をユーザが装着した状態において左右の目を覆うような形状の筐体108を含み、内部には装着時に目に正対するように表示パネルを備える。表示パネルは液晶パネルや有機ELパネルなどで実現する。筐体108内部にはさらに、HMD18装着時に表示パネルとユーザの目との間に位置し、ユーザの視野角を拡大する一対のレンズを備えてもよい。HMD18はさらに、装着時にユーザの耳に対応する位置にスピーカーやイヤホンを備えてよい。 The output mechanism unit 102 includes a housing 108 shaped to cover the left and right eyes when the user wears the HMD 18 and includes a display panel inside so as to face the eyes when worn. The display panel is realized by a liquid crystal panel or an organic EL panel. The housing 108 may further include a pair of lenses that are positioned between the display panel and the user's eyes when the HMD 18 is mounted and that enlarge the viewing angle of the user. The HMD 18 may further include a speaker or an earphone at a position corresponding to the user's ear when worn.
 筐体108の外面には、発光マーカー110a、110b、110c、110dが備えられる。発光マーカーの数や配置は特に限定されないが、図示した例では出力機構部102の筐体前面の4隅に設けている。さらに、装着バンド106後方の両側面にも設けてもよい。なお発光マーカー110c、110dは出力機構部102の下側にあり、図2の視点からは本来は見えないため、外周を点線で表している。 On the outer surface of the housing 108, light emitting markers 110a, 110b, 110c, and 110d are provided. The number and arrangement of the light emitting markers are not particularly limited. In the illustrated example, the light emitting markers are provided at the four corners on the front surface of the casing of the output mechanism unit 102. Further, it may be provided on both side surfaces behind the mounting band 106. Since the light emitting markers 110c and 110d are below the output mechanism unit 102 and are not originally visible from the viewpoint of FIG. 2, the outer periphery is represented by dotted lines.
 HMD18は図示するように、情報処理装置10が出力したデータを中継する出力制御装置20に接続されることにより表示や音声出力を実現する。出力制御装置20は基本的に、1つの入力端子5と2つの出力端子7a、7bを備える。入力端子5は、情報処理装置10から出力されたデータを直接的、あるいは間接的に取得するインターフェースである。出力端子7a、7bは、HMD18、平板型ディスプレイ16、あるいは他の出力制御装置20にデータを出力するインターフェースである。ただし入力端子と出力端子の数や設置箇所をこれに限定する主旨ではない。 As shown in the figure, the HMD 18 realizes display and audio output by being connected to an output control device 20 that relays data output from the information processing device 10. The output control device 20 basically includes one input terminal 5 and two output terminals 7a and 7b. The input terminal 5 is an interface that acquires data output from the information processing apparatus 10 directly or indirectly. The output terminals 7 a and 7 b are interfaces for outputting data to the HMD 18, the flat display 16, or another output control device 20. However, this is not intended to limit the number of input terminals and output terminals and the installation location.
 図3は情報処理装置10の内部回路構成を示している。情報処理装置10は、CPU(Central Processing Unit)22、GPU(Graphics Processing Unit)24、メインメモリ26を含む。これらの各部は、バス30を介して相互に接続されている。バス30にはさらに入出力インターフェース28が接続されている。入出力インターフェース28には、USBやIEEE1394などの周辺機器インターフェースや、有線又は無線LANのネットワークインターフェースからなる通信部32、ハードディスクドライブや不揮発性メモリなどの記憶部34、外部装置へデータを出力する出力部36、外部装置からデータを入力する入力部38、磁気ディスク、光ディスクまたは半導体メモリなどのリムーバブル記録媒体を駆動する記録媒体駆動部40が接続される。 FIG. 3 shows an internal circuit configuration of the information processing apparatus 10. The information processing apparatus 10 includes a CPU (Central Processing Unit) 22, a GPU (Graphics Processing Unit) 24, and a main memory 26. These units are connected to each other via a bus 30. An input / output interface 28 is further connected to the bus 30. The input / output interface 28 includes a peripheral device interface such as USB or IEEE1394, a communication unit 32 including a wired or wireless LAN network interface, a storage unit 34 such as a hard disk drive or a nonvolatile memory, and an output for outputting data to an external device. A unit 36, an input unit 38 for inputting data from an external device, and a recording medium driving unit 40 for driving a removable recording medium such as a magnetic disk, an optical disk or a semiconductor memory are connected.
 CPU22は、記憶部34に記憶されているオペレーティングシステムを実行することにより情報処理装置10の全体を制御する。CPU22はまた、リムーバブル記録媒体から読み出されてメインメモリ26にロードされた、あるいは通信部32を介してダウンロードされた各種プログラムを実行する。GPU24は、ジオメトリエンジンの機能とレンダリングプロセッサの機能とを有し、CPU22からの描画命令に従って描画処理を行い、出力部36から出力制御装置20へデータを出力する。 The CPU 22 controls the entire information processing apparatus 10 by executing an operating system stored in the storage unit 34. The CPU 22 also executes various programs read from the removable recording medium and loaded into the main memory 26 or downloaded via the communication unit 32. The GPU 24 has a function of a geometry engine and a function of a rendering processor, performs a drawing process according to a drawing command from the CPU 22, and outputs data from the output unit 36 to the output control device 20.
 メインメモリ26はRAM(Random Access Memory)により構成され、処理に必要なプログラムやデータを記憶する。出力制御装置20も基本的には同様の構成を有していてよい。ただし実行するアプリケーションや装置の設計によっては、情報処理装置10がほぼ全ての処理を行い、出力制御装置20は簡易な処理のみで十分な場合がある。この場合、出力制御装置20は、図示する構成の一部を省略できる。 The main memory 26 is composed of RAM (Random Access Memory) and stores programs and data necessary for processing. The output control device 20 may basically have the same configuration. However, depending on the application to be executed and the design of the apparatus, the information processing apparatus 10 may perform almost all processes, and the output control apparatus 20 may be sufficient with only simple processes. In this case, the output control device 20 can omit a part of the illustrated configuration.
 図4はHMD18と平板型ディスプレイ16に表示する画像を説明するための図である。同図左側の画像(a)は、HMD18に表示される画像、右側の画像(b)は平板型ディスプレイ16に表示される画像である。同図ではわかりやすさのために格子状の図柄を表示対象としているが、実際には仮想世界やゲーム画像など情報処理の内容によって表示される画像も様々となる。HMD18用の画像(a)は、画像を立体視させることを想定したものであり、表示パネルに対応する画像平面を2分割してなる左右の領域に、左目用画像と右目用画像からなる一対の視差画像をそれぞれ配置した構成を有する。 FIG. 4 is a diagram for explaining images displayed on the HMD 18 and the flat display 16. The left image (a) is an image displayed on the HMD 18, and the right image (b) is an image displayed on the flat display 16. In the figure, for the sake of simplicity, a grid-like pattern is the display target, but in reality, various images are displayed depending on the contents of information processing such as a virtual world and a game image. The image (a) for the HMD 18 assumes that an image is stereoscopically viewed, and a pair of left-eye images and right-eye images is formed in left and right regions obtained by dividing an image plane corresponding to a display panel into two. The parallax images are arranged respectively.
 この例はHMD18において表示パネルの前にレンズを設けユーザの視野を広げることを想定している。この場合、表示パネルに表示された画像には、レンズによって、中心から離れた画素ほど大きい変位で引き延ばされるいわゆる「糸巻き型」の歪みが発生する。変位量は色によっても異なるため色収差も生じる。したがって、当該歪みと色収差を考慮して逆方向に歪めておく歪み補正を行い、画像(a)のような「樽型」の画像を表示させる。これにより、レンズを介して見たときに画像(b)のような本来の画像が立体的に見えるようになる。 This example assumes that a lens is provided in front of the display panel in the HMD 18 to widen the user's field of view. In this case, a so-called “pincushion” distortion is generated in the image displayed on the display panel, which is stretched by the lens with a larger displacement as the pixel is further away from the center. Since the amount of displacement varies depending on the color, chromatic aberration also occurs. Accordingly, distortion correction is performed in which the distortion and the chromatic aberration are taken into consideration in the opposite direction, and a “barrel-shaped” image like the image (a) is displayed. Thus, an original image such as the image (b) can be seen stereoscopically when viewed through the lens.
 このような補正処理は情報処理装置10が行ってもよいし、出力制御装置20が行ってもよい。前者の場合、画像(a)のデータを受信した出力制御装置20は、HMD18には当該データをそのまま出力し、平板型ディスプレイ16には、画像(a)のうち左目用、右目用のどちらかの画像に、上記歪み補正と逆の補正を施し画像(b)を生成したうえで出力する。 Such correction processing may be performed by the information processing apparatus 10 or the output control apparatus 20. In the former case, the output control device 20 that has received the data of the image (a) outputs the data to the HMD 18 as it is, and the flat display 16 displays either the left eye or the right eye of the image (a). An image (b) is generated by performing the reverse correction to the above-described distortion correction and output.
 後者の場合は出力制御装置20が、情報処理装置10から送信された補正前の視差画像に歪み補正を施して画像(a)を生成しHMD18に出力する、平板型ディスプレイ16には補正前の視差画像のいずれか、すなわち画像(b)をそのまま出力する。以後の説明では前者の態様を想定するが、後者の態様でも同様に適用できる。レンズ用の歪み補正やその逆補正については一般的な手法を適用することができる。 In the latter case, the output control device 20 performs distortion correction on the uncorrected parallax image transmitted from the information processing device 10 to generate an image (a) and outputs the image (a) to the HMD 18. One of the parallax images, that is, the image (b) is output as it is. In the following description, the former mode is assumed, but the latter mode can be similarly applied. A general method can be applied to distortion correction for a lens and vice versa.
 図5は情報処理装置10と、あるHMD18aに画像データを出力する出力制御装置(出力制御装置20aとする)の機能ブロックの構成を示している。上述のとおり本実施の形態では、情報処理装置10に出力制御装置20を順次接続して出力データを伝送していくとともに、各出力制御装置20に接続したHMD18または平板型ディスプレイ16へ適切な形式で画像を出力していく。図5はそのうち、情報処理装置10とそれに直接接続された出力制御装置20aの構成を示しているが、その他の出力制御装置20も同様の構成とする。なお以降の説明では、そのように接続された出力制御装置20の配列のうち情報処理装置10に近い側を上位、遠い側を下位と表現する。 FIG. 5 shows a configuration of functional blocks of the information processing apparatus 10 and an output control apparatus (referred to as an output control apparatus 20a) that outputs image data to a certain HMD 18a. As described above, in the present embodiment, the output control device 20 is sequentially connected to the information processing device 10 to transmit output data, and an appropriate format is transmitted to the HMD 18 or the flat display 16 connected to each output control device 20. To output the image. FIG. 5 shows the configuration of the information processing apparatus 10 and the output control apparatus 20a directly connected thereto, but the other output control apparatuses 20 have the same configuration. In the following description, the side closer to the information processing device 10 in the array of the output control devices 20 connected in this way is expressed as a higher level, and the far side is expressed as a lower level.
 図5に示す各機能ブロックは、ハードウェア的には、図3に示したCPU、GPU、各種メモリなどで実現でき、ソフトウェア的には、記録媒体などからメモリにロードした、データ入力機能、データ保持機能、画像処理機能、通信機能などの諸機能を発揮するプログラムで実現される。したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは当業者には理解されるところであり、いずれかに限定されるものではない。 Each functional block shown in FIG. 5 can be realized in terms of hardware by the CPU, GPU, various memories shown in FIG. 3, and in terms of software, the data input function and data loaded from the recording medium into the memory. It is realized by a program that exhibits various functions such as a holding function, an image processing function, and a communication function. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
 情報処理装置10は、撮像装置12および入力装置14a、14bからの入力データを取得する入力データ取得部72、ゲームなど実行するアプリケーションに応じた情報処理を行う情報処理部74、ゲーム画像やその音声など情報処理の結果として出力すべきデータを生成する出力データ生成部78、最上位の出力制御装置20aと通信する通信部76、出力制御装置20aを含む各種装置の接続状態を特定する接続状態特定部80、出力データの生成に必要な基礎データを格納するデータ記憶部81を含む。 The information processing device 10 includes an input data acquisition unit 72 that acquires input data from the imaging device 12 and the input devices 14a and 14b, an information processing unit 74 that performs information processing according to an application to be executed such as a game, a game image and its sound The output data generation unit 78 that generates data to be output as a result of information processing, the communication unit 76 that communicates with the highest-level output control device 20a, and the connection status specification that specifies the connection status of various devices including the output control device 20a Part 80 and a data storage part 81 for storing basic data necessary for generating output data.
 入力データ取得部72は、撮像装置12が撮影した画像を所定のフレームレートで順次取得する。また、入力装置14a、14bを介したユーザ操作の内容を示す情報を取得する。ここでユーザ操作とは、実行するアプリケーションの選択、処理の開始/終了、コマンド入力など、一般的な情報処理でなされるものでよい。そして入力データ取得部72は、取得した情報を情報処理部74に供給する。 The input data acquisition unit 72 sequentially acquires images taken by the imaging device 12 at a predetermined frame rate. Also, information indicating the contents of user operations via the input devices 14a and 14b is acquired. Here, the user operation may be performed by general information processing such as selection of an application to be executed, start / end of processing, and command input. Then, the input data acquisition unit 72 supplies the acquired information to the information processing unit 74.
 入力データ取得部72はさらに、各HMD18内部の加速度センサが計測した結果を通信部76から取得し、情報処理部74に供給してもよい。入力データ取得部72はまた、入力装置14a、14bから取得したユーザからの処理開始/終了要求に従い、撮像装置12における撮影の開始/終了を制御したり、情報処理部74における処理の結果に応じて、撮像装置12から取得するデータの種類を制御したりしてもよい。ただし入力データ取得部72が取得するデータは、情報処理部74が実行する情報処理の内容によって異なってよい。 The input data acquisition unit 72 may further acquire the result measured by the acceleration sensor in each HMD 18 from the communication unit 76 and supply the result to the information processing unit 74. The input data acquisition unit 72 also controls the start / end of shooting in the imaging device 12 according to the processing start / end request from the user acquired from the input devices 14a and 14b, or according to the processing result in the information processing unit 74. Thus, the type of data acquired from the imaging device 12 may be controlled. However, the data acquired by the input data acquisition unit 72 may differ depending on the content of information processing executed by the information processing unit 74.
 情報処理部74は、ユーザが指定したゲームなどの情報処理を実施する。この情報処理には、撮影画像から対象物の像を検出しそれを追跡する処理などを含んでよい。例えば頭部や手などユーザの体の一部を、輪郭線を利用して追跡したり、特定の模様や形状を有する対象物をパターンマッチングにより追跡したりしてもよい。これらの追跡処理には一般的な画像処理技術を適用できる。さらに各HMD18から送信された加速度センサによる計測値などの情報を統合することにより、ユーザの頭部の位置や姿勢を詳細に特定してもよい。情報処理部74は、そのようにして得たユーザの動きや位置を入力値としたり、入力装置14a、14bを介したユーザ操作を入力値としたりして、ゲームを進捗させたり対応する処理を実施したりする。ただし情報処理の内容はこれらに限定されない。 The information processing unit 74 performs information processing such as a game designated by the user. This information processing may include a process of detecting an image of an object from a captured image and tracking it. For example, a part of a user's body such as a head or a hand may be tracked using a contour line, or an object having a specific pattern or shape may be tracked by pattern matching. A general image processing technique can be applied to these tracking processes. Further, the position and posture of the user's head may be specified in detail by integrating information such as measurement values transmitted from the HMDs 18 by the acceleration sensor. The information processing unit 74 uses the movement and position of the user thus obtained as an input value, or uses a user operation via the input devices 14a and 14b as an input value to progress the game or perform a corresponding process. Or do it. However, the contents of information processing are not limited to these.
 出力データ生成部78は、情報処理部74からの要求に従い、情報処理の結果として出力すべき画像や音声のデータを生成する。例えば上述のように、ユーザの頭部の位置や姿勢に対応する視点から見た仮想世界を、左右の視差画像として描画したうえ、レンズを考慮した歪み補正を施して左右に配置した画像のデータを生成する。この画像をHMD18において所定のフレームレートで左右の目の前に表示させ、仮想世界での音声をさらに出力したりすれば、ユーザはあたかも仮想世界に入り込んだような感覚を得られる。 The output data generation unit 78 generates image and audio data to be output as a result of information processing in accordance with a request from the information processing unit 74. For example, as described above, the virtual world viewed from the viewpoint corresponding to the position and posture of the user's head is rendered as a left and right parallax image, and distortion correction considering the lens is performed and the image data arranged on the left and right Is generated. If this image is displayed in front of the left and right eyes at a predetermined frame rate on the HMD 18 and the sound in the virtual world is further output, the user can feel as if he / she entered the virtual world.
 通信部76は、HDMI(登録商標)(High-Definition Multimedia Interface)やUSB(Universal Serial Bus)など所定のプロトコルにより出力制御装置20aと通信を確立し、出力データ生成部78が生成した画像や音声の出力データを送信する。また通信部76は出力制御装置20aを介して、その先に接続されている装置を検出する。接続状態特定部80は、検出された装置の情報を統合して、その種類、数、接続関係といった接続状態に係る情報を特定し、情報処理部74へ通知する。 The communication unit 76 establishes communication with the output control device 20a using a predetermined protocol such as HDMI (registered trademark) (High-Definition Multimedia Interface) or USB (Universal Serial Bus), and the image and sound generated by the output data generation unit 78. Send the output data. Further, the communication unit 76 detects a device connected to the destination via the output control device 20a. The connection state specifying unit 80 integrates information on the detected devices, specifies information related to the connection state such as the type, number, and connection relationship, and notifies the information processing unit 74 of the information.
 ここで複数のHMD18が検出され、かつそれぞれに異なる画像を表示させる場合、情報処理部74は、各HMD18に対応する表示画像の生成を出力データ生成部78に要求する。情報処理部74はさらに、そのようにして生成した複数の表示画像のデータを縮小して接続した合成画像をフレームごとに生成するよう出力データ生成部78に要求する。通信部76はこのようにして生成された合成画像のデータも、出力制御装置20aへ同様に送信する。なお具体的な合成例については後に詳述する。 When a plurality of HMDs 18 are detected and different images are displayed, the information processing unit 74 requests the output data generation unit 78 to generate a display image corresponding to each HMD 18. The information processing unit 74 further requests the output data generation unit 78 to generate a combined image obtained by reducing the data of the plurality of display images generated as described above and connecting them for each frame. The communication unit 76 similarly transmits the composite image data generated in this manner to the output control apparatus 20a. A specific synthesis example will be described in detail later.
 出力制御装置20aは、情報処理装置10から送信された出力データを取得する第1通信部82、取得した出力データを必要に応じて適切な形式に加工する出力データ形成部84、接続されたHMD18aへデータを出力する第2通信部86、接続された別の装置へデータを出力する第3通信部88を含む。第1通信部82は、上位に接続された装置と所定のプロトコルにより通信を確立し、情報処理装置10から送信された出力データを直接的、または間接的に取得する。図5に示した出力制御装置20aは、直接、情報処理装置10に接続していることから、出力データも直接取得できる。それより下位の出力制御装置20は、上位の出力制御装置20から出力データを順次取得する。 The output control device 20a includes a first communication unit 82 that acquires output data transmitted from the information processing device 10, an output data forming unit 84 that processes the acquired output data into an appropriate format as necessary, and a connected HMD 18a. A second communication unit 86 that outputs data to the other device, and a third communication unit 88 that outputs data to another connected device. The first communication unit 82 establishes communication with a device connected to a higher level by a predetermined protocol, and acquires output data transmitted from the information processing device 10 directly or indirectly. Since the output control apparatus 20a shown in FIG. 5 is directly connected to the information processing apparatus 10, it can also directly acquire output data. The lower-order output control device 20 sequentially acquires output data from the higher-order output control device 20.
 出力データ形成部84は、第1通信部82が取得した出力データに必要に応じた加工を施し、HMD18aへ出力可能な状態として第2通信部86へ出力する。具体的には、取得した出力データに合成画像のデータが含まれている場合に、HMD18aに表示すべき画像のみを抽出して元の画像サイズへ拡大することにより、図4の画像(a)のような画像を生成する。第1通信部82が取得した出力データに合成画像が含まれていない場合、すなわち元から図4の画像(a)の状態にある場合、出力データ形成部84は当該データをそのまま第2通信部86へ出力してよい。後者のケースとして、HMD18a以外にHMD18が接続されていない場合、もしくは複数のHMD18全てに共通の画像を表示させる場合が考えられる。 The output data forming unit 84 processes the output data acquired by the first communication unit 82 as necessary, and outputs the processed data to the second communication unit 86 as a state that can be output to the HMD 18a. Specifically, when the obtained output data includes composite image data, only the image to be displayed on the HMD 18a is extracted and enlarged to the original image size, whereby the image (a) in FIG. An image like this is generated. When the output data acquired by the first communication unit 82 does not include a composite image, that is, when the original image is in the state of FIG. 4A, the output data forming unit 84 uses the data as it is as the second communication unit. 86 may be output. As the latter case, a case where the HMD 18 other than the HMD 18a is not connected, or a case where a common image is displayed on all of the plurality of HMDs 18 can be considered.
 出力データ形成部84はさらに、第3通信部88に接続されている装置に対しても、出力可能な状態のデータを適宜生成して出力する。第3通信部88に接続される可能性のある装置として、別の出力制御装置20b、平板型ディスプレイ16、別のHMD18bが考えられる。別の出力制御装置20bが接続されている場合、出力データ形成部84は、第1通信部82が取得した出力データをそのまま第3通信部88へ出力してよい。平板型ディスプレイ16が接続されている場合、出力データ形成部84は、第1通信部82が取得した出力データに加工を施し、平板型ディスプレイ用の画像を生成する。具体的には上述のように、HMD用の画像の左右どちらかの画像に逆補正をかけることにより、図4の画像(b)のような画像を生成する。 The output data forming unit 84 further appropriately generates and outputs data that can be output to the device connected to the third communication unit 88. As a device that may be connected to the third communication unit 88, another output control device 20b, a flat panel display 16, and another HMD 18b may be considered. When another output control device 20 b is connected, the output data forming unit 84 may output the output data acquired by the first communication unit 82 to the third communication unit 88 as it is. When the flat display 16 is connected, the output data forming unit 84 processes the output data acquired by the first communication unit 82 and generates an image for the flat display. Specifically, as described above, an image such as the image (b) in FIG. 4 is generated by applying reverse correction to the left or right image of the HMD image.
 第3通信部88に別のHMD18bが接続されている場合は、第2通信部86に接続されているHMD18aに対し出力するデータを生成したのと同様の処理を行う。すなわち、取得した出力データに合成画像のデータが含まれている場合は、HMD18bに表示すべき画像を抽出して元の画像サイズへ拡大し、合成画像が含まれていない場合はそのまま第3通信部88へ出力する。このように出力データ形成部84は、第3通信部88に接続されている装置の種類によって、第3通信部88へ出力するデータを適宜切り替える。出力データ形成部84はそのほか、出力データに含まれる音声データの復号などの処理を必要に応じて行う。 When another HMD 18b is connected to the third communication unit 88, the same processing as that for generating data to be output to the HMD 18a connected to the second communication unit 86 is performed. That is, when the obtained output data includes composite image data, the image to be displayed on the HMD 18b is extracted and enlarged to the original image size. When the composite image is not included, the third communication is performed as it is. To the unit 88. As described above, the output data forming unit 84 appropriately switches the data to be output to the third communication unit 88 depending on the type of device connected to the third communication unit 88. In addition, the output data forming unit 84 performs processing such as decoding of audio data included in the output data as necessary.
 第2通信部86は、接続されたHMD18aと所定のプロトコルにより通信を確立し、出力データ形成部84が出力可能な状態にした出力データを送信することにより、HMD18aに動画や音声を出力させる。第3通信部88は、別の出力制御装置20b、平板型ディスプレイ16、または別のHMD18bが接続された場合に、それと通信を確立し、出力データ形成部84がそれらの装置に出力可能な状態にした出力データを送信する。 The second communication unit 86 establishes communication with the connected HMD 18a according to a predetermined protocol, and transmits the output data that the output data forming unit 84 is able to output, thereby causing the HMD 18a to output moving images and audio. The third communication unit 88 establishes communication with another output control device 20b, the flat display 16, or another HMD 18b, and the output data forming unit 84 can output to those devices. Send the output data.
 なお情報処理装置10の通信部76、出力制御装置20aの第1通信部82、第2通信部86、第3通信部88は、通信に用いるプロトコルで規定される信号の送受により、接続先の装置の種類を検知する。この情報をプロトコルに応じた処理手順によって情報処理装置10側で集約することにより、接続状態特定部80は、接続されている出力制御装置20の数やそれぞれに接続されている表示装置の種類を特定することができる。 Note that the communication unit 76 of the information processing device 10, the first communication unit 82, the second communication unit 86, and the third communication unit 88 of the output control device 20a are connected to each other by transmitting and receiving signals defined by the protocol used for communication. Detect device type. By aggregating this information on the information processing apparatus 10 side according to the processing procedure according to the protocol, the connection status specifying unit 80 determines the number of output control devices 20 connected and the type of display device connected to each. Can be identified.
 このような処理には一般的な通信技術を適用できる。例えばHDMIにより接続した場合、ホットプラグ検出信号により接続装置を検出したり、HDMI-CEC(Consumer Electronics Control)の仕組みを利用して装置の種類を認識しアドレッシングにより接続順を特定したりできる。あるいはUSBによる装置検出機能およびアドレッシング機能を利用することもできる。 General communication technology can be applied to such processing. For example, when connected by HDMI, a connected device can be detected by a hot plug detection signal, or the type of device can be recognized using the HDMI-CEC (Consumer Electronics Control) mechanism, and the connection order can be specified by addressing. Alternatively, a device detection function and an addressing function by USB can be used.
 図6は情報処理装置10が出力する合成画像のデータを説明するための図である。同図上段の画像208a、208bは、それぞれがHMD18にそのまま表示することにより適正に見える画像であり、例えば画像208aをHMD18aに、画像208bをHMD18bに表示させるとする。このとき出力データ生成部78はまず、各HMD18a、18bに対応させて画像208a、208bを生成する。次に生成した画像208a、208bをそれぞれ、横方向に1/2倍に縮小したうえ、左右に接続することにより、出力用の合成画像210を生成する。 FIG. 6 is a diagram for explaining the composite image data output by the information processing apparatus 10. The images 208a and 208b in the upper part of the figure are images that appear appropriately when displayed on the HMD 18 as they are. For example, the image 208a is displayed on the HMD 18a and the image 208b is displayed on the HMD 18b. At this time, the output data generation unit 78 first generates images 208a and 208b corresponding to the HMDs 18a and 18b. Next, the generated images 208a and 208b are reduced by a factor of 1/2 in the horizontal direction and connected to the left and right to generate a composite image 210 for output.
 すなわちこの場合の合成画像210は、画像領域を横方向に2分割したうちの左の領域212aに第1の画像208aの縮小画像、右の領域212bに第2の画像208bの縮小画像を配置した構成を有する。情報処理装置10は、このような合成画像210をフレームごとに生成し、順次出力していく。N個のHMD18全てに異なる画像を表示させる場合は、第1の画像、第2の画像、・・・、第Nの画像を、それぞれ横方向に1/N倍に縮小した画像を横方向に接続していけばよい。このようにすることで、接続されているHMD18の数によらず出力されるデータ量は一定となるため、通信用のハードウェアを拡張する必要がなく転送レートも一定となる。 That is, in this case, in the composite image 210, the reduced image of the first image 208a is arranged in the left region 212a and the reduced image of the second image 208b is arranged in the right region 212b. It has a configuration. The information processing apparatus 10 generates such a composite image 210 for each frame and sequentially outputs it. When displaying different images on all N HMDs 18, the first image, the second image,..., The Nth image are each reduced horizontally by 1 / N times. Just connect. By doing so, the amount of data to be output is constant regardless of the number of HMDs 18 connected, so that it is not necessary to expand communication hardware and the transfer rate is also constant.
 合成画像210を生成する場合、出力データ生成部78は、合成画像210中の領域と、それを拡大して表示すべきHMD18とを対応づけた情報を付加情報として出力データに含める。この情報は、一連の動画データの出力開始時に送信してもよいし、各フレームに対し付加するなど定期的に送信してもよい。情報処理部74は、あらかじめ付与しておいたHMD18の識別情報に対応づけて各表示画像の生成処理を出力データ生成部78に要求するとともに、当該識別情報と合成画像におけるデータの配置との対応を決定し出力データ生成部78に通知することにより合成画像の生成も要求する。 When generating the composite image 210, the output data generation unit 78 includes information associating the area in the composite image 210 with the HMD 18 to be enlarged and displayed as additional information in the output data. This information may be transmitted at the start of output of a series of moving image data, or may be periodically transmitted, for example, added to each frame. The information processing unit 74 requests the output data generation unit 78 to generate each display image in association with the identification information of the HMD 18 that has been assigned in advance, and the correspondence between the identification information and the arrangement of data in the composite image. Is generated and notified to the output data generation unit 78 to request generation of a composite image.
 表示画像の生成に必要なモデルデータや、合成画像を生成する場合の配置規則などはデータ記憶部81に格納しておく。出力制御装置20の出力データ形成部84は、例えば図6の合成画像210のうち領域212aの部分を抽出し、横方向に2倍に拡大することにより画像208aを復元しHMD18に出力する。この際、自らが抽出すべき領域は、情報処理装置10から送信された出力データの付加情報を参照することにより特定する。なお画像の縮小拡大処理には、最近傍法、バイリニア補間法、バイキュービック補間法、Lanczos法など一般的な手法のいずれを用いてもよい。 The model data necessary for generating the display image, the arrangement rule for generating the composite image, and the like are stored in the data storage unit 81. The output data forming unit 84 of the output control device 20 extracts, for example, the region 212a from the composite image 210 in FIG. 6 and doubles it in the horizontal direction to restore the image 208a and output it to the HMD 18. At this time, the area to be extracted is specified by referring to the additional information of the output data transmitted from the information processing apparatus 10. For image reduction / enlargement processing, any of general methods such as nearest neighbor method, bilinear interpolation method, bicubic interpolation method, and Lanczos method may be used.
 図7~12は、本実施の形態の情報処理システム8における各装置の接続状態とその間を伝送する画像データの構成例を示している。まず図7は基本的な例として、1つの出力制御装置20にHMD18と平板型ディスプレイ16が接続された場合を示している。この場合、情報処理装置10は、HMD18に表示させる画像120を生成したら、その画像120のまま出力制御装置20に送信する(S10)。出力制御装置20は当該画像120を、接続されたHMD18にそのまま出力して表示させるとともに(S12)、平板型ディスプレイ用の表示画像122を生成して平板型ディスプレイ16に送信する(S14)。 7 to 12 show an example of the configuration of the image data transmitted between the connection states of the respective devices in the information processing system 8 of the present embodiment and between them. First, FIG. 7 shows a case where the HMD 18 and the flat display 16 are connected to one output control device 20 as a basic example. In this case, after generating the image 120 to be displayed on the HMD 18, the information processing apparatus 10 transmits the image 120 as it is to the output control apparatus 20 (S10). The output control device 20 outputs and displays the image 120 as it is on the connected HMD 18 (S12), generates a display image 122 for a flat panel display, and transmits it to the flat panel display 16 (S14).
 なお同図においてHMD用の画像120は図4の画像(a)に対応し、矩形内の「L」と表記した楕円が左目用画像、「R」と表記した楕円が右目用画像を示している。また平板型ディスプレイ用の画像122は図4の画像(b)に対応し、この場合、画像120の左目用画像を逆補正した画像が「L」と表記した矩形部分に表される。ただし平板型ディスプレイ用の画像として、右目用画像を逆補正したものを用いてもよいし、左目用画像と右目用画像の双方を逆補正し平板型ディスプレイに同時に表示させることにより立体視させてもよい。 In the figure, the HMD image 120 corresponds to the image (a) in FIG. 4, and an ellipse labeled “L” in the rectangle indicates the left-eye image, and an ellipse labeled “R” indicates the right-eye image. Yes. The flat display image 122 corresponds to the image (b) in FIG. 4. In this case, an image obtained by reversely correcting the left-eye image of the image 120 is represented by a rectangular portion denoted as “L”. However, the right-eye image may be reversely corrected as an image for a flat panel display, or both the left-eye image and the right-eye image may be reverse-corrected and simultaneously displayed on the flat panel display for stereoscopic viewing. Also good.
 図8は、図7で示した接続状態における平板型ディスプレイ16の代わりに、2つめのHMD18bを接続したものである。この2つのHMD18a、18bには同じ画像を表示させるとする。この場合、情報処理装置10は、HMD18a、18bに共通に表示させる画像を生成したら、その画像120のまま出力制御装置20に送信する(S20)。出力制御装置20は当該画像120を、接続された2つのHMD18a、18bにそのまま出力し、それぞれに表示させる(S22、S24)。これにより、2人のユーザが同じ映像を鑑賞したり仮想世界における体験を共有したりできる。 FIG. 8 is a diagram in which a second HMD 18b is connected instead of the flat panel display 16 in the connected state shown in FIG. It is assumed that the same image is displayed on the two HMDs 18a and 18b. In this case, when the information processing apparatus 10 generates an image to be commonly displayed on the HMDs 18a and 18b, the information processing apparatus 10 transmits the image 120 as it is to the output control apparatus 20 (S20). The output control device 20 outputs the image 120 as it is to the two connected HMDs 18a and 18b, and displays them on each (S22 and S24). This allows two users to watch the same video or share experiences in the virtual world.
 図9は、図8で示した接続状態と同様、1つの出力制御装置20に2つのHMD18a、18bが接続されているが、両者に異なる画像を表示させる場合を示している。このとき情報処理装置10は、HMD18a、18bのそれぞれに表示させる画像を生成したら、それらを横方向に1/2倍に縮小し左右に接続した合成画像124を出力制御装置20に送信する(S30)。なお合成画像124の矩形において、「L1」、「R1」と表記した楕円がHMD18a用の左右の表示画像、「L2」、「R2」と表記した楕円がHMD18b用の左右の表示画像を、それぞれ横方向に縮小した画像を示している。 FIG. 9 shows a case where two HMDs 18a and 18b are connected to one output control device 20 as in the connection state shown in FIG. At this time, when the information processing apparatus 10 generates images to be displayed on the HMDs 18a and 18b, the information processing apparatus 10 transmits the composite image 124 that is reduced in the horizontal direction by a factor of 1/2 and connected to the left and right to the output control apparatus 20 (S30). ). In the rectangle of the composite image 124, the ellipses denoted as “L1” and “R1” are the left and right display images for the HMD 18a, and the ellipses denoted as “L2” and “R2” are the left and right display images for the HMD 18b, respectively. An image reduced in the horizontal direction is shown.
 出力制御装置20は、情報処理装置10から送信された付加情報に含まれる、合成画像中の領域とHMD18の識別情報との対応関係に基づき、HMD18a、18bに表示させる画像126、128をそれぞれ生成し、各HMD18a、18bに出力して表示させる(S32、S34)。図示する例では、HMD18a用には、「L1」、「R1」と表記された楕円を含む左半分の領域を合成画像124から抽出し、それを横方向に2倍に拡大することによりHMD18a用の表示画像126を生成してHMD18aに送信する(S32)。 The output control device 20 generates images 126 and 128 to be displayed on the HMDs 18a and 18b based on the correspondence between the region in the composite image and the identification information of the HMD 18 included in the additional information transmitted from the information processing device 10, respectively. Then, each HMD 18a, 18b is output and displayed (S32, S34). In the example shown in the figure, for the HMD 18a, the left half area including the ellipses labeled “L1” and “R1” is extracted from the composite image 124, and is doubled in the horizontal direction to be used for the HMD 18a. The display image 126 is generated and transmitted to the HMD 18a (S32).
 同様に、HMD18b用には、「L2」、「R2」と表記された楕円を含む右半分の領域を合成画像124から抽出し、それを横方向に2倍に拡大することによりHMD18b用の表示画像128を生成してHMD18bに送信する(S34)。この態様により、2人のユーザが個別の視点から同じ仮想世界を眺めたり、同じゲームの異なる画面を見たりする態様を実現できる。 Similarly, for the HMD 18b, the right half area including the ellipses denoted as “L2” and “R2” is extracted from the composite image 124, and is expanded by a factor of 2 in the horizontal direction to display for the HMD 18b. The image 128 is generated and transmitted to the HMD 18b (S34). According to this aspect, it is possible to realize an aspect in which two users view the same virtual world from different viewpoints or view different screens of the same game.
 図10は、2つの出力制御装置20a、20bに、2つのHMD18a、18bをそれぞれ接続し、さらに下位の出力制御装置20bに平板型ディスプレイ16を接続した場合を示している。この接続状態においてHMD18a、18bに異なる画像を表示させる場合、まず図9で説明したのと同様、情報処理装置10はHMD18a、18bに表示させる画像を横方向に1/2倍に縮小して左右に接続した合成画像124を上位の出力制御装置20aに送信する(S40)。 FIG. 10 shows a case where two HMDs 18a and 18b are connected to two output control devices 20a and 20b, respectively, and a flat panel display 16 is connected to a lower output control device 20b. When displaying different images on the HMDs 18a and 18b in this connected state, the information processing apparatus 10 first reduces the image displayed on the HMDs 18a and 18b by a factor of 1/2 in the horizontal direction as described with reference to FIG. The composite image 124 connected to is transmitted to the upper output control apparatus 20a (S40).
 出力制御装置20aは情報処理装置10から送信された付加情報に含まれる、合成画像中の領域とHMD18の識別情報との対応関係に基づき、接続されたHMD18aに対応する領域、同図の例では「L1」、「R1」と表記された楕円を含む領域を特定する。そして当該領域を合成画像124から抽出し、それを横方向に2倍に拡大することによりHMD18a用の表示画像126を生成してHMD18aに送信する(S42)。出力制御装置20aはまた、下位に接続されている出力制御装置20bに、情報処理装置10から送信された合成画像124のデータをそのまま送信する(S44)。 Based on the correspondence between the area in the composite image and the identification information of the HMD 18 included in the additional information transmitted from the information processing apparatus 10, the output control apparatus 20a corresponds to the area corresponding to the connected HMD 18a, in the example of FIG. A region including an ellipse labeled “L1” and “R1” is specified. Then, the region is extracted from the composite image 124, and the display image 126 for the HMD 18a is generated by enlarging it twice in the horizontal direction, and transmitted to the HMD 18a (S42). The output control device 20a also transmits the data of the composite image 124 transmitted from the information processing device 10 to the output control device 20b connected to the lower level as it is (S44).
 出力制御装置20bは、情報処理装置10から送信された付加情報に含まれる、合成画像中の領域とHMD18の識別情報との対応関係に基づき、接続されたHMD18bに対応する領域、同図の例では「L2」、「R2」と表記された楕円を含む領域を特定する。そして当該領域を合成画像124から抽出し、それを横方向に2倍に拡大することによりHMD18b用の表示画像128を生成してHMD18bに送信する(S46)。出力制御装置20bはまた、合成画像124からHMD18aの左目用画像に対応する領域を抽出し、それを横方向に2倍に拡大したうえ逆補正をかけることにより、平板型ディスプレイ用の表示画像130を生成して平板型ディスプレイ16に送信する(S48)。 The output control device 20b, based on the correspondence between the region in the composite image and the identification information of the HMD 18 included in the additional information transmitted from the information processing device 10, the region corresponding to the connected HMD 18b, an example of FIG. Then, an area including an ellipse written as “L2” or “R2” is specified. Then, the region is extracted from the composite image 124, and the display image 128 for the HMD 18b is generated by enlarging it twice in the horizontal direction, and transmitted to the HMD 18b (S46). The output control device 20b also extracts a region corresponding to the image for the left eye of the HMD 18a from the composite image 124, enlarges it twice in the horizontal direction, and applies reverse correction to thereby display a display image 130 for a flat panel display. Is generated and transmitted to the flat display 16 (S48).
 図10で示したように、各出力制御装置20が有する出力端子が2つのみであっても、その一方に別の出力制御装置20を接続することにより、3つ以上の表示装置での並列表示が可能となる。出力制御装置20の接続数をさらに増やせば、並列表示させる表示装置の数も増やすことができる。図11はその一例として、4つのHMD18を接続する場合を示している。この例では、情報処理装置10に4つの出力制御装置20a、20b、20c、20dが直列に接続され、各出力制御装置20a、20b、20c、20dにHMD18a、18b、18c、18dがそれぞれ接続されている。 As shown in FIG. 10, even if each output control device 20 has only two output terminals, by connecting another output control device 20 to one of the two output terminals, it is possible to connect three or more display devices in parallel. Display is possible. If the number of output control devices 20 is further increased, the number of display devices to be displayed in parallel can also be increased. FIG. 11 shows an example in which four HMDs 18 are connected. In this example, four output control devices 20a, 20b, 20c, and 20d are connected in series to the information processing device 10, and HMDs 18a, 18b, 18c, and 18d are connected to the output control devices 20a, 20b, 20c, and 20d, respectively. ing.
 4つのHMD18a、18b、18c、18dに個別の画像を表示させる場合、情報処理装置10は、各HMD18a、18b、18c、18dに対応する表示画像を、それぞれ横方向に1/4倍に縮小し、横方向に接続した合成画像134を生成したうえ、最上位の出力制御装置20aに送信する(S50)。同図の例では、HMD18a、18b、18c、18dに対応する表示画像をそれぞれ「L1/R1」、「L2/R2」、「L3/R3」、および「L4/R4」と表記している。出力制御装置20aは付加情報を参照し、接続されたHMD18aに対応する領域を合成画像134から抽出して横方向に4倍に拡大することにより、HMD18a用の表示画像136を生成しHMD18aに送信する(S52)。 When displaying individual images on the four HMDs 18a, 18b, 18c, and 18d, the information processing apparatus 10 reduces the display images corresponding to the HMDs 18a, 18b, 18c, and 18d by a factor of 1/4 in the horizontal direction. Then, the composite image 134 connected in the horizontal direction is generated and transmitted to the highest-level output control device 20a (S50). In the example shown in the figure, the display images corresponding to the HMDs 18a, 18b, 18c, and 18d are represented as “L1 / R1”, “L2 / R2”, “L3 / R3”, and “L4 / R4”, respectively. The output control device 20a refers to the additional information, extracts a region corresponding to the connected HMD 18a from the synthesized image 134, and expands the image by four times in the horizontal direction, thereby generating a display image 136 for the HMD 18a and transmitting it to the HMD 18a. (S52).
 出力制御装置20aはまた、下位に接続されている出力制御装置20bに、情報処理装置10から送信された合成画像134のデータをそのまま送信する(S54)。以後、出力制御装置20c、20dには、合成画像134のデータが順次伝送されるとともに(S58、S62)、それぞれの出力制御装置20b、20c、20dに接続されているHMD18b、18c、18dには、それぞれの表示画像138、140、142が復元されて送信される(S56、S60、S64)。これにより4つのHMD18a~18dには個別の画像が表示される。 The output control device 20a also transmits the data of the composite image 134 transmitted from the information processing device 10 to the output control device 20b connected to the lower level as it is (S54). Thereafter, the data of the composite image 134 is sequentially transmitted to the output control devices 20c and 20d (S58 and S62), and also to the HMDs 18b, 18c and 18d connected to the respective output control devices 20b, 20c and 20d. The respective display images 138, 140, 142 are restored and transmitted (S56, S60, S64). Thereby, individual images are displayed on the four HMDs 18a to 18d.
 同様の構成とすることで、必要な転送レートを一定としたまま、任意の数のHMD18に異なる画像を表示させることができる。また出力制御装置20dにさらに平板型ディスプレイ16を接続したり、HMD18a~18dの少なくともいずれかを平板型ディスプレイ16に置き換えたりすることで、任意の数の平板型ディスプレイ16や、任意の数のHMD18と平板型ディスプレイ16とを組み合わせた表示システムに、画像を並列表示させることができる。なお同図ではHMD18a~18dの接続順と、合成画像134における表示画像の配置順が対応しているが、上述のとおり付加情報において両者を対応づけておけば、それらの順序は特に限定されない。 By adopting the same configuration, it is possible to display different images on any number of HMDs 18 while keeping the necessary transfer rate constant. Further, by connecting the flat display 16 to the output control device 20d or by replacing at least one of the HMDs 18a to 18d with the flat display 16, any number of flat displays 16 or any number of HMDs 18 can be obtained. The image can be displayed in parallel on a display system that combines the flat panel display 16 and the flat panel display 16. In the figure, the connection order of the HMDs 18a to 18d and the arrangement order of the display images in the composite image 134 correspond to each other. However, as long as they are associated in the additional information as described above, the order is not particularly limited.
 また図10、11で示した態様では、情報処理装置10が出力した合成画像124、134のデータを、複数の出力制御装置20間でそのまま伝送させた。一方、これらの例のように複数のHMD18が全て異なる画像を表示する場合は、上位の出力制御装置20は自らが抽出した画像を元の合成画像から除外したうえ、残りの画像データを次に接続されている下位の出力制御装置20へ送信するようにしてもよい。このようにすると、出力制御装置20を通過するほど、伝送すべき画像データのサイズを減らすことができる。 10 and 11, the data of the composite images 124 and 134 output from the information processing apparatus 10 are transmitted as they are between the plurality of output control apparatuses 20. On the other hand, when the plurality of HMDs 18 all display different images as in these examples, the higher-level output control device 20 excludes the image extracted by itself from the original composite image, and then uses the remaining image data as the next. You may make it transmit to the low-order output control apparatus 20 connected. In this way, the size of the image data to be transmitted can be reduced as it passes through the output control device 20.
 図12は、図11と同様に接続された4つのHMD18a、18b、18c、18dが複数のグループを形成し、グループごとに個別の画像を表示させる場合を示している。同図の例では、各HMD18a~18dに付記されるように、HMD18a、18c、18dがグループ「a」に、HMD18bがグループ「b」に属しているとしている。この場合、情報処理装置10は、各グループに対応する表示画像(「La/Ra」および「Lb/Rb」と表記)を横方向に1/2倍に縮小し、左右に接続した合成画像154を生成したうえ、最上位の出力制御装置20aに送信する(S70)。 FIG. 12 shows a case where four HMDs 18a, 18b, 18c, and 18d connected in the same manner as in FIG. 11 form a plurality of groups, and individual images are displayed for each group. In the example shown in the drawing, it is assumed that the HMDs 18a, 18c, and 18d belong to the group “a” and the HMD 18b belongs to the group “b” as appended to the HMDs 18a to 18d. In this case, the information processing apparatus 10 reduces the display image corresponding to each group (denoted as “La / Ra” and “Lb / Rb”) by a factor of 1/2 in the horizontal direction, and the combined image 154 connected to the left and right. Is transmitted to the highest-level output control device 20a (S70).
 出力制御装置20aは付加情報を参照し、接続されたHMD18aが属するグループ「a」に対応する領域を合成画像154から抽出して横方向に2倍に拡大することにより、グループ「a」用の表示画像156を生成しHMD18aに送信する(S72)。出力制御装置20aはまた、下位に接続されている出力制御装置20bに、情報処理装置10から送信された合成画像154のデータをそのまま送信する(S74)。出力制御装置20bは、接続されたHMD18bが属するグループ「b」に対応する領域を合成画像154から抽出して横方向に2倍に拡大することにより、グループ「b」用の表示画像158を生成しHMD18bに送信する(S76)。 The output control device 20a refers to the additional information, extracts an area corresponding to the group “a” to which the connected HMD 18a belongs from the composite image 154, and doubles the region in the horizontal direction, so that the group for the group “a” A display image 156 is generated and transmitted to the HMD 18a (S72). The output control device 20a also transmits the data of the composite image 154 transmitted from the information processing device 10 to the output control device 20b connected to the lower level as it is (S74). The output control device 20b generates a display image 158 for the group “b” by extracting the region corresponding to the group “b” to which the connected HMD 18b belongs from the composite image 154 and expanding the region twice in the horizontal direction. Is transmitted to the HMD 18b (S76).
 出力制御装置20bはまた、下位に接続されている出力制御装置20cに、情報処理装置10から送信された合成画像154のデータをそのまま送信する(S78)。出力制御装置20cは、出力制御装置20aと同様、グループ「a」用の表示画像156を生成し、接続されたHMD18cに送信するとともに(S80)、下位に接続されている出力制御装置20dに、情報処理装置10から送信された合成画像154のデータをそのまま送信する(S82)。出力制御装置20dもグループ「a」用の表示画像156を生成し、接続されたHMD18dに送信する(S84)。 The output control device 20b also transmits the data of the composite image 154 transmitted from the information processing device 10 to the output control device 20c connected to the lower level as it is (S78). Similarly to the output control device 20a, the output control device 20c generates a display image 156 for the group “a”, transmits it to the connected HMD 18c (S80), and sends it to the output control device 20d connected to the lower level. The data of the composite image 154 transmitted from the information processing apparatus 10 is transmitted as it is (S82). The output control device 20d also generates a display image 156 for the group “a” and transmits it to the connected HMD 18d (S84).
 図示する例では、4つのHMD18a~18dを、aとbの2つのグループに分けたが、HMD18やグループの数は特に限定されない。このような態様は、チーム対抗のゲームにおいてチームごとに個別の視野を表示させる場合や、ゲームの参加者と観戦者など立場によって表示画面を異ならせたりする場合などに適用できる。また上述のように平板型ディスプレイ16を接続する場合は、当該平板型ディスプレイ16の画面を複数の領域に分割して各HMD18に表示されている画像の例えば左目用画像を並列表示させてもよい。 In the illustrated example, the four HMDs 18a to 18d are divided into two groups a and b, but the number of HMDs 18 and groups is not particularly limited. Such an aspect can be applied to a case where an individual field of view is displayed for each team in a team-matching game, or a case where a display screen is changed depending on positions such as game participants and spectators. Further, when the flat display 16 is connected as described above, the screen of the flat display 16 may be divided into a plurality of areas and, for example, images for the left eye of the images displayed on the respective HMDs 18 may be displayed in parallel. .
 このようにすると、例えばレースゲームに参加している各ユーザがHMD18で個々に見ている画像を、並べた状態で見せることができ、ゲームに参加していないユーザが観戦を楽しむことができる。また情報処理装置10が出力する合成画像はHMD用の画像のみで構成するのに限らず、平板型ディスプレイ16にそのまま表示できる状態の画像を含めたり、複数の平板型ディスプレイ16にそれぞれ表示させる複数の画像を含めたりしてもよい。情報処理装置10はそのような合成画像のデータを、ネットワークを介して別の情報処理装置やサーバに送信し、遠隔地にある表示装置に表示させてもよい。 In this way, for example, images that each user participating in the race game is viewing individually on the HMD 18 can be shown in an aligned state, and users who are not participating in the game can enjoy watching the game. The composite image output by the information processing apparatus 10 is not limited to being composed of only an HMD image, but may include an image that can be displayed on the flat display 16 as it is, or a plurality of images that are displayed on each of the multiple flat displays 16. You may include images. The information processing apparatus 10 may transmit such composite image data to another information processing apparatus or server via a network and display the data on a display device at a remote location.
 図13は情報処理装置10が管理する装置の接続状態と処理に用いるパラメータの対応情報のデータ構造例を示している。装置管理情報160は、表示装置ID欄162、物理ID欄164、出力制御装置ID欄166、表示装置タイプ欄168、グループID欄170、画像領域欄172を含む。上述のとおり情報処理装置10に出力制御装置20、HMD18、平板型ディスプレイ16が接続されると、接続状態特定部80は、通信機能を利用して接続されている装置を検出し、各装置の物理アドレスに対し識別情報(論理アドレス)を割り当てる。 FIG. 13 shows an example of the data structure of the connection status of the devices managed by the information processing device 10 and the correspondence information of the parameters used for processing. The device management information 160 includes a display device ID column 162, a physical ID column 164, an output control device ID column 166, a display device type column 168, a group ID column 170, and an image region column 172. As described above, when the output control device 20, the HMD 18, and the flat display 16 are connected to the information processing device 10, the connection state specifying unit 80 detects the devices connected using the communication function, and Identification information (logical address) is assigned to the physical address.
 図示する装置管理情報160において表示装置ID欄162に記載される情報は、検出された装置のうち表示装置、すなわちHMD18または平板型ディスプレイ16に割り当てられた識別番号であり、上位から昇順に番号を割り当てることで接続順に係る情報も表している。物理ID欄164は各表示装置が有する物理アドレスを表す。出力制御装置ID欄166は、各表示装置が接続されている出力制御装置20の識別情報を表す。表示装置タイプ欄168は、各表示装置の種類、すなわちHMDか平板型ディスプレイ(図では「TV」と表記)かを識別する情報を表す。 The information described in the display device ID column 162 in the illustrated device management information 160 is an identification number assigned to the display device, that is, the HMD 18 or the flat display 16 among the detected devices. Information related to the order of connection is also represented by assignment. The physical ID column 164 represents a physical address that each display device has. The output control device ID column 166 represents identification information of the output control device 20 to which each display device is connected. The display device type column 168 represents information for identifying the type of each display device, that is, whether it is an HMD or a flat panel display (shown as “TV” in the figure).
 図示する例では、識別番号「01」~「04」の表示装置がHMD、識別番号「05」の表示装置が「TV」すなわち平板型ディスプレイである。また識別番号「01」~「04」のHMDが識別情報「A」~「D」の出力制御装置20にそれぞれ接続され、識別情報「D」の出力制御装置20にはさらに、識別番号「05」の平板型ディスプレイが接続されている。接続状態特定部80は、これらの対応情報を収集し情報処理部74に通知する。一方、情報処理部74は、全ての表示装置に共通の画像を表示させるか、全てに別の画像を表示させるか、グループごとに別の画像を表示させるかを、ゲームの内容などに応じて決定しておく。 In the illustrated example, the display devices with the identification numbers “01” to “04” are HMDs, and the display device with the identification numbers “05” is “TV”, that is, a flat panel display. The HMDs with the identification numbers “01” to “04” are connected to the output control devices 20 with the identification information “A” to “D”, respectively. The output control device 20 with the identification information “D” is further connected to the identification number “05”. ”Is connected. The connection state identification unit 80 collects these pieces of correspondence information and notifies the information processing unit 74 of them. On the other hand, the information processing unit 74 determines whether to display a common image on all display devices, display a different image on all, or display a different image for each group depending on the content of the game. Make a decision.
 グループごとに別の画像を表示させる場合は、各表示装置が属するグループを特定する。例えばユーザごとに自分のHMD18が決まっている場合、別途、ユーザIDとHMD18の物理アドレスとを対応づけた情報を保持しておき、ゲームの開始時などにユーザ操作によりユーザIDをグループ分けすれば、結果として各表示装置の識別番号とグループとを紐付けることができる。ただしHMD18とユーザ、HMD18とグループなどの対応づけには様々な手法が考えられることは当業者には理解されるところである。 When specifying a different image for each group, specify the group to which each display device belongs. For example, if each HMD 18 is determined for each user, separately holding information that associates the user ID with the physical address of the HMD 18 and grouping the user IDs by user operation at the start of the game, etc. As a result, the identification number and group of each display device can be associated. However, it should be understood by those skilled in the art that various methods can be considered for associating the HMD 18 with the user and the HMD 18 with the group.
 図示する装置管理情報160において、グループID欄170には、そのようにして各HMD18に対応づけたグループの識別情報が表される。図示する例では、識別番号「01」、「03」、「04」の3つのHMD18がグループ「a」、識別番号「02」のHMDがグループ「b」に属しているとしている。この例は図12で示したグループ分けに対応する。また識別番号「05」は平板型ディスプレイ16であるため、この例ではグループ「a」、「b」双方に属するようにすることで、分割画面に両方の画像を並列表示する設定としている。ただし上述のようにどちらか一方の画像のみを表示するようにしてもよい。グループ分けの必要がないゲームなどでは当然、グループID欄170は省略できる。 In the device management information 160 shown in the figure, the group ID column 170 shows the identification information of the group associated with each HMD 18 in this way. In the illustrated example, it is assumed that three HMDs 18 with identification numbers “01”, “03”, and “04” belong to the group “a”, and HMDs with the identification number “02” belong to the group “b”. This example corresponds to the grouping shown in FIG. In addition, since the identification number “05” is the flat display 16, in this example, both images are set to be displayed in parallel on the divided screen by belonging to both the groups “a” and “b”. However, only one of the images may be displayed as described above. Of course, the group ID column 170 can be omitted for games that do not require grouping.
 続いて情報処理部74は、各グループのHMD18に表示すべき画像データの合成画像における配置を決定する。図13の下段にはその配置例を模式的に示している。図12で説明したのと同様、この例では、合成画像174のうち左半分がグループ「a」の左右の画像データ、右半分がグループ「b」の左右の画像のデータとしている。このように画像領域を横方向に分割して各表示画像のデータを配置する場合、装置管理情報160の画像領域欄172には、合成画像の横方向の範囲を設定する。例えば図示するように、合成画像174の横方向のサイズをX画素とした場合、画像の左端を0、右端をXとする座標軸における範囲を、(始点座標,終点座標)などの形式で指定する。 Subsequently, the information processing unit 74 determines the arrangement of the image data to be displayed on the HMD 18 of each group in the composite image. The lower part of FIG. 13 schematically shows an example of the arrangement. As described with reference to FIG. 12, in this example, the left half of the composite image 174 is the left and right image data of the group “a”, and the right half is the data of the left and right images of the group “b”. In this way, when the image area is divided in the horizontal direction and the data of each display image is arranged, the horizontal range of the composite image is set in the image area column 172 of the device management information 160. For example, as shown in the figure, when the horizontal size of the composite image 174 is X pixels, the range on the coordinate axis where the left end of the image is 0 and the right end is X is specified in a format such as (start point coordinate, end point coordinate). .
 図示する例では、グループ「a」に属する識別番号「01」、「03」、「04」のHMD18に対し、合成画像174の左半分を表す(0,X/2-1)の領域が指定されている。グループ「b」に属する識別番号「02」のHMD18に対しては、合成画像の右半分を表す(X/2,X)の領域が指定されている。さらに、両グループの画像を同時に表示する識別番号「05」の平板型ディスプレイ16に対しては、各グループの左目用画像に対応する(0,X/4-1)と(X/2,3X/4-1)の領域が指定されている。 In the example shown in the figure, an area of (0, X / 2-1) representing the left half of the composite image 174 is designated for the HMDs 18 having the identification numbers “01”, “03”, and “04” belonging to the group “a”. Has been. For the HMD 18 with the identification number “02” belonging to the group “b”, an area (X / 2, X) representing the right half of the composite image is designated. Further, for the flat panel display 16 with the identification number “05” for simultaneously displaying the images of both groups, (0, X / 4-1) and (X / 2, 3X) corresponding to the left eye images of each group. The area of / 4-1) is designated.
 このような装置管理情報160はデータ記憶部81に格納しておき、情報処理装置10内部で情報を共有する。出力データ生成部78は、情報処理部74から要求された画像生成処理を、表示装置の識別番号あるいはグループの識別情報に対応づけて実施し表示画像を生成したあと、それらを横方向に縮小し、画像領域欄172に指定されているように接続することにより合成画像174を生成する。また情報処理装置10は、装置管理情報160が表す対応関係を、付加情報として出力制御装置20に送信する。 Such device management information 160 is stored in the data storage unit 81 and is shared inside the information processing device 10. The output data generation unit 78 performs the image generation process requested from the information processing unit 74 in association with the identification number of the display device or the identification information of the group to generate a display image, and then reduces them horizontally. Then, the composite image 174 is generated by connecting as specified in the image area column 172. Further, the information processing apparatus 10 transmits the correspondence relationship represented by the apparatus management information 160 to the output control apparatus 20 as additional information.
 一方、合成画像174のデータを受け取った各出力制御装置20は、付加情報に基づき自らに接続されている表示装置に対応する領域を特定し、合成画像174から抽出して最終的な表示画像を形成する。なおここで示した装置管理情報160の各欄や表記のフォーマット等はあくまで例示であり、ゲームの内容や通信プロトコルなどによって様々な形態が考えられる。例えばさらにユーザIDを対応づけてもよいし、各ユーザが持つ入力装置のIDと対応づけてもよい。 On the other hand, each output control device 20 that has received the data of the composite image 174 specifies an area corresponding to the display device connected to itself based on the additional information, and extracts the final display image from the composite image 174. Form. Note that the fields of the device management information 160 and the format of the notation shown here are merely examples, and various forms are conceivable depending on the content of the game, the communication protocol, and the like. For example, the user ID may be further associated, or may be associated with the input device ID of each user.
 次にこれまで述べた構成によって実現される情報処理システム8の動作について説明する。図14は情報処理装置10が各表示装置にて表示させる画像のデータを出力する処理手順を示すフローチャートである。まず情報処理装置10に出力制御装置20等が接続されることにより情報処理システム8が構築されたら、通信部76および接続状態特定部80は、接続された装置を検出して識別番号を付与する(S100)。その情報は情報処理部74および検出された各装置に通知される。 Next, the operation of the information processing system 8 realized by the configuration described so far will be described. FIG. 14 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 outputs image data to be displayed on each display device. First, when the information processing system 8 is constructed by connecting the output control device 20 or the like to the information processing device 10, the communication unit 76 and the connection state specifying unit 80 detect the connected device and assign an identification number. (S100). The information is notified to the information processing unit 74 and each detected device.
 ゲームの内容などに応じてユーザがグループを形成する場合(S102のY)、別途設定したグループ分け情報やユーザとHMDとの対応情報などに従い、HMD18をグループに分けるとともにグループの数を変数Nに代入する(S104)。一方、グループを形成しない場合は(S102のN)、HMDの数を変数Nに代入する(S106)。そして画像領域を横方向にN分割した領域と、各HMDあるいは各グループとを対応づけ(S108)、各出力制御装置20に当該対応情報を通知する(S110)。なおHMD18が1つのみ接続されている場合、N=1であるため分割はなされない。 When the user forms a group according to the contents of the game (Y in S102), the HMD 18 is divided into groups and the number of groups is set to a variable N in accordance with separately set grouping information or correspondence information between the user and the HMD. Substitute (S104). On the other hand, when a group is not formed (N in S102), the number of HMDs is substituted into a variable N (S106). Then, the region obtained by dividing the image region into N in the horizontal direction is associated with each HMD or each group (S108), and the corresponding information is notified to each output control device 20 (S110). When only one HMD 18 is connected, since N = 1, no division is performed.
 そして情報処理部74と出力データ生成部78との協働により表示画像を生成し、それを必要に応じて横方向に1/N倍に縮小して対応づけた領域に配置することにより出力用の画像を生成し出力する(S112)。出力用の画像とは、N=1の場合は1つの表示画像、N>1の場合は複数の表示画像からなる合成画像である。処理を終了させる必要がない間は、所定のフレームレートで出力用の画像の生成および出力処理を継続し(S114のN、S112)、ユーザが終了要求を入力したりゲームが終了したりして処理を終了させる必要が生じたら全ての処理を終える(S114のY)。 Then, a display image is generated by the cooperation of the information processing unit 74 and the output data generation unit 78, and if necessary, the display image is reduced to 1 / N times in the horizontal direction and arranged in the associated region for output. Are generated and output (S112). The output image is a composite image composed of one display image when N = 1 and a plurality of display images when N> 1. While it is not necessary to end the processing, the generation and output processing of the output image is continued at a predetermined frame rate (N in S114, S112), and the user inputs an end request or the game ends. When it is necessary to end the processing, all the processing is finished (Y in S114).
 図15は出力制御装置20が各表示装置に画像を表示させる処理手順を示すフローチャートである。まず出力制御装置20は、表示装置あるいはグループと、合成画像内の領域との対応を示す情報を取得する(S120)。この情報は、図14のS110で情報処理装置10から通知されるものである。ただし上述のように、この対応情報を通知するタイミングは特に限定されず、毎フレームや所定数のフレームごとなど定期的に出力用画像に付加してもよい。 FIG. 15 is a flowchart showing a processing procedure for causing the output control device 20 to display an image on each display device. First, the output control device 20 acquires information indicating the correspondence between the display device or group and the region in the composite image (S120). This information is notified from the information processing apparatus 10 in S110 of FIG. However, as described above, the timing of notifying the correspondence information is not particularly limited, and may be periodically added to the output image such as every frame or every predetermined number of frames.
 次いで出力制御装置20は、情報処理装置10から送信された画像データを取得する(S122)。取得した画像が合成画像の場合は(S124のY)、自らに接続されているHMD18またはそれが属するグループに対応する領域を合成画像から抽出し、横方向に拡大することにより最終的な表示画像を生成する(S126)。情報処理装置10から送信された画像が合成画像か否か、抽出すべき領域、および拡大率は、S120で取得した対応情報から特定できる。 Next, the output control device 20 acquires the image data transmitted from the information processing device 10 (S122). If the acquired image is a composite image (Y in S124), the region corresponding to the HMD 18 connected to itself or a group to which it belongs is extracted from the composite image and expanded in the horizontal direction to obtain a final display image. Is generated (S126). Whether the image transmitted from the information processing apparatus 10 is a composite image, the area to be extracted, and the enlargement ratio can be specified from the correspondence information acquired in S120.
 情報処理装置10から送信された画像が合成画像でない場合は、接続されているHMD18が1つのみか、複数のHMD全てに共通の画像を表示させるか、であるため、当該画像がそのまま表示画像となる(S124のN)。次に出力制御装置20は、自らに平板型ディスプレイ16が接続されている場合(S128のY)、いずれかの左目用の画像のデータを抽出して逆補正をかけるなどして、平板型ディスプレイ用の表示画像を生成する(S130)。平板型ディスプレイ16が接続されていなければ(S128のN)、S130の処理はスキップする。 If the image transmitted from the information processing apparatus 10 is not a composite image, whether there is only one HMD 18 connected or a common image is displayed on all of the plurality of HMDs. (N in S124). Next, when the flat display 16 is connected to itself (Y in S128), the output control device 20 extracts data of one of the left-eye images and applies reverse correction to the flat display. A display image for use is generated (S130). If the flat display 16 is not connected (N of S128), the process of S130 is skipped.
 そして接続されているHMD18にS126で生成した表示画像のデータを、平板型ディスプレイ16にS130で生成した表示画像のデータを、それぞれ出力することにより、各表示装置に画像を表示させる(S132)。このとき、下位に別の出力制御装置20が接続されている場合は、S122で情報処理装置10から取得した画像のデータを、当該別の出力制御装置20にそのまま出力する。処理終了の必要がない間は、情報処理装置10から直接的または間接的に送信される画像データに対し、S124~S132の処理を繰り返す(S134のN、S122)。これにより各HMD18および平板型ディスプレイ16には、ゲーム画面などの動画像がそれぞれ表示される。情報処理装置10からの通知などによって処理を終了させる必要が生じたら全ての処理を終える(S134のY)。 Then, the display image data generated in S126 is output to the connected HMD 18 and the display image data generated in S130 is output to the flat panel display 16, thereby displaying the image on each display device (S132). At this time, if another output control device 20 is connected to the lower level, the image data acquired from the information processing device 10 in S122 is output to the other output control device 20 as it is. While it is not necessary to end the processing, the processing of S124 to S132 is repeated for the image data transmitted directly or indirectly from the information processing apparatus 10 (N of S134, S122). As a result, a moving image such as a game screen is displayed on each HMD 18 and the flat display 16. When it is necessary to terminate the processing due to a notification from the information processing apparatus 10 or the like, all the processing is terminated (Y in S134).
 これまで述べた例では、図6に例示したように、合成画像の画像領域を横方向にN分割してなる領域に、各HMD18の表示画像を横方向に1/N倍に縮小した画像を配置することにより、HMD18の数によらず同じサイズの画像を送信できるようにした。合成画像の構成はこれに限らず、例えば縦方向にN分割してなる領域に縮小した表示画像を配置してもよい。図16は表示画像を縦方向に接続した合成画像について説明するための図である。同図上段はこの場合の合成画像180、下段は2つのHMD18にそれぞれ表示すべき画像182、184である。 In the example described so far, as illustrated in FIG. 6, an image obtained by reducing the display image of each HMD 18 to 1 / N times in the horizontal direction in an area obtained by dividing the image area of the composite image into N in the horizontal direction. By arranging, images of the same size can be transmitted regardless of the number of HMDs 18. The composition of the composite image is not limited to this. For example, a reduced display image may be arranged in an area that is divided into N parts in the vertical direction. FIG. 16 is a diagram for explaining a composite image in which display images are connected in the vertical direction. The upper part of the figure is a composite image 180 in this case, and the lower part is images 182 and 184 to be displayed on the two HMDs 18 respectively.
 この例で画像182、184は元々、画面を横方向に分割してなる領域に左右の視差画像を並べたものであるため、合成画像180を図示するような縦方向の配列とするには、1つの表示画像を構成する左右の画像についても並び替えが必要となる。すなわち画像182のうち「L1」と表記された左目用の画像を合成画像180の1段目に、「R1」と表記された右目用の画像を合成画像180の2段目に配置する。同様に、画像184のうち「L2」と表記された左目用の画像を合成画像180の3段目に、「R2」と表記された右目用の画像を合成画像180の4段目に配置する。ここで「L1」、「R1」、「L2」、「R2」の画像をそれぞれ、横方向に2倍に拡大し、縦方向に1/4倍に縮小してから縦方向に接続することにより、図示するように元の画像182、184と同じサイズの合成画像180が得られる。 In this example, since the images 182 and 184 are originally obtained by arranging the left and right parallax images in a region obtained by dividing the screen in the horizontal direction, in order to make the composite image 180 in the vertical direction as illustrated, Rearrangement is also necessary for the left and right images constituting one display image. That is, in the image 182, the left-eye image labeled “L1” is arranged in the first stage of the composite image 180, and the right-eye image written “R1” is arranged in the second stage of the composite image 180. Similarly, in the image 184, the image for the left eye labeled “L2” is arranged in the third row of the composite image 180, and the image for the right eye written “R2” is arranged in the fourth row of the composite image 180. . Here, the images “L1”, “R1”, “L2”, and “R2” are each enlarged by 2 times in the horizontal direction, reduced by 1/4 times in the vertical direction, and then connected in the vertical direction. As shown, a composite image 180 having the same size as the original images 182 and 184 is obtained.
 同様に、合成画像にN個のHMD用の画像を含めるときは、各画像の左目用、右目用の画像をそれぞれ横方向に2倍に拡大し、縦方向に1/(2N)倍に縮小したうえ、左目用画像、右目用画像など所定の順序で接続していく。これにより、画像領域を縦方向にN分割してなる各領域に、1つの表示画像に対応する左目用画像、右目用画像が縦に配列した合成画像を生成できる。各出力制御装置20は、接続されたHMD18に対応する領域を抽出し、左目用画像、右目用画像をそれぞれ縦方向に2N倍に拡大し、横方向に1/2倍に縮小したうえ、左右に並べることにより、画像182、184を復元する。この場合、装置管理情報160の画像領域欄172における領域指定が縦方向の座標軸を示す以外、処理手順は図14、15で示したのと同様でよい。 Similarly, when N HMD images are included in the composite image, the left-eye and right-eye images of each image are doubled in the horizontal direction and reduced to 1 / (2N) times in the vertical direction. In addition, the left eye image and the right eye image are connected in a predetermined order. As a result, a composite image in which the left-eye image and the right-eye image corresponding to one display image are vertically arranged in each region obtained by dividing the image region into N in the vertical direction can be generated. Each output control device 20 extracts a region corresponding to the connected HMD 18, enlarges the left-eye image and the right-eye image by 2N times in the vertical direction, reduces the image by 1/2 times in the horizontal direction, The images 182 and 184 are restored. In this case, the processing procedure may be the same as that shown in FIGS. 14 and 15 except that the area designation in the image area column 172 of the device management information 160 indicates the vertical coordinate axis.
 図6で示したように複数の表示画像を横方向に接続する場合、情報処理装置10が合成画像の画素値をラスタ順に出力すると、出力制御装置20は対応する領域の画素を、出力された順に補間しながらHMD18に出力すればよいため、バッファメモリの必要容量が少なくてすむ。図16に示したように複数の表示画像を縦方向に接続する場合、ラスタ順に「L1」、「R1」、「L2」、「R2」と出力されるため、出力制御装置20は例えば、「R1」の1行目のデータを表示画像として出力するまでに、「L1」の最終行までのデータを一旦保存しておく必要がある。 When a plurality of display images are connected in the horizontal direction as shown in FIG. 6, when the information processing device 10 outputs the pixel values of the composite image in raster order, the output control device 20 outputs the pixels in the corresponding region. Since it is sufficient to output to the HMD 18 while interpolating in order, the required capacity of the buffer memory can be reduced. When a plurality of display images are connected in the vertical direction as shown in FIG. 16, “L1”, “R1”, “L2”, and “R2” are output in raster order. Until the data on the first line of “R1” is output as a display image, it is necessary to temporarily store the data up to the last line of “L1”.
 つまり縦方向に接続する態様では、少なくとも左目用画像の縮小画像全体を格納する容量のバッファメモリが必要となる。また合成画像に含めるべき画像の数が増えるほど、合成画像の下側に位置するデータの入力が遅くなり、そのデータを用いた画像の表示開始までの時間がフレーム周期以下の微小時間だけ遅延することになる。一方で、縦方向に接続する場合、元の表示画像を横方向に縮小する必要がないため、当該横方向の情報が失われない。HMD18に左右の視差画像を表示し立体視させる態様において、横方向の情報は左右の視差を表しているため立体感に影響を与えやすい。 That is, in the aspect of connecting in the vertical direction, a buffer memory having a capacity for storing at least the entire reduced image of the left-eye image is required. Also, as the number of images to be included in the composite image increases, the input of data located below the composite image is delayed, and the time until the display of the image using the data is started is delayed by a minute time that is equal to or shorter than the frame period. It will be. On the other hand, when connecting in the vertical direction, it is not necessary to reduce the original display image in the horizontal direction, so that the information in the horizontal direction is not lost. In the aspect in which the left and right parallax images are displayed on the HMD 18 and stereoscopically viewed, the information in the horizontal direction represents the left and right parallax, and thus tends to affect the stereoscopic effect.
 結果として、縦方向の接続態様の方が立体感を損いにくい。これらのことから、表示画像の接続方向を横方向とするか縦方向とするかは、使用できるメモリ容量や表示内容、立体視させるか否かなどに応じて適宜決定する。なお図16の例は、左右の視差画像のデータのみで合成画像を構成することを想定していたが、平板型ディスプレイ16などに表示させる単一の画像を含める場合は、単に縦方向に1/N倍に縮小すればよい。 As a result, the three-dimensional effect is less likely to be lost in the vertical connection mode. For these reasons, whether the connection direction of the display image is the horizontal direction or the vertical direction is appropriately determined according to the memory capacity that can be used, the display contents, and whether or not stereoscopic viewing is performed. In the example of FIG. 16, it is assumed that the composite image is composed only of the data of the left and right parallax images. However, when including a single image to be displayed on the flat display 16 or the like, the vertical direction is simply 1 in the vertical direction. It may be reduced to / N times.
 これまで述べた態様は、各HMD18に表示させる画像を縮小して合成し、元の画像と同じサイズの合成画像とすることで、単位時間に送信すべきデータ量(伝送帯域)がHMD18の数に影響されないようにした。一方、次に述べるような変形例も考えられる。図17、図18は、複数のHMDに表示する画像データの送信態様の変形例を説明するための図である。まず図17は、表示させる複数の画像を、縮小せずに接続して送信する例を表している。同図上段はこの場合の合成画像220、下段は4つのHMD18にそれぞれ表示すべき画像222、224、226、228である。 In the embodiment described so far, the images to be displayed on the respective HMDs 18 are reduced and combined to form a combined image having the same size as the original image, so that the amount of data (transmission band) to be transmitted per unit time is the number of HMDs 18. It was made not to be influenced by. On the other hand, the following modifications are also conceivable. FIGS. 17 and 18 are diagrams for explaining a modification of the transmission mode of image data to be displayed on a plurality of HMDs. First, FIG. 17 shows an example in which a plurality of images to be displayed are connected and transmitted without being reduced. The upper part of the figure shows the composite image 220 in this case, and the lower part shows the images 222, 224, 226, and 228 to be displayed on the four HMDs 18, respectively.
 図示するように合成画像220は、4つの表示画像222、224、226、228を、横方向に2つ、縦方向に2つ、そのまま接続した画像である。結果として合成画像220は、元の画像の4倍のサイズを有する。このようにすると情報処理装置10および各出力制御装置20間での伝送帯域は4倍になるが、各HMD18においては合成画像220から抽出した表示画像を拡大する必要なく表示できるため、出力制御装置20の処理の負荷が軽減される。なお合成画像として接続する画像は4つに限らず、接続順も図示するものに限らない。場合によっては1つの表示画像を分割して複数の行または列に接続してもよい。さらに、表示画像を縮小する態様と組み合わせ、合成画像のサイズの拡張を許容しつつ、より多くの表示画像を合成するようにしてもよい。 As shown in the figure, the composite image 220 is an image in which four display images 222, 224, 226, and 228 are connected as they are, two in the horizontal direction and two in the vertical direction. As a result, the composite image 220 has a size four times that of the original image. In this way, the transmission band between the information processing device 10 and each output control device 20 is quadrupled. However, since the display image extracted from the composite image 220 can be displayed on each HMD 18 without enlarging, the output control device 20 processing load is reduced. Note that the number of images to be connected as a composite image is not limited to four, and the connection order is not limited to that illustrated. In some cases, one display image may be divided and connected to a plurality of rows or columns. Furthermore, it may be combined with a mode of reducing the display image, and more display images may be combined while allowing the size of the combined image to be expanded.
 図18は、表示させる複数の画像を、縮小せずに高レートで送信する例を表している。同図上段は、通常の1フレーム分の出力周期に出力する画像列230、下段は2つのHMD18にそれぞれ表示すべき画像232、234である。すなわちこの例では、各HMD18に表示させる画像232、234を、画像平面上で接続するのでなく、同図に「t」で示される時間軸方向に並べる。各画像232、234は動画のフレーム画像であるため、結果として2つの動画のフレーム画像を、2倍のレートで交互に出力することになる。3つ以上の動画を含める場合も、それらのフレーム画像を順繰りに出力する。 FIG. 18 shows an example in which a plurality of images to be displayed are transmitted at a high rate without being reduced. The upper part of the figure shows an image sequence 230 to be output in a normal output period for one frame, and the lower part shows images 232 and 234 to be displayed on the two HMDs 18 respectively. In other words, in this example, the images 232 and 234 to be displayed on each HMD 18 are arranged on the time axis direction indicated by “t” in FIG. Since each of the images 232 and 234 is a moving image frame image, as a result, two moving image frame images are alternately output at a double rate. Even when three or more moving images are included, the frame images are output in order.
 この場合、出力制御装置20は、対応するHMD18に表示すべきフレームのデータが到来するタイミングで当該データを抽出する。そのため情報処理装置10は、HMD18の数に応じて出力順を決定し、その順序と表示すべきHMD18とを対応づけた付加情報を送信する。この態様では、表示すべき画像がN個のとき出力レートをN倍にするため、情報処理装置10および各出力制御装置20間での伝送帯域はN倍になる。一方、図17の態様と同様、各HMD18においては取得した画像を拡大する必要なく表示できるため、出力制御装置20の処理の負荷が軽減される。この場合も、表示画像を縮小する態様と組み合わせ、出力レートの増加を許容しつつ、より多くの表示画像を出力できるようにしてもよい。 In this case, the output control device 20 extracts the data at the timing when the data of the frame to be displayed on the corresponding HMD 18 arrives. Therefore, the information processing apparatus 10 determines the output order according to the number of HMDs 18 and transmits additional information that associates the order with the HMDs 18 to be displayed. In this aspect, when the number of images to be displayed is N, the output rate is increased N times, so the transmission band between the information processing device 10 and each output control device 20 is increased N times. On the other hand, similarly to the mode of FIG. 17, each HMD 18 can display the acquired image without enlarging, thereby reducing the processing load of the output control device 20. In this case as well, it may be combined with a mode in which the display image is reduced, and more display images may be output while allowing an increase in output rate.
 次に、情報処理装置10から送信される画像中の領域とHMD18との対応情報を送信する手法を例示する。上述のとおり当該対応情報は、一連の出力データを情報処理装置10から送信するのに先立ち出力制御装置20に送信しておいてもよいし、出力データに付加情報として含めてもよい。図19は後者の一実施形態として、情報処理装置10が出力する画像内に情報を埋め込む手法を説明するための図である。画像190は、情報処理装置10から出力制御装置20に送信される画像であり、同図の例では、図12、13で例示したように、グループ「a」用の表示画像とグループ「b」用の表示画像を接続した合成画像としている。 Next, a method of transmitting correspondence information between the area in the image transmitted from the information processing apparatus 10 and the HMD 18 will be exemplified. As described above, the correspondence information may be transmitted to the output control device 20 prior to transmitting a series of output data from the information processing device 10, or may be included as additional information in the output data. FIG. 19 is a diagram for explaining a method of embedding information in an image output from the information processing apparatus 10 as one embodiment of the latter. The image 190 is an image transmitted from the information processing apparatus 10 to the output control apparatus 20. In the example of FIG. 12, as illustrated in FIGS. 12 and 13, the display image for the group “a” and the group “b” are displayed. This is a composite image obtained by connecting display images for use.
 この例のようにレンズに対応する歪み補正を施した視差画像を表示する場合、表示画像190のうちゲーム画面や仮想世界など表示すべき内容が表されるのは、「La」、「Ra」、「Lb」、「Rb」と表記された部分であり、それ以外の領域は不可視領域となる。そこで当該不可視領域を利用して対応情報を表すようにすることで、データ量を増やすことなく必要な情報が伝送されるようにする。例えば画像190のうち、グループ「a」の表示画像のデータを含む左半分の不可視領域192aと、グループ「b」の表示画像のデータを含む右半分の不可視領域192bとを、異なる色で塗りつぶす。 When a parallax image subjected to distortion correction corresponding to a lens is displayed as in this example, contents to be displayed such as a game screen and a virtual world in the display image 190 are represented by “La” and “Ra”. , “Lb”, “Rb”, and other regions are invisible regions. Therefore, by displaying the corresponding information using the invisible region, necessary information can be transmitted without increasing the data amount. For example, in the image 190, the left half invisible area 192a including the display image data of the group “a” and the right half invisible area 192b including the display image data of the group “b” are filled with different colors.
 接続されているHMD18が属するグループに対応する色の情報を事前に出力制御装置20に与えておけば、出力制御装置20は、画像190のうちどの領域を抽出すればよいかを、不可視領域の色に基づき特定できる。グループが形成されない態様においては、HMD18と色、あるいは直接、出力制御装置20と色とを対応づけておいても同様に特定できる。いずれの態様においても、情報処理装置10と各出力制御装置20で対応関係を共有しておけばよく、例えばHMD18や出力制御装置20の物理アドレスと色とを両者に事前に設定しておけば、対応関係について情報を送受する必要もなくなる。 If the color information corresponding to the group to which the connected HMD 18 belongs is given to the output control device 20 in advance, the output control device 20 determines which region of the image 190 should be extracted in the invisible region. Can be identified based on color. In an aspect in which no group is formed, the HMD 18 and the color, or the output control device 20 and the color can be directly associated with each other. In any aspect, the information processing apparatus 10 and each output control apparatus 20 may share the correspondence relationship. For example, if the physical address and color of the HMD 18 and the output control apparatus 20 are set in advance in both of them. No need to send or receive information about correspondence.
 または対応関係のパターンをいくつか作成しておき、ゲーム開始時などにその内容や表示装置の接続状態などによって、どのパターンを用いるかを情報処理装置10が決定し出力制御装置20に通知するようにしてもよい。3つ以上の表示画像からなる合成画像の場合も、それぞれの不可視領域を異なる色の塗り潰しとすれば運用は同様となる。また合成画像でない場合も、全不可視領域の色が同じであることから合成画像でないこと自体を表すことができる。塗りつぶし色としては、例えば赤、緑、青の3要素を0または最大階調で表した8パターンの色から選択することにより、出力制御装置20は誤認なく容易に抽出すべき領域を判別できる。 Alternatively, several correspondence patterns are created, and the information processing apparatus 10 determines which pattern to use depending on the content, the connection state of the display device, and the like at the start of the game, and notifies the output control device 20 of it. It may be. In the case of a composite image composed of three or more display images, the operation is the same if each invisible region is filled with a different color. Further, even if it is not a composite image, it can be represented that it is not a composite image because the colors of all invisible regions are the same. As the fill color, for example, by selecting three colors of red, green, and blue from 0 or eight patterns of colors represented by the maximum gradation, the output control device 20 can easily determine the region to be extracted without misunderstanding.
 情報処理装置10は出力する全てのフレーム、または所定間隔のフレームについて、対応情報を不可視領域の色で表した画像190のデータを出力する。このようにすることで、出力制御装置20における領域判別処理を単純化できるとともに、接続されるHMD18が途中で増減するなどして抽出すべき領域に変化があっても、同様の処理で遅延なく適切な領域を抽出できる。なお不可視領域を全て塗りつぶさず、一部の所定サイズの領域のみを塗りつぶしてもよい。また一つの表示画像に対応する不可視領域でも、その位置によって異なる色としたり、マークや模様を表示したりして、埋め込む情報量を増やしてもよい。 The information processing apparatus 10 outputs data of the image 190 in which the correspondence information is represented by the color of the invisible region for all the frames to be output or frames at a predetermined interval. In this way, the area determination process in the output control apparatus 20 can be simplified, and even if there is a change in the area to be extracted due to the increase or decrease in the number of connected HMDs 18 in the middle, there is no delay in the same process. Appropriate areas can be extracted. Note that not all invisible areas may be filled, but only some areas of a predetermined size may be filled. Even in an invisible region corresponding to one display image, the amount of information to be embedded may be increased by using different colors or displaying marks or patterns depending on the position.
 例えば画像190において左半分の不可視領域192a、右半分の不可視領域192bは、それぞれ6つの小領域からなる。このうち左上隅の領域は対応するグループ、中央上の領域は抽出した画像の拡大率、といった情報を色で表してもよい。上述した、色と装置の対応関係のパターン自体をいずれかの箇所の色で表してもよい。また不可視領域を縞模様とし、その方向、幅、色の組み合わせなどを変化させることにより情報を表してもよい。 For example, in the image 190, the left half invisible area 192a and the right half invisible area 192b are each composed of six small areas. Of these, information such as the area in the upper left corner corresponding to the group and the area in the upper center indicating the enlargement ratio of the extracted image may be expressed in color. The above-described pattern of the correspondence relationship between the color and the apparatus may be represented by a color at any location. Further, the information may be expressed by making the invisible region a striped pattern and changing the direction, width, color combination, and the like.
 以上述べた本実施の形態によれば、画像データを生成する情報処理装置に出力制御装置を直列に接続し、各出力制御装置にHMDや平板型ディスプレイなどの表示装置を接続する。そしてそれらの接続状態と、各表示装置に表示させたい画像の内容によって、情報処理装置は、1つの表示画像、または、複数の表示画像を縮小して接続した合成画像を生成し出力する。これにより、情報処理装置や出力制御装置それぞれの出力端子の数を増やすことなく、また、伝送するデータのサイズを増やすことなく、接続する表示装置の数やそれらに表示させる画像の数を増やすことができる。 According to the present embodiment described above, an output control device is connected in series to an information processing device that generates image data, and a display device such as an HMD or a flat panel display is connected to each output control device. The information processing apparatus generates and outputs one display image or a combined image obtained by reducing a plurality of display images according to the connection state and the content of the image to be displayed on each display device. This increases the number of connected display devices and the number of images displayed on them without increasing the number of output terminals of each information processing device or output control device, and without increasing the size of data to be transmitted. Can do.
 この態様は、ユーザが個別に表示画像を見るようなHMDを用いたコンテンツには特に有効であり、同じゲームや仮想世界を複数のユーザが個別の視点から見るといった独特の態様を容易に実現できる。HMDと出力制御装置の対をセットで提供するようにすれば、コンテンツや参加人数に合わせてセットを買い足していくことにより、ユーザは最低限のコストで所望のコンテンツを楽しむことができる。 This mode is particularly effective for content using an HMD in which a user views a display image individually, and a unique mode in which a plurality of users view the same game or virtual world from individual viewpoints can be easily realized. . If a pair of HMD and output control device is provided as a set, the user can enjoy the desired content at a minimum cost by purchasing additional sets according to the content and the number of participants.
 また、HMDで表示させるためにレンズに対応する歪み補正を施した画像で合成画像を構成する。そしてこれにより生じる不可視領域に、各出力制御装置が合成画像から抽出すべき領域を特定するための情報を色や図形で表す。これにより、当該情報のために別途データを伝送させる必要がなくなりデータ転送や抽出処理を効率化できる。また合成画像の構成が途中で変化しても、出力制御装置は同様の処理で適切な領域を抽出できる。 Also, a composite image is composed of images that have been subjected to distortion correction corresponding to the lens for display by the HMD. Information for specifying an area to be extracted from the composite image by each output control device is represented by a color or a graphic in the invisible area generated thereby. Thereby, it is not necessary to separately transmit data for the information, and data transfer and extraction processing can be made efficient. Even if the composition of the composite image changes midway, the output control device can extract an appropriate region by the same processing.
 以上、本発明を実施の形態をもとに説明した。上記実施の形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described based on the embodiments. Those skilled in the art will understand that the above-described embodiment is an exemplification, and that various modifications can be made to combinations of the respective constituent elements and processing processes, and such modifications are also within the scope of the present invention. is there.
 8 情報処理システム、 10 情報処理装置、 12 撮像装置、 14a 入力装置、 16 平板型ディスプレイ、 18 HMD、 20 出力制御装置、 72 入力データ取得部、 74 情報処理部、 76 通信部、 78 出力データ生成部、 80 接続状態特定部、 81 データ記憶部、 82 第1通信部、 84 出力データ形成部、 86 第2通信部、 88 第3通信部。 8 information processing system, 10 information processing device, 12 imaging device, 14a input device, 16 flat panel display, 18 HMD, 20 output control device, 72 input data acquisition unit, 74 information processing unit, 76 communication unit, 78 output data generation Part, 80 connection state specifying part, 81 data storage part, 82 first communication part, 84 output data forming part, 86 second communication part, 88 third communication part.
 以上のように、本発明は各種電子コンテンツを処理するコンピュータ、表示装置、ゲーム装置、情報処理装置、画像処理装置や、それらを含むシステムなどに利用可能である。 As described above, the present invention can be used for a computer, a display device, a game device, an information processing device, an image processing device, a system including them, and the like that process various electronic contents.

Claims (22)

  1.  表示装置に表示させる動画像のデータを生成する出力データ生成部と、
     前記出力データ生成部が生成した動画像のデータを順次、出力する通信部と、を備え、
     前記表示装置が複数接続されているとき、前記出力データ生成部は、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを前記通信部に出力させることを特徴とする情報処理装置。
    An output data generation unit for generating moving image data to be displayed on the display device;
    A communication unit that sequentially outputs moving image data generated by the output data generation unit,
    When a plurality of the display devices are connected, the output data generation unit causes the communication unit to output combined image data obtained by combining a plurality of moving images to be displayed independently on each display device for each frame. An information processing apparatus characterized by the above.
  2.  前記出力データ生成部は、前記複数の動画像の各フレーム画像を、当該フレーム画像の数に応じた倍率で縮小して接続することにより、前記合成画像を生成することを特徴とする請求項1に記載の情報処理装置。 The output data generation unit generates the composite image by connecting each frame image of the plurality of moving images after reducing the connection at a magnification corresponding to the number of the frame images. The information processing apparatus described in 1.
  3.  表示装置の接続状態を特定する接続状態特定部をさらに含み、
     前記出力データ生成部は、前記接続状態に基づき、前記合成画像における合成対象の各フレーム画像の配置とそれを表示させる表示装置とを対応づけた対応情報を生成し、
     前記通信部は、当該対応情報も出力することを特徴とする請求項1または2に記載の情報処理装置。
    A connection state specifying unit for specifying a connection state of the display device;
    The output data generation unit generates correspondence information that associates the layout of each frame image to be combined in the composite image with a display device that displays it based on the connection state,
    The information processing apparatus according to claim 1, wherein the communication unit also outputs the correspondence information.
  4.  前記出力データ生成部は、前記表示装置として、表示パネルの前面にレンズを備えたヘッドマウントディスプレイに表示させる動画像のデータを生成したうえ、各フレーム画像のうち、前記レンズを介して見たときに不可視となる領域に前記対応情報に対応する画像を表すことを特徴とする請求項3に記載の情報処理装置。 The output data generation unit generates data of a moving image to be displayed on a head-mounted display having a lens on the front surface of the display panel as the display device, and when viewed through the lens among each frame image The information processing apparatus according to claim 3, wherein an image corresponding to the correspondence information is represented in a region that is invisible.
  5.  前記出力データ生成部は、合成対象のフレーム画像の数がN(Nは2以上の自然数)であるとき、各フレーム画像を、横方向のサイズを1/N倍にしたうえ横方向に接続することにより、前記合成画像を生成することを特徴とする請求項1から4のいずれかに記載の情報処理装置。 When the number of frame images to be synthesized is N (N is a natural number of 2 or more), the output data generation unit connects the frame images in the horizontal direction with the horizontal size being multiplied by 1 / N. The information processing apparatus according to claim 1, wherein the composite image is generated.
  6.  前記出力データ生成部は、前記表示装置としてヘッドマウントディスプレイに表示し立体視させるための左目用画像および右目用画像の対を左右に並列表示させたフレーム画像からなる動画像のデータを生成し、当該左目用画像および右目用画像をそれぞれ横方向に1/N倍に縮小して接続することにより、前記合成画像を生成することを特徴とする請求項5に記載の情報処理装置。 The output data generation unit generates moving image data including a frame image in which a pair of a left-eye image and a right-eye image for stereoscopic display displayed on a head-mounted display as the display device is displayed side by side, 6. The information processing apparatus according to claim 5, wherein the composite image is generated by connecting the left-eye image and the right-eye image by reducing each of them in the horizontal direction by 1 / N times.
  7.  前記出力データ生成部は、合成対象のフレーム画像の数がN(Nは2以上の自然数)であるとき、各フレーム画像を、縦方向のサイズを1/N倍にしたうえ縦方向に接続することにより、前記合成画像を生成することを特徴とする請求項1から4のいずれかに記載の情報処理装置。 When the number of frame images to be combined is N (N is a natural number equal to or greater than 2), the output data generation unit connects each frame image in the vertical direction with a vertical size of 1 / N times. The information processing apparatus according to claim 1, wherein the composite image is generated.
  8.  前記出力データ生成部は、前記表示装置としてヘッドマウントディスプレイに表示し立体視させるための左目用画像および右目用画像の対を左右に並列表示させたフレーム画像からなる動画像のデータを生成し、当該左目用画像および右目用画像をそれぞれ横方向に2倍に拡大し、縦方向に1/N倍に縮小したうえ縦方向に接続することにより、前記合成画像を生成することを特徴とする請求項7に記載の情報処理装置。 The output data generation unit generates moving image data including a frame image in which a pair of a left-eye image and a right-eye image for stereoscopic display displayed on a head-mounted display as the display device is displayed side by side, The composite image is generated by enlarging the left-eye image and the right-eye image twice in the horizontal direction, reducing the image in the vertical direction to 1 / N times, and connecting them in the vertical direction. Item 8. The information processing device according to Item 7.
  9.  前記出力データ生成部は、接続されている複数の表示装置がグループを形成しているとき、当該グループごとに前記動画像のデータを生成して合成することにより前記合成画像を生成することを特徴とする請求項1から8のいずれかに記載の情報処理装置。 When the plurality of connected display devices form a group, the output data generation unit generates the combined image by generating and combining the moving image data for each group. The information processing apparatus according to any one of claims 1 to 8.
  10.  情報処理装置が出力した表示用動画像のデータを取得する入力部と、
     前記表示用動画像のデータに、複数の表示装置に独立に表示させる複数の動画像がフレームごとに合成された合成画像のデータが含まれるとき、自らに接続された表示装置に対応する部分領域を当該合成画像のデータから抽出し、最終的な表示画像とする出力データ形成部と、
     前記最終的な表示画像のデータを前記接続された表示装置に出力する出力部と、
     を備えることを特徴とする出力制御装置。
    An input unit that acquires display moving image data output from the information processing apparatus;
    When the display moving image data includes composite image data in which a plurality of moving images to be displayed independently on a plurality of display devices are combined for each frame, a partial region corresponding to the display device connected to the display device Is extracted from the data of the composite image, and an output data forming unit to be a final display image;
    An output unit for outputting the data of the final display image to the connected display device;
    An output control device comprising:
  11.  前記入力部は、前記合成画像における合成対象の各フレーム画像の配置とそれを表示させる表示装置とを対応づけた対応情報も取得し、
     前記出力データ形成部は、前記対応情報に基づき、自らに接続された表示装置に対応する部分領域を特定して抽出することを特徴とする請求項10に記載の出力制御装置。
    The input unit also acquires correspondence information that associates the arrangement of each frame image to be synthesized in the synthesized image with a display device that displays the arrangement.
    The output control device according to claim 10, wherein the output data forming unit specifies and extracts a partial region corresponding to a display device connected to the output data forming unit based on the correspondence information.
  12.  前記合成画像は、合成対象のフレーム画像を、その数に応じた倍率で縮小した画像を接続してなり、
     前記出力データ形成部は、前記対応情報に基づき特定した、前記合成対象のフレーム画像の数に応じた倍率で、抽出した部分領域を拡大することにより、前記最終的な表示画像を生成することを特徴とする請求項11に記載の出力制御装置。
    The composite image is formed by connecting images obtained by reducing the frame images to be combined at a magnification corresponding to the number of frames.
    The output data forming unit generates the final display image by enlarging the extracted partial region at a magnification according to the number of the frame images to be synthesized specified based on the correspondence information. The output control device according to claim 11, characterized in that:
  13.  前記複数の表示装置は、表示パネルの前面にレンズを備えたヘッドマウントディスプレイを含み、
     前記出力データ形成部は、合成されたフレーム画像のうち、前記レンズを介して見たときに不可視となる領域に表された画像を読み出すことにより、それに対応づけられた前記対応情報を特定することを特徴とする請求項11または12に記載の出力制御装置。
    The plurality of display devices include a head mounted display including a lens on the front surface of the display panel,
    The output data forming unit identifies the correspondence information associated with the frame data by reading out an image represented in a region that is invisible when viewed through the lens in the synthesized frame image. The output control device according to claim 11 or 12,
  14.  前記出力部はさらに、前記情報処理装置が出力した前記表示用動画像のデータを別の出力制御装置に出力することを特徴とする請求項10から13のいずれかに記載の出力制御装置。 The output control device according to any one of claims 10 to 13, wherein the output unit further outputs the display moving image data output from the information processing device to another output control device.
  15.  動画像のデータを出力する情報処理装置と、当該動画像のデータを取得して表示装置に表示させる出力制御装置と、を備えた情報処理システムであって、
     前記情報処理装置は、
     前記表示装置に表示させる動画像のデータを生成する出力データ生成部と、
     前記出力データ生成部が生成した動画像のデータを順次、前記出力制御装置に出力する通信部と、を備え、
     前記表示装置が複数接続されているとき、前記出力データ生成部は、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを前記通信部に送信させ、
     前記出力制御装置は、
     前記動画像のデータを取得する入力部と、
     前記動画像のデータに前記合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とする出力データ形成部と、
     前記最終的な表示画像のデータを前記接続された表示装置に出力する出力部と、
     を備えることを特徴とする情報処理システム。
    An information processing system comprising: an information processing device that outputs moving image data; and an output control device that acquires the moving image data and causes the display device to display the data.
    The information processing apparatus includes:
    An output data generation unit for generating moving image data to be displayed on the display device;
    A communication unit that sequentially outputs moving image data generated by the output data generation unit to the output control device;
    When a plurality of the display devices are connected, the output data generation unit causes the communication unit to transmit combined image data obtained by combining a plurality of moving images to be displayed independently on each display device for each frame,
    The output control device includes:
    An input unit for acquiring data of the moving image;
    An output data forming unit that extracts a partial area to be displayed on a display device connected to the moving image data from the synthesized image data when the moving image data includes the synthesized image data; ,
    An output unit for outputting the data of the final display image to the connected display device;
    An information processing system comprising:
  16.  複数の前記出力制御装置が、前記情報処理装置に直列に接続され、
     各出力制御装置は、自らに接続されたヘッドマウントディスプレイに前記最終的な表示画像のデータを出力するとともに、前記直列の順に、前記情報処理装置が出力した動画像のデータを伝送させることを特徴とする請求項15に記載の情報処理システム。
    A plurality of the output control devices are connected in series to the information processing device,
    Each output control device outputs the data of the final display image to a head mounted display connected to the output control device, and transmits the data of the moving image output by the information processing device in the serial order. The information processing system according to claim 15.
  17.  表示装置に表示させる動画像のデータを生成するステップと、
     生成した動画像のデータを順次、出力するステップと、を含み、
     前記動画像のデータを生成するステップは、前記表示装置が複数接続されているとき、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを生成するステップを含むことを特徴とする情報処理装置による動画データ出力方法。
    Generating moving image data to be displayed on the display device;
    And sequentially outputting the generated moving image data,
    The step of generating the moving image data includes a step of generating combined image data obtained by combining a plurality of moving images to be displayed on each display device for each frame when a plurality of the display devices are connected. A method for outputting moving image data by an information processing apparatus.
  18.  情報処理装置が出力した表示用動画像のデータを取得するステップと、
     前記表示用動画像のデータに、複数の表示装置に独立に表示させる複数の動画像がフレームごとに合成された合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とするステップと、
     前記最終的な表示画像のデータを前記接続された表示装置に出力するステップと、
     を含むことを特徴とする出力制御装置による動画データ出力方法。
    Obtaining display moving image data output by the information processing apparatus;
    When the display moving image data includes composite image data in which a plurality of moving images to be independently displayed on a plurality of display devices are combined for each frame, a partial region to be displayed on a display device connected to the display moving image data Are extracted from the data of the composite image and set as a final display image;
    Outputting the data of the final display image to the connected display device;
    A moving image data output method by an output control device comprising:
  19.  表示装置に表示させる動画像のデータを生成する機能と、
     生成した動画像のデータを順次、出力する機能と、をコンピュータに実現させるコンピュータプログラムであって、
     前記動画像のデータを生成する機能は、前記表示装置が複数接続されているとき、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを生成することを特徴とするコンピュータプログラム。
    A function of generating moving image data to be displayed on the display device;
    A computer program for causing a computer to perform a function of sequentially outputting generated moving image data,
    The function of generating moving image data is to generate combined image data obtained by combining a plurality of moving images to be displayed on each display device for each frame when a plurality of the display devices are connected. A computer program characterized by the above.
  20.  情報処理装置が出力した表示用動画像のデータを取得する機能と、
     前記表示用動画像のデータに、複数の表示装置に独立に表示させる複数の動画像がフレームごとに合成された合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とする機能と、
     前記最終的な表示画像のデータを前記接続された表示装置に出力する機能と、
     をコンピュータに実現させることを特徴とするコンピュータプログラム。
    A function of acquiring display moving image data output by the information processing apparatus;
    When the display moving image data includes composite image data in which a plurality of moving images to be independently displayed on a plurality of display devices are combined for each frame, a partial region to be displayed on a display device connected to the display moving image data Is extracted from the composite image data, and the final display image,
    A function of outputting the final display image data to the connected display device;
    A computer program for causing a computer to realize the above.
  21.  表示装置に表示させる動画像のデータを生成する機能と、
     生成した動画像のデータを順次、出力する機能と、をコンピュータに実現させるコンピュータプログラムであって、
     前記動画像のデータを生成する機能は、前記表示装置が複数接続されているとき、各表示装置に独立に表示させる複数の動画像をフレームごとに合成してなる合成画像のデータを生成するコンピュータプログラムを記録したことを特徴とする、コンピュータにて読み取り可能な記録媒体。
    A function of generating moving image data to be displayed on the display device;
    A computer program for causing a computer to perform a function of sequentially outputting generated moving image data,
    The function of generating the moving image data is a computer that generates combined image data obtained by combining a plurality of moving images to be displayed on each display device for each frame when a plurality of the display devices are connected. A computer-readable recording medium having a program recorded thereon.
  22.  情報処理装置が出力した表示用動画像のデータを取得する機能と、
     前記表示用動画像のデータに、複数の表示装置に独立に表示させる複数の動画像がフレームごとに合成された合成画像のデータが含まれるとき、自らに接続された表示装置に表示させる部分領域を当該合成画像のデータから抽出し、最終的な表示画像とする機能と、
     前記最終的な表示画像のデータを前記接続された表示装置に出力する機能と、
     をコンピュータに実現させるコンピュータプログラムを記録したことを特徴とする、コンピュータにて読み取り可能な記録媒体。
    A function of acquiring display moving image data output by the information processing apparatus;
    When the display moving image data includes composite image data in which a plurality of moving images to be independently displayed on a plurality of display devices are combined for each frame, a partial region to be displayed on a display device connected to the display moving image data Is extracted from the composite image data, and the final display image,
    A function of outputting the final display image data to the connected display device;
    A computer-readable recording medium characterized by recording a computer program that causes a computer to realize the above.
PCT/JP2016/064755 2015-05-25 2016-05-18 Information processing apparatus, output control apparatus, information processing system, and video data output method WO2016190193A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015105849A JP2016218366A (en) 2015-05-25 2015-05-25 Information processing device, output control device, information processing system, and moving-image data output method
JP2015-105849 2015-05-25

Publications (1)

Publication Number Publication Date
WO2016190193A1 true WO2016190193A1 (en) 2016-12-01

Family

ID=57393865

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/064755 WO2016190193A1 (en) 2015-05-25 2016-05-18 Information processing apparatus, output control apparatus, information processing system, and video data output method

Country Status (2)

Country Link
JP (1) JP2016218366A (en)
WO (1) WO2016190193A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11308670B2 (en) 2017-03-22 2022-04-19 Sony Corporation Image processing apparatus and method
US10659771B2 (en) * 2017-07-13 2020-05-19 Google Llc Non-planar computational displays

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07154827A (en) * 1993-11-29 1995-06-16 Canon Inc Plural images synthesizing device and image display device
JP2012108211A (en) * 2010-11-15 2012-06-07 Sharp Corp Multi-display system and portable terminal device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07154827A (en) * 1993-11-29 1995-06-16 Canon Inc Plural images synthesizing device and image display device
JP2012108211A (en) * 2010-11-15 2012-06-07 Sharp Corp Multi-display system and portable terminal device

Also Published As

Publication number Publication date
JP2016218366A (en) 2016-12-22

Similar Documents

Publication Publication Date Title
US20210006770A1 (en) Methods and apparatus for receiving and/or using reduced resolution images
CN106165415B (en) Stereoscopic viewing
WO2017086263A1 (en) Image processing device and image generation method
JP6563043B2 (en) Video display system
JP6768197B2 (en) Information processing equipment and methods
CN103348682B (en) The method and apparatus that single vision is provided in multi-view system
JP2016537903A (en) Connecting and recognizing virtual reality content
JP6391423B2 (en) Image generating apparatus, image extracting apparatus, image generating method, and image extracting method
US20210185299A1 (en) A multi-camera device and a calibration method
WO2020170455A1 (en) Head-mounted display and image display method
JPWO2019123509A1 (en) Terminal device, system, program and method
WO2013108285A1 (en) Image recording device, three-dimensional image reproduction device, image recording method, and three-dimensional image reproduction method
WO2016190193A1 (en) Information processing apparatus, output control apparatus, information processing system, and video data output method
WO2020170454A1 (en) Image generation device, head-mounted display, and image generation method
JP2024050737A (en) Information processing device, operation method of information processing device, and program
US20210037231A1 (en) Image processing apparatus, image processing method, and image processing program
WO2020170456A1 (en) Display device and image display method
JP2005175539A (en) Stereoscopic video display apparatus and video display method
GB2568241A (en) Content generation apparatus and method
EP3654099A2 (en) Method for projecting immersive audiovisual content
US11688124B2 (en) Methods and apparatus rendering images using point clouds representing one or more objects
TWI825367B (en) Method, 3D display device and 3D terminal for implementing floating touch
US11863902B2 (en) Techniques for enabling high fidelity magnification of video
WO2023079623A1 (en) Image display system, image transmission device, display control device, and image display method
WO2022158220A1 (en) Image display system and image display method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16799899

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16799899

Country of ref document: EP

Kind code of ref document: A1