WO2019003609A1 - Image processing device and image processing method - Google Patents

Image processing device and image processing method Download PDF

Info

Publication number
WO2019003609A1
WO2019003609A1 PCT/JP2018/016478 JP2018016478W WO2019003609A1 WO 2019003609 A1 WO2019003609 A1 WO 2019003609A1 JP 2018016478 W JP2018016478 W JP 2018016478W WO 2019003609 A1 WO2019003609 A1 WO 2019003609A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
image processing
image data
lines
omnidirectional image
Prior art date
Application number
PCT/JP2018/016478
Other languages
French (fr)
Japanese (ja)
Inventor
啓文 葛西
一彰 鳥羽
市村 元
山本 和夫
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019003609A1 publication Critical patent/WO2019003609A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

Definitions

  • the technology disclosed in the present specification relates to an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing related to transmission of a baseband signal of image data.
  • the omnidirectional image can be viewed using a display device such as a large screen display or a head mounted display, for example.
  • An omnidirectional image can also be defined as an image having information of 360 degrees in all directions vertically and horizontally around the viewpoint when the viewpoint is a certain point in space.
  • the omnidirectional image is spherical and has a three-dimensional spread. For this reason, as image data, a spherical surface may be mapped and handled on a two-dimensional rectangle (for example, refer to patent documents 1).
  • An object of the technology disclosed in the present specification is to provide an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing related to transmission of a baseband signal of image data.
  • the first aspect of the technology disclosed herein is: A definition unit that defines latitude lines and longitude lines on a sphere that represents an omnidirectional image; An arrangement unit for arranging the pixels of the omnidirectional image on each of the defined latitude lines; An image processing apparatus comprising
  • the definition unit determines the interval of the latitude line based on either an angle of the spherical surface from the equatorial plane or a height of the spherical surface in the ground axis direction.
  • the placement unit places pixels on each latitude line at substantially equal intervals starting from the location of zero longitude.
  • the image processing apparatus sequentially arranges pixels on each of the latitudinal lines of the spherical surface sequentially in one line, and generates a baseband signal having a different number of pixels for each line, and a predetermined transmission path for the baseband signal. You may further provide the transmission part which transmits to an external device via it.
  • the image processing apparatus transmits the omnidirectional image data based on the acquisition unit that acquires transmission method information on the transmission method of the omnidirectional image data that can be handled by the external device, and the acquired transmission method information.
  • the information processing apparatus may further include a selection unit that selects a scheme, and the processing unit may generate the baseband signal according to the selected transmission scheme.
  • the image processing apparatus may further include a notification unit that notifies the external device of format information related to the format of the omnidirectional image data.
  • the image processing apparatus may generate baseband signals of different numbers of pixels for each line generated by sequentially arranging the pixels on each of the latitudinal lines of the spherical surface in one line from an external device through a predetermined transmission path.
  • the image processing apparatus may further include a receiving unit that receives and a processing unit that performs processing for displaying an omnidirectional image based on the baseband signal.
  • the image processing apparatus in this case may further include a notification unit that notifies the external device of transmission method information on a transmission method of compatible omnidirectional image data.
  • the image processing apparatus further includes an acquisition unit for acquiring format information on the format of the omnidirectional image data from the external device, and the processing unit displays the omnidirectional image based on the format information. Processing may be performed.
  • a second aspect of the technology disclosed in the present specification is Defining a latitude line and a longitude line on a sphere representing an omnidirectional image; Arranging the pixels of the omnidirectional image on each of the defined latitude lines; Image processing method.
  • an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing relating to transmission of a baseband signal of image data it is possible to provide an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing relating to transmission of a baseband signal of image data.
  • FIG. 1 is a diagram showing a method of arranging pixels on a spherical surface.
  • FIG. 2 is a diagram showing a configuration example of the AV system 200.
  • FIG. 3 is a diagram showing a configuration example of the HDMI transmitting unit 218 and the HDMI receiving unit 252 in the AV system 200.
  • FIG. 4 is a diagram showing a configuration example of the omnidirectional image data.
  • FIG. 5 is a diagram showing an example of the configuration of the omnidirectional image data.
  • FIG. 6 is a diagram showing an example of the configuration of the omnidirectional image data.
  • FIG. 7 is a diagram showing an example of TMDS transmission data for transmitting omnidirectional image data represented as a spherical surface.
  • FIG. 1 is a diagram showing a method of arranging pixels on a spherical surface.
  • FIG. 2 is a diagram showing a configuration example of the AV system 200.
  • FIG. 3 is a diagram showing a configuration example of the HDMI transmitting
  • FIG. 8 is a diagram showing an example of a packing format when transmitting image data through three TMDS channels # 0 to # 2 of HDMI.
  • FIG. 9 is a diagram showing an example of the data structure of E-EDID.
  • FIG. 10 is a diagram showing an example of the data structure of the Vender Specific area.
  • FIG. 11 is a diagram showing an example data structure of an AVI InfoFrame packet.
  • FIG. 12 is a flowchart showing the processing procedure performed by the disc player 210 when the display 250 is connected.
  • FIG. 13 is a flowchart showing a processing procedure for the disc player 210 to determine the transmission method of the omnidirectional image data.
  • FIG. 14 is a diagram illustrating how a video signal is transmitted in the transmission format shown in FIG. FIG.
  • FIG. 15 is a diagram showing a modification of the transmission format of the omnidirectional image data.
  • FIG. 16 is a view showing a method of mapping an omnidirectional image on a two-dimensional plane using equidistant cylindrical projection.
  • FIG. 17 is a view showing a method of projecting an omnidirectional image on a polyhedron and mapping it on a two-dimensional plane.
  • the omnidirectional image is an image having information of 360 degrees in all directions vertically and horizontally around a certain viewpoint in space, and has a three-dimensional spread.
  • most of the image data currently used consists of rectangular information having a predetermined aspect ratio such as 4: 3 or 16: 9. Therefore, techniques for recording and transmitting rectangular image data are generally used. Therefore, the omnidirectional image may be mapped and handled on a two-dimensional rectangle.
  • all intersections of latitude and longitude lines on a sphere are defined as pixels and mapped on a two-dimensional rectangle (see FIG. 16), or a cube circumscribed to a sphere
  • a method of projecting and mapping an omnidirectional image on a polyhedron such as (regular hexahedron) (see FIG. 17) and the like.
  • the cylinder 1602 is developed into a flat surface as indicated by reference numeral 1603.
  • the image data mapped to the plane 1603 of the two-dimensional coordinates is H.264. It can be compression encoded using a standard moving image data compression encoding scheme such as H.264, and can be transmitted and stored. Also, if the image data developed on the two-dimensional plane 1603 is mapped to a sphere based on the correspondence (index) between the two-dimensional coordinates and the original three-dimensional coordinates, the original omnidirectional image is reproduced. can do.
  • the upper and lower high latitude areas become high resolution areas where the number of pixels mapped per unit area of the original spherical surface is large, while the central low latitude areas are original This results in a low resolution area in which the number of pixels mapped per unit area of the spherical surface is small.
  • the omnidirectional image is projected on the inner periphery of the cube 1702 circumscribing the omnidirectional object 1701.
  • the cube 1702 is expanded on a plane as shown in FIG. 17B, and the image data projected on the side surfaces # 1 to # 6 of the cube 1702 is further two-dimensionally shown in FIG. 17C.
  • the image data mapped to the plane 1703 of the two-dimensional coordinates is as follows. It can be compression encoded using a standard moving image data compression encoding scheme such as H.264, and can be transmitted and stored. Also, if the image data developed on the two-dimensional plane 1703 is mapped to a sphere based on the correspondence (index) between the two-dimensional coordinates and the original three-dimensional coordinates, the original omnidirectional image is reproduced. can do.
  • latitude and longitude lines are defined on the spherical surface. Subsequently, on each latitude line, pixels are arranged at equal intervals starting from the location of zero longitude. Then, a latitude index is allocated to each pixel of each latitude line, and a longitude index starting from zero longitude is sequentially allocated to the pixels arranged on each latitude line. This makes it possible to refer to the positions of all the pixels arranged on the sphere by a combination of the latitude index and the longitude index.
  • the number of pixels arranged at the poles corresponding to the north pole and the south pole is 1, and the number of pixels arranged on the latitude line corresponding to the equator is maximum.
  • each latitude line corresponds to a so-called line (scanning line), and the total number of latitude lines or latitude line interval (or longitude) It can be said that the total number of pixels on the line or the pixel arrangement interval) corresponds to the vertical resolution. Also, the total number of pixels or the pixel arrangement interval on each latitude line can be said to correspond to so-called horizontal resolution.
  • the amount of data is small
  • the number of pixels is larger than that of the spherical representation as the latitude is closer to the high latitude, and the amount of data is also larger.
  • the method of projecting on a cube see FIG. 17
  • the resolution does not become uneven depending on the latitude
  • the amount of data becomes large in proportion to the surface area ratio of the sphere.
  • the pixel arrangement method on the spherical surface shown in FIG. 1 the pixels are arranged at equal intervals on each latitude line (that is, the pixels are arranged with almost the same density on the spherical surface). Is constant, and the resolution does not become uneven depending on the latitude.
  • the hardware resources on the receiving device side can be very small. Assuming that the received omnidirectional image is displayed as it is on the receiving side using, for example, a display that can be displayed on a spherical surface, as shown in FIG. And, as shown in FIG. 17, when using a transmission format for mapping the omnidirectional image on a two-dimensional plane, it is necessary to carry out reverse mapping of the received image data on the sphere. On the other hand, according to the pixel arrangement method shown in FIG. 1, since the image data representing the spherical image is transmitted spherically, the receiving side does not need reverse mapping processing or an image using a very small memory. Data processing is sufficient. Thus, hardware resources of the receiver can be saved.
  • FIG. 2 shows a configuration example of an AV (Audio Visual) system 200 capable of transmitting image data.
  • the AV system 200 includes a disk player 210 as a source device and a display 250 as a sink device. Image data of a spherically-represented omnidirectional image can be transmitted and received between the disk player 210 and the display 250.
  • the disc player 210 is an all-sky signal reproduction device
  • the display 250 is also an all-sky signal display device.
  • the disk player 210 and the display 250 are connected via, for example, a high-definition multimedia interface (HDMI) cable 350.
  • the disc player 210 is provided with an HDMI terminal 219 to which an HDMI transmitting unit 218 and a high-speed data line interface (I / F) 215 are connected.
  • the display 250 is provided with an HDMI terminal 259 to which the HDMI receiving unit 252 and the high-speed data line interface 251 are connected. Then, one end of the HDMI cable 350 is connected to the HDMI terminal 219 of the disc player 210, and the other end of the HDMI cable 350 is connected to the HDMI terminal 259 of the display 250.
  • Uncompressed (baseband) image data obtained by reproduction on the disc player 210 side is transmitted to the display 250 via the HDMI cable 350. Then, the display 250 processes the received image data to display an image.
  • uncompressed (baseband) audio data obtained by reproduction on the disc player 210 side is transmitted to the display 250 via the HDMI cable 350, and the display 250 side processes the received audio data. Audio is output.
  • the display 250 displays the omnidirectional image in order to provide the omnidirectional image to the user.
  • the omnidirectional image is an image having a three-dimensional spread, it needs to be displayed on the spherical surface in order to display correctly.
  • the received omnidirectional image data can be displayed as it is.
  • an appropriate rectangular portion can be cut out and displayed from the received omnidirectional image.
  • the configuration of the disc player 210 shown in FIG. 2 will be described focusing on the case of transmitting omnidirectional image data.
  • the disc player 210 includes an HDMI terminal 219, an HDMI transmitting unit 218, and a high-speed data line interface 215. Also, the disk player 210 includes a central processing unit (CPU) 214, a memory 213, an Ethernet (registered trademark) interface 211, a disk drive 212, a general image processing unit 216, and an omnidirectional image processing unit 217. , And these components are interconnected via internal buses 220, 221.
  • the CPU 214 executes control software stored in the memory 213 and centrally controls the operation of each component in the disk player 210 through the internal buses 220 and 221. Data generated when the CPU 214 executes control software is also stored in the memory 213 as appropriate.
  • the HDMI transmitting unit 218 transmits baseband data (or video) and audio data from the HDMI terminal 219 by communication conforming to HDMI as an HDMI source.
  • Image and audio data are transmitted on a Transition Minimized Differential Signaling (TMDS) channel of HDMI.
  • TMDS Transition Minimized Differential Signaling
  • the high-speed data line interface 215 is an interface of a bidirectional communication path configured using a predetermined line (in the present embodiment, a reserve line and an HPD (Hot-Plug-Detect) line) configuring the HDMI cable 350. is there.
  • a predetermined line in the present embodiment, a reserve line and an HPD (Hot-Plug-Detect) line
  • the high speed data line interface 215 is disposed between the Ethernet interface 211 and the HDMI terminal 219.
  • the high-speed data line interface 215 transmits transmission data supplied from the CPU 214 from the HDMI terminal 219 via the HDMI cable 350 to the other device (ie, the display 250 as an HDMI sink device). Also, the high-speed data line interface 215 receives data from the other party's device from the HDMI cable 350 via the HDMI terminal 219 and supplies the received data to the CPU 214.
  • Content data recorded on a disk (not shown) loaded in the disk drive 212 (that is, content data reproduced from the disk by the disk drive 212) is sent to the general image processing unit 216 through the internal bus 220.
  • the content data recorded in the disk drive 212 is, for example, a compressed image compressed in the MPEG (Moving Picture Experts Group) format.
  • the general image processing unit 216 decodes or decompresses the compressed image, and then sends it to the omnidirectional image processing unit 217.
  • the omnidirectional image processing unit 217 transmits the transmission method. Process to the state according to
  • the omnidirectional image data is expressed as a pixel arrangement on a spherical surface. That is, latitude lines and longitude lines are defined on the spherical surface representing the omnidirectional image, and the pixels forming the omnidirectional image are arranged at predetermined intervals (for example, equally spaced) in the longitudinal direction on each latitudinal line. (See Figure 1). Then, when transmitting the omnidirectional image data expressed as the pixel arrangement on the spherical surface in this manner, the omnidirectional image processing unit 217 sets pixel data on each latitude line of the omnidirectional image data for each line. Generate a placed baseband image.
  • the baseband image for transmitting the omnidirectional image data has a configuration in which the number of pixels is different for each line, but the details will be described later.
  • the omnidirectional image data processed by the omnidirectional image processing unit 217 is supplied to the HDMI transmission unit 218.
  • the HDMI transmitting unit 218 packs the omnidirectional image data and outputs the packed data from the HDMI terminal 219.
  • the HDMI transmitting unit 218 packs (described later) baseband signals of the omnidirectional image data according to a transmission method such as RGB 4: 4: 4, YCbCr 4: 4: 4, YCbCr 4: 2: 0, or the like.
  • the processing of the omnidirectional image processing unit 217 is not performed, and the image data is supplied to the HDMI transmission unit 218.
  • the omnidirectional image data is processed by the omnidirectional image processing unit 217 into a state according to the selected transmission method, and then the HDMI transmission unit 218 is performed. Supplied to
  • the content data reproduced from the disk by the disk drive 212 is sent out to the network instead of the HDMI cable 350, the content data is output to the network terminal 222 through the Ethernet interface 211. Similarly, when the content data reproduced from the disk by the disk drive 212 is sent not to the TMDS channel of the HDMI cable 350 but to the bidirectional communication path (described above), the content data is transmitted to the Ethernet interface 211. And the high speed data line interface 215 to the HDMI terminal 219.
  • the display 250 includes an HDMI terminal 259, an HDMI receiving unit 252, and a high-speed data line interface 251.
  • the display 250 includes a CPU 253, an Ethernet interface 255, a display panel 258, a general image processing unit 257, and an omnidirectional image processing unit 256, and these components are mutually connected via the internal buses 260 and 261. It is connected.
  • the CPU 253 executes control software stored in the memory 254, and centrally controls the operation of each component in the display 250 through the internal buses 260 and 261. Data generated when the CPU 253 executes the control software is also stored in the memory 254 as appropriate.
  • the HDMI receiving unit 252 receives baseband image (or video) and audio data supplied to the HDMI terminal 259 through the HDMI cable 350 by communication conforming to HDMI as an HDMI sink. Image and audio data are transmitted on the TMDS channel of HDMI (same as above).
  • the high-speed data line interface 251 includes predetermined lines (reserve line and HPD line in this embodiment) which constitute the HDMI cable 350. Is an interface of a two-way communication path configured using
  • the high speed data line interface 251 is disposed between the Ethernet interface 255 and the HDMI terminal 259.
  • the high-speed data line interface 251 transmits transmission data supplied from the CPU 253 from the HDMI terminal 259 to the other device (that is, the display 250 as an HDMI sink device) via the HDMI cable 350. Also, the high-speed data line interface 251 receives data from the other party's device from the HDMI cable 350 via the HDMI terminal 259, and supplies the received data to the CPU 253.
  • the HDMI receiving unit 252 receives a baseband image.
  • the HDMI reception unit 252 depacks a baseband signal packed by a transmission method such as RGB 4: 4: 4, YCbCr 4: 4: 4, or YCbCr 4: 2: 0. If the image data received by the HDMI receiving unit 252 is omnidirectional image data, the omnidirectional image processing unit 256 processes the image data into a state suitable for the display panel 258, and then the general image processing unit 257 Sent. If the image data received by the HDMI receiving unit 252 is not omnidirectional image data, the omnidirectional image processing unit 256 does not perform processing, and the received image data is sent to the general image processing unit 257 as it is .
  • the omnidirectional image data is expressed as a pixel arrangement on a spherical surface. That is, latitude lines and longitude lines are defined on the spherical surface representing the omnidirectional image, and the pixels forming the omnidirectional image are arranged at predetermined intervals (for example, equally spaced) in the longitudinal direction on each latitudinal line. (See Figure 1).
  • the omnidirectional image processing unit 256 maps each pixel arranged in each line of the baseband image on the spherical surface representing the omnidirectional image to reproduce the original omnidirectional image. carry out.
  • the omnidirectional image processing unit 256 performs processing of the received omnidirectional image data based on the format information of the omnidirectional image data notified from the disc player 210 side. For example, the omnidirectional image processing unit 256 calculates the number of pixels on each line of the baseband image based on the number of pixels on the equator and the total number of lines in the omnidirectional image data acquired as the format information. Also, the omnidirectional image processing unit 256 can perform display position correction when displaying the omnidirectional image, based on the origin position information acquired as the format information.
  • the general image processing unit 257 performs, for example, image quality improvement processing and superimposing processing of graphic data (such as OSD) on the image data. After such general image processing, the image data is sent to the display panel 258, and the image is presented to the user.
  • graphic data such as OSD
  • FIG. 3 shows a configuration example of the HDMI transmitting unit (HDMI source) 218 on the disc player 210 side and the HDMI receiving unit (HDMI sink) 252 on the display 250 side in the AV system 200.
  • HDMI transmitting unit HDMI source
  • HDMI receiving unit HDMI sink
  • the HDMI cable 350 connecting the HDMI transmitting unit 218 and the HDMI receiving unit 252 includes TMDS channels # 0 to # 2 and a TMDS clock channel. Image data and audio data are serially transmitted in one direction in synchronization with the pixel clock from the HDMI transmitting unit 218 to the HDMI receiving unit 252 using the TMDS channels # 0 to # 2. Also, the pixel clock is transmitted using the TMDS clock channel.
  • the HDMI transmitting unit 218 is an effective image period (hereinafter referred to as “active video period” which is a period obtained by removing the horizontal blanking period and the vertical blanking period from the interval from one vertical synchronization signal (Vsync) to the next vertical synchronization signal. , And transmits the differential signal corresponding to pixel data of a non-compressed (baseband) image for one screen to the HDMI receiving unit 252 in one direction.
  • the HDMI transmitting unit 218 transmits a differential signal corresponding to at least audio data and control data associated with an image to the HDMI receiving unit 252 in one direction in the horizontal blanking interval or the vertical blanking interval.
  • the HDMI transmitter 81 included in the HDMI transmitting unit 218 converts pixel data of an uncompressed image into a corresponding differential signal, and the HDMI receiving unit is generated by the three TMDS channels # 0 to # 2 included in the HDMI cable 350. Serially transmit in one direction 252.
  • the HDMI transmitter 81 converts audio data accompanying uncompressed images, and further necessary control data and other auxiliary data into corresponding differential signals, and the three TMDS channels included in the HDMI cable 350 Serial transmission is performed in one direction to the HDMI reception unit 252 at 0 to # 2.
  • the HDMI transmitter 81 transmits a pixel clock synchronized with pixel data transmitted on the three TMDS channels # 0 to # 2 to the HDMI receiving unit 252 on the TMDS clock channel included in the HDMI cable 350. For example, in each TMDS channel # 0 to # 2, 10 bits of pixel data are transmitted during one clock of the pixel clock.
  • the HDMI reception unit 252 receives a differential signal corresponding to pixel data transmitted in one direction from the HDMI transmission unit 218 in a plurality of channels in the active video period. Also, the HDMI receiving unit 252 receives a differential signal corresponding to audio data or control data transmitted in one direction from the HDMI transmitting unit 218 on a plurality of channels in the horizontal blanking interval or the vertical blanking interval. Do.
  • the HDMI receiver 82 included in the HDMI receiving unit 252 is a differential signal corresponding to pixel data transmitted in one direction from the HDMI transmitting unit 218 through the three TMDS channels # 0 to # 2 included in the HDMI cable 350. And differential signals corresponding to audio data and control data in synchronization with the pixel clock transmitted from the HDMI transmitting unit 218 on the TMDS clock channel.
  • the HDMI cable 350 also includes transmission channels called Display Data Channel (DDC) 83 and Consumer Electronics Control (CEC) line 84 in addition to TMDS channels # 0 to # 2 and TMDS clock channel.
  • DDC Display Data Channel
  • CEC Consumer Electronics Control
  • the DDC 83 includes two signal lines (not shown) included in the HDMI cable 350.
  • the HDMI transmitting unit 218 uses the DDC 83 in order to read E-EDID (Enhanced Extended Display Identification Data) from the HDMI receiving unit 252 connected via the HDMI cable 350.
  • E-EDID Enhanced Extended Display Identification Data
  • the HDMI receiving unit 252 has an EDID ROM (Read Only Memory) 85 in addition to the HDMI receiver 82, and stores E-EDID, which is performance information related to its own performance (Configuration or Capability), in the EDID ROM 85. ing.
  • the HDMI transmitting unit 218 can read out the E-EDID of the HDMI receiving unit 252 connected via the HDMI cable 350 from the EDID ROM 85 via the DDC 83. Then, the HDMI transmitting unit 218 sets the performance of the HDMI receiving unit 252 based on the E-EDID.
  • the HDMI transmitting unit 218 recognizes the format (profile) of an image supported by the electronic device (display 250) having the HDMI receiving unit 252, for example, RGB, YCbCr 4: 4: 4, YCbCr 4: 2: 0, etc. Do.
  • the CEC line 84 is formed of a single signal line (not shown) included in the HDMI cable 350.
  • the CEC line 84 is used to perform bi-directional communication of control data between the HDMI transmitting unit 218 and the HDMI receiving unit 252.
  • the HDMI cable 350 further includes an HPD line 86 connected to an HPD pin consisting of 19 pins.
  • the HDMI transmitting unit 218 can detect that the HDMI receiving unit 252 is connected via the HDMI cable 350 using the HPD line 86.
  • the HDMI cable 350 includes a line (+5 V power supply line 87) consisting of 18 pins used to supply +5 V power from the source device to the sink device.
  • the HDMI cable 350 includes a 14-pin reserve line 88.
  • FIGS. 4 and 5 show an example of the configuration of the omnidirectional image data of the original signal.
  • FIG. 4 shows a side view of a spherical surface.
  • Latitude lines are defined every angular interval ⁇ from the equatorial plane of the sphere.
  • FIG. 5A shows a state in which a latitude line is added on the spherical surface.
  • FIG. 5B shows the latitude line viewed from the top.
  • the omnidirectional image is expressed as a spherical surface by arranging the pixels at equal intervals starting from the location of the zero longitude on each latitude line.
  • the number of pixels arranged at the poles corresponding to the north pole and the south pole is 1, and the number of pixels arranged on the latitude line corresponding to the equator becomes maximum.
  • the number of pixels on the latitudinal line corresponding to the equator is 4,000, and the angle between the latitudinal lines is constant at 0.00107471 radians, that is, the total number of latitudinal lines corresponding to the vertical resolution is 3001.
  • the number of pixels arranged for each latitude line (line number) corresponding to the scanning line is summarized in the following Table 1 (however, the display of information of some lines is omitted).
  • the latitude lines are not defined at equal angular intervals ⁇ from the equatorial plane of the sphere, but as shown in FIG. It is also possible to define all celestial sphere image data represented as a spherical surface by defining every interval h. However, it is thought that the image quality will be higher in the transmission method in which the latitude lines are defined at equal angular intervals ⁇ from the equatorial plane of the sphere.
  • FIG. 7 shows an example of TMDS transmission data in the case of transmitting the omnidirectional image data represented as a spherical surface as shown in FIG. 5 and FIG. 6 by the TMDS channel of HDMI.
  • the figure shows sections of various transmission data in the case where image data of 4000 pixels in width x height of 3001 lines is transmitted in the TMDS channels # 0 to # 2.
  • Video Data period Video Data period
  • data island period In the video field (Video Field) or TMDS period in which the transmission data is transmitted by three TMDS channels # 0 to # 2 of HDMI.
  • a video field interval is an interval from a rising edge (active edge) of one vertical synchronization signal to a rising edge of the next vertical synchronization signal, and a horizontal blanking period (horizontal blanking period), a vertical blanking period (vertical blanking period), and The video field period is divided into an active video period (Active Video) which is a period obtained by removing the horizontal blanking period and the vertical blanking period.
  • Active Video active video period
  • Video data intervals are assigned to active video intervals.
  • This video data section consists of 3001 lines, and it is possible to transmit data of 4000 pixels (active pixels) of each line. That is, in the video data section, it is possible to transmit data of effective pixels of 4000 pixels (pixels) ⁇ 3001 lines constituting image data of one non-compressed screen.
  • the data island period and the control period are assigned to the horizontal blanking period and the vertical blanking period.
  • Auxiliary data (Auxiliary data) is transmitted in the data island period and the control period.
  • a dark gray area in FIG. 7 is a data island section. It is allocated to a part of the horizontal blanking period and the vertical blanking period.
  • data island period among the auxiliary data, data not related to control, for example, a packet of audio data is transmitted.
  • the control period is assigned to the horizontal blanking period and other parts of the vertical blanking period.
  • an area indicated by hatching is a control section.
  • data relating to control for example, a vertical synchronization signal, a horizontal synchronization signal (Hsync), a control packet and the like are transmitted.
  • the total number of latitude lines corresponding to the vertical resolution is 3001, and the number of pixels of each latitude line varies. That is, the number of pixels disposed at the poles corresponding to the north pole and the south pole is 1, and the number of pixels disposed on the latitude line corresponding to the equator is at most 4000 (see, for example, FIG. 1).
  • the pixels for each latitude line of the omnidirectional image data as shown in FIGS. 4 and 5 are sequentially arranged in each line of the active video section consisting of 4000 pixels (pixels) ⁇ 3001 lines.
  • a video data section consisting of 3001 lines is configured.
  • the number of effective pixels of each line is different for each line. This is because only a minimum of one pixel is arranged on the scan line corresponding to the latitude lines of the north pole and the south pole, but a maximum of 4000 pixels is arranged on the scan line corresponding to the equator.
  • the TMDS transmission data shown in FIG. 7 can be said to have a data format in which the number of pixels is different for each line (or the number of pixels is not constant in each line).
  • the gray area on each line of the video data section in FIG. 7 is a video data section. Since the number of pixels is different for each line, the length of the video data section also differs for each line.
  • One line can also be defined as a section separated by a synchronization signal. In each line, when the video data section displayed in gray ends, the next horizontal synchronization signal is inserted and returned to the beginning of the line, and transmission of the signal of the next line is started. Therefore, when the number of pixels is different for each line, the interval of the horizontal synchronization signal is not the same for each line.
  • FIG. 14 exemplifies how a video signal is transmitted in a section from one vertical synchronization signal to the next vertical synchronization signal in the transmission format shown in FIG.
  • FIG. 15 shows an example of a transmission format of omnidirectional image data in which the length of each line of the video data section is made uniform.
  • invalid pixels are filled in a portion surrounded by a dashed square.
  • the omnidirectional image processing unit 217 When transmitting the omnidirectional image data represented as the pixel arrangement on the spherical surface as shown in FIGS. 4 and 5 from the disc player 210 to the display 250, the omnidirectional image processing unit 217 The pixel data on each latitudinal line of the omnidirectional image data is arranged to generate a baseband image as shown in FIG.
  • image data representing an omnidirectional image as a spherical surface is defined as a baseband image having a different number of pixels for each line (or the number of pixels is not constant in each line) It can be transmitted by the TMDS channel of HDMI.
  • the display 250 does not need to perform a process of demapping the received image data on the sphere, or Image data processing using a small amount of memory is sufficient.
  • the received omnidirectional image data can be displayed as it is.
  • an appropriate rectangular portion can be cut out and displayed from the received omnidirectional image.
  • each line as shown in FIG. 6 can be transmitted on the AV system 200 as TMDS transmission data, that is, a baseband image.
  • TMDS transmission data that is, a baseband image.
  • the display 250 side has an advantage that the process of reversely mapping the received image data on the spherical surface is not necessary.
  • FIG. 8 shows an example of packing format when transmitting image data through three TMDS channels # 0 to # 2 of HDMI.
  • RGB 4: 4: 4 scheme 8-bit blue (B) data, 8-bit green (G) data, 8-bit red (in red) in the data area of each pixel in the TMDS channels # 0 to # 2.
  • R Data is arranged.
  • FIG. 8 exemplifies only one of RGB 4: 4: 4 as a transmission method of image data, other transmission methods such as YCbCr 4: 4: 4 and YCbCr 4: 2: 0 may of course be used.
  • the HDMI transmitting unit 218 when transmitting the omnidirectional image data from the disk player 210 to the display 250, the HDMI transmitting unit 218 generates a baseband having a different number of pixels for each line, generated by the omnidirectional image processing unit 217.
  • the signal is packed in a packing format as shown in FIG. 8 and output from the HDMI terminal 219.
  • the HDMI receiving unit 252 on the display 250 side has the EDID ROM 85 that stores the E-EDID related to its own performance. Also, the HDMI transmitting unit 218 on the disc player 210 side reads out the E-EDID of the HDMI receiving unit 252 from the EDID ROM 85 through the DDC 83 included in the HDMI cable 350, and the HDMI receiving unit 252 based on the E-EDID. The performance of can be set.
  • the CPU 214 of the disc player 210 transmits all omnidirectional image data compatible with the display 250 based on the E-EDID read from the HDMI receiving unit 252 of the display 250. It also recognizes the method. Specifically, the CPU 214 of the disc player 210 recognizes whether or not the display 250 corresponds to a baseband image in which the number of pixels is different for each line (or the number of pixels is not constant in each line).
  • FIG. 9 shows an example of the data structure of E-EDID.
  • the illustrated E-EDID is composed of a basic block and an extension lock.
  • E-EDID1.3 Basic Structure At the top of the basic block, data defined by the E-EDID1.3 standard represented by "E-EDID1.3 Basic Structure” is placed, and then the conventional EDID represented by "Preferred timing” and The timing information different from the "Preferred timing” for maintaining the compatibility with the conventional EDID represented by "2nd timing” is disposed.
  • a data area to be expanded to store the omnidirectional image information is defined.
  • FIG. 10 shows an example of the data structure of the Vender Specific area in the extension block of E-EDID.
  • blocks 0 to N which are blocks of 1 byte are provided.
  • the data area of the omnidirectional image information to be stored by the sink device in the present embodiment, the display 250
  • the sink device in the present embodiment, the display 250
  • the sixth byte is a flag indicating the function supported by the sink device represented by "Supports-AI”, represented by “DC-48bit”, “DC-36bit”, and “DC-30bit”, respectively.
  • the 8th byte to the 11th byte are information used for transmitting and receiving a stereoscopic image (see, for example, Patent Document 2).
  • Information on a stereoscopic image is stored in the eighth byte to the tenth byte.
  • information related to stereophonic sound is stored in the 11th byte. Since stereoscopic image transmission and reception is not directly related to the technology disclosed herein, detailed description of the eighth to eleventh bytes is omitted.
  • the 12th byte to the 16th byte store information on the omnidirectional image.
  • the seventh bit and the sixth bit of the twelfth byte data indicating the method of the omnidirectional image data supported by the sink device is written.
  • the seventh bit is the distance from the equatorial plane of the spherical surface defined by the latitude lines defined when arranging each pixel of the omnidirectional image data on the spherical surface. This is a method determined by the angle (SLBA: Sphere Latitude by Angle).
  • SLBA Sphere Latitude by Angle
  • the distance between the latitude lines defined when arranging each pixel of the omnidirectional image data on the spherical surface is determined by the height in the direction of the earth axis of the spherical surface It is a system (SLBH: Sphere Latitude by Height).
  • SLBH Sphere Latitude by Height
  • the maximum number of pixels on the equator (Pixel Number on Equator) that can be handled by the sink device is written.
  • the maximum number of pixels 4000 is written in the 13th byte and the 14th byte.
  • the maximum number of latitude lines (or the total number of lines in the video data section) that can be handled by the sink device (Total Latitude Line Number) is written.
  • VIC Video format Identification Code
  • the sink device determines the interval of the latitude line for arranging the pixels of the omnidirectional image data (angle or height) as the omnidirectional image data method that the sink device can handle. Can be specified, the maximum number of pixels to be placed on the equator, the maximum number of latitudes that can be supported (the maximum number of lines), and so on.
  • the disc player 210 when transmitting the omnidirectional image data to the display 250, the disc player 210 is in the Vendor Specific area of the E-EDID read out from the HDMI receiving unit 252 on the display 250 side. Based on the information mainly described in the 12th byte to the 16th byte, it is possible to identify the type of the omnidirectional image data that can be supported by the display 250. Then, the disc player 210 selects any method that can be supported by the display 250 of the transmission destination, and transmits the omnidirectional image data.
  • the information on the omnidirectional image as described above may be arranged not in the Vendor Specific region but in an information storage region newly provided for the omnidirectional image transmission.
  • the means or method for storing information on the omnidirectional image in the sink device, and the means or method for transmitting the information on the omnidirectional image from the sink device to the source device are not limited to specific ones.
  • the disc player 210 when transmitting the omnidirectional image data, transmits information on the currently transmitted image format to the display 250.
  • the information on the image format includes a method (SLBA or SLBH) of determining the interval of the latitude line defined when arranging each pixel of the omnidirectional image data on the spherical surface, and the arrangement on the equator Maximum number of pixels to be displayed (Pixel Number on Equator), number of latitude lines (or total number of lines in video data section) defined when arranging each pixel of omnidirectional image data on a sphere (Total Latitude Line Number) , Pixel number of origin position, line number of origin position, etc. are included.
  • SLBA or SLBH a method of determining the interval of the latitude line defined when arranging each pixel of the omnidirectional image data on the spherical surface, and the arrangement on the equator Maximum number of pixels to be displayed (Pixel Number on Equator), number of latitude lines (or total number of lines in video data section) defined when arranging each pixel of omnidirectional image data on a sphere (Total Latitude Line Number) , Pixel
  • the disc player 210 inserts information on the image format of the omnidirectional image data being transmitted during the blanking period of the omnidirectional image data (non-compressed (baseband) video signal) to be transmitted to the display 250 Can be sent to the display 250.
  • the disc player 210 uses the AVI (Auxiliary Video Information) InfoFrame packet or the like in the HDMI signal to transmit information regarding the image format of the omnidirectional image data being transmitted as a block of the omnidirectional image data. It can be inserted in the ranking period.
  • AVI Advanced Video Information
  • the AVI InfoFrame packet is placed in a data island period (see FIG. 7) in TMDS transmission data.
  • FIG. 11 shows an example data structure of an AVI InfoFrame packet. According to the HDMI standard, it is possible to transmit incidental information on an image being transmitted from a source device to a sink device using an AVI InfoFrame packet.
  • the second byte describes information indicating the packet length.
  • the packet length of the AVI InfoFrame is currently "0x0E", but when the omnidirectional image output format information is defined in the 18th to 26th bytes in this embodiment, it is "0x17", as shown in FIG. become.
  • Each piece of information described in the AVI InfoFrame is defined in CEA-861-D Section 6-4. Therefore, detailed description of the third to seventeenth bytes which are not particularly modified or added in the present embodiment will be omitted.
  • the 18th to 26th bytes contain information on the image format of the omnidirectional image data being transmitted.
  • any one of the transmission methods of the omnidirectional image data selected by the source device is designated. Specifically, data indicating the method of the omnidirectional image data selected by the source device is written in the seventh bit and the sixth bit of the eighteenth byte.
  • the seventh bit is a method in which the interval of the latitude line defined when arranging each pixel of the omnidirectional image data on the spherical surface as shown in FIG. 4 and FIG. SLBA).
  • the sixth bit as shown in FIG. 6, the distance between the latitude lines defined when arranging each pixel of the omnidirectional image data on the spherical surface is determined by the height in the direction of the earth axis of the spherical surface It is a system (SLBH).
  • the maximum number of pixels on the equator (Pixel Number on Equator) in the transmitted omnidirectional image data is set.
  • the number of latitude lines (or the total number of lines in the video data section) (Total Latitude Line Number) in the transmitted omnidirectional image data is set.
  • the origin position information of the transmitted omnidirectional image data is set.
  • the origin position is the standard position of the image.
  • the sink device can use this information to present the origin always in front of the viewer.
  • the pixel number (Pixel Number) of the origin position is set in the 23rd byte and the 24th byte, and the line number (Line Number) of the origin position is set in the 25th byte and the 26th byte.
  • the display 250 as a sink device can control the processing of the image data transmitted from the disc player 210 based on the information on the image format notified using the AVI InfoFrame packet.
  • the display 250 receives all celestial sphere image data from the disc player 210 when the seventh bit or the sixth bit of the eighteenth byte of the received AVI InfoFrame packet is set. It can be determined that it is being transmitted. Furthermore, the display 250 can obtain the number of pixels on the equator and the total number of lines in the transmitted omnidirectional image data based on the values set in the 19th byte to the 22nd byte of the same packet. The number of pixels on each line can be calculated and obtained based on that.
  • packing formats such as RGB 4: 4: 4, YCbCr 4: 4: 4, YCbCr 4: 2: 0, etc. are designated by the sixth bit and the fifth bit of the fourth byte of the AVI InfoFrame packet.
  • the display 250 can perform display position correction at the time of presentation on the display panel 258 based on the origin position information of the transmission image, which is set to the 23rd byte to the 26th byte of the AVI InfoFrame packet.
  • FIG. 12 shows, in the form of a flowchart, a processing procedure performed when the disc player 210 connects to the display 250 via the HDMI cable 350 in the AV system 200 shown in FIG.
  • the illustrated procedure is basically implemented mainly by the CPU 214 in the disk player 210.
  • the disc player 210 checks whether the HPD signal of the HDMI cable 350 connected to its own HDMI terminal 219 is at high level (step S1201).
  • the disc player 210 determines that the HDMI sink device is not connected via the HDMI cable 350, and the process ends. Do.
  • the disc player 210 uses the DDC 83 of the HDMI cable 350 to connect the HDMI sink device (display 250).
  • the E-EDID is read out from the EDID ROM 85 in the HDMI receiving unit 252 (step S1202).
  • the data structure of E-EDID is as illustrated in FIG. Further, as shown in FIG. 10, in the 12th to 16th bytes of the Vendor Specific area included in the extension block in E-EDID, the transmission method of the omnidirectional image compatible with the HDMI sink device is related. Information is stored. Among them, the seventh bit of the 12th byte indicates that the system of the latitude lines where the pixels of the omnidirectional image data are arranged is determined by the angle from the equatorial plane of the spherical surface (SLBA), and the sixth The bit indicates that the interval of the same latitude line is a scheme (SLBH) determined by the height in the direction of the ground axis of the sphere.
  • SLBA equatorial plane of the spherical surface
  • the disc player 210 checks whether the seventh bit or the sixth bit of the 12th byte is set, and the connected HDMI sink device corresponds to the display of the omnidirectional image. It can be confirmed that In the processing procedure shown in FIG. 12, it is assumed that the connected HDMI sink device is a device (display 250) compatible with the display of the omnidirectional image.
  • the disc player 210 checks whether there is a omnidirectional image to be transmitted to the connected HDMI sink device (display 250) (step S1203).
  • step S1203 If there is no omnidirectional image to be transmitted (No in step S1203), the disc player 210 sets data indicating non-transmission of the omnidirectional image in the AVI InfoFrame packet to be inserted in the blanking period of the image data. Then (step S1207), the process ends.
  • non-transmission of the omnidirectional image can be indicated by setting 0 to any of the seventh bit and the sixth bit of the 18th byte of the AVI InfoFrame packet.
  • step S1203 If there is a omnidirectional image to be transmitted to the display 250 (Yes in step S1203), the disc player 210 starts processing for transmitting the omnidirectional image data to the display 250.
  • the disc player 210 determines the transmission method of the omnidirectional image data to be transmitted to the display 250 (step S1204). At this time, the disc player 210 selects the omnidirectional image in consideration of the transmission method of the omnidirectional image compatible with the display 250 described in the Vendor Specific area of the EDID read from the display 250 in step S1202. Determine the data transmission method.
  • the disc player 210 may use, for example, a method of determining an interval of latitude lines, a maximum number of pixels arranged in one line, a total number of latitude lines (or a total number of lines), and an omnidirectional sphere Origin position information of the image data is determined.
  • the disc player 210 determines whether the transmission of the omnidirectional image data is started (step S1205). If transmission of the omnidirectional image data is not started (No in step S1205), the disc player 210 sets data indicating non-transmission of the omnidirectional image in the AVI InfoFrame packet (step S1207), and ends this processing. Do.
  • step S1205 When the transmission of the omnidirectional image data is started (Yes in step S1205), the disc player 210 indicates the transmission method of the omnidirectional image data in the 18th to 26th bytes of the AVI InfoFrame packet. Data is set (step S1206), and the process ends.
  • FIG. 13 shows, in the form of a flowchart, a detailed processing procedure for the disc player 210 to determine the transmission method of the omnidirectional image data, which is performed in step S1205 in the flowchart shown in FIG. .
  • the disc player 210 checks whether or not the seventh bit of the twelfth byte of the Vendor Specific area included in the extension block of the E-EDID acquired from the EDID ROM 85 of the HDMI sink device (display 250) is set. (Step S1301).
  • the method of determining the spacing of the latitude lines when arranging each pixel of the omnidirectional image data on the sphere includes the SLBA method of determining the angle from the equatorial plane of the sphere and the height of the sphere in the ground axis direction.
  • SLBH method There are two types of SLBH method determined by the above (described above). Among them, the above-mentioned person's SLBA method is a transmission method with higher image quality.
  • the disc player 210 preferentially gives priority to the latitude line.
  • the SLBA method in which the distance is determined by the angle of the spherical surface from the equatorial plane is selected (step S1302), and the present process ends.
  • step S1303 If the seventh bit of the 12th byte of the Vendor Specific area is not set (No in step S1301), the disc player 210 subsequently sets the 6th bit of the 12th byte of the Vendor Specific area. It is checked whether it has been done (step S1303).
  • the disc player 210 sets the latitude line to the spherical surface.
  • the SLBH method determined by the height in the ground axis direction is selected (step S1304), and the process ends.
  • the disc player 210 transmits the omnidirectional image data to the display 250. It is determined that there is no possible method, the output non-selection of the omnidirectional image is set (step S1305), and the process ends.
  • the disc player 210 when transmitting the omnidirectional image data from the disc player 210 to the display 250, the disc player 210 is an omnidirectional sphere that the display 250 can handle.
  • the transmission method information of the image data is acquired, and the omnidirectional image data is transmitted by the transmission method compatible with the display 250.
  • the disc player 210 when transmitting the omnidirectional image data, transmits the transmission method information to the display 250.
  • the disc player 210 describes transmission method information of omnidirectional image data in an AVI InfoFrame packet, and transmits it to the display 250 using the blanking period of the image data (video signal). I am trying to do it.
  • the method for the disc player 210 to transmit the transmission method information of the omnidirectional image data to the display 250 is not limited to the above.
  • the disc player 210 may transmit transmission method information of omnidirectional image data to the display 250 via the CEC line 84 of the HDMI cable 350.
  • the disc player 210 transmits the transmission method information of the omnidirectional image data to the display 250 via the bidirectional communication path (described above) configured by the reserve line 88 of the HDMI cable 350 and the HPD line 86. You may do so.
  • FIG. 2 exemplifies the AV system 200 using the HDMI transmission path, but as a base band digital interface used for transmission of omnidirectional image data, DVI (Digital Visual Interface), other than HDMI, Examples include DP (Display Port) interface and wireless interface using 60 MHz millimeter wave.
  • DVI Digital Visual Interface
  • Examples include DP (Display Port) interface and wireless interface using 60 MHz millimeter wave.
  • DP Display Port
  • the technique disclosed in the present specification can be similarly applied to transmission of omnidirectional image data using any of these digital interfaces.
  • a base band digital interface used for transmission of omnidirectional image data As a base band digital interface used for transmission of omnidirectional image data, a DVI interface, a DP interface, a wireless interface using 60 MHz millimeter waves, and the like can be mentioned besides HDMI.
  • the technique disclosed in the present specification can be similarly applied to transmission of omnidirectional image data using any of these digital interfaces.
  • the present specification describes a pixel arrangement method for representing all celestial sphere image data as a spherical surface by arranging each pixel at predetermined intervals in the longitudinal direction on each of the latitudinal lines on which the latitude line and the longitude line are defined.
  • a pixel layout method can be applied not only to transmission of omnidirectional image data but also to various processing such as recording.
  • a definition unit that defines latitude lines and longitude lines on a sphere that represents an omnidirectional image
  • An arrangement unit for arranging the pixels of the omnidirectional image on each of the defined latitude lines
  • An index assignment unit is further provided, which assigns a latitude index to each latitude line and assigns a longitude index starting from zero longitude to pixels arranged on each latitude line.
  • the image processing apparatus as described in said (1).
  • the arrangement unit arranges pixels at approximately equal intervals starting from the location of zero longitude on each latitude line. The image processing apparatus as described in said (1).
  • the definition unit determines the distance between the latitude lines based on either the angle from the equatorial plane of the spherical surface or the height of the spherical surface in the ground axis direction.
  • the image processing apparatus according to any one of the above (1) and (2).
  • a processing unit for sequentially arranging pixels on each of the latitudinal lines of the spherical surface in one line to generate baseband signals having different numbers of pixels for each line;
  • a transmitter configured to transmit the baseband signal to an external device via a predetermined transmission path;
  • the image processing apparatus according to any one of (1) to (3), further comprising: (5) an acquisition unit for acquiring transmission method information on a transmission method of the omnidirectional image data compatible with the external device; A selection unit that selects a transmission method of the omnidirectional image data based on the acquired transmission method information; And further The processing unit generates the baseband signal according to the selected transmission scheme.
  • the image processing apparatus as described in said (4).
  • the acquisition unit acquires the transmission method information from the external device via the predetermined transmission path.
  • the image processing device according to (5).
  • the predetermined transmission path is a transmission path based on the HDMI standard,
  • the acquisition unit acquires the transmission method information from an E-EDID stored in an EDID ROM included in the external device.
  • the image processing apparatus as described in said (6).
  • the acquisition unit determines, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) Get
  • the selection unit is a method of determining the interval of the latitude line when the image processing unit generates the baseband signal based on the acquired transmission method information, the maximum number of pixels arranged in one line, correspondence Select at least one of the maximum possible latitudes (maximum number of lines),
  • the image processing apparatus according to any one of the above (5) to (7).
  • the information processing apparatus further comprises a notification unit that notifies the external device of format information related to the format of the omnidirectional image data.
  • the image processing apparatus according to any one of the above (4) to (8).
  • the notification unit transmits the format information to the external device via the predetermined transmission path.
  • the image processing apparatus according to (9).
  • the predetermined transmission path is a transmission path based on the HDMI standard,
  • the notification unit transmits the format information using a blanking period when the transmission unit transmits the baseband signal.
  • the image processing apparatus according to any one of the above (9) or (10).
  • the notification unit determines, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal Informing at least one of the origin position information of the omnidirectional image data,
  • the image processing apparatus according to any one of the above (9) to (11).
  • a receiving unit for receiving baseband signals of different numbers of pixels for each line, which are generated by sequentially arranging the pixels on each latitude line of the spherical surface in one line, from an external device through a predetermined transmission path
  • a processing unit that performs processing for displaying an omnidirectional image based on the baseband signal
  • the image processing apparatus according to any one of (1) to (3), further comprising: (14)
  • the information processing apparatus further includes a notification unit that notifies the external device of transmission method information on a transmission method of compatible omnidirectional image data.
  • the image processing apparatus according to (13).
  • the notification unit determines, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) To notify, The image processing apparatus as described in said (14).
  • the predetermined transmission path is a transmission path based on the HDMI standard, An EDID ROM storing an E-EDID including the transmission method information, The image processing apparatus according to any one of the above (14) or (15).
  • the information processing apparatus further comprises an acquisition unit for acquiring format information on a format of the omnidirectional image data from the external device, The processing unit performs processing for displaying an omnidirectional image based on the format information.
  • the image processing apparatus may use, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal. Acquiring at least one of origin position information of the omnidirectional image data from the external device;
  • the image processing apparatus according to (17).
  • the predetermined transmission path is a transmission path based on the HDMI standard, The acquisition unit acquires the format information using a blanking period when the transmission unit transmits the baseband signal.
  • the image processing apparatus according to any one of the above (18) or (19).
  • An image processing method having: (21) An image processing unit that generates baseband signals having different numbers of pixels for each line from omnidirectional image data configured by arranging pixels on a spherical surface; A transmitter configured to transmit the baseband signal to an external device via a predetermined transmission path; A transmitter equipped with: (22) The image processing unit arranges, for each line, pixels on each of the latitudinal lines of the spherical surface in which the latitude line and the longitude line are defined, and generates the baseband signal.
  • the transmitter according to (21).
  • an acquisition unit for acquiring transmission method information on a transmission method of all celestial sphere image data that can be handled by the external device; A selection unit that selects a transmission method of the omnidirectional image data based on the acquired transmission method information; And further The image processing unit generates the baseband signal according to the selected transmission scheme.
  • the transmitter according to any one of the above (21) or (22).
  • the acquisition unit acquires the transmission scheme information from the external device via the predetermined transmission path.
  • the predetermined transmission path is a transmission path based on the HDMI standard,
  • the acquisition unit acquires the transmission method information from an E-EDID stored in an EDID ROM included in the external device.
  • the transmitter according to any one of the above (23) or (24).
  • the omnidirectional image data is configured by arranging each pixel at a predetermined interval in the longitude direction on each latitude line of the sphere on which the latitude line and the longitude line are defined,
  • the acquisition unit acquires, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) that can be supported.
  • the selection unit is a method of determining the interval of the latitude line when the image processing unit generates the baseband signal based on the acquired transmission method information, the maximum number of pixels arranged in one line, correspondence Select at least one of the maximum possible latitudes (maximum number of lines),
  • the transmitter according to any one of (23) to (25).
  • the information processing apparatus further comprises a notification unit that notifies the external device of format information related to the format of the omnidirectional image data.
  • the transmitter according to any one of (21) to (26).
  • the notification unit transmits the format information to the external device via the predetermined transmission path.
  • the transmitter according to (27).
  • the predetermined transmission path is a transmission path based on the HDMI standard
  • the notification unit transmits the format information using a blanking period when the transmission unit transmits the baseband signal.
  • the transmitter according to any one of the above (27) or (28).
  • the omnidirectional image data is configured by arranging each pixel at a predetermined interval in the longitude direction on each latitude line of the spherical surface on which the latitude line and the longitude line are defined,
  • the notification unit determines, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal At least one of origin position information of image data is notified, The transmitter according to any one of (27) to (29).
  • a receiver comprising: (33)
  • the information processing apparatus further comprises a notification unit for notifying the external device of transmission method information on a transmission method of compatible omnidirectional image data.
  • the omnidirectional image data is configured by arranging each pixel at a predetermined interval in the longitude direction on each latitude line of the spherical surface on which the latitude line and the longitude line are defined,
  • the notification unit notifies at least one of the method of determining the interval of the latitude lines, the maximum number of pixels arranged in one line, and the maximum number of maximum latitude lines (maximum number of lines) that can be supported as the transmission method information.
  • the receiving device according to (33).
  • the image processing apparatus further comprises an acquisition unit for acquiring format information on the format of the omnidirectional image data from the external device, The processing unit performs processing for displaying an omnidirectional image based on the format information.
  • the receiver according to any one of (32) to (34).
  • the acquisition unit determines, as the format information, a method of determining the interval of the latitude lines, the maximum number of pixels arranged in one line, the total number of the latitude lines or the total number of lines included in the baseband signal Acquire at least one of origin position information of the omnidirectional image data;
  • the receiving device according to (35).
  • Ethernet interface 212 Disk drive, 213: Memory, 214: CPU 215: High-speed data line interface 216: General image processing unit, 217: All-sky image processing unit 218: HDMI transmission unit 219: HDMI terminal, 220, 221: Internal bus 222: Network terminal 250: Display 251: High-speed data Line interface 252: HDMI receiver, 253: CPU, 254: memory 255: Ethernet interface 256: all-sky image processor, 257: general image processor 258: display panel, 259: HDMI terminal 260, 261 ... Internal bus, 262 ... Network terminal 350 ... HDMI cable

Abstract

Provided are an image processing device and an image processing method for processing omnidirectional image data and performing a process related to the transmission of a baseband signal of image data. The image processing device is provided with: a defining unit which defines latitude lines and longitude lines on a globe on which the omnidirectional image is represented; and an arrangement unit which arranges pixels of the omnidirectional image on each of the defined latitude lines. Moreover, the image processing device is further provided with: a processing unit which sequentially arranges each of the pixels arranged on each of the latitude lines of the spherical surface on a single line, and generates baseband signals having a different number of pixels for each of the lines; and a transmission unit which transmits the baseband signals to an external device via a predetermined transmission channel. The image processing device can operate as a source device.

Description

画像処理装置及び画像処理方法Image processing apparatus and image processing method
 本明細書で開示する技術は、全天球画像データを処理し、又は画像データのベースバンド信号の伝送に関する処理を行なう画像処理装置及び画像処理方法に関する。 The technology disclosed in the present specification relates to an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing related to transmission of a baseband signal of image data.
 広角カメラや全天球カメラの普及により、全天球画像を扱うコンテンツ・ビジネスも拡大しつつある。全天球画像は、例えば大画面ディスプレイやヘッドマウントディスプレイなどの表示機器を用いて視聴することができる。 With the spread of wide-angle cameras and omnidirectional cameras, the content business dealing with omnidirectional images is also expanding. The omnidirectional image can be viewed using a display device such as a large screen display or a head mounted display, for example.
 全天球画像を、空間上のある1点を視点したときに、その視点を中心として上下左右全方位360度の情報を持つ画像、と定義することもできる。全天球画像は、球面状であり、3次元的な広がりを持つ。このため、画像データとしては、球面を2次元矩形上にマッピングして取り扱われる場合がある(例えば、特許文献1を参照のこと)。 An omnidirectional image can also be defined as an image having information of 360 degrees in all directions vertically and horizontally around the viewpoint when the viewpoint is a certain point in space. The omnidirectional image is spherical and has a three-dimensional spread. For this reason, as image data, a spherical surface may be mapped and handled on a two-dimensional rectangle (for example, refer to patent documents 1).
特開2014-127001号公報JP, 2014-127001, A 特開2013-229885号公報JP, 2013-229885, A
 本明細書で開示する技術の目的は、全天球画像データを処理し、又は画像データのベースバンド信号の伝送に関する処理を行なう画像処理装置及び画像処理方法を提供することにある。 An object of the technology disclosed in the present specification is to provide an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing related to transmission of a baseband signal of image data.
 本明細書で開示する技術の第1の側面は、
 全天球画像を表現する球面上に緯度線及び経度線を定義する定義部と、
 前記定義した各緯度線上に前記全天球画像の画素を配置する配置部と、
を具備する画像処理装置である。
The first aspect of the technology disclosed herein is:
A definition unit that defines latitude lines and longitude lines on a sphere that represents an omnidirectional image;
An arrangement unit for arranging the pixels of the omnidirectional image on each of the defined latitude lines;
An image processing apparatus comprising
 前記定義部は、前記緯度線の間隔を、前記球面の赤道面からの角度又は前記球面の地軸方向の高さのいずれかに基づいて決める。また、前記配置部は、各緯度線上において、経度ゼロの場所を起点としてほぼ等間隔に画素を配置する。 The definition unit determines the interval of the latitude line based on either an angle of the spherical surface from the equatorial plane or a height of the spherical surface in the ground axis direction. In addition, the placement unit places pixels on each latitude line at substantially equal intervals starting from the location of zero longitude.
 画像処理装置は、前記球面の各緯度線上の画素をそれぞれ1ラインに順次配置して、ライン毎に画素数の異なるベースバンド信号を生成する処理部と、前記ベースバンド信号を所定の伝送路を介して外部機器に送信する送信部をさらに備えていてもよい。 The image processing apparatus sequentially arranges pixels on each of the latitudinal lines of the spherical surface sequentially in one line, and generates a baseband signal having a different number of pixels for each line, and a predetermined transmission path for the baseband signal. You may further provide the transmission part which transmits to an external device via it.
 この場合の画像処理装置は、前記外部機器が対応可能な全天球画像データの伝送方式に関する伝送方式情報を取得する取得部と、取得した前記伝送方式情報に基づいて全天球画像データの伝送方式を選択する選択部をさらに備え、前記処理部は、前記選択された伝送方式に従って、前記ベースバンド信号を生成するようにしてもよい。また、画像処理装置は、前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器に通知する通知部をさらに備えていてもよい。 In this case, the image processing apparatus transmits the omnidirectional image data based on the acquisition unit that acquires transmission method information on the transmission method of the omnidirectional image data that can be handled by the external device, and the acquired transmission method information. The information processing apparatus may further include a selection unit that selects a scheme, and the processing unit may generate the baseband signal according to the selected transmission scheme. The image processing apparatus may further include a notification unit that notifies the external device of format information related to the format of the omnidirectional image data.
 あるいは、画像処理装置は、前記球面の各緯度線上の画素をそれぞれ1ラインに順次配置して生成された、ライン毎に画素数の異なるベースバンド信号を、所定の伝送路を介して外部機器から受信する受信部と、前記ベースバンド信号を基に全天球画像を表示するための処理を行なう処理部をさらに備えていてもよい。 Alternatively, the image processing apparatus may generate baseband signals of different numbers of pixels for each line generated by sequentially arranging the pixels on each of the latitudinal lines of the spherical surface in one line from an external device through a predetermined transmission path. The image processing apparatus may further include a receiving unit that receives and a processing unit that performs processing for displaying an omnidirectional image based on the baseband signal.
 この場合の画像処理装置は、対応可能な全天球画像データの伝送方式に関する伝送方式情報を前記外部機器に通知する通知部をさらに備えていてもよい。また、画像処理装置は、前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器から取得する取得部をさらに備え、前記処理部は、前記フォーマット情報に基づいて全天球画像を表示するための処理を行なうようにしてもよい。 The image processing apparatus in this case may further include a notification unit that notifies the external device of transmission method information on a transmission method of compatible omnidirectional image data. The image processing apparatus further includes an acquisition unit for acquiring format information on the format of the omnidirectional image data from the external device, and the processing unit displays the omnidirectional image based on the format information. Processing may be performed.
 また、本明細書で開示する技術の第2の側面は、
 全天球画像を表現する球面上に緯度線及び経度線を定義する定義ステップと、
 前記定義した各緯度線上に前記全天球画像の画素を配置する配置ステップと、
を有する画像処理方法である。
Also, a second aspect of the technology disclosed in the present specification is
Defining a latitude line and a longitude line on a sphere representing an omnidirectional image;
Arranging the pixels of the omnidirectional image on each of the defined latitude lines;
Image processing method.
 本明細書で開示する技術によれば、全天球画像データを処理し、又は画像データのベースバンド信号の伝送に関する処理を行なう画像処理装置及び画像処理方法を提供することができる。 According to the technology disclosed in the present specification, it is possible to provide an image processing apparatus and an image processing method for processing omnidirectional image data or performing processing relating to transmission of a baseband signal of image data.
 なお、本明細書に記載された効果は、あくまでも例示であり、本発明の効果はこれに限定されるものではない。また、本発明が、上記の効果以外に、さらに付加的な効果を奏する場合もある。 The effects described in the present specification are merely examples, and the effects of the present invention are not limited thereto. In addition to the effects described above, the present invention may exhibit additional effects.
 本明細書で開示する技術のさらに他の目的、特徴や利点は、後述する実施形態や添付する図面に基づくより詳細な説明によって明らかになるであろう。 Other objects, features, and advantages of the technology disclosed herein will be apparent from the more detailed description based on the embodiments described below and the accompanying drawings.
図1は、球面への画素配置方法を示した図である。FIG. 1 is a diagram showing a method of arranging pixels on a spherical surface. 図2は、AVシステム200の構成例を示した図である。FIG. 2 is a diagram showing a configuration example of the AV system 200. As shown in FIG. 図3は、AVシステム200におけるHDMI送信部218とHDMI受信部252の構成例を示した図である。FIG. 3 is a diagram showing a configuration example of the HDMI transmitting unit 218 and the HDMI receiving unit 252 in the AV system 200. 図4は、全天球画像データの構成例を示した図である。FIG. 4 is a diagram showing a configuration example of the omnidirectional image data. 図5は、全天球画像データの構成例を示した図である。FIG. 5 is a diagram showing an example of the configuration of the omnidirectional image data. 図6は、全天球画像データの構成例を示した図である。FIG. 6 is a diagram showing an example of the configuration of the omnidirectional image data. 図7は、球面として表現された全天球画像データを伝送するためのTMDS伝送データ例を示した図である。FIG. 7 is a diagram showing an example of TMDS transmission data for transmitting omnidirectional image data represented as a spherical surface. 図8は、HDMIの3つのTMDSチャネル#0乃至#2で画像データを伝送する際のパッキング・フォーマット例を示した図である。FIG. 8 is a diagram showing an example of a packing format when transmitting image data through three TMDS channels # 0 to # 2 of HDMI. 図9は、E-EDIDのデータ構造例を示した図である。FIG. 9 is a diagram showing an example of the data structure of E-EDID. 図10は、Vender Specific領域のデータ構造例を示した図である。FIG. 10 is a diagram showing an example of the data structure of the Vender Specific area. 図11は、AVI InfoFrameパケットのデータ構造例を示した図である。FIG. 11 is a diagram showing an example data structure of an AVI InfoFrame packet. 図12は、ディスク・プレーヤ210がディスプレイ250の接続時に実施する処理手順を示したフローチャートである。FIG. 12 is a flowchart showing the processing procedure performed by the disc player 210 when the display 250 is connected. 図13は、ディスク・プレーヤ210が全天球画像データの伝送方式を決定するための処理手順を示したフローチャートである。FIG. 13 is a flowchart showing a processing procedure for the disc player 210 to determine the transmission method of the omnidirectional image data. 図14は、図7に示した伝送フォーマットで映像信号が伝送される様子を例示した図である。FIG. 14 is a diagram illustrating how a video signal is transmitted in the transmission format shown in FIG. 図15は、全天球画像データの伝送フォーマットの変形例を示した図である。FIG. 15 is a diagram showing a modification of the transmission format of the omnidirectional image data. 図16は、正距円筒図法を利用して全天球画像を2次元平面にマッピングする方法を示した図である。FIG. 16 is a view showing a method of mapping an omnidirectional image on a two-dimensional plane using equidistant cylindrical projection. 図17は、全天球画像を多面体に投影して2次元平面にマッピングする方法を示した図である。FIG. 17 is a view showing a method of projecting an omnidirectional image on a polyhedron and mapping it on a two-dimensional plane.
 以下、図面を参照しながら本明細書で開示する技術の実施形態について詳細に説明する。 Hereinafter, embodiments of the technology disclosed herein will be described in detail with reference to the drawings.
 全天球画像は、空間上のある視点を中心として上下左右全方位360度の情報を持つ画像であり、3次元的な広がりを持つ。他方、現在使用されている画像データの多くは、4:3、16:9など所定のアスペクト比を持つ矩形の情報からなる。このため、矩形の画像データを記録したり伝送したりする技術が一般的である。したがって、全天球画像を2次元矩形上にマッピングして取り扱われる場合がある。 The omnidirectional image is an image having information of 360 degrees in all directions vertically and horizontally around a certain viewpoint in space, and has a three-dimensional spread. On the other hand, most of the image data currently used consists of rectangular information having a predetermined aspect ratio such as 4: 3 or 16: 9. Therefore, techniques for recording and transmitting rectangular image data are generally used. Therefore, the omnidirectional image may be mapped and handled on a two-dimensional rectangle.
 例えば、正距円筒図法を利用して、球面上の緯度及び経度線の交点をすべて画素として定義し、2次元矩形上にマッピングする方法や(図16を参照のこと)、球面に外接する立方体(正六面体)などの多面体上に全天球画像を投影してマッピングする方法(図17を参照のこと)、などを挙げることができる。 For example, using equidistant cylindrical projection, all intersections of latitude and longitude lines on a sphere are defined as pixels and mapped on a two-dimensional rectangle (see FIG. 16), or a cube circumscribed to a sphere There may be mentioned a method of projecting and mapping an omnidirectional image on a polyhedron such as (regular hexahedron) (see FIG. 17) and the like.
 前者のマッピング方法の場合、まず、図16(A)に示すように、全天球1601に外接する円筒1602の内周に全天球画像を投影する。次いで、図16(B)に示すように、この円筒1602を参照番号1603に示すように平面に展開する。2次元座標の平面1603にマッピングされた画像データは、H.264などの標準的な動画データの圧縮符号化方式を用いて圧縮符号化して、伝送並びに蓄積を行なうことができる。また、2次元座標と元の3次元座標との対応関係(インデックス)に基づいて、2次元平面1603上に展開された画像データを球面にマッピングしていけば、元の全天球画像を再現することができる。 In the case of the former mapping method, first, as shown in FIG. Next, as shown in FIG. 16B, the cylinder 1602 is developed into a flat surface as indicated by reference numeral 1603. The image data mapped to the plane 1603 of the two-dimensional coordinates is H.264. It can be compression encoded using a standard moving image data compression encoding scheme such as H.264, and can be transmitted and stored. Also, if the image data developed on the two-dimensional plane 1603 is mapped to a sphere based on the correspondence (index) between the two-dimensional coordinates and the original three-dimensional coordinates, the original omnidirectional image is reproduced. can do.
 なお、正距円筒図法を用いた場合、上下の高緯度の領域は、元の球面の単位面積当たりにマッピングされる画素数が多い高解像度の領域になる一方、中央の低緯度の領域は、元の球面の単位面積当たりにマッピングされる画素数が少ない低解像度の領域になってしまう。 When the equidistant cylindrical projection is used, the upper and lower high latitude areas become high resolution areas where the number of pixels mapped per unit area of the original spherical surface is large, while the central low latitude areas are original This results in a low resolution area in which the number of pixels mapped per unit area of the spherical surface is small.
 また、後者のマッピング方法の場合、まず、図17(A)に示すように、全天球1701に外接する立方体1702の内周に全天球画像を投影する。次いで、図17(B)に示すように立方体1702を平面に展開し、さらに、立方体1702の各側面#1~#6に投影された映像データを、図17(C)に示すように2次元座標の平面1703上にマッピングする。2次元座標の平面1703にマッピングされた画像データは、H.264などの標準的な動画データの圧縮符号化方式を用いて圧縮符号化して、伝送並びに蓄積を行なうことができる。また、2次元座標と元の3次元座標との対応関係(インデックス)に基づいて、2次元平面1703上に展開された画像データを球面にマッピングしていけば、元の全天球画像を再現することができる。 Further, in the case of the latter mapping method, first, as shown in FIG. 17A, the omnidirectional image is projected on the inner periphery of the cube 1702 circumscribing the omnidirectional object 1701. Next, the cube 1702 is expanded on a plane as shown in FIG. 17B, and the image data projected on the side surfaces # 1 to # 6 of the cube 1702 is further two-dimensionally shown in FIG. 17C. Map on the plane 1703 of coordinates. The image data mapped to the plane 1703 of the two-dimensional coordinates is as follows. It can be compression encoded using a standard moving image data compression encoding scheme such as H.264, and can be transmitted and stored. Also, if the image data developed on the two-dimensional plane 1703 is mapped to a sphere based on the correspondence (index) between the two-dimensional coordinates and the original three-dimensional coordinates, the original omnidirectional image is reproduced. can do.
 これに対し、全天球画像を2次元矩形などにマッピングせず、球面として表現する方法も考えられる。 On the other hand, a method of representing as a spherical surface without mapping the omnidirectional image to a two-dimensional rectangle or the like may be considered.
 全天球画像を球面として表現するには、球面上に画素を配置するとともに、各画素の球面上に位置を参照できるようなインデックスが必要である。そこで、本明細書では、図1に示すような球面上への画素配置方法を提案する。 In order to represent an omnidirectional image as a spherical surface, it is necessary to arrange pixels on the spherical surface and to have an index that allows reference of the position on the spherical surface of each pixel. Therefore, in this specification, a method of arranging pixels on a spherical surface as shown in FIG. 1 is proposed.
 図1では、まず球面上に緯度線と経度線を定義する。続いて、各緯度線上で、経度ゼロの場所を起点として、等間隔に画素を配置していく。そして、各緯度線の各画素に緯度インデックスを割り振るとともに、各緯度線上に配置された画素に対して経度ゼロを起点とする経度インデックスを順次割り振る。これによって、球面上に配置されたすべての画素の位置を緯度インデックスと経度インデックスの組み合わせで参照することが可能となる。 In FIG. 1, first, latitude and longitude lines are defined on the spherical surface. Subsequently, on each latitude line, pixels are arranged at equal intervals starting from the location of zero longitude. Then, a latitude index is allocated to each pixel of each latitude line, and a longitude index starting from zero longitude is sequentially allocated to the pixels arranged on each latitude line. This makes it possible to refer to the positions of all the pixels arranged on the sphere by a combination of the latitude index and the longitude index.
 なお、図1に示す球面上への画素配置方法では、北極及び南極に相当する極点に配置される画素数が1であり、赤道に相当する緯度線上に配置される画素数が最大になる。 In the pixel arrangement method on the spherical surface shown in FIG. 1, the number of pixels arranged at the poles corresponding to the north pole and the south pole is 1, and the number of pixels arranged on the latitude line corresponding to the equator is maximum.
 図1に示す球面への画素配置を既存の画像データ表現方法と対比してみると、各緯度線はいわゆるライン(走査ライン)に相当し、緯度線の総本数若しくは緯度線間隔(あるいは、経度線上の総画素数若しくは画素配置間隔)は垂直解像度に相当するということができる。また、各緯度線上の総画素数若しくは画素配置間隔はいわゆる水平解像度に相当するということができる。 When the pixel arrangement on the spherical surface shown in FIG. 1 is compared with the existing image data representing method, each latitude line corresponds to a so-called line (scanning line), and the total number of latitude lines or latitude line interval (or longitude) It can be said that the total number of pixels on the line or the pixel arrangement interval) corresponds to the vertical resolution. Also, the total number of pixels or the pixel arrangement interval on each latitude line can be said to correspond to so-called horizontal resolution.
 図1に示すような球面上の画素配置として表現された全天球画像データをそのまま伝送に使用すると、マッピングされた全天球画像データ(例えば、図16、図17を参照のこと)の伝送と比較して、以下のようなメリット(1)、(2)が挙げられる。 When omnidirectional image data represented as a pixel arrangement on a spherical surface as shown in FIG. 1 is used as it is for transmission, transmission of mapped omnidirectional image data (see, eg, FIGS. 16 and 17) The following merits (1) and (2) can be mentioned in comparison with the above.
(1)データ量が少ない
 例えば、正距円筒図法(図16を参照のこと)は、高緯度になり若しくは極に近づくほど球面表現に対して画素数が多くなり、データ量も大きくなる。また、立方体に投影する方法(図17を参照のこと)は、緯度に応じて解像度が不均一になることはないものの、球の表面積比にほぼ比例してデータ量が大きくなる。これに対し、図1に示す球面への画素配置方法によれば、各緯度線上に等間隔に画素を配置するので(すなわち、球面上で画素がほぼ同密度で配置されるので)、データ量が一定に保たれ、また緯度に応じて解像度が不均一になることもない。
(1) The amount of data is small For example, in the equidistant cylindrical projection (see FIG. 16), the number of pixels is larger than that of the spherical representation as the latitude is closer to the high latitude, and the amount of data is also larger. Also, with the method of projecting on a cube (see FIG. 17), although the resolution does not become uneven depending on the latitude, the amount of data becomes large in proportion to the surface area ratio of the sphere. On the other hand, according to the pixel arrangement method on the spherical surface shown in FIG. 1, the pixels are arranged at equal intervals on each latitude line (that is, the pixels are arranged with almost the same density on the spherical surface). Is constant, and the resolution does not become uneven depending on the latitude.
(2)受信装置側のハードウェア資源が非常に少なくて済む
 受信側において、例えば球面上に表示可能なディスプレイを用いて、受信した全天球画像をそのまま表示することを想定した場合、図16や図17に示したように全天球画像を2次元平面にマッピングする伝送フォーマットを用いると、受信した画像データを球面上に逆マッピングする処理が必要である。これに対し、図1に示す画素配置方法によれば、全天球画像を球面表現した画像データを伝送するので、受信側では逆マッピング処理が必要ないか、若しくは非常に少ないメモリを用いた画像データ処理で十分である。したがって、受信装置のハードウェア資源を節約することができる。
(2) The hardware resources on the receiving device side can be very small. Assuming that the received omnidirectional image is displayed as it is on the receiving side using, for example, a display that can be displayed on a spherical surface, as shown in FIG. And, as shown in FIG. 17, when using a transmission format for mapping the omnidirectional image on a two-dimensional plane, it is necessary to carry out reverse mapping of the received image data on the sphere. On the other hand, according to the pixel arrangement method shown in FIG. 1, since the image data representing the spherical image is transmitted spherically, the receiving side does not need reverse mapping processing or an image using a very small memory. Data processing is sufficient. Thus, hardware resources of the receiver can be saved.
 図2には、画像データを伝送可能なAV(Audio Visual)システム200の構成例を示している。このAVシステム200は、ソース機器としてのディスク・プレーヤ210と、シンク機器としてのディスプレイ250で構成される。ディスク・プレーヤ210とディスプレイ250間では球面表現した全天球画像の画像データを送受信することができる。したがって、ディスク・プレーヤ210は全天球信号再生機器であり、ディスプレイ250は全天球信号表示機器でもある。 FIG. 2 shows a configuration example of an AV (Audio Visual) system 200 capable of transmitting image data. The AV system 200 includes a disk player 210 as a source device and a display 250 as a sink device. Image data of a spherically-represented omnidirectional image can be transmitted and received between the disk player 210 and the display 250. Thus, the disc player 210 is an all-sky signal reproduction device, and the display 250 is also an all-sky signal display device.
 ディスク・プレーヤ210とディスプレイ250間は、例えばHDMI(High-Definition Multimedia Interface)ケーブル350を介して接続されている。ディスク・プレーヤ210には、HDMI送信部218及び高速データ・ライン・インターフェース(I/F)215が接続されたHDMI端子219が設けられている。また、ディスプレイ250には、HDMI受信部252及び高速データ・ライン・インターフェース251が接続されたHDMI端子259が設けられている。そして、HDMIケーブル350の一端はディスク・プレーヤ210のHDMI端子219に接続され、HDMIケーブル350の他端はディスプレイ250のHDMI端子259に接続されている。 The disk player 210 and the display 250 are connected via, for example, a high-definition multimedia interface (HDMI) cable 350. The disc player 210 is provided with an HDMI terminal 219 to which an HDMI transmitting unit 218 and a high-speed data line interface (I / F) 215 are connected. Further, the display 250 is provided with an HDMI terminal 259 to which the HDMI receiving unit 252 and the high-speed data line interface 251 are connected. Then, one end of the HDMI cable 350 is connected to the HDMI terminal 219 of the disc player 210, and the other end of the HDMI cable 350 is connected to the HDMI terminal 259 of the display 250.
 ディスク・プレーヤ210側で再生して得られた非圧縮(ベースバンド)画像データは、HDMIケーブル350を介してディスプレイ250に送信される。そして、ディスプレイ250は、受信した画像データを処理して、画像が表示される。また、ディスク・プレーヤ210側で再生して得られた非圧縮(ベースバンド)音声データは、HDMIケーブル350を介してディスプレイ250に送信され、ディスプレイ250側では、受信した音声データを処理して、音声が出力される。 Uncompressed (baseband) image data obtained by reproduction on the disc player 210 side is transmitted to the display 250 via the HDMI cable 350. Then, the display 250 processes the received image data to display an image. In addition, uncompressed (baseband) audio data obtained by reproduction on the disc player 210 side is transmitted to the display 250 via the HDMI cable 350, and the display 250 side processes the received audio data. Audio is output.
 なお、ディスク・プレーヤ210から送信されている画像データが全天球画像データである場合には、ディスプレイ250は、ユーザに全天球画像を提供するために、全天球画像を表示する。 When the image data transmitted from the disc player 210 is omnidirectional image data, the display 250 displays the omnidirectional image in order to provide the omnidirectional image to the user.
 ここで、全天球画像の表示方式の一例について説明する。 Here, an example of the display method of the omnidirectional image will be described.
 前述したように、全天球画像は3次元的な広がりを持つ画像なので、正しく表示するためには球面上に表示する必要がある。例えば、球面上に正しく表示できるディスプレイの場合には、受信した全天球画像データをそのまま表示することができる。また、例えばテレビのような矩形のディスプレイの場合には、受信した全天球画像から適当な矩形部分を切り出して表示することができる。 As described above, since the omnidirectional image is an image having a three-dimensional spread, it needs to be displayed on the spherical surface in order to display correctly. For example, in the case of a display that can be correctly displayed on a spherical surface, the received omnidirectional image data can be displayed as it is. Further, for example, in the case of a rectangular display such as a television, an appropriate rectangular portion can be cut out and displayed from the received omnidirectional image.
 図2に示したディスク・プレーヤ210の構成について、全天球画像データを送信する場合を中心に説明する。 The configuration of the disc player 210 shown in FIG. 2 will be described focusing on the case of transmitting omnidirectional image data.
 ディスク・プレーヤ210は、HDMI端子219と、HDMI送信部218と、高速データ・ライン・インターフェース215を備えている。また、ディスク・プレーヤ210は、CPU(Central Processing Unit)214と、メモリ213と、イーサネット(登録商標)インターフェース211と、ディスク・ドライブ212と、一般画像処理部216と、全天球画像処理部217を備え、これらのコンポーネントは内部バス220、221を介して相互接続されている。 The disc player 210 includes an HDMI terminal 219, an HDMI transmitting unit 218, and a high-speed data line interface 215. Also, the disk player 210 includes a central processing unit (CPU) 214, a memory 213, an Ethernet (registered trademark) interface 211, a disk drive 212, a general image processing unit 216, and an omnidirectional image processing unit 217. , And these components are interconnected via internal buses 220, 221.
 CPU214は、メモリ213に格納されている制御ソフトウェアを実行し、内部バス220、221を通じてディスク・プレーヤ210内の各コンポーネントの動作を統括的にコントロールする。CPU214が制御ソフトウェアを実行する際に生成されるデータも、適宜メモリ213に格納される。 The CPU 214 executes control software stored in the memory 213 and centrally controls the operation of each component in the disk player 210 through the internal buses 220 and 221. Data generated when the CPU 214 executes control software is also stored in the memory 213 as appropriate.
 HDMI送信部218は、HDMIソースとして、HDMIに準拠した通信により、ベースバンドの画像(若しくは映像)と音声のデータを、HDMI端子219から送出する。画像及び音声のデータは、HDMIのTMDS(Transition Minimized Differential Signaling)チャネルで送信される。 The HDMI transmitting unit 218 transmits baseband data (or video) and audio data from the HDMI terminal 219 by communication conforming to HDMI as an HDMI source. Image and audio data are transmitted on a Transition Minimized Differential Signaling (TMDS) channel of HDMI.
 高速データ・ライン・インターフェース215は、HDMIケーブル350を構成する所定ライン(本実施形態では、リザーブ・ラインとHPD(Hot-Plug-Detect)ライン)を用いて構成される双方向通信路のインターフェースである。 The high-speed data line interface 215 is an interface of a bidirectional communication path configured using a predetermined line (in the present embodiment, a reserve line and an HPD (Hot-Plug-Detect) line) configuring the HDMI cable 350. is there.
 高速データ・ライン・インターフェース215は、イーサネット・インターフェース211とHDMI端子219の間に配設されている。高速データ・ライン・インターフェース215は、CPU214から供給される送信データを、HDMI端子219からHDMIケーブル350を介して、相手側の機器(すなわち、HDMIシンク機器としてのディスプレイ250)に送信する。また、高速データ・ライン・インターフェース215は、HDMIケーブル350からHDMI端子219を介して相手側の機器からデータを受信し、受信データをCPU214に供給する。 The high speed data line interface 215 is disposed between the Ethernet interface 211 and the HDMI terminal 219. The high-speed data line interface 215 transmits transmission data supplied from the CPU 214 from the HDMI terminal 219 via the HDMI cable 350 to the other device (ie, the display 250 as an HDMI sink device). Also, the high-speed data line interface 215 receives data from the other party's device from the HDMI cable 350 via the HDMI terminal 219 and supplies the received data to the CPU 214.
 ディスク・ドライブ212に装填されたディスク(図示しない)に記録されたコンテンツ・データ(すなわち、ディスク・ドライブ212でディスクから再生されたコンテンツ・データ)は、内部バス220を通じて一般画像処理部216に送られる。ディスク・ドライブ212に記録されたコンテンツ・データは、例えばMPEG(Moving Picture Experts Group)形式で圧縮された圧縮画像である。 Content data recorded on a disk (not shown) loaded in the disk drive 212 (that is, content data reproduced from the disk by the disk drive 212) is sent to the general image processing unit 216 through the internal bus 220. Be The content data recorded in the disk drive 212 is, for example, a compressed image compressed in the MPEG (Moving Picture Experts Group) format.
 一般画像処理部216は、圧縮画像をデコード若しくは伸長処理した後、全天球画像処理部217に送る。全天球画像処理部217は、一般画像処理部216から得た画像データのうち、全天球画像を表示するための全天球画像データを、HDMIのTMDSチャネルで送信する際に、伝送方式に応じた状態に加工処理する。 The general image processing unit 216 decodes or decompresses the compressed image, and then sends it to the omnidirectional image processing unit 217. When transmitting the omnidirectional image data for displaying the omnidirectional image among the image data obtained from the general image processing unit 216 using the TMDS channel of HDMI, the omnidirectional image processing unit 217 transmits the transmission method. Process to the state according to
 本実施形態では、全天球画像データは、球面上の画素配置として表現されている。すなわち、全天球画像を表現する球面上に緯度線及び経度線を定義して、各緯度線上に経度方向に所定の間隔(例えば、等間隔)で全天球画像を構成する各画素が配置されている(図1を参照のこと)。そして、全天球画像処理部217は、このように球面上の画素配置として表現された全天球画像データを伝送する際に、ライン毎に全天球画像データの各緯度線上の画素データを配置したベースバンド画像を生成する。全天球画像データを伝送するベースバンド画像は、ライン毎に画素数が異なる構成となるが、詳細については後述に譲る。 In the present embodiment, the omnidirectional image data is expressed as a pixel arrangement on a spherical surface. That is, latitude lines and longitude lines are defined on the spherical surface representing the omnidirectional image, and the pixels forming the omnidirectional image are arranged at predetermined intervals (for example, equally spaced) in the longitudinal direction on each latitudinal line. (See Figure 1). Then, when transmitting the omnidirectional image data expressed as the pixel arrangement on the spherical surface in this manner, the omnidirectional image processing unit 217 sets pixel data on each latitude line of the omnidirectional image data for each line. Generate a placed baseband image. The baseband image for transmitting the omnidirectional image data has a configuration in which the number of pixels is different for each line, but the details will be described later.
 全天球画像処理部217で処理された全天球画像データは、HDMI送信部218に供給される。HDMI送信部218は、全天球画像データをパッキングして、HDMI端子219から出力する。HDMI送信部218は、例えば、RGB4:4:4、あるいはYCbCr4:4:4やYCbCr4:2:0などの伝送方式で、全天球画像データのベースバンド信号をパッキング(後述)する。 The omnidirectional image data processed by the omnidirectional image processing unit 217 is supplied to the HDMI transmission unit 218. The HDMI transmitting unit 218 packs the omnidirectional image data and outputs the packed data from the HDMI terminal 219. The HDMI transmitting unit 218 packs (described later) baseband signals of the omnidirectional image data according to a transmission method such as RGB 4: 4: 4, YCbCr 4: 4: 4, YCbCr 4: 2: 0, or the like.
 なお、画像データが全天球画像データでない場合には、全天球画像処理部217での処理は実施されないで、HDMI送信部218に供給される。一方、画像データが全天球画像データである場合には、全天球画像処理部217により、全天球画像データを選択された伝送方式に応じた状態に加工処理した後に、HDMI送信部218に供給される。 If the image data is not omnidirectional image data, the processing of the omnidirectional image processing unit 217 is not performed, and the image data is supplied to the HDMI transmission unit 218. On the other hand, if the image data is omnidirectional image data, the omnidirectional image data is processed by the omnidirectional image processing unit 217 into a state according to the selected transmission method, and then the HDMI transmission unit 218 is performed. Supplied to
 また、ディスク・ドライブ212でディスクから再生されたコンテンツ・データを、HDMIケーブル350ではなくネットワークに送出する際には、当該コンテンツ・データは、イーサネット・インターフェース211を介してネットワーク端子222に出力される。同様に、ディスク・ドライブ212でディスクから再生されたコンテンツ・データを、HDMIケーブル350のTMDSチャネルではなく双方向通信路(前述)に送出する際には、当該コンテンツ・データは、イーサネット・インターフェース211及び高速データ・ライン・インターフェース215を介して、HDMI端子219に出力される。 Also, when the content data reproduced from the disk by the disk drive 212 is sent out to the network instead of the HDMI cable 350, the content data is output to the network terminal 222 through the Ethernet interface 211. . Similarly, when the content data reproduced from the disk by the disk drive 212 is sent not to the TMDS channel of the HDMI cable 350 but to the bidirectional communication path (described above), the content data is transmitted to the Ethernet interface 211. And the high speed data line interface 215 to the HDMI terminal 219.
 続いて、図2に示したディスプレイ250の構成について、全天球画像データを受信し表示出力する場合を中心に説明する。 Subsequently, the configuration of the display 250 shown in FIG. 2 will be described focusing on the case of receiving the omnidirectional image data and displaying and outputting it.
 ディスプレイ250は、HDMI端子259と、HDMI受信部252と、高速データ・ライン・インターフェース251を備えている。また、ディスプレイ250は、CPU253と、イーサネット・インターフェース255と、表示パネル258と、一般画像処理部257と、全天球画像処理部256を備え、これらのコンポーネントは内部バス260、261を介して相互接続されている。 The display 250 includes an HDMI terminal 259, an HDMI receiving unit 252, and a high-speed data line interface 251. In addition, the display 250 includes a CPU 253, an Ethernet interface 255, a display panel 258, a general image processing unit 257, and an omnidirectional image processing unit 256, and these components are mutually connected via the internal buses 260 and 261. It is connected.
 CPU253は、メモリ254に格納されている制御ソフトウェアを実行し、内部バス260、261を通じてディスプレイ250内の各コンポーネントの動作を統括的にコントロールする。CPU253が制御ソフトウェアを実行する際に生成されるデータも、適宜メモリ254に格納される。 The CPU 253 executes control software stored in the memory 254, and centrally controls the operation of each component in the display 250 through the internal buses 260 and 261. Data generated when the CPU 253 executes the control software is also stored in the memory 254 as appropriate.
 HDMI受信部252は、HDMIシンクとして、HDMIに準拠した通信により、HDMIケーブル350を介してHDMI端子259に供給されるベースバンドの画像(若しくは映像)と音声のデータを受信する。画像及び音声のデータは、HDMIのTMDSチャネルで伝送される(同上)。 The HDMI receiving unit 252 receives baseband image (or video) and audio data supplied to the HDMI terminal 259 through the HDMI cable 350 by communication conforming to HDMI as an HDMI sink. Image and audio data are transmitted on the TMDS channel of HDMI (same as above).
 高速データ・ライン・インターフェース251は、ディスク・プレーヤ210側の高速データ・ライン・インターフェース215(前述)と同様に、HDMIケーブル350を構成する所定ライン(本実施形態では、リザーブ・ラインとHPDライン)を用いて構成される双方向通信路のインターフェースである。 Similar to the high-speed data line interface 215 (described above) on the disc player 210 side, the high-speed data line interface 251 includes predetermined lines (reserve line and HPD line in this embodiment) which constitute the HDMI cable 350. Is an interface of a two-way communication path configured using
 高速データ・ライン・インターフェース251は、イーサネット・インターフェース255とHDMI端子259の間に配設されている。高速データ・ライン・インターフェース251は、CPU253から供給される送信データを、HDMI端子259からHDMIケーブル350を介して、相手側の機器(すなわち、HDMIシンク機器としてのディスプレイ250)に送信する。また、高速データ・ライン・インターフェース251は、HDMIケーブル350からHDMI端子259を介して相手側の機器からデータを受信し、受信データをCPU253に供給する。 The high speed data line interface 251 is disposed between the Ethernet interface 255 and the HDMI terminal 259. The high-speed data line interface 251 transmits transmission data supplied from the CPU 253 from the HDMI terminal 259 to the other device (that is, the display 250 as an HDMI sink device) via the HDMI cable 350. Also, the high-speed data line interface 251 receives data from the other party's device from the HDMI cable 350 via the HDMI terminal 259, and supplies the received data to the CPU 253.
 HDMI受信部252ではベースバンド画像が受信される。HDMI受信部252は、RGB4:4:4、あるいはYCbCr4:4:4やYCbCr4:2:0などの伝送方式でパッキングされたベースバンド信号をデパッキングする。HDMI受信部252で受信された画像データが全天球画像データである場合には、全天球画像処理部256で表示パネル258に適した状態に加工処理された後に、一般画像処理部257に送られる。また、HDMI受信部252で受信された画像データが全天球画像データでない場合には、全天球画像処理部256では処理を行なわず、受信した画像データはそのまま一般画像処理部257に送られる。 The HDMI receiving unit 252 receives a baseband image. The HDMI reception unit 252 depacks a baseband signal packed by a transmission method such as RGB 4: 4: 4, YCbCr 4: 4: 4, or YCbCr 4: 2: 0. If the image data received by the HDMI receiving unit 252 is omnidirectional image data, the omnidirectional image processing unit 256 processes the image data into a state suitable for the display panel 258, and then the general image processing unit 257 Sent. If the image data received by the HDMI receiving unit 252 is not omnidirectional image data, the omnidirectional image processing unit 256 does not perform processing, and the received image data is sent to the general image processing unit 257 as it is .
 全天球画像データを受信した場合の処理について説明する。本実施形態では、全天球画像データは、球面上の画素配置として表現されている。すなわち、全天球画像を表現する球面上に緯度線及び経度線を定義して、各緯度線上に経度方向に所定の間隔(例えば、等間隔)で全天球画像を構成する各画素が配置されている(図1を参照のこと)。全天球画像処理部256は、ベースバンド画像の各ラインに配置された各画素を、全天球画像を表現する球面上にマッピングして、元の全天球画像を再現するための処理を実施する。 A process in the case of receiving the omnidirectional image data will be described. In the present embodiment, the omnidirectional image data is expressed as a pixel arrangement on a spherical surface. That is, latitude lines and longitude lines are defined on the spherical surface representing the omnidirectional image, and the pixels forming the omnidirectional image are arranged at predetermined intervals (for example, equally spaced) in the longitudinal direction on each latitudinal line. (See Figure 1). The omnidirectional image processing unit 256 maps each pixel arranged in each line of the baseband image on the spherical surface representing the omnidirectional image to reproduce the original omnidirectional image. carry out.
 全天球画像処理部256は、ディスク・プレーヤ210側から通知される全天球画像データのフォーマット情報に基づいて、受信した全天球画像データの処理を実施する。例えば、全天球画像処理部256は、フォーマット情報として取得した全天球画像データにおける赤道上の画素数と総ライン数に基づいて、ベースバンド画像の各ライン上の画素数を計算する。また、全天球画像処理部256は、フォーマット情報として取得した原点位置情報に基づいて、全天球画像を表示する際の表示位置補正を行なうことができる。 The omnidirectional image processing unit 256 performs processing of the received omnidirectional image data based on the format information of the omnidirectional image data notified from the disc player 210 side. For example, the omnidirectional image processing unit 256 calculates the number of pixels on each line of the baseband image based on the number of pixels on the equator and the total number of lines in the omnidirectional image data acquired as the format information. Also, the omnidirectional image processing unit 256 can perform display position correction when displaying the omnidirectional image, based on the origin position information acquired as the format information.
 一般画像処理部257は、画像データに対して、例えば画質改善処理やグラフィック・データ(OSDなど)の重畳処理を実施する。このような一般画像処理が施された後に、画像データは表示パネル258に送られ、ユーザに画像が提示される。 The general image processing unit 257 performs, for example, image quality improvement processing and superimposing processing of graphic data (such as OSD) on the image data. After such general image processing, the image data is sent to the display panel 258, and the image is presented to the user.
 図3には、AVシステム200におけるディスク・プレーヤ210側のHDMI送信部(HDMIソース)218とディスプレイ250側のHDMI受信部(HDMIシンク)252の構成例を示している。 FIG. 3 shows a configuration example of the HDMI transmitting unit (HDMI source) 218 on the disc player 210 side and the HDMI receiving unit (HDMI sink) 252 on the display 250 side in the AV system 200.
 HDMI送信部218とHDMI受信部252を接続するHDMIケーブル350には、TMDSチャネル#0乃至#2と、TMDSクロック・チャネルが含まれる。TMDSチャネル#0乃至#2を用いて、HDMI送信部218からHDMI受信部252へ、画像データ及び音声データが、ピクセル・クロックに同期して一方向にシリアル伝送される。また、TMDSクロック・チャネルを用いて、ピクセル・クロックが伝送される。 The HDMI cable 350 connecting the HDMI transmitting unit 218 and the HDMI receiving unit 252 includes TMDS channels # 0 to # 2 and a TMDS clock channel. Image data and audio data are serially transmitted in one direction in synchronization with the pixel clock from the HDMI transmitting unit 218 to the HDMI receiving unit 252 using the TMDS channels # 0 to # 2. Also, the pixel clock is transmitted using the TMDS clock channel.
 HDMI送信部218は、一の垂直同期信号(Vsync)から次の垂直同期信号までの区間から、水平帰線区間及び垂直帰線区間を除いた区間である有効画像区間(以下、「アクティブビデオ区間」ともいう)において、非圧縮(ベースバンド)の1画面分の画像の画素データに対応する差動信号を、HDMI受信部252に一方向に送信する。また、HDMI送信部218は、水平帰線区間又は垂直帰線区間において、少なくとも画像に付随する音声データや制御データ、その他の補助データなどに対応する差動信号を、HDMI受信部252に一方向に送信する。 The HDMI transmitting unit 218 is an effective image period (hereinafter referred to as “active video period” which is a period obtained by removing the horizontal blanking period and the vertical blanking period from the interval from one vertical synchronization signal (Vsync) to the next vertical synchronization signal. , And transmits the differential signal corresponding to pixel data of a non-compressed (baseband) image for one screen to the HDMI receiving unit 252 in one direction. In addition, the HDMI transmitting unit 218 transmits a differential signal corresponding to at least audio data and control data associated with an image to the HDMI receiving unit 252 in one direction in the horizontal blanking interval or the vertical blanking interval. Send to
 すなわち、HDMI送信部218が有するHDMIトランスミッタ81は、非圧縮の画像の画素データを対応する差動信号に変換し、HDMIケーブル350に含まれる3つのTMDSチャネル#0乃至#2で、HDMI受信部252に一方向にシリアル伝送する。 That is, the HDMI transmitter 81 included in the HDMI transmitting unit 218 converts pixel data of an uncompressed image into a corresponding differential signal, and the HDMI receiving unit is generated by the three TMDS channels # 0 to # 2 included in the HDMI cable 350. Serially transmit in one direction 252.
 また、HDMIトランスミッタ81は、非圧縮の画像に付随する音声データや、さらには必要な制御データその他の補助データなどを対応する差動信号に変換し、HDMIケーブル350に含まれる3つのTMDSチャネル#0乃至#2で、HDMI受信部252に一方向にシリアル伝送する。 Also, the HDMI transmitter 81 converts audio data accompanying uncompressed images, and further necessary control data and other auxiliary data into corresponding differential signals, and the three TMDS channels included in the HDMI cable 350 Serial transmission is performed in one direction to the HDMI reception unit 252 at 0 to # 2.
 さらに、HDMIトランスミッタ81は、3つのTMDSチャネル#0乃至#2で送信する画素データに同期したピクセル・クロックを、HDMIケーブル350に含まれるTMDSクロック・チャネルで、HDMI受信部252に送信する。例えば、各TMDSチャネル#0乃至#2では、ピクセル・クロックの1クロックの間に10ビットの画素データが送信される。 Further, the HDMI transmitter 81 transmits a pixel clock synchronized with pixel data transmitted on the three TMDS channels # 0 to # 2 to the HDMI receiving unit 252 on the TMDS clock channel included in the HDMI cable 350. For example, in each TMDS channel # 0 to # 2, 10 bits of pixel data are transmitted during one clock of the pixel clock.
 HDMI受信部252は、アクティブビデオ区間において、複数のチャネルで、HDMI送信部218から一方向に送信されてくる、画素データに対応する差動信号を受信する。また、HDMI受信部252は、水平帰線区間又は垂直帰線区間において、複数のチャネルで、HDMI送信部218から一方向に送信されてくる、音声データや制御データに対応する差動信号を受信する。 The HDMI reception unit 252 receives a differential signal corresponding to pixel data transmitted in one direction from the HDMI transmission unit 218 in a plurality of channels in the active video period. Also, the HDMI receiving unit 252 receives a differential signal corresponding to audio data or control data transmitted in one direction from the HDMI transmitting unit 218 on a plurality of channels in the horizontal blanking interval or the vertical blanking interval. Do.
 すなわち、HDMI受信部252が有するHDMIレシーバ82は、HDMIケーブル350に含まれる3つのTMDSチャネル#0乃至#2で、HDMI送信部218から一方向に送信されてくる画素データに対応する差動信号と、音声データや制御データに対応する差動信号を、同じくHDMI送信部218からTMDSクロック・チャネルで送信されてくるピクセル・クロックに同期して受信する。 That is, the HDMI receiver 82 included in the HDMI receiving unit 252 is a differential signal corresponding to pixel data transmitted in one direction from the HDMI transmitting unit 218 through the three TMDS channels # 0 to # 2 included in the HDMI cable 350. And differential signals corresponding to audio data and control data in synchronization with the pixel clock transmitted from the HDMI transmitting unit 218 on the TMDS clock channel.
 また、HDMIケーブル350には、TMDSチャネル#0乃至#2とTMDSクロック・チャネルの他に、DDC(Display Data Channel)83やCEC(Consumer Electronics Control)ライン84と呼ばれる伝送チャネルが含まれる。 The HDMI cable 350 also includes transmission channels called Display Data Channel (DDC) 83 and Consumer Electronics Control (CEC) line 84 in addition to TMDS channels # 0 to # 2 and TMDS clock channel.
 DDC83は、HDMIケーブル350に含まれる図示しない2本の信号線からなる。HDMI送信部218は、HDMIケーブル350を介して接続されているHDMI受信部252からE-EDID(Enhanced Extended Display Identification Data)を読み出すために、DDC83を使用する。 The DDC 83 includes two signal lines (not shown) included in the HDMI cable 350. The HDMI transmitting unit 218 uses the DDC 83 in order to read E-EDID (Enhanced Extended Display Identification Data) from the HDMI receiving unit 252 connected via the HDMI cable 350.
 HDMI受信部252は、HDMIレシーバ82の他にEDID ROM(Read Only Memory)85を有しており、自身の性能(Configuration若しくはCapability)に関する性能情報であるE-EDIDをこのEDID ROM85内に記憶している。HDMI送信部218は、HDMIケーブル350を介して接続されているHDMI受信部252のE-EDIDを、DDC83を介してEDID ROM85から読み出すことができる。そして、HDMI送信部218は、そのE-EDIDに基づいてHDMI受信部252の性能を設定する。例えば、HDMI送信部218は、HDMI受信部252を有する電子機器(ディスプレイ250)が対応している画像のフォーマット(プロファイル)、例えば、RGB、YCbCr4:4:4、YCbCr4:2:0などを認識する。 The HDMI receiving unit 252 has an EDID ROM (Read Only Memory) 85 in addition to the HDMI receiver 82, and stores E-EDID, which is performance information related to its own performance (Configuration or Capability), in the EDID ROM 85. ing. The HDMI transmitting unit 218 can read out the E-EDID of the HDMI receiving unit 252 connected via the HDMI cable 350 from the EDID ROM 85 via the DDC 83. Then, the HDMI transmitting unit 218 sets the performance of the HDMI receiving unit 252 based on the E-EDID. For example, the HDMI transmitting unit 218 recognizes the format (profile) of an image supported by the electronic device (display 250) having the HDMI receiving unit 252, for example, RGB, YCbCr 4: 4: 4, YCbCr 4: 2: 0, etc. Do.
 CECライン84は、HDMIケーブル350に含まれる図示しない1本の信号線からなる。CECライン84は、HDMI送信部218とHDMI受信部252との間で、制御用のデータの双方向通信を行なうのに用いられる。 The CEC line 84 is formed of a single signal line (not shown) included in the HDMI cable 350. The CEC line 84 is used to perform bi-directional communication of control data between the HDMI transmitting unit 218 and the HDMI receiving unit 252.
 また、HDMIケーブル350には、さらに、19ピンからなるHPDピンに接続されるHPDライン86が含まれている。HDMI送信部218は、このHPDライン86を利用して、HDMIケーブル350を介してHDMI受信部252が接続されていること検出することができる。また、HDMIケーブル350には、ソース機器からシンク機器に+5Vの電源を供給するために用いられる18ピンからなるライン(+5V電源ライン87)が含まれている。さらに、HDMIケーブル350には、14ピンのリザーブ・ライン88が含まれている。 The HDMI cable 350 further includes an HPD line 86 connected to an HPD pin consisting of 19 pins. The HDMI transmitting unit 218 can detect that the HDMI receiving unit 252 is connected via the HDMI cable 350 using the HPD line 86. Also, the HDMI cable 350 includes a line (+5 V power supply line 87) consisting of 18 pins used to supply +5 V power from the source device to the sink device. Further, the HDMI cable 350 includes a 14-pin reserve line 88.
 続いて、全天球画像データの伝送方式について説明する。 Subsequently, a transmission method of omnidirectional image data will be described.
 図4及び図5には、原信号の全天球画像データの構成例を示している。球面上に緯度線と経度線を定義する。図4には、球面を側面視した様子を示している。球体の赤道面から角度間隔θ毎に緯度線を定義する。図5(A)には、球面上に緯度線を追加した様子を示している。また、図5(B)には、緯度線を上面視した様子を示している。そして、図1を参照しながら説明したように、各緯度線上で、経度ゼロの場所を起点として等間隔で画素を配置していくことで、全天球画像は球面として表現される。このような球面上への画素配置方法では、北極及び南極に相当する極点に配置される画素数が1であり、赤道に相当する緯度線上に配置される画素数が最大になる。 FIGS. 4 and 5 show an example of the configuration of the omnidirectional image data of the original signal. Define latitude and longitude lines on the sphere. FIG. 4 shows a side view of a spherical surface. Latitude lines are defined every angular interval θ from the equatorial plane of the sphere. FIG. 5A shows a state in which a latitude line is added on the spherical surface. Further, FIG. 5B shows the latitude line viewed from the top. Then, as described with reference to FIG. 1, the omnidirectional image is expressed as a spherical surface by arranging the pixels at equal intervals starting from the location of the zero longitude on each latitude line. In such a pixel arrangement method on a spherical surface, the number of pixels arranged at the poles corresponding to the north pole and the south pole is 1, and the number of pixels arranged on the latitude line corresponding to the equator becomes maximum.
 例えば、赤道に相当する緯度線上の画素数が4000、緯度線間の角度が一定で0.001047198ラジアン、すなわち、垂直解像度に相当する緯度線の総本数が3001であるとする。走査ラインに相当する緯度線(ライン番号)毎に配置される画素数を、以下の表1にまとめておく(但し、一部のラインの情報の表示を省略する)。 For example, it is assumed that the number of pixels on the latitudinal line corresponding to the equator is 4,000, and the angle between the latitudinal lines is constant at 0.00107471 radians, that is, the total number of latitudinal lines corresponding to the vertical resolution is 3001. The number of pixels arranged for each latitude line (line number) corresponding to the scanning line is summarized in the following Table 1 (however, the display of information of some lines is omitted).
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 なお、図4及び図5に示したように、緯度線を球体の赤道面から等角度間隔θ毎に定義するのではなく、図6に示すように、緯度線を球体の地軸方向の高さ間隔h毎に定義して、球面として表現した全天球画像データを構成することもできる。但し、緯度線を球体の赤道面から等角度間隔θ毎に定義する伝送方式の方が、画質品位がより高くなるものと思料する。 As shown in FIGS. 4 and 5, the latitude lines are not defined at equal angular intervals θ from the equatorial plane of the sphere, but as shown in FIG. It is also possible to define all celestial sphere image data represented as a spherical surface by defining every interval h. However, it is thought that the image quality will be higher in the transmission method in which the latitude lines are defined at equal angular intervals θ from the equatorial plane of the sphere.
 図7には、図5及び図6に示したような、球面として表現された全天球画像データをHDMIのTMDSチャネルで伝送する場合のTMDS伝送データ例を示している。同図は、TMDSチャネル#0乃至#2において、横×縦が4000ピクセル×3001ラインの画像データが伝送される場合の、各種の伝送データの区間を示している。 FIG. 7 shows an example of TMDS transmission data in the case of transmitting the omnidirectional image data represented as a spherical surface as shown in FIG. 5 and FIG. 6 by the TMDS channel of HDMI. The figure shows sections of various transmission data in the case where image data of 4000 pixels in width x height of 3001 lines is transmitted in the TMDS channels # 0 to # 2.
 HDMIの3つのTMDSチャネル#0乃至#2で伝送データが伝送されるビデオフィールド(Video Field)若しくはTMDS区間には、伝送データの種類に応じて、ビデオデータ区間(Video Data period)、データアイランド区間(Data Island period)、及びコントロール区間(Control period)の3種類の区間が存在する。 Depending on the type of transmission data, video data period (Video Data period) and data island period are in the video field (Video Field) or TMDS period in which the transmission data is transmitted by three TMDS channels # 0 to # 2 of HDMI. There are three types of intervals: (Data Island period) and control period (Control period).
 ビデオフィールド区間は、ある垂直同期信号の立ち上がりエッジ(active edge)から次の垂直同期信号の立ち上がりエッジまでの区間であり、水平ブランキング期間(horizontal blanking)、垂直ブランキング期間(vertical blanking)、並びに、ビデオフィールド区間から、水平ブランキング期間及び垂直ブランキング期間を除いた区間であるアクティブビデオ区間(Active Video)に分けられる。 A video field interval is an interval from a rising edge (active edge) of one vertical synchronization signal to a rising edge of the next vertical synchronization signal, and a horizontal blanking period (horizontal blanking period), a vertical blanking period (vertical blanking period), and The video field period is divided into an active video period (Active Video) which is a period obtained by removing the horizontal blanking period and the vertical blanking period.
 ビデオデータ区間は、アクティブビデオ区間に割り当てられる。このビデオデータ区間は3001ラインからなり、各ラインで4000ピクセル(画素)分の有効画素(Active pixel)のデータを伝送することが可能である。すなわち、ビデオデータ区間では、非圧縮の1画面分の画像データを構成する4000ピクセル(画素)×3001ライン分の有効画素のデータを伝送することが可能である。 Video data intervals are assigned to active video intervals. This video data section consists of 3001 lines, and it is possible to transmit data of 4000 pixels (active pixels) of each line. That is, in the video data section, it is possible to transmit data of effective pixels of 4000 pixels (pixels) × 3001 lines constituting image data of one non-compressed screen.
 データアイランド区間及びコントロール区間は、水平ブランキング期間及び垂直ブランキング期間に割り当てられる。このデータアイランド区間及びコントロール区間では、補助データ(Auxiliarydata)が伝送される。 The data island period and the control period are assigned to the horizontal blanking period and the vertical blanking period. Auxiliary data (Auxiliary data) is transmitted in the data island period and the control period.
 データアイランド区間は、図7中、濃いグレーの領域がデータアイランド区間である。水平ブランキング期間と垂直ブランキング期間の一部分に割り当てられている。このデータアイランド区間では、補助データのうち、制御に関係しないデータである、例えば、音声データのパケットなどが伝送される。 In the data island section, a dark gray area in FIG. 7 is a data island section. It is allocated to a part of the horizontal blanking period and the vertical blanking period. In the data island period, among the auxiliary data, data not related to control, for example, a packet of audio data is transmitted.
 コントロール区間は、水平ブランキング期間と垂直ブランキング期間の他の部分に割り当てられている。図7中、斜線で表示された領域がコントロール区間である。このコントロール区間では、補助データのうちの、制御に関係するデータである、例えば、垂直同期信号及び水平同期信号(Hsync)、制御パケットなどが伝送される。 The control period is assigned to the horizontal blanking period and other parts of the vertical blanking period. In FIG. 7, an area indicated by hatching is a control section. In the control section, of the auxiliary data, data relating to control, for example, a vertical synchronization signal, a horizontal synchronization signal (Hsync), a control packet and the like are transmitted.
 図4及び図5に示した、球面として表現した全天球画像データの構成例では、垂直解像度に相当する緯度線の総本数が3001であり、各緯度線の画素数はまちまちである。すなわち、北極及び南極に相当する極点に配置される画素数が1であり、赤道に相当する緯度線上に配置される画素数が最大4000である(例えば、図1を参照のこと)。 In the configuration example of the omnidirectional image data expressed as a spherical surface shown in FIG. 4 and FIG. 5, the total number of latitude lines corresponding to the vertical resolution is 3001, and the number of pixels of each latitude line varies. That is, the number of pixels disposed at the poles corresponding to the north pole and the south pole is 1, and the number of pixels disposed on the latitude line corresponding to the equator is at most 4000 (see, for example, FIG. 1).
 図7に示すTMDS伝送データの場合、4000ピクセル(画素)×3001ラインからなるアクティブビデオ区間の各ラインに、図4及び図5に示すような全天球画像データの緯度線毎の画素を順に配置していくことで、3001ラインからなるビデオデータ区間を構成している。この場合、各ラインの有効画素数はライン毎に異なる。北極及び南極の緯度線に相当する走査ラインには最小の1画素しか配置されないが、赤道に相当する走査ラインには最大の4000画素が配置されるからである。 In the case of TMDS transmission data shown in FIG. 7, the pixels for each latitude line of the omnidirectional image data as shown in FIGS. 4 and 5 are sequentially arranged in each line of the active video section consisting of 4000 pixels (pixels) × 3001 lines. By arranging, a video data section consisting of 3001 lines is configured. In this case, the number of effective pixels of each line is different for each line. This is because only a minimum of one pixel is arranged on the scan line corresponding to the latitude lines of the north pole and the south pole, but a maximum of 4000 pixels is arranged on the scan line corresponding to the equator.
 図7に示すTMDS伝送データは、ライン毎に画素数が異なる(若しくは、各ラインで画素数が一定でない)データ・フォーマットということができる。図7中のビデオデータ区間の各ライン上のグレーの領域がビデオデータ区間である。ライン毎に画素数が異なることから、ライン毎にビデオデータ区間の長さも異なる。1ラインは、同期信号で区切られる区間と定義することもできる。各ラインにおいて、グレーで表示したビデオデータ区間が終了すると次の水平同期信号が挿入されてラインの先頭に帰還し、次のラインの信号の伝送が開始される。したがって、ライン毎に画素数が異なると、ライン毎に水平同期信号の間隔も同じでなくなる。参考のため、図14には、図7に示した伝送フォーマットで、一の垂直同期信号から次の垂直同期信号までの区間に映像信号が伝送される様子を例示している。 The TMDS transmission data shown in FIG. 7 can be said to have a data format in which the number of pixels is different for each line (or the number of pixels is not constant in each line). The gray area on each line of the video data section in FIG. 7 is a video data section. Since the number of pixels is different for each line, the length of the video data section also differs for each line. One line can also be defined as a section separated by a synchronization signal. In each line, when the video data section displayed in gray ends, the next horizontal synchronization signal is inserted and returned to the beginning of the line, and transmission of the signal of the next line is started. Therefore, when the number of pixels is different for each line, the interval of the horizontal synchronization signal is not the same for each line. For reference, FIG. 14 exemplifies how a video signal is transmitted in a section from one vertical synchronization signal to the next vertical synchronization signal in the transmission format shown in FIG.
 なお、ライン毎に有効画素の数が異なる画像データを伝送する際に、ライン毎のビデオデータ区間を変化させるのではなく、有効画素が少ないラインにおいては、有効画素の最後尾以降を(例えば値0で示される)無効画素で埋めて、各ラインのビデオデータ区間の長さを同じに揃えるという伝送フォーマットも考えられる。図15には、ビデオデータ区間の各ラインの長さを均一にした、全天球画像データの伝送フォーマット例を示している。同図中、破線の四角で囲んだ部分には無効画素が埋められているものとする。無効画素で埋めることにより、球面で表現した全天球画像データを矩形の画像データに変換することができ、例えばMPEGのような既存の圧縮方式で圧縮処理することが可能になる。但し、このような伝送フォーマットにすると、図7に示した伝送フォーマットよりも伝送データ量が増えてしまう。 Note that when transmitting image data in which the number of effective pixels differs for each line, the video data section for each line is not changed, but for the line with few effective pixels, A transmission format may also be considered in which the video data section of each line is equalized by filling in invalid pixels (indicated by 0). FIG. 15 shows an example of a transmission format of omnidirectional image data in which the length of each line of the video data section is made uniform. In the figure, it is assumed that invalid pixels are filled in a portion surrounded by a dashed square. By filling in with invalid pixels, it is possible to convert spherical image data represented as a whole spherical image into rectangular image data, and it is possible to perform compression processing using an existing compression method such as MPEG. However, if such a transmission format is used, the amount of transmission data increases more than the transmission format shown in FIG.
 ディスク・プレーヤ210からディスプレイ250に、図4及び図5に示したような球面上の画素配置として表現された全天球画像データを伝送する際に、全天球画像処理部217は、ライン毎に全天球画像データの各緯度線上の画素データを配置して、図7に示すようなベースバンド画像を生成する。 When transmitting the omnidirectional image data represented as the pixel arrangement on the spherical surface as shown in FIGS. 4 and 5 from the disc player 210 to the display 250, the omnidirectional image processing unit 217 The pixel data on each latitudinal line of the omnidirectional image data is arranged to generate a baseband image as shown in FIG.
 図7に示すTMDS伝送データのフォーマットによれば、全天球画像を球面として表現した画像データを、ライン毎に画素数の異なる(若しくは、各ラインで画素数が一定でない)ベースバンド画像として、HDMIのTMDSチャネルで伝送することができる。図2に示したAVシステム200において、ディスク・プレーヤ210からこのようなTMDS伝送データを送信すると、ディスプレイ250側では、受信した画像データを球面上に逆マッピングする処理が必要ないか、若しくは非常に少ないメモリを用いた画像データ処理で十分である。例えば、球面上に正しく表示できるディスプレイの場合には、受信した全天球画像データをそのまま表示することができる。また、例えばテレビのような矩形のディスプレイの場合には、受信した全天球画像から適当な矩形部分を切り出して表示することができる。 According to the format of TMDS transmission data shown in FIG. 7, image data representing an omnidirectional image as a spherical surface is defined as a baseband image having a different number of pixels for each line (or the number of pixels is not constant in each line) It can be transmitted by the TMDS channel of HDMI. When such a TMDS transmission data is transmitted from the disc player 210 in the AV system 200 shown in FIG. 2, the display 250 does not need to perform a process of demapping the received image data on the sphere, or Image data processing using a small amount of memory is sufficient. For example, in the case of a display that can be correctly displayed on a spherical surface, the received omnidirectional image data can be displayed as it is. Further, for example, in the case of a rectangular display such as a television, an appropriate rectangular portion can be cut out and displayed from the received omnidirectional image.
 もちろん、図6に示したように、緯度線を球体の地軸方向の高さ間隔h毎に定義して球面として表現した全天球画像データの場合にも、図7に示すような、ライン毎に画素数が異なる(若しくは、各ラインで画素数が一定でない)TMDS伝送データすなわちベースバンド画像として、AVシステム200上で伝送することができる。この場合も、ディスプレイ250側では、受信した画像データを球面上に逆マッピングする処理が必要ないというメリットがある。 Of course, as shown in FIG. 6, even in the case of omnidirectional image data in which latitude lines are defined for each height interval h in the direction of the earth axis of the sphere and expressed as a sphere, each line as shown in FIG. , Or different in the number of pixels (or the number of pixels in each line is not constant) can be transmitted on the AV system 200 as TMDS transmission data, that is, a baseband image. Also in this case, the display 250 side has an advantage that the process of reversely mapping the received image data on the spherical surface is not necessary.
 図8には、HDMIの3つのTMDSチャネル#0乃至#2で画像データを伝送する際のパッキング・フォーマット例を示している。但し、同図中のTMDSクロックとピクセル・クロックの関係は、TMDSクロック=ピクセル・クロックの関係になっている。 FIG. 8 shows an example of packing format when transmitting image data through three TMDS channels # 0 to # 2 of HDMI. However, the relationship between the TMDS clock and the pixel clock in the same figure is the relationship between TMDS clock = pixel clock.
 RGB4:4:4の方式では、TMDSチャネル#0乃至#2における各ピクセル(画素)のデータ領域に、8ビットの青色(B)データ、8ビットの緑色(G)データ、8ビットの赤色(R)データが配置されている。なお、図8では、画像データの伝送方式としてRGB4:4:4の1通りのみ例示したが、もちろん、YCbCr4:4:4やYCbCr4:2:0など他の伝送方式を使用してもよい。 In the RGB 4: 4: 4 scheme, 8-bit blue (B) data, 8-bit green (G) data, 8-bit red (in red) in the data area of each pixel in the TMDS channels # 0 to # 2. R) Data is arranged. Although FIG. 8 exemplifies only one of RGB 4: 4: 4 as a transmission method of image data, other transmission methods such as YCbCr 4: 4: 4 and YCbCr 4: 2: 0 may of course be used.
 本実施形態では、ディスク・プレーヤ210からディスプレイ250に全天球画像データを送信する際に、HDMI送信部218は、全天球画像処理部217が生成した、ライン毎に画素数が異なるベースバンド信号を、図8に示すようなパッキング・フォーマットでパッキングして、HDMI端子219から出力する。 In the present embodiment, when transmitting the omnidirectional image data from the disk player 210 to the display 250, the HDMI transmitting unit 218 generates a baseband having a different number of pixels for each line, generated by the omnidirectional image processing unit 217. The signal is packed in a packing format as shown in FIG. 8 and output from the HDMI terminal 219.
 上述したように、ディスプレイ250側のHDMI受信部252は、自身の性能に関するE-EDIDを記憶するEDID ROM85を有している。また、ディスク・プレーヤ210側のHDMI送信部218は、HDMIケーブル350に含まれるDDC83を介してHDMI受信部252のE-EDIDをEDID ROM85から読み出して、そのE-EDIDに基づいてHDMI受信部252の性能を設定することができる。 As described above, the HDMI receiving unit 252 on the display 250 side has the EDID ROM 85 that stores the E-EDID related to its own performance. Also, the HDMI transmitting unit 218 on the disc player 210 side reads out the E-EDID of the HDMI receiving unit 252 from the EDID ROM 85 through the DDC 83 included in the HDMI cable 350, and the HDMI receiving unit 252 based on the E-EDID. The performance of can be set.
 本実施形態に係るAVシステム200では、ディスク・プレーヤ210側のCPU214は、ディスプレイ250側のHDMI受信部252から読み出したE-EDIDに基づいて、ディスプレイ250が対応可能な全天球画像データの伝送方式についても認識する。具体的には、ディスク・プレーヤ210側のCPU214は、ディスプレイ250がライン毎に画素数が異なる(若しくは、各ラインで画素数が一定でない)ベースバンド画像に対応しているかどうかを認識する。 In the AV system 200 according to the present embodiment, the CPU 214 of the disc player 210 transmits all omnidirectional image data compatible with the display 250 based on the E-EDID read from the HDMI receiving unit 252 of the display 250. It also recognizes the method. Specifically, the CPU 214 of the disc player 210 recognizes whether or not the display 250 corresponds to a baseband image in which the number of pixels is different for each line (or the number of pixels is not constant in each line).
 図9には、E-EDIDのデータ構造例を示している。図示のE-EDIDは、基本ブロックと拡張部ロックで構成されている。 FIG. 9 shows an example of the data structure of E-EDID. The illustrated E-EDID is composed of a basic block and an extension lock.
 基本ブロックの先頭には、“E-EDID1.3 Basic Structure”で表されるE-EDID1.3の規格で定められたデータが配置され、続いて“Preferred timing”で表される従来のEDIDとの互換性を保つためのタイミング情報と、“2nd timing”で表される従来のEDIDとの互換性を保つための“Preferred timing”とは異なるタイミング情報が配置されている。 At the top of the basic block, data defined by the E-EDID1.3 standard represented by "E-EDID1.3 Basic Structure" is placed, and then the conventional EDID represented by "Preferred timing" and The timing information different from the "Preferred timing" for maintaining the compatibility with the conventional EDID represented by "2nd timing" is disposed.
 また、基本ブロックには、“2nd timing”に続いて、“Monitor NAME”で表される表示装置の名前を示す情報と、“Monitor Range Limits”で表される、アスペクト比が4:3及び16:9である場合についての表示可能な画素数を示す情報が順番に配置されている。 In addition, in the basic block, following “2nd timing”, information indicating the name of the display device represented by “Monitor NAME”, and aspect ratio 4: 3 and 16 represented by “Monitor Range Limits” The information which shows the number of displayable pixels about the case of: 9 is arrange | positioned in order.
 一方、 拡張ブロックの先頭には、“Short Video Descriptor”で表される、表示可能な画像サイズ(解像度)、フレームレート、インターレースであるかプログレッシブであるかを示す情報、アスペクト比などの情報が記述されたデータ、“Short Audio Descriptor”で表される、再生可能な音声コーデック方式、サンプリング周波数、カットオフ帯域、コーデックビット数などの情報が記述されたデータ、及び“Speaker Allocation”で表される左右のスピーカに関する情報が順番に配置されている。 On the other hand, at the beginning of the extension block, information such as a displayable image size (resolution), frame rate, information indicating interlace or progressive, aspect ratio, etc., described by “Short Video Descriptor” is described Data, data represented by "Short Audio Descriptor", data in which information such as reproducible audio codec method, sampling frequency, cutoff band, codec bit number, etc. are described, and left and right represented by "Speaker Allocation" Information on the speakers of is arranged in order.
 また、拡張ブロックには、“Speaker Allocation”に続いて、“Vender Specific”で表されるメーカー毎に固有に定義されたデータ、“3rd timing”で表される従来のEDIDとの互換性を保つためのタイミング情報、及び“4th timing”で表される従来のEDIDとの互換性を保つためのタイミング情報が配置されている。 In addition, in the extension block, following "Speaker Allocation", compatibility with the conventional EDID represented by "3rd timing" and data uniquely defined for each maker represented by "Vender Specific" is maintained. Timing information for maintaining compatibility with the conventional EDID represented by "4th timing".
 本実施形態では、このVender Specific領域に、全天球画像情報を記憶するために拡張するデータエリアを定義する。 In this embodiment, in this Vender Specific area, a data area to be expanded to store the omnidirectional image information is defined.
 図10には、E-EDIDの拡張ブロック内のVender Specific領域のデータ構造例を示している。図示のVendor Specific 領域には、それぞれ1バイトのブロックである第0ブロック乃至第Nブロックが設けられている。そして、既に定義された第0バイトから第11バイトに続く第12バイトから第16バイトに、シンク機器(本実施形態では、ディスプレイ250)が記憶しておくべき全天球画像情報のデータ領域を定義する。 FIG. 10 shows an example of the data structure of the Vender Specific area in the extension block of E-EDID. In the illustrated Vendor Specific area, blocks 0 to N which are blocks of 1 byte are provided. Then, in the 12th byte to the 16th byte following the 0th byte to the 11th byte already defined, the data area of the omnidirectional image information to be stored by the sink device (in the present embodiment, the display 250) Define.
 まず、第0バイトから第7バイトについて説明する。“Vender Specific”で表されるデータの先頭に配置された第0バイトには、“Vendor-Specific tag code(=3)”で表されるデータ“Vender Specific”のデータ領域を示すヘッダと、“Length(=N)”で表されるデータ“Vender Specific”の長さを示す情報が配置される。 First, the 0th byte to the 7th byte will be described. The 0th byte placed at the beginning of the data represented by "Vender Specific" has a header indicating the data area of the data "Vender Specific" represented by "Vendor-Specific tag code (= 3)", Information indicating the length of data "Vender Specific" represented by Length (= N) "is arranged.
 また、第1バイト乃至第3バイトには、“24bit IEEE Registration Identifier(0x000C03)LSB first”で表されるHDMI用として登録された番号“0x000C03“を示す情報が配置される。さらに、第4バイト及び第5バイトには、”A“、”B“、”C“、及び”D“のそれぞれにより表される、24bitのシンク機器の物理アドレスを示す情報が配置される。 Further, in the first to third bytes, information indicating the number “0x000C03” registered for the HDMI, which is represented by “24 bit IEEE Registration Identifier (0x000C03) LSB first”, is arranged. Further, in the fourth byte and the fifth byte, information indicating the physical address of the 24-bit sink device, which is represented by "A", "B", "C", and "D" respectively, is arranged.
 第6バイトには、“Supports-AI”で表されるシンク機器が対応している機能を示すフラグ、“DC-48bit”、“DC-36bit”、及び“DC-30bit”のそれぞれで表される1ピクセル当たりのビット数を指定する情報のそれぞれ、“DC-Y444”で表される、シンク機器がYCbCr4:4:4の画像の伝送に対応しているかを示すフラグが配置されている。 The sixth byte is a flag indicating the function supported by the sink device represented by "Supports-AI", represented by "DC-48bit", "DC-36bit", and "DC-30bit", respectively. A flag indicating whether or not the sink device corresponds to the transmission of the YCbCr 4: 4: 4 image, which is represented by “DC-Y 444”, is disposed in each of the information specifying the number of bits per pixel.
 また、第7バイトには、“Max-TMDS-Clock”で表されるTMDSのピクセル・クロックの最大の周波数を示す情報が配置される。 Further, in the seventh byte, information indicating the maximum frequency of the TMDS pixel clock represented by "Max-TMDS-Clock" is placed.
 第8バイトから第11バイトは、立体画像送受信の際に使用される情報である(例えば、特許文献2を参照のこと)。第8バイトから第10バイトには、立体画像に関する情報が記憶される。また、第11バイトには、立体音声に関する情報が記憶される。立体画像送受信は本明細書で開示する技術と直接関連しないので、第8バイト乃至第11バイトに関しては、詳細な説明を省略する。 The 8th byte to the 11th byte are information used for transmitting and receiving a stereoscopic image (see, for example, Patent Document 2). Information on a stereoscopic image is stored in the eighth byte to the tenth byte. Also, in the 11th byte, information related to stereophonic sound is stored. Since stereoscopic image transmission and reception is not directly related to the technology disclosed herein, detailed description of the eighth to eleventh bytes is omitted.
 次に、第12バイトから第16バイトについて説明する。第12バイトから第16バイトには、全天球画像に関する情報が記憶される。 Next, the 12th to 16th bytes will be described. The 12th byte to the 16th byte store information on the omnidirectional image.
 第12バイトの第7ビット及び第6ビットには、当該シンク機器が対応している全天球画像データの方式を示すデータが書き込まれる。具体的には、第7ビットは、図4及び図5に示したような、全天球画像データの各画素を球面上に配置する際に定義する緯度線の間隔が球面の赤道面からの角度により決められている方式(SLBA:Sphere Latitude by Angle)である。また、第6ビットは、図6に示したような、全天球画像データの各画素を球面上に配置する際に定義する緯度線の間隔が球面の地軸方向の高さにより決められている方式(SLBH:Sphere Latitude by Height)である。但し、第7ビットが設定されるSLBAの方が、画質品位がより高い伝送方式であると思料する。 In the seventh bit and the sixth bit of the twelfth byte, data indicating the method of the omnidirectional image data supported by the sink device is written. Specifically, as shown in FIG. 4 and FIG. 5, the seventh bit is the distance from the equatorial plane of the spherical surface defined by the latitude lines defined when arranging each pixel of the omnidirectional image data on the spherical surface. This is a method determined by the angle (SLBA: Sphere Latitude by Angle). In the sixth bit, as shown in FIG. 6, the distance between the latitude lines defined when arranging each pixel of the omnidirectional image data on the spherical surface is determined by the height in the direction of the earth axis of the spherical surface It is a system (SLBH: Sphere Latitude by Height). However, it is considered that the SLBA in which the seventh bit is set is a transmission method with higher image quality.
 第13バイト及び第14バイトには、当該シンク機器が対応可能な赤道上の最大画素数(Pixel Number on Equator)が書き込まれる。図5及び図6に示す画素配置方法では、第13バイト及び第14バイトには、最大画素数4000が書き込まれることになる。 In the 13th byte and the 14th byte, the maximum number of pixels on the equator (Pixel Number on Equator) that can be handled by the sink device is written. In the pixel arrangement method shown in FIGS. 5 and 6, the maximum number of pixels 4000 is written in the 13th byte and the 14th byte.
 第15バイト及び第16バイトは、当該シンク機器が対応可能な最大緯度線数(若しくは、ビデオデータ区間の総ライン数)(Total Latitude Line Number)が書き込まれる。 In the 15th byte and the 16th byte, the maximum number of latitude lines (or the total number of lines in the video data section) that can be handled by the sink device (Total Latitude Line Number) is written.
 第13バイト乃至第16バイトに関しては、画素数、ライン数に関する情報そのものをそれぞれ書き込むのではなく、よく使われる組み合わせをVIC(Video format Identification Code)として定義してビットを割り当て、シンク機器は対応可能なVICを示すデータを書き込むことにしてもよい。 Regarding the 13th byte to 16th byte, instead of writing the information about the number of pixels and the number of lines, a common combination is defined as VIC (Video format Identification Code) to assign bits, and the sink device can respond It is also possible to write data indicating a specific VIC.
 図10に示すVeodor Specific領域によれば、シンク機器は、自身が対応可能な全天球画像データの方式として、全天球画像データの画素を配置する緯度線の間隔を決める方式(角度又は高さのいずれでであるか)、赤道上に配置される最大画素数、対応可能な最大緯度線数(最大ライン数)、などを指定することができる。 According to the Veodor Specific area shown in FIG. 10, the sink device determines the interval of the latitude line for arranging the pixels of the omnidirectional image data (angle or height) as the omnidirectional image data method that the sink device can handle. Can be specified, the maximum number of pixels to be placed on the equator, the maximum number of latitudes that can be supported (the maximum number of lines), and so on.
 図2に示すAVシステム200において、ディスク・プレーヤ210は、ディスプレイ250に対して全天球画像データを伝送する際に、ディスプレイ250側のHDMI受信部252から読み出したE-EDIDのVendor Specific領域の主に第12バイト乃至第16バイトに記載されている情報に基づいて、このディスプレイ250が対応可能な全天球画像データの方式を識別することができる。そして、ディスク・プレーヤ210は、伝送先のディスプレイ250において対応可能ないずれかの方式を選択して、全天球画像データを伝送する。なお、上記のような全天球画像に関する情報を、Vendor Specific領域ではなく、全天球画像伝送用に新たに設けられた情報格納領域に配置するようにしてもよい。要するに、シンク機器において全天球画像に関する情報を格納する手段又は方法、並びに、シンク機器からソース機器へ全天球画像に関する情報を伝送する手段や方法は、特定のものには限定されない。 In the AV system 200 shown in FIG. 2, when transmitting the omnidirectional image data to the display 250, the disc player 210 is in the Vendor Specific area of the E-EDID read out from the HDMI receiving unit 252 on the display 250 side. Based on the information mainly described in the 12th byte to the 16th byte, it is possible to identify the type of the omnidirectional image data that can be supported by the display 250. Then, the disc player 210 selects any method that can be supported by the display 250 of the transmission destination, and transmits the omnidirectional image data. The information on the omnidirectional image as described above may be arranged not in the Vendor Specific region but in an information storage region newly provided for the omnidirectional image transmission. In short, the means or method for storing information on the omnidirectional image in the sink device, and the means or method for transmitting the information on the omnidirectional image from the sink device to the source device are not limited to specific ones.
 また、ディスク・プレーヤ210は、全天球画像データを伝送する際に、現在伝送している画像フォーマットに関する情報を、ディスプレイ250に送信するようにする。 Also, when transmitting the omnidirectional image data, the disc player 210 transmits information on the currently transmitted image format to the display 250.
 ここで、画像フォーマットに関する情報には、全天球画像データの各画素を球面上に配置する際に定義する緯度線の間隔を決める方式(SLBA又はSLBHのいずれであるか)、赤道上に配置される最大画素数(Pixel Number on Equator)、全天球画像データの各画素を球面上に配置する際に定義する緯度線数(若しくは、ビデオデータ区間の総ライン数)(Total Latitude Line Number)、原点位置の画素番号、原点位置のライン番号、といった情報が含まれる。 Here, the information on the image format includes a method (SLBA or SLBH) of determining the interval of the latitude line defined when arranging each pixel of the omnidirectional image data on the spherical surface, and the arrangement on the equator Maximum number of pixels to be displayed (Pixel Number on Equator), number of latitude lines (or total number of lines in video data section) defined when arranging each pixel of omnidirectional image data on a sphere (Total Latitude Line Number) , Pixel number of origin position, line number of origin position, etc. are included.
 例えば、ディスク・プレーヤ210は、ディスプレイ250に送信する全天球画像データ(非圧縮(ベースバンド)の映像信号)のブランキング期間に、送信中の全天球画像データの画像フォーマットに関する情報を挿入することで、ディスプレイ250に送信することができる。 For example, the disc player 210 inserts information on the image format of the omnidirectional image data being transmitted during the blanking period of the omnidirectional image data (non-compressed (baseband) video signal) to be transmitted to the display 250 Can be sent to the display 250.
 具体的には、ディスク・プレーヤ210は、HDMI信号内のAVI(Auxiliary Video Information) InfoFrameパケットなどを用いて、送信中の全天球画像データの画像フォーマットに関する情報を、全天球画像データのブランキング期間に挿入することができる。 Specifically, the disc player 210 uses the AVI (Auxiliary Video Information) InfoFrame packet or the like in the HDMI signal to transmit information regarding the image format of the omnidirectional image data being transmitted as a block of the omnidirectional image data. It can be inserted in the ranking period.
 AVI InfoFrameパケットは、TMDS伝送データ中のデータアイランド区間(図7を参照のこと)に配置される。図11には、AVI InfoFrameパケットのデータ構造例を示している。HDMI規格では、AVI InfoFrameパケットにより、伝送している画像に関する付帯情報をソース機器からシンク機器に伝送することが可能である。 The AVI InfoFrame packet is placed in a data island period (see FIG. 7) in TMDS transmission data. FIG. 11 shows an example data structure of an AVI InfoFrame packet. According to the HDMI standard, it is possible to transmit incidental information on an image being transmitted from a source device to a sink device using an AVI InfoFrame packet.
 第0バイトには、データパケットの種類を示す「Packet Type」が定義されている。AVI InfoFrameパケットの「Packet Type」は「0x82」となっている。第1バイトには、パケットデータ定義のバージョン情報が記載される。AVI InfoFrameパケットは、現在「0x03」であるが、本実施形態において全天球画像データの伝送方式を第18バイト乃至第26バイトで定義した場合には、図示のようにバージョンが「0x04」に更新されることになる。 In the 0th byte, "Packet Type" indicating the type of data packet is defined. The “Packet Type” of the AVI InfoFrame packet is “0x82”. The first byte describes version information of the packet data definition. The AVI InfoFrame packet is currently "0x03", but when the transmission method of the omnidirectional image data is defined by the 18th to 26th bytes in this embodiment, the version is "0x04" as shown in the figure. It will be updated.
 第2バイトには、パケット長を表す情報が記載される。AVI InfoFrameのパケット長は、現在「0x0E」であるが、本実施形態において全天球画像出力フォーマット情報を第18バイトから第26バイトに定義した場合は、図11に示すように、「0x17」になる。AVI InfoFrameに記載される各情報は、CEA-861-D Section 6-4に定義されている。したがって、本実施形態で特に変更・追加がない第3バイト乃至第17バイトについては詳細な説明を省略する。 The second byte describes information indicating the packet length. The packet length of the AVI InfoFrame is currently "0x0E", but when the omnidirectional image output format information is defined in the 18th to 26th bytes in this embodiment, it is "0x17", as shown in FIG. become. Each piece of information described in the AVI InfoFrame is defined in CEA-861-D Section 6-4. Therefore, detailed description of the third to seventeenth bytes which are not particularly modified or added in the present embodiment will be omitted.
 第18バイト乃至第26バイトには、送信中の全天球画像データの画像フォーマットに関する情報が記載される。 The 18th to 26th bytes contain information on the image format of the omnidirectional image data being transmitted.
 第18バイトでは、当該ソース機器が選択した全天球画像データの伝送方式のいずれか1つを指定する。具体的には、第18バイトの第7ビット及び第6ビットに、当該ソース機器が選択した全天球画像データの方式を示すデータが書き込まれる。このうち、第7ビットは、図4及び図5に示したような、全天球画像データの各画素を球面上に配置する際に定義する緯度線の間隔が角度により決められている方式(SLBA)である。また、第6ビットは、図6に示したような、全天球画像データの各画素を球面上に配置する際に定義する緯度線の間隔が球面の地軸方向の高さにより決められている方式(SLBH)である。 In the eighteenth byte, any one of the transmission methods of the omnidirectional image data selected by the source device is designated. Specifically, data indicating the method of the omnidirectional image data selected by the source device is written in the seventh bit and the sixth bit of the eighteenth byte. Among them, the seventh bit is a method in which the interval of the latitude line defined when arranging each pixel of the omnidirectional image data on the spherical surface as shown in FIG. 4 and FIG. SLBA). In the sixth bit, as shown in FIG. 6, the distance between the latitude lines defined when arranging each pixel of the omnidirectional image data on the spherical surface is determined by the height in the direction of the earth axis of the spherical surface It is a system (SLBH).
 第19バイト及び第20バイトには、伝送されている全天球画像データにおける赤道上の最大画素数(Pixel Number on Equator)が設定される。 In the 19th byte and the 20th byte, the maximum number of pixels on the equator (Pixel Number on Equator) in the transmitted omnidirectional image data is set.
 第21バイト及び第22バイトには、伝送されている全天球画像データにおける緯度線数(若しくは、ビデオデータ区間の総ライン数)(Total Latitude Line Number)が設定される。 In the 21st and 22nd bytes, the number of latitude lines (or the total number of lines in the video data section) (Total Latitude Line Number) in the transmitted omnidirectional image data is set.
 第23バイト乃至第26バイトには、伝送されている全天球画像データの原点位置情報が設定される。原点位置とは、原点とは画像の標準位置である。例えば、シンク機器はこの情報を使って、原点を常に視聴者の正面となるように提示することができる。第23バイト及び第24バイトには原点位置の画素番号(Pixel Number)が設定され、第25バイト及び第26バイトには原点位置のライン番号(Line Number)が設定される。 In the 23rd byte through the 26th byte, the origin position information of the transmitted omnidirectional image data is set. The origin position is the standard position of the image. For example, the sink device can use this information to present the origin always in front of the viewer. The pixel number (Pixel Number) of the origin position is set in the 23rd byte and the 24th byte, and the line number (Line Number) of the origin position is set in the 25th byte and the 26th byte.
 シンク機器としてのディスプレイ250は、AVI InfoFrameパケットを用いて通知される画像フォーマットに関する情報に基づいて、ディスク・プレーヤ210から伝送される画像データの処理を制御することができる。 The display 250 as a sink device can control the processing of the image data transmitted from the disc player 210 based on the information on the image format notified using the AVI InfoFrame packet.
 具体的には、ディスプレイ250は、受信したAVI InfoFrameパケットの第18バイトの第7ビット若しくは第6ビットのいずれかのビットが設定されている場合に、ディスク・プレーヤ210から全天球画像データが伝送されていると判定することができる。さらに、ディスプレイ250は、同パケットの第19バイトから第22バイトに設定されている値に基づいて、伝送されている全天球画像データにおける赤道上の画素数と総ライン数を得ることができ、それを基に各ライン上の画素数を計算して得ることができる。 Specifically, the display 250 receives all celestial sphere image data from the disc player 210 when the seventh bit or the sixth bit of the eighteenth byte of the received AVI InfoFrame packet is set. It can be determined that it is being transmitted. Furthermore, the display 250 can obtain the number of pixels on the equator and the total number of lines in the transmitted omnidirectional image data based on the values set in the 19th byte to the 22nd byte of the same packet. The number of pixels on each line can be calculated and obtained based on that.
 また、AVI InfoFrameパケットの第4バイトの第6ビット及び第5ビットで、RGB4:4:4、YCbCr4:4:4、YCbCr4:2:0などのパッキング・フォーマットが指定されている。 In addition, packing formats such as RGB 4: 4: 4, YCbCr 4: 4: 4, YCbCr 4: 2: 0, etc. are designated by the sixth bit and the fifth bit of the fourth byte of the AVI InfoFrame packet.
 さらに、ディスプレイ250は、AVI InfoFrameパケットの第23バイトから第26バイトに設定されている、送信画像の原点位置情報に基づいて、表示パネル258に提示する際の表示位置補正を行なうことができる。 Furthermore, the display 250 can perform display position correction at the time of presentation on the display panel 258 based on the origin position information of the transmission image, which is set to the 23rd byte to the 26th byte of the AVI InfoFrame packet.
 図12には、図2に示したAVシステム200において、ディスク・プレーヤ210がHDMIケーブル350を介してディスプレイ250と接続する際に実施する処理手順をフローチャートの形式で示している。図示の処理手順は、基本的には、ディスク・プレーヤ210内のCPU214が主体となって実施されるものとする。 FIG. 12 shows, in the form of a flowchart, a processing procedure performed when the disc player 210 connects to the display 250 via the HDMI cable 350 in the AV system 200 shown in FIG. The illustrated procedure is basically implemented mainly by the CPU 214 in the disk player 210.
 まず、ディスク・プレーヤ210は、自身のHDMI端子219に接続されているHDMIケーブル350のHPD信号がハイレベルにあるか否かをチェックする(ステップS1201)。 First, the disc player 210 checks whether the HPD signal of the HDMI cable 350 connected to its own HDMI terminal 219 is at high level (step S1201).
 HDMIケーブル350のHPD信号がローレベルにある場合には(ステップS1201のNo)、ディスク・プレーヤ210は、HDMIケーブル350を介してHDMIシンク機器が接続されていないと判定して、本処理を終了する。 When the HPD signal of the HDMI cable 350 is at the low level (No in step S1201), the disc player 210 determines that the HDMI sink device is not connected via the HDMI cable 350, and the process ends. Do.
 一方、HDMIケーブル350のHPD信号がハイレベルにある場合には(ステップS1201のYes)、ディスク・プレーヤ210は、HDMIケーブル350のDDC83を使用して、接続されたHDMIシンク機器(ディスプレイ250)のHDMI受信部252内のEDID ROM85から、E-EDIDを読み出す(ステップS1202)。 On the other hand, when the HPD signal of the HDMI cable 350 is at the high level (Yes in step S1201), the disc player 210 uses the DDC 83 of the HDMI cable 350 to connect the HDMI sink device (display 250). The E-EDID is read out from the EDID ROM 85 in the HDMI receiving unit 252 (step S1202).
 E-EDIDのデータ構造は、図9に例示した通りである。また、図10に示したように、E-EDIDのうち拡張ブロックに含まれるVendor Specific領域の第12バイトから第16バイトには、当該HDMIシンク機器が対応可能な全天球画像の伝送方式に関する情報が記憶される。このうち第12バイトの第7ビットは全天球画像データの画素が配置される緯度線の間隔が球面の赤道面からの角度により決められている方式(SLBA)であることを示し、第6ビットは同緯度線の間隔が球面の地軸方向の高さにより決められている方式(SLBH)であることを示す。したがって、ディスク・プレーヤ210は、第12バイトの第7ビット又は第6ビットのいずれかのビットが設定されているかをチェックして、接続されたHDMIシンク機器が全天球画像の表示に対応していることを確認することができる。図12に示す処理手順では、接続されたHDMIシンク機器が全天球画像の表示に対応している機器(ディスプレイ250)であることを想定している。 The data structure of E-EDID is as illustrated in FIG. Further, as shown in FIG. 10, in the 12th to 16th bytes of the Vendor Specific area included in the extension block in E-EDID, the transmission method of the omnidirectional image compatible with the HDMI sink device is related. Information is stored. Among them, the seventh bit of the 12th byte indicates that the system of the latitude lines where the pixels of the omnidirectional image data are arranged is determined by the angle from the equatorial plane of the spherical surface (SLBA), and the sixth The bit indicates that the interval of the same latitude line is a scheme (SLBH) determined by the height in the direction of the ground axis of the sphere. Therefore, the disc player 210 checks whether the seventh bit or the sixth bit of the 12th byte is set, and the connected HDMI sink device corresponds to the display of the omnidirectional image. It can be confirmed that In the processing procedure shown in FIG. 12, it is assumed that the connected HDMI sink device is a device (display 250) compatible with the display of the omnidirectional image.
 続いて、ディスク・プレーヤ210は、接続されたHDMIシンク機器(ディスプレイ250)に送信する全天球画像があるかどうかをチェックする(ステップS1203)。 Subsequently, the disc player 210 checks whether there is a omnidirectional image to be transmitted to the connected HDMI sink device (display 250) (step S1203).
 送信すべき全天球画像がない場合には(ステップS1203のNo)、ディスク・プレーヤ210は、画像データのブランキング期間に挿入するAVI InfoFrameパケットに全天球画像の非伝送を示すデータを設定して(ステップS1207)、本処理を終了する。例えば、AVI InfoFrameパケットの第18バイトの第7ビット及び第6ビットのいずれにも0を設定することで、全天球画像の非伝送を示すことができる。 If there is no omnidirectional image to be transmitted (No in step S1203), the disc player 210 sets data indicating non-transmission of the omnidirectional image in the AVI InfoFrame packet to be inserted in the blanking period of the image data. Then (step S1207), the process ends. For example, non-transmission of the omnidirectional image can be indicated by setting 0 to any of the seventh bit and the sixth bit of the 18th byte of the AVI InfoFrame packet.
 また、ディスプレイ250に送信する全天球画像がある場合には(ステップS1203のYes)、ディスク・プレーヤ210は、ディスプレイ250に全天球画像データを送信するための処理を開始する。 If there is a omnidirectional image to be transmitted to the display 250 (Yes in step S1203), the disc player 210 starts processing for transmitting the omnidirectional image data to the display 250.
 ここで、ディスク・プレーヤ210は、ディスプレイ250に送信する全天球画像データの伝送方式を決定する(ステップS1204)。このとき、ディスク・プレーヤ210は、ステップS1202でディスプレイ250から読み出したEDIDのVendor Specific領域に記載されている、ディスプレイ250が対応可能な全天球画像の伝送方式を考慮して、全天球画像データの伝送方式を決定する。ディスク・プレーヤ210は、全天球画像データの伝送方式として、例えば、緯度線の間隔を決める方式、1ラインに配置される最大画素数、緯度線の総数(若しくは総ライン数)、全天球画像データの原点位置情報が決定される。 Here, the disc player 210 determines the transmission method of the omnidirectional image data to be transmitted to the display 250 (step S1204). At this time, the disc player 210 selects the omnidirectional image in consideration of the transmission method of the omnidirectional image compatible with the display 250 described in the Vendor Specific area of the EDID read from the display 250 in step S1202. Determine the data transmission method. The disc player 210 may use, for example, a method of determining an interval of latitude lines, a maximum number of pixels arranged in one line, a total number of latitude lines (or a total number of lines), and an omnidirectional sphere Origin position information of the image data is determined.
 そして、ディスク・プレーヤ210は、全天球画像データの伝送開始か否かを判定する(ステップS1205)。全天球画像データの伝送開始でないときには(ステップS1205のNo)、ディスク・プレーヤ210は、AVI InfoFrameパケットに全天球画像の非伝送を示すデータを設定して(ステップS1207)、本処理を終了する。 Then, the disc player 210 determines whether the transmission of the omnidirectional image data is started (step S1205). If transmission of the omnidirectional image data is not started (No in step S1205), the disc player 210 sets data indicating non-transmission of the omnidirectional image in the AVI InfoFrame packet (step S1207), and ends this processing. Do.
 また、全天球画像データの伝送開始であるときには(ステップS1205のYes)、ディスク・プレーヤ210は、AVI InfoFrameパケットの第18バイト乃至第26バイトなどに、全天球画像データの伝送方式を示すデータを設定して(ステップS1206)、本処理を終了する。 When the transmission of the omnidirectional image data is started (Yes in step S1205), the disc player 210 indicates the transmission method of the omnidirectional image data in the 18th to 26th bytes of the AVI InfoFrame packet. Data is set (step S1206), and the process ends.
 図13には、図12に示したフローチャート中のステップS1205で実施される、ディスク・プレーヤ210が全天球画像データの伝送方式を決定するための詳細な処理手順をフローチャートの形式で示している。 FIG. 13 shows, in the form of a flowchart, a detailed processing procedure for the disc player 210 to determine the transmission method of the omnidirectional image data, which is performed in step S1205 in the flowchart shown in FIG. .
 まず、ディスク・プレーヤ210は、HDMIシンク機器(ディスプレイ250)のEDID ROM85から取得したE-EDIDの拡張ブロックに含まれるVendor Specific領域の第12バイトの第7ビットが設定されているかどうかをチェックする(ステップS1301)。 First, the disc player 210 checks whether or not the seventh bit of the twelfth byte of the Vendor Specific area included in the extension block of the E-EDID acquired from the EDID ROM 85 of the HDMI sink device (display 250) is set. (Step S1301).
 本実施形態では、全天球画像データの各画素を球面上に配置する際の緯度線の間隔を決める方式には、球面の赤道面からの角度により決めるSLBA方式と、球面の地軸方向の高さにより決めるSLBH方式の2通りが用意されている(前述)。このうち、前記者のSLBA方式の方が、画質品位がより高い伝送方式である。 In the present embodiment, the method of determining the spacing of the latitude lines when arranging each pixel of the omnidirectional image data on the sphere includes the SLBA method of determining the angle from the equatorial plane of the sphere and the height of the sphere in the ground axis direction. There are two types of SLBH method determined by the above (described above). Among them, the above-mentioned person's SLBA method is a transmission method with higher image quality.
 したがって、Vendor Specific領域の第12バイトの第7ビットが設定され、ディスプレイ250がSLBA方式を指定している場合には(ステップS1301のYes)、ディスク・プレーヤ210は、優先的に、緯度線の間隔を球面の赤道面からの角度により決めるSLBA方式を選択して(ステップS1302)、本処理を終了する。 Therefore, when the seventh bit of the twelfth byte of the Vendor Specific area is set, and the display 250 designates the SLBA method (Yes in step S1301), the disc player 210 preferentially gives priority to the latitude line. The SLBA method in which the distance is determined by the angle of the spherical surface from the equatorial plane is selected (step S1302), and the present process ends.
 また、Vendor Specific領域の第12バイトの第7ビットが設定されていない場合には(ステップS1301のNo)、ディスク・プレーヤ210は、続いて、Vendor Specific領域の第12バイトの第6ビットが設定されているかどうかをチェックする(ステップS1303)。 If the seventh bit of the 12th byte of the Vendor Specific area is not set (No in step S1301), the disc player 210 subsequently sets the 6th bit of the 12th byte of the Vendor Specific area. It is checked whether it has been done (step S1303).
 ここで、Vendor Specific領域の第12バイトの第6ビットが設定され、ディスプレイ250がSLBH方式を指定している場合には(ステップS1303のYes)、ディスク・プレーヤ210は、緯度線の間隔を球面の地軸方向の高さにより決めるSLBH方式を選択して(ステップS1304)、本処理を終了する。 Here, when the sixth bit of the twelfth byte of the Vendor Specific area is set, and the display 250 designates the SLBH method (Yes in step S1303), the disc player 210 sets the latitude line to the spherical surface. The SLBH method determined by the height in the ground axis direction is selected (step S1304), and the process ends.
 また、Vendor Specific領域の第12バイトの第7ビット及び第6ビットのいずれも設定されていない場合には(ステップS1303のNo)、ディスク・プレーヤ210は、ディスプレイ250が全天球画像データの伝送可能な方式がないと判断して全天球画像の出力非選択を設定して(ステップS1305)、本処理を終了する。 When neither the seventh bit nor the sixth bit of the twelfth byte of the Vendor Specific area is set (No in step S1303), the disc player 210 transmits the omnidirectional image data to the display 250. It is determined that there is no possible method, the output non-selection of the omnidirectional image is set (step S1305), and the process ends.
 以上説明してきたように、図2に示すAVシステム200において、ディスク・プレーヤ210からディスプレイ250に全天球画像データを送信する際に、ディスク・プレーヤ210は、ディスプレイ250が対応可能な全天球画像データの伝送方式情報を取得して、ディスプレイ250が対応可能な伝送方式により全天球画像データを送信する。また、ディスク・プレーヤ210は、全天球画像データを送信する際に、その伝送方式情報をディスプレイ250に送信する。 As described above, in the AV system 200 shown in FIG. 2, when transmitting the omnidirectional image data from the disc player 210 to the display 250, the disc player 210 is an omnidirectional sphere that the display 250 can handle. The transmission method information of the image data is acquired, and the omnidirectional image data is transmitted by the transmission method compatible with the display 250. Also, when transmitting the omnidirectional image data, the disc player 210 transmits the transmission method information to the display 250.
 したがって、AVシステム200において、ディスク・プレーヤ210とディスプレイ250間で全天球画像データの伝送を良好に行なうことができる、という点を十分理解されたい。 Therefore, it should be fully understood that, in the AV system 200, transmission of omnidirectional image data can be well performed between the disc player 210 and the display 250.
 また、本実施形態では、ディスク・プレーヤ210は、AVI InfoFrameパケットに全天球画像データの伝送方式情報を記載して、画像データ(映像信号)のブランキング期間を利用して、ディスプレイ250に送信するようにしている。 Further, in the present embodiment, the disc player 210 describes transmission method information of omnidirectional image data in an AVI InfoFrame packet, and transmits it to the display 250 using the blanking period of the image data (video signal). I am trying to do it.
 但し、ディスク・プレーヤ210がディスプレイ250に全天球画像データの伝送方式情報を送信する方法は、上記に限定されない。例えば、ディスク・プレーヤ210は、HDMIケーブル350のCECライン84を介して、ディスプレイ250に全天球画像データの伝送方式情報を送信するようにしてもよい。あるいは、ディスク・プレーヤ210は、HDMIケーブル350のリザーブ・ライン88及びHPDライン86で構成される双方向通信路(前述)を介して、ディスプレイ250に全天球画像データの伝送方式情報を送信するようにしてもよい。 However, the method for the disc player 210 to transmit the transmission method information of the omnidirectional image data to the display 250 is not limited to the above. For example, the disc player 210 may transmit transmission method information of omnidirectional image data to the display 250 via the CEC line 84 of the HDMI cable 350. Alternatively, the disc player 210 transmits the transmission method information of the omnidirectional image data to the display 250 via the bidirectional communication path (described above) configured by the reserve line 88 of the HDMI cable 350 and the HPD line 86. You may do so.
 図2には、HDMIの伝送路を用いるAVシステム200を例示したが、全天球画像データの伝送に用いるベース・バンド・デジタル・インターフェースとしては、HDMI以外にも、DVI(DIgital Visual Interface)、DP(Display Port)インターフェース、60MHzミリ波を利用したワイヤレス・インターフェースなどを挙げることができる。これらのうちいずれのデジタル・インターフェースを用いて全天球画像データを伝送する場合にも、同様に本明細書で開示する技術を適用することができる。 FIG. 2 exemplifies the AV system 200 using the HDMI transmission path, but as a base band digital interface used for transmission of omnidirectional image data, DVI (Digital Visual Interface), other than HDMI, Examples include DP (Display Port) interface and wireless interface using 60 MHz millimeter wave. The technique disclosed in the present specification can be similarly applied to transmission of omnidirectional image data using any of these digital interfaces.
 また、ディスク・プレーヤとディスプレイ以外の組み合わせからなるソース機器とシンク機器の間で全天球画像データを伝送する場合にも、同様に本明細書で開示する技術を適用することができる。 Also, in the case of transmitting omnidirectional image data between a source device and a sink device, which are combinations of a disc player and a display, the technology disclosed in the present specification can be applied similarly.
 以上、特定の実施形態を参照しながら、本明細書で開示する技術について詳細に説明してきた。しかしながら、本明細書で開示する技術の要旨を逸脱しない範囲で当業者が該実施形態の修正や代用を成し得ることは自明である。 The technology disclosed herein has been described in detail above with reference to specific embodiments. However, it is obvious that those skilled in the art can make modifications and substitutions of the embodiment without departing from the scope of the technology disclosed herein.
 本明細書では、HDMI規格を採用するAVシステムにおいて、球面として表現された全天球画像データをTMDS伝送データとして伝送する実施形態を中心に説明してきたが、本明細書で開示する技術の要旨はこれに限定されるものではない。HDMI以外の伝送規格に基づくシステムや、TMDS以外の方式でベースバンド画像を伝送するさまざまなシステムに対しても、同様に本明細書で開示する技術を適用することができる。 In the present specification, an AV system adopting the HDMI standard has been described focusing on an embodiment in which all celestial sphere image data represented as a spherical surface is transmitted as TMDS transmission data, but the gist of the technology disclosed in the present specification. Is not limited to this. The techniques disclosed in the present specification can be similarly applied to systems based on transmission standards other than HDMI and various systems that transmit baseband images in a system other than TMDS.
 全天球画像データの伝送に用いるベース・バンド・デジタル・インターフェースとしては、HDMI以外にも、DVI、DPインターフェース、60MHzミリ波を利用したワイヤレス・インターフェースなどを挙げることができる。これらのうちいずれのデジタル・インターフェースを用いて全天球画像データを伝送する場合にも、同様に本明細書で開示する技術を適用することができる。 As a base band digital interface used for transmission of omnidirectional image data, a DVI interface, a DP interface, a wireless interface using 60 MHz millimeter waves, and the like can be mentioned besides HDMI. The technique disclosed in the present specification can be similarly applied to transmission of omnidirectional image data using any of these digital interfaces.
 また、本明細書は、緯度線及び経度線が定義された球面の各緯度線上に経度方向に所定の間隔で各画素を配置して、全天球画像データを球面として表現する画素配置方法について説明してきたが、このような画素配置方法を全天球画像データの伝送時だけでなく、記録時などさまざまな処理を行なう際にも適用することができる。 Further, the present specification describes a pixel arrangement method for representing all celestial sphere image data as a spherical surface by arranging each pixel at predetermined intervals in the longitudinal direction on each of the latitudinal lines on which the latitude line and the longitude line are defined. As described above, such a pixel layout method can be applied not only to transmission of omnidirectional image data but also to various processing such as recording.
 また、球面以外の曲面にマッピングされた画像データにおいても、ライン毎に画素数が異なるベースバンド画像として伝送する方法を適用することができる。 In addition, even in the case of image data mapped to a curved surface other than a spherical surface, a method of transmitting as a baseband image in which the number of pixels differs for each line can be applied.
 要するに、例示という形態により本明細書で開示する技術について説明してきたが、本明細書の記載内容を限定的に解釈するべきではない。本明細書で開示する技術の要旨を判断するためには、特許請求の範囲を参酌すべきである。 In short, although the technology disclosed herein has been described in the form of exemplification, the contents of the specification should not be interpreted in a limited manner. In order to determine the scope of the technology disclosed herein, the claims should be referred to.
 なお、本明細書の開示の技術は、以下のような構成をとることも可能である。
(1)全天球画像を表現する球面上に緯度線及び経度線を定義する定義部と、
 前記定義した各緯度線上に前記全天球画像の画素を配置する配置部と、
を具備する画像処理装置。
(1-1)各緯度線に緯度インデックスを割り振るとともに、各緯度線上に配置された画素に対して経度ゼロを起点とする経度インデックスを割り振るインデックス割当部をさらに備える、
上記(1)に記載の画像処理装置。
(2)前記配置部は、各緯度線上において、経度ゼロの場所を起点としてほぼ等間隔に画素を配置する、
上記(1)に記載の画像処理装置。
(3)前記定義部は、前記緯度線の間隔を、前記球面の赤道面からの角度又は前記球面の地軸方向の高さのいずれかに基づいて決める、
上記(1)又は(2)のいずれかに記載の画像処理装置。
(4)前記球面の各緯度線上の画素をそれぞれ1ラインに順次配置して、ライン毎に画素数の異なるベースバンド信号を生成する処理部と、
 前記ベースバンド信号を所定の伝送路を介して外部機器に送信する送信部と、
をさらに備える、上記(1)乃至(3)のいずれかに記載の画像処理装置。
(5)前記外部機器が対応可能な全天球画像データの伝送方式に関する伝送方式情報を取得する取得部と、
 取得した前記伝送方式情報に基づいて全天球画像データの伝送方式を選択する選択部と、
をさらに備え、
 前記処理部は、前記選択された伝送方式に従って、前記ベースバンド信号を生成する、
上記(4)に記載の画像処理装置。
(6)前記取得部は、前記所定の伝送路を介して前記外部機器から前記伝送方式情報を取得する、
上記(5)に記載の画像処理装置。
(7)前記所定の伝送路は、HDMI規格に基づく伝送路であり、
 前記取得部は、前記外部機器が備えるEDID ROMに格納されたE-EDIDから前記伝送方式情報を取得する、
上記(6)に記載の画像処理装置。
(8)前記取得部は、前記伝送方式情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを取得し、
 前記選択部は、取得した前記伝送方式情報に基づいて、前記画像処理部が前記ベースバンド信号を生成する際の、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを選択する、
上記(5)乃至(7)のいずれかに記載の画像処理装置。
(9)前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器に通知する通知部をさらに備える、
上記(4)乃至(8)のいずれかに記載の画像処理装置。
(10)前記通知部は、前記所定の伝送路を介して前記外部機器に前記フォーマット情報を送信する、
上記(9)に記載の画像処理装置。
(11)前記所定の伝送路は、HDMI規格に基づく伝送路であり、
 前記通知部は、前記送信部が前記ベースバンド信号を送信する際のブランキング期間を利用して前記フォーマット情報を送信する、
上記(9)又は(10)のいずれかに記載の画像処理装置。
(12)前記通知部は、前記フォーマット情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、前記緯度線の総数若しくは前記ベースバンド信号に含まれる総ライン数、前記全天球画像データの原点位置情報のうち少なくとも1つを通知する、
上記(9)乃至(11)のいずれかに記載の画像処理装置。
(13)前記球面の各緯度線上の画素をそれぞれ1ラインに順次配置して生成された、ライン毎に画素数の異なるベースバンド信号を、所定の伝送路を介して外部機器から受信する受信部と、
 前記ベースバンド信号を基に全天球画像を表示するための処理を行なう処理部と、
をさらに備える、上記(1)乃至(3)のいずれかに記載の画像処理装置。
(14)対応可能な全天球画像データの伝送方式に関する伝送方式情報を前記外部機器に通知する通知部をさらに備える、
上記(13)に記載の画像処理装置。
(15)前記通知部は、前記伝送方式情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを通知する、
上記(14)に記載の画像処理装置。
(16)前記所定の伝送路は、HDMI規格に基づく伝送路であり、
 前記伝送方式情報を含んだE-EDIDを格納するEDID ROMを備える、
上記(14)又は(15)のいずれかに記載の画像処理装置。
(17)前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器から取得する取得部をさらに備え、
 前記処理部は、前記フォーマット情報に基づいて全天球画像を表示するための処理を行なう、
上記(13)乃至(16)のいずれかに記載の画像処理装置。
(18)前記取得部は、前記フォーマット情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、前記緯度線の総数若しくは前記ベースバンド信号に含まれる総ライン数、前記全天球画像データの原点位置情報のうち少なくとも1つを前記外部機器から取得する、
上記(17)に記載の画像処理装置。
(19)前記所定の伝送路は、HDMI規格に基づく伝送路であり、
 前記取得部は、前記送信部が前記ベースバンド信号を送信する際のブランキング期間を利用して前記フォーマット情報を取得する、
上記(18)又は(19)のいずれかに記載の画像処理装置。
(20)全天球画像を表現する球面上に緯度線及び経度線を定義する定義ステップと、
 前記定義した各緯度線上に前記全天球画像の画素を配置する配置ステップと、
を有する画像処理方法。
(21)球面上に画素を配置して構成される全天球画像データから、ライン毎に画素数の異なるベースバンド信号を生成する画像処理部と、
 前記ベースバンド信号を所定の伝送路を介して外部機器に送信する送信部と、
を具備する送信装置。
(22)前記画像処理部は、緯度線及び経度線が定義された前記球面の各緯度線上の画素をライン毎に配置して、前記ベースバンド信号を生成する、
上記(21)に記載の送信装置。
(23)前記外部機器が対応可能な全天球画像データの伝送方式に関する伝送方式情報を取得する取得部と、
 取得した前記伝送方式情報に基づいて全天球画像データの伝送方式を選択する選択部と、
をさらに備え、
 前記画像処理部は、前記選択された伝送方式に従って、前記ベースバンド信号を生成する、
上記(21)又は(22)のいずれかに記載の送信装置。
(24)前記取得部は、前記所定の伝送路を介して前記外部機器から前記伝送方式情報を取得する、
上記(23)に記載の送信装置。
(25)前記所定の伝送路は、HDMI規格に基づく伝送路であり、
 前記取得部は、前記外部機器が備えるEDID ROMに格納されたE-EDIDから前記伝送方式情報を取得する、
上記(23)又は(24)のいずれかに記載の送信装置。
(26)前記全天球画像データは、緯度線及び経度線が定義された前記球面の各緯度線上に経度方向に所定の間隔で各画素を配置して構成され、
 前記取得部は、前記伝送方式情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを取得し、
 前記選択部は、取得した前記伝送方式情報に基づいて、前記画像処理部が前記ベースバンド信号を生成する際の、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを選択する、
上記(23)乃至(25)のいずれかに記載の送信装置。
(27)前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器に通知する通知部をさらに備える、
上記(21)乃至(26)のいずれかに記載の送信装置。
(28)前記通知部は、前記所定の伝送路を介して前記外部機器に前記フォーマット情報を送信する、
上記(27)に記載の送信装置。
(29)前記所定の伝送路は、HDMI規格に基づく伝送路であり、
 前記通知部は、前記送信部が前記ベースバンド信号を送信する際のブランキング期間を利用して前記フォーマット情報を送信する、
上記(27)又は(28)のいずれかに記載の送信装置。
(30)前記全天球画像データは、緯度線及び経度線が定義された前記球面の各緯度線上に経度方向に所定の間隔で各画素を配置して構成され、
 前記通知部は、前記フォーマット情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、前記緯度線の総数若しくは前記ベースバンド信号に含まれる総ライン数、前記全天球画像データの原点位置情報のうち少なくとも1つを通知する、
上記(27)乃至(29)のいずれかに記載の送信装置。
(31)球面上に画素を配置して構成される全天球画像データから、ライン毎に画素数の異なるベースバンド信号を生成する画像処理ステップと、
 前記ベースバンド信号を所定の伝送路を介して外部機器に送信する送信ステップと、
を有する送信方法。
(32)球面上に画素を配置して構成される全天球画像データから生成された、ライン毎に画素数の異なるベースバンド信号を、所定の伝送路を介して外部機器から受信する受信部と、
 前記ベースバンド信号を基に全天球画像を表示するための処理を行なう処理部と、
を具備する受信装置。
(33)対応可能な全天球画像データの伝送方式に関する伝送方式情報を前記外部機器に通知する通知部をさらに備える、
上記(32)に記載の受信装置。
(34)前記全天球画像データは、緯度線及び経度線が定義された前記球面の各緯度線上に経度方向に所定の間隔で各画素を配置して構成され、
 前記通知部は、前記伝送方式情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを通知する、
上記(33)に記載の受信装置。
(35)前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器から取得する取得部をさらに備え、
 前記処理部は、前記フォーマット情報に基づいて全天球画像を表示するための処理を行なう、
上記(32)乃至(34)のいずれかに記載の受信装置。
(36)前記取得部は、前記フォーマット情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、前記緯度線の総数若しくは前記ベースバンド信号に含まれる総ライン数、前記全天球画像データの原点位置情報のうち少なくとも1つを取得する、
上記(35)に記載の受信装置。
(37)球面上に画素を配置して構成される全天球画像データから生成された、ライン毎に画素数の異なるベースバンド信号を外部機器から受信する受信ステップと、
 前記ベースバンド信号を基に全天球画像を表示するための処理を行なう処理ステップと、
を有する受信方法。
Note that the technology disclosed in the present specification can also be configured as follows.
(1) A definition unit that defines latitude lines and longitude lines on a sphere that represents an omnidirectional image;
An arrangement unit for arranging the pixels of the omnidirectional image on each of the defined latitude lines;
An image processing apparatus equipped with
(1-1) An index assignment unit is further provided, which assigns a latitude index to each latitude line and assigns a longitude index starting from zero longitude to pixels arranged on each latitude line.
The image processing apparatus as described in said (1).
(2) The arrangement unit arranges pixels at approximately equal intervals starting from the location of zero longitude on each latitude line.
The image processing apparatus as described in said (1).
(3) The definition unit determines the distance between the latitude lines based on either the angle from the equatorial plane of the spherical surface or the height of the spherical surface in the ground axis direction.
The image processing apparatus according to any one of the above (1) and (2).
(4) A processing unit for sequentially arranging pixels on each of the latitudinal lines of the spherical surface in one line to generate baseband signals having different numbers of pixels for each line;
A transmitter configured to transmit the baseband signal to an external device via a predetermined transmission path;
The image processing apparatus according to any one of (1) to (3), further comprising:
(5) an acquisition unit for acquiring transmission method information on a transmission method of the omnidirectional image data compatible with the external device;
A selection unit that selects a transmission method of the omnidirectional image data based on the acquired transmission method information;
And further
The processing unit generates the baseband signal according to the selected transmission scheme.
The image processing apparatus as described in said (4).
(6) The acquisition unit acquires the transmission method information from the external device via the predetermined transmission path.
The image processing device according to (5).
(7) The predetermined transmission path is a transmission path based on the HDMI standard,
The acquisition unit acquires the transmission method information from an E-EDID stored in an EDID ROM included in the external device.
The image processing apparatus as described in said (6).
(8) The acquisition unit determines, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) Get
The selection unit is a method of determining the interval of the latitude line when the image processing unit generates the baseband signal based on the acquired transmission method information, the maximum number of pixels arranged in one line, correspondence Select at least one of the maximum possible latitudes (maximum number of lines),
The image processing apparatus according to any one of the above (5) to (7).
(9) The information processing apparatus further comprises a notification unit that notifies the external device of format information related to the format of the omnidirectional image data.
The image processing apparatus according to any one of the above (4) to (8).
(10) The notification unit transmits the format information to the external device via the predetermined transmission path.
The image processing apparatus according to (9).
(11) The predetermined transmission path is a transmission path based on the HDMI standard,
The notification unit transmits the format information using a blanking period when the transmission unit transmits the baseband signal.
The image processing apparatus according to any one of the above (9) or (10).
(12) The notification unit determines, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal Informing at least one of the origin position information of the omnidirectional image data,
The image processing apparatus according to any one of the above (9) to (11).
(13) A receiving unit for receiving baseband signals of different numbers of pixels for each line, which are generated by sequentially arranging the pixels on each latitude line of the spherical surface in one line, from an external device through a predetermined transmission path When,
A processing unit that performs processing for displaying an omnidirectional image based on the baseband signal;
The image processing apparatus according to any one of (1) to (3), further comprising:
(14) The information processing apparatus further includes a notification unit that notifies the external device of transmission method information on a transmission method of compatible omnidirectional image data.
The image processing apparatus according to (13).
(15) The notification unit determines, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) To notify,
The image processing apparatus as described in said (14).
(16) The predetermined transmission path is a transmission path based on the HDMI standard,
An EDID ROM storing an E-EDID including the transmission method information,
The image processing apparatus according to any one of the above (14) or (15).
(17) The information processing apparatus further comprises an acquisition unit for acquiring format information on a format of the omnidirectional image data from the external device,
The processing unit performs processing for displaying an omnidirectional image based on the format information.
The image processing apparatus according to any one of the above (13) to (16).
(18) The acquisition unit may use, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal. Acquiring at least one of origin position information of the omnidirectional image data from the external device;
The image processing apparatus according to (17).
(19) The predetermined transmission path is a transmission path based on the HDMI standard,
The acquisition unit acquires the format information using a blanking period when the transmission unit transmits the baseband signal.
The image processing apparatus according to any one of the above (18) or (19).
(20) defining the latitude line and the longitude line on the sphere that represents the omnidirectional image;
Arranging the pixels of the omnidirectional image on each of the defined latitude lines;
An image processing method having:
(21) An image processing unit that generates baseband signals having different numbers of pixels for each line from omnidirectional image data configured by arranging pixels on a spherical surface;
A transmitter configured to transmit the baseband signal to an external device via a predetermined transmission path;
A transmitter equipped with:
(22) The image processing unit arranges, for each line, pixels on each of the latitudinal lines of the spherical surface in which the latitude line and the longitude line are defined, and generates the baseband signal.
The transmitter according to (21).
(23) an acquisition unit for acquiring transmission method information on a transmission method of all celestial sphere image data that can be handled by the external device;
A selection unit that selects a transmission method of the omnidirectional image data based on the acquired transmission method information;
And further
The image processing unit generates the baseband signal according to the selected transmission scheme.
The transmitter according to any one of the above (21) or (22).
(24) The acquisition unit acquires the transmission scheme information from the external device via the predetermined transmission path.
The transmitter according to (23).
(25) The predetermined transmission path is a transmission path based on the HDMI standard,
The acquisition unit acquires the transmission method information from an E-EDID stored in an EDID ROM included in the external device.
The transmitter according to any one of the above (23) or (24).
(26) The omnidirectional image data is configured by arranging each pixel at a predetermined interval in the longitude direction on each latitude line of the sphere on which the latitude line and the longitude line are defined,
The acquisition unit acquires, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) that can be supported. ,
The selection unit is a method of determining the interval of the latitude line when the image processing unit generates the baseband signal based on the acquired transmission method information, the maximum number of pixels arranged in one line, correspondence Select at least one of the maximum possible latitudes (maximum number of lines),
The transmitter according to any one of (23) to (25).
(27) The information processing apparatus further comprises a notification unit that notifies the external device of format information related to the format of the omnidirectional image data.
The transmitter according to any one of (21) to (26).
(28) The notification unit transmits the format information to the external device via the predetermined transmission path.
The transmitter according to (27).
(29) The predetermined transmission path is a transmission path based on the HDMI standard,
The notification unit transmits the format information using a blanking period when the transmission unit transmits the baseband signal.
The transmitter according to any one of the above (27) or (28).
(30) The omnidirectional image data is configured by arranging each pixel at a predetermined interval in the longitude direction on each latitude line of the spherical surface on which the latitude line and the longitude line are defined,
The notification unit determines, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal At least one of origin position information of image data is notified,
The transmitter according to any one of (27) to (29).
(31) an image processing step of generating baseband signals having different numbers of pixels for each line from omnidirectional image data configured by arranging pixels on a spherical surface;
Transmitting the baseband signal to an external device through a predetermined transmission path;
Transmission method.
(32) A receiving unit for receiving baseband signals of different numbers of pixels for each line, which are generated from omnidirectional image data configured by arranging pixels on a spherical surface, from an external device through a predetermined transmission path When,
A processing unit that performs processing for displaying an omnidirectional image based on the baseband signal;
A receiver comprising:
(33) The information processing apparatus further comprises a notification unit for notifying the external device of transmission method information on a transmission method of compatible omnidirectional image data.
The receiving device according to (32).
(34) The omnidirectional image data is configured by arranging each pixel at a predetermined interval in the longitude direction on each latitude line of the spherical surface on which the latitude line and the longitude line are defined,
The notification unit notifies at least one of the method of determining the interval of the latitude lines, the maximum number of pixels arranged in one line, and the maximum number of maximum latitude lines (maximum number of lines) that can be supported as the transmission method information. ,
The receiving device according to (33).
(35) The image processing apparatus further comprises an acquisition unit for acquiring format information on the format of the omnidirectional image data from the external device,
The processing unit performs processing for displaying an omnidirectional image based on the format information.
The receiver according to any one of (32) to (34).
(36) The acquisition unit determines, as the format information, a method of determining the interval of the latitude lines, the maximum number of pixels arranged in one line, the total number of the latitude lines or the total number of lines included in the baseband signal Acquire at least one of origin position information of the omnidirectional image data;
The receiving device according to (35).
(37) a receiving step of receiving, from an external device, baseband signals having different numbers of pixels for each line, generated from omnidirectional image data configured by arranging pixels on a spherical surface;
A processing step of performing processing for displaying an omnidirectional image based on the baseband signal;
Receiving method.
 81…HDMIトランスミッタ、82…HDMIレシーバ
 83…DDC、84…CECライン
 85…EDID ROM、86…HPDライン
 87…電源ライン、88…リザーブ・ライン
 200…AVシステム
 210…ディスク・プレーヤ
 211…イーサネット・インターフェース
 212…ディスク・ドライブ、213…メモリ、214…CPU
 215…高速データ・ライン・インターフェース
 216…一般画像処理部、217…全天球画像処理部
 218…HDMI送信部
 219…HDMI端子、220、221…内部バス
 222…ネットワーク端子
 250…ディスプレイ
 251…高速データ・ライン・インターフェース
 252…HDMI受信部、253…CPU、254…メモリ
 255…イーサネット・インターフェース
 256…全天球画像処理部、257…一般画像処理部
 258…表示パネル、259…HDMI端子
 260、261…内部バス、262…ネットワーク端子
 350…HDMIケーブル
81 ... HDMI transmitter, 82 ... HDMI receiver 83 ... DDC, 84 ... CEC line 85 ... EDID ROM, 86 ... HPD line 87 ... Power supply line, 88 ... Reserved line 200 ... AV system 210 ... Disk player 211 ... Ethernet interface 212: Disk drive, 213: Memory, 214: CPU
215: High-speed data line interface 216: General image processing unit, 217: All-sky image processing unit 218: HDMI transmission unit 219: HDMI terminal, 220, 221: Internal bus 222: Network terminal 250: Display 251: High-speed data Line interface 252: HDMI receiver, 253: CPU, 254: memory 255: Ethernet interface 256: all-sky image processor, 257: general image processor 258: display panel, 259: HDMI terminal 260, 261 ... Internal bus, 262 ... Network terminal 350 ... HDMI cable

Claims (20)

  1.  全天球画像を表現する球面上に緯度線及び経度線を定義する定義部と、
     前記定義した各緯度線上に前記全天球画像の画素を配置する配置部と、
    を具備する画像処理装置。
    A definition unit that defines latitude lines and longitude lines on a sphere that represents an omnidirectional image;
    An arrangement unit for arranging the pixels of the omnidirectional image on each of the defined latitude lines;
    An image processing apparatus equipped with
  2.  前記配置部は、各緯度線上において、経度ゼロの場所を起点としてほぼ等間隔に画素を配置する、
    請求項1に記載の画像処理装置。
    The arrangement unit arranges pixels at substantially equal intervals starting from the location of zero longitude on each latitude line.
    The image processing apparatus according to claim 1.
  3.  前記定義部は、前記緯度線の間隔を、前記球面の赤道面からの角度又は前記球面の地軸方向の高さのいずれかに基づいて決める、
    請求項1に記載の画像処理装置。
    The definition unit determines the distance between the latitude lines based on either the angle from the equatorial plane of the spherical surface or the height of the spherical surface in the ground axis direction.
    The image processing apparatus according to claim 1.
  4.  前記球面の各緯度線上の画素をそれぞれ1ラインに順次配置して、ライン毎に画素数の異なるベースバンド信号を生成する処理部と、
     前記ベースバンド信号を所定の伝送路を介して外部機器に送信する送信部と、
    をさらに備える、請求項1に記載の画像処理装置。
    A processing unit that sequentially arranges pixels on each of the latitudinal lines of the spherical surface in one line, and generates baseband signals having different numbers of pixels for each line;
    A transmitter configured to transmit the baseband signal to an external device via a predetermined transmission path;
    The image processing apparatus according to claim 1, further comprising:
  5.  前記外部機器が対応可能な全天球画像データの伝送方式に関する伝送方式情報を取得する取得部と、
     取得した前記伝送方式情報に基づいて全天球画像データの伝送方式を選択する選択部と、
    をさらに備え、
     前記処理部は、前記選択された伝送方式に従って、前記ベースバンド信号を生成する、
    請求項4に記載の画像処理装置。
    An acquisition unit configured to acquire transmission method information on a transmission method of all-sky image data compatible with the external device;
    A selection unit that selects a transmission method of the omnidirectional image data based on the acquired transmission method information;
    And further
    The processing unit generates the baseband signal according to the selected transmission scheme.
    The image processing apparatus according to claim 4.
  6.  前記取得部は、前記所定の伝送路を介して前記外部機器から前記伝送方式情報を取得する、
    請求項5に記載の画像処理装置。
    The acquisition unit acquires the transmission method information from the external device via the predetermined transmission path.
    The image processing apparatus according to claim 5.
  7.  前記所定の伝送路は、HDMI規格に基づく伝送路であり、
     前記取得部は、前記外部機器が備えるEDID ROMに格納されたE-EDIDから前記伝送方式情報を取得する、
    請求項6に記載の画像処理装置。
    The predetermined transmission path is a transmission path based on the HDMI standard,
    The acquisition unit acquires the transmission method information from an E-EDID stored in an EDID ROM included in the external device.
    The image processing apparatus according to claim 6.
  8.  前記取得部は、前記伝送方式情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを取得し、
     前記選択部は、取得した前記伝送方式情報に基づいて、前記画像処理部が前記ベースバンド信号を生成する際の、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを選択する、
    請求項5に記載の画像処理装置。
    The acquisition unit acquires, as the transmission method information, at least one of a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, and a maximum number of maximum latitude lines (maximum number of lines) that can be supported. ,
    The selection unit is a method of determining the interval of the latitude line when the image processing unit generates the baseband signal based on the acquired transmission method information, the maximum number of pixels arranged in one line, correspondence Select at least one of the maximum possible latitudes (maximum number of lines),
    The image processing apparatus according to claim 5.
  9.  前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器に通知する通知部をさらに備える、
    請求項4に記載の画像処理装置。
    And a notification unit for notifying the external device of format information on the format of the omnidirectional image data.
    The image processing apparatus according to claim 4.
  10.  前記通知部は、前記所定の伝送路を介して前記外部機器に前記フォーマット情報を送信する、
    請求項9に記載の画像処理装置。
    The notification unit transmits the format information to the external device via the predetermined transmission path.
    The image processing apparatus according to claim 9.
  11.  前記所定の伝送路は、HDMI規格に基づく伝送路であり、
     前記通知部は、前記送信部が前記ベースバンド信号を送信する際のブランキング期間を利用して前記フォーマット情報を送信する、
    請求項9に記載の画像処理装置。
    The predetermined transmission path is a transmission path based on the HDMI standard,
    The notification unit transmits the format information using a blanking period when the transmission unit transmits the baseband signal.
    The image processing apparatus according to claim 9.
  12.  前記通知部は、前記フォーマット情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、前記緯度線の総数若しくは前記ベースバンド信号に含まれる総ライン数、前記全天球画像データの原点位置情報のうち少なくとも1つを通知する、
    請求項9に記載の画像処理装置。
    The notification unit determines, as the format information, a method of determining an interval of the latitude lines, a maximum number of pixels arranged in one line, a total number of the latitude lines or a total number of lines included in the baseband signal At least one of origin position information of image data is notified,
    The image processing apparatus according to claim 9.
  13.  前記球面の各緯度線上の画素をそれぞれ1ラインに順次配置して生成された、ライン毎に画素数の異なるベースバンド信号を、所定の伝送路を介して外部機器から受信する受信部と、
     前記ベースバンド信号を基に全天球画像を表示するための処理を行なう処理部と、
    をさらに備える、請求項1に記載の画像処理装置。
    A receiving unit that receives baseband signals of different numbers of pixels for each line, which are generated by sequentially arranging the pixels on each of the spherical latitude lines in one line, from an external device via a predetermined transmission path;
    A processing unit that performs processing for displaying an omnidirectional image based on the baseband signal;
    The image processing apparatus according to claim 1, further comprising:
  14.  対応可能な全天球画像データの伝送方式に関する伝送方式情報を前記外部機器に通知する通知部をさらに備える、
    請求項13に記載の画像処理装置。
    The information processing apparatus further comprises a notification unit for notifying the external device of transmission method information on a transmission method of the compatible omnidirectional image data.
    The image processing apparatus according to claim 13.
  15.  前記通知部は、前記伝送方式情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、対応可能な最大緯度線数(最大ライン数)のうち少なくとも1つを通知する、
    請求項14に記載の画像処理装置。
    The notification unit notifies at least one of the method of determining the interval of the latitude lines, the maximum number of pixels arranged in one line, and the maximum number of maximum latitude lines (maximum number of lines) that can be supported as the transmission method information. ,
    The image processing apparatus according to claim 14.
  16.  前記所定の伝送路は、HDMI規格に基づく伝送路であり、
     前記伝送方式情報を含んだE-EDIDを格納するEDID ROMを備える、
    請求項14に記載の画像処理装置。
    The predetermined transmission path is a transmission path based on the HDMI standard,
    An EDID ROM storing an E-EDID including the transmission method information,
    The image processing apparatus according to claim 14.
  17.  前記全天球画像データのフォーマットに関するフォーマット情報を前記外部機器から取得する取得部をさらに備え、
     前記処理部は、前記フォーマット情報に基づいて全天球画像を表示するための処理を行なう、
    請求項13に記載の画像処理装置。
    It further comprises an acquisition unit for acquiring format information on the format of the omnidirectional image data from the external device
    The processing unit performs processing for displaying an omnidirectional image based on the format information.
    The image processing apparatus according to claim 13.
  18.  前記取得部は、前記フォーマット情報として、前記緯度線の間隔を決める方式、1ラインに配置される最大画素数、前記緯度線の総数若しくは前記ベースバンド信号に含まれる総ライン数、前記全天球画像データの原点位置情報のうち少なくとも1つを前記外部機器から取得する、
    請求項17に記載の画像処理装置。
    The acquisition unit determines the interval of the latitude line as the format information, the maximum number of pixels arranged in one line, the total number of the latitude lines or the total number of lines included in the baseband signal, the omnidirectional sphere Acquiring at least one of origin position information of image data from the external device;
    The image processing apparatus according to claim 17.
  19.  前記所定の伝送路は、HDMI規格に基づく伝送路であり、
     前記取得部は、前記送信部が前記ベースバンド信号を送信する際のブランキング期間を利用して前記フォーマット情報を取得する、
    請求項18に記載の画像処理装置。
    The predetermined transmission path is a transmission path based on the HDMI standard,
    The acquisition unit acquires the format information using a blanking period when the transmission unit transmits the baseband signal.
    The image processing apparatus according to claim 18.
  20.  全天球画像を表現する球面上に緯度線及び経度線を定義する定義ステップと、
     前記定義した各緯度線上に前記全天球画像の画素を配置する配置ステップと、
    を有する画像処理方法。
    Defining a latitude line and a longitude line on a sphere representing an omnidirectional image;
    Arranging the pixels of the omnidirectional image on each of the defined latitude lines;
    An image processing method having:
PCT/JP2018/016478 2017-06-26 2018-04-23 Image processing device and image processing method WO2019003609A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-124465 2017-06-26
JP2017124465 2017-06-26

Publications (1)

Publication Number Publication Date
WO2019003609A1 true WO2019003609A1 (en) 2019-01-03

Family

ID=64741432

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/016478 WO2019003609A1 (en) 2017-06-26 2018-04-23 Image processing device and image processing method

Country Status (1)

Country Link
WO (1) WO2019003609A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021020274A1 (en) * 2019-07-26 2021-02-04 株式会社イッツ・エムエムシー Position space identification method, position space identifier imparting device, and computer program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135614A (en) * 2007-11-28 2009-06-18 Sony Corp Transmitting apparatus, receiving apparatus, communication system, transmitting method, receiving method, and programs thereof
WO2017029885A1 (en) * 2015-08-18 2017-02-23 株式会社ソニー・インタラクティブエンタテインメント Image generating device and image display control device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135614A (en) * 2007-11-28 2009-06-18 Sony Corp Transmitting apparatus, receiving apparatus, communication system, transmitting method, receiving method, and programs thereof
WO2017029885A1 (en) * 2015-08-18 2017-02-23 株式会社ソニー・インタラクティブエンタテインメント Image generating device and image display control device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021020274A1 (en) * 2019-07-26 2021-02-04 株式会社イッツ・エムエムシー Position space identification method, position space identifier imparting device, and computer program

Similar Documents

Publication Publication Date Title
US11792377B2 (en) Transmission apparatus, method of transmitting image data in high dynamic range, reception apparatus, method of receiving image data in high dynamic range, and program
JP5367814B2 (en) Video data transmission method
JP5372687B2 (en) Transmitting apparatus, transmitting method, receiving apparatus, and receiving method
US9924151B2 (en) Transmitting apparatus for transmission of related information of image data
JP2010028261A (en) Transmission device, three dimensional image data transmission method, reception device, and three dimensional image data reception method
US20200234499A1 (en) Method for transmitting/receiving media data and device therefor
US20130100247A1 (en) Image data transmission apparatus, control method for image data transmission apparatus, image data transmission method, and image data reception apparatus
JP6307856B2 (en) Transmitting device, wide color gamut image data transmitting method, receiving device, wide color gamut image data receiving method and program
JP6724788B2 (en) Transmission device, transmission method, reception device, reception method and program
WO2019003609A1 (en) Image processing device and image processing method
WO2022244338A1 (en) Video signal processing device, video signal processing method, video signal output device, and multi-display system
JP5477499B2 (en) Transmission device and stereoscopic image data transmission method
JP6304417B2 (en) Receiving method and receiving apparatus
JP6098678B2 (en) Transmission method and transmission apparatus
JP5790859B2 (en) Transmission / reception system
JP5621944B2 (en) Transmitting apparatus and transmitting method
JP5583866B2 (en) Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method
JP5577476B1 (en) Stereo image data transmitting method and stereo image data transmitting apparatus
JP5577477B1 (en) Stereo image data receiving method and stereo image data receiving apparatus
JP5583297B2 (en) Stereo image data receiving method and stereo image data receiving apparatus
JP5538604B2 (en) Stereo image data transmitting method and stereo image data transmitting apparatus
JP2014225901A (en) Method and device for transmitting stereoscopic image data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18824620

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18824620

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP