WO2016167160A1 - Dispositif de production de données et dispositif de reproduction - Google Patents

Dispositif de production de données et dispositif de reproduction Download PDF

Info

Publication number
WO2016167160A1
WO2016167160A1 PCT/JP2016/061159 JP2016061159W WO2016167160A1 WO 2016167160 A1 WO2016167160 A1 WO 2016167160A1 JP 2016061159 W JP2016061159 W JP 2016061159W WO 2016167160 A1 WO2016167160 A1 WO 2016167160A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
azimuth
frame
display
Prior art date
Application number
PCT/JP2016/061159
Other languages
English (en)
Japanese (ja)
Inventor
北浦 竜二
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2016167160A1 publication Critical patent/WO2016167160A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Definitions

  • the present invention mainly relates to a data generation device that generates data relating to an omnidirectional image and a reproduction device that reproduces an image cut out from the omnidirectional image.
  • a method using a camera equipped with a wide-angle fisheye lens or an omnidirectional mirror a method of photographing the same scene from a plurality of cameras with different viewpoint positions, and connecting the captured images of each camera, etc.
  • An omnidirectional image with a field of view can be taken.
  • the image provider uses a large number of omnidirectional images obtained by shooting at various locations, so that the viewer can view the field of view according to an arbitrary viewpoint position and an arbitrary line-of-sight direction specified by the viewer. Can be browsed.
  • image data indicating the state of the field of view is extracted from omnidirectional images at various viewpoint positions held by the supply center, and the extracted image data is transmitted to the terminal device of the viewer.
  • Google Inc. Headquarters: Google Inc., California, USA
  • Street View registered trademark
  • a service that displays is displayed.
  • Japanese Patent Publication Japanese Laid-Open Patent Publication No. 2001-008232 (published on January 12, 2001)” Japanese Patent Publication “JP 2013-090257 A (published May 13, 2013)”
  • Patent Document 1 and “Street View” are a technique for presenting a still image. That is, in Patent Document 1 and “Street View”, when a viewer designates an arbitrary line-of-sight direction from an arbitrary viewpoint position, a still image corresponding to the designation is presented.
  • Patent Document 1 Due to the fact that the image viewed by the viewer is not a moving image taken while moving, the technique of Patent Document 1 and the technique of “street view” have the following problems.
  • Patent Document 2 is based on an operation of moving the current location on a map along a virtual route, and a plurality of images obtained by photographing the surroundings while moving a real-world route corresponding to the virtual route.
  • the following processing is performed for omnidirectional still images.
  • a visual field image showing a visual field in the sight line direction designated by the viewer is cut out from the omnidirectional still image for the plurality of omnidirectional still images, and the cut out visual field image is displayed.
  • a moving image showing the surroundings viewed from the shooting point at each shooting time is displayed.
  • Patent Document 2 solves the problems of the technique of Patent Document 1 and the “street view” technique described above to some extent.
  • Patent Document 2 when a viewer who is browsing the moving image alone desires to see the state of a certain gaze direction, an operation to change the gaze direction is performed as necessary. Thus, it is possible to see the state of the certain gaze direction.
  • Patent Document 2 a viewer who wants to confirm the state of view when viewing a certain direction from a certain point and the state of view when viewing another direction from the same point at the same time. There is a problem that the request cannot be satisfied. This is because the invention disclosed in Patent Document 2 cannot display the visual field image showing the former visual field and the visual field image showing the latter visual field at the same time.
  • the present invention has been made in view of the above problems, and its main purpose is to allow viewers of a moving image for display cut out from an omnidirectional moving image generated by moving shooting to have a plurality of different images from the same shooting point.
  • An object of the present invention is to realize a data generation device that generates data that enables the state of each field of view when viewing a direction to be confirmed at the same time. It is also included in the object of the present invention to realize a playback device that allows the viewer to check the state of each field of view when viewing a plurality of different directions from the same shooting point.
  • a data generation device provides a plurality of images that a playback device should refer to in order to extract a plurality of images to be displayed from a target frame for all or a part of frames of an omnidirectional video generated by moving shooting.
  • a data generation unit for generating the data wherein the data generation unit generates azimuth data indicating the azimuth for each of a plurality of different azimuths, thereby generating a plurality of azimuth data as the plurality of data,
  • Each of the plurality of azimuth data is referred to by the playback device to cut out an image showing a field of view when viewing the azimuth indicated by the azimuth data from the shooting point of the target frame as the display target image.
  • a playback device includes a plurality of azimuth data to be referred to in order to extract a plurality of images to be displayed from a target frame for all or some frames of an omnidirectional video generated by moving shooting.
  • a data reference processing unit to be referenced, and each of the plurality of azimuth data includes a field of view when the data reference processing unit views the azimuth indicated by the azimuth data from the shooting point of the target frame as the display target image.
  • This is data that should be referred to in order to cut out an image showing the above situation, and includes a reproduction processing unit that reproduces a plurality of images cut out from the target frame for all or some of the frames.
  • the data generation device is a view of each field of view when viewing a plurality of different directions from the same shooting point to a viewer of a display moving image cut out from an omnidirectional moving image generated by moving shooting. It is to realize a data generation apparatus that generates data that enables confirmation at the same time.
  • the playback device has an effect that the viewer can check the state of each field of view when viewing a plurality of different directions from the same shooting point at the same time.
  • FIG. 1 It is a block diagram which shows the structure of the free viewpoint moving image data generator which concerns on Embodiment 1 of this invention. It is explanatory drawing for demonstrating the map data which the said free viewpoint moving image data production
  • the content creator who is the user of the free viewpoint video data generation apparatus causes the free viewpoint video data generation apparatus to generate metadata for causing the free viewpoint video playback apparatus to reproduce a recommended scene by any operation.
  • FIG. 17 is a flowchart showing another specific step in the flowchart of FIG. 16 in detail. It is the figure which illustrated the display mode of the visual field image which shows the mode of a field of view at the time of seeing the above-mentioned other direction from the above-mentioned certain photography point by the above-mentioned free viewpoint animation reproducing device set as the 2nd display mode. It is the schematic of the system which concerns on Embodiment 3 containing the free viewpoint moving image data production
  • Embodiment 1 which is a preferred embodiment of a data generation apparatus according to the present invention will be described below in detail with reference to the accompanying drawings.
  • symbol also in different drawing shall be abbreviate
  • the free viewpoint moving image data generation device generates free viewpoint moving image data that is referenced by a free viewpoint moving image playback device according to a second embodiment to be described later for playing back a free viewpoint moving image.
  • the free viewpoint moving image data generated by the free viewpoint moving image data generating device includes omnidirectional moving image data input to the free viewpoint moving image data generating device.
  • This omnidirectional video data is generated by shooting (moving shooting) while moving a shooting route including a plurality of shooting points.
  • the metadata that allows the user of the free viewpoint video playback device according to Embodiment 2 to confirm the state of each field of view when viewing a plurality of different directions from the same shooting point at the same time is free viewpoint video. It is included in the data.
  • this metadata is also referred to as reproduction control data.
  • the free-viewpoint video data generating device includes a broadcasting device that generates the free-viewpoint video data, a server on the cloud, and software that generates the free-viewpoint video data.
  • An installed PC (Personal Computer) or the like can be mentioned.
  • FIG. 1 is a block diagram showing the configuration of a free-viewpoint video data generation device 100 (hereinafter abbreviated as “data generation device 100”) according to an embodiment of the present invention. First, an outline of the configuration of the data generation device 100 will be described with reference to this figure.
  • the data generation device 100 includes a control unit 110 and an operation reception unit 120.
  • the control unit 110 is a CPU and controls the entire data generation apparatus 100 in an integrated manner.
  • the operation reception unit 120 is an operation device that receives an operation by a content creator (a user of the data generation device 100).
  • the control unit 110 executes a specific program, thereby performing a basic azimuth data generation unit 111 (data generation unit), an extended azimuth data generation unit (data generation unit) 112, a shooting point data conversion processing unit 113, and an encoding process.
  • the encoding processing unit 114 and the multiplexing processing unit 116 may be realized by hardware (LSI) instead of software.
  • the basic azimuth data generation unit 111 cuts out the visual field image to be reproduced from the target frame by the free viewpoint video reproduction device according to the second embodiment. Data to be referred to is generated.
  • This field-of-view image is an image showing the state of the field of view when viewing a certain direction (the direction specified by the content creator or the default direction) from the shooting point of the target frame.
  • the basic azimuth data generation unit 111 generates azimuth data indicating the certain azimuth (hereinafter referred to as basic azimuth data) as the data. Specific contents of the basic azimuth data will be described later.
  • the subject that the user wants to show most to the user of the free-viewpoint video playback device according to the second embodiment is selected from the various subjects included in the target frame.
  • the certain direction is designated so as to be included in the field-of-view image.
  • the extended azimuth data generation unit 112 generates, for each image frame of the omnidirectional video data, data to be referred to so that the free viewpoint video playback device according to the second embodiment cuts out the playback target view image from the target frame.
  • This field-of-view image is an image showing the state of the field of view when another azimuth (the azimuth designated by the content creator or the default azimuth) is viewed from the shooting point of the target frame.
  • the extended azimuth data generation unit 112 generates azimuth data indicating another azimuth (extended azimuth data indicating an azimuth different from the azimuth indicated by the basic azimuth data) as the data. Specific contents of the extended azimuth data will be described later.
  • the extended azimuth data generation unit 112 may generate only one extended azimuth data as the extended azimuth data related to the target frame, or may generate a plurality of extended azimuth data indicating different azimuths.
  • the shooting point data conversion processing unit 113 outputs map data input from the outside to the multiplexing processing unit 116.
  • the shooting point data conversion processing unit 113 corresponds to the shooting point in the map indicated by the map data input from the outside with the GPS coordinate data indicating the shooting point of the frame. It is converted into coordinate data (map coordinate data) indicating the position.
  • the GPS coordinate data may be included in the omnidirectional video data as additional information of the omnidirectional video data. That is, the GPS coordinate data may be generated by this camera when shooting an omnidirectional video by a camera incorporating a GPS module.
  • a type of “coordinate data indicating a shooting point” different from the GPS coordinate data may be included.
  • the coordinate data may be generated by using a GPS module and a gyro sensor and / or a vehicle speed sensor in combination, or may be manually input as additional information of an omnidirectional video.
  • FIG. 2 is an explanatory diagram for explaining the map data.
  • the map data representing the map 10 in FIG. 2 is two-dimensional map image data indicating a map of the area where the above-described shooting route is located.
  • map data map data, such as a sightseeing map and a road map, is mentioned, for example.
  • the map coordinate data (shooting point data) corresponding to the shooting point of the first frame of the omnidirectional video data is a coordinate indicating the point 11 on the map indicated by the map data in FIG.
  • map coordinate data (shooting point data) corresponding to the shooting point of the last frame of the omnidirectional video data is assumed to be a coordinate indicating the point 13 on the map indicated by the map data in FIG.
  • each map coordinate data (photographing point data) corresponding to the photographing points of the remaining frames of the omnidirectional video data is coordinates on a line 12 connecting the points 11 and 13 in FIG. .
  • the shooting point data conversion processing unit 113 is configured to perform the process of converting the GPS coordinate data indicating the shooting point of the target frame into map coordinate data for all frames. You may perform only about some frames (key frame).
  • the shooting point data conversion processing unit 113 for each frame that is not a key frame, map coordinate data indicating the shooting point of the target frame, map coordinate data indicating a shooting point of the key frame immediately before the target frame, and the target You may produce
  • the shooting point data conversion processing unit 113 may indicate that “the camera that shoots an omnidirectional video has the immediately preceding key frame over a period from the shooting time of the immediately preceding key frame to the shooting time of the immediately following key frame.
  • Map coordinate data indicating a shooting point of a frame that is not a key frame may be generated on the assumption that it has moved at a constant speed on a line segment connecting the shooting point of the key frame and the shooting point of the immediately following key frame.
  • the shooting point data conversion processing unit 113 generates map coordinate data indicating the shooting point of the target frame based on speed information (speed vector) input by the content creator for the target frame for each frame that is not a key frame. May be.
  • the shooting point data conversion processing unit 113 indicates that “a camera that shoots an omnidirectional video has moved from the shooting point of the frame immediately before the target frame at a speed indicated by the speed information for a certain period of time.
  • map coordinate data indicating the shooting point of the target frame may be generated.
  • the fixed time may be a time corresponding to a quotient obtained by dividing the total shooting time of the omnidirectional video by the total number of frames n of the omnidirectional video.
  • the map coordinate data regarding the frame is composed of the frame number of the frame and the map coordinate value indicating the shooting point of the frame, but using the shooting time of the frame indicated by the frame number instead of the frame number. Also good.
  • the encoding processing unit 114 encodes omnidirectional moving image data input from the outside, and outputs the encoded omnidirectional moving image data to the multiplexing processing unit 116.
  • the encoding method used by the encoding processing unit 114 for example, the MPEG-2 method, which is a method standardized by MPEG (Motion Picture Experts Group), H.264, and H.264. Any method can be applied as long as it is a method for encoding a moving image, such as the H.264 method or the HEVC (High-efficiency-Video Coding) method. Since these encoding methods do not directly characterize the present invention and are known techniques, detailed description thereof will be omitted.
  • the encoding processing unit 114 is not essential in the data generation apparatus according to the present invention. That is, the data generation apparatus according to another embodiment different from the first embodiment may output the input omnidirectional moving image data without compression without encoding.
  • the reproduction control data generation unit 115 When the reproduction control data generation unit 115 receives basic azimuth data, one or more extended azimuth data, and shooting point data (map coordinate data), the reproduction control data generation unit 115 generates reproduction control data including these data, and the generated reproduction control data. Is output to the multiplexing processing unit 116. The specific contents of the reproduction control data will be described later.
  • the multiplexing processing unit 116 generates free viewpoint moving image data by multiplexing map data, reproduction control data for each frame, and encoded omnidirectional moving image data that are input from the outside, The generated free viewpoint video data is output to the outside.
  • the free-viewpoint video data (distribution data) generated by the multiplexing processing unit 116 is automatically transmitted to the free-viewpoint video playback apparatus according to the second embodiment or manually using a removable recording medium. Distributed.
  • FIG. 3 is a flowchart illustrating an example of the operation of the data generation device 100.
  • the data generation device 100 starts the operation according to the flowchart of FIG. 3 at the timing when the omnidirectional video data and the map data are input from the outside.
  • the data generation device 100 performs the processing from step S1 to step S6 for each frame of the omnidirectional video data.
  • step S1 the shooting point data conversion processing unit 113 converts the GPS coordinate data indicating the shooting point of the target frame into map coordinate data using the map data input from the outside, and the reproduction control data generation unit 115. Output to.
  • step S2 the basic azimuth data generation unit 111 generates basic azimuth data (step S2).
  • Step S2 will be specifically described with reference to FIG.
  • FIG. 4 (a) shows a flowchart showing step S2 in detail.
  • step S21 the basic orientation data generation unit 111 determines whether the operation reception unit 120 has received an operation for generating basic orientation data.
  • step S22 the basic azimuth data generating unit 111 proceeds to step S22, and the operation receiving unit 120 performs an operation for generating basic azimuth data. If it is determined that it has not been received, the process proceeds to step S23.
  • step S22 the basic azimuth data generation unit 111 generates basic azimuth data corresponding to the contents of the above-described operation (specifically, an operation for inputting the cut-out center coordinates described below).
  • step S23 the basic azimuth data generation unit 111 automatically generates basic azimuth data.
  • FIG. 5 is a view image showing a state of view when a certain direction is viewed from a certain photographing point, which is cut out from an omnidirectional image photographed at a certain photographing point by the free viewpoint moving image reproducing device according to the second embodiment. It is a figure which shows an example.
  • the omnidirectional video data according to the present embodiment is configured to include all 360-degree videos, but the free-viewpoint video playback device according to the second embodiment (for example, a large display such as a TV or a projector, a tablet, a smartphone, or the like)
  • the small display or the head mounted display cannot display the entire image. That is, the free viewpoint moving image playback apparatus according to the second embodiment needs to cut out a part (view image) from the target frame and display the view image for each frame of the omnidirectional moving image.
  • the basic azimuth data generation unit 111 performs an operation on the target frame (omnidirectional image) corresponding to the center position of the visual field image that the content creator wants the free viewpoint video playback apparatus according to the second embodiment to cut out.
  • Direction data including the coordinates (cutout center coordinates) is generated.
  • the free viewpoint video playback device acquires orientation data including the cut-out center coordinates corresponding to the point 15 at the center position of the image 14 in FIG. 5 as the orientation data related to the target frame.
  • the image 14 can be cut out from the target frame as a visual field image.
  • the basic azimuth data generation unit 111 may proceed to step S22 when an operation of inputting a cut-out center coordinate is accepted, or when another type of operation is accepted.
  • the other type of operation will be described below with reference to FIG. FIG. 6 is an explanatory diagram which is referred to for explaining the different operation.
  • the basic azimuth data generation unit 111 may proceed to step S22 when an operation for designating an arbitrary point on the map 10 indicated by the map data (for example, the point 16 in FIG. 6) is received.
  • the basic azimuth data generation unit 111 may specify the azimuth in which the point designated by the above operation is located as seen from the point on the map 10 corresponding to the shooting point of the target frame. Then, the basic azimuth data generation unit 111 may generate basic azimuth data including a value indicating the azimuth instead of the cut-out center coordinates.
  • the basic azimuth data generation unit 111 includes a coordinate value indicating a point on the map corresponding to the shooting point of the target frame and a coordinate value indicating a point designated by the operation on the map. Such basic azimuth data may be generated.
  • step S2 the extended azimuth data generation unit 112 generates extended azimuth data (step S3), and the process proceeds to step S4.
  • step S4 the data generating apparatus 100 returns to step S3 when an operation for further generating extended azimuth data is received, and returns to step S5 when not receiving an operation for further generating extended azimuth data. move on.
  • Step S3 will be specifically described with reference to FIG.
  • FIG. 4 (b) shows a flowchart showing step S3 in detail.
  • step S31 the extended orientation data generation unit 112 determines whether or not the operation receiving unit 120 has received an operation for generating extended orientation data.
  • step S32 the extended azimuth data generating unit 112 proceeds to step S32, and the operation receiving unit 120 performs an operation for generating extended azimuth data. If it is determined that it has not been received, the process proceeds to step S33.
  • step S32 the extended azimuth data generation unit 112 expands the azimuth data according to the content of the above-described operation (specifically, an operation for inputting the cut-out center coordinates different from the cut-out center coordinates included in the basic direction data). Is generated.
  • step S33 the extended azimuth data generation unit 112 automatically generates extended azimuth data including cutout center coordinates different from the cutout center coordinates included in the basic azimuth data.
  • the extended azimuth data generation unit 112 may automatically calculate four extended azimuth data each indicating the east, west, south, and north directions, or automatically generate only one extended azimuth data indicating a predetermined direction. May be calculated automatically.
  • the extended azimuth data generation unit 112 may automatically generate extended azimuth data including cut-out center coordinates corresponding to a combination of a predetermined azimuth angle and a predetermined elevation angle.
  • the extended azimuth data generation unit 112 may automatically generate extended azimuth data such that the coordinates in the moving area in the omnidirectional video become the cut-out center coordinates.
  • the extended azimuth data generation unit 112 may calculate a motion vector in pixel units or block units with reference to the forward or backward frame of the omnidirectional video. Then, in any of the following cases, the extended azimuth data generation unit 112 generates extended azimuth data such that the coordinates of the center or the center of gravity of each of the following one or more areas become the cut center coordinates. It may be automatically generated.
  • the extended orientation data generation unit 112 has the center or the center of gravity of each of the one or more areas described above. Extended azimuth data may be automatically generated so that these coordinates become the cut-out center coordinates.
  • Pixel The direction of the calculated motion vector is different from the direction of the motion vector of each pixel around the area.
  • Pixel block The direction of the calculated motion vector is different from the direction of the motion vector of each block around the area.
  • the extended azimuth data generation unit 112 may compare the image data with a target frame in the omnidirectional video to determine whether or not the subject image is included in the target frame. . Then, when the extended orientation data generation unit 112 determines that the subject image is included in the target frame, the extended orientation data generation unit 112 determines the region including the subject image among the plurality of predetermined regions. Extended azimuth data may be automatically generated such that the coordinates of the center or the center of gravity become the cut-out center coordinates.
  • the image data input by the user may be image data taken at an opportunity different from the shooting of the omnidirectional video.
  • the image data may be data of an image captured by a user when a desired field-of-view image of a desired frame (omnidirectional still image) in the omnidirectional video is displayed by a playback device.
  • the extended azimuth data generation unit 112 may generate extended azimuth data in n steps S33, or generate extended azimuth data only in some steps S33 of the n steps S33. May be.
  • the operation receiving unit 120 can receive input of a plurality of different cut-out center coordinates.
  • the extended azimuth data generation unit 112 generates a plurality of extended azimuth data by generating extended azimuth data that includes the cut out center coordinates for each of the plurality of input cut out center coordinates.
  • FIG. 7 is a diagram showing that there are a plurality of subjects that the content creator recommends browsing around a certain section in the shooting route.
  • FIG. 8 shows a visual field image showing the state of the field of view when a different direction (the direction of the subject 18) is seen from a certain photographing point, which is cut out from a frame photographed at the certain photographing point by the free viewpoint moving image reproducing device. It is a figure which shows an example.
  • the point on the map 10 represented by the map data corresponding to the point where the subject 18 exists is the point 20, and the point on the map 10 corresponding to the point where the subject 19 exists is the point 21.
  • the subject 18 is shown in P frames taken at P shooting points corresponding to the part included in the region 22 in the line 12, and the part included in the region 23 in the line 12. It is assumed that the subject 19 is shown in Q frames taken at Q shooting points corresponding to.
  • the content creator for each of the P frames, displays a field image (for example, field image 24 in FIG. 18) in which the subject 18 is captured from the target frame (omnidirectional image) on the free viewpoint video reproduction device. What is necessary is just to input the cutting center coordinate (for example, the coordinate in this omnidirectional image corresponding to the point 25 of FIG. 18) which makes it cut out.
  • the content creator may input, for each of the Q frames, a cut-out center coordinate that causes the free-viewpoint video playback device to cut out a field-of-view image showing the subject 19 from the frame.
  • the extended azimuth data generation unit 112 generates extended azimuth data including a certain cutout center coordinate and extended azimuth data including another cutout center coordinate for a frame in which the two subjects 18 and 19 are captured. become.
  • the expanded orientation data may include identification information corresponding to the orientation represented by the X component of the cut-out center coordinates.
  • the content creator can highlight a part of the frames in the omnidirectional video.
  • a certain scene or a plurality of subjects that the viewer is interested in or wants to show to the viewer are included, it is possible to prevent the viewer from overlooking them.
  • the viewer can view a visual field image of a desired orientation among a plurality of orientations corresponding to a plurality of extended orientation data. For example, it is possible to browse recommended sightseeing spots, famous landmarks on the road map, facial expressions with highlights of one's acquaintances and family, and the like.
  • step S5 the reproduction control data generation unit 115 generates reproduction control data including the input shooting point data, basic azimuth data, and extended azimuth data.
  • reproduction control data generated by the reproduction control data generation unit 115 will be described in detail with reference to FIGS.
  • FIG. 9 is a diagram schematically showing an example of the data structure of the free-viewpoint video data generated by the data generation device 100.
  • FIG. 10 is a diagram schematically illustrating an example of the data structure of the reproduction control data included in the free viewpoint moving image data.
  • FIG. 11 is a diagram schematically illustrating another example of the data structure of the reproduction control data included in the free viewpoint moving image data.
  • the free viewpoint moving image data 26 includes omnidirectional moving image data and map data, and for each i from 1 to n, a frame whose frame number is i (hereinafter also referred to as a frame i). Eye-gaze direction control data Vi.
  • the reproduction control data V1 includes shooting point data P1 indicating the shooting point of the frame 1 and basic azimuth data O1 related to the frame 1.
  • the extended azimuth data E1j related to the frame 1 is included. (M, n, i, j are positive integers).
  • one or more extended azimuth data may be included in each reproduction control data of the free viewpoint moving image data generated by the data generation device 100 of the present embodiment.
  • the extended azimuth data may not be included in a part of the reproduction control data (reproduction control data Vn).
  • the azimuth data included in the reproduction control data will be specifically described again with reference to FIGS.
  • the basic azimuth data includes cut-out center coordinates and identification information for identifying the basic azimuth data from the extended azimuth data.
  • Each extended azimuth data includes cut-out center coordinates and identification information for identifying the extended azimuth data from basic azimuth data and other extended azimuth data.
  • the basic azimuth data and the extended azimuth data may include priority display order data as shown in FIG.
  • the priority display order data indicates the order in which the field of view image cut out by the orientation data including the priority display order data is to be displayed among the plurality of field of view images that can be extracted from the frame i. It is the data shown.
  • the azimuth video reproduction device can select the azimuth among a plurality of field images that can be extracted from the frame i. It shows that the visual field image cut out by the data should be displayed first.
  • the value of the priority display order data included in certain azimuth data of the reproduction control data Vi is k.
  • the free viewpoint moving image reproduction apparatus can select the view field image that can be cut out from the frame i. This indicates that the field-of-view image cut out by the azimuth data is to be displayed k-th (k is an integer of 2 or more).
  • the content creator wants the free-viewpoint video playback device to display a plurality of field-of-view images cut out from the frame i so that they can be simultaneously viewed, the same value is used for the plurality of orientation data for cutting out the plurality of image data. May be included.
  • viewpoint data may be included in each azimuth data.
  • the viewpoint restriction data includes information indicating whether or not there is a viewpoint restriction (a restriction for allowing the free-viewpoint video playback device to cut out the visual field image only from a part of the entire area of the omnidirectional image). Also good.
  • the viewpoint restriction data including information indicating that there is a viewpoint restriction may include information indicating the partial area.
  • the data generation device 100 allows the content creator to determine whether or not the viewpoint is restricted, and the above-mentioned partial area (for example, an area that the viewer in the omnidirectional image is particularly interested in, or a viewer in the omnidirectional image).
  • the user may be configured to perform an operation of designating a region that may be browsed.
  • the content creator can allow the viewer who is the user of the free viewpoint video playback device to browse only the visual field image in the region that is particularly desired to be noticed.
  • the content creator can restrict browsing of the field-of-view images in other areas that the viewer does not want to browse.
  • An area that the content creator does not want the viewer to browse includes an area that includes a scene that the viewer does not want to browse from the viewpoint of confidentiality, portrait rights, customs, or the like.
  • the data generation apparatus 100 that performs streaming distribution of omnidirectional video in real time and the data generation apparatus 100 that distributes omnidirectional video by broadcast have the above-described configuration.
  • step S6 the encoding processing unit 114 encodes the input frame (omnidirectional image) using a preset encoding method (for example, HEVC), and outputs encoded data of the omnidirectional image.
  • a preset encoding method for example, HEVC
  • the data generation device 100 proceeds to step S7 after performing the above steps S1 to S6 for each image frame.
  • step S7 the multiplexing processing unit 7 multiplexes the encoded data of each frame, the map data, and the reproduction control data of each frame to generate free viewpoint moving image data, and the data generating apparatus 100 ends the operation.
  • the data generation device 100 generates free-viewpoint video data using omnidirectional video data, map data, and playback control data for each frame.
  • the reproduction control data for each frame includes shooting point data indicating a point on the map data corresponding to the shooting point of the target frame, basic orientation data for extracting a field-of-view image from the target frame, and extended gaze direction data. Yes.
  • the free-viewpoint video playback device with respect to each frame of the omnidirectional video data included in the free-viewpoint video data created in this way, partly from the target frame (a subject, scenery, scene with a highlight)
  • the reproduction control data relating to the target frame is used in order to cut out and display the field-of-view image in which the image is captured.
  • the data generation device 100 can reduce (or eliminate) the time and effort required for the viewer to search for a subject, scenery, or scene with a highlight.
  • the data generation apparatus 100 can solve the problem that the time when the scene can be reproduced has passed while searching for a scene with highlight, and the viewer misses the scene.
  • the content creator uses the data generation device 100 to create azimuth data for displaying a field-of-view image including the subject on the free-viewpoint video playback device for a plurality of subjects desired to be shown to the viewer. Can do.
  • the content creator can also control the display priority order of a plurality of field-of-view images in each frame by using the data generation device 100, so that content satisfying various viewers with different preferences. Can be produced and distributed.
  • the data generation apparatus may be configured to receive omnidirectional image data (still image data) and output free viewpoint still image data.
  • the content creator can use this data generation device for the purpose of displaying advertisements using signage or the like. That is, the content creator can sell the right to determine the display priority of the visual field image for each of the multiple visual field images that signage cuts out from the omnidirectional image data to the advertiser or the like.
  • the signage is configured so that an advertisement video or message can be superimposed on each visual field image
  • the content creator signs the advertising video or message into the target visual field image for each line-of-sight image.
  • the right to superimpose display may be sold.
  • the free viewpoint moving image data generating apparatus generates free viewpoint moving image data that is referenced by a free viewpoint moving image reproduction system according to an embodiment different from the second embodiment to be described later to reproduce a free viewpoint moving image. You may comprise.
  • Such a free-viewpoint video playback system has, for example, a plurality of display devices (display, projector, etc.) surrounding the viewer and a structure that allows the viewer to face a predetermined direction when the viewer is sitting. And a seating system.
  • display devices display, projector, etc.
  • the basic azimuth data generated by the free viewpoint moving image data generating device is displayed on a display device (display device positioned in the predetermined direction as viewed from the seat) located in front of the viewer of the free viewpoint moving image playback system. Cutout center coordinates that are coordinates on the image frame (omnidirectional image) corresponding to the center position of the displayed visual field image may be included.
  • the extended azimuth data includes coordinates (cutout center coordinates) on the image frame (omnidirectional image) corresponding to the center position of the field-of-view image displayed on the display device in front of the viewer of the free viewpoint video playback device. May be included.
  • the basic azimuth data generation unit 111 may calculate the cut-out center coordinates included in the basic azimuth data by performing image processing on the omnidirectional moving image data.
  • the basic azimuth data generation unit 111 may divide two adjacent image frames (omnidirectional images) into a plurality of regions with the same division pattern, and acquire a motion vector of each region by lock matching or the like.
  • the basic azimuth data generation unit 111 determines which area is the area in the moving direction of the camera from the direction and magnitude of the motion vector of each area, and the center coordinates of the area determined to be the moving direction May be the cut-out center coordinates included in the basic azimuth data.
  • the basic orientation data generation unit 111 may divide the image frame (omnidirectional image) into a plurality of regions. When the photographer appears in a certain area in the omnidirectional image, the basic azimuth data generation unit 111 recognizes that the area is the direction opposite to the moving direction of the camera. It may be determined whether the region is in the traveling direction.
  • the basic orientation data generation unit 111 may use the center coordinates of the area determined to be the traveling direction as the cut-out center coordinates included in the basic orientation data.
  • the area where the photographer moves in the omnidirectional image may be designated by the operation of the content creator, and the basic azimuth data generation unit 111 selects the subject shown throughout the photographic period. , It may be recognized as a photographer.
  • Step S33 is not essential in the data generation device according to the present invention. That is, the extended azimuth data generation unit 112 may not generate the extended azimuth data when it is determined that the operation reception unit 120 has not received an operation for generating the extended azimuth data.
  • step S4 is not essential in the data generation apparatus according to the present invention. That is, the extended azimuth data generation unit 112 may generate only one extended azimuth data for each frame.
  • the data generation device 100 may determine the priority display order data based on the preference information of a specific viewer that is input in advance, or a number of viewers who have already viewed the target omnidirectional video. It may be generated with reference to browsing information (big data indicating which part of the omnidirectional image is viewed by many viewers at each reproduction time of the omnidirectional video).
  • the data generation device 100 determines the priority display order based on the season and time zone at the time of reproduction. Data may be generated. By doing in this way, the data generation device 100 preferentially displays, for example, a visual field image including a subject or a landscape that is difficult to see unless it is in the season or time zone during playback on the free viewpoint video playback device. Can do. In particular, it is possible to provide an optimal display for tourist attraction and navigation.
  • the viewpoint restriction data may include level information related to viewpoint restriction instead of information indicating presence / absence of viewpoint restriction.
  • the level represented by the level information may be any of the following three levels, for example.
  • A level that allows viewers to display any field-of-view image in an omnidirectional image unconditionally on a free-viewpoint video playback device.
  • ⁇ Browsing that meets a predetermined condition for example, paid for the content creator
  • Only for the user a level at which the arbitrary viewpoint image can be displayed on the free-viewpoint video playback apparatus.
  • the viewer can display only the viewfield image in a part of the omnidirectional image on the free-viewpoint video playback apparatus. In this way, the content creator can allow the viewer who does not satisfy the predetermined condition to browse only a part of the visual field image.
  • the content creator can loosen the viewpoint restriction according to the amount of money paid by the viewer to the content creator.
  • the reproduction control data may not be included in the free viewpoint moving image data.
  • the data generation apparatus 100 when the data generation apparatus 100 is implemented in the form of a PC having a function of distributing free viewpoint video data, the data generation apparatus 100 includes free viewpoint video data including map data and omnidirectional video data, omnidirectional
  • the reproduction control data (reproduction control data set) of each frame of the moving image data may be distributed individually.
  • the content creator who is the user of the data generation device 100 can send two different reproduction control data sets created according to his / her preference to the partner to which the free viewpoint video data is distributed. Then, the content creator causes the other PC to play the omnidirectional video based on the first playback control data set and to play the omnidirectional video based on the second playback control data set. Can do.
  • a content creator causes a partner to see a certain landscape (scenery that appears in the field-of-view image displayed by the former playback) when shooting an omnidirectional video, and is watching it when shooting a omnidirectional video. It is possible to make the other party see another landscape (the landscape shown in the visual field image displayed by the latter reproduction).
  • the playback control data may include information related to the content creator (information related to the copyright such as the name of the content creator). This mechanism is useful when the content creator desires to handle the reproduction control data itself as a secondary work.
  • the data generation apparatus 100 that handles omnidirectional video data has been described.
  • the data generation apparatus according to the present invention is not limited to such a configuration. That is, the range of the data generation device according to the present invention is created by photographing a three-dimensional three-dimensional model (three-dimensional three-dimensional model using CG or a plurality of directions) instead of omnidirectional video data.
  • a data generation apparatus that handles a three-dimensional three-dimensional model) is also included.
  • Embodiment 2 which is a preferred embodiment of a playback apparatus according to the present invention will be described below in detail with reference to the accompanying drawings.
  • symbol also in different drawing shall be abbreviate
  • the free viewpoint video playback apparatus is an apparatus that plays back an omnidirectional video by referring to the free viewpoint video data generated by the data generation apparatus 100 according to the first embodiment. It is not structured to display the entire image frame.
  • the free-viewpoint video playback apparatus is configured to display a field-of-view image by cutting out the field-of-view image from the target frame for each frame of the omnidirectional video.
  • the free viewpoint video playback device may be a device having a touch panel function such as a smartphone or a tablet.
  • the free-viewpoint video playback device includes a magneto-optical disk typified by DVD, Blu-ray (registered trademark) Disc, and / or a semiconductor memory typified by USB memory, SD (registered trademark) card, etc.
  • An apparatus that reads and reproduces content data from an electronic medium may be used.
  • the free-viewpoint video playback device may be a television receiver that receives broadcast waves of TV broadcasts, or a device that receives content data distributed from the Internet or other communication lines.
  • the free-viewpoint video playback device is configured to include an HDMI (registered trademark) Multi-Media Interface (HDMI) (registered trademark) receiver that receives an image signal from an external device such as a Blu-ray (registered trademark) disc player. It may be.
  • HDMI registered trademark
  • HDMI Multi-Media Interface
  • Blu-ray registered trademark
  • the free viewpoint video playback device may be any device as long as it has a function of receiving content data from outside and playing back the input content data.
  • FIG. 12 is a block diagram showing a configuration of a free-viewpoint video playback apparatus 200 (hereinafter abbreviated as “playback apparatus 200”) according to the present embodiment.
  • playback apparatus 200 a free-viewpoint video playback apparatus 200 (hereinafter abbreviated as “playback apparatus 200”) according to the present embodiment.
  • playback apparatus 200 the outline of the configuration of the playback apparatus 200 will be described with reference to FIG.
  • the playback device 200 includes a control unit 210, a display unit 220, and an operation reception unit 230.
  • the control unit 210 is a CPU and controls the entire playback apparatus 200 in an integrated manner.
  • the display unit 220 is a display on which a visual field image is displayed.
  • the operation accepting unit 230 is an operation device that accepts an operation by a viewer of the omnidirectional video (a user of the playback device 200).
  • the control unit 210 functions as a demultiplexing processing unit 211, a map display processing unit 212, a decoding processing unit 213, an orientation data analysis unit 214, and an image cutout unit 215 by executing a specific program.
  • decoding processing unit 213 and the demultiplexing processing unit 211 may be realized by hardware (LSI) instead of software.
  • the demultiplexing processing unit 211 When the demultiplexing processing unit 211 receives an input of free viewpoint moving image data from the outside, the demultiplexing processing unit 211 performs demultiplexing processing on the free viewpoint moving image data, thereby reproducing map data and each frame from the free viewpoint moving image data. Control data and encoded omnidirectional video data are extracted.
  • the demultiplexing processing unit 211 outputs the map data and the shooting point data of each frame to the map display processing unit 212, outputs each encoded frame to the decoding processing unit 213, and sets the azimuth data group of each frame as a direction.
  • the data is output to the data analysis unit 214.
  • the map display processing unit 212 displays a map represented by the map data on the display unit 220, and displays a line indicating the shooting route on the map using the shooting point data of each frame.
  • the decoding processing unit 213 decodes each frame input from the outside and outputs each decoded frame to the image cutout unit 215.
  • the azimuth data analysis unit 214 selects, for each frame, all or part of one or more azimuth data related to the frame.
  • the azimuth data analysis unit 214 performs this selection process based on a browser operation or automatically.
  • the azimuth data analysis unit 214 outputs the cutout coordinates included in the azimuth data to the image cutout unit 215 for each selected azimuth data.
  • the image cutout unit 215 cuts out one or more visual field images from each frame with reference to one or more cutout coordinates regarding the frame.
  • the image cutout unit 215, for each of the one or more cutout center coordinates referred to, displays a visual field image in a region having a predetermined length and width in the region centered on the cutout center coordinate. , Cut out from the frame.
  • the image cutout unit 215 displays, for each frame, one or more visual field images cut out from the frame on the display unit 220 within the period of the frame.
  • the playback apparatus 200 has the following first mode and second mode, and the operation in the first mode is different from the operation in the second mode.
  • First mode The field-of-view image cut out with reference to the cut-out center coordinates of the basic azimuth data is displayed in full screen, and the field-of-view image cut out with reference to the cut-out center coordinates of the extended azimuth data is displayed in a small screen (wipe display).
  • Mode second mode The field-of-view image cut out with reference to the cut-out center coordinates of the basic azimuth data is displayed by default, and the cut-out center coordinates of the extended azimuth data are referenced based on the operation of switching the view-field image displayed by the viewer.
  • FIG. 13 is a flowchart showing the operation
  • FIG. 14 is a flowchart showing in detail one step of the flowchart of FIG.
  • FIG. 15 is a diagram illustrating a display mode (PinP display) of the visual field image by the playback device 200 set in the first display mode.
  • the playback device 200 starts the operation according to the flowchart of FIG. 13 at the timing when the free viewpoint video data is input from the outside.
  • step S41 the multiplexing processing unit 7 performs a demultiplexing process on the free viewpoint moving image data, so that the map data, the reproduction control data of each frame, and the encoded omnidirectional moving image are converted from the free viewpoint moving image data. Extract the data.
  • the playback device 200 performs the processing from step S43 to step S46 for each frame of the omnidirectional video data within the period of the frame.
  • step S43 the decoding processing unit 213 decodes the frame i (omnidirectional image) using a preset decoding method (for example, HEVC), and outputs the decoded frame i to the image clipping unit 215.
  • a preset decoding method for example, HEVC
  • step S44 the orientation data analysis unit 214 selects one or more orientation data included in the reproduction control data Vi.
  • the azimuth data analysis unit 214 extracts the cutout center coordinates from the azimuth data for each of the selected one or more azimuth data, and outputs the extracted cutout center coordinates to the image cutout unit 215.
  • Step S44 will be specifically described with reference to FIG.
  • step S441 the azimuth data analyzing unit 214 determines whether or not priority display order data is included in each of the one or more azimuth data of the reproduction control data Vi.
  • step S442 If the direction data analysis unit 214 determines that the priority display order data is included in each direction data of the reproduction control data Vi, the process proceeds to step S442, and the priority display order data is included in each direction data of the reproduction control data Vi. If it is determined that it is not included, the process proceeds to step S443.
  • step S442 the azimuth data analysis unit 214 selects azimuth data including priority display order data indicating the highest priority display order among all the azimuth data included in the reproduction control data Vi, and step S444. Proceed to
  • step S443 the azimuth data analysis unit 214 selects basic azimuth data by referring to the identification information of each azimuth data included in the reproduction control data Vi.
  • step S444 the azimuth data analyzing unit 214 selects the azimuth data selected in step S442 or step S443 (specific azimuth data which is the azimuth data that the playback device 200 should use most preferentially to cut out a plurality of field images). ) To extract the cut-out center coordinates.
  • the orientation data analysis unit 214 outputs the extracted cut-out center coordinates to the image cut-out unit 215.
  • the orientation data analysis unit 214 proceeds to step S445 after step S444.
  • step S445 the azimuth data analysis unit 214 determines whether or not the extended azimuth data is included in the reproduction control data Vi.
  • the reproducing device 200 proceeds to step S446 when it is determined that the extended control data Vi is included in the playback control data Vi, and proceeds to step S45 when it is determined that the extended control data Vi is not included in the playback control data Vi. Proceed to
  • step S446 the azimuth data analyzing unit 214 determines whether or not the operation receiving unit 230 has received an operation for causing the playback device 200 to select the extended azimuth data.
  • step S447 When the operation accepting unit 230 determines that the operation has been accepted, the reproducing device 200 proceeds to step S447, and when the operation accepting unit 230 determines that the operation has not been accepted, the reproducing device 200 proceeds to step S45.
  • step S447 the azimuth data analysis unit 214 selects extended azimuth data according to the operation. Further, in step S448, the azimuth data analysis unit 214 extracts cutout center coordinates from the extended azimuth data selected in step S447, and outputs the extracted cutout center coordinates to the image cutout unit 215.
  • the reproducing device 200 proceeds to step S45 after step S448.
  • step S45 the image cutout unit 215 refers to the cutout center coordinates output in step S444 and cuts out the view image for full screen display from the frame i.
  • step S45 the image cutout unit 215 refers to the cutout center coordinates output in step S448, and displays the field-of-view image for sub-screen display from the frame i. cut.
  • the image cutout unit 215 displays the full-screen view field image in full screen after step S45.
  • the image cutout unit 215 superimposes and displays a visual field image for sub-screen display on the visual field image.
  • the display unit 220 displays a full-screen display field-of-view image (full-screen image) 27 and a sub-screen display field-of-view image (wipe image) 28. Become.
  • the playback apparatus 200 may be configured to accept an operation for displaying an image of an arbitrary desired orientation included in the frame i (omnidirectional image) as a visual field image.
  • the playback device 200 may perform the following processing within the period of the frame i. That is, the playback device 200 may display the visual field image over a predetermined valid period, and perform the process of step S45 after the expiration date.
  • the above operation may be an operation for inputting the value of the cutout center coordinate in the omnidirectional image, or an operation for specifying the cutout center coordinate (mouse click or touch operation).
  • the playback apparatus 200 receives a mouse operation, a flick operation, or an operation of pressing a controller button during the predetermined valid period, the playback device 200 changes the displayed field image of one direction to a field image of another direction. It may be changed.
  • the playback device 200 converts a field image of a certain direction being displayed into a field image of another direction according to the amount and direction of movement of the mouse, a field image of another direction according to the amount and direction of flick, or
  • the viewing image may be changed to a different azimuth image according to the type of button to be pressed.
  • the image cutout unit 215 may perform a process of correcting the cutout center coordinates as necessary for each frame except the first frame.
  • the cutout center coordinate C1 related to the frame immediately before the target frame and the cutout center coordinate C2 related to the target frame exceeds a predetermined value, it is desirable to correct the cutout central coordinate related to the target frame.
  • Cut-out center coordinates C3 coordinates on a line segment connecting the coordinates C1 and C2, and the distance between the coordinates C1 is the predetermined value. In addition, this is because the viewer can always grasp which field of view image is displayed, and also to prevent the viewer from getting sick.
  • the size of the visual field image cut out from the omnidirectional image by the image cutout unit 215 may be a preset size corresponding to the size of the display area of the display unit 220.
  • the reproduction control data Vi may include size information indicating the size of the visual field image cut out from the omnidirectional image by the image cutout unit 215.
  • the image cutout unit 215 cuts out the visual field image with reference to the size information and the cutout center coordinates.
  • the image cutout unit 215 may perform a process of correcting distortion (distortion due to a lens or a mirror) with respect to the omnidirectional image, and may cut out the visual field image from the corrected omnidirectional image. Note that the distortion correction method does not directly characterize the present invention, and a known method can be applied, and thus detailed description thereof is omitted.
  • the azimuth data analysis unit 214 may output control information indicating that the field-of-view image to be cut out with reference to the cut-out center coordinates together with the cut-out center coordinates should be displayed in full screen. .
  • the orientation data analysis unit 214 may output control information indicating that the visual field image to be cut out with reference to the cut-out center coordinate should be wipe displayed together with the cut-out center coordinate.
  • the image cutout unit 215 may determine whether to view the visual field image to be cut out by referring to the control information with reference to the cutout center coordinates acquired together with the control information to be displayed in full screen or wiped.
  • the azimuth data analysis unit 214 may output the priority display order data included in the azimuth data together with the cutout center coordinates to the image cutout unit 215 together with the cutout center coordinates.
  • the image cutout unit 215 may specify priority display order data indicating the highest priority display order from the plurality of acquired priority display order data.
  • the image cutout unit 215 may determine to display the visual field image to be cut out with reference to the cutout center coordinates acquired together with the priority display order data. In addition, the image cutout unit 215 may determine to wipe-display a field-of-view image to be cut out with reference to the cut-out center coordinates acquired together with other priority display order data.
  • the image cutout unit 215 may compare the value of the X component of the cutout center coordinate of the basic azimuth data with the value of the X component of the cutout center coordinate of the extended gaze direction data. When the value of the former X component is smaller than the value of the latter X component, the image cutout unit 215 displays the wipe image 28 on the right end of the display, and the value of the former X component is greater than the value of the latter X component. May be displayed on the left end of the display.
  • a field image in a certain direction in which a certain subject (a wonderful scenery) is displayed is displayed as a full screen image 27, and another subject related to the subject (the face of a person who is moved by seeing the scenery) is reflected.
  • a field image of a different orientation is displayed as the wipe image 28, the viewer can feel as if he / she is simultaneously viewing the wonderful scenery and the face of the person who is impressed by seeing it. .
  • the display position of the wipe screen may be the upper end and the lower end instead of the left end and the right end.
  • the image cutout unit 215 may compare the value of the Y component of the cutout center coordinate of the basic orientation data with the value of the Y component of the cutout center coordinate of the extended gaze direction data.
  • the image cutout unit 215 displays the wipe image 28 at the upper end of the display, and the value of the former Y component is greater than the value of the latter Y component. May be larger, the wipe image 28 may be displayed at the lower end of the display.
  • the image cutout unit 215 may create an image button using the field-of-view image 28 and display the created image button instead of wiping the field-of-view image 28.
  • the image cutout unit 215 may perform the following processing after deleting the image button and the field-of-view image 27 displayed on the full screen. That is, the image cutout unit 215 may display the field image 28 in full screen, create an image button using the field image 27, and display the created image button.
  • the user can quickly display the field-of-view image cut out with reference to the cut-out center coordinates of the extended orientation data on the playback device 200 in full screen.
  • the playback device 200 may play back the omnidirectional video twice.
  • the playback apparatus 200 displays the field-of-view image of each frame using only the cut-out center coordinates of the basic orientation data during the first playback, and uses only the cut-out center coordinates of the extended orientation data during the second and subsequent playbacks.
  • the field-of-view image of each frame may be displayed.
  • the playback apparatus 200 may be a digital signage configured as described above.
  • the playback device 200 as digital signage can allow the viewer to browse various advertisements shown in the omnidirectional video without getting the viewer bored.
  • the reproducing device 200 displays a button corresponding to the identification information included in the direction data for each of the plurality of direction data in a full screen display. It may be displayed on the field image.
  • the playback apparatus 200 may refer to the cut-out center coordinates of the orientation data identified by the identification information corresponding to the pressed button when any of the displayed buttons is pressed. Then, the playback apparatus 200 may switch the image to be displayed on the full screen to a reference image cut out using the cut-out center coordinates.
  • the playback device 200 may display a button for setting whether or not to display a field-of-view image of the orientation indicated by the target extended orientation data for each extended orientation data.
  • the playback device 200 may wipe-display only the visual field image to be displayed based on the setting by the button.
  • a plurality of field images are displayed according to the priority display order indicated by the priority display order data during the period of the frame. May be wiped in order, or a plurality of wipe screens may be prepared to wipe display all the field-of-view images.
  • the playback device 200 separates the omnidirectional video data, map data, and playback control data for each frame from the free-viewpoint video data generated by the data generation device 100. .
  • the playback apparatus 200 selects all or a part of the azimuth data included in the line-of-sight control data, and cuts out and displays a part of the omni-directional image (field-of-view image) using the cut-out center coordinates included in the line-of-sight direction data. To do.
  • the playback device 200 (a device having a display screen with a limited resolution and size, such as a tablet, a TV, a PC, a smartphone, or an HMD) can play back each frame (omnidirectional) For the image), a field image in a specific direction among the target omnidirectional images (field image that the content creator wants the viewer to browse) is displayed.
  • the visual field image that the content creator wants the viewer to browse after a predetermined period has elapsed Can do even when the viewer performs an operation to display an image of a desired orientation as a visual field image, the visual field image that the content creator wants the viewer to browse after a predetermined period has elapsed Can do.
  • FIG. 16 is a flowchart showing the operation
  • FIG. 17 is a flowchart showing in detail one step (step S44) of the flowchart of FIG.
  • FIG. 18 is a diagram illustrating a state in which the playback device 200 set in the second display mode displays a default visual field image on a map.
  • FIG. 19 is a flowchart showing in detail another process (step S48) of the flowchart of FIG. 16, and FIG. 20 shows the case where the playback device 200 set in the second display mode is designated by the user. It is the figure which illustrated a mode that the performed visual field image is displayed on a map.
  • the playback device 200 starts the operation according to the flowchart of FIG. 16 at the timing when the free viewpoint video data is input from the outside.
  • the playback device 200 proceeds to step S42 after performing step S41 already described.
  • step S42 the map display processing unit 212 displays the map represented by the map data on the display unit 220, and displays a line indicating the shooting route on the map 29 using the shooting point data of each frame.
  • the playback apparatus 200 performs the processing from step S43 to step S48 for each frame of the omnidirectional video data during the frame period.
  • the reproducing device 200 proceeds to step S444A after performing step S43 already described.
  • step S44A the azimuth data analysis unit 214 automatically selects one azimuth data included in the reproduction control data Vi.
  • the azimuth data analysis unit 214 extracts cutout center coordinates from the selected azimuth data, and outputs the extracted cutout center coordinates to the image cutout unit 215.
  • step S44A The specific processing in step S44A is as shown in FIG. 17, but the processing in steps S441 to S444 in FIG. 17 is the same as the processing in steps S441 to S444 in FIG. 13 already described. The description of the specific processing in step S44A is omitted.
  • step S45 the image cutout unit 215 cuts out the default display visual field image 31 from the frame i with reference to the cutout center coordinates output in step S444.
  • the image cutout unit 215 displays the visual field image 31 for default display on the map 29 after step S45 (step S46).
  • the reproducing device 200 proceeds to step S47 after step S46.
  • step S47 the map display processing unit 212 displays the symbol 30 indicating the shooting point of the frame i at the position on the map indicated by the shooting point data of the frame i.
  • step S47 the map display processing unit 212 obtains each azimuth data not selected by the azimuth data analysis unit 214 in step S44A from the azimuth data analysis unit 214, and performs the following processing on each azimuth data regarding the frame i. I do.
  • the map display processing unit 212 has a field of view when the azimuth indicated by the target azimuth data is viewed from the shooting point of the frame i from the cut-out center coordinates included in the target azimuth data and the shooting point data of the frame i.
  • the position on the map of the subject to be entered is estimated, and the symbol 32 is displayed at the estimated position.
  • the map display processing unit 212 refers to the cut-out center coordinates included in the target orientation data, extracts the field image from the frame i, and applies a known distance estimation technique to the field image. The distance between the shooting point of the frame i and the point where the subject in the field-of-view image exists may be estimated. Then, the map display processing unit 212 may estimate the position of the subject on the map from the shooting point data of the frame i, the cut-out center coordinates indicating the orientation, and the estimated distance.
  • the reproducing device 200 proceeds to step S48 after step S47.
  • step S48 the playback device 200 performs a visual field image switching process.
  • the visual field image switching process in step S48 will be specifically described with reference to FIG.
  • the direction data analysis unit 214 determines whether or not the extended control data is included in the reproduction control data Vi (step S481).
  • step S48 If the playback device 200 determines that the extended control data Vi does not include the extended orientation data, the playback device 200 ends step S48. If the playback control data Vi determines that the extended control data Vi includes the extended orientation data, the playback device 200 ends the process. The process proceeds to S482.
  • step S482 the image cutout unit 215 performs the following processing on each extended orientation data included in the reproduction control data Vi.
  • the image cutout unit 215 obtains the cutout center coordinates included in the target extended azimuth data from the azimuth data analysis unit 214, extracts the field image from the frame i with reference to the obtained cutout center coordinates, and extracts The displayed field-of-view image is displayed as a thumbnail.
  • step S482 the image cutout unit 215 displays a broken line connecting the thumbnail 33 of the visual field image and the symbol 32 indicating the position of the subject in the visual field image on the map. indicate.
  • step S482 the orientation data analysis unit 214 determines whether the operation reception unit 230 has received an operation for selecting any thumbnail.
  • step S48 the playback device 200 ends step S48, and the operation receiving unit 230 has received an operation for selecting any thumbnail. If it is determined, the process proceeds to step S484.
  • step S484 the azimuth data analyzing unit 214 selects the extended azimuth data corresponding to the selected thumbnail 33 from one or more extended azimuth data related to the frame i, and the process proceeds to step S485.
  • step 485 the azimuth data analysis unit 214 extracts the cutout center coordinates from the selected extended azimuth data, and outputs the extracted cutout center coordinates to the image cutout unit 215.
  • step S485 the image cutout unit 215 refers to the cutout center coordinates output in step S485, cuts out the visual field image from the frame i (step S486), and cuts the displayed visual field image in step S486.
  • step S487 the image cutout unit 215 displays a thick frame surrounding the selected thumbnail 33 (the field image corresponding to the thumbnail 33 is displayed as a result of the user selecting the thumbnail 33). A thick frame).
  • the map display processing unit 212 displays a large symbol 30 indicating the shooting point of the frame i while the default visual field image is displayed, and the visual field image designated by the user. While the symbol is displayed, the symbol 30 indicating the shooting point of the frame i may be displayed small.
  • step S482 the image cutout unit 215 may perform the following process instead of performing the process of displaying the thumbnail.
  • the image cutout unit 215 may display a button corresponding to the orientation indicated by the extended orientation data for each extended orientation data included in the reproduction control data Vi.
  • the image cutout unit 215 may display a plurality of the above buttons side by side on the edge of the display screen.
  • the image cutout unit 215 may display a visual field image having an orientation corresponding to the pressed button.
  • the image cutout unit 215 displays a pull-down menu that can display a field-of-view image indicated by any extended azimuth data among one or more extended azimuth data included in the reproduction control data Vi. Also good.
  • the viewer can display the shooting position of the current frame and the playback time of the current frame by viewing the map screen.
  • the desired field-of-view image can be displayed while confirming the thumbnail of the field-of-view image, the location of the object in the field-of-view image, and the orientation in which the object is viewed from the shooting point. it can.
  • the playback device 200 is not only a playback device such as a television or a digital video recorder described above, but also a digital camera, a digital movie, a portable movie player, a mobile phone, a car navigation system, a portable DVD player, a PC, etc. Is widely applicable to devices that handle
  • the playback device according to the present invention is not limited to a playback device including a display, and a playback device that does not include a display and displays a moving image on an external display is also included in the scope of the playback device according to the present invention. .
  • the playback device 200 includes the first mode and the second mode, the playback device according to the present invention is not limited to such a configuration.
  • a playback apparatus that performs only one of the first mode operation and the second mode operation is also included in the scope of the playback apparatus according to the present invention.
  • the playback device 200 may play back an omnidirectional video as follows.
  • the playback device 200 includes a virtual dome-shaped screen on which the movie texture of the omnidirectional video is pasted, and a virtual camera that is arranged at the center of the screen and that can change the orientation and position. It may be displayed in a certain area in the display screen, and a portion reflected in the virtual camera in the omnidirectional video may be displayed in another area in the display screen.
  • the playback apparatus 200 identifies the direction in which the virtual camera should face from the cut-out center coordinates included in the basic orientation data, and sets the orientation of the virtual camera so that it faces the same direction as the identified direction. I do not care.
  • the basic orientation data may include data indicating the direction in which the virtual camera should face instead of the cut-out center coordinates, and the playback device 200 sets the orientation of the virtual camera with reference to the data. May be.
  • step S2 the data generation device 100 accepts an operation for designating the orientation of the virtual camera instead of the operation for inputting the cut-out center coordinates, and the data generated by the above operation is used instead of the cut-out center coordinates.
  • Basic azimuth data as it is included is generated.
  • FIG. 21 is a schematic diagram of the free viewpoint moving image processing system according to the present embodiment.
  • a free viewpoint video processing system 1 (hereinafter referred to as “system 1”) according to the present embodiment includes a data generation device 100 according to the first embodiment and a playback device 200 according to the second embodiment. Contains.
  • the data generation device 100 generates free viewpoint video data using the method described in the first embodiment, and the playback device 200 reads the free viewpoint video data and uses the method described in the second embodiment. Play an omnidirectional video.
  • the method of passing free viewpoint moving image data from the data generation device 100 to the playback device 200 may be a method using broadcasting or communication, or a method using a removable recording medium as a medium.
  • the system 1 may be owned by one user, or shared by a first user who owns the data generation device 100 and a second user who owns the playback device 200. There may be.
  • the first user creates free viewpoint video data using the data generation device 100 of the system 1, and the second user browses the omnidirectional video using the playback device 200 of the system 1. May be.
  • the control blocks of the data generation device 100 and the playback device 200 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or realized by software using a CPU (Central Processing Unit). May be.
  • a logic circuit hardware
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the data generation device 100 and the playback device 200 include a CPU that executes instructions of a program that is software that implements each function, and a ROM (in which the program and various data are recorded so as to be readable by a computer (or CPU)).
  • the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • a transmission medium such as a communication network or a broadcast wave
  • the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • the playback device uses the target frame for all or part of the frames of the omnidirectional video generated by moving shooting.
  • a data generation unit (for example, basic azimuth data generation unit 111 and extended azimuth data generation unit 112) that generates a plurality of data (for example, basic azimuth data and extended azimuth data) to be referred to in order to extract a plurality of images to be displayed from
  • the data generation unit generates a plurality of azimuth data as the plurality of data by generating azimuth data indicating the azimuth for each of a plurality of different azimuths, and the plurality of azimuths
  • Each piece of data is indicated by the playback device as the display target image indicated by the direction data from the shooting point of the target frame.
  • the omnidirectional video refers to a video in which all or almost all directions from each shooting point on the moving shooting shooting route are shown. Further, it is assumed that the target frame is omnidirectional still image data shot at a certain shooting point.
  • the data generation device includes an image showing a state of view when viewing a certain direction from the shooting point, and a view showing a state of view when viewing another direction from the shooting point.
  • a plurality of azimuth data for cutting out the image from the target frame is generated.
  • the playback device cuts out these images from the target frame with reference to the plurality of azimuth data, and plays back these images in a period in which these images should be played back.
  • the data generation device is configured such that each viewer when viewing a plurality of different directions from the same shooting point is viewed by a viewer (a user of the playback device) of a playback video clipped from the omnidirectional video data generated by moving shooting. It can be said that there is an effect of generating data that makes it possible to confirm the state of view at the same time.
  • the data generation unit (for example, a portion including the basic azimuth data generation unit 111, the extended azimuth data generation unit 112, and the reproduction control data generation unit 115)
  • the control data including the plurality of azimuth data (for example, reproduction control data) is generated, and the data generation unit adds the azimuth data to each of the plurality of azimuth data as the control data.
  • Control data that includes identification information for identification from the orientation data may be generated.
  • the data generation device determines that the playback device specifies which azimuth data of the plurality of azimuth data is simply by referring to the azimuth data. It has the further effect of making it possible.
  • the data generation unit in the aspect 1 or 2, the data generation unit generates control data including the plurality of azimuth data, and the data generation unit
  • specific azimuth data for example, basic azimuth data
  • Control data may be generated.
  • the specific image among the plurality of images that can be cut out from the target frame using the plurality of azimuth data is the image that the user of the reproduction device most wants to view for the user of the data generation device. It shall be.
  • the data generation device generates the specific image as the specific orientation data by generating the orientation data for the playback device to cut out the specific image from the target frame. It is possible to make it easier for the user to touch the eyes.
  • the display order of the image that the data generation unit extracts for each of the plurality of azimuth data with reference to the azimuth data is generated, and the playback device displays a plurality of images so that images with a relatively high display order are displayed relatively quickly during the period of the target frame. An image may be displayed.
  • Display order data e.g., priority display order data
  • the data generation device can cause the user of the reproduction device to view the plurality of images in the order that the user of the data generation device wants the user of the reproduction device to view the image. , There is an additional effect.
  • the data generation device is the data generation apparatus according to any one of the aspects 1 to 4, wherein the omnidirectional video is a video generated by shooting while moving along a predetermined route.
  • the data generation device has an additional effect that the user of the playback device can check the map of the area where the omnidirectional video is captured.
  • a playback apparatus refers to cutting out a plurality of display target images from the target frame for all or some of the omnidirectional video generated by moving shooting.
  • a data reference processing unit (for example, an azimuth data analyzing unit 214) that refers to a plurality of data to be processed, and the plurality of data are a plurality of azimuth data indicating different azimuths, Is the data to be referred to in order for the data reference processing unit to cut out an image showing the state of the field of view when the azimuth indicated by the azimuth data is viewed from the shooting point of the target frame as the display target image.
  • a reproduction processing unit (for example, the image clipping unit 215) that reproduces a plurality of images clipped from the target frame for all or some of the frames. Eteiru.
  • the omnidirectional video refers to a video in which all or almost all directions from each shooting point on the moving shooting shooting route are shown. Further, it is assumed that the target frame is omnidirectional still image data shot at a certain shooting point.
  • the data generation device includes an image showing a state of view when viewing a certain direction from the shooting point, and a view showing a state of view when viewing another direction from the shooting point.
  • a plurality of azimuth data for cutting out the image from the target frame is generated.
  • the playback device cuts out these images from the target frame with reference to the plurality of azimuth data, and plays back these images in a period in which these images should be played back.
  • the playback device has different views when viewing a plurality of different directions from the same shooting point to a viewer (a user of the playback device) of the playback video clipped from the omnidirectional video data generated by moving shooting. It can be said that there is an effect that the state of can be confirmed at the same time.
  • the data reference processing unit refers to control data including the plurality of azimuth data
  • the control data referred to by the data reference processing unit includes Specific orientation data, which is orientation data to be referred to most preferentially in order for the device to cut out the image, may be included in the plurality of orientation data.
  • the specific image of the plurality of images that can be cut out from the target frame using the plurality of azimuth data is the image that the user of the playback device most wants to view for the producer of the omnidirectional video.
  • the specific image is an image cut out using the specific azimuth data.
  • the reproducing apparatus has an additional effect that the omnidirectional video can be reproduced in such a manner that the specific image is easily touched by the user's eyes.
  • the data reference processing unit refers to the plurality of azimuth data, and the own device uses the azimuth data for each of the plurality of azimuth data.
  • the reproduction processing unit refers to display order data indicating the display order of images to be cut out by reference, and the reproduction processing unit is configured to display the plurality of images having a relatively high display order relatively quickly in a target frame period. One image may be reproduced.
  • the reproducing apparatus has an additional effect that the user can browse the plurality of images in the order in which the creator of the omnidirectional video wants the user to browse.
  • the omnidirectional video is a video generated by shooting while moving along a predetermined route
  • the predetermined video A map display processing unit (for example, a map display processing unit 212) that displays a map of the region where the route of the map is located (for example, the map 29), and the map display processing unit applies a target for each frame of the omnidirectional video.
  • information for example, the symbol 30
  • the shooting point of the target frame may be displayed on the map.
  • regenerating apparatus can make a user grasp
  • the map display processing unit includes, for each frame of the omnidirectional video, information indicating the shooting point of the target frame and the playback process in the period of the target frame.
  • Information (for example, symbol 30) indicating the position of the subject in the image to be cut out from the target frame by the unit may be displayed on the map.
  • the playback device can cause the user to grasp the approximate position of the subject (non-animal such as a building) shown in the visual field image when the visual field image is displayed. The effect which becomes.
  • the present invention can also be configured as follows.
  • the omnidirectional image data having a 360 ° field of view, which is taken using an omnidirectional camera while moving along a predetermined route, and map data that is a map of the route are input, and the input omnidirectional data is used.
  • An image data generation device that generates free viewpoint image data that enables reproduction of an image in an arbitrary line-of-sight direction from a shooting position corresponding to coordinates on map data,
  • a basic line-of-sight direction data generation unit that generates basic line-of-sight direction data, which is initial line-of-sight direction data, for each frame of omnidirectional image data;
  • An extended gaze direction data generating unit that generates at least one extended gaze direction data that is gaze direction data different from the basic gaze direction data for each frame of omnidirectional image data;
  • a map position data generating unit that generates map position data obtained by converting the shooting position of each frame of the omnidirectional image data into coordinates on the map data;
  • a line-of-sight direction control data generating unit that generates line
  • the basic line-of-sight direction data and the plurality of extended line-of-sight direction data include information for identifying each of the basic line-of-sight direction data and information about the line-of-sight direction when displaying the omnidirectional image.
  • An image data generation device according to the configuration.
  • the image data generation device In the gaze direction control data, when the basic gaze direction data and at least one or more extended gaze direction data are included for each frame of omnidirectional image data, which gaze direction data is preferentially displayed.
  • the image data generation device characterized in that the image data generation device includes information related to a ranking indicating the above.
  • a separation unit that separates omnidirectional image data, the map data, and the line-of-sight direction control data from free viewpoint image data;
  • a gaze direction control unit that selects at least one or more basic gaze direction data or extended gaze direction data from the gaze direction control data, and outputs cut-out center coordinates of the selected extended gaze direction data;
  • An image cutout unit that obtains cutout center coordinates, cuts out part of the previous omnidirectional image data around the cutout center coordinates and generates a display image, and a display unit that obtains and displays the display image is provided.
  • An image data reproducing apparatus characterized by the above.
  • the line-of-sight direction control unit outputs cut-out center coordinates of a plurality of line-of-sight direction data for the same frame of the omnidirectional image data, and simultaneously uses the cut-out center coordinates of the plurality of line-of-sight direction data from the image cut-out unit.
  • the image data reproducing device according to the fourth configuration, wherein the display unit is notified of which of the display images generated is the main screen.
  • the present invention can be suitably applied to a device that distributes an omnidirectional video and a device that plays back an omnidirectional video.
  • Free-viewpoint video data generator 110 Control Unit
  • Basic Direction Data Generation Unit 112 Extended orientation data generator (data generator)
  • Reproduction control data generation unit data generation unit
  • Multiplexing processing unit (distribution data generation processing unit)
  • Free viewpoint video playback device playback device
  • Control unit 212
  • Map display processing unit maps display processing unit
  • Direction Data Analysis Unit Data Reference Processing Unit
  • Image cutout unit (reproduction processing unit)

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

La présente invention permet à l'observateur d'une image dynamique à afficher qui est extraite d'une image dynamique omnidirectionnelle produite par une prise de vues en continu, de confirmer l'état de chaque champ visuel à peu près au même moment lorsque plusieurs directions différentes sont vues à partir du même point de prise de vues. Un dispositif de production de données (100) comprend une unité de production de données d'orientation de base (111) permettant de produire, pour chaque trame d'une image dynamique omnidirectionnelle, des données d'orientation de base servant de référence à un dispositif de reproduction dans le but d'extraire de la trame une image à afficher, et une unité de production de données d'orientation étendue (112) permettant de produire des données d'orientation étendues servant de référence au dispositif de reproduction dans ledit but.
PCT/JP2016/061159 2015-04-17 2016-04-05 Dispositif de production de données et dispositif de reproduction WO2016167160A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-085406 2015-04-17
JP2015085406 2015-04-17

Publications (1)

Publication Number Publication Date
WO2016167160A1 true WO2016167160A1 (fr) 2016-10-20

Family

ID=57126136

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/061159 WO2016167160A1 (fr) 2015-04-17 2016-04-05 Dispositif de production de données et dispositif de reproduction

Country Status (1)

Country Link
WO (1) WO2016167160A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110892361A (zh) * 2017-07-19 2020-03-17 三星电子株式会社 显示设备、显示设备的控制方法及其计算机程序产品
CN111739121A (zh) * 2020-06-08 2020-10-02 北京联想软件有限公司 画虚拟线条的方法、装置、设备及存储介质
JP2021114787A (ja) * 2017-07-04 2021-08-05 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
JP7459195B2 (ja) 2018-03-09 2024-04-01 キヤノン株式会社 生成装置、生成方法、及びプログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013183249A (ja) * 2012-03-01 2013-09-12 Dainippon Printing Co Ltd 動画提示装置
JP2014132325A (ja) * 2012-12-04 2014-07-17 Nintendo Co Ltd 情報処理システム、情報処理装置、プログラムおよび表示方法
JP2014228952A (ja) * 2013-05-20 2014-12-08 政人 矢川 情報提供システム及びその方法及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013183249A (ja) * 2012-03-01 2013-09-12 Dainippon Printing Co Ltd 動画提示装置
JP2014132325A (ja) * 2012-12-04 2014-07-17 Nintendo Co Ltd 情報処理システム、情報処理装置、プログラムおよび表示方法
JP2014228952A (ja) * 2013-05-20 2014-12-08 政人 矢川 情報提供システム及びその方法及びプログラム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021114787A (ja) * 2017-07-04 2021-08-05 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
JP7087158B2 (ja) 2017-07-04 2022-06-20 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
CN110892361A (zh) * 2017-07-19 2020-03-17 三星电子株式会社 显示设备、显示设备的控制方法及其计算机程序产品
JP7459195B2 (ja) 2018-03-09 2024-04-01 キヤノン株式会社 生成装置、生成方法、及びプログラム
CN111739121A (zh) * 2020-06-08 2020-10-02 北京联想软件有限公司 画虚拟线条的方法、装置、设备及存储介质

Similar Documents

Publication Publication Date Title
US10778951B2 (en) Camerawork generating method and video processing device
US10271082B2 (en) Video distribution method, video reception method, server, terminal apparatus, and video distribution system
US10691202B2 (en) Virtual reality system including social graph
US10805592B2 (en) Apparatus and method for gaze tracking
JP6558587B2 (ja) 情報処理装置、表示装置、情報処理方法、プログラム、および情報処理システム
JP6309749B2 (ja) 画像データ再生装置および画像データ生成装置
US8730354B2 (en) Overlay video content on a mobile device
KR101482025B1 (ko) 증강 현실 프리젠테이션
KR101210315B1 (ko) 3차원 비디오 위에 그래픽 객체를 오버레이하기 위한 추천 깊이 값
JP2015187797A (ja) 画像データ生成装置および画像データ再生装置
US10623792B1 (en) Dynamic generation of on-demand video
JP2014215828A (ja) 画像データ再生装置、および視点情報生成装置
JP2013505636A (ja) 双方向テレビに対しハイパーリンクされた3dビデオインサート
WO2016167160A1 (fr) Dispositif de production de données et dispositif de reproduction
US10051342B1 (en) Dynamic generation of on-demand video
US20230018560A1 (en) Virtual Reality Systems and Methods
WO2020206647A1 (fr) Procédé et appareil pour commander, au moyen du suivi du mouvement d'utilisateur, la lecture d'un contenu vidéo
KR102140077B1 (ko) 서버, 사용자 단말 장치 및 그 제어 방법
WO2021015035A1 (fr) Appareil de traitement d'images, système de restitution d'images et procédé de traitement d'images
JP6934052B2 (ja) 表示制御装置、表示制御方法及びプログラム
KR102084970B1 (ko) 가상현실 관람 방법 및 가상현실 관람 시스템
JP2020101847A (ja) 画像ファイル生成装置及び画像ファイル生成方法、画像生成装置及び画像生成方法、画像生成システム、並びにプログラム
US11287658B2 (en) Picture processing device, picture distribution system, and picture processing method
JP2022545880A (ja) コードストリームの処理方法、装置、第1端末、第2端末及び記憶媒体
US11863902B2 (en) Techniques for enabling high fidelity magnification of video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16779951

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 16779951

Country of ref document: EP

Kind code of ref document: A1