WO2011121920A1 - 撮像制御装置、没入位置情報生成装置、撮像制御方法、没入位置情報生成方法 - Google Patents
撮像制御装置、没入位置情報生成装置、撮像制御方法、没入位置情報生成方法 Download PDFInfo
- Publication number
- WO2011121920A1 WO2011121920A1 PCT/JP2011/001555 JP2011001555W WO2011121920A1 WO 2011121920 A1 WO2011121920 A1 WO 2011121920A1 JP 2011001555 W JP2011001555 W JP 2011001555W WO 2011121920 A1 WO2011121920 A1 WO 2011121920A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- imaging
- optical systems
- virtual viewpoint
- subject
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
- H04N5/2226—Determination of depth image, e.g. for foreground/background separation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
Definitions
- the present invention relates to an imaging control device that captures an image to be fused by a user, an immersive position information generating device, an imaging control method, and an immersive position information generating method.
- the stereo effect adjustment mechanism allows the drive motor to match the crossing position of the optical axes of the left and right video cameras with the position of the subject, or from the zoom magnification information, the left and right shooting optical systems of the stereoscopic image shooting device
- the base line length and the convergence angle are controlled to create appropriate binocular parallax (see Patent Documents 1 to 4).
- a value obtained by multiplying the baseline length before zoom magnification variation by the inverse of the zoom magnification variation is set as a new baseline length, and the intersection of the optical axes of the left and right imaging optical systems is set to the same position as before the baseline length variation. It is disclosed that the convergence angle is set so as to suppress the shift amount so as not to be substantially different from that before zoom variation, thereby promoting natural fusion of the observer's stereoscopic image (see Patent Document 5).
- natural fusion means that an observer recognizes a right-eye video and a left-eye video as a three-dimensional video.
- the adjustment method described above which was effective at the time of shooting a 3D image in middle / short-distance shooting, may not be desirable when used as a broadcasting image for medium / long-distance shooting.
- CG computer graphics
- an observer's parallax from an arbitrary viewpoint can be set when rendering an image with parallax.
- the zoom is as if the observer was approaching the target object (hereinafter referred to as immersive zoom for distinction, including the reverse operation when zooming out). Is also possible.
- immersive zoom is a zoom method that changes the parallax as well as zooming that only enlarges the normal image so that the viewer is closer to or closer to the actual state. means. In other words, there is a difference in expression that the observer looks closer to the object (feeling accompanying the movement of the viewpoint) compared to the zoom that enlarges a distant solid (feeling as if looking through binoculars).
- the convergence angle should increase as the object is approached in the real environment, but as shown by the broken line in FIG. 11, the image obtained by the enlarged zoom is not so and the convergence angle changes. Since there is nothing or a small size, there is a problem that the observer feels uncomfortable with the visual information different from the usual sense and makes fusion difficult. At the time of reproduction, the viewpoint position of the observer becomes unclear for the observer.
- the present invention provides an imaging control apparatus, an immersive position information generation apparatus, which captures an image that can achieve a magnified zoom with an immersive feeling by matching a virtual viewpoint with a zoom position while maintaining a constant focus.
- An object is to provide an imaging control method and an immersion position information generation method.
- an imaging control apparatus that captures at least two images used for causing a user to recognize a subject that is a predetermined distance or more away from an imaging point as a solid. And at least two optical systems arranged at intervals of the baseline length, and when the zoom magnification of the imaging means for imaging the subject and the at least two optical systems arranged at the baseline length interval is set as a reference zoom magnification
- the imaging means is arranged to realize the desired zoom magnification based on the reference zoom magnification, a desired zoom magnification with respect to the predetermined reference zoom magnification, and a distance from the imaging means to the subject.
- An immersive distance that is a distance from a virtual position to be realized to the actual position of the imaging unit is calculated, and the virtual distance is calculated based on the calculated immersive distance.
- a calculating means for calculating an interval between the at least two optical systems when the at least two optical systems are arranged, and arranging the at least two optical systems by changing the predetermined baseline length to the calculated interval And an imaging control device that images the subject in a state in which the at least two optical systems are arranged by the control unit.
- the subject is three-dimensionally based on information from the imaging control device that picks up at least two images used to cause the user to recognize a subject that is a predetermined distance or more away from the imaging point as a three-dimensional object.
- An immersive position information generating apparatus that generates information for recognition as a receiving means for receiving information on the virtual viewpoint position of the user from the imaging control apparatus, and information on the received virtual viewpoint position.
- an immersive position information generation device comprising generation means for generating video information added to a captured video from the imaging control device.
- an imaging control method for imaging at least two images used for causing a user to recognize a subject that is a predetermined distance or more away from an imaging point as a three-dimensional object, at an interval of a predetermined baseline length.
- the zoom magnification of at least two optical systems arranged is set as a reference zoom magnification, the reference zoom magnification, a desired zoom magnification with respect to the predetermined reference zoom magnification, and the at least two optical systems, Based on the distance from the imaging means for imaging to the subject, the distance from the virtual position where the imaging means should be placed to realize the desired zoom magnification to the actual position of the imaging means Calculate the immersion distance, and based on the calculated immersion distance, the imaging means is arranged at the virtual position to achieve the desired zoom magnification
- the at least two optical systems when the at least two optical systems are arranged on an extension line connecting the actual positions of the at least two optical systems.
- An imaging control method is provided that includes an imaging step of imaging the subject in a state of being performed. With this configuration, it is possible to realize an enlarged zoom with an immersive feeling.
- the subject is three-dimensionally based on information from the imaging control device that picks up at least two images used to cause the user to recognize a subject that is a predetermined distance or more away from the imaging point as a three-dimensional object.
- An immersive position information generation method for generating information for recognition as a receiving step of receiving information on the virtual viewpoint position of the user from the imaging control device, and information on the received virtual viewpoint position,
- An immersion position information generation method is provided that includes a generation step of generating image information added to a captured image from the imaging control device.
- the imaging control apparatus, the immersive position information generation apparatus, the imaging control method, and the immersive position information generation method of the present invention that can capture a three-dimensional image can realize an immersive enlarged zoom.
- summary of embodiment of this invention The figure for demonstrating an example of a mode that the virtual viewpoint position in embodiment of this invention is notified.
- the figure for demonstrating an example to which the imaging control apparatus and immersion position information generation apparatus which concern on embodiment of this invention are applied The figure for demonstrating another example to which the imaging control apparatus and immersion position information generation apparatus which concern on embodiment of this invention are applied
- the block diagram which shows an example of a structure of the immersion position information generation apparatus which concerns on embodiment of this invention The flowchart which shows an example of the processing flow in the imaging control apparatus which concerns on embodiment of this invention
- the shooting magnification and viewpoint setting at the time when the zoom operation starts are used as a reference.
- the magnification is 1 ⁇
- the installation position of the optical system is set from the observer's viewpoint
- the baseline length is set to the distance between both eyes of a general viewer.
- the “base line length” is an interval between optical systems (for example, a default value), and if it is as far away as a general human eye, an image taken in that state can be stereoscopically viewed by an observer. The interval which becomes.
- the focus position of the optical system is set to the position of the subject in a composition in which the subject is captured near the center of the screen.
- FIG. 1 shows subject images 103 and 104 photographed by the optical systems 101 and 102 from the positions of the optical systems 101 and 102, respectively.
- the positions of the optical systems 101 and 102 refer to virtual viewpoints determined from parallax in the composition before zooming.
- the parallax in the composition before zooming refers to the difference in the appearance of the right-eye video and the left-eye video.
- the virtual viewpoint refers to the viewpoint and the position of the viewpoint assumed to be the size that the observer feels as a result of the viewpoint moving with respect to the reference magnification when the zoom magnification is changed.
- the angle of convergence generated when the observer's viewpoint (virtual viewpoint) is moved to a corresponding position in accordance with the zoom magnification of the screen enlargement with the positions (viewpoints) of the optical systems 101 and 102 as a reference. Is controlled (immersive zoom).
- the convergence angle refers to an angle formed by the two optical systems 101 and 102 with respect to the focal position.
- the convergence angle formed with respect to the focal position is set to the magnification of the enlargement zoom. It is made to coincide with the convergence angle formed (increased) when the position of the virtual viewpoint corresponds. Images 105 and 106 in such a manner are shown in FIG.
- the direction of the optical system refers to the rotation angle about the vertical direction of the optical system.
- the amount of change for increasing the base line length can be handled by the convergence angle formed by controlling the rotation direction of each optical system 101, 102, that is, the optical system 101, If the convergence angle can be increased by rotating 102, the control can be performed only by controlling the rotation direction of the optical systems 101 and 102 (or in combination with the control of the base line length). That is, when a plurality of optical systems (at least two optical systems) can rotate by themselves, a virtual viewpoint position calculation unit 704 described later rotates the plurality of optical systems when calculating the length (interval) of the baseline length. If the convergence angle can be increased by the above, the length (interval) of the base line length is calculated in consideration of the convergence angle due to the rotation of a plurality of optical systems.
- the change in composition before and after immersive zooming varies depending on the shooting scene, so it depends largely on the distance to the focal position, the position of the virtual viewpoint before and after zooming, and the position of the main subject on the screen (displacement from the center position of the screen).
- the base line length to be performed and the rotation angle of the optical system may differ, but as a result, the angle of convergence is controlled to increase according to the position of the virtual viewpoint where the observer's viewpoint moves after zooming.
- the control method has a similar effect.
- the change in composition before and after the immersive zoom includes that the composition is not changed in the sense that it is captured at the center of the screen.
- the change during zooming can be ignored (when it can be cut by switching to another viewpoint camera, etc.), or between fixed virtual viewpoint positions
- the baseline length may be secured by an image from another optical system.
- the other optical system includes a configuration that is separated as a camera unit.
- the horizontal positions of the optical systems are the same. This is because at least the final image to be fused will make it difficult for the observer to have a natural fusion unless the composition and focus position match.
- This virtual viewpoint position is a viewpoint position obtained from the zoom position with respect to the focal position (distance to the subject to be zoomed) and the camera position (real viewpoint).
- FIG. 2 shows an example of how the virtual viewpoint position is notified.
- the virtual viewpoint position information is transmitted from a camera 201 (including an imaging control device) to a relay station (including a broadcasting station) 202 or the like, and the relay station 202 or the like.
- the information of the virtual viewpoint position is transmitted together with the shooting content to the machine 203 or the like. Note that the flow of information on the virtual viewpoint position is not limited to this.
- the virtual viewpoint position By notifying the virtual viewpoint position, it can be used to display the position of the observer in the entire photographing area on a display or the like. For example, as shown in FIG. 3, when providing shooting content in a baseball field, where the virtual viewpoint position is in the baseball field when shooting a batter (batter) from the position 301 of the pitcher (pitcher), (Where the observer is supposed to be) can be displayed on a display or the like.
- the camera position 302 is actually behind the baseball field.
- the virtual viewpoint position 402 is displayed on a display or the like when virtually shooting from a short distance of the player by zooming. can do.
- the camera position 403 is actually outside the playing field 401.
- the virtual viewpoint position can be displayed on a display or the like when shooting the appearance of the player from the position of the referee or shooting the appearance of the shot from the goal position. In this way, 3D viewpoint sickness (discomfort at the time of fusion, etc.) is achieved by allowing the observer (viewer) to grasp his / her position as a means to convey a natural sense of presence to the observer (viewer). It can be reduced.
- the virtual viewpoint position obtained from the camera settings (zoom magnification, etc.) and the immersive evaluation value error obtained from the video are derived to correct the virtual viewpoint position information and change the camera settings. You may make it.
- the evaluation value of the immersive feeling obtained from the video refers to the immersive position felt by the observer.
- the immersion position felt by the observer means the position where the observer is assumed to be present as a result of controlling the optical system when the observer moves from the actual camera position (or reference position). To tell. The sense of discomfort and illusion felt by humans cannot always be resolved by information required only by the setting state of the camera. Therefore, the actually taken parallax image is evaluated and the error is corrected.
- a baseball broadcast will be described as an example of video content.
- the actual camera position is behind the center back net, and the batter is photographed from that position (1x zoom).
- the distance (depth) from the camera position to the batter is 120 m, for example, and the base line length is 6 cm, for example.
- the virtual viewpoint position is 30 m from the focus (batter) (90 m from the camera), and the base line length is 24 cm.
- the virtual viewpoint position is notified to the relay station or the like.
- a display for example, a mound view display 501 notifying that the viewpoint is near the mound is displayed on the TV screen 502.
- the virtual viewpoint position is displayed (camera display 503) as if the camera is in it, and the position of the viewpoint in the stadium (the place where the viewer is supposed to be present) is broadcast at a superimpose or the like. be able to.
- a character telop may be displayed on the TV screen 502 to indicate the point of view from which point.
- notification may be made by voice, announcement, data guidance, or the like.
- the director including director and cameraman
- the director can be used as reference information for the presentation by displaying in real time together with the position corresponding to the movement of the viewpoint by zooming.
- the zoom control and the virtual viewpoint position control may be changed independently. By making it possible to change independently, it can be used as an effect at the time of production.
- the performer performs various effects using the zoom rate display bar 602 (including the current zoom display 603 and the current virtual viewpoint position display 604) displayed on the monitor 601. It becomes possible. Specifically, a special effect is made possible by causing the virtual viewpoint to follow the camera zoom speed (change in the zoom amount) while keeping a shift. That is, a baseline length control unit 705 and an optical system control unit 703, which will be described later, control the arrangement of at least two optical systems so that the virtual viewpoint is delayed following the change in the zoom of the captured image captured by the imaging unit 701. To do.
- control is started so that the virtual viewpoint moves after a short delay, and the zoom viewpoint is accelerated by following the acceleration so that the virtual viewpoint matches the final zoom magnification.
- by slowly following the virtual viewpoint at a speed slower than the zoom speed it is possible to suppress a change felt by the observer and to reduce a burden during viewing.
- Such a zoom operation and change of the virtual viewpoint position may result in damage to the observer's natural fusion, so it is desirable that the director can manage the scene and frequency of use as an effect. It is effective to notify the virtual viewpoint position so that the director (sometimes a photographer) can confirm the virtual viewpoint position.
- the imaging control apparatus 700 performs imaging such as changing the virtual viewpoint position such as an immersive zoom while controlling the optical system, and outputs the set virtual viewpoint position as a result of the control.
- imaging control apparatus 700 performs imaging such as changing the virtual viewpoint position such as an immersive zoom while controlling the optical system, and outputs the set virtual viewpoint position as a result of the control.
- components of the imaging control apparatus will be described.
- the image capturing unit 701 has a general camera function for capturing video.
- the present invention includes the function of a camera that uses a general optical system.
- the imaging unit 701 includes, for example, the optical systems 101 and 102 shown in FIG.
- the zoom magnification control unit 702 sends information for satisfying the set zoom magnification to the optical system control unit 703, and uses feedback information including a control result from the optical system control unit 703 to reflect it in the control.
- the optical system control unit 703 cooperates with the zoom magnification control unit 702, the baseline length control unit 705, and the like to control the optical system mechanically and software in order to guide an image to be captured by the imaging unit 701. In addition, the optical system control unit 703 feeds back a control result performed based on each control instruction to the instruction source. Note that the optical system control unit 703 includes other controls that are not described in detail in the present invention, such as control of the rotation direction of the optical system, control of camera shake, and control of brightness.
- the virtual viewpoint position calculation unit 704 moves from the camera reference position to the focal position (for example, the position of the subject) based on the reference zoom magnification, the current magnification relative to the reference magnification, and other optical system setting values.
- the immersive position (immersion distance) of whether approaching (or away) is calculated.
- the reference position of the camera refers to, for example, the actual position of the camera (or the actual position of the optical system).
- the reference zoom magnification is an appropriate parallax (baseline length corresponds to the distance between the two eyes of the observer) when the observer observes at a position based on a 1x zoom (without enlargement or reduction). It is desirable to use the set state as a reference.
- An example of a simple reference state setting is that the camera installation position is the observer's position, the zoom magnification of the optical system is set to 1, and the baseline length between the optical systems is the distance between the eyes of a general observer Set.
- the immersive position is a moving position when it is assumed that the observer has moved to a position where it looks like the image has been enlarged (or reduced) rather than enlarged (or reduced). Travel distance).
- zooming in when an observer moves closer to the amount that looks larger than before enlargement (that is, the immersive position), when the observer looks at the object to be photographed from that position, The convergence angle increases.
- the base length of the optical system is increased in order to obtain a corresponding increase in the convergence angle.
- the convergence angle is Get smaller.
- the base length of the optical system is reduced in order to obtain a corresponding amount of decrease in the convergence angle.
- the zoom magnification is maintained as if it is 1 ⁇ . If the camera itself actually moves in the direction of the zoom operation (whether forward or backward), either correct the virtual viewpoint position so that the baseline length can be changed taking into account the amount of movement, Alternatively, it is desirable to associate information necessary for making the virtual viewpoint position correspond to the actual position.
- the virtual viewpoint position calculation unit 704 moves the virtual viewpoint position to the calculated desired immersive position, so that the amount of baseline length that can obtain an appropriate parallax (angle of convergence), and other optical system control amounts ( The rotation amount of the optical system, the amount of light, etc.) are calculated and transmitted to the respective control systems.
- the virtual viewpoint position can be controlled by fluctuations, errors, and other fluctuation patterns depending on the degree and amount of change. May be changed. Further, it may be changed according to a manual virtual viewpoint position change input by a camera operator or the like.
- the baseline length control unit 705 controls the distance between the optical systems in order to realize the baseline length calculated by the virtual viewpoint position calculation unit 704.
- the baseline length control unit 705 arranges (distances) between at least two optical systems so that the virtual viewpoint is delayed following the effect as described above, that is, the change in the zoom of the captured image captured by the imaging unit 701. ) May be controlled. Thereby, the zoom operation can be characterized.
- the virtual viewpoint position output unit 706 acquires the calculated virtual viewpoint position, and acquires the control value of the optical system as necessary. That is, information on the observer's virtual viewpoint position based on the calculated immersion distance information is acquired and output to the outside. For example, the virtual viewpoint position output unit 706 generates and outputs notification information as information mapped to the virtual viewpoint position with respect to the focus position and camera position of the shooting target. Note that when only the virtual viewpoint position needs to be output, that is, for example, when sufficient mapping is possible with the immersive position information generation apparatus described later including other information, the virtual viewpoint position output is performed. The unit 706 outputs the virtual viewpoint position as it is.
- the virtual viewpoint position includes information that can be output in real time in synchronization with the captured video (captured content) captured by the imaging unit 701, or can correspond to a time stamp synchronized with the captured video. It is desirable. That is, when outputting the information of the virtual viewpoint position, the information of the virtual viewpoint position is output in synchronization with the captured video, or includes information that can be associated with the time stamp synchronized with the captured video. As a result, the virtual viewpoint position in which scene of the video matches in real time, and the synchronization of information during recording (recording) (where the virtual viewpoint is located at which timing) becomes easy.
- zoom and virtual viewpoint position information it may be output as an additional data element only when the angle of view or composition changes, such as scene switching or zooming. That is, information on the virtual viewpoint position is output at a predetermined timing. Thereby, the traffic when it is necessary to transmit video, audio, and other data can be kept low.
- information on the camera position and focal position focal length from the camera position
- mapping to the entire position corresponding to where in the shooting location
- the immersive position information generation apparatus 800 receives the virtual viewpoint position output from the imaging control apparatus 700 or the imaging control apparatus 700 via some communication interface.
- the virtual viewpoint position output from the imaging control apparatus 700 includes the case of information mapped to the positional relationship with the imaging target when including other information.
- the positional relationship with the imaging control device 700 is arbitrary as long as it is connected within the same equipment as a whole (when both devices are built in the camera) or from the equipment in which the imaging control device 700 exists.
- the immersive position information generation apparatus 800 changes to a display method that facilitates understanding for an observer who is referring to a virtual viewpoint position display apparatus that displays captured content, a virtual viewpoint position, and the like, as necessary. .
- the contrast information with the zoom magnification is displayed.
- the video production or the supervisor is an observer
- information on the state of change of the virtual viewpoint or the change pattern on the production is displayed.
- the mapping information of the virtual viewpoint position at the entire position is displayed.
- the virtual viewpoint position acquisition unit 801 receives information regarding the position of the virtual viewpoint from the imaging control apparatus 700. Depending on the configuration of the received information element, in many cases, it is necessary to convert the information into information that can be intuitively understood (easy to understand) by the observer, and thus the received information is sent to the immersive position information notification unit 803.
- the optical system control value acquisition unit 802 receives additional information related to the position of the virtual viewpoint from the imaging control device 700. Depending on the configuration of the information elements received, in many cases, information about the position of the virtual viewpoint needs to be converted into information that can be intuitively understood (easy to understand) by the observer, so that additional information is immersive position information.
- the notification is sent to the notification unit 803.
- information such as a series of optical system control flow and corresponding time stamp
- information indicating that the angle of view and the composition have been changed such as scene switching and zoom operation, may be acquired.
- Information on the camera position and focal position (focal length from the camera position) in the shooting area may also be acquired.
- the virtual viewpoint position acquisition unit 801 can acquire necessary information collectively as the virtual viewpoint position information
- the virtual viewpoint position acquisition unit 801 also serves as the optical system control value acquisition unit 802. It can be said.
- the immersion position information notification unit 803 receives information from the virtual viewpoint position acquisition unit 801 and the optical system control value acquisition unit 802 and notifies the observer of information related to the immersion position. At this time, for example, as shown in FIG. 5, an image in which information related to the immersion position is added to the captured image may be generated. That is, the immersion position information notification unit 803 generates video information by adding other information that can recognize the virtual viewpoint position. In addition to the position information based on the position of the virtual viewpoint, this information can also be used to easily notify the viewpoint (for example, what is written as “Now is the viewpoint”).
- Mapping information in which a symbol indicating the immersive position is arranged on the map or schematic diagram of the shooting area, or a symbol or voice alarm information simply indicating that the immersive position has changed from the previous state. It is desirable to take a form according to the situation, such as notification information that can be commented (announced) by an intermediary (video editor, announcer, etc.) secondarily.
- the immersion position information notification unit 803 can change the information to be more easily understood by the observer by combining information from the virtual viewpoint position acquisition unit 801 and the optical system control value acquisition unit 802. For example, by using information that can correspond to the time stamp of the captured video, the virtual viewpoint position in which scene of the video matches in real time, or information synchronization during recording (recording) It is easy to find the virtual viewpoint at the timing. That is, the immersive position information notification unit 803 adds virtual viewpoint position information to the captured video based on information that can be associated with the time stamp of the captured video.
- information indicating that the angle of view and the composition have changed can be used to notify the virtual viewpoint switching timing while keeping traffic low when video, audio, and other data need to be transmitted.
- the observer in this case, the camera operator, director, etc.
- this is a form of information notification that is useful when different control is performed for the purpose of production or the like rather than always following the immersive zoom with respect to the zoom magnification of the camera.
- mapping of the virtual viewpoint position to the shooting area is performed by specifying the position of the midpoint from the position of the camera and the focal length.
- Control value information (including position information) required for mapping varies depending on the shooting area. For example, when it is necessary to notify including the spatial situation of the area, information that also requires the elevation angle of the camera is required. May be one of the following. When information that can be easily understood by the observer can be configured, this information is output to the observer.
- the notification method and the notification destination differ depending on the observer.
- the immersive position information notification unit 803 notifies the immersive position information notification unit 803 via a television receiver or the like on a broadcast wave.
- notification is made at the time of reproduction by recording the recording material (including the video itself) or additional information on a recording medium.
- the observer is an editor, director, director or other relay station or broadcaster who edits or controls video content, notification is made via a monitor of a device such as a switcher or editing equipment. To do.
- the observer is a person who directly operates the camera, such as a cameraman, the notification is notified in the viewfinder or information display portion of the camera.
- the virtual viewpoint position calculation unit 704 is a predetermined reference zoom magnification (for example, 1 ⁇ ), a desired zoom magnification with respect to the predetermined reference zoom magnification, and an imaging control device (an imaging unit 701).
- Immersive distance (which is the distance from the virtual position where the imaging control device should be placed to achieve the desired zoom magnification to the actual position of the imaging control device based on the distance from the subject to the subject) For example, 90 m) is calculated (step S901).
- the virtual viewpoint position calculation unit 704 realizes a desired zoom magnification similar to the case where the imaging control device is arranged at a virtual position and the desired zoom magnification is realized. Then, the interval (the length of the base line length to be set) between the plurality of optical systems when the plurality of optical systems are arranged on the extension line connecting the actual positions is calculated (step S902).
- the baseline length control unit 705 changes a predetermined baseline length (for example, default value) to the calculated interval (the length of the baseline length to be set) and arranges a plurality of optical systems (step S903).
- the imaging unit 701 captures an image of a subject with a plurality of optical systems arranged by the baseline length control unit 705 (step S904).
- the baseline length control unit 705 may control the arrangement of at least two optical systems so that the virtual viewpoint is delayed following the change in the zoom of the captured image captured by the imaging unit 701. Is possible.
- the virtual viewpoint position acquisition unit 801 receives information on the virtual viewpoint position of the observer from the imaging control device (step S1001).
- the immersive position information notification unit 803 generates video information in which the received virtual viewpoint position information is added to the captured video received from the imaging control device (step S1002).
- the immersive position information notification unit 803 may generate video information by adding other information (such as the symbols and comments described above) that can recognize the virtual viewpoint position, and information that can correspond to the time stamp of the captured video. Based on the above, information on the virtual viewpoint position may be added to the captured video.
- the present invention it is possible to perform immersive zoom in accordance with the zoom of the camera (or to add some expression intention) and to convey the state to the observer.
- This is effective when a video is taken from a distance (especially sports), or when a near image does not include an obstacle or other subject in the composition.
- the case where the near image does not include an obstacle or other subject in the composition means that the composition can be set in advance so that only the subject at the focal position can be captured even when shooting at near and middle distances.
- the composition includes an obstacle or other subject in the near image, or if the composition cannot be determined in advance,
- the parallax of the subject in front may be emphasized more than necessary, which may result in hindering fusion for the observer.
- a setting menu or switch may be provided so that the camera operator can switch between the immersive zoom mode and other zoom modes.
- the base line length control unit 705 adjusts the base line length between optical systems different from the plurality of optical systems (at least two optical systems) based on the calculated base line length (interval), and uses another optical system.
- the imaging unit 701 images the subject, and the imaging unit 701 images the subject in a state where a plurality of optical systems are arranged with a predetermined baseline length.
- a third viewpoint may be provided at a long base line length. That is, an optical system different from the two optical systems may be provided.
- the imaging unit 701 uses an optical system different from the plurality of optical systems. The subject is imaged with the baseline length (interval) secured.
- the object of the present invention can be achieved by using the third optical system (another optical system) even if the base line length is too long to be handled by the two optical systems.
- a preset program for an action operation, an activation command, and a button may be provided.
- the provided video content may be live broadcast or recorded in advance.
- each functional block used in the above description of the embodiment of the present invention is typically realized as an LSI (Large Scale Integration) which is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
- LSI Large Scale Integration
- IC Integrated Circuit
- system LSI super LSI
- ultra LSI ultra LSI depending on the degree of integration.
- the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- the imaging control device, the immersive position information generating device, the imaging control method, and the immersive position information generating method according to the present invention can realize an immersive enlarged zoom, the imaging control for capturing an image to be fused by the user It is useful for an apparatus, an immersive position information generating apparatus, an imaging control method, an immersive position information generating method, and the like.
Abstract
Description
Claims (22)
- 撮像地点から所定の距離以上離れた被写体を立体としてユーザに認識させるために用いられる少なくとも2つの画像を撮像する撮像制御装置であって、
所定の基線長の間隔で配置された少なくとも2つの光学系を備え、前記被写体を撮像する撮像手段と、
前記基線長間隔で配された前記少なくとも2つの光学系のズーム倍率を基準ズーム倍率とするとき、前記基準ズーム倍率と、前記所定の基準ズーム倍率に対する所望のズーム倍率と、前記撮像手段から前記被写体までの距離とに基づいて、前記所望のズーム倍率を実現するために前記撮像手段が配されるべき仮想的な位置から前記撮像手段の現実の位置までの距離である没入距離を算出し、算出された前記没入距離に基づいて、前記仮想的な位置に前記撮像手段が配されていて前記所望のズーム倍率を実現する場合と同様の所望のズーム倍率を実現するために、前記少なくとも2つの光学系の実際の位置同士を結ぶ線の延長線上に前記少なくとも2つの光学系が配される場合の前記少なくとも2つの光学系の間隔を算出する算出手段と、
前記所定の基線長を算出された前記間隔に変更して前記少なくとも2つの光学系を配置する制御手段とを備え、
前記撮像手段は、前記制御手段によって前記少なくとも2つの光学系が配置された状態で前記被写体を撮像する撮像制御装置。 - 算出された前記没入距離の情報に基づく前記ユーザの仮想視点位置の情報を取得し、外部へ出力する出力手段を更に備える請求項1に記載の撮像制御装置。
- 前記出力手段は、前記仮想視点位置の情報を出力する際、前記仮想視点位置の情報を、撮影映像に同期させて出力するか又は前記撮影映像に同期したタイムスタンプとの対応がとれる情報を含めて出力する請求項2に記載の撮像制御装置。
- 前記出力手段は、所定のタイミングで前記仮想視点位置の情報を出力する請求項2又は3に記載の撮像制御装置。
- 算出された前記間隔を前記少なくとも2つの光学系で確保することができない場合、
前記撮像手段は、前記少なくとも2つの光学系とは別の光学系を用いて前記間隔を確保した状態で前記被写体を撮像する請求項1に記載の撮像制御装置。 - 前記少なくとも2つの光学系自体を回転制御できる場合、
前記算出手段は、前記間隔を算出する際、前記少なくとも2つの光学系を回転させることで輻輳角を大きくできる場合、前記少なくとも2つの光学系の回転による前記輻輳角を考慮して前記間隔を算出する請求項1に記載の撮像制御装置。 - 前記制御手段は、算出された前記間隔に基づいて、前記少なくとも2つの光学系とは別の光学系間の基線長を調整し、前記別の光学系を備える撮像手段に前記被写体を撮像させ、
前記撮像手段は、前記少なくとも2つの光学系が前記所定の基線長の間隔で配置された状態で前記被写体を撮像する請求項1に記載の撮像制御装置。 - 前記制御手段は、前記撮像手段によって撮影される撮影映像のズームの変化に対して仮想視点を遅れて追従させるよう前記少なくとも2つの光学系の配置を制御する請求項1に記載の撮像制御装置。
- 撮像地点から所定の距離以上離れた被写体を立体としてユーザに認識させるために用いられる少なくとも2つの画像を撮像する撮像制御装置からの情報に基づいて、前記被写体を立体として認識させるための情報を生成する没入位置情報生成装置であって、
前記撮像制御装置から、前記ユーザの仮想視点位置の情報を受信する受信手段と、
受信された前記仮想視点位置の情報を、前記撮像制御装置からの撮影映像に付加した映像情報を生成する生成手段とを、
備える没入位置情報生成装置。 - 前記生成手段は、前記仮想視点位置を認識できる他の情報を加えて前記映像情報を生成する請求項9に記載の没入位置情報生成装置。
- 前記生成手段は、前記撮影映像のタイムスタンプと対応がとれる情報に基づいて、前記仮想視点位置の情報を前記撮影映像に付加する請求項9又は10に記載の没入位置情報生成装置。
- 撮像地点から所定の距離以上離れた被写体を立体としてユーザに認識させるために用いられる少なくとも2つの画像を撮像する撮像制御方法であって、
所定の基線長の間隔で配置された少なくとも2つの光学系のズーム倍率を基準ズーム倍率とするとき、前記基準ズーム倍率と、前記所定の基準ズーム倍率に対する所望のズーム倍率と、前記少なくとも2つの光学系を備え、前記被写体を撮像する撮像手段から前記被写体までの距離とに基づいて、前記所望のズーム倍率を実現するために前記撮像手段が配されるべき仮想的な位置から前記撮像手段の現実の位置までの距離である没入距離を算出し、算出された前記没入距離に基づいて、前記仮想的な位置に前記撮像手段が配されていて前記所望のズーム倍率を実現する場合と同様の所望のズーム倍率を実現するために、前記少なくとも2つの光学系の実際の位置同士を結ぶ線の延長線上に前記少なくとも2つの光学系が配される場合の前記少なくとも2つの光学系の間隔を算出する算出ステップと、
前記所定の基線長を算出された前記間隔に変更して前記少なくとも2つの光学系を配置する制御ステップと、
前記制御ステップによって前記少なくとも2つの光学系が配置された状態で前記被写体を撮像する撮像ステップとを、
有する撮像制御方法。 - 算出された前記没入距離の情報に基づく前記ユーザの仮想視点位置の情報を取得し、外部へ出力する出力ステップを更に有する請求項12に記載の撮像制御方法。
- 前記出力ステップにおいて、前記仮想視点位置の情報を出力する際、前記仮想視点位置の情報を、撮影映像に同期させて出力するか又は前記撮影映像に同期したタイムスタンプとの対応がとれる情報を含めて出力する請求項13に記載の撮像制御方法。
- 前記出力ステップにおいて、所定のタイミングで前記仮想視点位置の情報を出力する請求項13又は14に記載の撮像制御方法。
- 算出された前記間隔を前記少なくとも2つの光学系で確保することができない場合、
前記撮像ステップにおいて、前記少なくとも2つの光学系とは別の光学系を用いて前記間隔を確保した状態で前記被写体を撮像する請求項12に記載の撮像制御方法。 - 前記少なくとも2つの光学系が自ら回転できる場合、
前記算出ステップにおいて、前記間隔を算出する際、前記少なくとも2つの光学系を回転させることで輻輳角を大きくできる場合、前記少なくとも2つの光学系の回転による前記輻輳角を考慮して前記間隔を算出する請求項12に記載の撮像制御方法。 - 前記制御ステップにおいて、算出された前記間隔に基づいて、前記少なくとも2つの光学系とは別の光学系間の基線長を調整し、前記別の光学系を備える撮像手段に前記被写体を撮像させ、
前記撮像ステップにおいて、前記少なくとも2つの光学系が前記所定の基線長の間隔で配置された状態で前記被写体を撮像する請求項12に記載の撮像制御方法。 - 前記撮像手段によって撮影される撮影映像のズームの変化に対して仮想視点を遅れて追従させるよう前記少なくとも2つの光学系の配置を制御するステップを更に有する請求項12に記載の撮像制御方法。
- 撮像地点から所定の距離以上離れた被写体を立体としてユーザに認識させるために用いられる少なくとも2つの画像を撮像する撮像制御装置からの情報に基づいて、前記被写体を立体として認識させるための情報を生成する没入位置情報生成方法であって、
前記撮像制御装置から、前記ユーザの仮想視点位置の情報を受信する受信ステップと、
受信された前記仮想視点位置の情報を、前記撮像制御装置からの撮影映像に付加した映像情報を生成する生成ステップとを、
有する没入位置情報生成方法。 - 前記生成ステップにおいて、前記仮想視点位置を認識できる他の情報を加えて前記映像情報を生成する請求項20に記載の没入位置情報生成方法。
- 前記生成ステップにおいて、前記撮影映像のタイムスタンプと対応がとれる情報に基づいて、前記仮想視点位置の情報を前記撮影映像に付加する請求項20又は21に記載の没入位置情報生成方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/579,729 US9167134B2 (en) | 2010-03-30 | 2011-03-16 | Imaging control device, immersion position information generation device, imaging control method, immersion position information generation method |
CN201180016549.4A CN102823231B (zh) | 2010-03-30 | 2011-03-16 | 摄像控制装置及摄像控制方法 |
JP2012508053A JP5607146B2 (ja) | 2010-03-30 | 2011-03-16 | 撮像制御装置、没入位置情報生成装置、撮像制御方法、没入位置情報生成方法 |
EP11762173.0A EP2555506B1 (en) | 2010-03-30 | 2011-03-16 | Imaging control device, immersion position information generation device, imaging control method, immersion position information generation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010076910 | 2010-03-30 | ||
JP2010-076910 | 2010-03-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011121920A1 true WO2011121920A1 (ja) | 2011-10-06 |
Family
ID=44711690
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/001555 WO2011121920A1 (ja) | 2010-03-30 | 2011-03-16 | 撮像制御装置、没入位置情報生成装置、撮像制御方法、没入位置情報生成方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9167134B2 (ja) |
EP (1) | EP2555506B1 (ja) |
JP (2) | JP5607146B2 (ja) |
CN (1) | CN102823231B (ja) |
WO (1) | WO2011121920A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018500690A (ja) * | 2014-12-23 | 2018-01-11 | エルビット システムズ リミテッド | 拡大3d画像を生成するための方法およびシステム |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5607146B2 (ja) * | 2010-03-30 | 2014-10-15 | パナソニック株式会社 | 撮像制御装置、没入位置情報生成装置、撮像制御方法、没入位置情報生成方法 |
JP2012191608A (ja) * | 2011-02-22 | 2012-10-04 | Panasonic Corp | 立体撮像装置および立体撮像方法 |
CN107211085B (zh) * | 2015-02-20 | 2020-06-05 | 索尼公司 | 摄像装置和摄像方法 |
JP6924079B2 (ja) * | 2017-06-12 | 2021-08-25 | キヤノン株式会社 | 情報処理装置及び方法及びプログラム |
US10721419B2 (en) * | 2017-11-30 | 2020-07-21 | International Business Machines Corporation | Ortho-selfie distortion correction using multiple image sensors to synthesize a virtual image |
CA3085185C (en) * | 2017-12-20 | 2024-04-09 | Leia Inc. | Cross-render multiview camera, system, and method |
JP7245013B2 (ja) * | 2018-09-06 | 2023-03-23 | キヤノン株式会社 | 制御装置及び制御方法 |
JP7349808B2 (ja) | 2019-03-20 | 2023-09-25 | 任天堂株式会社 | 画像表示システム、画像表示プログラム、画像表示装置、および画像表示方法 |
JP7301567B2 (ja) * | 2019-03-20 | 2023-07-03 | 任天堂株式会社 | 画像表示システム、画像表示プログラム、画像表示装置、および画像表示方法 |
JPWO2020246292A1 (ja) | 2019-06-07 | 2020-12-10 | ||
US20230396873A1 (en) * | 2020-10-29 | 2023-12-07 | Sony Group Corporation | Information processing device, information processing method, and program |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07322128A (ja) | 1994-05-24 | 1995-12-08 | Canon Inc | 複眼撮像装置 |
JPH1070740A (ja) * | 1996-08-28 | 1998-03-10 | Sony Corp | 立体カメラおよびビデオ映像伝送システム |
JP2003107601A (ja) | 2001-10-01 | 2003-04-09 | Canon Inc | 立体画像撮影装置および立体画像撮影方法 |
JP2005024629A (ja) | 2003-06-30 | 2005-01-27 | Nippon Hoso Kyokai <Nhk> | 立体カメラ用雲台装置 |
JP2005094364A (ja) * | 2003-09-17 | 2005-04-07 | Japan Process Development Co Ltd | 立体画像撮影装置 |
JP2007195091A (ja) * | 2006-01-23 | 2007-08-02 | Sharp Corp | 合成映像生成システム |
JP2008233579A (ja) | 2007-03-22 | 2008-10-02 | Fujifilm Corp | 複眼撮影装置 |
JP2009210957A (ja) | 2008-03-06 | 2009-09-17 | Fujifilm Corp | 複眼カメラおよび撮影方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06339155A (ja) * | 1993-05-28 | 1994-12-06 | Sharp Corp | 3次元画像撮影システム |
US5682198A (en) * | 1993-06-28 | 1997-10-28 | Canon Kabushiki Kaisha | Double eye image pickup apparatus |
US6326995B1 (en) * | 1994-11-03 | 2001-12-04 | Synthonics Incorporated | Methods and apparatus for zooming during capture and reproduction of 3-dimensional images |
JPH08251467A (ja) * | 1995-03-09 | 1996-09-27 | Canon Inc | カメラ情報の表示装置 |
JP2791310B2 (ja) * | 1996-08-27 | 1998-08-27 | 幹次 村上 | 多角度撮影用の撮像装置 |
IL155525A0 (en) * | 2003-04-21 | 2009-02-11 | Yaron Mayer | System and method for 3d photography and/or analysis of 3d images and/or display of 3d images |
JP2005341466A (ja) * | 2004-05-31 | 2005-12-08 | Auto Network Gijutsu Kenkyusho:Kk | 車載カメラシステム |
JP3880603B2 (ja) * | 2005-02-22 | 2007-02-14 | 株式会社コナミデジタルエンタテインメント | 画像処理装置、画像処理方法及びプログラム |
JP4928275B2 (ja) * | 2007-01-10 | 2012-05-09 | キヤノン株式会社 | カメラ制御装置及びその制御方法 |
JP5607146B2 (ja) * | 2010-03-30 | 2014-10-15 | パナソニック株式会社 | 撮像制御装置、没入位置情報生成装置、撮像制御方法、没入位置情報生成方法 |
-
2011
- 2011-03-16 JP JP2012508053A patent/JP5607146B2/ja not_active Expired - Fee Related
- 2011-03-16 WO PCT/JP2011/001555 patent/WO2011121920A1/ja active Application Filing
- 2011-03-16 US US13/579,729 patent/US9167134B2/en active Active
- 2011-03-16 CN CN201180016549.4A patent/CN102823231B/zh not_active Expired - Fee Related
- 2011-03-16 EP EP11762173.0A patent/EP2555506B1/en active Active
-
2014
- 2014-06-19 JP JP2014126055A patent/JP5884073B2/ja not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07322128A (ja) | 1994-05-24 | 1995-12-08 | Canon Inc | 複眼撮像装置 |
JPH1070740A (ja) * | 1996-08-28 | 1998-03-10 | Sony Corp | 立体カメラおよびビデオ映像伝送システム |
JP2003107601A (ja) | 2001-10-01 | 2003-04-09 | Canon Inc | 立体画像撮影装置および立体画像撮影方法 |
JP2005024629A (ja) | 2003-06-30 | 2005-01-27 | Nippon Hoso Kyokai <Nhk> | 立体カメラ用雲台装置 |
JP2005094364A (ja) * | 2003-09-17 | 2005-04-07 | Japan Process Development Co Ltd | 立体画像撮影装置 |
JP2007195091A (ja) * | 2006-01-23 | 2007-08-02 | Sharp Corp | 合成映像生成システム |
JP2008233579A (ja) | 2007-03-22 | 2008-10-02 | Fujifilm Corp | 複眼撮影装置 |
JP2009210957A (ja) | 2008-03-06 | 2009-09-17 | Fujifilm Corp | 複眼カメラおよび撮影方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2555506A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018500690A (ja) * | 2014-12-23 | 2018-01-11 | エルビット システムズ リミテッド | 拡大3d画像を生成するための方法およびシステム |
Also Published As
Publication number | Publication date |
---|---|
CN102823231A (zh) | 2012-12-12 |
US20120307020A1 (en) | 2012-12-06 |
JP2014209768A (ja) | 2014-11-06 |
EP2555506B1 (en) | 2021-05-05 |
US9167134B2 (en) | 2015-10-20 |
EP2555506A1 (en) | 2013-02-06 |
JP5607146B2 (ja) | 2014-10-15 |
EP2555506A4 (en) | 2017-05-31 |
CN102823231B (zh) | 2016-03-02 |
JP5884073B2 (ja) | 2016-03-15 |
JPWO2011121920A1 (ja) | 2013-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5884073B2 (ja) | 没入位置情報生成方法 | |
JP6432029B2 (ja) | 低コストでテレビ番組を制作する方法及びシステム | |
WO2004049734A1 (ja) | 立体映像信号生成回路及び立体映像表示装置 | |
JP2011114868A (ja) | Gui提供方法及びこれを用いたディスプレイ装置と3d映像提供システム | |
JP2011114867A (ja) | 再生モード切替方法、出力モード切替方法及びこれを用いたディスプレイ装置と3d映像提供システム | |
JP5546633B2 (ja) | 立体画像再生装置、立体画像再生システム及び立体画像再生方法 | |
KR101329057B1 (ko) | 다시점 입체 동영상 송신 장치 및 방법 | |
WO2011136137A1 (ja) | 立体撮像装置及びその制御方法 | |
US9479761B2 (en) | Document camera, method for controlling document camera, program, and display processing system | |
US9258546B2 (en) | Three-dimensional imaging system and image reproducing method thereof | |
JP6207640B2 (ja) | 2次元映像の立体映像化表示装置 | |
JP5562122B2 (ja) | 画像処理装置及びその制御方法 | |
JP2013090180A (ja) | 立体映像撮影表示装置 | |
JP2011135252A (ja) | 立体映像撮影用カメラ調整補助装置 | |
US8503874B2 (en) | Apparatus for imaging three-dimensional image | |
KR102315899B1 (ko) | 영상을 처리하기 위한 전자 장치, 방법, 컴퓨터 판독 가능한 기록 매체 및 컴퓨터 프로그램 | |
CN109729335A (zh) | 一种视差调整的方法、装置及系统 | |
JP5335022B2 (ja) | 映像再生装置 | |
JP2010154479A (ja) | 立体画像再生装置、立体画像再生プログラム、撮像装置 | |
CN114302127A (zh) | 一种数字全景3d影片制作的方法及系统 | |
CN111629195A (zh) | 一种3dvr照片的显示方法 | |
Perisic | A New Single Camera System for Stereoscopic Image Acquisition | |
JP2012257150A (ja) | 立体映像撮像装置および立体映像再生装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180016549.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11762173 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012508053 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13579729 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011762173 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |