WO2017026751A1 - 3차원 스튜디오 시스템 - Google Patents
3차원 스튜디오 시스템 Download PDFInfo
- Publication number
- WO2017026751A1 WO2017026751A1 PCT/KR2016/008651 KR2016008651W WO2017026751A1 WO 2017026751 A1 WO2017026751 A1 WO 2017026751A1 KR 2016008651 W KR2016008651 W KR 2016008651W WO 2017026751 A1 WO2017026751 A1 WO 2017026751A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- background
- setting data
- driving
- subject
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/245—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/684—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
- H04N23/6845—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by combination of a plurality of images sequentially taken
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B2210/00—Aspects not specifically covered by any group under G01B, e.g. of wheel alignment, caliper-like sensors
- G01B2210/52—Combining or merging partially overlapping images to an overall image
Definitions
- the present invention relates to a three-dimensional studio system, and more particularly, to a three-dimensional studio system for controlling a camera and a background installed in a booth (Booth) for shooting a subject in three dimensions (hereinafter referred to as "3D"). .
- High quality 3D images of the real can be secured by 3D shooting using a 3D studio.
- the three-dimensional studio is provided with a large number of cameras distributed around the subject, a plurality of lighting devices for illuminating the subject, and a background for photographing the subject.
- each camera should be set to different settings for shooting according to the characteristics of the subject in order to obtain a 3D image of the subject to obtain a high quality 3D image.
- each camera has a different orientation direction, placement position, placement distance, etc. with respect to the subject. Therefore, each camera should be controlled differently for exposure or zoom in consideration of the direction of orientation, the position of placement and the distance of placement.
- the background should be selected such that the shape and shape of the subject are well represented in consideration of the contrast or the color of the subject.
- the shooting environment of the 3D studio has a difficulty in that it must be set according to the changed subject or the state of the changed subject.
- An object of the present invention is to provide a 3D studio system that can sense at least one of the subject and the background for 3D shooting, and can easily set the shooting environment by using a result corresponding to the sensing result of the preset data. have.
- Another object of the present invention is to sense the shape of the subject, the position of the subject, the distance of the subject, the black and white state of the subject, the color state of the subject, and control the camera and the camera by using the data corresponding to the sensing result. In controlling the shooting environment for 3D shooting the subject by performing the control of the driving.
- Another object of the present invention is to store the camera setting data for the control of the camera and the driving setting data for adjusting the position and orientation of the camera to have a correlation with the sensing signal, and to the sensing signal sensing the subject Controlling the photographing environment for 3D photographing of the subject by using corresponding data.
- Another problem to be solved by the present invention is to control each of the cameras distributed around the subject to capture an image including at least one adjacent camera and a preset range of overlapping regions in order to obtain a 3D image in the image process. It's in the ship.
- Another object of the present invention is to control the background of the subject to change in response to the sensing result.
- Another object of the present invention is to correlate the data corresponding to the result of sensing the subject and the data reflecting the rendering correction value obtained in the process of performing an image process on the 3D photographed image with the sensing signal.
- Another object of the present invention is to provide a shooting environment optimized for 3D imaging by giving a correlation having a sensing weight and a statistical weight to data backed up to a database.
- Another problem to be solved by the present invention is to sample the rendering correction value determined to be effective in an image process and correlate the sampled data with the sensing signal through a data statistical process of sampling data using the sampled rendering correction value.
- the camera module including a camera for photographing a subject in response to a camera control signal and a driving device for adjusting the position and orientation of the camera in response to a drive control signal
- sensor modules for providing a sensing signal for sensing at least the subject, wherein the camera modules and the sensor modules are spaced and distributed around the subject
- a database for storing camera setting data for controlling the camera and driving setting data for adjusting the position and direction of the camera so as to have a correlation with the sensing signal
- the camera may transmit a photographed image of the camera to an external image processing apparatus, select the camera setting data and the driving setting data for the sensing signal of the sensor module from the database, and apply the camera setting data and the driving setting data to the database.
- a booth controller configured to provide corresponding camera control signals and drive control signals to the camera and the driving device, respectively; And receiving the sensing signal and the camera setting data and the driving setting data selected for the sensing signal from the booth control unit, and receiving a rendering correction value obtained in a process of performing an image process from the image processing apparatus.
- a statistical analyzer configured to back up the camera setting data and the driving setting data reflecting the rendering correction value to the database to have a correlation with the sensing signal.
- the present invention can easily set the shooting environment by sensing the subject can reduce the time and effort required to set the shooting environment for the 3D shooting.
- the present invention can sense the various characteristics of the subject to control the control of the camera or the driving of the camera in the optimal state for 3D photographing of the subject.
- the present invention is stored so that the camera setting data for controlling the camera, the drive setting data for adjusting the position and orientation of the camera to be correlated with the sensing signal to use the optimal data corresponding to the result of sensing the subject.
- the camera setting data for controlling the camera the drive setting data for adjusting the position and orientation of the camera to be correlated with the sensing signal to use the optimal data corresponding to the result of sensing the subject.
- the optimal data for setting the 3D photographing environment may be provided according to the statistical analysis by being stored in a database so that the sensing signal and data are correlated.
- the background of the subject may be changed in response to the sensing result, so that an optimal background of the subject may be provided for 3D photographing.
- FIG. 1 is a block diagram showing a preferred embodiment of the 3D studio system of the present invention.
- FIG. 2 is a perspective view showing an embodiment of the booth of FIG.
- FIG 3 is a partial perspective view illustrating the camera module.
- FIG. 4 is a perspective view illustrating a cylindrical background module configurable in the booth of FIG.
- FIG. 5 is a perspective view illustrating a domed background module configurable in the booth of FIG. 1;
- FIG. 6 illustrates an embodiment of a background cell for background change.
- FIG. 7 is a plan view illustrating that the camera is installed in the background module.
- FIG. 8 is a plan view illustrating an arrangement method of a camera.
- 9 is an exemplary diagram for explaining photographing to have an overlapping area.
- the 3D studio system of the present invention includes a booth 10, a booth control unit 20, a database 40, a correction value input unit 50, and a statistical analysis unit 60.
- the booth control unit 20 is configured to transmit the photographed image of the camera 132 of the booth 10 to the external image processing apparatus 30.
- the image processing apparatus 30 receives the captured images photographed by the cameras 132 of the booth 10 through the booth control unit 20 and performs 3D rendering using the captured images to generate a 3D image file of the subject. Means the device to create.
- the 3D image file refers to electronic information that can be implemented as a 3D image by using a designated application.
- the image processing apparatus 30 may generate a rendering correction value for securing a high quality photographed image of the subject as a result of the image processing.
- the rendering correction value will be described later.
- the booth 10 may include a sensor module 110, a camera module 130, and a background module 150, and may further include a lighting module (not shown) although not specifically illustrated.
- the sensor module 110 and the camera module 130 are provided in plural and spatially distributed around the subject. A method of spatially distributing the plurality of sensor modules 110 and the plurality of camera modules 130 will be described later.
- the sensor module 110 provides a sensing signal for sensing a subject to the booth controller 20, a shape recognition sensor 112 for recognizing the shape of the subject, a position recognition sensor 114 for recognizing the position of the subject, and It is configured to include at least one of the space recognition sensor 116 for recognizing the distance, the photo sensor 118 for sensing the black and white of the subject, and the color sensor 120 for recognizing the color of the subject.
- the sensing signal output from the sensor module 110 includes at least one of an output of the shape recognition sensor 112, the position recognition sensor 114, the space recognition sensor 116, the photo sensor 118, and the color sensor 120. It can be defined as.
- the sensing value of each sensor may be determined as a component of the sensing signal.
- the shape recognition sensor 112 may be configured to have a light receiving structure that senses light reflected from the subject, and recognizes a shape of the subject such as a pose or a volume.
- the position recognition sensor 114 may be configured using a high frequency sensor for recognizing the height of the subject.
- the space recognition sensor 116 may be configured using a laser sensor and recognizes a distance of a subject.
- the photo sensor 118 is for recognizing the black and white of the subject, that is, the contrast of the subject.
- the color sensor 120 is for recognizing the color of the subject.
- the photo sensor 118 and the color sensor 120 may be configured using a sensor having a single or multilayer CMOS pixel.
- the sensor module 10 may be configured to recognize black and white or color of a background as well as a subject and provide a sensing signal corresponding thereto.
- the sensor module 10 may further include a photo sensor and a color sensor, wherein the photo sensor recognizes black and white, that is, the contrast of the background, and the color sensor recognizes the color of the background.
- the photo sensors and the color sensors for sensing the black and white or the color of the subject or the background may be configured using one photo sensor 118 and one color sensor 120.
- the sensor module 110 is provided by at least one of the shape recognition sensor 112, the position recognition sensor 114, the space recognition sensor 116, the photo sensor 118, and the color sensor 120.
- the sensing signal including the sensing value is provided to the booth controller 20.
- the sensor module 110 may be configured to correspond to each of the camera module 130 or one to a plurality of sensor camera modules, and may be spatially diverse in parallel with the camera module 130 in consideration of a sensing method and a sensing target. Can be distributedly arranged.
- the camera module 130 includes a camera 132 for photographing a subject in response to a camera control signal, and a driving device 134 for adjusting a position and a direction of the camera 132 in response to a driving control signal.
- the camera 132 provides the photographed image to the booth control unit 20, and the photographing operation of the camera 132 and the driving state of the driving device 134 are controlled by the booth control unit 20.
- the database 40 may include a sensing signal, camera setting data for controlling the camera 132, driving setting data for adjusting the position and orientation of the camera 132, and background control data for determining a background of a subject. Save it.
- the database 40 stores camera setting data, driving setting data, and background control data of various values that can be optimally matched to a specific sensing signal, and includes sensing signals and camera setting data that can be matched to optimal values.
- the drive setting data and the background control data can be defined as having a correlation with each other.
- Each value included in the camera setting data, the driving setting data, and the background control data may be correlated with multiple sensing signals, and each value of the sensing signal is also updated with multiple camera setting data, driving setting data, and background. It may have a correlation with control data.
- a correlation can be defined as having a relationship that can be used for selection.
- the background module 150 receives the background control signal from the booth control unit 20 and provides a background of a subject corresponding to the background control signal.
- the background module 150 may be configured to change the background to any one of black, white, and one or more colors in response to the background control signal.
- the booth control unit 20 transmits the photographed image of the camera 132 to the external image processing apparatus 30, and stores the camera setting data and the driving setting data having a correlation with the sensing signal of the sensor module 110. Selected at 40, the camera control signal and the drive control signal corresponding to the selected camera setting data and the drive setting data are provided to the camera 132 and the drive device 134, respectively.
- the booth controller 20 selects background control data having a correlation with the sensing signal from the database 40, and provides a background control signal corresponding to the selected background control data to the background module 150.
- the correction value input unit 50 may be configured to input the correction value to the booth control unit 20.
- the correction value input unit 50 provides a correction value for correcting the camera setting data, the driving setting data, and the background control data by a user's input.
- the correction value input unit 50 may include an input device such as a keyboard or a touch pad for providing a correction value, and a display device for displaying the currently set values and the correction value.
- the correction value input to the correction value input unit 50 may be determined manually by a user or may be determined using a separate device.
- the booth control unit 20 controls at least one of a shutter, an aperture, a sensitivity, and a zoom of the camera 132 and zooms the camera 132 to obtain a captured image including an overlapping area of a preset range with one or more other cameras adjacent to each other. To control.
- the booth control unit 20 may include a zoom of the camera 132, a position of the camera 132, and a camera 132 so that the camera 132 secures a captured image including an overlapping area of a preset range with one or more other cameras adjacent thereto.
- the camera control signal and the driving control signal for adjusting the direction of directivity) are provided to the camera 132 and the driving device 134.
- the statistical analysis unit 60 receives the camera setting data, the driving setting data and the background control data selected by the sensing signal and the sensing signal from the booth control unit 20, and performs an image process from the image processing apparatus 30. Receive the obtained rendering correction value.
- the statistical analysis unit 60 backs up the database 40 after correcting the camera setting data, the driving setting data, and the background control data provided from the booth control unit 20 using the rendering correction value.
- the statistical analysis unit 60 separates the camera setting data, the driving setting data, and the background control data reflecting the rendering correction value from the camera setting data, the driving setting data, and the background control data before reflecting the rendering correction value for the history management.
- the database 40 is backed up to have a correlation with the sensing signal.
- the statistical analyzer 60 defines a sensing signal to have a statistical weight for a specific element included in the sensing signal, and has high correlation with the camera setting data, the driving setting data, and the background control data with respect to the sensing signal having the weight.
- the database 40 is backed up to have it.
- the corresponding sensing value of the position sensor 114 may give high reliability.
- the statistical analysis unit 60 defines a sensing signal to have statistical weights for elements having a small correction value statistically, and the camera setting data, driving setting data, and background control data reflecting the correction value for the weighted sensing signal are included. Back up to the database 40 to have a high correlation.
- the data backed up to the database 40 is preferential when the booth control unit 20 selects camera setting data, driving setting data, and background control data corresponding to the sensing signals provided from the sensor module 110 of the booth 10. Can be selected. Therefore, the booth control unit 20 may perform settings for 3D photographing using reliable camera setting data, driving setting data, and background control data.
- the statistical analysis unit 60 then distinguishes between the valid and invalid ones of the rendering correction values and samples the valid rendering correction values.
- the statistical analysis unit 60 samples the camera setting data, the driving setting data, and the background control data by reflecting the valid rendering correction values in the camera setting data, the driving setting data, and the background control data.
- the statistical analyzer 60 backs up the sampled camera setting data, the driving setting data, and the background control data to the database 40 to have a correlation with the sensing signal.
- the image processing apparatus 30 may provide a rendering correction value for each of the captured images photographed by the camera 132 to the statistical analyzer 60.
- the image processing apparatus 30 may provide information that the rendering correction value for the captured image used to generate the 3D image among the rendering correction values for each of the captured images is valid, and may not be used to generate the 3D image.
- Rendering correction values for the image may be invalid and provide information. For example, an unnecessary photographed image or a photographed image lacking an overlapping area among photographed images is not used to generate a 3D image. Therefore, the rendering correction values for these photographed images can be determined to be invalid.
- the statistical analysis unit 60 samples the camera setting data, the driving setting data, and the background control data by reflecting valid rendering correction values among the rendering correction values provided as described above in the camera setting data, the driving setting data, and the background control data. .
- the statistical analyzer 60 backs up the sampled camera setting data, the driving setting data, and the background control data to the database 40 to have a correlation with the sensing signal.
- the sampled camera setting data, the driving setting data, and the background control data may be used for the setting of the camera module 130 that captures the captured image that is not used to generate the 3D image.
- the booth controller 20 corresponds to sensing signals of the sensor modules 110 installed in the booth 10.
- 132 selects camera setting data, drive setting data and background control data and provides camera control signals, drive control signals and background control signals corresponding to the camera setting data, drive setting data and background control data. To each of the cameras 132.
- the cameras 132 installed in the booth 10 may set a shutter, an iris, a sensitivity, and a zoom according to a camera control signal
- the driving devices 134 may include a camera 132 mounted therein by a driving control signal.
- the position and direction of the controllable direction can be controlled, and the background module 150 can change the background by the background control signal.
- the cameras 132 may simultaneously photograph the projectile under the control of the booth controller 20.
- the photographed images photographed by the cameras 132 are transferred to the image processing apparatus 30 through the booth control unit 20.
- the 3D image is generated by performing a rendering process using the captured images by driving a program for an embedded image processing process of the image processing apparatus 30 and a manual operation of an operator. If it is determined that correction of some or all of the captured images is necessary during the process of generating the 3D image using the captured images, a rendering correction value for the correction may be generated.
- the image processing apparatus 30 provides the rendering correction value to the statistical analyzer 60.
- the statistical analysis unit 60 receives the setting data of the booth control unit 20 and the rendering correction value of the image processing apparatus 30, and uses the rendering correction value to provide the camera setting data and the driving provided by the booth control unit 20.
- the setting data and the background control data are corrected and backed up to the database 40.
- the booth control unit 20 may include camera setting data, driving setting data, and background control data to which a rendering correction value is applied in response to a sensing signal. May be provided to the booth 10.
- the 3D studio system of the present invention can secure an optimal captured image corresponding to the subject, and as a result, can generate a high quality 3D image.
- the camera modules 130 are spatially distributed within the booth 10.
- the camera modules 130 may be spatially distributed to form a cylindrical space around a subject, as shown in FIG. 2, and may be spatially distributed to configure a domed space around a subject.
- the sensor modules 110 may also be distributed in a cylindrical or dome-like space as the camera modules 130 are distributed.
- the camera modules 130 are arranged in a cylindrical or dome shape with respect to the subject, which is an example of the present invention, and may be variously arranged with respect to the subject.
- the booth 10 includes array lines 170 and 172 for arranging the camera modules 130 in the cylindrical shape.
- the array lines 170 and 172 may be formed to form a circle at the bottom and may have different diameters.
- the array lines 170 and 172 are regularly spaced apart, and the guide lines 182 that are radially disposed and are regularly spaced between the array lines 170 and 172 are coupled to each other.
- the horizontal movers 134a are installed in the guiders 182, and the horizontal movers 134a are driven to move in the horizontal direction along the combined guider 182. That is, the guiders 182 may guide the horizontal movement of the horizontal moving parts 134a, and the distance of the camera module 130 with respect to the subject may be adjusted by the horizontal moving parts 134a.
- the guider 180 is vertically installed in each horizontal moving part 134a. Two or more camera modules 130 are distributed in the guider 180 and guide the lifting of the two or more camera modules 130.
- the guiders 180 installed vertically with respect to the subject TS are configured to form a cylindrical space, and two or more camera modules 130 are provided in each of the guiders 180 forming the cylindrical space. ) Is installed.
- FIG. 2 illustrates that 16 guiders 180 are installed and 6 camera modules 130 are installed for each guider
- the present invention provides the number of guiders 180 and the number of the camera modules 130. The number can be configured in various ways.
- Each camera module 130 installed in the guider 180 of FIG. 2 is combined with a camera 132 and a driving device 134.
- the driving device 134 includes a camera 132 and is capable of raising, lowering, horizontally rotating, and vertically rotating, and at least one of raising, lowering, horizontally rotating, and vertically rotating in response to a driving control signal is controlled so that the camera 132 is controlled. You can adjust the position and orientation of.
- the driving device 134 is configured to move up and down along the guider 180.
- the driving device 134 mounts the camera 132 and mounts the first driving unit 134c for performing any one of a horizontal rotation and a vertical rotation in response to the driving control signal, and mounts the first driving unit and corresponds to the driving control signal. It may be described as including a second driving unit for performing the other of the horizontal rotation and vertical rotation and a third driving unit for performing the lifting and lowering of the second driving unit in response to the drive control signal, wherein the second driving unit and The third driver may be composed of one fourth driver 134b.
- the horizontal moving part 134a moves horizontally along the guider 182, the first driving part 134c rotates vertically, and the fourth driving part 134b moves up and down horizontally.
- the horizontal moving part 134a, the first driving part 134c, and the fourth driving part 134b have a motor in which driving is controlled by a driving control signal.
- the height of the camera module 130 is adjusted by the lifting and lowering of the fourth driving unit 134b, and when the height of the subject TS is large, the fourth driving unit 134b is controlled to move up and down, and the subject ( When the height of TS is small, the fourth driver 134b is controlled to descend.
- the background module 150 may be configured as shown in FIG. 4 in response to the camera modules 130 being distributed in a cylindrical shape around the subject.
- the camera modules 130 may be disposed in the background module 150.
- the background module 150 of FIG. 4 configured in a cylindrical shape includes a plurality of background cells 150a to 150l, and each of the background cells 150a to 150l is a wall partitioned at a constant angle in a plane with respect to the subject TS. It may be configured as, may be configured to be assembled to engage with each other.
- the background module 150 may be configured to arrange at least a portion of the camera module 130 or to have a space in which the camera 132 may photograph the subject TS.
- a circular space formed between the joints of the background cells 150a to 150l of the background module 150 arranges at least a part of the camera module 130 or a space for the camera 132 to photograph the subject TS. It is expressed as
- the background module 150 may be configured as shown in FIG. 5 in response to the camera modules 130 being distributed in a dome shape around the subject.
- the background module 150 of FIG. 5 having a dome shape includes a plurality of background cells 150a-150l divided radially and uniformly, and each of the background cells 150a-150l may be configured to be engaged with each other.
- the background module 150 may change the background to any one of black, white, and one or more colors in response to the background control signal, and each of the background cells 150a to 150l may be configured to emit an LED as a light source, respectively.
- B), white (W), red (R), green (G), blue (B) and yellow (Y) can be variously expressed.
- the background cells 150a to 150l may be configured such that the emission color is determined by the emission color or the combined color of the light sources.
- the background color of the background module 150 may be determined in consideration of the subject. If the subject is white, the background may be determined black. If the subject is black, the background may be determined white. Complementary colors may be determined as the background.
- each of the background cells 150a to 150l of the background module 150 includes a screen capable of representing black, white, and one or more colors, respectively, and is configured to selectively display a background by driving the screen in response to a background control signal. Can be.
- the screens of the background cells 150a-150l can be configured to drive in either the vertical or horizontal direction.
- FIG 6 illustrates that the screens of the background cells 150a to 150l are driven horizontally, and the screen 160 is driven by the rotation of the roller 162.
- the camera module 130 may be partially mounted or embedded in the background module 150 as shown in FIG. 7.
- the CA indicates a position where the camera module 130 is arranged in a plane.
- the camera module may be distributed on two or more placement lines positioned on the same plane with respect to the subject TS.
- FIG. 8 corresponds to a case in which an additional camera module CA2 is arranged inside the camera module CA1, which is basically arranged when an additional captured image of the lower or upper part of the subject TS is required.
- the camera module may be distributed on two or more placement lines positioned on different planes with respect to the subject TS.
- Embodiments of the present invention preferably control the camera module 130 to include an overlapping area of a preset range with one or more other cameras adjacent to the 3D image.
- the camera module 130 at the C2 position should photograph the subject to have an area overlapped with the captured images at the C1 and C3 positions.
- the captured image of each position should have overlapping areas SP1 and SP2 in a preset range with the captured image of another adjacent camera as shown in FIGS. 10A to 10C. This is to ensure the connected images in various directions.
- the camera module 130 at each position may include a zoom of the camera 132, a height of the camera 132, and a subject of the camera 132. It is desirable to control several factors, such as distance to.
- the 3D photographing environment of the subject may be easily set, and the 3D photographing environment may be set in an optimal state corresponding to the characteristics of the subject.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (26)
- 카메라 제어 신호에 대응하여 피사체를 촬영하는 카메라와 구동 제어 신호에 대응하여 상기 카메라의 위치와 지향 방향을 조절하는 구동 장치를 포함하는 카메라 모듈들 및 적어도 상기 피사체를 센싱한 센싱 신호를 제공하는 센서 모듈들을 포함하며, 상기 카메라 모듈들과 상기 센서 모듈들은 상기 피사체를 중심으로 공간적으로 분산 배치되는 부스;상기 카메라의 제어를 위한 카메라 세팅 데이터, 상기 카메라의 위치와 지향 방향을 조절하기 위한 구동 세팅 데이터를 상기 센싱 신호와 상관 관계를 갖도록 저장하는 데이터베이스;상기 카메라의 촬영 이미지를 외부의 이미지 처리 장치로 전달하고, 상기 센서 모듈의 상기 센싱 신호에 대한 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터를 상기 데이터베이스에서 선택하고, 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터에 대응하는 상기 카메라 제어 신호와 상기 구동 제어 신호를 상기 카메라와 상기 구동 장치에 각각 제공하는 부스 제어부; 및상기 부스 제어부로부터 상기 센싱 신호와 상기 센싱 신호에 대하여 선택된 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터를 수신하고, 상기 이미지 처리 장치로부터 이미지 프로세스를 수행하는 과정에 구해지는 렌더링(Rendering) 보정값을 수신하며, 상기 렌더링 보정값을 반영하여 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터를 상기 센싱 신호와 상관 관계를 갖도록 상기 데이터베이스에 백업하는 통계 분석부;를 포함하는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 부스 제어부는 상기 카메라가 인접한 하나 이상의 다른 카메라와 미리 설정된 범위의 중첩 영역을 포함하는 촬영 이미지를 확보하도록 상기 카메라의 줌과 상기 카메라의 상기 위치와 상기 카메라의 상기 지향 방향을 조절하는 상기 카메라 제어 신호와 상기 구동 제어 신호를 상기 카메라와 상기 구동 장치에 제공하는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 부스 제어부는 상기 카메라의 셔터, 조리개, 감응도 및 줌(Zoom) 중 적어도 하나를 제어하며 인접한 하나 이상의 다른 카메라와 미리 설정된 범위의 중첩 영역을 포함하는 촬영 이미지를 확보하도록 상기 카메라의 상기 줌을 제어하는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 구동 장치는 상기 카메라를 탑재하고, 승하강, 수평 회전과 수직 회전이 가능하며, 상기 구동 제어 신호에 대응하여 상기 승하강, 상기 수평 회전 및 상기 수직 회전 중 적어도 하나가 제어되어서 상기 카메라의 위치와 지향 방향을 조절하는 3차원 스튜디오 시스템.
- 제4 항에 있어서,상기 부스는 둘 이상의 상기 카메라 모듈의 설치가 가능하고 상기 구동 장치의 승하강을 가이드하는 제1 가이더를 더 포함하며,상기 구동 장치는 상기 구동 제어 신호에 대응하여 상기 제1 가이더를 따라 승하강하도록 구성되는 3차원 스튜디오 시스템.
- 제4 항에 있어서, 상기 구동 장치는,상기 카메라를 탑재하고, 상기 구동 제어 신호에 대응하여 상기 수평 회전과 상기 수직 회전 중 어느 하나를 수행하는 제1 구동부;상기 제1 구동부를 탑재하고, 상기 구동 제어 신호에 대응하여 상기 수평 회전과 상기 수직 회전 중 다른 하나를 수행하는 제2 구동부; 및상기 구동 제어 신호에 대응하여 상기 제2 구동부의 상기 승하강을 수행하는 제3 구동부;를 포함하는 3차원 스튜디오 시스템.
- 제6 항에 있어서,상기 제2 구동부 및 상기 제3 구동부는 하나의 제4 구동부에 포함되는 3차원 스튜디오 시스템.
- 제1 항에 있어서, 상기 부스는,둘 이상의 상기 카메라 모듈의 분산 설치가 가능하고 둘 이상의 상기 카메라 모듈의 승하강을 가이드하는 제1 가이더;수평 방향 이동을 가이드하는 제2 가이더; 및상기 제1 가이더를 지지하며 상기 제2 가이더를 따라 수평 방향으로 이동하도록 구성되는 수평 이동부;를 더 포함하며,상기 수평 이동부는 상기 구동 제어 신호에 대응하여 상기 피사체와 상기 카메라 모듈 간의 거리가 조절되도록 이동하는 3차원 스튜디오 시스템.
- 제1 항에 있어서, 상기 센서 모듈은,상기 피사체의 형상을 인식하는 형상 인식 센서;상기 피사체의 위치를 인식하는 위치 인식 센서;상기 피사체의 거리를 인식하는 공간 인식 센서;상기 피사체의 흑백을 센싱하는 제1 포토 센서; 및상기 피사체의 컬러를 인식하는 제1 컬러 센서; 중 적어도 하나를 포함하며,상기 센싱 신호는 상기 형상 인식 센서, 상기 위치 인식 센서, 상기 공간 인식 센서, 상기 제1 포토 센서 및 상기 제1 컬러 센서의 출력을 포함하는 3차원 스튜디오 시스템.
- 제9 항에 있어서,상기 공간 인식 센서는 레이저 센서를 이용하여 구성되는 3차원 스튜디오 시스템.
- 제9 항에 있어서, 상기 센서 모듈은,상기 피사체의 배경의 상기 흑백을 센싱하는 제2 포토 센서; 및상기 피사체의 상기 배경의 상기 컬러를 인식하는 제2 컬러 센서;를 더 포함하며, 상기 센싱 신호는 상기 형상 인식 센서, 상기 위치 인식 센서, 상기 공간 인식 센서, 상기 제1 포토 센서, 상기 제2 포토 센서, 상기 제1 컬러 센서 및 상기 제2 컬러 센서의 출력을 포함하는 3차원 스튜디오 시스템.
- 제11 항에 있어서,상기 제1 포토 센서 및 제2 포토 센서는 하나의 포토 센서를 이용하여 구성되고,상기 제1 컬러 센서 및 제2 컬러 센서는 하나의 컬러 센서를 이용하여 구성되는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 카메라 모듈들은 상기 피사체를 중심으로 실린더형 공간을 구성하도록 공간적으로 분산 배치되는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 카메라 모듈들은 상기 피사체를 중심으로 돔형 공간을 구성하도록 공간적으로 분산 배치되는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 카메라 모듈들은 상기 피사체를 중심으로 동일 평면 상에 위치하는 둘 이상의 배치선 상에 분산 배치되는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 카메라 모듈들은 상기 피사체를 중심으로 다른 평면 상에 위치하는 둘 이상의 배치선 상에 분산 배치되는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 부스는 상기 피사체에 대한 배경을 제공하는 배경 모듈을 더 포함하며,상기 부스 제어부는 상기 센싱 신호에 대응하는 배경 제어 신호를 상기 부스에 제공하고,상기 배경 모듈은 상기 배경 제어 신호에 대응하여 흑색, 백색 및 하나 이상의 컬러 중 어느 하나로 배경을 변경하는 3차원 스튜디오 시스템.
- 제17 항에 있어서,상기 배경 모듈은 다수의 배경 셀을 포함하며, 각 배경 셀은 상기 배경 제어흑색, 신호에 대응하여 백색 및 하나 이상의 컬러로 발광을 변경하여서 상기 카메라에 대한 상기 배경을 선택적으로 표현하는 3차원 스튜디오 시스템.
- 제17 항에 있어서,상기 배경 모듈은 다수의 배경 셀을 포함하며, 각 배경 셀은 흑색, 백색 및 하나 이상의 컬러를 각각 표현할 수 있는 스크린을 포함하여 상기 배경 제어 신호에 대응하여 상기 스크린을 구동하여서 상기 배경을 선택적으로 표현하는 3차원 스튜디오 시스템.
- 제19 항에 있어서,상기 배경 모듈은 상기 스크린을 수직 또는 수평 중 어느 하나의 방향으로 구동하는 3차원 스튜디오 시스템.
- 제18 항에 있어서,상기 카메라 모듈은 상기 배경 모듈의 전면에 배치되는 3차원 스튜디오 시스템.
- 제17 항에 있어서,상기 배경 모듈은 상기 카메라 모듈의 적어도 일부를 배치하거나 상기 카메라가 상기 피사체를 촬영할 수 있는 공간을 갖는 3차원 스튜디오 시스템.
- 제11 항에 있어서,상기 데이터베이스는 상기 센싱 신호와 상관 관계를 갖도록 배경 제어 데이터를 저장하고,상기 부스 제어부는 상기 센싱 신호에 대응하는 상기 배경 제어 데이터를 선택하고 선택된 상기 배경 제어 데이터에 대응하는 상기 배경 제어 신호를 상기 배경 모듈에 제공하는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 통계 분석부는 이력 관리를 위하여 상기 렌더링 보정값을 반영한 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터를 상기 렌더링 보정값을 반영하기 전의 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터와 구분하여서 동일한 상기 센싱 신호와 상관 관계를 갖도록 상기 데이터베이스에 백업하는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 통계 분석부는 상기 센싱 신호에 포함된 특정 요소에 대한 통계적 가중치를 갖도록 상기 센싱 신호를 정의하고 가중치를 갖는 상기 센싱 신호에 대하여 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터가 상기 상관 관계를 갖도록 상기 데이터베이스에 백업하는 3차원 스튜디오 시스템.
- 제1 항에 있어서,상기 통계 분석부는 상기 렌더링 보정값 중 유효한 것과 무효한 것을 구분한후 유효한 렌더링 보정값을 샘플링하고, 유효한 상기 렌더링 보정값을 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터에 반영하여 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터를 표본화하며, 표본화된 상기 카메라 세팅 데이터 및 상기 구동 세팅 데이터를 상기 센싱 신호에 대한 상기 상관 관계를 갖도록 상기 데이터베이스에 백업하는 3차원 스튜디오 시스템.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/751,949 US10397477B2 (en) | 2015-08-10 | 2016-08-05 | Three-dimensional studio system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2015-0112460 | 2015-08-10 | ||
KR1020150112460A KR101679398B1 (ko) | 2015-08-10 | 2015-08-10 | 3차원 스튜디오 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017026751A1 true WO2017026751A1 (ko) | 2017-02-16 |
Family
ID=57706815
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2016/008651 WO2017026751A1 (ko) | 2015-08-10 | 2016-08-05 | 3차원 스튜디오 시스템 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10397477B2 (ko) |
KR (1) | KR101679398B1 (ko) |
WO (1) | WO2017026751A1 (ko) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10386184B2 (en) * | 2017-08-29 | 2019-08-20 | Target Brands, Inc. | Three-dimensional photogrammetry system and method |
KR101857788B1 (ko) * | 2017-09-22 | 2018-05-14 | 주식회사 스탠스 | 탈부착조립성과 길이가변성이 우수한 360°다시점 영상 촬영카메라용 리그장치 |
KR101946440B1 (ko) | 2017-10-16 | 2019-02-11 | (주)디메이드쓰리디스튜디오 | 피사체 자동맞춤조절이 가능한 부스형 삼차원 촬영장치 |
KR20190083601A (ko) | 2018-01-04 | 2019-07-12 | (주)큐디스 | Ip카메라와 연속광 조명을 구비하는 3d 영상 촬영장치 |
FR3078564B1 (fr) * | 2018-03-01 | 2020-09-11 | 4D View Solutions | Systeme de modelisation tridimensionnelle d'une scene par photogrammetrie multi-vue |
AU2019308228B2 (en) | 2018-07-16 | 2021-06-03 | Accel Robotics Corporation | Autonomous store tracking system |
KR101987922B1 (ko) * | 2019-01-08 | 2019-06-11 | 오은실 | 라이트 셀의 능동적 제어를 통한 자동 이미지 캡쳐 시스템 |
GB2593126A (en) * | 2019-08-29 | 2021-09-22 | Alexander Lang Gordon | 3D Pose capture system |
US11743418B2 (en) * | 2019-10-29 | 2023-08-29 | Accel Robotics Corporation | Multi-lighting conditions rapid onboarding system for visual item classification |
US11205094B2 (en) * | 2019-10-29 | 2021-12-21 | Accel Robotics Corporation | Multi-angle rapid onboarding system for visual item classification |
US10621472B1 (en) * | 2019-10-29 | 2020-04-14 | Accel Robotics Corporation | Rapid onboarding system for visual item classification |
RU2750650C1 (ru) * | 2020-10-06 | 2021-06-30 | Игорь Сергеевич Лернер | Многофункциональная мультимедийная студия самообслуживания для производства фото/видеоматериалов |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005252831A (ja) * | 2004-03-05 | 2005-09-15 | Mitsubishi Electric Corp | 設備監視支援装置 |
KR20070020764A (ko) * | 2005-08-16 | 2007-02-22 | 구해원 | 동일시각에 동일피사체를 여러 대의 카메라로 촬영하는입체촬영 시스템 |
KR20070049109A (ko) * | 2006-12-04 | 2007-05-10 | 실리콘 옵틱스 인코포레이션 | 파노라마 비전 시스템 및 방법 |
JP2010154052A (ja) * | 2008-12-24 | 2010-07-08 | Hitachi Ltd | 複数カメラ制御システム |
JP2010154306A (ja) * | 2008-12-25 | 2010-07-08 | Olympus Corp | 撮像制御装置、撮像制御プログラム及び撮像制御方法 |
JP2015026931A (ja) * | 2013-07-25 | 2015-02-05 | オリンパス株式会社 | 撮像装置、撮像方法及び撮像プログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20030016452A (ko) | 2001-08-16 | 2003-03-03 | 다본정보기술 주식회사 | 3차원 촬영장치 |
EP3422955B1 (en) * | 2016-02-29 | 2023-10-18 | Packsize International, LLC | System and method for assisted 3d scanning |
US10789764B2 (en) * | 2017-05-31 | 2020-09-29 | Live Cgi, Inc. | Systems and associated methods for creating a viewing experience |
-
2015
- 2015-08-10 KR KR1020150112460A patent/KR101679398B1/ko active IP Right Grant
-
2016
- 2016-08-05 US US15/751,949 patent/US10397477B2/en active Active
- 2016-08-05 WO PCT/KR2016/008651 patent/WO2017026751A1/ko active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005252831A (ja) * | 2004-03-05 | 2005-09-15 | Mitsubishi Electric Corp | 設備監視支援装置 |
KR20070020764A (ko) * | 2005-08-16 | 2007-02-22 | 구해원 | 동일시각에 동일피사체를 여러 대의 카메라로 촬영하는입체촬영 시스템 |
KR20070049109A (ko) * | 2006-12-04 | 2007-05-10 | 실리콘 옵틱스 인코포레이션 | 파노라마 비전 시스템 및 방법 |
JP2010154052A (ja) * | 2008-12-24 | 2010-07-08 | Hitachi Ltd | 複数カメラ制御システム |
JP2010154306A (ja) * | 2008-12-25 | 2010-07-08 | Olympus Corp | 撮像制御装置、撮像制御プログラム及び撮像制御方法 |
JP2015026931A (ja) * | 2013-07-25 | 2015-02-05 | オリンパス株式会社 | 撮像装置、撮像方法及び撮像プログラム |
Also Published As
Publication number | Publication date |
---|---|
KR101679398B1 (ko) | 2016-11-28 |
US10397477B2 (en) | 2019-08-27 |
US20180234627A1 (en) | 2018-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017026751A1 (ko) | 3차원 스튜디오 시스템 | |
WO2009145575A2 (ko) | 감시 카메라 장치 | |
WO2014010940A1 (en) | Image correction system and method for multi-projection | |
WO2011087337A2 (ko) | 기판 검사장치 | |
WO2016021790A1 (en) | Imaging sensor capable of detecting phase difference of focus | |
WO2020027607A1 (ko) | 객체 탐지 장치 및 제어 방법 | |
WO2018155742A1 (ko) | 다중 카메라 입력의 합성을 통한 실시간 모니터링 시스템 | |
WO2014189186A1 (ko) | 무선 리모컨 기능을 갖는 ip 카메라를 이용한 전자기기의 제어 방법 | |
WO2017183915A2 (ko) | 영상취득 장치 및 그 방법 | |
WO2017196026A2 (ko) | Ptz 카메라의 촬영영상 설정방법 및 그를 위한 장치 | |
WO2014017816A1 (en) | Apparatus and method to photograph an image | |
WO2013137637A1 (en) | Imaging apparatus and image sensor thereof | |
WO2012161384A1 (ko) | 오토포커스 카메라 시스템 및 그 제어방법 | |
WO2012046903A1 (ko) | 사각지역 촬영이 가능한 감시카메라 및 그 제어방법 | |
WO2024112174A1 (ko) | 이동용 카메라 촬영 모사 시스템, 방법, 및 상기 방법을 실행시키기 위한 컴퓨터 판독 가능한 프로그램을 기록한 기록 매체 | |
WO2018092926A1 (ko) | 사물 인터넷 기반의 실외형 자가촬영사진지원 카메라 시스템 | |
WO2011078615A2 (en) | Distance adaptive 3d camera | |
WO2021137555A1 (en) | Electronic device comprising image sensor and method of operation thereof | |
WO2015178593A1 (ko) | 물체 전방위 촬영 3차원 실물화상기 | |
WO2020040468A1 (ko) | 유기발광소자의 혼색 불량 검출장치 및 검출방법 | |
WO2018092929A1 (ko) | 사물 인터넷 기반의 실내형 자가촬영사진지원 카메라 시스템 | |
WO2015178536A1 (ko) | 화질 개선 장치, 이를 가지는 디지털 촬영 장치 및 화질 개선 방법 | |
WO2017007077A1 (ko) | 감시 방법 | |
JPWO2018020769A1 (ja) | 映像システム装置 | |
WO2014185578A1 (ko) | 직교식 입체 카메라의 광축 정렬 방법 및 직교식 입체 카메라 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16835391 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15751949 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM F1205A DATED 01.06.2018) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16835391 Country of ref document: EP Kind code of ref document: A1 |