WO2019082686A1 - Dispositif d'imagerie - Google Patents

Dispositif d'imagerie

Info

Publication number
WO2019082686A1
WO2019082686A1 PCT/JP2018/038042 JP2018038042W WO2019082686A1 WO 2019082686 A1 WO2019082686 A1 WO 2019082686A1 JP 2018038042 W JP2018038042 W JP 2018038042W WO 2019082686 A1 WO2019082686 A1 WO 2019082686A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image
pixel
signal
imaging
Prior art date
Application number
PCT/JP2018/038042
Other languages
English (en)
Japanese (ja)
Inventor
寿伸 杉山
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018008193A external-priority patent/JP2019083501A/ja
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to US16/756,171 priority Critical patent/US11372200B2/en
Publication of WO2019082686A1 publication Critical patent/WO2019082686A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present technology relates to an imaging device, and in particular to an imaging device capable of efficiently separating signals having different characteristics.
  • a technique using infrared (IR) images in addition to visible images is widely used as a technique for object recognition and information sensing used in surveillance camera systems, security systems, in-vehicle systems, game machines and the like.
  • IR infrared
  • An image sensor capable of simultaneously acquiring IR images has been proposed.
  • Patent Document 1 proposes that one G pixel in a 2 ⁇ 2 Bayer array is replaced with an IR pixel that transmits only IR light. According to the proposal of Patent Document 1, a visible signal and an IR signal are separated by arithmetic processing of signals from respective R, G, B, and IR pixels, and respective images are output according to the application.
  • the present technology has been made in view of such circumstances, and can efficiently separate signals having different characteristics.
  • imaging is performed in a state in which light of a predetermined pattern from a structured light source and an imaging unit that images a subject is irradiated on a projection area of a specific pixel of the imaging unit.
  • an image generation unit configured to generate an image of the subject based on a pixel signal obtained by being read.
  • FIG. 1 It is a flowchart explaining the imaging process of the camera system of FIG. It is a figure which shows the example of a pixel array. It is a figure showing the 3rd example of composition of a camera system to which this art is applied. It is a figure explaining the principle of SL light source. It is a block diagram which shows the structural example of IR light irradiation apparatus and an imaging device. It is a figure which shows the example of the pixel array of an image sensor. It is a figure explaining the relationship between the angle of view of an imaging device, and the irradiation angle of IR-SL light source.
  • FIG. 1 schematically shows an overall configuration of an operating room system. It is a figure which shows the example of a display of the operation screen in a concentration operation panel. It is a figure which shows an example of the mode of the surgery to which the operating room system was applied. It is a block diagram which shows an example of a function structure of the camera head shown in FIG. 33, and CCU.
  • FIG. 1 is a diagram illustrating a first configuration example of a camera system to which the present technology is applied.
  • the camera system of the present technology in a state where light of a dot pattern is projected from a structured light (SL) light source onto a projection area of a predetermined pixel, imaging is performed at a position close to the SL light source.
  • SL structured light
  • the camera system 1 of FIG. 1 includes an IR light irradiation device 11 as an IR-SL light source, and an imaging device 12.
  • IR light irradiation device 11 as an IR-SL light source
  • imaging device 12 In FIG. 1, projection areas of pixels are virtually shown by broken lines on a plane on which a subject is assumed.
  • the projection area is an area corresponding to the pixel array of the image sensor in the imaging device 12.
  • the characters shown in each projection area indicate that the pixels corresponding to each projection area are R, G, B pixels or IR pixels.
  • the IR light irradiation device 11 is a device that emits IR light, and is disposed in the vicinity of the imaging device 12 in a fixed manner.
  • the IR light irradiation device 11 irradiates IR light of a dot pattern that irradiates only the projection area corresponding to the IR pixel. Of the projection areas, each dot of the dot pattern is projected as IR light from the IR light irradiator 11 onto the projection area of the IR pixel as indicated by a colored circle.
  • the imaging device 12 includes, as one example, an image sensor in which R, G, B pixels and IR pixels are arranged in a 2 ⁇ 2 matrix.
  • the shutter system of the image sensor may be a rolling shutter system or a global shutter system.
  • FIG. 1 only the projection area of 4 ⁇ 4 pixels is shown, and the projection areas to which IR light is irradiated are four areas. Each dot of IR light is illuminated.
  • the imaging device 12 images a subject in a state where IR light of a dot pattern that irradiates only a projection area corresponding to an IR pixel is irradiated by the IR light irradiation device 11.
  • IR light of a dot pattern that irradiates only a projection area corresponding to an IR pixel is irradiated by the IR light irradiation device 11.
  • R, G and B pixels visible light from a predetermined light source is received. Thereby, in the imaging device 12, a visible image corresponding to the signal from the R, G, B pixels and an IR image corresponding to the signal from the IR pixel are generated.
  • FIG. 2 is a diagram for explaining the principle of a structured light source.
  • the IR light irradiation device 11 has a configuration in which the diffraction grating 22 is provided in front of the laser light source 21.
  • the diffraction grating 22 By designing the diffraction grating 22 appropriately, it is possible to irradiate IR light of a dot pattern to any position (for example, a projection area of IR pixels in FIG. 1) in a matrix.
  • FIG. 3 is a diagram for explaining the relationship between the angle of view of the imaging device and the irradiation angle of the IR-SL light source.
  • the solid lines in the SL irradiation angle centering on the optical axis 11C of the IR light irradiation device 11 indicate the boundaries of the SL irradiation area, and the broken lines in the angle of view centering on the optical axis 12C of the imaging device 12 , Indicates the boundary of the pixel area.
  • the SL irradiation area is an area where dots are irradiated from the IR-SL light source to the pixel area.
  • the SL irradiation angle of the IR light irradiation device 11 and the angle of view of the imaging device 12 are set so as to approximately match.
  • the solid line L1 on the left side of FIG. 3 indicates the projection area of the imaging device 12.
  • the range indicated by the bidirectional arrow corresponds to the projection area of one pixel.
  • An alternate long and short dash line L2 on the right side of the pixel projection area indicates a parallax matching limit distance.
  • the parallax matching limit distance is a distance from the IR light irradiation device 11 and the imaging device 12 at which the parallaxes of the IR light irradiation device 11 and the imaging device 12 substantially coincide with each other.
  • the black rectangles on the solid line L1 and on the dashed-dotted line L2 represent the dots of the dot pattern.
  • the IR light of the dot pattern from the IR light irradiation device 11 of FIG. 3 is irradiated only to the region corresponding to the IR pixel in the pixel array of the imaging device 12. At that time, the imaging device 12 and the IR light irradiation device 11 are fixed.
  • the dot pattern irradiated from the IR light irradiation device 11 no matter what distance the subject is if it is separated from the parallax matching limit distance shown by the alternate long and short dash line L2.
  • the projection area of the IR pixel and the IR light of the dot pattern correspond to 1: 1. Reflected light of IR light does not reach the R, G and B pixels, and is received only by the IR pixels.
  • the parallax coincidence limit distance is, for example, the distance between the optical axis 11C of the IR light irradiation device 11 and the optical axis 12C of the image sensor (lens) of the imaging device 12 If the difference of about 3 mm, it will be about 60 cm.
  • FIG. 4 is a block diagram showing a configuration example of an imaging device. Imaging of a subject in the camera system 1 is performed by the imaging device 12 in a state in which IR light of a dot pattern from the IR light irradiation device 11 as an IR-SL light source is steadily irradiated.
  • the imaging device 12 includes an optical system 31 such as a lens, an image sensor 32, and an image generation unit 33.
  • the image sensor 32 has a pixel array unit in which R, G, B pixels and IR pixels are arranged in a 2 ⁇ 2 matrix.
  • the image sensor 32 photoelectrically converts incident light and A / D converts the pixel value of each pixel of the pixel array unit to generate a signal of the pixel.
  • the image generation unit 33 generates a visible image using signals (visible signals) from R, G, and B pixels among a plurality of pixels forming the pixel array unit of the image sensor 32, and generates the generated visible image as It outputs to the signal processing part of the latter part which is not illustrated. Further, the image generation unit 33 generates an IR image using a signal (IR signal) from an IR pixel among a plurality of pixels constituting the pixel array unit of the image sensor 32, and illustrates the generated IR image. Output to the signal processing unit at the subsequent stage.
  • signals visible signals
  • the image generation unit 33 includes a signal separation unit 41, an interpolation processing unit 42-1 and an interpolation processing unit 42-2, and an image quality improvement signal processing unit 43-1 and an image quality improvement signal processing unit 43-2. Configured
  • the signal separation unit 41 separates the visible signal from the signal of the image sensor 32, and outputs the signal to the interpolation processing unit 42-1. Further, the signal separation unit 41 separates the IR signal from the signal of the image sensor 32, and outputs the IR signal to the interpolation processing unit 42-2.
  • the interpolation processing unit 42-1 generates a visible image by performing interpolation processing such as demosaicing processing that generates pixel signals of missing colors according to the arrangement of R, G, and B pixels, and performs high-quality image processing Supply to section 43-1.
  • interpolation processing such as demosaicing processing that generates pixel signals of missing colors according to the arrangement of R, G, and B pixels, and performs high-quality image processing Supply to section 43-1.
  • the interpolation processing unit 42-2 performs an IR signal interpolation process to generate an IR image, and outputs the IR image to the image quality improvement signal processing unit 43-2.
  • the high image quality formation signal processing unit 43-1 performs high image quality formation processing of the visible image, and outputs the visible image after the high quality processing.
  • the high image quality formation signal processing unit 43-2 performs high image quality formation processing of the IR image, and outputs the IR image after the high quality processing.
  • step S11 of FIG. 5 the image sensor 32 captures an image of the subject in a state where the IR light of the dot pattern from the IR light irradiation device 11 is irradiated.
  • the image sensor 32 photoelectrically converts incident light, and A / D converts the pixel value of each pixel of the pixel array unit to generate a pixel signal.
  • step S12 the signal separation unit 41 separates the visible signal and the IR signal from the signal from the image sensor 32.
  • the separated visible signal is output to the interpolation processing unit 42-1, and the IR signal is output to the interpolation processing unit 42-2.
  • step S13 the interpolation processing unit 42-1 generates a visible image by interpolating the visible signal. Further, the interpolation processing unit 42-2 generates an IR image by performing interpolation processing of the IR signal.
  • the visible image after interpolation is output to the image quality improvement signal processing unit 43-1.
  • the IR image after interpolation is output to the image quality improvement signal processing unit 43-2.
  • step S14 the image quality improvement signal processing unit 43-1 performs image quality improvement processing of a visible image.
  • the high image quality formation signal processing unit 43-2 performs high image quality formation processing of the IR image.
  • the visible image and the IR image after the image quality improvement processing are output to the signal processing unit in the subsequent stage.
  • the visible image and the IR image obtained as described above are used, for example, for object recognition of a surveillance camera system. After separation, they may be combined to obtain a color image.
  • the visible image and the IR image are used for face recognition for security for personal computers and smart phones, or for iris recognition. It is possible to simultaneously acquire an IR image and a visible image used for gesture recognition such as for authentication or for a game by one imaging device.
  • the reflected light from the subject due to the irradiation of the IR light is received only by the IR pixels, it does not affect the visible signals obtained by the RGB pixels. Therefore, it becomes possible to separate the visible signal and the IR signal which are signals having different characteristics. In addition, it is not necessary to form a dedicated on-chip filter for blocking IR irradiation light on R, G, B pixels.
  • FIG. 6 is an external view showing an arrangement example of the IR-SL light source.
  • the imaging device 12 incorporates the IR light irradiator 11A for irradiating IR light as the IR-SL light source, so that the configuration of imaging and the configuration of irradiation of IR light are in the same housing.
  • the IR light irradiation unit 11A is disposed in the vicinity of the optical system 31 represented by the lens of the imaging device 12.
  • the position of the image sensor 32 provided behind the optical system 31 and the position of the IR light irradiation unit 11A can be disposed closer to each other. It is possible to match the pattern and the pixel projection area.
  • FIG. 6 shows an example in which the imaging device 12 and the IR light irradiation device 11 as an IR-SL light source are configured independently of each other.
  • the IR light irradiation device 11 is detachably mounted on, for example, a housing of the imaging device 12 with an adjuster or the like.
  • the arrangement of the RGB-IR pixels is described as an example arranged in a 2 ⁇ 2 matrix, but the arrangement of the RGB-IR pixels is 2 ⁇ 2 It is not limited to the one arranged in the form of matrix. For example, it may be another different pixel array other than RGB-IR, or the pixel array may be arranged on a 3 ⁇ 3 or 4 ⁇ 4 matrix.
  • the pattern shape of the IR-SL light source a dot pattern configured by arranging each dot of IR light in a predetermined pattern has been described as an example, but the pattern shape of the IR-SL light source is The shape is not limited to the dot pattern, and may be another shape such as a pattern formed so that light is applied so that a plurality of pixels straddle, as long as the shape corresponds to the projection area of the pixel.
  • each dot of the dot pattern is projected on the projection area of the IR pixel
  • each dot of the dot pattern is R, G, B It may be projected onto a specific pixel area of the pixels.
  • the number of pixels that receive the IR signal increases, so the resolution of the IR image can be increased.
  • the IR signal mixed in the G pixel is subjected to subtraction processing by matrix operation.
  • the case of using one IR-SL light source has been described, but, for example, in order to increase the intensity of irradiation light, light from a plurality of light sources is irradiated to one pixel. It is also good.
  • FIG. 7 is a diagram illustrating a second configuration example of a camera system to which the present technology is applied.
  • the camera system 51 of FIG. 7 includes a light irradiation device 61 as an SL light source and an imaging device 62.
  • projection areas of pixels are virtually shown by broken lines on a plane on which a subject is assumed.
  • the projection area is an area corresponding to the pixel array of the imaging device 62.
  • the characters shown in each projection area indicate that the pixels corresponding to each projection area are R, G, B pixels or IR pixels.
  • the point which uses the monochrome (W / B) sensor in which the on-chip color filter is not mounted differs from the camera system 1 of FIG.
  • the other common parts have the same configuration as that of the camera system 1 of FIG. 1 and are repeated, so the description of the common parts will be omitted.
  • the four light sources included in the light irradiation device 61 are an R light source that emits R light, a G light source that emits G light, a B light source that emits B light, and an IR light source that emits IR light.
  • the R light source emits R light of a dot pattern that irradiates only the projection area corresponding to the R pixel.
  • the G light source emits G light of a dot pattern that irradiates only the projection area corresponding to the G pixel.
  • the B light source emits B light of a dot pattern that irradiates only the projection area corresponding to the B pixel.
  • the IR light source emits IR light of a dot pattern that irradiates only the projection area corresponding to the IR pixel.
  • each dot of the dot pattern is simultaneously projected as light from the four light sources of the light irradiation device 61, as indicated by four types of hatched circles.
  • the imaging device 62 captures an image of the subject in a state in which light of a dot pattern is emitted from the four light sources of the light irradiation device 61 so as to irradiate only the projection area of each pixel of R, G, B, and IR.
  • the imaging device 62 captures an image of the subject in a state in which light of a dot pattern is emitted from the four light sources of the light irradiation device 61 so as to irradiate only the projection area of each pixel of R, G, B, and IR.
  • FIG. 8 is a block diagram showing a configuration example of an imaging device.
  • the imaging of the subject in the camera system 51 is performed by the imaging device 62 in a state in which the light of the dot pattern from the light irradiation device 61 as the SL light source is steadily irradiated.
  • the imaging device 62 includes an optical system 71 such as a lens, an image sensor 72, and an image generation unit 73.
  • the image sensor 72 photoelectrically converts incident light, and A / D converts the pixel value of each pixel of the pixel array unit to generate a signal of the pixel.
  • the image sensor 72 is a black and white (W / B) sensor on which the on-chip color filter is not mounted.
  • the image sensor 72 has a pixel array unit in which R, G, B pixels and IR pixels are arranged in a 2 ⁇ 2 matrix.
  • the image generation unit 73 generates R, G, and B images using R, G, and B signals from R, G, and B pixels among a plurality of pixels forming the pixel array unit of the image sensor 72, The generated R, G, and B images are output to a signal processing unit in a subsequent stage (not shown).
  • the image generation unit 73 generates an IR image using an IR signal from an IR pixel among a plurality of pixels constituting the pixel array unit of the image sensor 72, and generates the generated IR image in a subsequent signal processing (not shown). Output to the unit.
  • the image generation unit 73 is configured to include a signal separation unit 81, an interpolation processing unit 82-1 to an interpolation processing unit 82-4, and an image quality improvement signal processing unit 83-1 to an image quality improvement signal processing unit 83-4. Ru.
  • the signal separation unit 81 separates the R signal from the signal of the image sensor 72, and outputs the R signal to the interpolation processing unit 82-1.
  • the signal separation unit 81 separates the G signal from the signal of the image sensor 72, and outputs the G signal to the interpolation processing unit 82-2.
  • the signal separation unit 81 separates the B signal from the signal of the image sensor 72, and outputs the B signal to the interpolation processing unit 82-3.
  • the signal separation unit 81 separates the IR signal from the signal of the image sensor 72, and outputs the IR signal to the interpolation processing unit 82-4.
  • the interpolation processing unit 82-1 to the interpolation processing unit 82-3 perform interpolation processing such as demosaicing processing for generating pixel signals of missing colors according to the arrangement of R, G, and B pixels, thereby obtaining R image and G
  • the image and the B image are generated and supplied to the image quality improvement signal processing unit 83-1 to the image quality improvement signal processing unit 83-3, respectively.
  • the interpolation processing unit 82-4 performs interpolation processing of the IR signal to generate an IR image after interpolation, and outputs the IR image to the image quality improvement signal processing unit 83-4.
  • the image quality improvement signal processing unit 83-1 to the image quality improvement signal processing unit 83-3 perform the image quality improvement processing of the R image, the G image, and the B image, and the R image, the G image, and the B after the image quality improvement processing. Output each image.
  • the high image quality formation signal processing unit 83-4 performs high image quality formation processing of the IR image, and outputs the IR image after the high quality processing.
  • FIG. 9 is an external view showing an arrangement configuration example of the SL light source.
  • 3 shows an example in which the configuration of imaging and the configuration of irradiation of light are integrated in the same casing by incorporating the light irradiation unit 61A-4 for irradiating IR light.
  • the light emitting units 61A-1 to 61A-4 are disposed in the vicinity of the optical system 71 represented by the lens of the imaging device 62.
  • the position of the image sensor 72 provided behind the optical system 71 and the positions of the light emitting units 61A-1 to 61A-4 can be arranged closer to each other. Further, it is possible to make the dot pattern and the pixel projection area coincide with each other at a short distance.
  • the light irradiation device 61 includes a light irradiation unit 61A-1 for irradiating R light, a light irradiation unit 61A-2 for irradiating G light, a light irradiation unit 61A-3 for irradiating B light, and a light irradiation unit for irradiating IR light 61A-4.
  • the light irradiation device 61 is, for example, detachably mounted on the housing of the imaging device 62 with an adjuster or the like.
  • the light irradiation device 61 can be replaced according to the application, and, for example, switching to an SL light source of different wavelength is possible.
  • step S51 of FIG. 10 the image sensor 72 captures an image of the subject in a state where the light of the dot pattern from each of the light emitting units 61A-1 to 61A-4 of the light emitting device 61 is irradiated.
  • the image sensor 72 photoelectrically converts incident light, and A / D converts the pixel value of each pixel of the pixel array unit to generate a signal of the pixel.
  • step S52 the signal separation unit 81 separates the R signal, the G signal, the B signal, and the IR signal from the signal from the image sensor 72.
  • the R signal, the G signal, and the B signal are respectively output to the interpolation processing unit 82-1 to the interpolation processing unit 82-3.
  • the IR signal is output to the interpolation processing unit 82-4.
  • step S53 the interpolation processing unit 82-1 to the interpolation processing unit 82-3 respectively perform interpolation processing of the R signal, the G signal, and the B signal to generate R, G, B images after interpolation.
  • the interpolation processing unit 82-4 performs interpolation processing of the IR signal to generate an IR image after interpolation.
  • the R, G, B images after interpolation are output to the image quality improvement signal processing unit 83-1 to the image quality improvement signal processing unit 83-3, respectively.
  • the IR image after interpolation is output to the image quality improvement signal processing unit 83-4.
  • step S54 the image quality improvement signal processing unit 83-1 to the image quality improvement signal processing unit 83-3 perform the image quality improvement processing of the R, G, and B images.
  • the image quality improvement signal processing unit 83-4 performs image quality improvement processing of the image of the IR signal.
  • the R, G, B images and the IR image after the image quality improvement processing are output to the signal processing unit in the subsequent stage.
  • FIG. 11 is a diagram illustrating an example of the pixel array.
  • FIG. 11 An example of a pixel arrangement similar to that of FIG. 7 is shown in A of FIG. A in FIG. 11 uses four light sources of R, G, B, and IR as SL light sources for forming dot patterns, and emits light of dot patterns corresponding to each projection area of 2 ⁇ 2 pixels.
  • This is an example in which the By using a combination of four SL light sources of R, G, B, and IR and a corresponding dot pattern, it is possible to obtain a visible image and an IR image as in the first embodiment.
  • B of FIG. 11 is an example in which the IR pattern of the combination of the SL light sources shown in A of FIG. 11 is replaced with a G pattern.
  • C in FIG. 11 uses a plurality of light sources of wavelength bands other than R, G, B, and IR wavelength bands (nine types in the case of C in FIG. 11), and a wide area of 3 ⁇ 3 surrounded by thick lines. It is an example developed to By using such a combination of nine SL light sources and a corresponding dot pattern, it is possible to acquire multispectral spectroscopy and spectral characteristics of an object necessary for analysis. In addition, it may be developed in a wide area of 4 ⁇ 4.
  • the wavelength band to be acquired varies depending on the material of the subject, so a camera system that can flexibly change the probe wavelength for analysis like the camera system of the present technology is useful.
  • the dot pattern is described as an example of the pattern shape of the SL light source, but the pattern shape of the SL light source is not limited to the dot pattern, and a shape corresponding to the projection area of the pixel If it is, other shapes, such as a pattern formed so that light may be irradiated so that a plurality of pixels straddle may be sufficient.
  • the case of only one light source has been described as the arrangement of the SL light source, but for example, in order to irradiate the dot pattern corresponding to the same pixel to increase the intensity of the irradiation light.
  • Multiple SL light sources may be used.
  • the pattern light of one type of SL light source is projected on the projection area of each pixel, but the pattern light from plural types of SL light sources having different wavelength bands on the same pixel projection area May be projected. These are selected depending on the application of spectral analysis.
  • the camera system of the second embodiment can be used for spectral analysis and the like in technical fields such as bio and medical.
  • the imaging of a normal color image, the acquisition of a multispectral spectral image, the observation of a specific fluorescence emission and the like can be performed by using the camera system of the second embodiment.
  • the camera system according to the second embodiment to the technical fields of biotechnology and medical, it is possible to simultaneously perform fluorescence observation for observing fluorescence reflection by excitation light and normal color imaging.
  • imaging is performed in a state where light of a predetermined pattern from a structured light source is emitted to a projection area of a specific pixel of an imaging unit that captures an object.
  • the image of the subject is generated based on the pixel signal.
  • a camera system capable of separating and simultaneously acquiring images having different characteristics, for example, a visible image and an IR image without causing crosstalk.
  • a multi-spectral camera In a multi-spectral camera, many color filters corresponding to various wavelength bands can be arranged at each pixel of the sensor, and each pixel signal can be acquired. By using a multi-spectral camera, the spectral reflection characteristics of the subject are analyzed, or the material identification and analysis of the subject are performed.
  • the present technology it is possible to perform imaging having spectral characteristics similar to those in the case where the on-chip color filter is disposed by one sensor without requiring the on-chip color filter.
  • FIG. 12 is a diagram illustrating a third configuration example of a camera system to which the present technology is applied.
  • the camera system 101 of FIG. 12 includes an IR light irradiation apparatus 111 as an IR-SL light source, and an imaging apparatus 112.
  • projection areas of pixels are virtually shown by broken lines on a plane on which an object is assumed.
  • the projection area is an area corresponding to the pixel array of the imaging device 112. Characters R, G, B and T shown in each projection area indicate that the pixels corresponding to each projection area are R, G and B pixels or pixels for TOF (Time Of Flight), respectively. .
  • the IR light irradiation device 111 is a device that irradiates IR light, and is fixedly disposed in the vicinity of the imaging device 112.
  • the IR light irradiation apparatus 111 blinks and irradiates IR light of a dot pattern which irradiates only the projection area corresponding to the TOF pixel.
  • each dot of the dot pattern is projected as IR light from the IR light irradiator 111, as indicated by a colored circle, on the projection area of the TOF pixel.
  • the imaging device 112 includes an image sensor in which R, G, B pixels and TOF pixels are arranged.
  • the shutter system of the image sensor may be a rolling shutter system or a global shutter system.
  • the imaging device 112 images a subject in a state where IR light of a dot pattern that irradiates only the projection area corresponding to the TOF pixel is irradiated by the IR light irradiation device 111.
  • IR light of a dot pattern that irradiates only the projection area corresponding to the TOF pixel is irradiated by the IR light irradiation device 111.
  • R, G and B pixels visible light from a predetermined light source is received.
  • a visible image corresponding to the signal from the R, G, B pixels is generated, and the distance information is obtained using the signal from the TOF pixel.
  • the IR light irradiation device 111 may be configured integrally with the imaging device 112, or is configured to be detachable from the imaging device 112. It may be
  • FIG. 13 is a diagram for explaining the principle of a structured light source.
  • the IR light irradiation apparatus 111 has a configuration in which the diffraction grating 122 is provided in front of the laser light source 121.
  • the diffraction grating 122 By appropriately designing the diffraction grating 122, it becomes possible to irradiate IR light of a dot pattern at an arbitrary position (for example, a projection area of the TOF pixel in FIG. 12) in a matrix.
  • FIG. 14 is a block diagram showing a configuration example of an IR light irradiation apparatus and an imaging apparatus.
  • the IR light irradiation apparatus 111 includes a laser light source 121, a diffraction grating 122, and an IR-LED driver 131.
  • the IR-LED driver 131 controls the flickering irradiation operation of the laser light source 121 in accordance with the LED ON / OFF signal and the LED intensity adjustment signal supplied from the imaging device 112.
  • the ON / OFF signal of the LED is a signal indicating ON and OFF of the LED.
  • the LED intensity adjustment signal is a signal for adjusting the intensity of the LED.
  • the imaging device 112 includes an optical system 141 such as a lens, an IR band pass filter 142, an image sensor 143, and a camera DSP 144.
  • an optical system 141 such as a lens, an IR band pass filter 142, an image sensor 143, and a camera DSP 144.
  • the image sensor 143 has a pixel array unit in which R, G, B pixels and TOF pixels are arranged.
  • the image sensor 143 photoelectrically converts incident light and A / D converts the pixel value of each pixel of the pixel array unit to generate a signal of the pixel.
  • the camera DSP 144 generates a color image by using R, G, B signals from R, G, B pixels among the plurality of pixels constituting the pixel array unit of the image sensor 143, and generates the generated color image. It outputs to the signal processing part of the latter part which is not shown. Further, the camera DSP 144 calculates the distance using the TOF signal from the TOF pixel among the plurality of pixels constituting the pixel array unit of the image sensor 143. The camera DSP 144 generates an AF control signal for controlling AF (Auto Focus) from the distance information indicating the calculated distance. The generated AF control signal is used to drive the optical system 141.
  • AF Auto Focus
  • the camera DSP 144 generates the LED ON / OFF signal and the LED intensity adjustment signal, and outputs the generated LED ON / OFF signal and the LED intensity adjustment signal to the IR-LED driver 131.
  • FIG. 15 is a diagram showing an example of the pixel array of the image sensor.
  • the pixel array unit of the image sensor 143 is configured by a pixel array in which G pixels are replaced with TOF pixels every four pixels in the horizontal direction and the vertical direction of the Bayer array.
  • FIG. 16 is a diagram for explaining the relationship between the angle of view of the imaging device and the irradiation angle of the IR-SL light source.
  • the solid lines in the SL irradiation angle centering on the optical axis 111C of the IR light irradiation device 111 indicate the boundaries of the SL irradiation area, and the broken lines in the angle of view centering on the optical axis 112C of the imaging device 112 , Indicates the boundary of the pixel area.
  • the SL irradiation area is an area where dots are irradiated from the IR-SL light source to the pixel area.
  • the SL irradiation angle of the IR light irradiation device 111 and the angle of view of the imaging device 112 are set so as to approximately match.
  • the solid line L1 on the left side of FIG. 16 indicates the projection area of the imaging device 112.
  • the range indicated by the bidirectional arrow corresponds to the projection area of one pixel.
  • An alternate long and short dash line L2 on the right side of the pixel projection area indicates the parallax matching limit distance described above with reference to FIG.
  • the black rectangles on the solid line L1 and on the dashed-dotted line L2 represent the dots of the dot pattern.
  • the IR light of the dot pattern from the IR light irradiation device 111 of FIG. 16 is irradiated only to the region corresponding to the TOF pixel in the pixel array of the imaging device 112. At that time, the imaging device 112 and the IR light irradiation device 111 are fixed.
  • the dot pattern irradiated from the IR light irradiator 111 no matter what distance the subject is, provided that it is separated from the parallax coincidence limit distance indicated by the dashed dotted line L2.
  • the projection area of the IR pixel and the IR light of the dot pattern correspond to 1: 1. Reflected light of IR light does not reach the R, G and B pixels, and is received only by the IR pixels.
  • the disparity matching limit distance is the same as that in the case of FIG. 3, and thus the description thereof is omitted.
  • FIG. 17 is a view for explaining the relationship between the angle of view of the imaging apparatus and the irradiation angle of the IR-SL light source when a dichroic mirror is used.
  • the dichroic mirror 151 is formed so as to reflect light in a direction perpendicular to the incident surface and transmit light in a direction parallel to the incident surface.
  • the dichroic mirror 151 is disposed in front of the optical system 141 of the imaging device 112 so that the center of the optical axis after reflection of the dichroic mirror 151 and the center of the optical axis 112C of the imaging device 112 substantially coincide with each other.
  • the IR light irradiation device 111 is disposed perpendicularly to the optical axis 112 C of the imaging device 112 so as to emit light in the direction perpendicular to the incident surface of the dichroic mirror 151.
  • the center of the optical axis after reflection of the dichroic mirror 151 and the center of the optical axis 112C of the imaging device 112 can be made to substantially coincide.
  • the IR light of the dot pattern emitted from the IR light irradiation device 111 is reflected by the dichroic mirror 151 and the light from the subject is transmitted, approximately 50% of the IR light may be received by the imaging device 112. it can.
  • the correspondence between the dot pattern and the projection area of the pixel can be made to substantially coincide, including the short distance.
  • dichroic mirror 151 instead of the dichroic mirror 151, a dichroic prism, a deflection beam splitter, or the like may be disposed.
  • the dichroic mirror 151 may be used also in the camera system of the other embodiments.
  • FIG. 18 is a cross-sectional view showing an exemplary configuration of part of the light incident side of the image sensor.
  • the light receiving pixel 161, the insulating layer 162, the filter layer 163, the color filter layer 164, and the on-chip lens 165 are shown as a part of the configuration on the light incident side in the image sensor 143.
  • the light receiving pixel 161 is configured of a B pixel, a G pixel, an R pixel, and a TOF pixel in order from the left.
  • the insulating layer 162 transmits the light transmitted through the filter layer 163 to the light receiving pixel 161.
  • the filter layer 163 includes an IR blocking filter disposed on the B pixel, the G pixel, the R pixel, and a blue filter disposed on the TOF pixel.
  • the IR blocking filter blocks light in a wavelength range of IR light (for example, around 850 nm).
  • the blue filter is disposed to overlap with the red filter of the color filter layer 164 to transmit only IR light.
  • the color filter layer 164 includes a blue filter disposed on the B pixel, a green filter disposed on the G pixel, and a red filter disposed on the R pixel and the TOF pixel.
  • the blue filter blocks light in the G wavelength range and light in the R wavelength range and transmits light in the B wavelength range.
  • the green filter blocks light in the B wavelength range and light in the R wavelength range and transmits light in the G wavelength range.
  • the red filter blocks light in the G wavelength band and light in the B wavelength band and transmits light in the R wavelength band.
  • the on-chip lens 165 is configured of a lens disposed on each pixel of the light receiving pixel 161.
  • an IR band pass filter 142 is disposed between the optical system 141 and the image sensor 143.
  • the IR band pass filter 142 is a band pass filter having transparency in the visible region and in the wavelength region of IR light.
  • FIG. 19 is a diagram showing spectral characteristics of the IR band pass filter and the IR blocking filter.
  • the IR band pass filter 142 transmits light in wavelength ranges of 400 nm to 680 nm and 830 nm to 870 nm, and blocks light in wavelength ranges other than 400 nm to 680 nm and 830 nm to 870 nm.
  • the IR blocking filter blocks light in a wavelength range around 850 nm until the transmittance becomes 0.1.
  • the IR band pass filter 142 completely blocks the light in the wavelength range other than the visible range and the wavelength range of the IR light (transmission factor 0).
  • the IR blocking filter does not completely block light in the wavelength range of IR light. For this reason, as shown below, the transparency of the IR light wavelength region is not zero.
  • FIG. 20 is a diagram showing spectral characteristics corresponding to each pixel.
  • the spectral sensitivity of the sensor is also added to the spectral characteristics of FIG.
  • the R pixel is set to be sensitive to light in a wavelength range of approximately 590 nm to 630 nm.
  • the G pixel is set to be sensitive to light in a wavelength range of approximately 490 nm to 550 nm.
  • the B pixel is set to be sensitive to light in a wavelength range of approximately 440 nm to 470 nm.
  • the TOF pixels are set to be sensitive to light in a wavelength range of approximately 840 nm to 860 nm.
  • the transparency of light in the wavelength range of IR light (for example, around 850 nm) is not completely zero. Permeation of IR light to visible pixels has little effect on the color reproduction of visible images if it is present in the natural environment, but if artificial IR light is mixed in, it can be used for color reproduction of visible images. It will be an influence level.
  • IR light is set to irradiate only the TOF pixels. Therefore, according to the present technology, it is possible to almost completely avoid the mixing of the projected IR light into the visible pixel.
  • FIG. 21 is a block diagram showing a configuration example of a camera DSP in an imaging device.
  • the camera DSP 144 includes a signal separation unit 181, an interpolation processing unit 182, a color image signal processing unit 183, a phase difference calculation processing unit 184, a distance calculation processing unit 185, and an AF control signal generation unit 186.
  • the signal separation unit 181 separates the R, G, and B signals from the signal of the image sensor 143, and outputs the signals to the interpolation processing unit 182. Further, the signal separation unit 181 separates the TOF signal from the signal of the image sensor 143, and outputs the TOF signal to the phase difference calculation processing unit 184.
  • the interpolation processing unit 182 uses the R, G, and B signals supplied from the signal separation unit 181 to perform interpolation processing such as demosaicing processing to generate pixel signals of missing colors according to the arrangement of R, G, and B pixels. I do.
  • the interpolation processing unit 182 outputs the color image generated by performing the interpolation processing to the color image signal processing unit 183.
  • the color image signal processing unit 183 performs predetermined signal processing on the color image supplied from the interpolation processing unit 182, and outputs the color image after signal processing to a signal processing unit in the subsequent stage.
  • the phase difference calculation processing unit 184 calculates the phase difference using the TOF signal supplied from the signal separation unit 181, and outputs phase difference information indicating the calculated phase difference to the distance calculation processing unit 185.
  • the distance calculation processing unit 185 calculates a distance using the phase difference information supplied from the phase difference calculation processing unit 184, and outputs distance information indicating the calculated distance.
  • the distance information output from the distance calculation processing unit 185 is supplied to the AF control signal generation unit 186 and a signal processing unit at a subsequent stage (not shown).
  • the AF control signal generation unit 186 calculates lens position information using the distance information supplied from the distance calculation processing unit 185 and the conversion formula from the distance information to the lens position information.
  • the AF control signal generator 186 generates an AF control signal based on the calculated lens position information.
  • the AF control signal is output to a drive unit (not shown) of the optical system 141.
  • step S111 the image sensor 143 captures an image of the subject in a state where the IR light of the dot pattern from the IR light irradiation device 111, which is an IR-SL light source, is irradiated.
  • the image sensor 143 photoelectrically converts incident light and A / D converts the pixel value of each pixel of the pixel array unit to generate a pixel signal.
  • step S112 the signal separation unit 181 separates the RGB signal and the TOF signal from the signal from the image sensor 143.
  • step S 113 the interpolation processing unit 182 performs interpolation processing of the R, G, and B signals supplied from the signal separation unit 181 to generate a color image, and outputs the color image to the color image signal processing unit 183.
  • step S114 the color image signal processing unit 183 performs predetermined signal processing on the color image supplied from the interpolation processing unit 182, and outputs the color image after signal processing to the signal processing unit in the subsequent stage.
  • step S115 the phase difference calculation processing unit 184 calculates the phase difference using the TOF signal supplied from the signal separation unit 181, and transmits the phase difference information indicating the calculated phase difference to the distance calculation processing unit 185. Output.
  • step S116 the distance calculation processing unit 185 performs distance calculation processing using the phase difference information supplied from the phase difference calculation processing unit 184.
  • the distance information output as a result of the distance calculation process is supplied to the AF control signal generation unit 186 and a signal processing unit at a subsequent stage (not shown).
  • step S117 the AF control signal generation unit 186 calculates lens position information from the distance information supplied from the distance calculation processing unit 185 using a conversion formula to lens position information, and is based on the calculated lens position information. , Generate an AF control signal.
  • the AF control signal is output to a drive unit (not shown) of the optical system 141.
  • color images and distance information are used for AF control of portable terminals such as smart phones, security applications requiring color images and distance information such as face recognition, and gesture recognition for games and the like.
  • the image sensor has been described on the premise of an RGB Bayer array sensor, but in the present technology, a monochrome sensor or a color filter array sensor other than the RGB Bayer array is used. Can also be applied.
  • the TOF pixels are arranged every four pixels in the RGB Bayer array of the image sensor, but the density of TOF pixels may be different from the above description. Good. Also, the arrangement of the TOF pixels may be asymmetric in the vertical and horizontal directions. Furthermore, both TOF pixels and image plane phase difference pixels may be arranged.
  • the dot pattern is described as an example of the pattern shape of the SL light source.
  • the pattern shape of the SL light source is not limited to the dot pattern, but a plurality of patterns may be used as long as they correspond to the projection area of the pixel. Other shapes may be used, such as a pattern formed so that light strikes across pixels.
  • the IR blocking filter may not necessarily be used. In particular, it is unnecessary when ambient light does not include IR light, such as indoor use.
  • the reflected light from the subject due to the irradiation of the IR light is received only by the TOF pixels, and thus does not affect the visible signals obtained by the RGB pixels. Therefore, it becomes possible to separate a visible signal and a TOF signal which are signals having different characteristics.
  • FIG. 23 is a diagram illustrating a fourth configuration example of a camera system to which the present technology is applied.
  • the camera system 201 of FIG. 23 includes an IR light irradiation device 211 as an IR-SL light source, and an imaging device 212.
  • IR light irradiation device 211 as an IR-SL light source
  • imaging device 212 In FIG. 23, projected areas of pixels are virtually shown by broken lines on a plane on which an object is assumed.
  • the projection area includes a visible pixel projection area and a triangulation projection area corresponding to the pixel array of the imaging device 212.
  • the visible pixel projection area is an area where visible pixels of R, G and B pixels are arranged.
  • the triangulation projection area is an area in which pixels for triangulation are arranged.
  • the triangulation projection area is formed at the center in the vertical direction of the visible pixel projection area.
  • the triangulation projection area is a band-like area with a width of 2 lines.
  • the IR light irradiator 211 is a device that irradiates IR light, and is separated from the fixed distance necessary for triangulation and fixedly disposed in the imaging device 212.
  • the IR light irradiation apparatus 211 irradiates IR light of a dot pattern which irradiates only predetermined pixels located at random in the triangulation projection area. Each dot of a dot pattern is projected as IR light from IR light irradiation apparatus 211 in the projection area
  • the imaging device 212 includes an image sensor in which R, G, B pixels and triangulation pixels are arranged.
  • the shutter system of the image sensor may be a rolling shutter system or a global shutter system.
  • the imaging device 212 images a subject in a state where IR light of a dot pattern that irradiates only predetermined pixels of the triangulation projection area is irradiated by the IR light irradiation device 211.
  • R, G and B pixels visible light from a predetermined light source is received.
  • a visible image corresponding to the signal from the R, G, B pixels is generated, and the distance information is obtained using the signal from the triangulation pixel.
  • the IR light irradiation device 211 may be configured integrally with the imaging device 212, or is configured to be detachable from the imaging device 212. It may be
  • FIG. 24 is a diagram showing a configuration example of an IR light irradiation apparatus and an imaging apparatus. Among the configurations shown in FIG. 24, the same configurations as the configurations described with reference to FIG. Duplicate descriptions will be omitted as appropriate.
  • the IR light irradiation device 211 separates the distance between the light axis 211C of the IR light irradiation device 211 and the light axis 212C of the image sensor (lens) of the imaging device 212 by the baseline distance Lb and fixes it to the imaging device 212 Will be installed.
  • the IR light irradiator 211 includes a laser light source 121, a diffraction grating 122, and an IR-LED driver 131 as in the case of FIG.
  • the imaging device 212 includes an optical system 141, an image sensor 231, and a camera DSP 232.
  • the image sensor 231 has a pixel array unit in which R, G, B pixels and triangulation pixels are arranged.
  • the image sensor 231 photoelectrically converts incident light and A / D converts the pixel value of each pixel of the pixel array unit to generate a signal of the pixel.
  • the camera DSP 232 generates a color image using R, G, B signals from R, G, B pixels among a plurality of pixels constituting the pixel array unit of the image sensor 231, and generates a generated color image. It outputs to the signal processing part of the latter part which is not shown. Further, the camera DSP 232 calculates a distance using a triangulation signal from the triangulation pixels among a plurality of pixels constituting the pixel array unit of the image sensor 231, and generates an AF control signal from the calculated distance information. Do. The generated AF control signal is used to drive the optical system 141.
  • the camera DSP 232 generates an LED ON / OFF signal and an LED intensity adjustment signal, and outputs the generated LED ON / OFF signal and the LED intensity adjustment signal to the IR-LED driver 131.
  • FIG. 25 is a diagram showing an example of the pixel array of the image sensor.
  • the pixel array unit of the image sensor 231 is configured of a visible pixel area in which R, G, B pixels are arranged, and a triangulation area in which pixels for triangulation are arranged.
  • the pixel array unit of the image sensor 231 is configured such that the first to third rows of the 2 ⁇ 2 Bayer array are visible pixel areas, and the fourth row of the 2 ⁇ 2 Bayer arrays is a triangulation area.
  • FIG. 26 is a block diagram showing a configuration example of a camera DSP in an imaging device.
  • the camera DSP 232 includes a signal separation unit 251, an interpolation processing unit 252, a color image signal processing unit 253, a distance calculation processing unit 254, and an AF control signal generation unit 255.
  • the signal separation unit 251 separates the R, G, and B signals from the signal of the image sensor 143, and outputs the signals to the interpolation processing unit 252. Further, the signal separation unit 251 separates the triangulation signal from the signal of the image sensor 143, and outputs the triangulation signal to the distance calculation processing unit 254.
  • the interpolation processing unit 252 performs interpolation processing such as demosaicing processing that generates pixel signals of missing colors according to the arrangement of R, G, B pixels using the R, G, B signals supplied from the signal separation unit 251 To generate a color image and output the color image to the color image signal processing unit 253.
  • interpolation processing such as demosaicing processing that generates pixel signals of missing colors according to the arrangement of R, G, B pixels using the R, G, B signals supplied from the signal separation unit 251 To generate a color image and output the color image to the color image signal processing unit 253.
  • the color image signal processing unit 253 performs predetermined signal processing on the color image supplied from the interpolation processing unit 252, and outputs the color image subjected to the signal processing to a signal processing unit in the subsequent stage.
  • the distance calculation processing unit 254 calculates a distance using the triangulation signal supplied from the interpolation processing unit 252, and outputs distance information indicating the calculated distance.
  • the output distance information is supplied to the AF control signal generation unit 255 and a signal processing unit at a subsequent stage (not shown).
  • the AF control signal generation unit 255 calculates lens position information from the distance information supplied from the distance calculation processing unit 254 using a conversion formula to lens position information, and an AF control signal based on the calculated lens position information. Generate The generated AF control signal is output to a driving unit (not shown) of the optical system 141.
  • step S211 in FIG. 27 the image sensor 143 captures an image of the subject in a state where the IR light of the dot pattern from the IR light irradiation device 211, which is an IR-SL light source, is irradiated.
  • the image sensor 143 photoelectrically converts incident light and A / D converts the pixel value of each pixel of the pixel array unit to generate a pixel signal.
  • step S212 the signal separation unit 251 separates the RGB signal and the triangulation signal from the signal from the image sensor 143.
  • the separated RGB signals are output to the interpolation processing unit 252, and the triangulation signal is output to the distance calculation processing unit 254.
  • step S 213 the interpolation processing unit 252 performs interpolation processing of the R, G, and B signals supplied from the signal separation unit 251 to generate a color image, and outputs the color image to the color image signal processing unit 253.
  • step S214 the color image signal processing unit 253 performs predetermined signal processing on the color image supplied from the interpolation processing unit 252, and outputs the color image after signal processing to the signal processing unit of the subsequent stage.
  • step S215 the distance calculation processing unit 254 performs distance calculation processing using the triangulation signal supplied from the signal separation unit 251.
  • the distance information output as a result of the distance calculation process is supplied to the AF control signal generation unit 255 and a signal processing unit at a subsequent stage (not shown).
  • step S216 the AF control signal generation unit 255 calculates lens position information from the distance information supplied from the distance calculation processing unit 254 using a conversion formula to lens position information, and is based on the calculated lens position information. , Generate an AF control signal.
  • the generated AF control signal is output to a driving unit (not shown) of the optical system 141.
  • color images and distance information are used for AF control of portable terminals such as smart phones, security applications requiring color images and distance information such as face recognition, and gesture recognition for games and the like.
  • the image sensor has been described on the premise of an RGB Bayer array sensor, but in the present technology, a monochrome sensor or a color filter array sensor other than the RGB Bayer array is used. Can also be applied.
  • the triangulation pixels are arranged every four rows of the vertical Bayer array, but the trigonometric pixels may have different densities. Also, in the above description, an example in which the triangulation area which is a band-like area is configured with a width of 2 lines has been described, but the triangulation area may be configured with a width of 1 line, It may be configured to have another width.
  • the triangulation pixels described in the third embodiment and the fourth embodiment are described. It may be performed in combination with the described TOF pixels. That is, it is possible to arrange the triangulation pixels and the TOF pixels in the strip-like triangulation projection area of the fourth embodiment and simultaneously perform the triangulation and the TOF distance measurement. . In this case, the accuracy of distance measurement can be improved.
  • the reflected light from the subject due to the irradiation of the IR light is received only by the triangulation pixels, and therefore does not affect the visible signals obtained by the RGB pixels. Therefore, it becomes possible to separate a visible signal and a triangulation signal which are signals having different characteristics.
  • the series of processes described above can be performed by hardware or software.
  • a program that configures the software is installed on a computer.
  • the computer includes, for example, a general-purpose personal computer that can execute various functions by installing a computer incorporated in dedicated hardware and various programs.
  • FIG. 28 is a block diagram showing an example of the hardware configuration of a computer that executes the series of processes described above according to a program.
  • a central processing unit (CPU) 301 a read only memory (ROM) 302, and a random access memory (RAM) 303 are mutually connected via a bus 304.
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • An input / output interface 305 Also connected to the bus 304 is an input / output interface 305.
  • An input unit 306, an output unit 307, a storage unit 308, a communication unit 309, and a drive 310 are connected to the input / output interface 305.
  • the input unit 306 includes, for example, a keyboard, a mouse, a microphone, a touch panel, an input terminal, and the like.
  • the output unit 307 includes, for example, a display, a speaker, and an output terminal.
  • the storage unit 308 includes, for example, a hard disk, a RAM disk, and a non-volatile memory.
  • the communication unit 309 includes, for example, a network interface.
  • the drive 310 drives removable media 311 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 301 loads the program stored in the storage unit 308 into the RAM 303 via the input / output interface 305 and the bus 304 and executes the program. Processing is performed.
  • the RAM 303 also stores data necessary for the CPU 301 to execute various processes.
  • the program executed by the computer (CPU 301) can be recorded and applied to, for example, a removable medium 311 as a package medium or the like.
  • the program can be installed in the storage unit 308 via the input / output interface 305 by attaching the removable media 311 to the drive 310.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting. In that case, the program can be received by the communication unit 309 and installed in the storage unit 308.
  • this program can be installed in advance in the ROM 302 or the storage unit 308.
  • the program executed by the communication device may be a program that performs processing in chronological order according to the order described in this specification, or in parallel, or when necessary, such as when a call is made. It may be a program in which processing is performed.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is any type of movement, such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machines, agricultural machines (tractors), etc. It may be realized as a device mounted on the body.
  • FIG. 29 is a block diagram showing a schematic configuration example of a vehicle control system 7000 which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • Vehicle control system 7000 comprises a plurality of electronic control units connected via communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside information detection unit 7400, an inside information detection unit 7500, and an integrated control unit 7600.
  • the communication network 7010 connecting the plurality of control units is, for example, an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • Each control unit includes a microcomputer that performs arithmetic processing in accordance with various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various arithmetic operations, and drive circuits that drive devices to be controlled. Equipped with Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and by wired communication or wireless communication with an apparatus or sensor inside or outside the vehicle. A communication I / F for performing communication is provided. In FIG.
  • the integrated control unit 7600 a microcomputer 7610, a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F 7660, an audio image output unit 7670, An in-vehicle network I / F 7680 and a storage unit 7690 are illustrated.
  • the other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • Drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • drive system control unit 7100 includes a drive force generation device for generating a drive force of a vehicle such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, and a steering angle of the vehicle. It functions as a control mechanism such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as an ABS (Antilock Brake System) or an ESC (Electronic Stability Control).
  • Vehicle state detection unit 7110 is connected to drive system control unit 7100.
  • the vehicle state detection unit 7110 may be, for example, a gyro sensor that detects an angular velocity of an axial rotational movement of a vehicle body, an acceleration sensor that detects an acceleration of the vehicle, or an operation amount of an accelerator pedal, an operation amount of a brake pedal, and steering of a steering wheel. At least one of the sensors for detecting the angle, the engine speed, the rotational speed of the wheel, etc. is included.
  • Drive system control unit 7100 performs arithmetic processing using a signal input from vehicle state detection unit 7110 to control an internal combustion engine, a drive motor, an electric power steering device, a brake device, and the like.
  • Body system control unit 7200 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device of various lamps such as a head lamp, a back lamp, a brake lamp, a blinker or a fog lamp.
  • the body system control unit 7200 may receive radio waves or signals of various switches transmitted from a portable device substituting a key.
  • Body system control unit 7200 receives the input of these radio waves or signals, and controls a door lock device, a power window device, a lamp and the like of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310 which is a power supply source of the drive motor according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device provided with the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature adjustment control of the secondary battery 7310 or control of a cooling device or the like provided in the battery device.
  • Outside-vehicle information detection unit 7400 detects information outside the vehicle equipped with vehicle control system 7000.
  • the imaging unit 7410 and the external information detection unit 7420 is connected to the external information detection unit 7400.
  • the imaging unit 7410 includes at least one of a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and another camera.
  • ToF time-of-flight
  • an environment sensor for detecting the current weather or weather, or another vehicle, an obstacle or a pedestrian around the vehicle equipped with the vehicle control system 7000 is detected in the outside-vehicle information detection unit 7420, for example.
  • the ambient information detection sensors at least one of the ambient information detection sensors.
  • the environment sensor may be, for example, at least one of a raindrop sensor that detects wet weather, a fog sensor that detects fog, a sunshine sensor that detects sunshine intensity, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a light detection and ranging (LIDAR) device.
  • the imaging unit 7410 and the external information detection unit 7420 may be provided as independent sensors or devices, or may be provided as an integrated device of a plurality of sensors or devices.
  • FIG. 30 shows an example of installation positions of the imaging unit 7410 and the external information detection unit 7420.
  • the imaging units 7910, 7912, 7914, 7916, 7918 are provided at, for example, at least one of the front nose of the vehicle 7900, the side mirror, the rear bumper, the back door, and the upper portion of the windshield of the vehicle interior.
  • An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided in the upper part of the windshield in the vehicle cabin mainly acquire an image in front of the vehicle 7900.
  • the imaging units 7912 and 7914 provided in the side mirror mainly acquire an image of the side of the vehicle 7900.
  • An imaging unit 7916 provided in the rear bumper or back door mainly acquires an image behind the vehicle 7900.
  • the imaging unit 7918 provided on the upper part of the windshield in the passenger compartment is mainly used to detect a leading vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 30 shows an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors
  • the imaging range d indicates The imaging range of the imaging part 7916 provided in the rear bumper or the back door is shown.
  • a bird's-eye view of the vehicle 7900 as viewed from above can be obtained.
  • the external information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, and corners of the vehicle 7900 and above the windshield of the vehicle interior may be, for example, ultrasonic sensors or radar devices.
  • the external information detection units 7920, 7926, 7930 provided on the front nose of the vehicle 7900, the rear bumper, the back door, and the upper part of the windshield of the vehicle interior may be, for example, a LIDAR device.
  • These outside-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle or the like.
  • the out-of-vehicle information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle, and receives the captured image data. Further, the external information detection unit 7400 receives detection information from the external information detection unit 7420 connected. When the out-of-vehicle information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the out-of-vehicle information detection unit 7400 transmits ultrasonic waves or electromagnetic waves and receives information on the received reflected waves.
  • the external information detection unit 7400 may perform object detection processing or distance detection processing of a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received information.
  • the external information detection unit 7400 may perform environment recognition processing for recognizing rainfall, fog, road surface conditions and the like based on the received information.
  • the external information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the external information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
  • the external information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by different imaging units 7410 to generate an overhead image or a panoramic image. It is also good.
  • the external information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
  • An in-vehicle information detection unit 7500 detects information in the vehicle.
  • a driver state detection unit 7510 that detects a state of a driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera for imaging the driver, a biometric sensor for detecting the driver's biological information, a microphone for collecting sound in the vehicle interior, and the like.
  • the biological sensor is provided, for example, on a seat or a steering wheel, and detects biological information of an occupant sitting on a seat or a driver who grips the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, or determine whether the driver does not go to sleep You may The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 in accordance with various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by, for example, a device such as a touch panel, a button, a microphone, a switch or a lever, which can be input operated by the passenger.
  • the integrated control unit 7600 may receive data obtained by speech recognition of speech input by the microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) corresponding to the operation of the vehicle control system 7000.
  • PDA Personal Digital Assistant
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Furthermore, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the above-described input unit 7800 and outputs the generated signal to the integrated control unit 7600. The passenger or the like operates the input unit 7800 to input various data to the vehicle control system 7000 and instruct processing operations.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, and the like.
  • the storage unit 7690 may be realized by a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • HDD hard disk drive
  • semiconductor storage device an optical storage device
  • magneto-optical storage device or the like.
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F 7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi (registered trademark)), Bluetooth (registered trademark), etc. may be implemented.
  • the general-purpose communication I / F 7620 is connected to, for example, an apparatus (for example, an application server or control server) existing on an external network (for example, the Internet, a cloud network or an operator-specific network) via a base station or access point
  • an apparatus for example, an application server or control server
  • an external network for example, the Internet, a cloud network or an operator-specific network
  • the general-purpose communication I / F 7620 is a terminal (for example, a driver, a pedestrian or a shop terminal, or an MTC (Machine Type Communication) terminal) existing near the vehicle using, for example, P2P (Peer To Peer) technology. It may be connected with
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol designed for use in a vehicle.
  • the dedicated communication I / F 7630 may be a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of lower layer IEEE 802.11p and upper layer IEEE 1609, for example. May be implemented.
  • the dedicated communication I / F 7630 is typically used for Vehicle to Vehicle communication, Vehicle to Infrastructure communication, Vehicle to Home communication, and Vehicle to Pedestrian. 2.) Perform V2X communication, a concept that includes one or more of the communication.
  • the positioning unit 7640 receives a GNSS signal (for example, a GPS signal from a Global Positioning System (GPS) satellite) from, for example, a Global Navigation Satellite System (GNSS) satellite and executes positioning, thereby performing latitude, longitude, and altitude of the vehicle.
  • Generate location information including Positioning section 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone having a positioning function, a PHS, or a smartphone.
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a radio station or the like installed on a road, and acquires information such as the current position, traffic jams, closing times or required time.
  • the function of the beacon reception unit 7650 may be included in the above-described dedicated communication I / F 7630.
  • An in-vehicle apparatus I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle apparatuses 7760 existing in the vehicle.
  • the in-car device I / F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB). Further, the in-car device I / F 7660 can be connected via a connection terminal (and a cable, if necessary) (not shown) via USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High). A wired connection such as -definition Link) may be established.
  • the in-vehicle device 7760 may include, for example, at least one of a mobile device or wearable device owned by a passenger, or an information device carried in or attached to a vehicle. Further, the in-vehicle device 7760 may include a navigation device for performing a route search to any destination.
  • the in-vehicle device I / F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
  • the in-vehicle network I / F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the in-vehicle network I / F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is connected via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon reception unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680.
  • the vehicle control system 7000 is controlled in accordance with various programs based on the information acquired. For example, the microcomputer 7610 calculates a control target value of the driving force generation device, the steering mechanism or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the driving system control unit 7100. It is also good.
  • the microcomputer 7610 realizes the function of an advanced driver assistance system (ADAS) including collision avoidance or shock mitigation of a vehicle, follow-up traveling based on an inter-vehicle distance, vehicle speed maintenance traveling, vehicle collision warning, vehicle lane departure warning, etc. Cooperative control for the purpose of In addition, the microcomputer 7610 automatically runs without using the driver's operation by controlling the driving force generating device, the steering mechanism, the braking device, etc. based on the acquired information of the surroundings of the vehicle. Coordinated control may be performed for the purpose of driving and the like.
  • ADAS advanced driver assistance system
  • the microcomputer 7610 is information acquired via at least one of a general-purpose communication I / F 7620, a dedicated communication I / F 7630, a positioning unit 7640, a beacon reception unit 7650, an in-vehicle device I / F 7660, and an in-vehicle network I / F 7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict a danger such as a collision of a vehicle or a pedestrian or the like approaching a road or the like on the basis of the acquired information, and may generate a signal for warning.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or aurally notifying information to a passenger or the outside of a vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices.
  • the display unit 7720 may include, for example, at least one of an on-board display and a head-up display.
  • the display portion 7720 may have an AR (Augmented Reality) display function.
  • the output device may be another device such as a headphone, a wearable device such as a glasses-type display worn by a passenger, a projector, or a lamp other than these devices.
  • the display device may obtain information obtained from various processes performed by the microcomputer 7610 or information received from another control unit in various formats such as text, images, tables, graphs, etc. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data or audio data into an analog signal and outputs it in an auditory manner.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be configured by a plurality of control units.
  • the vehicle control system 7000 may comprise another control unit not shown.
  • part or all of the functions of any control unit may be provided to another control unit. That is, as long as transmission and reception of information are performed via the communication network 7010, predetermined arithmetic processing may be performed by any control unit.
  • a sensor or device connected to any control unit is connected to another control unit, a plurality of control units may mutually transmit and receive detection information via the communication network 7010. .
  • the camera system according to the present embodiment described with reference to FIGS. 1 to 27 can be applied to the imaging unit 7410 or the external information detection unit 7420 of FIG.
  • the present technology to the imaging unit 7410 or the out-of-vehicle information detection unit 7420, detection and distance measurement of surrounding preceding vehicles, pedestrians, obstacles, and the like can be performed accurately.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an operating room system.
  • FIG. 31 is a diagram schematically showing an overall configuration of an operating room system 5100 to which the technology according to the present disclosure can be applied.
  • the operating room system 5100 is configured such that devices installed in the operating room are connected to be able to cooperate with each other via an audiovisual controller (AV controller) 5107 and an operating room controller 5109.
  • AV controller audiovisual controller
  • FIG. 31 various devices may be installed in the operating room.
  • various device groups 5101 for endoscopic surgery a sealing camera 5187 provided on the ceiling of the operating room for imaging the hand of the operator, and an operating room provided on the ceiling of the operating room
  • a surgical field camera 5189 for imaging the entire situation a plurality of display devices 5103A to 5103D, a recorder 5105, a patient bed 5183 and an illumination 5191 are shown.
  • a device group 5101 belongs to an endoscopic surgery system 5113 described later, and includes an endoscope, a display device that displays an image captured by the endoscope, and the like.
  • Each device belonging to the endoscopic surgery system 5113 is also referred to as a medical device.
  • the display devices 5103A to 5103D, the recorder 5105, the patient bed 5183 and the illumination 5191 are devices provided, for example, in the operating room separately from the endoscopic surgery system 5113.
  • Each device which does not belong to the endoscopic surgery system 5113 is also referred to as a non-medical device.
  • the audiovisual controller 5107 and / or the operating room controller 5109 cooperate with each other to control the operation of the medical device and the non-medical device.
  • the audio-visual controller 5107 centrally controls processing relating to image display in medical devices and non-medical devices.
  • the device group 5101, the ceiling camera 5187, and the operation room camera 5189 have a function of transmitting information to be displayed during surgery (hereinafter also referred to as display information).
  • It may be a device (hereinafter also referred to as a source device).
  • the display devices 5103A to 5103D can be devices to which display information is output (hereinafter, also referred to as a device of an output destination).
  • the recorder 5105 may be a device that corresponds to both a source device and an output device.
  • the audiovisual controller 5107 controls the operation of the transmission source device and the output destination device, acquires display information from the transmission source device, transmits the display information to the output destination device, and displays or records the function.
  • the display information is various images captured during the operation, various information related to the operation (for example, physical information of the patient, information on a past examination result, information on the operation method, etc.).
  • information about an image of a surgical site in a patient's body cavity captured by the endoscope may be transmitted from the device group 5101 as display information to the audiovisual controller 5107.
  • information on the image of the operator's hand captured by the ceiling camera 5187 can be transmitted as display information.
  • information on an image indicating the appearance of the entire operating room captured by the surgery site camera 5189 may be transmitted from the surgery site camera 5189 as display information.
  • the audiovisual controller 5107 acquires information on an image captured by the other device from the other device as display information. You may
  • the recorder 5105 information about these images captured in the past is recorded by the audiovisual controller 5107.
  • the audiovisual controller 5107 can acquire information on an image captured in the past from the recorder 5105 as display information.
  • the recorder 5105 may also record various types of information regarding surgery in advance.
  • the audiovisual controller 5107 causes the acquired display information (that is, the image taken during the operation and various information related to the operation) to be displayed on at least one of the display devices 5103A to 5103D which are output destination devices.
  • the display device 5103A is a display device suspended and installed from the ceiling of the operating room
  • the display device 5103B is a display device installed on the wall of the operating room
  • the display device 5103C is in the operating room
  • the display device 5103D is a display device installed on a desk
  • the display device 5103D is a mobile device (for example, a tablet PC (Personal Computer)) having a display function.
  • the operating room system 5100 may include devices outside the operating room.
  • the apparatus outside the operating room may be, for example, a server connected to a network built inside or outside a hospital, a PC used by medical staff, a projector installed in a conference room of a hospital, or the like.
  • the audiovisual controller 5107 can also display the display information on the display device of another hospital via a video conference system or the like for telemedicine.
  • the operating room control device 5109 centrally controls processing other than processing related to image display in non-medical devices.
  • the operating room controller 5109 controls the driving of the patient bed 5183, the ceiling camera 5187, the operation room camera 5189, and the illumination 5191.
  • the operating room system 5100 is provided with a centralized operation panel 5111, and the user gives an instruction for image display to the audiovisual controller 5107 through the centralized operation panel 5111, and the operating room control device 5109. Instructions can be given to the operation of the non-medical device.
  • the centralized operation panel 5111 is configured by providing a touch panel on the display surface of the display device.
  • FIG. 32 is a view showing a display example of the operation screen on the centralized operation panel 5111.
  • FIG. 32 shows, as an example, an operation screen corresponding to a case where two display devices are provided as an output destination device in the operating room system 5100.
  • the operation screen 5193 is provided with a source selection area 5195, a preview area 5197, and a control area 5201.
  • a transmission source device provided in the operating room system 5100 and a thumbnail screen representing display information of the transmission source device are displayed in association with each other. The user can select display information to be displayed on the display device from any of the transmission source devices displayed in the transmission source selection area 5195.
  • a preview of a screen displayed on two display devices which are output destination devices is displayed.
  • four images are displayed in PinP on one display device.
  • the four images correspond to the display information transmitted from the transmission source device selected in the transmission source selection area 5195.
  • one is displayed relatively large as a main image, and the remaining three are displayed relatively small as sub-images.
  • the user can replace the main image and the sub-image by appropriately selecting the area in which the four images are displayed.
  • a status display area 5199 is provided below the area where the four images are displayed, and the status regarding surgery (for example, elapsed time of surgery, physical information of patient, etc.) is appropriately displayed in the area. obtain.
  • a control area 5201 includes a transmission source operation area 5203 in which a GUI (Graphical User Interface) component for performing an operation on a transmission source device is displayed, and a GUI component for performing an operation on an output destination device And an output destination operation area 5205 in which is displayed.
  • the transmission source operation area 5203 is provided with GUI components for performing various operations (pan, tilt, and zoom) on the camera in the transmission source apparatus having an imaging function. The user can operate the operation of the camera in the source device by appropriately selecting these GUI components.
  • the transmission source operation area 5203 may be provided with a GUI component for performing an operation such as reproduction, reproduction stop, rewind, fast forward, etc. of the image.
  • a GUI component for performing various operations (swap, flip, color adjustment, contrast adjustment, switching between 2D display and 3D display) on the display in the display device which is the output destination device It is provided.
  • the user can operate the display on the display device by appropriately selecting these GUI components.
  • the operation screen displayed on the centralized operation panel 5111 is not limited to the illustrated example, and the user can use the audiovisual controller 5107 and the operating room control device 5109 provided in the operating room system 5100 via the centralized operation panel 5111. Operation input to each device that can be controlled may be possible.
  • FIG. 33 is a diagram showing an example of a state of surgery to which the operating room system described above is applied.
  • a ceiling camera 5187 and an operation room camera 5189 are provided on the ceiling of the operating room, and can capture a picture of the hand of the operator (doctor) 5181 who performs treatment on the affected part of the patient 5185 on the patient bed 5183 and the entire operating room It is.
  • the ceiling camera 5187 and the operation room camera 5189 may be provided with a magnification adjustment function, a focal length adjustment function, an imaging direction adjustment function, and the like.
  • the illumination 5191 is provided on the ceiling of the operating room and illuminates at least the hand of the operator 5181.
  • the illumination 5191 may be capable of appropriately adjusting the irradiation light amount, the wavelength (color) of the irradiation light, the irradiation direction of the light, and the like.
  • the endoscopic surgery system 5113, the patient bed 5183, the ceiling camera 5187, the operation room camera 5189 and the illumination 5191 are connected via the audiovisual controller 5107 and the operating room controller 5109 (not shown in FIG. 33) as shown in FIG. Are connected to each other so that they can cooperate with each other.
  • a centralized operation panel 5111 is provided in the operating room, and as described above, the user can appropriately operate these devices present in the operating room via the centralized operation panel 5111.
  • the endoscopic surgery system 5113 includes an endoscope 5115, other surgical instruments 5131, a support arm device 5141 for supporting the endoscope 5115, and various devices for endoscopic surgery. And a cart 5151 mounted thereon.
  • trocars 5139a to 5139d are punctured in the abdominal wall. Then, the barrel 5117 of the endoscope 5115 and other surgical tools 5131 are inserted into the body cavity of the patient 5185 from the trocars 5139 a to 5139 d.
  • an insufflation tube 5133, an energy treatment instrument 5135, and a forceps 5137 are inserted into the body cavity of the patient 5185 as other surgical instruments 5131.
  • the energy treatment tool 5135 is a treatment tool that performs incision and peeling of tissue, sealing of a blood vessel, and the like by high-frequency current or ultrasonic vibration.
  • the illustrated surgical tool 5131 is merely an example, and various surgical tools generally used in endoscopic surgery, such as forceps and retractors, may be used as the surgical tool 5131, for example.
  • An image of the operation site in the body cavity of the patient 5185 taken by the endoscope 5115 is displayed on the display device 5155.
  • the operator 5181 performs a treatment such as excision of the affected area using the energy treatment tool 5135 and the forceps 5137 while viewing the image of the operative part displayed on the display device 5155 in real time.
  • a treatment such as excision of the affected area using the energy treatment tool 5135 and the forceps 5137
  • the insufflation tube 5133, the energy treatment tool 5135 and the forceps 5137 are supported by the operator 5181 or an assistant during the operation.
  • the support arm device 5141 includes an arm 5145 extending from the base 5143.
  • the arm 5145 includes joints 5147a, 5147b, 5147c, and links 5149a, 5149b, and is driven by control from the arm controller 5159.
  • the endoscope 5115 is supported by the arm 5145, and its position and posture are controlled. In this way, stable position fixation of the endoscope 5115 can be realized.
  • the endoscope 5115 includes a lens barrel 5117 whose region of a predetermined length from the tip is inserted into the body cavity of the patient 5185, and a camera head 5119 connected to the proximal end of the lens barrel 5117.
  • the endoscope 5115 configured as a so-called rigid endoscope having a rigid barrel 5117 is illustrated.
  • the endoscope 5115 is configured as a so-called flexible mirror having a flexible barrel 5117 It is also good.
  • a light source device 5157 is connected to the endoscope 5115, and light generated by the light source device 5157 is guided to the tip of the lens barrel by a light guide extended inside the lens barrel 5117, and an objective The light is emitted toward the observation target in the body cavity of the patient 5185 through the lens.
  • the endoscope 5115 may be a straight endoscope, or may be a oblique endoscope or a side endoscope.
  • An optical system and an imaging device are provided inside the camera head 5119, and reflected light (observation light) from the observation target is condensed on the imaging device by the optical system.
  • the observation light is photoelectrically converted by the imaging element to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image.
  • the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 5153 as RAW data.
  • the camera head 5119 has a function of adjusting the magnification and the focal length by appropriately driving the optical system.
  • a plurality of imaging devices may be provided in the camera head 5119 in order to cope with, for example, stereoscopic vision (3D display).
  • a plurality of relay optical systems are provided inside the lens barrel 5117 in order to guide observation light to each of the plurality of imaging elements.
  • the CCU 5153 is constituted by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and controls the operation of the endoscope 5115 and the display device 5155 in a centralized manner. Specifically, the CCU 5153 subjects the image signal received from the camera head 5119 to various types of image processing, such as development processing (demosaicing processing), for displaying an image based on the image signal. The CCU 5153 provides the display device 5155 with the image signal subjected to the image processing. Further, an audiovisual controller 5107 shown in FIG. 31 is connected to the CCU 5153. The CCU 5153 also provides the audiovisual controller 5107 with the image signal subjected to the image processing.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the CCU 5153 transmits a control signal to the camera head 5119 to control the driving thereof.
  • the control signal may include information on imaging conditions such as magnification and focal length.
  • the information related to the imaging condition may be input through the input device 5161 or may be input through the above-described centralized operation panel 5111.
  • the display device 5155 displays an image based on the image signal subjected to the image processing by the CCU 5153 under the control of the CCU 5153.
  • the endoscope 5115 corresponds to high-resolution imaging such as 4K (3840 horizontal pixels ⁇ 2160 vertical pixels) or 8K (7680 horizontal pixels ⁇ 4320 vertical pixels), and / or 3D display, for example
  • a device capable of high-resolution display and / or a device capable of 3D display may be used.
  • high-resolution imaging such as 4K or 8K
  • a display device 5155 having a size of 55 inches or more a further immersive feeling can be obtained.
  • a plurality of display devices 5155 having different resolutions and sizes may be provided depending on the application.
  • the light source device 5157 is configured of a light source such as an LED (light emitting diode), for example, and supplies illumination light at the time of imaging the surgical site to the endoscope 5115.
  • a light source such as an LED (light emitting diode)
  • the arm control device 5159 is constituted by a processor such as a CPU, for example, and operates in accordance with a predetermined program to control the driving of the arm 5145 of the support arm device 5141 according to a predetermined control method.
  • the input device 5161 is an input interface to the endoscopic surgery system 5113.
  • the user can input various information and input instructions to the endoscopic surgery system 5113 through the input device 5161.
  • the user inputs, via the input device 5161, various types of information related to surgery, such as physical information of a patient and information on a surgery procedure.
  • the user instructs, via the input device 5161, an instruction to drive the arm unit 5145, and an instruction to change the imaging conditions (type of irradiated light, magnification, focal length, etc.) by the endoscope 5115.
  • An instruction to drive the energy treatment tool 5135, etc. are input.
  • the type of the input device 5161 is not limited, and the input device 5161 may be various known input devices.
  • a mouse, a keyboard, a touch panel, a switch, a foot switch 5171, and / or a lever may be applied as the input device 5161.
  • the touch panel may be provided on the display surface of the display device 5155.
  • the input device 5161 is a device mounted by the user, such as a glasses-type wearable device or an HMD (Head Mounted Display), for example, and various types of input according to the user's gesture or line of sight detected by these devices. Is done. Further, the input device 5161 includes a camera capable of detecting the motion of the user, and various inputs are performed in accordance with the user's gesture and line of sight detected from the image captured by the camera. Furthermore, the input device 5161 includes a microphone capable of picking up the user's voice, and various inputs are performed by voice via the microphone.
  • a glasses-type wearable device or an HMD Head Mounted Display
  • the user for example, the operator 5181
  • the input device 5161 being configured to be able to input various information in a non-contact manner. Is possible.
  • the user can operate the device without releasing his / her hand from the operating tool, the convenience of the user is improved.
  • the treatment instrument control device 5163 controls the drive of the energy treatment instrument 5135 for ablation of tissue, incision, sealing of a blood vessel or the like.
  • the insufflation apparatus 5165 is provided with a gas in the body cavity via the insufflation tube 5133 in order to expand the body cavity of the patient 5185 for the purpose of securing a visual field by the endoscope 5115 and securing a working space of the operator.
  • Send The recorder 5167 is a device capable of recording various types of information regarding surgery.
  • the printer 5169 is a device capable of printing various types of information related to surgery in various types such as text, images, and graphs.
  • the support arm device 5141 includes a base 5143 which is a base and an arm 5145 extending from the base 5143.
  • the arm 5145 includes a plurality of joints 5147a, 5147b, and 5147c, and a plurality of links 5149a and 5149b connected by the joints 5147b.
  • FIG. The structure of the arm 5145 is shown in a simplified manner. In practice, the shape, number and arrangement of the joints 5147a to 5147c and the links 5149a and 5149b, and the direction of the rotation axis of the joints 5147a to 5147c are appropriately set so that the arm 5145 has a desired degree of freedom. obtain.
  • the arm 5145 may be preferably configured to have six or more degrees of freedom.
  • the endoscope 5115 can be freely moved within the movable range of the arm 5145, so that the lens barrel 5117 of the endoscope 5115 can be inserted into the body cavity of the patient 5185 from a desired direction. It will be possible.
  • the joints 5147a to 5147c are provided with an actuator, and the joints 5147a to 5147c are configured to be rotatable around a predetermined rotation axis by driving the actuators.
  • the driving of the actuator is controlled by the arm control device 5159 to control the rotation angles of the joint portions 5147a to 5147c, and the driving of the arm portion 5145 is controlled. Thereby, control of the position and posture of the endoscope 5115 can be realized.
  • the arm control device 5159 can control the driving of the arm unit 5145 by various known control methods such as force control or position control.
  • the drive of the arm 5145 is appropriately controlled by the arm control device 5159 according to the operation input, and
  • the position and attitude of the endoscope 5115 may be controlled.
  • the endoscope 5115 at the tip of the arm 5145 is moved from any position to any position, the endoscope 5115 can be fixedly supported at the position after the movement.
  • the arm 5145 may be operated by a so-called master slave method. In this case, the arm 5145 can be remotely controlled by the user via the input device 5161 installed at a location distant from the operating room.
  • the arm control device 5159 receives the external force from the user and moves the actuator of each joint 5147 a to 5147 c so that the arm 5145 moves smoothly following the external force. So-called power assist control may be performed.
  • the arm 5145 can be moved with a relatively light force. Therefore, it is possible to move the endoscope 5115 more intuitively and with a simpler operation, and the convenience of the user can be improved.
  • the endoscope 5115 is supported by a doctor called scopist.
  • the position of the endoscope 5115 can be more reliably fixed without manual operation, so that it is possible to stably obtain an image of the operative site. , Can be performed smoothly.
  • the arm control device 5159 may not necessarily be provided in the cart 5151. Also, the arm control device 5159 may not necessarily be one device. For example, the arm control device 5159 may be provided at each joint 5147 a to 5147 c of the arm 5145 of the support arm device 5141, and the arm control devices 5159 cooperate with one another to drive the arm 5145. Control may be realized.
  • the light source device 5157 supplies the endoscope 5115 with illumination light for imaging the operative part.
  • the light source device 5157 is configured of, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources
  • the output intensity and output timing of each color can be controlled with high accuracy. Adjustments can be made.
  • the laser light from each of the RGB laser light sources is irradiated on the observation target in time division, and the drive of the imaging device of the camera head 5119 is controlled in synchronization with the irradiation timing to cope with each of RGB. It is also possible to capture a shot image in time division. According to the method, a color image can be obtained without providing a color filter in the imaging device.
  • the drive of the light source device 5157 may be controlled to change the intensity of the light to be output at predetermined time intervals.
  • the drive of the imaging element of the camera head 5119 is controlled in synchronization with the timing of the change of the light intensity to acquire images in time division, and by combining the images, high dynamic without so-called blackout and whiteout is obtained. An image of the range can be generated.
  • the light source device 5157 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, the mucous membrane surface layer is irradiated by irradiating narrow band light as compared with irradiation light (that is, white light) at the time of normal observation using the wavelength dependency of light absorption in body tissue.
  • the so-called narrow band imaging is performed to image a predetermined tissue such as a blood vessel with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiation with excitation light.
  • a body tissue is irradiated with excitation light and fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue while being locally injected. What irradiates the excitation light corresponding to the fluorescence wavelength of the reagent, and obtains a fluorescence image etc. can be performed.
  • the light source device 5157 can be configured to be able to supply narrow band light and / or excitation light corresponding to such special light observation.
  • FIG. 34 is a block diagram showing an example of a functional configuration of the camera head 5119 and the CCU 5153 shown in FIG.
  • the camera head 5119 has a lens unit 5121, an imaging unit 5123, a drive unit 5125, a communication unit 5127, and a camera head control unit 5129 as its functions.
  • the CCU 5153 also includes a communication unit 5173, an image processing unit 5175, and a control unit 5177 as its functions.
  • the camera head 5119 and the CCU 5153 are communicably connected in both directions by a transmission cable 5179.
  • the lens unit 5121 is an optical system provided at the connection with the lens barrel 5117.
  • the observation light taken in from the tip of the lens barrel 5117 is guided to the camera head 5119 and is incident on the lens unit 5121.
  • the lens unit 5121 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the optical characteristic of the lens unit 5121 is adjusted so as to condense the observation light on the light receiving surface of the imaging element of the imaging unit 5123.
  • the zoom lens and the focus lens are configured such that the position on the optical axis can be moved in order to adjust the magnification and the focus of the captured image.
  • the imaging unit 5123 is configured by an imaging element, and is disposed downstream of the lens unit 5121.
  • the observation light which has passed through the lens unit 5121 is condensed on the light receiving surface of the imaging device, and an image signal corresponding to the observation image is generated by photoelectric conversion.
  • the image signal generated by the imaging unit 5123 is provided to the communication unit 5127.
  • an imaging element which comprises the imaging part 5123 it is an image sensor of a CMOS (Complementary Metal Oxide Semiconductor) type, for example, and a color imaging
  • CMOS Complementary Metal Oxide Semiconductor
  • photography of the high resolution image of 4K or more may be used, for example.
  • the imaging device constituting the imaging unit 5123 is configured to have a pair of imaging devices for acquiring image signals for the right eye and for the left eye corresponding to 3D display.
  • the 3D display enables the operator 5181 to more accurately grasp the depth of the living tissue in the operation site.
  • the imaging unit 5123 is configured as a multi-plate type, a plurality of lens units 5121 are also provided corresponding to each imaging element.
  • the imaging unit 5123 may not necessarily be provided in the camera head 5119.
  • the imaging unit 5123 may be provided inside the lens barrel 5117 immediately after the objective lens.
  • the drive unit 5125 is constituted by an actuator, and moves the zoom lens and the focus lens of the lens unit 5121 by a predetermined distance along the optical axis under the control of the camera head control unit 5129. Thereby, the magnification and the focus of the captured image by the imaging unit 5123 may be appropriately adjusted.
  • the communication unit 5127 is configured of a communication device for transmitting and receiving various types of information to and from the CCU 5153.
  • the communication unit 5127 transmits the image signal obtained from the imaging unit 5123 to the CCU 5153 via the transmission cable 5179 as RAW data.
  • the image signal be transmitted by optical communication in order to display the captured image of the surgical site with low latency.
  • the operator 5181 performs the operation while observing the condition of the affected area by the captured image, and for safer and more reliable operation, the moving image of the operation site is displayed in real time as much as possible It is because that is required.
  • the communication unit 5127 is provided with a photoelectric conversion module which converts an electrical signal into an optical signal.
  • the image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 5153 via the transmission cable 5179.
  • the communication unit 5127 also receives, from the CCU 5153, a control signal for controlling the drive of the camera head 5119.
  • the control signal includes, for example, information indicating that the frame rate of the captured image is designated, information indicating that the exposure value at the time of imaging is designated, and / or information indicating that the magnification and focus of the captured image are designated, etc. Contains information about the condition.
  • the communication unit 5127 provides the received control signal to the camera head control unit 5129.
  • the control signal from the CCU 5153 may also be transmitted by optical communication.
  • the communication unit 5127 is provided with a photoelectric conversion module that converts an optical signal into an electric signal, and the control signal is converted into an electric signal by the photoelectric conversion module and is then provided to the camera head control unit 5129.
  • imaging conditions such as the frame rate, the exposure value, the magnification, and the focus described above are automatically set by the control unit 5177 of the CCU 5153 based on the acquired image signal. That is, so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are installed in the endoscope 5115.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Automatic White Balance
  • the camera head control unit 5129 controls the drive of the camera head 5119 based on the control signal from the CCU 5153 received via the communication unit 5127. For example, the camera head control unit 5129 controls the drive of the imaging element of the imaging unit 5123 based on the information to specify the frame rate of the captured image and / or the information to specify the exposure at the time of imaging. In addition, for example, the camera head control unit 5129 appropriately moves the zoom lens and the focus lens of the lens unit 5121 via the drive unit 5125 based on the information indicating that the magnification and the focus of the captured image are designated.
  • the camera head control unit 5129 may further have a function of storing information for identifying the lens barrel 5117 and the camera head 5119.
  • the camera head 5119 can have resistance to autoclave sterilization.
  • the communication unit 5173 is configured of a communication device for transmitting and receiving various information to and from the camera head 5119.
  • the communication unit 5173 receives an image signal transmitted from the camera head 5119 via the transmission cable 5179.
  • the image signal can be suitably transmitted by optical communication.
  • the communication unit 5173 is provided with a photoelectric conversion module which converts an optical signal into an electrical signal.
  • the communication unit 5173 provides the image processing unit 5175 with the image signal converted into the electrical signal.
  • the communication unit 5173 transmits, to the camera head 5119, a control signal for controlling the drive of the camera head 5119.
  • the control signal may also be transmitted by optical communication.
  • the image processing unit 5175 performs various types of image processing on an image signal that is RAW data transmitted from the camera head 5119.
  • image processing for example, development processing, high image quality processing (band emphasis processing, super-resolution processing, NR (noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing (electronic zoom processing) And various other known signal processings.
  • the image processing unit 5175 also performs detection processing on the image signal to perform AE, AF, and AWB.
  • the image processing unit 5175 is configured by a processor such as a CPU or a GPU, and the image processing and the detection processing described above can be performed by the processor operating according to a predetermined program.
  • the image processing unit 5175 is configured by a plurality of GPUs, the image processing unit 5175 appropriately divides the information related to the image signal, and performs image processing in parallel by the plurality of GPUs.
  • the control unit 5177 performs various types of control regarding imaging of the surgical site by the endoscope 5115 and display of the imaged image. For example, the control unit 5177 generates a control signal for controlling the drive of the camera head 5119. At this time, when the imaging condition is input by the user, the control unit 5177 generates a control signal based on the input by the user. Alternatively, when the endoscope 5115 is equipped with the AE function, the AF function, and the AWB function, the control unit 5177 determines the optimum exposure value, focal length, and the like according to the result of the detection processing by the image processing unit 5175. The white balance is appropriately calculated to generate a control signal.
  • control unit 5177 causes the display device 5155 to display an image of the operative site based on the image signal subjected to the image processing by the image processing unit 5175.
  • the control unit 5177 recognizes various objects in the operation site image using various image recognition techniques. For example, the control unit 5177 detects a shape, a color, and the like of an edge of an object included in an operation part image, thereby enabling a surgical tool such as forceps, a specific living part, bleeding, mist when using the energy treatment tool 5135, etc. It can be recognized.
  • the control unit 5177 uses the recognition result to superimpose various operation support information on the image of the operation unit. The operation support information is superimposed and presented to the operator 5181, which makes it possible to proceed with the operation more safely and reliably.
  • a transmission cable 5179 connecting the camera head 5119 and the CCU 5153 is an electric signal cable corresponding to communication of an electric signal, an optical fiber corresponding to optical communication, or a composite cable of these.
  • communication is performed by wire communication using the transmission cable 5179, but communication between the camera head 5119 and the CCU 5153 may be performed wirelessly.
  • the communication between the two is performed wirelessly, it is not necessary to lay the transmission cable 5179 in the operating room, so that the movement of the medical staff in the operating room can be eliminated by the transmission cable 5179.
  • the operating room system 5100 to which the technology according to the present disclosure can be applied has been described.
  • the medical treatment system to which the operating room system 5100 is applied is the endoscopic surgery system 5113
  • the configuration of the operating room system 5100 is not limited to such an example.
  • the operating room system 5100 may be applied to a flexible endoscopic system for examination or a microsurgery system instead of the endoscopic surgery system 5113.
  • the camera system according to the present embodiment described with reference to FIGS. 1 to 27 can be suitably applied to the ceiling camera 5187, the surgical site camera 5189, and the camera head 5119 of the endoscope 5115 among the configurations described above.
  • the technology according to the present disclosure to the ceiling camera 5187, the operation room camera 5189, and the camera head 5119 of the endoscope 5115, hemoglobin in the blood can be accurately observed, the depth of internal organs, etc. can be accurately determined. Can be measured.
  • a system means a set of a plurality of components (apparatus, modules (parts), etc.), and it does not matter whether all the components are in the same case. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules in one housing are all systems. .
  • An imaging unit for imaging a subject An image of the subject is generated based on a pixel signal obtained by performing imaging in a state in which light of a predetermined pattern from a structured light source is emitted to a projection area of a specific pixel of the imaging unit. And an image generation unit.
  • the image generation unit generates an image of the subject based on a pixel signal obtained by performing imaging in a state where IR light is irradiated to the projection region of the IR pixel from the structured light source. 1) or the imaging device as described in (2).
  • the image generation unit generates an IR image based on a signal from the IR pixel irradiated with the IR light from the structured light source, and based on a signal from a pixel not irradiated with the IR light.
  • the imaging device according to (3) which generates a visible image.
  • the image generation unit irradiates light of a predetermined pattern from the plurality of structured light sources on a projection area of a pixel corresponding to each of the plurality of structured light sources from the plurality of structured light sources having different wavelength bands.
  • the imaging device according to (1) or (2) wherein an image of the subject is generated based on a pixel signal obtained by performing imaging in a state where the image is captured.
  • the image generation unit generates an image of the subject based on a pixel signal obtained by performing imaging in a state where IR light is irradiated to the projection region of the TOF pixel from the structured light source.
  • the imaging device according to (1) or (2).
  • the image generation unit calculates a distance for AF control based on a signal from the TOF pixel on which the IR light from the structured light source is irradiated, and the image generation unit calculates the distance from the pixel on which the IR light is not irradiated.
  • the imaging device according to (6) which generates a visible image based on a signal.
  • the image generation unit generates an image of the subject based on a pixel signal obtained by performing imaging in a state where IR light is emitted from the structured light source to the projection area of the triangulation pixel.
  • the imaging device according to (1) or (2).
  • the image generation unit calculates a distance for AF control based on a signal from the triangulation pixel irradiated with the IR light from the structured light source, and the pixel not irradiated with the IR light
  • An imaging device given in the above (8) which generates a visible picture based on a signal of.
  • the imaging device according to any one of (1) to (10), further including: a light irradiation unit which is the structured light source. (12) In order to make the irradiation area boundary of the light irradiation part substantially coincide with the angle of view of the imaging part, a mirror which reflects the light irradiated from the light irradiation part and transmits the light reflected by the projection area of the pixel.
  • the imaging device according to (11), further comprising: (13) The imaging apparatus according to (11), wherein the light irradiation unit includes a diffraction grating on a front surface.
  • the light emitting unit is configured integrally with the imaging apparatus.
  • the imaging device according to any one of (11) to (13), wherein the light emitting unit is mounted exchangeably to the imaging device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

L'invention concerne un dispositif d'imagerie qui permet de séparer efficacement des signaux présentant des caractéristiques différentes. Une unité de génération d'image génère une image d'un objet sur la base d'un signal de pixel obtenu par réalisation d'une imagerie dans un état dans lequel une zone de projection d'un pixel spécifique d'une unité d'imagerie qui image l'objet est irradiée par la lumière d'un motif prescrit à partir d'une source de lumière structurée. La présente invention peut être appliquée, par exemple, à un système de caméra comprenant une source de lumière structurée et un dispositif d'imagerie.
PCT/JP2018/038042 2017-10-27 2018-10-12 Dispositif d'imagerie WO2019082686A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/756,171 US11372200B2 (en) 2017-10-27 2018-10-12 Imaging device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2017208111 2017-10-27
JP2017-208111 2017-10-27
JP2018-008193 2018-01-22
JP2018008193A JP2019083501A (ja) 2017-10-27 2018-01-22 撮像装置

Publications (1)

Publication Number Publication Date
WO2019082686A1 true WO2019082686A1 (fr) 2019-05-02

Family

ID=66246327

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/038042 WO2019082686A1 (fr) 2017-10-27 2018-10-12 Dispositif d'imagerie

Country Status (1)

Country Link
WO (1) WO2019082686A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012117508A1 (fr) * 2011-02-28 2012-09-07 株式会社Pfu Dispositif, procédé et programme de traitement d'informations
WO2016157593A1 (fr) * 2015-03-27 2016-10-06 富士フイルム株式会社 Appareil d'acquisition d'image de distance et procédé d'acquisition d'image de distance
WO2016199518A1 (fr) * 2015-06-09 2016-12-15 富士フイルム株式会社 Dispositif d'acquisition d'image de distance et procédé d'acquisition d'image de distance
JP2017187471A (ja) * 2016-03-31 2017-10-12 パナソニックIpマネジメント株式会社 撮像装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012117508A1 (fr) * 2011-02-28 2012-09-07 株式会社Pfu Dispositif, procédé et programme de traitement d'informations
WO2016157593A1 (fr) * 2015-03-27 2016-10-06 富士フイルム株式会社 Appareil d'acquisition d'image de distance et procédé d'acquisition d'image de distance
WO2016199518A1 (fr) * 2015-06-09 2016-12-15 富士フイルム株式会社 Dispositif d'acquisition d'image de distance et procédé d'acquisition d'image de distance
JP2017187471A (ja) * 2016-03-31 2017-10-12 パナソニックIpマネジメント株式会社 撮像装置

Similar Documents

Publication Publication Date Title
US11372200B2 (en) Imaging device
WO2018135315A1 (fr) Dispositif de capture d'images, procédé de traitement d'images et système de traitement d'images
CN111108436B (zh) 镜筒和成像装置
JP7229988B2 (ja) 測距システム、及び、受光モジュール、並びに、バンドパスフィルタの製造方法
JP7248037B2 (ja) 画像処理装置、画像処理方法、およびプログラム
US11119633B2 (en) Information processing device and method
WO2018003245A1 (fr) Dispositif de traitement de signaux, dispositif d'imagerie et procédé de traitement de signaux
WO2018016344A1 (fr) Dispositif de capture d'image à semi-conducteurs et instrument électronique
JP7131554B2 (ja) 画像処理装置、画像処理方法、およびプログラム
WO2018037680A1 (fr) Dispositif d'imagerie, système d'imagerie, et procédé de traitement de signal
US11953376B2 (en) Imaging apparatus, signal processing apparatus, signal processing method, and program
WO2018100992A1 (fr) Système optique d'imagerie, module de caméra et appareil électronique
US11482159B2 (en) Display control device, display control method, and display control program
JP7081609B2 (ja) レンズ装置及び撮像装置
WO2019082686A1 (fr) Dispositif d'imagerie
US11470295B2 (en) Signal processing device, signal processing method, and imaging device
JP7405132B2 (ja) レンズ鏡筒及び撮像装置
WO2022124166A1 (fr) Élément d'affichage à cristaux liquides, dispositif d'affichage, dispositif électronique, substrat d'entraînement et procédé de fabrication de substrat d'entraînement
US20240118571A1 (en) Liquid crystal display element, display device, electronic device, drive substrate, and method for manufacturing drive substrate

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18869802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18869802

Country of ref document: EP

Kind code of ref document: A1