WO2022196210A1 - Dispositif d'imagerie, procédé d'imagerie et appareil électronique - Google Patents
Dispositif d'imagerie, procédé d'imagerie et appareil électronique Download PDFInfo
- Publication number
- WO2022196210A1 WO2022196210A1 PCT/JP2022/005661 JP2022005661W WO2022196210A1 WO 2022196210 A1 WO2022196210 A1 WO 2022196210A1 JP 2022005661 W JP2022005661 W JP 2022005661W WO 2022196210 A1 WO2022196210 A1 WO 2022196210A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- filter
- pixel array
- region
- imaging device
- image
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 217
- 238000012545 processing Methods 0.000 claims abstract description 84
- 230000002194 synthesizing effect Effects 0.000 claims description 9
- 239000000758 substrate Substances 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 37
- 238000010586 diagram Methods 0.000 description 35
- 230000000875 corresponding effect Effects 0.000 description 31
- 238000001514 detection method Methods 0.000 description 28
- 238000005516 engineering process Methods 0.000 description 27
- 210000003128 head Anatomy 0.000 description 26
- 230000003287 optical effect Effects 0.000 description 19
- 230000001276 controlling effect Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 15
- 238000003860 storage Methods 0.000 description 15
- 238000002674 endoscopic surgery Methods 0.000 description 12
- 230000007246 mechanism Effects 0.000 description 10
- 238000000034 method Methods 0.000 description 10
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 9
- 239000003086 colorant Substances 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 238000009826 distribution Methods 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 239000000047 product Substances 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 208000005646 Pneumoperitoneum Diseases 0.000 description 3
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 229920005989 resin Polymers 0.000 description 3
- 239000011347 resin Substances 0.000 description 3
- 229910052581 Si3N4 Inorganic materials 0.000 description 2
- PPBRXRYQALVLMV-UHFFFAOYSA-N Styrene Chemical compound C=CC1=CC=CC=C1 PPBRXRYQALVLMV-UHFFFAOYSA-N 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000035790 physiological processes and functions Effects 0.000 description 2
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 239000004925 Acrylic resin Substances 0.000 description 1
- 229920000178 Acrylic resin Polymers 0.000 description 1
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 229910004298 SiO 2 Inorganic materials 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 239000011230 binding agent Substances 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 229920006026 co-polymeric resin Polymers 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 239000002537 cosmetic Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- KPUWHANPEXNPJT-UHFFFAOYSA-N disiloxane Chemical class [SiH3]O[SiH3] KPUWHANPEXNPJT-UHFFFAOYSA-N 0.000 description 1
- 239000000975 dye Substances 0.000 description 1
- 238000002073 fluorescence micrograph Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000002406 microsurgery Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 239000000049 pigment Substances 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 229910052814 silicon oxide Inorganic materials 0.000 description 1
- 229920001909 styrene-acrylic polymer Polymers 0.000 description 1
- 239000002344 surface layer Substances 0.000 description 1
- 238000002198 surface plasmon resonance spectroscopy Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- the lens 102 may be, for example, a convex lens or a hemispherical asymmetric lens, and is not particularly limited.
- the lens 102 can be made of a resin material such as silicon oxide (SiO 2 ), silicon nitride (SiN), styrene resin, acrylic resin, styrene-acrylic copolymer resin, or siloxane resin. can.
- FIG. 3 is a schematic diagram showing an example of the configuration of the color filter 200 shown in FIG.
- the color filter 200 can transmit light of specific wavelengths as described above.
- the Bayer array means that the color filter blocks 204 that transmit green light are arranged in a checkered pattern, and the remaining portions are color filter blocks 204 that transmit red light and color filter blocks 204 that transmit blue light. It is an arrangement pattern in which each row is arranged alternately.
- the filter region 202b since the filter region 202b has the configuration described above, the imaging apparatus 10 can capture images using the filter region 202b, thereby achieving a wide range of images similar to those of a general visible light camera (RGB camera). It becomes possible to capture the light of the band.
- the filter region 202b is not limited to the Bayer array, and is not limited to the above wavelengths.
- the bandwidth corresponding to the distribution range of the person's skin color is divided into a plurality of regions, and the filter regions 202a, 202c, and 202d transmit light of wavelengths in each division, respectively. It is composed of a plurality of color filter blocks 204 .
- the filter regions 202a, 202c, and 202d may be, for example, a Bayer array, or may transmit predetermined monochromatic light (infrared light or near-infrared light with a predetermined wavelength). It may be a filter.
- the output circuit section 338 performs signal processing on the pixel signals sequentially supplied from each of the column signal processing circuit sections 334 described above through the horizontal signal line 346 and outputs the processed signal.
- the output circuit section 338 may function, for example, as a functional section that performs buffering, or may perform processing such as black level adjustment, column variation correction, and various digital signal processing. Note that buffering refers to temporarily storing pixel signals in order to compensate for differences in processing speed and transfer speed when exchanging pixel signals.
- the input/output terminal 348 is a terminal for exchanging signals with an external device.
- the pixel array region 304b can capture an image of the entire subject 800 in cooperation with the filter region 202b that transmits light in a wide band of wavelengths.
- Each of the pixel array regions 304a, 304c, and 304d cooperates with filter regions 202a, 202c, and 202d that transmit narrow-band wavelength light that is optimized for a portion of the color of the subject 800.
- a portion eg, face
- one pixel array unit 302 can capture not only the entire subject 800 but also a part of the subject 800 according to the color characteristics of a part of the subject 800 . Therefore, according to the present embodiment, the imaging apparatus 10 can improve the resolution of some colors of the subject 800 and more accurately detect some colors of the subject 800 .
- the processing unit 410 can process pixel signals from the pixel array unit 302 to obtain images (first image, second image). Then, the processing unit 410 can output the image data to the output unit 440, which will be described later.
- the processing unit 410 extracts a predetermined subject (object) 800 or A portion of the subject 800 can be detected as a Region of Interest (ROI).
- ROI Region of Interest
- the number of ROIs to be detected is not limited to one, and a plurality of ROIs may be detected.
- the processing unit 410 detects a contour having a predetermined characteristic from contours included in a two-level tone image generated by binarizing the image. , ROI may be detected.
- the processing unit 410 may directly detect the ROI by extracting predetermined feature points (shape, color, etc.) from the image without using the two-tone image.
- the processing unit 410 may detect the ROI based on a predetermined region specified in advance by the user in the image without using the two-tone image. For example, the processing unit 410 can extract the eyes from the image and detect the area of the face image as the ROI. Further, in the present embodiment, for example, from a plurality of pixels 306 in a pixel array region (first pixel array region) 304b displayed on a display device (not shown) superimposed on a touch panel (not shown), When the user performs a touch operation on an image (first image) based on pixel signals to specify a range (the touch operation may be performed in real time), the processing unit 410 performs the specified range. A ROI may be set based on the range. Furthermore, in the present embodiment, the processing unit 410 may set, for example, an area in which the lens unit 100 is in focus as the ROI (object 800).
- the control unit 420 detects the ROI detected from the image (first image) based on the pixel signals from the plurality of pixels 306 in the pixel array region (first pixel array region) 304b.
- the filter areas (second filter areas) 202a, 202c, and 202d to be used are selected according to the color and color distribution.
- the control unit 420 refers to the color distribution in the ROI with a table 432 (see FIG. 11) stored in the storage unit 430, which will be described later, and selects filter areas (second filter areas) 202a and 202c to be used. , 202d can be selected.
- control unit 420 selects the pixels 306 from which pixel signals are to be read during ROI imaging.
- some of the pixels 306 in the pixel array regions 304a, 304c, and 304d are controlled according to the ROI. An increase in power consumption of the device 10 can be suppressed.
- the output unit 440 can acquire image data from the processing unit 410 and output it to the synthesizing unit 500, for example.
- the synthesizing unit 500 may be configured by a CPU, GPU, or the like other than the processing unit 400 provided in the imaging device 10, or may be provided in the processing unit 400. good.
- the synthesizing unit 500 corresponds to an image (first image) based on pixel signals from a plurality of pixels 306 in a pixel array region (first pixel array region) 304b and an ROI (region of interest).
- the subject 800 or a partial image of the subject 800 (second image) may be synthesized.
- FIG. 7 is a flowchart diagram of the imaging method according to the present embodiment
- FIGS. 8 to 10 are explanatory diagrams for explaining the imaging method according to the embodiment of the present disclosure
- FIG. is an explanatory diagram for explaining an example of a table 432 used in the imaging method according to .
- the imaging device 10 sets imaging conditions to capture an entire image based on pixel signals from the plurality of pixels 306 in the pixel array region 304b (step S101).
- the imaging device 10 images the entire subject 802 under the imaging conditions set in step S101 (step S102).
- the imaging device 10 identifies ROIs 810a to 810c from the entire image 808 obtained in step S102 (step S103). For example, in the example of FIG. 9, the imaging device 10 extracts eyes from the entire image and detects two face image regions as ROIs 810a to 810c.
- the imaging device 10 images the ROIs 810a to 810c under the imaging conditions set in step S106 (step S107). At this time, the imaging device 10 may capture an area ROUI (not shown) obtained by excluding the ROIs 810a to 810c from the entire image 808.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- the colors of the subjects 800 and 802 or part of the subjects 800 and 802 can be accurately detected with high resolution, and the imaging device 10 can be downsized. Also, an increase in power consumption can be suppressed.
- the imaging device 10 has a simple and compact configuration and can detect light of wavelengths in a predetermined band with high resolution. 802 colors can be accurately detected and reproduced. Furthermore, in the present embodiment, by detecting in advance a location where the color is to be detected with high accuracy and controlling only the pixels corresponding to the location, an increase in image data to be output is suppressed, which in turn reduces an increase in power consumption. can be suppressed.
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or part of the operations within the smartphone 900 according to various programs recorded in the ROM 902, RAM 903, storage device 904, or the like.
- a ROM 902 stores programs and calculation parameters used by the CPU 901 .
- a RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901 , ROM 902 and RAM 903 are interconnected by a bus 914 .
- the storage device 904 is a data storage device configured as an example of a storage unit of the smartphone 900 .
- the sensor module 907 is, for example, a motion sensor (eg, an acceleration sensor, a gyro sensor, a geomagnetic sensor, etc.), a biological information sensor (eg, a pulse sensor, a blood pressure sensor, a fingerprint sensor, etc.), or a position sensor (eg, GNSS (Global Navigation Satellite system) receiver, etc.) and various sensors.
- a motion sensor eg, an acceleration sensor, a gyro sensor, a geomagnetic sensor, etc.
- a biological information sensor eg, a pulse sensor, a blood pressure sensor, a fingerprint sensor, etc.
- GNSS Global Navigation Satellite system
- the speaker 911 can output, for example, the voice of a call, the voice accompanying the video content displayed by the display device 910 described above, and the like to the user.
- the user's face can be imaged by the imaging device 10 mounted on the smartphone 900, and the skin color of the user's face can be accurately detected. Therefore, by applying the technology according to the present disclosure, it is possible to provide a service that proposes, for example, cosmetics and makeup methods, color coordination of clothes, etc., according to the skin color detection result. Furthermore, according to the smartphone 900 to which the technology according to the present disclosure is applied, the skin color detection result is analyzed, the user's physiological state and psychological state are recognized, and proposals for treatment and health promotion, products, services, etc. are provided. can also make suggestions.
- a plurality of propellers 930 are provided above unmanned flying object 920 , and are rotated by power transmitted from a propeller drive unit 932 provided inside unmanned flying object 920 . , and maintains the attitude of the unmanned flying object 920 horizontally.
- a propeller driving section 932 is provided inside the unmanned flying object 920 and rotates each propeller 930 according to control from a flight control section 934, which will be described later.
- the positioning unit 940 mainly has an attitude detection section 942, a GPS (Global Positioning System) unit 944, and an altimeter 946, as shown in FIG.
- the attitude detection unit 942 includes, for example, a gyro sensor or the like that is a combination of an acceleration sensor and an angular velocity sensor, and detects the attitude (inclination, orientation, etc.) and acceleration of the unmanned flying object 920 .
- the GPS unit 944 is composed of a current position measurement device that performs measurement using GPS signals from GPS satellites, and can obtain two-dimensional position information (latitude information and longitude information) of the unmanned flying object 920 on the ground surface.
- the altimeter 946 can acquire altitude information (height above the ground) of the unmanned air vehicle 920 .
- the positioning unit 940 may not include the altimeter 946 if the GPS unit 944 can acquire altitude information with sufficient accuracy. However, the altitude information obtained by the GPS unit 944 may have low accuracy depending on the positioning state. . Therefore, the positioning unit 940 preferably includes an altimeter 946 to obtain altitude information with sufficient accuracy.
- the flight control unit 934 When the flight control unit 934 receives a control signal from a control device (not shown) owned by the pilot, the flight control unit 934 utilizes the position information and the attitude information acquired by the positioning unit 940 described above, and generates the control signal. It controls the propeller driver 932 according to the flight instructions.
- the technology (the present technology) according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- An endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into the body cavity of a patient 11132 and a camera head 11102 connected to the proximal end of the lens barrel 11101 .
- an endoscope 11100 configured as a so-called rigid scope having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible scope having a flexible lens barrel. good.
- the CCU 11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the operations of the endoscope 11100 and the display device 11202 in an integrated manner. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various image processing such as development processing (demosaicing) for displaying an image based on the image signal.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201 .
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- the user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204 .
- the user inputs an instruction or the like to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100 .
- the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time.
- the drive of the imaging device of the camera head 11102 in synchronism with the timing of the change in the intensity of the light to obtain an image in a time-division manner and synthesizing the images, a high dynamic A range of images can be generated.
- the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band corresponding to special light observation.
- special light observation for example, the wavelength dependence of light absorption in body tissues is used to irradiate a narrower band of light than the irradiation light (i.e., white light) used during normal observation, thereby observing the mucosal surface layer.
- narrow band imaging in which a predetermined tissue such as a blood vessel is imaged with high contrast, is performed.
- fluorescence observation may be performed in which an image is obtained from fluorescence generated by irradiation with excitation light.
- FIG. 16 is a block diagram showing an example of functional configurations of the camera head 11102 and CCU 11201 shown in FIG.
- the camera head 11102 has a lens unit 11401, an imaging section 11402, a drive section 11403, a communication section 11404, and a camera head control section 11405.
- the CCU 11201 has a communication section 11411 , an image processing section 11412 and a control section 11413 .
- the camera head 11102 and the CCU 11201 are communicably connected to each other via a transmission cable 11400 .
- the number of imaging elements constituting the imaging unit 11402 may be one (so-called single-plate type) or plural (so-called multi-plate type).
- image signals corresponding to RGB may be generated by each image pickup element, and a color image may be obtained by synthesizing the image signals.
- the imaging unit 11402 may be configured to have a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display.
- the 3D display enables the operator 11131 to more accurately grasp the depth of the living tissue in the surgical site.
- a plurality of systems of lens units 11401 may be provided corresponding to each imaging element.
- the drive unit 11403 is configured by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405 . Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be appropriately adjusted.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. good.
- the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
- the camera head control unit 11405 controls driving of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is composed of a communication device for transmitting and receiving various information to and from the camera head 11102 .
- the communication unit 11411 receives image signals transmitted from the camera head 11102 via the transmission cable 11400 .
- control unit 11413 causes the display device 11202 to display a captured image showing the surgical site and the like based on the image signal that has undergone image processing by the image processing unit 11412 .
- the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape, color, and the like of the edges of objects included in the captured image, thereby detecting surgical instruments such as forceps, specific body parts, bleeding, mist during use of the energy treatment instrument 11112, and the like. can recognize.
- the control unit 11413 may use the recognition result to display various types of surgical assistance information superimposed on the image of the surgical site. By superimposing and presenting the surgery support information to the operator 11131, the burden on the operator 11131 can be reduced and the operator 11131 can proceed with the surgery reliably.
- a transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable of these.
- FIG. 17 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
- the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
- the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
- the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
- the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
- the in-vehicle information detection unit 12040 detects in-vehicle information.
- the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
- the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose, side mirrors, rear bumper, back door, and windshield of the vehicle 12100, for example.
- An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
- Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
- An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
- the imaging unit 12105 provided above the windshield in the passenger compartment is mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
- the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
- the imaging device according to any one of (5) to (7) above, wherein (9) The imaging device according to any one of (1) to (8) above, wherein the first and second pixel array regions are composed of a plurality of pixels arranged in a matrix on the substrate. (10) The imaging device according to (9) above, wherein the area of the pixels forming the first pixel array region is different from the area of the pixels forming the second pixel array region. (11) The imaging device according to (9) or (10) above, wherein the number of pixels forming the first pixel array region is different from the number of pixels forming the second pixel array region. (12) The imaging device according to any one of (1) to (11) above, further comprising a lens unit that guides light to the color filter.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Studio Devices (AREA)
- Optical Filters (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
L'invention concerne un dispositif d'imagerie (10) comprenant : un filtre de couleur (200) comprenant une première région de filtre pour transmettre la lumière ayant une longueur d'onde d'une première bande et une seconde région de filtre pour transmettre la lumière ayant une longueur d'onde d'une seconde bande ; une unité de réseau de pixels (302) comprenant une première région de réseau de pixels pour recevoir la lumière qui est passée à travers la première région de filtre et une seconde région de réseau de pixels pour recevoir la lumière qui est passée à travers la seconde région de filtre ; une unité de traitement (410) pour détecter une région d'intérêt à partir d'une première image obtenue en traitant des signaux de pixels provenant d'une pluralité de pixels positionnés dans la première région de réseau de pixels ; et une unité de commande (420) qui, afin d'acquérir une seconde image d'un objet correspondant à la région d'intérêt, commande certains d'une pluralité de pixels positionnés dans la seconde région de réseau de pixels sur la base de la position de la région d'intérêt.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-043627 | 2021-03-17 | ||
JP2021043627A JP2022143220A (ja) | 2021-03-17 | 2021-03-17 | 撮像装置、撮像方法及び電子機器 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022196210A1 true WO2022196210A1 (fr) | 2022-09-22 |
Family
ID=83322237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/005661 WO2022196210A1 (fr) | 2021-03-17 | 2022-02-14 | Dispositif d'imagerie, procédé d'imagerie et appareil électronique |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2022143220A (fr) |
WO (1) | WO2022196210A1 (fr) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013108788A (ja) * | 2011-11-18 | 2013-06-06 | Tokyo Institute Of Technology | マルチスペクトル画像情報取得装置及びマルチスペクトル画像情報取得方法 |
JP2018096834A (ja) * | 2016-12-13 | 2018-06-21 | ソニーセミコンダクタソリューションズ株式会社 | データ処理装置、データ処理方法、プログラム、および電子機器 |
JP2019036843A (ja) * | 2017-08-15 | 2019-03-07 | キヤノン株式会社 | 撮像装置、カメラおよび輸送機器 |
-
2021
- 2021-03-17 JP JP2021043627A patent/JP2022143220A/ja active Pending
-
2022
- 2022-02-14 WO PCT/JP2022/005661 patent/WO2022196210A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013108788A (ja) * | 2011-11-18 | 2013-06-06 | Tokyo Institute Of Technology | マルチスペクトル画像情報取得装置及びマルチスペクトル画像情報取得方法 |
JP2018096834A (ja) * | 2016-12-13 | 2018-06-21 | ソニーセミコンダクタソリューションズ株式会社 | データ処理装置、データ処理方法、プログラム、および電子機器 |
JP2019036843A (ja) * | 2017-08-15 | 2019-03-07 | キヤノン株式会社 | 撮像装置、カメラおよび輸送機器 |
Also Published As
Publication number | Publication date |
---|---|
JP2022143220A (ja) | 2022-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10958847B2 (en) | Imaging device, image processing method, and image processing system | |
US11372200B2 (en) | Imaging device | |
WO2017163927A1 (fr) | Boîtier de taille de puce, procédé de production, appareil électronique et endoscope | |
WO2018110303A1 (fr) | Élément d'imagerie à semi-conducteurs et dispositif électronique | |
JP2022044653A (ja) | 撮像装置 | |
WO2018150768A1 (fr) | Dispositif de photométrie, procédé de photométrie, programme et dispositif d'imagerie | |
US12079712B2 (en) | Solid state image capturing system, solid state image capturing device, information processing device, image processing method, information processing method | |
WO2018003245A1 (fr) | Dispositif de traitement de signaux, dispositif d'imagerie et procédé de traitement de signaux | |
US11936979B2 (en) | Imaging device | |
JPWO2020026856A1 (ja) | 撮像装置および電子機器 | |
WO2018037680A1 (fr) | Dispositif d'imagerie, système d'imagerie, et procédé de traitement de signal | |
WO2020170565A1 (fr) | Procédé de traitement de signaux et dispositif d'imagerie | |
JP2023061999A (ja) | 画像処理装置、画像処理方法、および撮像装置 | |
CN116195065A (zh) | 固态成像装置和电子设备 | |
JP2018166242A (ja) | 画像処理装置、画像処理方法、および電子機器 | |
JP7034925B2 (ja) | 固体撮像装置および撮像方法 | |
WO2022196210A1 (fr) | Dispositif d'imagerie, procédé d'imagerie et appareil électronique | |
JP2019179782A (ja) | 半導体装置および半導体装置の製造方法 | |
KR20220146447A (ko) | 촬상 장치, 촬상 방법, 전자기기 | |
WO2019082686A1 (fr) | Dispositif d'imagerie | |
WO2020202876A1 (fr) | Élément d'imagerie à semi-conducteurs et dispositif d'imagerie |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22770969 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22770969 Country of ref document: EP Kind code of ref document: A1 |