JP2013242441A - Imaging apparatus and method for deciding illumination light quantity - Google Patents

Imaging apparatus and method for deciding illumination light quantity Download PDF

Info

Publication number
JP2013242441A
JP2013242441A JP2012115856A JP2012115856A JP2013242441A JP 2013242441 A JP2013242441 A JP 2013242441A JP 2012115856 A JP2012115856 A JP 2012115856A JP 2012115856 A JP2012115856 A JP 2012115856A JP 2013242441 A JP2013242441 A JP 2013242441A
Authority
JP
Japan
Prior art keywords
illumination
information
map
luminance
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2012115856A
Other languages
Japanese (ja)
Other versions
JP6160030B2 (en
Inventor
Tomoyuki Shindo
朋行 進藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2012115856A priority Critical patent/JP6160030B2/en
Publication of JP2013242441A publication Critical patent/JP2013242441A/en
Application granted granted Critical
Publication of JP6160030B2 publication Critical patent/JP6160030B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Exposure Control For Cameras (AREA)
  • Stroboscope Apparatuses (AREA)
  • Studio Devices (AREA)

Abstract

PROBLEM TO BE SOLVED: To preliminarily decide a flash light emission quantity, without performing pre-light-emission.SOLUTION: An imaging part 11 acquires an actual image. A depth map acquisition part 12 acquires a depth map simultaneously with the actual image. A decision part 26 creates a reduced brightness map in which an average brightness value obtained by averaging the brightness value of each pixel is distributed, for each area obtained by dividing the actual image. The decision part 26 processes the depth map similarly as well, to create a reduced depth map and a virtual illumination brightness map in which the quantity of light reflected from an object, obtained when performing illumination only by a built-in flash 17 is calculated for each area, based on the created reduced brightness map and the information of the light emission quantity of the built-in flash 17. The decision part 26 synthesizes the created illumination brightness map and the created reduced brightness map, to create a virtual brightness map and decide the light emission quantity of the built-in flash 17, so that the brightness value of the area including a target object becomes a predetermined brightness value, based on the created virtual brightness map.

Description

撮像装置、及び照明光量決定方法に関する。   The present invention relates to an imaging device and an illumination light amount determination method.

ストロボを使用して適正露光を得るためには、ストロボの発光量を制御する必要がある。制御としては、ガイドナンバと撮影距離とに基づいて絞りを制御するフラッシュマチック方式、及び被写体側から返ってくる反射光量が所定の光量になった時点でフラッシュの発光を停止するよう制御するオートストロボ方式が知られている。前者の方式では、撮影距離に測定誤差があるとそのまま露出誤差に現われる。後者の方式では、調光用センサの窓が汚れると調光性能が悪くなる。そこで、撮影に先立ってプリ発光を行い、プリ発光により被写体を反射する被写体光を測光手段で測光し、測光した被写体光に基づいて適正露光が得られるように本発光時のストロボ発光量(光量と発光時間)を算出する撮影装置が知られている(特許文献1)。   In order to obtain proper exposure using a strobe, it is necessary to control the light emission amount of the strobe. Controls include a flashmatic system that controls the aperture based on the guide number and shooting distance, and an auto strobe that controls the flash to stop when the amount of reflected light returning from the subject reaches a predetermined level. The method is known. In the former method, if there is a measurement error in the shooting distance, it appears as an exposure error as it is. In the latter method, the dimming performance deteriorates when the window of the dimming sensor becomes dirty. Therefore, pre-flash is performed prior to shooting, subject light that reflects the subject by pre-flash is measured by the photometric means, and the strobe flash amount (light amount) during main flash is obtained so that appropriate exposure is obtained based on the measured subject light. And a light emission time) are known (Patent Document 1).

特開2003−161987号公報JP 2003-161987

しかしながら、人物、例えば子供に対してストロボ撮影を行う場合、プリ発光の直後に本発光が行われるため、プリ発光のまぶしさで目をつぶった時に本発光による撮影が行われてしまうおそれがある。また、被写体が、例えば生物の場合、生物がプリ発光によって動いてしまうおそれがある。さらには、プリ発光を行うストロボ撮影は、本発光のみのストロボ撮影と比べて電力を消費して、カメラのバッテリー切れを早めてしまうおそれがある。   However, when flash photography is performed on a person, for example, a child, since the main light emission is performed immediately after the pre-light emission, there is a possibility that the main light emission photographing may be performed when the eyes are closed by the glare of the pre-light emission. . Further, when the subject is a living organism, for example, the living organism may move due to the pre-emission. Furthermore, flash photography that performs pre-flash may consume more power than flash photography using only main flash, and may cause the camera to run out of battery.

本発明は、上記問題に鑑み、被写体を照明する照明光量を、プリ発光を行うことなく事前に決めることができる撮像装置、及び照明光量決定方法を提供することを目的とする。   In view of the above problems, an object of the present invention is to provide an imaging apparatus and an illumination light quantity determination method that can determine the illumination light quantity for illuminating a subject in advance without performing pre-light emission.

本発明を例示する撮像装置の一態様は、被写体を撮像して実画像を取得する実画像取得手段と、前記実画像に応じた奥行き情報を含む3次元情報を取得する3次元情報取得手段と、前記実画像と前記3次元情報とに基づいて前記被写体を照明する照明手段の照明光量を撮影前に決める決定手段と、前記決められた照明光量になるように前記照明手段の照明を撮影時に制御する照明制御手段と、を備えたものである。   An aspect of an imaging apparatus illustrating the present invention includes: a real image acquisition unit that captures a subject and acquires a real image; and a three-dimensional information acquisition unit that acquires three-dimensional information including depth information according to the real image. Deciding means for deciding an illuminating light amount of the illuminating means for illuminating the subject based on the real image and the three-dimensional information before photographing; Lighting control means for controlling.

また、本発明の照明光量決定方法の一観点によれば、被写体を撮像して取得した実画像を入力するステップと、前記実画像に応じた奥行き情報を含む3次元情報を入力するステップと、前記実画像と前記3次元情報とに基づいて照明手段が前記被写体を照明した時に得られる前記奥行き情報を考慮した仮想の被写体輝度情報を生成し前記仮想の被写体輝度情報に基づいて着目被写体を含む領域の輝度値が所定の輝度値になるように前記照明手段の照明光量を決めるステップと、を含むものである。   Further, according to one aspect of the illumination light amount determination method of the present invention, a step of inputting a real image obtained by imaging a subject, a step of inputting three-dimensional information including depth information according to the real image, Based on the real image and the three-dimensional information, virtual subject luminance information considering the depth information obtained when the illumination unit illuminates the subject is generated, and the subject of interest is included based on the virtual subject luminance information. Determining the illumination light quantity of the illumination means so that the luminance value of the region becomes a predetermined luminance value.

本発明によれば、被写体を照明する照明手段の照明光量を、プリ発光を行うことなく撮影前に決めることができる。   According to the present invention, it is possible to determine the illumination light amount of the illumination unit that illuminates the subject before photographing without performing pre-light emission.

本発明の一例である電子カメラの概略を示すブロック図である。It is a block diagram which shows the outline of the electronic camera which is an example of this invention. 発光量を決定する決定部の構成を示すブロック図である。It is a block diagram which shows the structure of the determination part which determines light emission amount. 決定部が発光量を決定する動作手順を示すフローチャートである。It is a flowchart which shows the operation | movement procedure in which a determination part determines the light emission amount. 決定部が取得、及び生成するマップを仮想的に説明した説明図であり、(A)は実画像、(B)は深度マップ、(C)は縮小輝度マップ、(D)は縮小輝度マップ、(E)は仮想の証明輝度マップ、(F)は仮想の輝度マップをそれぞれ一例として示す説明図である。It is explanatory drawing which demonstrated the map which a determination part acquires and produces | generates virtually, (A) is a real image, (B) is a depth map, (C) is a reduced luminance map, (D) is a reduced luminance map, (E) is a virtual proof luminance map, and (F) is an explanatory diagram showing a virtual luminance map as an example. 外部フラッシュと電子カメラとを含む撮影システムの一形態を示す説明図である。It is explanatory drawing which shows one form of the imaging | photography system containing an external flash and an electronic camera. 図5で説明した撮影システムの構成を示すブロック図である。It is a block diagram which shows the structure of the imaging | photography system demonstrated in FIG. 図6で説明した電子カメラの決定部の構成を示すブロック図である。It is a block diagram which shows the structure of the determination part of the electronic camera demonstrated in FIG. 図7で説明した決定部が発光量を決定する動作手順を示すフローチャートである。It is a flowchart which shows the operation | movement procedure in which the determination part demonstrated in FIG. 7 determines light emission amount.

[第1実施形態]
本発明の撮像装置の一実施形態を示す電子カメラ10は、図1に示すように、撮像部11、深度マップ取得部12、ROM13、RAM14、記録部15、フラッシュ制御部16、内蔵フラッシュ17、操作部18、表示制御部19、表示部20、タッチパネル21、及びCPU22を備える。撮像部11、深度マップ取得部12、ROM13、RAM14、記録部15、表示制御部19、及びCPU22は、それぞれバス23に接続されている。また、CPU22には、フラッシュ制御部16、操作部18、及びタッチパネル21が接続されている。タッチパネル21は、表示部20に設けられている。表示部20は、スルー画像等を表示するためのもので、表示制御部19により表示が制御される。フラッシュ制御部16は、内蔵フラッシュ17が接続されており、内蔵フラッシュ17の発光を制御する。
[First Embodiment]
As shown in FIG. 1, an electronic camera 10 showing an embodiment of an imaging apparatus of the present invention includes an imaging unit 11, a depth map acquisition unit 12, a ROM 13, a RAM 14, a recording unit 15, a flash control unit 16, a built-in flash 17, An operation unit 18, a display control unit 19, a display unit 20, a touch panel 21, and a CPU 22 are provided. The imaging unit 11, depth map acquisition unit 12, ROM 13, RAM 14, recording unit 15, display control unit 19, and CPU 22 are each connected to a bus 23. Further, the flash control unit 16, the operation unit 18, and the touch panel 21 are connected to the CPU 22. The touch panel 21 is provided on the display unit 20. The display unit 20 is for displaying a through image or the like, and the display is controlled by the display control unit 19. The flash controller 16 is connected to the built-in flash 17 and controls the light emission of the built-in flash 17.

操作部18は、電源操作部、レリーズ操作部等を備える。ROM13は、内蔵フラッシュ17の発光情報(ガイドナンバや周辺低下光量を含む情報)や、撮影用プログラムや発光量決定用プログラム等、CPU22に実行させるためのプログラムが記憶されている。RAM14は、CPU22の作業領域やデータの一時的な記憶領域等に使用される。記録部15には、本撮像時に取得する実画像のデータが記録される。   The operation unit 18 includes a power supply operation unit, a release operation unit, and the like. The ROM 13 stores programs to be executed by the CPU 22 such as light emission information (information including guide number and peripheral light amount), a shooting program, and a light emission amount determination program. The RAM 14 is used as a work area for the CPU 22 and a temporary storage area for data. The recording unit 15 records actual image data acquired at the time of actual imaging.

撮像部11は、撮像レンズにより結像する被写体の光像を所定の露出により撮像素子の結像面に結像させ、結像される被写体の光像を撮像素子により電気信号に変換して実画像を生成する。なお、撮像部11は、実画像取得部の一例である。   The imaging unit 11 forms an optical image of the subject formed by the imaging lens on the imaging surface of the imaging device with a predetermined exposure, converts the optical image of the formed subject into an electrical signal by the imaging device, and Generate an image. The imaging unit 11 is an example of a real image acquisition unit.

深度マップ取得部12は、奥行き情報である深度マップ、又はそれを含む3次元情報(以下、「深度マップ」と称す)を、実画像と略同時に、同じ撮像範囲となるように取得する。取得した深度マップは、実画像と関連付けしてRAM14に格納される。なお、深度マップ取得部12は、周知の「ステレオカメラによる深度マップの生成手法」、「赤外光を利用したTOF(Time of Flight)手法」、「赤外光の反射パターンを利用した手法」、及び「動きベクトルからの深度マップの生成手法」等の技術を利用して深度マップを生成する深度マップ生成回路を備える。生成される深度マップは、例えばある基準位置を基に深さ(被写体までの距離)が深い画素の場合にはその画素の輝度を高くする等、各画素における深度に応じた輝度分布を求めたグレースケールの画像である。なお、深度マップ取得部12は、3次元情報取得部の一例である。   The depth map acquisition unit 12 acquires a depth map that is depth information or three-dimensional information including the depth map (hereinafter referred to as “depth map”) so as to be in the same imaging range substantially simultaneously with the actual image. The acquired depth map is stored in the RAM 14 in association with the actual image. The depth map acquisition unit 12 includes the well-known “depth map generation method using a stereo camera”, “TOF (Time of Flight) method using infrared light”, and “method using a reflection pattern of infrared light”. And a depth map generation circuit that generates a depth map using a technique such as “a method of generating a depth map from a motion vector”. For the generated depth map, for example, in the case of a pixel having a deep depth (distance to the subject) based on a certain reference position, the luminance distribution corresponding to the depth in each pixel is obtained, such as increasing the luminance of the pixel. It is a grayscale image. The depth map acquisition unit 12 is an example of a three-dimensional information acquisition unit.

CPU22は、AE・AF部24、判定部25、及び決定部26を備える。AE・AF部24は、実画像の一部(測距領域と測光領域)に基づいて被写体輝度の情報と被写体距離の情報(被写体のコントラストを示すAF評価値の情報)とを取得し、これら情報に基づいてISO感度、シャッタ速度、及び絞り値を含む露出が適正露出になるように、また、撮影レンズが合焦位置に移動するように撮像部11を制御する。   The CPU 22 includes an AE / AF unit 24, a determination unit 25, and a determination unit 26. The AE / AF unit 24 acquires subject luminance information and subject distance information (AF evaluation value information indicating the contrast of the subject) based on a part of the actual image (ranging region and photometric region), and these Based on the information, the imaging unit 11 is controlled so that the exposure including the ISO sensitivity, the shutter speed, and the aperture value becomes an appropriate exposure, and the photographing lens moves to the in-focus position.

CPU22は、レリーズ操作部の半押し操作に応答して、低輝度被写体か否かを判定するために撮像部11から実画像を取得する事前撮影を行う。事前撮影時に取得した実画像は、RAM14に一時的に格納される。判定部25は、事前撮影時に取得した実画像の一部の被写体輝度が、例えば予め決めた閾値以下の場合、低輝度被写体であると判定する。   In response to the half-pressing operation of the release operation unit, the CPU 22 performs pre-shooting for acquiring a real image from the imaging unit 11 in order to determine whether the subject is a low-luminance subject. The actual image acquired at the time of pre-shooting is temporarily stored in the RAM 14. The determination unit 25 determines that the subject is a low-luminance subject when the subject brightness of a part of the real image acquired at the time of pre-shooting is equal to or less than a predetermined threshold, for example.

CPU22は、判定部25が低輝度被写体であると判定した場合、深度マップ取得部12を動作させて深度マップを取得し、取得した深度マップと事前撮影時に取得した実画像とを利用して適正露光を得るための内蔵フラッシュ17の発光量を決定し、レリーズ操作部の全押し操作に応答して、先に決めた発光量に基づいて内蔵フラッシュ17を発光して本撮影を行う。   When the determination unit 25 determines that the subject is a low-luminance subject, the CPU 22 operates the depth map acquisition unit 12 to acquire a depth map, and uses the acquired depth map and the actual image acquired at the time of pre-shooting to appropriately The amount of light emitted from the built-in flash 17 for obtaining exposure is determined, and in response to a full pressing operation of the release operation unit, the built-in flash 17 is lighted based on the amount of light previously determined to perform the main photographing.

CPU22は、フラッシュ撮影を行う場合、予め決められた露出値(ISO感度、絞り値、及びシャッタ速度)に撮像部11の露出を設定する。本撮影時に取得した実画像は、撮像データとして記録部15に記憶される。なお、レリーズ操作部としては、半押しと全押しとの二段押し構造の代わりに一段押し構造のものを使用してもてもよい。この場合には、押し操作に応答して事前撮影を行って発光量を決定してから一定時間経過後に、本撮影を行えばよい。   When performing flash photography, the CPU 22 sets the exposure of the imaging unit 11 to predetermined exposure values (ISO sensitivity, aperture value, and shutter speed). The actual image acquired at the time of the actual photographing is stored in the recording unit 15 as the imaging data. In addition, as a release operation part, you may use the thing of a one-step pushing structure instead of the two-step pushing structure of a half press and a full press. In this case, the main photographing may be performed after a predetermined time has elapsed since the pre-photographing was performed in response to the pressing operation to determine the light emission amount.

決定部26は、事前撮影により取得した実画像と深度マップとに基づいて、内蔵フラッシュ17により距離に応じた被写体を照明した時に得られる仮想の反射光量、すなわち仮想の被写体輝度情報(以下、「仮想輝度情報」と称す。)を作成し、作成された仮想輝度情報に基づいて、着目被写体を含む領域の輝度値が所定の輝度値になるように内蔵フラッシュ17の発光量を決める。なお、決定部26は、決定手段の一例である。   Based on the actual image and the depth map acquired by the pre-shooting, the determination unit 26 is a virtual reflected light amount obtained when the subject corresponding to the distance is illuminated by the built-in flash 17, that is, virtual subject luminance information (hereinafter, “ And the amount of light emitted from the built-in flash 17 is determined so that the luminance value of the area including the subject of interest becomes a predetermined luminance value based on the generated virtual luminance information. The determination unit 26 is an example of a determination unit.

具体的には、決定部26は、図2に示すように、縮小輝度マップ作成部27、縮小深度マップ作成部28、照明輝度マップ作成部29、仮想輝度マップ作成部30、及び発光量決定部31を含む。これら構成は、判定部25が低輝度被写体であると判定した場合、図3に示す手順に従って発光量を決定する。   Specifically, as shown in FIG. 2, the determination unit 26 includes a reduced luminance map creation unit 27, a reduced depth map creation unit 28, an illumination luminance map creation unit 29, a virtual luminance map creation unit 30, and a light emission amount determination unit. 31 is included. In these configurations, when the determination unit 25 determines that the subject is a low-luminance subject, the light emission amount is determined according to the procedure shown in FIG.

まず、CPU22は、適度被写体であると判定した場合、深度マップ取得部12を作動して深度マップを取得し、取得した深度マップをRAM14に記憶する(S−1)。なお、この時点では、事前撮影に応答して実画像がRAM14に格納されている。   First, when determining that the subject is a moderate subject, the CPU 22 operates the depth map acquisition unit 12 to acquire a depth map, and stores the acquired depth map in the RAM 14 (S-1). At this point, the actual image is stored in the RAM 14 in response to the pre-shooting.

縮小輝度マップ作成部27は、RAM14から実画像を読み出し、読み出した実画像の有効画像領域を分割した領域毎に、各画素の輝度値を平均して求めた平均輝度値を分布させた縮小輝度マップを実画像に基づいて作成する(S−2)。ここで、実画像を図4(A)に、実画像に対応する深度マップを図4(B)に、実画像に基づいて作成した縮小輝度マップを図4(C)にそれぞれ示す。   The reduced luminance map creating unit 27 reads the actual image from the RAM 14, and for each region obtained by dividing the effective image region of the read actual image, the reduced luminance map in which the average luminance value obtained by averaging the luminance values of each pixel is distributed. A map is created based on the actual image (S-2). Here, FIG. 4A shows an actual image, FIG. 4B shows a depth map corresponding to the actual image, and FIG. 4C shows a reduced luminance map created based on the actual image.

縮小深度マップ作成部28は、前記各領域に対応するエリア毎に、各画素の深度値を平均して求めた平均深度値を前記エリア毎に分布させた縮小深度マップを深度マップ(図4(D)参照)に基づいて作成する(S−3)。なお、図4(C)及び同図(D)では、縮小したマップを拡大して示している。   For each area corresponding to each region, the reduced depth map creation unit 28 obtains a reduced depth map in which the average depth value obtained by averaging the depth values of each pixel is distributed for each area (FIG. 4 ( D) (see S). In FIG. 4C and FIG. 4D, the reduced map is shown enlarged.

照明輝度マップ作成部29は、内蔵フラッシュ17の発光情報(発光量や照射角等の情報)と縮小深度マップとに基づいて、前記発光情報に含まれる基本の発光量で内蔵フラッシュ17を照明した時に得られる被写体からの反射光量を前記エリア毎に算出した仮想の照明輝度マップ(図4(E)参照)を作成する(S−4)。このとき、発光量と距離とをパラメータとする輝度値のテーブルを予めROM13に記憶しておき、読み出して使用することで照明輝度マップを作成する時の演算を簡略化することができる。   The illumination brightness map creating unit 29 illuminates the built-in flash 17 with the basic light emission amount included in the light emission information based on the light emission information (information such as the light emission amount and the irradiation angle) of the built-in flash 17 and the reduced depth map. A virtual illumination luminance map (see FIG. 4E) is created (S-4) in which the amount of reflected light from the subject that is sometimes obtained is calculated for each area. At this time, a table of luminance values using the light emission amount and the distance as parameters is stored in the ROM 13 in advance, and the calculation for creating the illumination luminance map can be simplified by reading and using the table.

仮想輝度マップ作成部30は、照明輝度マップと輝度マップとを合成して仮想輝度マップ(図4(F)参照)を作成する(S−5)。仮想輝度マップは、実際の環境下(定常光下)で内蔵フラッシュを発光した時に得られる被写体からの反射光量を前記領域毎に算出した仮想の被写体輝度情報である。   The virtual brightness map creating unit 30 creates a virtual brightness map (see FIG. 4F) by combining the illumination brightness map and the brightness map (S-5). The virtual luminance map is virtual subject luminance information in which the amount of reflected light from the subject obtained when the built-in flash is emitted in an actual environment (under steady light) is calculated for each region.

そして、発光量決定部31は、仮想輝度マップに基づいて、着目被写体を含む領域32(図4(F)参照)の輝度値が所定の輝度値になるように内蔵フラッシュ17の発光量を決める。この作業は、仮想輝度マップに対して着目被写体を含む領域32の輝度値を抽出し(S−6)、抽出した輝度値が予め決めた所定の輝度値か否かを判定する(S−7)。抽出した輝度値が所定の輝度値ではない場合には、前記基準の発光量に対して再び設定する発光量を増減させて照明輝度マップを作り直し(S−8)、作り直した照明輝度マップと輝度マップとを合成して仮想輝度マップを作り直し(S−9)、作り直した仮想輝度マップから着目被写体を含む領域32の輝度値を抽出し(S−6)、抽出した輝度値が所定の輝度値か否かを判定していく(S−7)。抽出した輝度値が所定の輝度値である場合には、このとき設定した発光量を本撮影時の発光量に決める(S−10)。   Then, the light emission amount determination unit 31 determines the light emission amount of the built-in flash 17 based on the virtual luminance map so that the luminance value of the region 32 (see FIG. 4F) including the subject of interest becomes a predetermined luminance value. . In this operation, the luminance value of the region 32 including the subject of interest is extracted from the virtual luminance map (S-6), and it is determined whether or not the extracted luminance value is a predetermined luminance value determined in advance (S-7). ). If the extracted luminance value is not a predetermined luminance value, the illumination luminance map is recreated by increasing / decreasing the light emission amount to be set again with respect to the reference light emission amount (S-8). A virtual luminance map is re-created by combining the map (S-9), the luminance value of the region 32 including the subject of interest is extracted from the re-created virtual luminance map (S-6), and the extracted luminance value is a predetermined luminance value. It is determined whether or not (S-7). When the extracted luminance value is a predetermined luminance value, the light emission amount set at this time is determined as the light emission amount at the time of actual photographing (S-10).

所定の輝度値としては、許容する範囲をもった値としてもよい。この場合には、着目被写体を含む領域32の輝度値が許容範囲内に収まっているか否かを判定すればよい。ここで、着目被写体を含む領域32としては、仮想輝度マップの領域のうち、輝度値が一番高い領域とするのが好適である。   The predetermined luminance value may be a value having an allowable range. In this case, it is only necessary to determine whether or not the luminance value of the region 32 including the subject of interest is within the allowable range. Here, it is preferable that the region 32 including the subject of interest is the region having the highest luminance value among the regions of the virtual luminance map.

なお、事前撮影後に実画像を表示部20に表示して、表示される実画像上のうち、着目被写体又はそれを含む領域を、タッチパネル21を使って択一的に選択するように構成してもよい。タッチパネル21は、選択手段の一例である。   It should be noted that the real image is displayed on the display unit 20 after the pre-shooting, and the subject of interest or the region including it is selectively selected using the touch panel 21 on the displayed real image. Also good. The touch panel 21 is an example of a selection unit.

また、照明輝度マップを最初に作成する時に用いる基準の発光量は、後から設定する発光量よりも低めに設定しておくのが好ましい。というのは、後から発光量を徐々に上げて仮想輝度マップを作り直して判定する方が、仮想輝度マップのいずれかの領域の輝度値が高輝度(白飛び)になり難く、よって輝度値の判定を迅速に行うことができるためである。なお、逆に、内蔵フラッシュ17の最大発光量を基準の発光量として設定しておき、後から発光量を徐々に下げるようにしてもよい。   Further, it is preferable that the reference light emission amount used when the illumination luminance map is first created is set lower than the light emission amount set later. This is because it is more difficult for the brightness value of any area of the virtual brightness map to become high brightness (out-of-brightness) if the light emission amount is gradually increased and the virtual brightness map is re-determined later. This is because the determination can be made quickly. Conversely, the maximum light emission amount of the built-in flash 17 may be set as a reference light emission amount, and the light emission amount may be gradually decreased later.

[第2実施形態]
第2実施形態は、図5に示すように、フラッシュ装置45を電子カメラ46から離れた位置に配置して、人物47に対してフラッシュ撮影を行う撮影システム48を示す例である。なお、符号49は、フラッシュ光を反射する物体の反射面を示す。フラッシュ装置45は、照明手段の一例である。
[Second Embodiment]
As shown in FIG. 5, the second embodiment is an example of an imaging system 48 that performs flash shooting on a person 47 by disposing the flash device 45 at a position away from the electronic camera 46. Reference numeral 49 denotes a reflection surface of an object that reflects flash light. The flash device 45 is an example of illumination means.

電子カメラ46は、図6に示すように、図1で説明した構成に加えて、フラッシュ装置45との間で無線通信を行う送受信機50を備える。フラッシュ装置45には、発光部51、送受信部52、位置検出部53、方向検出部54、発光情報記憶部55、及びこれらを統括的に制御する制御部56を備えている。位置検出部53は、測地衛星からの電波を受信して、測定された現在位置を示す位置情報を得る、例えばGPS(Global Positioning Satellite)センサを含む。制御部56は、位置検出部53から得られる位置情報を、電子カメラ46からの要求に応じて、送受信部52を介して電子カメラ46に送信する。   As shown in FIG. 6, the electronic camera 46 includes a transceiver 50 that performs wireless communication with the flash device 45 in addition to the configuration described in FIG. 1. The flash device 45 includes a light emitting unit 51, a transmission / reception unit 52, a position detection unit 53, a direction detection unit 54, a light emission information storage unit 55, and a control unit 56 that comprehensively controls them. The position detection unit 53 includes, for example, a GPS (Global Positioning Satellite) sensor that receives radio waves from a geodetic satellite and obtains position information indicating the measured current position. The control unit 56 transmits the position information obtained from the position detection unit 53 to the electronic camera 46 via the transmission / reception unit 52 in response to a request from the electronic camera 46.

方向検出部54は、例えば、電子コンパスと、加速度センサとから構成される。電子コンパスは、ホール素子等の磁気素子により構成され、地磁気を検出して方位を検出し、また、加速度センサは、傾斜(仰角)を検出する。制御部56は、電子カメラ46からの要求に応じて、方向検出部54から得られる発光部51の発光方向の情報を、送受信部52を介して電子カメラ46に送信する。   The direction detection unit 54 includes, for example, an electronic compass and an acceleration sensor. The electronic compass is composed of a magnetic element such as a Hall element, detects geomagnetism to detect the azimuth, and the acceleration sensor detects inclination (elevation angle). In response to a request from the electronic camera 46, the control unit 56 transmits information on the light emission direction of the light emitting unit 51 obtained from the direction detection unit 54 to the electronic camera 46 via the transmission / reception unit 52.

発光情報記憶部55は、発光量(ガイドナンバ)や照射角等の発光部情報を予め記憶している。制御部56は、電子カメラ46からの要求に応じて、発光部情報を発光情報記憶部55から読み出して、送受信部52を介して電子カメラ46に送信する。この実施形態では、位置情報、発光方向の情報、及び発光部情報が発光情報の一例である。   The light emission information storage unit 55 stores light emission unit information such as a light emission amount (guide number) and an irradiation angle in advance. In response to a request from the electronic camera 46, the control unit 56 reads the light emitting unit information from the light emitting information storage unit 55 and transmits it to the electronic camera 46 via the transmission / reception unit 52. In this embodiment, position information, light emission direction information, and light emission unit information are examples of light emission information.

制御部56は、送受信部52を介して電子カメラ46で決められる発光量の情報、及びトリガ信号等が送信されることに基づいて、発光部51の発光を制御する。なお、フラッシュ装置45と電子カメラ46とを無線接続しているが、代わりに有線接続してもよい。   The control unit 56 controls the light emission of the light emitting unit 51 based on the transmission of information on the light emission amount determined by the electronic camera 46, a trigger signal, and the like via the transmission / reception unit 52. The flash device 45 and the electronic camera 46 are connected wirelessly, but may be connected by wire instead.

電子カメラ46のCPU22は、決定部57を備える。決定部57は、図7に示すように、縮小輝度マップ作成部27、縮小深度マップ作成部28、反射光輝度マップ作成部60、照明輝度マップ作成部61、仮想輝度マップ作成部30、及び発光量決定部31を含む。ここで、第2実施形態では、図2で説明した構成と比べて、反射光輝度マップ作成部60を備えている。反射光輝度マップ作成部60は、縮小輝度マップと縮小深度マップとに基づいて物体の反射面49の反射率を求めて反射光輝度マップを作成する。照明輝度マップ作成部61は、縮小輝度マップ、発光情報、及び反射光輝度マップに基づいて照明輝度マップを作成する。なお、図2で説明したと同じ構成には同符号を付与して詳しい説明を省略する。   The CPU 22 of the electronic camera 46 includes a determination unit 57. As shown in FIG. 7, the determining unit 57 includes a reduced luminance map creating unit 27, a reduced depth map creating unit 28, a reflected light luminance map creating unit 60, an illumination luminance map creating unit 61, a virtual luminance map creating unit 30, and a light emission. A quantity determination unit 31 is included. Here, in 2nd Embodiment, the reflected light luminance map preparation part 60 is provided compared with the structure demonstrated in FIG. The reflected light luminance map creating unit 60 obtains the reflectance of the reflecting surface 49 of the object based on the reduced luminance map and the reduced depth map, and creates a reflected light luminance map. The illumination brightness map creation unit 61 creates an illumination brightness map based on the reduced brightness map, the light emission information, and the reflected light brightness map. Note that the same components as those described in FIG. 2 are denoted by the same reference numerals, and detailed description thereof is omitted.

判定部25は、事前撮影により低輝度被写体であると判定した場合、図8に示す手順に従って発光量を決定する。この第2実施形態では、図3で説明した手順と比べて、反射光輝度マップ作成部60が縮小輝度マップと縮小深度マップとに基づいて物体の反射面49の反射率を求めて反射光輝度マップを作成する手順(S−11)を含む。具体的には、縮小深度マップに対して、距離の類似性に応じた平滑化を施すことで反射面49を特定し、また、縮小輝度マップに対しては、特定した反射面49の範囲毎に、色の類似性に応じた平滑化を施すことにより各反射面49の反射率を特定する。   If the determination unit 25 determines that the subject is a low-luminance subject by pre-shooting, the determination unit 25 determines the light emission amount according to the procedure shown in FIG. In the second embodiment, compared to the procedure described with reference to FIG. 3, the reflected light luminance map creating unit 60 obtains the reflectance of the reflecting surface 49 of the object based on the reduced luminance map and the reduced depth map, and reflects the reflected light luminance. A procedure for creating a map (S-11) is included. Specifically, the reflective surface 49 is specified by smoothing the reduced depth map according to the similarity of the distance, and for the reduced luminance map, each range of the specified reflective surface 49 is specified. Further, the reflectance of each reflecting surface 49 is specified by performing smoothing according to the similarity of colors.

照明輝度マップ作成部61は、フラッシュ装置45に発光情報の要求を送信し、これに対して受信される発光情報(発光位置、発光方向、発光量、照射角等の情報)、先に作成した縮小深度マップ、及び反射光輝度マップに基づいて、フラッシュ装置45で発光した時に直接届く直接光と、周囲の物体に反射して間接的に届く反射光とにより照明される被写体からの反射光量を領域毎に算出した仮想の照明輝度マップを作成する(S−12)。その後は、第1実施形態で説明と同様に、仮想輝度マップを作成し、仮想輝度マップから得られる着目被写体を含む領域の輝度値が予め決めた所定の輝度値になるように、照明輝度マップを作成する時に設定する発光量を求める。   The illumination brightness map creation unit 61 transmits a request for light emission information to the flash unit 45, and the light emission information (information such as light emission position, light emission direction, light emission amount, irradiation angle, etc.) received in response to the request is generated first. Based on the reduced depth map and the reflected light luminance map, the amount of light reflected from the subject illuminated by the direct light that directly reaches when the flash device 45 emits light and the reflected light that indirectly reaches the object reflected by surrounding objects is calculated. A virtual illumination brightness map calculated for each region is created (S-12). Thereafter, in the same manner as described in the first embodiment, a virtual luminance map is created, and an illumination luminance map is set so that the luminance value of an area including the subject of interest obtained from the virtual luminance map becomes a predetermined luminance value determined in advance. The amount of light emission to be set when creating the image is obtained.

上記第2実施形態では、反射光輝度マップを作成し、定常光、照明光、及び反射光との3つ光に基づいて仮想輝度マップを作っているが、反射光輝度マップ作成部を省略して定常光、及び照明光のみで作ってもよい。また、第2実施形態で説明した反射光輝度マップ作成部を第1実施形態で用いて、第1実施形態でも定常光、照明光、及び反射光との3つ光に基づいて仮想輝度マップを作るようにしてもよい。   In the second embodiment, a reflected light luminance map is created, and a virtual luminance map is created based on three lights of steady light, illumination light, and reflected light, but the reflected light luminance map creating unit is omitted. Alternatively, it may be made of only steady light and illumination light. In addition, the reflected light luminance map creation unit described in the second embodiment is used in the first embodiment, and the virtual luminance map is generated based on the three lights of the steady light, the illumination light, and the reflected light in the first embodiment. You may make it.

上記各実施形態では、電子カメラ10として説明しているが、カメラ付き携帯電話やカメラ付電子機器等でもよい。また、放電発光によって閃光を発生する閃光光源を用いた内蔵フラッシュ17、及びフラッシュ装置(外部フラッシュ)45として説明しているが、本発明の照明手段としてはこれに限らず、LED(Light Emitting Diode)を用いた照明手段を使用してもよい。また、照明手段としては、複数使用してもよい。この場合には、マスターフラッシュと、前記マスターフラッシュからの本発光に同調して発光する単体又は複数のスレーブフラッシュとを使用すればよい。   In each of the above embodiments, the electronic camera 10 is described, but a camera-equipped mobile phone, a camera-equipped electronic device, or the like may be used. Further, the internal flash 17 using a flash light source that generates flash light by discharge light emission and the flash device (external flash) 45 are described. However, the illumination means of the present invention is not limited to this, and an LED (Light Emitting Diode). ) May be used. A plurality of illumination means may be used. In this case, a master flash and a single or a plurality of slave flashes that emit light in synchronization with the main light emission from the master flash may be used.

上記各実施形態では、照度輝度マップや仮想輝度マップを作るための演算を簡略化するために、実画像(輝度マップ)と深度マップとをそれぞれ縮小しているが、本発明ではこれに限らず、縮小しなくてもよい。また、上記各実施形態では、領域の輝度値や深度値を平均化して縮小マップを得るバイリニア法を用いているが、代わりに、バイキュービック法で縮小化をしてもよい。   In each of the above embodiments, the real image (luminance map) and the depth map are reduced to simplify the calculation for creating the illuminance luminance map and the virtual luminance map, but the present invention is not limited to this. , It does not have to be reduced. In each of the above embodiments, the bilinear method of obtaining the reduced map by averaging the luminance values and depth values of the regions is used. However, the bicubic method may be used instead.

上記各実施形態では、被写体輝度が低輝度被写体の時に、フラッシュ撮影を行うための発光量を決定しているが、本発明ではこれに限らず、例えば日中シンクロ撮影モードが選択された時には、被写体輝度に関わらず発光量を決定してフラッシュ撮影を行うように構成してもよい。   In each of the above embodiments, when the subject brightness is a low brightness subject, the light emission amount for performing flash photography is determined, but the present invention is not limited to this. For example, when the daytime synchro photography mode is selected, It may be configured to perform flash photography by determining the light emission amount regardless of the subject luminance.

上記各実施形態では、シャッタレリーズの前に実画像、及び深度マップを常に1コマ分だけRAMに記憶しておき、シャッタレリーズの時にその前に記憶しておいた実画像、及び深度マップをRAMから読み出して発光量を決めるように構成してもよい。   In each of the above embodiments, the actual image and the depth map are always stored in the RAM for one frame before the shutter release, and the actual image and the depth map stored before the shutter release are stored in the RAM. The light emission amount may be determined by reading out from.

上記各実施形態では、撮像部11と深度マップ取得部12とを一体に設けた電子カメラ10として説明しているが、撮像部11と深度マップ取得部12とのいずれか一方、又は両方を別に設けた構成であってもよい。この場合、別に設けた方から実画像と深度マップの一方又は両方を外部から入力する構成にすればよい。また、撮像部11、深度マップ取得部12、及び照明手段を別々に設けてもよい。この場合、本発明としては、撮像部11、深度マップ取得部12、及び照明制御手段(CPU22)が省略され、実画像と深度マップとを外部入力とし、入力される実画像と深度マップとに基づいて照明光量を決定する照明光量決定方法、及び装置としてもよい。   In each of the above embodiments, the electronic camera 10 in which the imaging unit 11 and the depth map acquisition unit 12 are integrally provided is described. However, either or both of the imaging unit 11 and the depth map acquisition unit 12 are separately provided. The provided structure may be sufficient. In this case, a configuration may be adopted in which one or both of the actual image and the depth map are input from the outside, provided separately. Moreover, you may provide the imaging part 11, the depth map acquisition part 12, and an illumination means separately. In this case, as the present invention, the imaging unit 11, the depth map acquisition unit 12, and the illumination control unit (CPU 22) are omitted, and the actual image and the depth map are externally input. It is good also as an illumination light quantity determination method and apparatus which determine an illumination light quantity based on.

以上、本発明を好適な実施の形態に基づき具体的に説明したが、本発明は上記各実施形態で説明した構成に限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能であることは言うまでもない。   Although the present invention has been specifically described above based on the preferred embodiments, the present invention is not limited to the configurations described in the above embodiments, and various modifications can be made without departing from the scope of the invention. Needless to say.

10,46 電子カメラ
17 内蔵フラッシュ
26,57 決定部
45 フラッシュ装置
10, 46 Electronic camera 17 Built-in flash 26, 57 Determining unit 45 Flash device

Claims (8)

被写体を撮像して実画像を取得する実画像取得手段と、
前記実画像に応じた奥行き情報を含む3次元情報を取得する3次元情報取得手段と、
前記実画像と前記3次元情報とに基づいて、前記被写体を照明する照明手段の照明光量を撮影前に決める決定手段と、
前記決定手段で決められた照明光量になるように前記照明手段の照明を撮影時に制御する照明制御手段と、
を備えていることを特徴とする撮像装置。
Real image acquisition means for capturing a subject and acquiring a real image;
3D information acquisition means for acquiring 3D information including depth information according to the actual image;
A determining means for determining an illumination light amount of an illuminating means for illuminating the subject before photographing based on the real image and the three-dimensional information;
An illumination control means for controlling the illumination of the illumination means at the time of photographing so that the illumination light amount determined by the determination means;
An imaging apparatus comprising:
請求項1に記載の撮像装置において、
前記決定手段は、前記照明手段が前記被写体を照明した時に得られる、前記奥行き情報を考慮した仮想の被写体輝度情報を生成し、生成された前記仮想の被写体輝度情報に基づいて着目被写体を含む領域の輝度値が所定の輝度値になるように前記照明光量を決めることを特徴とする撮像装置。
The imaging device according to claim 1,
The determination unit generates virtual subject luminance information in consideration of the depth information, which is obtained when the illumination unit illuminates the subject, and includes an object of interest based on the generated virtual subject luminance information An imaging apparatus characterized in that the illumination light quantity is determined so that the luminance value of the light becomes a predetermined luminance value.
請求項1又は2に記載の撮像装置において、
前記照明手段は、該装置から離れた位置に設けられ、
前記照明手段の照明に関する照明情報を前記照明手段から取得する照明情報取得手段を備え、
前記決定手段は、前記照明光量を決定する条件に前記照明情報を加えて決定することを特徴とする撮像装置。
The imaging device according to claim 1 or 2,
The illumination means is provided at a position away from the device,
Illumination information acquisition means for acquiring illumination information related to illumination of the illumination means from the illumination means,
The determination means is determined by adding the illumination information to a condition for determining the amount of illumination light.
請求項3に記載の撮像装置において、
前記照明情報は、光量、照明の位置、及び照明角度の情報を含み、
前記決定手段は、前記条件に前記光量、照明の位置、及び照明角度の情報を加えて前記照明光量を決定することを特徴とする撮像装置。
The imaging device according to claim 3.
The illumination information includes information on light quantity, illumination position, and illumination angle,
The determination unit adds the information on the light amount, the position of illumination, and the illumination angle to the condition to determine the illumination light amount.
請求項1から4のいずれか1項に記載の撮像装置において、
前記実画像上で前記着目被写体又はそれを含む範囲を選択する選択手段を備えることを特徴とする撮像装置。
The imaging device according to any one of claims 1 to 4,
An imaging apparatus comprising: selection means for selecting the subject of interest or a range including the subject of interest on the real image.
請求項1から5のいずれか1項に記載の撮像装置において、
前記実画像と前記3次元情報とに基づいて前記実画像に含まれる物体の反射位置、及び反射率を含む反射情報を推定する反射光情報推定手段を備え、
前記決定手段は、前記照明光量を決定する条件に前記反射情報を加えて前記照明光量を決めることを特徴とする撮像装置。
In the imaging device according to any one of claims 1 to 5,
Reflected light information estimation means for estimating reflection information including the reflection position of the object included in the real image and the reflectance based on the real image and the three-dimensional information,
The determination unit adds the reflection information to a condition for determining the illumination light amount and determines the illumination light amount.
請求項2から6のいずれか1項に記載の撮像装置において、
前記決定手段は、
前記実画像の領域を分割した領域毎に輝度値の情報をもつ輝度マップを前記実画像に基づいて作成する手段と、
前記領域毎に距離情報をもつ深度マップを前記3次元情報に基づいて作成する手段と、
前記照明情報と前記深度マップとに基づいて前記照明手段のみで照明した時に得られる仮想の照明輝度マップを算出する手段と、
前記照明輝度マップと前記輝度マップとを合成して前記仮想輝度情報を生成する手段と、
を備えていることを特徴とする撮像装置。
The imaging apparatus according to any one of claims 2 to 6,
The determining means includes
Means for creating a luminance map having luminance value information for each area obtained by dividing the area of the actual image based on the actual image;
Means for creating a depth map having distance information for each region based on the three-dimensional information;
Means for calculating a virtual illumination luminance map obtained when illuminated only with the illumination means based on the illumination information and the depth map;
Means for generating the virtual luminance information by combining the illumination luminance map and the luminance map;
An imaging apparatus comprising:
被写体を撮像して得られる実画像を入力するステップと、
前記実画像に応じた奥行き情報を含む3次元情報を入力するステップと、
前記実画像と前記3次元情報とに基づいて、照明手段が前記被写体を照明した時に得られる、前記奥行き情報を考慮した仮想の被写体輝度情報を生成し、前記仮想の被写体輝度情報に基づいて着目被写体を含む領域の輝度値が所定の輝度値になるように前記照明手段の照明光量を決めるステップと、
を含むことを特徴とする照明光量決定方法。
Inputting a real image obtained by imaging a subject;
Inputting three-dimensional information including depth information according to the actual image;
Based on the real image and the three-dimensional information, virtual object luminance information considering the depth information, which is obtained when the illumination unit illuminates the subject, is generated, and attention is paid based on the virtual subject luminance information. Determining the illumination light amount of the illumination means so that the luminance value of the region including the subject becomes a predetermined luminance value;
A method for determining the amount of illumination light.
JP2012115856A 2012-05-21 2012-05-21 Imaging apparatus and illumination light quantity determination method Active JP6160030B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012115856A JP6160030B2 (en) 2012-05-21 2012-05-21 Imaging apparatus and illumination light quantity determination method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012115856A JP6160030B2 (en) 2012-05-21 2012-05-21 Imaging apparatus and illumination light quantity determination method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP2017117783A Division JP2017200214A (en) 2017-06-15 2017-06-15 Imaging apparatus

Publications (2)

Publication Number Publication Date
JP2013242441A true JP2013242441A (en) 2013-12-05
JP6160030B2 JP6160030B2 (en) 2017-07-12

Family

ID=49843389

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012115856A Active JP6160030B2 (en) 2012-05-21 2012-05-21 Imaging apparatus and illumination light quantity determination method

Country Status (1)

Country Link
JP (1) JP6160030B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016213717A (en) * 2015-05-11 2016-12-15 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
JP2017003709A (en) * 2015-06-08 2017-01-05 キヤノン株式会社 Image-capturing device, light emission control method, and program
JP2017200214A (en) * 2017-06-15 2017-11-02 株式会社ニコン Imaging apparatus
JP2019068306A (en) * 2017-10-02 2019-04-25 キヤノン株式会社 Imaging apparatus and imaging method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07159845A (en) * 1993-12-09 1995-06-23 Nikon Corp Exposure controller for camera
JP2002369075A (en) * 2001-06-04 2002-12-20 Olympus Optical Co Ltd Image pickup unit, picture processor, image pickup program and picture processing program
JP2004264783A (en) * 2003-03-04 2004-09-24 Olympus Corp Camera and electronic camera
JP2004320284A (en) * 2003-04-15 2004-11-11 Nikon Gijutsu Kobo:Kk Digital camera
JP2007033715A (en) * 2005-07-25 2007-02-08 Eastman Kodak Co Imaging apparatus
JP2007133422A (en) * 2006-12-28 2007-05-31 Canon Inc Strobe system
JP2008233381A (en) * 2007-03-19 2008-10-02 Fujifilm Corp Imaging apparatus
JP2010134363A (en) * 2008-12-08 2010-06-17 Canon Inc Illumination control device and method
JP2011069893A (en) * 2009-09-24 2011-04-07 Nikon Corp Illuminator and camera system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07159845A (en) * 1993-12-09 1995-06-23 Nikon Corp Exposure controller for camera
JP2002369075A (en) * 2001-06-04 2002-12-20 Olympus Optical Co Ltd Image pickup unit, picture processor, image pickup program and picture processing program
JP2004264783A (en) * 2003-03-04 2004-09-24 Olympus Corp Camera and electronic camera
JP2004320284A (en) * 2003-04-15 2004-11-11 Nikon Gijutsu Kobo:Kk Digital camera
JP2007033715A (en) * 2005-07-25 2007-02-08 Eastman Kodak Co Imaging apparatus
JP2007133422A (en) * 2006-12-28 2007-05-31 Canon Inc Strobe system
JP2008233381A (en) * 2007-03-19 2008-10-02 Fujifilm Corp Imaging apparatus
JP2010134363A (en) * 2008-12-08 2010-06-17 Canon Inc Illumination control device and method
JP2011069893A (en) * 2009-09-24 2011-04-07 Nikon Corp Illuminator and camera system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016213717A (en) * 2015-05-11 2016-12-15 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
JP2017003709A (en) * 2015-06-08 2017-01-05 キヤノン株式会社 Image-capturing device, light emission control method, and program
JP2017200214A (en) * 2017-06-15 2017-11-02 株式会社ニコン Imaging apparatus
JP2019068306A (en) * 2017-10-02 2019-04-25 キヤノン株式会社 Imaging apparatus and imaging method
JP6995554B2 (en) 2017-10-02 2022-01-14 キヤノン株式会社 Imaging device and imaging method

Also Published As

Publication number Publication date
JP6160030B2 (en) 2017-07-12

Similar Documents

Publication Publication Date Title
KR102604336B1 (en) Methods and apparatus for performing exposure estimation using a time-of-flight sensor
JP6946188B2 (en) Methods and equipment for multi-technology depth map acquisition and fusion
JP6139017B2 (en) Method for determining characteristics of light source and mobile device
JP6200151B2 (en) Imaging apparatus and dimming control method
WO2015027807A1 (en) Indoor positioning terminal, network, system and method thereof
CN105812673B (en) Flash lamp control system and method
WO2009139154A1 (en) Image pickup device and image pickup method
WO2009110082A1 (en) Image photographic device, image photographic method, and image photographic program
US20160330434A1 (en) Control method of a depth camera
JP6160030B2 (en) Imaging apparatus and illumination light quantity determination method
JP2013124941A (en) Distance measuring apparatus and distance measuring method
CN108965579A (en) Method and device thereof, terminal and the storage medium of ranging are realized based on TOF camera
EP3381015B1 (en) Systems and methods for forming three-dimensional models of objects
JP2011227372A5 (en)
JP2019140698A (en) Imaging apparatus and program
JP2017200214A (en) Imaging apparatus
JP7129196B2 (en) IMAGING DEVICE, CONTROL METHOD AND PROGRAM THEREOF
JP2008093131A (en) Glare evaluation apparatus
KR20120000234A (en) The method of auto-exposure control for white light 3d scanner using an illuminometer
JP2019128412A (en) Imaging device and control method of the same
JP6168874B2 (en) Strobe device, imaging system, strobe device control method and program
TW201624098A (en) System and method for controlling FLASH
JP2016085248A (en) Exposure computation device
JP2020091357A (en) Strobe device and control method of the same and program thereof
JP6671932B2 (en) Light emitting device and control method thereof

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150512

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20160317

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160329

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160530

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161018

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161214

RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7426

Effective date: 20170323

RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20170323

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20170327

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170516

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170529

R150 Certificate of patent or registration of utility model

Ref document number: 6160030

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250