WO2012014946A1 - 画像処理装置、および画像処理プログラム - Google Patents
画像処理装置、および画像処理プログラム Download PDFInfo
- Publication number
- WO2012014946A1 WO2012014946A1 PCT/JP2011/067145 JP2011067145W WO2012014946A1 WO 2012014946 A1 WO2012014946 A1 WO 2012014946A1 JP 2011067145 W JP2011067145 W JP 2011067145W WO 2012014946 A1 WO2012014946 A1 WO 2012014946A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- template
- evaluation value
- matching
- edge
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/164—Detection; Localisation; Normalisation using holistic features
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to an image processing apparatus and an image processing program.
- pattern matching methods are known.
- an image is divided into a plurality of regions, a template matching process is performed for each region, and a region having the highest similarity is extracted as a matching region (Patent Document 1).
- An image processing apparatus includes an edge image generation apparatus that generates an edge image by extracting an edge from an image, and a fixed shape fixed to the edge image generated by the edge image generation apparatus.
- a matching device that performs template matching using a template showing the shape of the pattern, and an evaluation value calculation that calculates an evaluation value for identifying the position of the fixed pattern of the predetermined shape in the image based on the matching result by the matching device
- a specifying device for specifying the position of a fixed pattern of a predetermined shape in the image based on the evaluation value calculated by the evaluation value calculation device.
- the evaluation value calculation device moves the template in the image, and at each template position, the pixel value of each pixel of the template and the template It is preferable to calculate the evaluation value by adding the pixel values of the pixels of the edge image at the same position as each of the pixels and adding or integrating the integration results for all the pixels of the template.
- the specifying device specifies the position of the template having the maximum calculated evaluation value as the position of the fixed pattern having a predetermined shape in the image. It is preferable.
- the fixed pattern having a predetermined shape may be an AF area arranged in a shooting screen of the camera.
- An image processing program according to a fifth aspect of the present invention includes an edge image generation procedure for extracting an edge from an image to generate an edge image, and a fixed pattern having a predetermined shape for the edge image generated by the edge image generation procedure.
- An evaluation value calculation that calculates an evaluation value for specifying a position of a fixed pattern of a predetermined shape in an image based on a matching procedure for performing template matching using a template showing the shape of the image and a matching result in the matching procedure Based on the procedure and the evaluation value calculated by the evaluation value calculation means, the computer is caused to execute a specific procedure for specifying the position of the fixed pattern having a predetermined shape in the image.
- the position of the fixed pattern in the image can be specified with high accuracy.
- FIG. 1 is a block diagram showing a configuration of an embodiment of a camera.
- FIG. 2 is a diagram illustrating a display example of the AF frame in the shooting screen.
- FIG. 3 is a diagram illustrating a display example of a face detection frame.
- FIG. 4 is a diagram illustrating a specific example when the facial feature points and the AF frame overlap.
- FIGS. 5A to 5C are diagrams schematically showing an erasing method using adjacent pixels of the AF frame.
- FIG. 6 is a diagram illustrating a face detection result after AF frame deletion.
- FIG. 7 is a diagram illustrating a specific example of a blurred image.
- FIGS. 8A to 8C are diagrams showing an example of setting a detection area.
- FIG. 9 is a diagram illustrating a specific example of an edge image.
- FIG. 10 is a diagram illustrating a specific example of a template.
- FIG. 11 is a diagram illustrating how an image processing program is provided through a data signal such as a recording medium or the Internet.
- FIG. 1 is a block diagram showing a configuration of an embodiment of a camera to which an image processing apparatus according to the present invention is applied.
- the camera 100 includes an operation member 101, a lens 102, an image sensor 103, a control device 104, a memory card slot 105, and a monitor 106.
- the operation member 101 includes various input members operated by the user, such as a power button, a release button, a zoom button, a cross key, an enter button, a play button, and a delete button.
- the lens 102 is composed of a plurality of optical lenses, but is representatively represented by one lens in FIG.
- the image sensor 103 is an image sensor such as a CCD or a CMOS, for example, and captures a subject image formed by the lens 102. Then, an image signal obtained by imaging is output to the control device 104.
- the control device 104 generates a predetermined image format, for example, JPEG format image data (hereinafter referred to as “main image data”) based on the image signal input from the image sensor 103. Further, the control device 104 generates display image data, for example, thumbnail image data, based on the generated image data. The control device 104 generates an image file that includes the generated main image data and thumbnail image data, and further includes header information, and outputs the image file to the memory card slot 105. In the present embodiment, it is assumed that both the main image data and the thumbnail image data are image data expressed in the RGB color system.
- the memory card slot 105 is a slot for inserting a memory card as a storage medium, and records and records the image file output from the control device 104 on the memory card.
- the memory card slot 105 reads an image file stored in the memory card based on an instruction from the control device 104.
- the monitor 106 is a liquid crystal monitor (rear monitor) mounted on the back surface of the camera 100, and the monitor 106 displays an image stored in a memory card, a setting menu for setting the camera 100, and the like. . Further, when the user sets the mode of the camera 100 to the shooting mode, the control device 104 outputs image data for display of images acquired from the image sensor 103 in time series to the monitor 106. As a result, a through image is displayed on the monitor 106.
- the control device 104 includes a CPU, a memory, and other peripheral circuits, and controls the camera 100.
- the memory constituting the control device 104 includes SDRAM and flash memory.
- the SDRAM is a volatile memory, and is used as a work memory for the CPU to develop a program when the program is executed or as a buffer memory for temporarily recording data.
- the flash memory is a non-volatile memory in which data of a program executed by the control device 104, various parameters read during program execution, and the like are recorded.
- the control device 104 superimposes and displays a frame (AF frame) corresponding to the arrangement position of the distance measuring sensor on a through image (photographing screen) displayed on the monitor 106.
- AF frame a frame
- 51 AF frames are displayed on the shooting screen as shown in FIG.
- the camera 100 corresponds to one AF frame selected by the control device 104 by performing known AF processing from among the 51 AF frames, or one AF frame designated by the user. Focus adjustment is performed using the distance measurement information of the distance measuring sensor.
- the camera 100 has a face detection function
- the control device 104 executes a known face detection process for the inside of the shooting screen, so that a human face existing in the shooting screen is displayed. Can be detected.
- the control device 104 surrounds a region including the detected face with a face detection frame 3a and displays it on the through image, thereby clearly indicating the face detection result to the user.
- the control device 104 tracks the detected face between frames, thereby performing subject tracking during live view display or automatically selecting an AF frame located near the detected face to adjust the focus. You can also go.
- face detection in a shooting screen is performed by extracting facial feature points such as eyes and mouth from the shooting screen and determining whether or not it is a human face based on the positional relationship of the feature points. This is done by judging.
- the feature points such as the eyes and mouth of the person are displayed at the display position of the AF frame 4a. If it overlaps, there is a possibility that the control device 104 cannot detect the facial feature points and cannot accurately detect the human face.
- the following method can be considered.
- the processing shown below is recorded as an image processing program, for example, in the flash memory of the control device 104, and is executed by the control device 104 functioning as an image processing device.
- the following processing is performed by recording an image in the shooting screen as a face detection image in the buffer memory and then performing the face detection image as a target, and does not affect the through image displayed on the monitor 106. . That is, while the following processing is being performed, the photographing screen on which the AF frame shown in FIG. 2 is displayed is continuously displayed on the monitor 106.
- the control device 104 deletes the AF frame 4a for all the AF frames 4a in the face detection image by replacing the pixel where the frame line of the AF frame 4a is located by using an adjacent pixel, and the AF frame Interpolate the pixels hidden in 4a. Since the AF frame 4a is arranged at a predetermined position in the shooting screen as shown in FIG. 2, position information of the AF frame 4a in the shooting screen is recorded in advance in a flash memory or the like. Thus, the control device 104 can specify where the AF frame 4a is located in the face detection image.
- the width of the frame line of the AF frame 4a is 2 pixels.
- the AF frame 4a includes the pixels 5a and A process in the case of being constituted by 5b and pixels 5c and 5d constituting a horizontal frame line will be described.
- the control device 104 replaces the pixels 5a and 5b constituting the vertical frame line among these pixels as shown in FIG. 5B. In other words, the control device 104 replaces the pixel 5a using the pixel 5e adjacent to the right side of the pixel 5a, and replaces the pixel 5b using the pixel 5f adjacent to the left side of the pixel 5b. Further, the control device 104 replaces the pixels 5c and 5d constituting the horizontal frame line as shown in FIG. 5C. In other words, the control device 104 replaces the pixel 5c using the pixel 5g adjacent to the upper side of the pixel 5c, and replaces the pixel 5d using the pixel 5h adjacent to the lower side of the pixel 5d.
- the control device 104 can detect a person's face by face detection processing and display the detection frame 3a on the through image.
- the control device 104 needs to grasp the position of the AF frame 4a in the face detection image.
- the position information in the shooting screen of the AF frame 4a is recorded in advance in a flash memory or the like, so that the control device 104 can locate the AF frame 4a in the face detection image. Can be specified.
- the position of the AF frame 4a in the shooting screen is stably in the same position.
- the position of the AF frame 4a in the shooting screen may be mechanically shifted or an optical shift may occur.
- the AF frame 4a is erased using the pre-recorded position information of the AF frame 4a and a process for interpolating pixels is performed, the interpolation accuracy may decrease. there were.
- an image of the AF frame 4a is recorded in a flash memory in advance, and template matching is performed on the image in the shooting screen using the image of the AF frame 4a as a template.
- this method also has the following problems. That is, as a method of matching calculation in template matching, a known cross-correlation method or residual sequential test method is used. However, these methods have a signal strength between corresponding positions of a partial signal for calculation and a template signal. The calculation is performed and the results are totaled over the entire signal.
- the control device 104 detects the position of the AF frame 4a in the shooting screen as follows.
- the position of the AF frame 4a in the shooting screen is detected for a blurred image as shown in FIG. 7 will be described.
- the control device 104 estimates the approximate position of the AF frame 4a based on the position information. can do.
- FIG. 8A is an enlarged view of an image in a region including a human face in the image shown in FIG.
- the control device 104 sets a search area 8a having a predetermined size around the estimated position of the AF frame 4a as shown in FIG. FIG. 8C is an enlarged view of the image in the set search area 8a. And the control apparatus 104 extracts an edge by taking the difference between adjacent pixels for the set search area 8a. As a result, for example, an edge image as shown in FIG. 9 is generated for the search area 8a shown in FIG. 8B.
- the control device 104 performs template matching using a template for specifying the position of the AF frame 4a for the calculated edge image in the search area 8a.
- the template used here is a mask image showing the shape of the AF frame 4a. The outermost pixel is 1, and the inner pixel is 0.
- the position of the AF frame 4a in the search area 8a can be specified.
- control device 104 moves the template shown in FIG. 10 within the search area 8a, and at each template position, the pixel value of each pixel of the template and the edge at the same position as each pixel of the template.
- the pixel values of each pixel of the image are integrated and the integration result is added up for all the pixels of the template.
- the control device 104 uses the summation result as an evaluation value, and determines that the AF frame 4a exists at the template position where the evaluation value is the largest value, whereby the position of the AF frame 4a in the search area 8a. As a result, the position of the AF frame 4a in the shooting screen can be specified.
- the control device 104 deletes the AF frame 4a by replacing the pixel where the frame line of the AF frame 4a is located with an adjacent pixel. Thereby, even when the AF frame 4a overlaps with the facial feature point, the face detection is possible by erasing the AF frame 4a and interpolating pixels corresponding to the eye portion 6a hidden by the AF frame 4a. can do.
- the control device 104 extracts an edge from the search area 8a to generate an edge image, and performs template matching using a template indicating the shape of the AF frame 4a for the generated edge image, and performs matching Based on the result, an evaluation value for specifying the position of the AF frame 4a in the shooting screen is calculated, and the position of the AF frame 4a in the shooting screen is specified based on the evaluation value. Thereby, even when the image is unclear, the position of the AF frame 4a in the shooting screen can be specified with high accuracy.
- the control device 104 moves the template within the search area 8a set in the photographing screen, and at each template position, the pixel value of each pixel of the template and the edge at the same position as each pixel of the template
- the evaluation values are calculated by integrating the pixel values of each pixel of the image and adding the integration results for all the pixels of the template. Thereby, the position of the AF frame 4a in the search area 8a can be specified with high accuracy.
- the control device 104 specifies the position of the template that maximizes the calculated evaluation value as the position of the AF frame 4a in the shooting screen. Thereby, the position of the AF frame 4a in the shooting screen can be specified with a simple process.
- the camera of the above-described embodiment can be modified as follows.
- (1) in the above-described embodiment, an example in which the control device 104 detects the position of the AF frame 4a in the shooting screen has been described.
- the control device 104 can also detect the position of a fixed pattern of a predetermined shape included in the shooting screen or the image using the method in the above-described embodiment. For example, it is possible to detect a rectangle other than the AF frame 4a included in the image or the position of the alignment mark in the wafer.
- the control device 104 determines the approximate position of the AF frame 4a based on the position information. And the search area 8a is set around it. Then, the control device 104 generates an edge image for the inside of the search area 8a and performs template matching. However, when the position of the fixed pattern of a predetermined shape included in the shooting screen or the image cannot be estimated, the control device 104 generates an edge image for the entire shooting screen or the entire image and performs template matching. It may be.
- the control device 104 moves the template in the search area 8a, and at each template position, the pixel value of each pixel of the template is at the same position as each pixel of the template.
- the example in which the evaluation value is calculated by adding the pixel values of the pixels of the edge image and adding up the integration results for all the pixels of the template has been described.
- the control device 104 moves the template within the search area 8a, and at each template position, the pixel value of each pixel of the template and the pixel value of each pixel of the edge image at the same position as each pixel of the template.
- the evaluation value may be calculated by integrating the integration results for all the pixels of the template.
- the present invention can also be applied to other devices having a photographing function, such as a mobile phone with a camera and a video camera.
- FIG. 11 is a diagram showing this state.
- the personal computer 300 is provided with a program via the CD-ROM 304.
- the personal computer 300 also has a connection function with the communication line 301.
- a computer 302 is a server computer that provides the program, and stores the program in a recording medium such as a hard disk 303.
- the communication line 301 is a communication line such as the Internet or personal computer communication, or a dedicated communication line.
- the computer 302 reads the program using the hard disk 303 and transmits the program to the personal computer 300 via the communication line 301. That is, the program is embodyed as a data signal on a carrier wave and transmitted via the communication line 301.
- the program can be supplied as a computer-readable computer program product in various forms such as a recording medium and a carrier wave.
- the present invention is not limited to the configurations in the above-described embodiments as long as the characteristic functions of the present invention are not impaired. Moreover, it is good also as a structure which combined the above-mentioned embodiment and a some modification.
- This application is based on Japanese Patent Application No. 2010-170035 (filed on Jul. 29, 2010), the contents of which are incorporated herein by reference.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
本発明の第2の態様は、第1の態様による画像処理装置において、評価値算出装置は、画像内でテンプレートを移動させながら、それぞれのテンプレート位置において、テンプレートの各画素の画素値と、テンプレートの各画素と同じ位置にあるエッジ画像の各画素の画素値とを積算し、積算結果をテンプレートの全画素について合算または積算することによって、評価値を算出することが好ましい。
本発明の第3の態様は、第2の態様による画像処理装置において、特定装置は、画像内において、算出された評価値が最大となるテンプレートの位置を所定形状の固定パターンの位置として特定することが好ましい。
本発明の第4の態様は、第1から第3の態様による画像処理装置において、所定形状の固定パターンは、カメラの撮影画面内に配置されたAFエリアであってもよい。
本発明の第5の態様による画像処理プログラムは、画像内からエッジを抽出してエッジ画像を生成するエッジ画像生成手順と、エッジ画像生成手順で生成したエッジ画像を対象として、所定形状の固定パターンの形状を示したテンプレートを用いたテンプレートマッチングを行うマッチング手順と、マッチング手順でのマッチング結果に基づいて、画像内における所定形状の固定パターンの位置を特定するための評価値を算出する評価値算出手順と、評価値算出手段で算出した評価値に基づいて、画像内における所定形状の固定パターンの位置を特定する特定手順とをコンピュータに実行させる。
(1)制御装置104は、AF枠4aの枠線が位置している画素を隣接画素を用いて置き換えることによってAF枠4aを消去するようにした。これによって、AF枠4aが顔の特徴点と重なっている場合でも、AF枠4aを消去するとともにAF枠4aで隠れていた目の部分6aに相当する画素を補間して、顔検出を可能にすることができる。
なお、上述した実施の形態のカメラは、以下のように変形することもできる。
(1)上述した実施の形態では、制御装置104は、撮影画面内におけるAF枠4aの位置を検出する例について説明した。しかしながら、制御装置104は、上述した実施の形態における手法を用いて、撮影画面内または画像内に含まれる所定形状の固定パターンの位置を検出することもできる。例えば、画像内に含まれるAF枠4a以外の矩形や、ウェハー内のアライメントマークの位置を検出することもできる。
本出願は日本国特許出願2010-170035号(2010年7月29日出願)を基礎として、その内容は引用文としてここに組み込まれる。
Claims (5)
- 画像内からエッジを抽出してエッジ画像を生成するエッジ画像生成装置と、
前記エッジ画像生成装置によって生成されたエッジ画像を対象として、所定形状の固定パターンの形状を示したテンプレートを用いたテンプレートマッチングを行うマッチング装置と、
前記マッチング装置によるマッチング結果に基づいて、前記画像内における前記所定形状の固定パターンの位置を特定するための評価値を算出する評価値算出装置と、
前記評価値算出装置によって算出された前記評価値に基づいて、前記画像内における前記所定形状の固定パターンの位置を特定する特定装置とを備える画像処理装置。 - 請求項1に記載の画像処理装置において、
前記評価値算出装置は、前記画像内で前記テンプレートを移動させながら、それぞれのテンプレート位置において、テンプレートの各画素の画素値と、テンプレートの各画素と同じ位置にあるエッジ画像の各画素の画素値とを積算し、積算結果をテンプレートの全画素について合算または積算することによって、前記評価値を算出する画像処理装置。 - 請求項2に記載の画像処理装置において、
前記特定装置は、前記画像内において、算出された前記評価値が最大となる前記テンプレートの位置を前記所定形状の固定パターンの位置として特定する画像処理装置。 - 請求項1~3のいずれか一項に記載の画像処理装置において、
前記所定形状の固定パターンは、カメラの撮影画面内に配置されたAFエリアである画像処理装置。 - 画像内からエッジを抽出してエッジ画像を生成するエッジ画像生成手順と、
前記エッジ画像生成手順で生成したエッジ画像を対象として、所定形状の固定パターンの形状を示したテンプレートを用いたテンプレートマッチングを行うマッチング手順と、
前記マッチング手順でのマッチング結果に基づいて、前記画像内における前記所定形状の固定パターンの位置を特定するための評価値を算出する評価値算出手順と、
前記評価値算出手段で算出した前記評価値に基づいて、前記画像内における前記所定形状の固定パターンの位置を特定する特定手順とをコンピュータに実行させるための画像処理プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/812,418 US20130129226A1 (en) | 2010-07-29 | 2011-07-27 | Image processing apparatus and image processing program |
CN2011800374432A CN103039068A (zh) | 2010-07-29 | 2011-07-27 | 图像处理装置以及图像处理程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010170035A JP2012034069A (ja) | 2010-07-29 | 2010-07-29 | 画像処理装置、および画像処理プログラム |
JP2010-170035 | 2010-07-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012014946A1 true WO2012014946A1 (ja) | 2012-02-02 |
Family
ID=45530149
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/067145 WO2012014946A1 (ja) | 2010-07-29 | 2011-07-27 | 画像処理装置、および画像処理プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130129226A1 (ja) |
JP (1) | JP2012034069A (ja) |
CN (1) | CN103039068A (ja) |
WO (1) | WO2012014946A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10729579B2 (en) | 2014-07-11 | 2020-08-04 | National Institutes Of Health | Surgical tool and method for ocular tissue transplantation |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104641626B (zh) * | 2012-09-19 | 2018-02-27 | 富士胶片株式会社 | 摄像装置及对焦确认显示方法 |
RU2677764C2 (ru) * | 2013-10-18 | 2019-01-21 | Конинклейке Филипс Н.В. | Координатная привязка медицинских изображений |
US10504267B2 (en) * | 2017-06-06 | 2019-12-10 | Adobe Inc. | Generating a stylized image or stylized animation by matching semantic features via an appearance guide, a segmentation guide, and/or a temporal guide |
US10825224B2 (en) | 2018-11-20 | 2020-11-03 | Adobe Inc. | Automatic viseme detection for generating animatable puppet |
CN115552460A (zh) * | 2020-05-22 | 2022-12-30 | 富士胶片株式会社 | 图像数据处理装置及图像数据处理系统 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005173649A (ja) * | 2003-12-05 | 2005-06-30 | Institute Of Physical & Chemical Research | 画像処理におけるテンプレートマッチング処理方法と処理装置 |
JP2007293732A (ja) * | 2006-04-27 | 2007-11-08 | Hitachi High-Technologies Corp | 検査装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0252587A (ja) * | 1988-08-17 | 1990-02-22 | Olympus Optical Co Ltd | 色差線順次信号記憶方式 |
JPH11252587A (ja) * | 1998-03-03 | 1999-09-17 | Matsushita Electric Ind Co Ltd | 物体追跡装置 |
JP4290100B2 (ja) * | 2003-09-29 | 2009-07-01 | キヤノン株式会社 | 撮像装置及びその制御方法 |
JP2006254321A (ja) * | 2005-03-14 | 2006-09-21 | Matsushita Electric Ind Co Ltd | 人物追尾装置及び人物追尾プログラム |
JP4457358B2 (ja) * | 2006-05-12 | 2010-04-28 | 富士フイルム株式会社 | 顔検出枠の表示方法、文字情報の表示方法及び撮像装置 |
KR101295433B1 (ko) * | 2007-06-19 | 2013-08-09 | 삼성전자주식회사 | 카메라의 자동초점조절 장치 및 방법 |
JP4961282B2 (ja) * | 2007-07-03 | 2012-06-27 | キヤノン株式会社 | 表示制御装置及びその制御方法 |
JP2009152725A (ja) * | 2007-12-19 | 2009-07-09 | Fujifilm Corp | 自動追跡装置及び方法 |
US20090174805A1 (en) * | 2008-01-07 | 2009-07-09 | Motorola, Inc. | Digital camera focusing using stored object recognition |
JP2010152135A (ja) * | 2008-12-25 | 2010-07-08 | Fujinon Corp | セーフエリア警告装置 |
US8520131B2 (en) * | 2009-06-18 | 2013-08-27 | Nikon Corporation | Photometric device, imaging device, and camera |
-
2010
- 2010-07-29 JP JP2010170035A patent/JP2012034069A/ja active Pending
-
2011
- 2011-07-27 CN CN2011800374432A patent/CN103039068A/zh active Pending
- 2011-07-27 WO PCT/JP2011/067145 patent/WO2012014946A1/ja active Application Filing
- 2011-07-27 US US13/812,418 patent/US20130129226A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005173649A (ja) * | 2003-12-05 | 2005-06-30 | Institute Of Physical & Chemical Research | 画像処理におけるテンプレートマッチング処理方法と処理装置 |
JP2007293732A (ja) * | 2006-04-27 | 2007-11-08 | Hitachi High-Technologies Corp | 検査装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10729579B2 (en) | 2014-07-11 | 2020-08-04 | National Institutes Of Health | Surgical tool and method for ocular tissue transplantation |
Also Published As
Publication number | Publication date |
---|---|
US20130129226A1 (en) | 2013-05-23 |
JP2012034069A (ja) | 2012-02-16 |
CN103039068A (zh) | 2013-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8254630B2 (en) | Subject extracting method and device by eliminating a background region using binary masks | |
TWI425828B (zh) | 攝影裝置、圖像區域之判定方法以及電腦可讀取之記錄媒體 | |
JP5959923B2 (ja) | 検出装置、その制御方法、および制御プログラム、並びに撮像装置および表示装置 | |
KR101537948B1 (ko) | 얼굴 포즈 추정을 이용한 촬영 방법 및 장치 | |
JP5246078B2 (ja) | 被写体位置特定用プログラム、およびカメラ | |
WO2012014946A1 (ja) | 画像処理装置、および画像処理プログラム | |
JP2005353010A (ja) | 画像処理装置および撮像装置 | |
US10013632B2 (en) | Object tracking apparatus, control method therefor and storage medium | |
JP4868046B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
KR20110089655A (ko) | 촬영 구도를 유도하는 디지털 영상 촬영 장치 및 방법 | |
KR101204888B1 (ko) | 디지털 촬영장치, 그 제어방법 및 이를 실행시키기 위한프로그램을 저장한 기록매체 | |
JP2009141475A (ja) | カメラ | |
JP2011040993A (ja) | 被写体追尾プログラム、およびカメラ | |
JP2008035125A (ja) | 撮像装置、画像処理方法およびプログラム | |
JP4894708B2 (ja) | 撮像装置 | |
JP5083080B2 (ja) | 画像マッチング装置、およびカメラ | |
JP2008209306A (ja) | カメラ | |
JP2007312206A (ja) | 撮像装置及び、画像再生装置 | |
JP2011191860A (ja) | 撮像装置、撮像処理方法及びプログラム | |
JP4565273B2 (ja) | 被写体追跡装置、およびカメラ | |
JP4632417B2 (ja) | 撮像装置、及びその制御方法 | |
JP2009146033A (ja) | 部分画像抽出方法 | |
JP2006109005A (ja) | 画像出力装置および撮像装置 | |
JP2010157792A (ja) | 被写体追跡装置 | |
JP2005004287A (ja) | 顔画像検出装置およびその制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180037443.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11812535 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13812418 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11812535 Country of ref document: EP Kind code of ref document: A1 |